– and it designs back…
I’ve tried not to get into this, on account of it being a veritable can of ticked-off killer bees, but I think I must, so with a due sense of dread I utter the word: – Microsoft…
(disclaimer: while I am very much a Mac fan, this is not going to be a “Mac vs. PC” article, and I probably won’t respond very well to that line of discussion. Clear? OK, let’s proceed)
So what’s the deal here?
Well, as any designer with a user experience focus should, I have taken note of this thing, the computer – in fact, the very recent history and speedy ascent of this thing presents us with a unique opportunity to see what design actually does in a complex environment.
You see, the computer operating system is the first widely accepted, uniform “object” ever to have taken on its shape purely by design, and to have spread far, far beyond any specific demographic, environment or circumstance.
What I mean is, the chair, the hammer, the glasses, the car, all the other designed objects we have, they were made into their overall shapes by a meeting of function and design – so there’s a million chairs in the world, but they all have a seat, and they all have some manner of footing, and a car won’t work unless you have a reasonably intuitive steering method that is also reliable, and so on – you get the point.
Not so for the computer OS. Nobody had any expectations for this entity, and there were litterally no limitations – anything could have been built on the 0′s and 1′s that make up computing.
To say that something was limited only by imagination has rarely been more true.
Anyway, long story short, someone at Xerox PARC came up with a programming paradigm called Smalltalk, somebody thought of multitasking in windows, there was some borrowing, some stealing, some lawsuits and some business shenanigans, and presto, what we now know as Microsoft Windows became the prevalent operating system of computers all over the world.
And this design, it’s designing back – and this, as they say, is where the plot thickens…
For example, pretty well 90% of all computer users consider crashing and recovering from it a basic condition of working with a computer, much like refueling a car, but this is by design, not by function.
The only reason they think so is because the only OS they’ve ever known, Windows, is prone to crashing. They don’t know that a computer is not a thing that’s supposed to crash anymore than the aforementioned car is, and that it should be treated the same if any particular model is prone to do so.
Similarly, the way most people react when a piece of technology doesn’t do what it should is something along the lines of “- oh, I’m so stupid with machines!” – but this, too, may (!) be traced back to the fact that Windows, most people’s first and only direct acquaintance with hi-tech, notoriously chides the user when something goes wrong, and/or talks tech above the user’s head with gibberish alerts that sound ominously complicated, yet serves no purpose since nobody in the room understands what they mean.
It has done so ever since it was first brought to the market, and an entire generation of people just got used to it.
The list could go on but as I said this is not about bashing Windows, so let’s just say the point is made.
(if you feel like a real tirade, albeit an extremely well spoken and interesting one, this guy did his homework and then some)
Now, I’d like to think there’s more of a reason for looking at this, and certainly for writing about it, than just establishing that I don’t like the Windows experience particularly – and there is.
That reason is, there’s a cost – mostly in actual money, too, we’re not talking abstract cost here – and we’re failing to notice.
For example, look at the concept called the productivity paradox. If you don’t feel like following that link, the brief version is that, in the business world, there is hardly any visible gain in productivity with the increase in IT expenses.
This is named a paradox and has statisticians and other researchers flailing for an explanation but I believe the answer is right at hand: – the most prevailing computer user interface is Windows, and Windows does not increase productivity, at least not in any way relative to the expenses incurred by using it.
Let’s do a bit of napkin-math on it:
Acme is a copywriting company that, so far, has worked on electric typewriters. There’s some 25 writers employed, and they each get 40.000,- a year.
So let’s get them computerized – each work station costs $750,- and each copy of Windows is $300,-.
That’s $93.750,- right there – or more than two whole salaries for the first year (and we haven’t bought antivirus software yet, peripherals, or even an office package).
imagine a pile of these reaching from earth to the moon…
The computers better make these people that much more effective, but how would they do that? They’re still just writing stuff.
It gets worse though.
A piece of hardware such as a computer should be able to run pretty well for at least 5-6 years (in fact there’s no real reason why it shouldn’t run much longer than that) but, mainly because of Windows, this company is probably going to have to upgrade most of the workstations every 3 years or so – in addition to that, there’s going to be point updates to the system, costing money too (say, a couple of hundred dollars pr. workstation pr. year average).
But it gets worse yet.
Because, unless these people are mostly superusers, having 25 workstations running Windows is going to require at least one full-time IT support employee. If the company has servers, too, one guy probably won’t cut it (certainly not if they run Windows Server, IIS and the sort).
– and just as you thought it wouldn’t get worse anymore, it probably will; the company is all but guaranteed to experience serious downtime and expenses due to a virus or hacker attack. Most attacks (even taking the spread of Windows into account) exploit weaknesses and security holes in Windows that should not exist in the first place.
Again, my point here is not that they’d be better of with Mac systems – my point is we, society, businesses, take this as a prerequisite condition of day-to-day operation without question, even though similar conditions in any other field of human endeavour would have usfrothing at the mouth.
Could this be because the omnipresence and uniformity of Windows has redesigned our perception? I submit that it is, simply because there is no other explanation for our glaring blind spots regarding computers.
I think we, people, need to start learning to cope with tecnology and design – in fact, we’re overdue; the personal computer is spreading in the form of smartphones, and these are already beginning to show the same kind of vulnerability. If we just sigh and resign to this development, we will react to it in the wrong way, also known as the Micosoft way: – the tecnological equivalent of frantically trying to heal a broken leg, a burst appendix and a gunshot wound with two tylenol and a band aid.
Remember these? Gone. By the wheelbarrow.
The world of computers should be teaching us this, to the tune of billions of dollars every year in costs incurred by virus-, bot- and worm-attacks, spam (some estimates have more than 75% of all spam spreading via Windows PCs, thanks to their inherent vulnerabilities) and just regular wasted time.
And it’s not even that we don’t know – articles flood the news everytime there’s a major virus or worm attack. It just never gets traced back to Microsoft, which is even more baffling when you realize that significant action could be taken against these attacks, and the costs they cause, by redesigning the computer operating system in a sensible way.
I think this is really an appeal to the true power of design – just imagine if we solved this problem, born from (bad) design, with (good)design, and saved the world billions upon billions of dollars…