The English language is a great tool. It’s expressive, powerful, inclusive, and evolves through the democratic and open-source processes of accepting change on the basis of common usage. Great, but you know what it doesn’t have? Enough useable swear words.
Think about it, you probably know eight to ten “emotional verbalizations” applicable to a complete and unmitigated, but easily prevented, disaster caused by human laziness or incompetence. Give it some thought and you might make it to fifteen before having to repeat entire phrases. That’s nothing; a German or Russian wouldn’t even be getting warmed up at thirty and a lot of those fairly routinely make into the media where we’re limited to one or two acceptable euphemisms.
“Glitch,” for example, is the officially sanctified media term for a colossal — well, you know — involving computers. It doesn’t matter how bad or easily preventable the disaster was; to the mass media the presence of a computer in the scenario absolves management of all responsibility.
A Minor Malfunction
Not all dictionaries list the word; those that do tend to describe a glitch as a minor malfunction without an obvious cause — more comical than serious in its consequences. For example, Yahoo’s online dictionary describes a glitch as “A minor malfunction, mishap, or technical problem” and gives this etymology:
Although glitch seems a word that people would always have found useful, it is first recorded in English in 1962 in the writing of John Glenn: “Another term we adopted to describe some of our problems was ‘glitch.'” Glenn then gives the technical sense of the word the astronauts had adopted: “Literally, a glitch is a spike or change in voltage in an electrical current.” It is easy to see why the astronauts, who were engaged in a highly technical endeavor, might have generalized a term from electronics to cover other technical problems. Since then glitch has passed beyond technical use and now covers a wide variety of malfunctions and mishaps.
In a recent search, Google news returned 1,410 hits for a search on the terms “computer glitch” (without the quotation marks).
One of these, a “glitch” at the Royal Bank of Canada (RBC) directly glitched the accounts for about ten million people for over a week and shut down its payroll and direct deposit services to thousands of clients — leaving several hundred thousand marginal earners unable to meet their mortgage obligations. By the time this “glitch” fades into history, it will have affected roughly a quarter of the bank accounts in the country; but all of Canada’s major newspapers concurred in calling it a glitch. Thus “Computer failure RBC -glitch” and “Computer failure ‘Royal Bank’ -glitch” return no hits.
A Cavalier Attitude
Meanwhile, over in England, a glitch functionally shut down travel in British airspace for a number of hours. No big deal, just a glitch whose ripple effects across the world cost millions of dollars and caused tens of thousands of people to suffer delays or missed flights.
That cavalier attitude seems to extend to all things computer. For example the Pakistan Daily Times, the Los Angeles Times, and Forbes Magazine all describe a tendency for the digital speedometer on some 2004 Honda bikes to underreport the actual vehicle speed as “a computer glitch.” So some people got tickets they didn’t deserve, or got hurt in bike crashes they shouldn’t have gotten into. So what? It’s just a “computer glitch.” Can’t they just reboot their lives?
Glitches really get around. Google reports attributions to the glitch for things ranging from billings for nonexistent 911 calls through missing ballots in the North Dakota special election, to failures in Medicaid processing in Georgia and a magical decision by the Boca Raton emergency warning fanout system to drop about 20 percent of the numbers it should have called, but call most of the other 60,000 or so twice — generally after 11:00 PM.
A Complacent Media
In every single one of these cases, people’s lives were affected while the news media devalued the consequences by describing entire event cycles as mere glitches. That’s the Windows mentality at work; in the Windows world, there are no consequences to failure: just reboot and move on.
Sometimes, we should just call a cigar a cigar. These weren’t glitches. They were the logical and necessary outcome of pressure-induced incompetence and idiocy at work. Try it first on the production system? Of course. Test on the customer’s dime? Efficient! Cut your processing margin so close to zero that you have no capacity to recover when a batch fails? Just good asset management, eh?
English just doesn’t have the words to describe management failures like these — and, believe me, the standard vocabulary doesn’t remotely cover it. Worse, the media’s complacent use of the phrase “computer glitch” to defang even the most serious failures and their consequences is a big part of the reason no one ever rethinks the underlying causes or hangs the bosses whose decisions created them.
Paul Murphy, a LinuxInsider columnist, wrote and published The Unix Guide to Defenestration. Murphy is a 20-year veteran of the IT consulting industry, specializing in Unix and Unix-related management issues.
This story was originally published on June 10, 2004, and is broughtto you today as part of our Best of ECT News series.
I AM often AM azed at the decline in concern over quality that has come over the industry. I recall when, around 1980, I worked at Ontel, a small manufacturer of dedicated word processors. When complaints started surfacing that the systems might need to be rebooted as often as once every few WEEKS, there was a major investigation.
I attribute the whole acceptance of lackadaisical computer performance and frequent failures to the spread of Microsoft’s phenomenally atrocious early versions of Windows; who in the industry doesn’t remember when the normal "fix" for any problem in Windows ’95 or ’98 was to reinstall (!) the operating system?
Before the PC revolution, computers were hidden gods attended by their priesthoods, and resentment over their potential for errors was high. I still remember a song from those days, I believe it was by Charlie Daniels, where a frustrated customer faced with an erroneous account stabbed his punched card statement full of holes, told the utility (phone company? whatever) to "shove that up your computer," and was rewarded by a large refund check.
Later, hundreds of millions of people were exposed to computers on their deks at work, at school, or at home, and largely thanks to Microsoft’s shoddy software, found that the "gods" were more fallible and pathetic than any human could imagine. On the one hand, this eliminated the real fear of confronting mechanized perfection that people had twenty-five or thirty years ago, afraid practically to touch a keyboard for fear that they would "mess it up," but on the other hand it accustomed them to accepting and even expecting that computers were especially unreliable and notoriously sensitive "pieces of junk." This contempt was at the heart of the Y2K scare, as people whose only personal experience with a computer was the constantly crashing box on their desk were suddenly confronted with the ubiquity of "real" computer systems in telephone, air traffic control, and power switching systems and expected them to be as fully prone to fail as all of the "personal" systems that they’d experienced.
Sadly, only very few people have been regularly exposed to higher quality systems such as those from Apple or Sun, so that they can trust their computers and expect them to perform with anything like the reliability of an automobile or even a lawn mower.
To my mind, one of the absolutely worst things that Bill Gates and Company have done is to foster this contempt for computers and implicit expectation of unreliability AM ong the general public, to say nothing of the billions of man-hours and trillions of dollars that have been wasted dealing with incompetent software.
We in the IT industry can only work dilligently over the coming decades in pursuit of quality, in hope to one day rebuild the image of credibility that is now only a memory of a bygone era.