B2B Marketers » Reach Pre-Qualified IT Decision Makers with a Custom Lead Gen Program » Get Details
Welcome Guest | Sign In
Salesforce Industries Summit

The Challenge Apple Faces in Enterprise Computing

By Paul Murphy MacNewsWorld ECT News Network
May 4, 2004 9:00 AM PT

Apple's role in desktop software has historically been that of the one-eyed man in the country of the blind: roundly reviled by all but discreetly trailed by the mob. Partially, as a result, its periodic efforts to break into major business markets have generally been rebuffed, not because its products don't belong in an enterprise-computing architecture but because the decision-makers are usually too far behind the technology curve to recognize its value.

The Challenge Apple Faces in Enterprise Computing

Technology products can be seen as representations of ideas about how things should work or how work should be done. In theory, the more the ideas articulated in the product overlap those held by its intended customers, the more the thing should sell and the more effective it should be when applied.

That's a key reason Unix products like Linux are so powerful when used as intended -- and so destructive of corporate value when misapplied. Unfortunately, the ideas behind the first Macintosh were in sync with what the user community needed, but out of sync with what IT decision-makers thought users should have. As a result, most of the experts who direct corporate IT decisions labeled the first Apple Lisas as slow, expensive and wrong for business.

They were wrong then, of course, but nothing fundamental has changed since, and they're still wrong today. To really understand how far that original Mac was ahead of the IBM PC idea, and thus why the traditional IT decision-maker feels so uncomfortable with it, you need to compare the ideas behind the two products, not their physical realization in products.

Original Mac: Ahead of Its Time

Consider, for example, this statement from a 1979 presentation by Jeff Raskin, the Mac's original designer:

The design assumes the existence of a network allowing nationwide communications. Macintosh is a communications device.

In 1979, data-processing professionals worked in glass rooms serving 370s and dreamed of upgrading to larger glass rooms serving the new 30XX series. Things like desktop computing, Unix, automated networking and the use of computers for collaboration and communication were not on their horizons.

Instead, the daily battle was for increased centralization and the elimination of internal data-processing competitors through the consolidation of things like departmental minicomputers from suppliers including Data General, Digital, Honeywell and Prime.

Today, that history is repeating itself as Microsoft makes its third attempt to get client-server to work; this time via more control centralization and another forklift upgrade. Indeed, Raskin's words from twenty five years ago don't need a lot of change to describe Microsoft's impending Mono, WinFX, WinFS and related technologies:

The design assumes the existence of a network allowing enterprise-wide communications. Longhorn is a communications service.

That parallelism extends to an ironic recurrence both transcending and illustrating the maxim that the more things change, the more they stay the same. In 1979, while Raskin was inventing the Mac, Microsoft was advertising -- but not yet actually delivering -- its first operating system product -- Xenix -- a product it failed to port from AT&T System 7 code and had to subcontract, first to Human Computing Resources and then to SCO.

Apple: Largest Unix Workstation Maker

Today, MacOS X has made Apple the world's largest Unix workstation manufacturer while Microsoft advertises (but can't yet deliver) a reinvented Pick OS for the Internet age. As someone like Dave Barry might say, you couldn't make this stuff up on a bet.

Twenty years ago, however, the machine that should have established Apple as a leader in enterprise computing was the MacXL, introduced in January 1985 as an upgradeable repackaging of the Lisa 2. This machine offered 1 MB of RAM with an integrated 720 x 360 screen, 10 MB disk, MC68000 CPU, a sharable connection to the PostScript based LaserWriter and a set of GUI applications for a total price of US$4,495 -- five bucks less than the 256 MB IBM PC/AT with only BASIC and PC-DOS on a 24 x 80 screen. Like today's X-Serve, it should have swept the enterprise market; but didn't.

Instead, even Apple's deal with PC World to sponsor Mac Magazine couldn't stand against the relentless hype put out by IBM and Microsoft while companies that had IT professionals to guide their decision-making uniformly bought the IBM product and let their people denigrate the Mac as an expensive toy for amateurs. As a result, Apple couldn't meet manufacturing commitments on the product and withdrew from this market segment only four months after releasing the XL.

Capitalizing on Strengths

Meanwhile, of course, educators, graphic artists, and others whose response to the Mac desktop computing idea wasn't mediated by in-house IT expertise, bought into the product and those Apple executives who could spare time from the internal wars brought about by the Mac's failure to succeed in the business market adjusted both its marketing focus and its development efforts to capitalize on the product's demonstrated strengths in these sectors.

The fundamentally counter-intuitive situation driving that hasn't changed: Customer IT expertise is still a negative indicator for Apple sales. You'd expect the opposite, of course, with greater expertise manifesting as a stronger preference for things that work, but what you see is that the more experienced and competent corporate decision-makers are thought to be with respect to IT, the less likely they are to choose Apple products.

There are two possible explanations: Either our definition of IT expertise is naive or both the original PC-DOS environment and today's Microsoft client-server architecture are significantly superior to the original MacOS and today's Darwin/MacOS combination on at least some high-value measures.

'There's More of Us'

So far, twenty-three years of increasingly strident attempts by the PC-mainframe community to support the latter hypothesis boil down to "there's more of us" -- something that could just as easily be said of people who belong on the left-hand side of the IQ normal curve and not an actual argument.

In contrast, it's not that hard to support the idea that we tend to value the wrong kinds of expertise in deciding who the IT experts are. Consider, for example, this quotation from a story by Lafe Low in CIO Magazine's March 15, 2004 issue:

The business unit leaders wanted to change the order of certain data-entry fields to record customer's telephone numbers before their addresses. So Nickoliasen started the ERP customization project, which proved costly and time-consuming.

He realized the time and effort spent customizing the order-entry screens to change the order of data input did nothing to help acquire additional customers or even improve customer service. It cost US$100,000 and was, in his view, a waste of the company's time and resources.

CIO Magazine caters to key IT decision-makers, the people whose expertise led to the rapid rise of the IBM PC and blocked Apple's early effort to serve the enterprise computing market. Read that quotation twice; think about the fact that this laudatory review of IT expertise at work was written in 2004 -- not 1979 -- and you'll probably agree that more of the disconnect lies with our view of IT expertise than with the Apple products this expertise leads them to reject.

Inapplicable Experience

Unfortunately, that kind of expertise, in which the wrong people apply inapplicable experience to make wrong decisions, perpetuates its own reality and thus defines the challenge Apple faces in the enterprise computing market -- explaining why spotting an Apple X-serve and RAID array amid the Wintel clutter in Fortune 1000 data centers is marginally harder than visually tracking Mercury on a sunny day.

So what can Apple do about it? In the short term, nothing.

The people in charge in the data centers will perpetuate their ideas until younger companies with smarter leadership do a sufficiently better job of meeting consumer needs either to put their employers out of business or take them over.

In the long term, a renewed focus on serving smaller companies now could move things along, but the bottom line is that trying to sell tomorrow's ideas to yesterday's people is a recipe for yet another failure to crack the enterprise market.

Paul Murphy, a LinuxInsider columnist, wrote and published The Unix Guide to Defenestration. Murphy is a 20-year veteran of the IT consulting industry, specializing in Unix and Unix-related management issues.

Salesforce Industries Summit
What is the greatest challenge to organizations implementing Artificial Intelligence?
A shortage of talent with the skills to utilize AI to its full potential.
AI can be expensive and the return on investment is questionable to decision makers.
AI is an unfamiliar and complex technology that is not yet fully trusted.
AI has its own set of cybersecurity concerns which require additional resources.
Dependability of AI technology is still in doubt.
Many view AI as an unnecessary luxury.
Women in Tech
Forrester names NICE inContact CXone a leader in cloud contact center software
Forrester names NICE inContact CXone a leader in cloud contact center software