Dealing with naturally risk-averse and conservative IT policies becomes increasingly less desirable to consumers when compared to the types of enablement available to the adventurous as technology rapidly evolves.
IT is left to languish unappreciated, a resource of last resort, with an insurmountable burden of legacy-induced maintenance, and no chance of being seen as anything other than an impediment. A look at some possible causes of past migrations suggests how IT might be able to avoid the ignominy of being dumped and tumbled under the waves of change.
Wave 1: Escape from the Mainframe to the Super Mini
Many of us can remember the heady days of the mainframe — the amazing 360 series with its glittering lights, and the dark promise of the closeted 370. In those days, IT was indisputably king, and programmers stalked the halls like gods, demanding and receiving obeisance from all. Alas, the miracle of the mainframe soon began to pale in direct relationship to the growth of the programming backlog, and end user dissatisfaction climbed exponentially as the backlog continued to increase.
In its last days, the mainframe was seen as a monolithic roadblock incapable of responding to business needs and providing only basic engine-room services that were often aligned with an outdated business model. The few items that were actually removed from the backlog only made things worse, as ill-considered changes piled on top of each other and exacerbated already challenging maintenance problems. Poor — even zero — service levels, mounting frustrations, and a healthy dose of scapegoating primed the end users for a rapid escape from mainframe land as soon as the life boat of minicomputers hove into sight.
Consumers were ecstatic. They had escaped the dead hand of the corporate mainframe, and now each division had its own minicomputer, which existed for the sole purpose of responding to the owners with rapid solutions to their needs. Progress surged ahead, with solutions appearing daily.
As the load on these minicomputers expanded, it soon became apparent that overlooking the overheads imposed by security, backup protection, disaster recovery, and standardized availability were causing critical failures in the minis’ ability to service a once- enthusiastic clientele. As outages became more frequent and inability to recover lost or corrupted data impacted both the business and some personal reputations, discontent began to grow.
The wise old CIO (EDP manager) suggested that divisions might now be able to benefit from the disciplines and standards that made the mainframe so highly available and so secure. In a vain attempt at alleviating their pain, divisions returned the minicomputer to the control of the central IT group. Almost immediately, the unstable and insecure environments were brought under disciplined control, and availability improved dramatically.
Alas, the same old process for handling change requests still existed, and so the programming backlog began to increase again along with a similar increase in the time taken to add computing or storage resources for new projects. This backlog was so severe that in many instances the end consumer could confidently expect their requests for services to disappear into the IT black hole never to be seen again.
If a new server and storage were required for a new business venture, chances were slim that this could happen in anything approaching a reasonable time frame without CEO intervention. Discontent simmered, and the masses seethed with rebellion, desperately seeking an alternative that would actually allow them to enable their business. At the very peak of dissatisfaction, along came the PC.
Wave 2: Escape From the Super Mini to the PC
At last, the end user was freed from any dependence on IT (or so it seemed at the time). The popularity of PCs grew so rapidly that business folk were seen toting 100 lbs. of pseudo-portable 286 PCs onto their 707 airliners. On the desktop, the financial guys were in seventh heaven. They could count and manipulate beans whenever and however they desired — and in as many ways as their imagination could produce. Indeed, many organizations were so busy counting beans they didn’t notice most of them were rolling off the table.
In the financial shakeout of that recession, PCs assumed an even more important role as costs came down and capability went up in astonishing increments. End users were totally empowered, except for one small thing: They did not have access to the data held in the dead hands of corporate IT. Some few souls ventured into the lair of IT, and through a combination of courage and bluff, managed to have data downloaded onto their PCs, where they immediately performed miracles of analytical reporting.
As change accelerated, the increasing frequency of upgrade and patches coincided with a need to share between PCs. Software was often free as it could be copied onto multiple PCs without any seeming constraint. Then, along came Ethernet and the Internet, and the unsecured desktop. The once physically discrete PC became simply one of many PCs accessible over the network, internally and often externally. These were now easy prey to a single user’s error, the 12-year-old vandal and the sophisticated eastern hacker.
In addition, software vendors started to pay attention to licensing issues, and a licensing audit caused more fear and trembling in the user community than an IRS audit of personal taxes. The wise old CIO typically suggested that if PCs were placed under IT control, IT could manage all those pesky licenses and multiple revision levels, roll out patches as needed, and implement sophisticated security. The battered and bruised end user agreed — and once again, the dead hand of IT reached out and scooped up the latest technology.
Wave 3: Escape from the PC to the Handheld Device
End consumers breathed a sigh of relief when the intolerable burden of PC maintenance and security was lifted from their weary shoulders, allowing them to focus on performing their business functions instead of maintaining PCs. Alas, it was not too long before Microsoft released yet another operating system and yet another release of MS Office to go with it — the latter containing absolutely must-have features for dedicated end users wanting to better enable their business functions with technology.
In older days, end consumers would have simply gone off to a local electronics retailer and picked up the latest version. Now they had to — yes, you guessed it — put in a request to the dreaded central IT. And when could they expect the upgrade they saw as so essential to the survival of their division, let alone the organization? Yes you guessed it again — it would be a loooong way off.
Discontent grew and became exacerbated as end users found the ability to load pussycat screensavers was eliminated, and what was once a very personalized personal computer suddenly became an immovable IT-supported desktop computer. There were restrictions on memory sticks and writable CD/DVD, and rumor had it that even the PC would soon be taken away and replaced with a “dumb” terminal. Once again, it seemed that poor service and restricted access had driven the end consumer to despair. But fret not, technology again rode to the rescue, this time with the handheld device, the smart — if not super-smart — phone.
Wave 4: Escape From Handheld to ‘Your Own Device’
Astute end users yet again found an escape path from the dead hand of IT as it descended upon the desktop. They went back to their favorite technology store where they were greeted like old friends (customers). There, excited sales persons showed valued customers a dazzling array of smartphones — each lighter, faster, and more capable than the one before. They could send and receive not only business but also personal email, create simple documents with the promise of more later, send information and download information from the company, friends and family, cruise the Internet with no restriction — even take photos and record favorite songs for playback while reading email. Oh yeah, and make phone calls too. Near Nirvana.
Prudent end users at leading-edge enablements heard mutterings that IT was thinking of taking over their handheld devices on the grounds of security. Seems IT thought if you should lose your hand set, then it should be able to clear any corporate information before someone might pick it up and misuse it. In addition, they were concerned that inappropriate use of handhelds to text, sext or next someone could expose the organization to legal retaliation.
The wise old CIO set standards for the use of handhelds and limited end users to two or three choices, none of which were what the end consumer really wanted and most of which would not supply the tools and toys viewed as so essential to personal productivity. End consumers saw the dead hand of IT hovering over their productivity once again.
Having suffered through the capture and takeover of their personal desktops, they feared IT would hold their handhelds hostage. Dissatisfaction grew. Requests for features, functions and new versions seemed to fall into a black hole. End users sullenly retreated to their home offices where they used their own PCs and their own handhelds, and attempted to do their jobs without much of anything from corporate except an Internet interface to the corporate email. IT’s stock sank to an all-time low.
Wave 5: Escape to the Cloud
Having tried several generations of new technology, battered end consumers experienced brief blazes of productivity as they escaped IT control over one technology for the latest and greatest. But sooner or later, the hand of IT always caught up, imposing security, availability, repeatability and connectability, hoping to optimize the corporation rather than the individual.
Experienced end consumers found out “if you can’t beat em, join em” was a good strategy. They went to night school and learned programming. It wasn’t that big a step, as they were very experienced in using technology for personal productivity. They figured, “I will write the programs I need for myself and won’t have to wait on IT.”
That innovative strategy unfortunately foundered once those adventurous programmers found they couldn’t run their software on their own devices and, of course, couldn’t access IT resources like servers and computers without once again going through that incredibly frustrating and lengthy process imposed by corporate IT.
Ever innovative, our end users found cloud computing services and several suppliers. It seemed one could simply choose server and storage needs from an Internet menu and, once a credit card were entered, the resources would immediately become available for use.
Even better, extending and growing compute needs and storage needs would be just as easy. Availability, backup, disaster recovery were all options they could select according to need, and those too could be paid for with a credit card and be immediately available. What was not to love? If only IT had worked like this, they would never have left them.
As they proudly presented their latest work to their executive teams, they heard rumors and grumblings that IT would soon be imposing control over use of the cloud for security reasons. They wondered how the cloud could be any less secure than their own organizations, with their low-paid administrators operating as unappreciated, underpaid, sullen and rebellious underlings in the bowels of IT.
For a moment, fear clutched their hearts, but they realized it would take a long time for IT to figure out how to control the external cloud. Hopefully they didn’t even know about it yet, and in the meantime the end users could make hay while the sun shone.
What to Do?
So, what can the “Wise old CIO” do to prevent these mass escapes of end consumers with the subsequent exposure of security, confidentiality and reputational loss? First, it seems end consumers will remain sullen but not rebellious unless an alternative is available. Once an alternative is available, however, open rebellion will break out and the end consumers will rapidly adopt new technology regardless of risk, and usually way ahead of the curve being followed by IT.
Why is IT’s vision confined? It’s an intersection of legacy-induced maintenance loads exacerbated by an inability to make a budget case. But most importantly, it is an inability to respond to two critical components: 1) provisioning requests for new or extended resources; and 2) change requests for programming functionality. This article focuses only on the infrastructure side, as so much has been written on application architectures for rapid deployment.
How does the wise old CIO transition his application-focused, biggest-bang-for-the-buck infrastructure to one that optimizes the ability to respond to end consumers and application developers alike with the speedy provisioning of server and storage resources?
The answer lies in the adoption of the service provider model, through which end consumer services are reliably and quickly provided from a common, rationalized, tiered infrastructure at a known cost and at contracted service levels. This is the foundation of the deployment model called “cloud.” Under cloud, the actual service selection and service provisioning have been automated, along with subsequent billing for those services.
The combination of guaranteed service levels with quick service selection and provisioning, as well as cost-conscious decision making, drives end consumer satisfaction and cost-effective service delivery of infrastructure. The tiering of infrastructure to align with business services ensures that IT responds directly to business need as well as allowing IT to evolve performance-optimized infrastructure into a service-optimized infrastructure.
IT moves from building for “just in case” to building for “what’s actually needed” and thus builds what the business is prepared to pay for at budget time. This journey can be complex, but the benefits are outstanding in terms of dramatic improvements in end consumer satisfaction, as well as significant cost savings through business-aligned infrastructure and cost-conscious service selection.