It may be too late to worry about how 2010 impacted us all, but it is not too late to think about how it will shape events this year. In some respects, 2010 was not a year that changed any one course dramatically. Most of 2010 was built around some familiar themes: consolidation, M&A, emerging technologies, market maturity, etc. In other words, 2010 was a case of old wine in a new bottle.
The technology industry and the market place exhibited characteristics that everyone is generally familiar with that are indicative of an economy that seems to be thawing. So what that means for IT “consumers” is that as 2011 kicks into full gear, they should not attempt to teach old dogs new tricks. With this theme in mind, let us look at some areas that 2010 illuminated.
Consolidation Continues To Drive IT Growth
I have always maintained that the IT industry will continue to grow by consolidation. Nowhere was this more prominent than in the storage industry. As has been the case all along, the bigger fish in the ocean started swallowing the smaller fish. The likes of IBM, EMC, HP and even Dell acquired smaller — but well-established — players in an attempt to reenergize their storage portfolios.
It seems to have become the case of the survival of the fittest — and the fittest are the ones with lots of cash that can be used to buy storage technologies that would have been too time consuming (or expensive in the long run) to build organically.
HP and Dell targeting 3Par Data (HP eventually winning) and Dell pulling off a “take under” of Compellent show how important it is for these companies to refresh their storage product road maps. While it depends on how well these companies integrate these new acquisitions, it does, and should, give the well-established players a chance to stop and think about what it could mean in terms of competition.
Think about what Dell and HP now have in terms of their respective acquisitions: hardware and software bases that can be used to create a brand new storage portfolio that will replace older platforms (HP) or perhaps complement a reseller relationship that was getting a bit uneasy (Dell).
So, what does this mean for companies? It means that companies should think twice before investing in products from smaller players.
It can also mean only one of three things in this environment:
- The first is that the company could get acquired by a bigger company forcing customers into relationships they may or may not like. For example, many customers hated that they had to deal with Oracle for their investments in Sun’s technology.
- Second, it could mean the company succumbs to market forces and withers away because it cannot stand up to the competition from bigger companies.
- Third, it is also possible that it will continue to grow and stand up on its own merit and become wildly successful — in which case, some bigger player may choose to swoop in and do a friendly or hostile takeover.
One way or another, it seems the risks of investing in smaller startups just went up in 2010.
Clouds of 2010
Call me a contrarian, but in 2010, cloud continued to be largely hyped. Companies are all too eager to dip their toes into the water to see what becomes of the cloud. Exhibiting a “me too” syndrome, many companies chose to make projects related to cloud a highlight of their 2010 initiatives.
Far too many customers I have talked to know that they want to do something with the cloud but when asked for specifics, they were a bit fuzzy on details. The biggest problem with cloud seems to be a lack of understanding what it actually means and what benefits it actually brings about. With this information solidified, cloud deployments will be more fruitful.
2010 may have brought some clarity to the issue, but not as much as the industry needs to measure success. The industry continues to be largely fragmented in this space, and lack of adopted standards will continue to be a drag on this issue.
Big companies like EMC did not shy away from marketing their cloud initiatives and continued to use their influence to move the industry in a certain direction. Some customers bought into it, but most others remained unsure. Let us hope that in 2011, companies will work toward evolving standards like TRILL, cloud interoperability and ratifying use cases.
2010 was the year of Wikileaks — not because of the political drama that it caused, but because of the implications that it has on how companies manage their data. All it takes is a disgruntled employee to put sensitive data out on the Internet.
2010 showed us that companies cannot be naive about security and, more importantly, cannot deal with security as a flat concept. Security needs to be treated like data storage — in a hierarchical manner.
Different types of data are given treatments based on their importance to the company; so too should they be given appropriate levels of security. No longer can companies assume that data is immune from flowing out of the company, or that every one of its employees is a good corporate citizen.
I expect that like the U.S. government, companies will make the most of 2011 to create detailed blueprints on how access to data is managed and restricted. If my prediction is wrong, then unfortunately Wikileaks will have taught corporate America nothing, and one day it could very well be your company’s data out on the Internet.
A noticeable trend in 2010 was how companies focused on driving cost efficiencies. The economy may be thawing, but it seems that companies have latched onto cost optimization like iron filings to a magnet. This may be a good thing in the long run, but what companies should not forget is that with too much cost optimization comes the increased risk of a small issue having an amplified impact.
Nowhere is this more prominent than in the areas of server virtualization and asset consolidation. Companies have tried hard to reduce their compute and data footprints — be it by way of ultra-dense server virtualization, ultra-dense storage frames or having multiple applications run on the same compute environment.
While these changes occur, their operations models, including staffing, organizational structure, policies and processes, etc., are still suited to the old way of managing IT. A common theme that emerged in 2010 was the mismatch between how technology was adopted (architecture, consolidation, virtualization, etc.) and how it was handled (organizational structure, change management, etc.).
Driving cost efficiencies will continue to move companies toward adopting cutting-edge technologies (as 2010 demonstrated), but not by updating their internal structures to deal with them. Companies will be directing their IT organizations to walk through minefields. 2010 should be a wakeup call.
That Old Outsourcing Animal
Don’t get me wrong, I am not against outsourcing. However, things start going wrong when companies outsource everything, including functions that should only be maintained in- house.
This is not because outsourcing companies cannot do a good job, but because there is a conflict of interest between the time and monetary investment required to maintain these functions at levels that are best for the company’s business.
As a trend, companies are beginning to focus on staffing architecture, engineering and capacity management functions in-house. This was not particularly a 2010 trend, but it deserves to be mentioned as it is even more critical in a next-generation technology environment — that is in one that is bleeding edge, highly consolidated and densely virtualized, and where one small blip can have a domino effect.
So there it is, a few highlights of 2010 from my side. By no means is this meant to be a 100 percent complete catchall; it’s just some musings to get 2011 started. Here’s to an exciting year in IT.
Ashish Nadkarni is practice lead at GlassHouse Technologies.