Attention Marketers: Access 30 Million IT Decision Makers with ECT News Network's INSTA-LEADS · Click to Learn More!
Welcome Guest | Sign In
TechNewsWorld.com

Accelerating Hadoop's Big Data Momentum

Accelerating Hadoop's Big Data Momentum

Some may consider the Pivotal and Hortonworks partnership on Apache Ambari simplistic or even mundane. However, without better-quality operations tools, Hadoop seems unlikely to evolve enough, or quickly enough, to fully achieve what its proponents envision. The fact is that in many or even most cases, sublime achievement requires a mundane foundation.

By Charles King E-Commerce Times ECT News Network
08/05/14 6:20 AM PT

The concept of "collaboration" has become such a messaging mainstay that it often verges on cliché. So, does the recently announced Pivotal and Hortonworks plan to improve Hadoop operations by collaborating on Apache Ambari qualify as anything new or special?

In fact, it does.

Innovation vs. Reliability

Hadoop is a core technology in many or most modern Big Data analytics solutions and strategies, but it has struggled to develop the practical features enterprises demand for dependable management and performance.

Despite Hadoop's numerous attractions -- and there are many -- denying or delaying those features likely would hobble wider adoption.

This challenge is anything but unique to Hadoop. In fact, Linux and some related efforts suffered similarly painful paths to maturity. Open source collaboration can encourage -- and has delivered -- marvelous innovations, but it often stumbles in areas that benefit from more linear development efforts.

More Efficient and Effective

In his commentary on the Pivotal deal, Shaun Connolly, Hortonworks' VP of corporate strategy, specified how Apache projects can span "five distinct pillars to form a complete enterprise data platform: data access, data management, security, operations and governance."

Per the two companies' statements, this new effort aims to leverage their considerable skills and open source experience to improve Hadoop operations and make Apache Ambari the standard management tool for Hadoop.

So what exactly is Apache Ambari? A framework for Hadoop provisioning, managing and monitoring that allows administrators to

  • easily provision Hadoop clusters of virtually any size;
  • simplify Hadoop cluster management tasks, including controlling service and component lifecycles, modifying configurations and managing growth;
  • efficiently monitor Hadoop clusters by preconfiguring alerts and visualizing operational data; and
  • effectively integrate Hadoop (via a RESTful API) with existing data center tools, like Microsoft System Center and Teradata Viewpoint, and operational processes.

A Big Big-Data Future

The main "wood behind the arrowhead" in this collaboration are the Pivotal engineers who will combine parts of the company's installation and configuration manager technologies in Ambari to substantially expand its core capabilities.

Some may consider this partnership and its goals simplistic or even mundane. However, without better-quality operations tools, Hadoop seems unlikely to evolve enough, or quickly enough, to fully achieve what its proponents envision.

The fact is that in many or even most cases, sublime achievement requires a mundane foundation.

Without the foundational operations capabilities enterprises demand -- and that the Pivotal and Hortonworks collaboration will deliver -- Hadoop's Big Data future could have been far smaller than many hope or believe it will be.


E-Commerce Times columnist Charles King is principal analyst for Pund-IT, an IT industry consultancy that emphasizes understanding technology and product evolution, and interpreting the effects these changes will have on business customers and the greater IT marketplace. Though Pund-IT provides consulting and other services to technology vendors, the opinions expressed in this commentary are King's alone.


Facebook Twitter LinkedIn Google+ RSS