Oracle Makes Bold, Risky Move With Big Data Appliance Launch
Oracle's speed to market with its Big Data Appliance is indeed impressive, but what still remains to be seen is whether the product will be a success. "Something like 40 percent of computer PhDs are working on Hadoop projects, sometimes unsuccessfully," noted Schooner CTO John Busch. "It can be a very complex process, so I don't know if bundling it with hardware will be easy as one-two-three."
Oracle has released its Big Data Appliance, which it announced last year at Oracle OpenWorld. With no release date promised, the industry was surprised by its relative speed to market.
Less surprising was Oracle's enlistment of Cloudera -- the leading provider of Hadoop system management tools -- to provide an Apache Hadoop distribution and tools for the appliance. With Cloudera's established track record and relatively large install base, partnering with the company was almost a no-brainer.
Assembling the Pieces
Oracle Big Data Appliance incorporates Cloudera's distribution, including Apache Hadoop and Cloudera Manager; an open source distribution of R; Oracle NoSQL Database Community Edition; Oracle HotSpot Java Virtual Machine; and Oracle Linux running on Oracle's Sun servers.
"This is a unique offering in the market," Ed Albanese, head of business development for Cloudera, told the E-Commerce Times. "It is a purpose-built appliance that contains a sophisticated high-end software configuration with a good combination of software."
The chief element of the software is the open source distribution of Apache Hadoop -- CDH.
The partnership gives Oracle users the benefit of a new product, Albanese said. Apache Hadoop's users get the benefit of a tighter combo of hardware and software.
"Oracle has a very large ecosystem of all sorts of vendors, and we anticipate that there will be a much larger number of services and tech products available that will be easier to integrate," Albanese said.
Oracle also announced Oracle Big Data Connectors, a portfolio of software products aimed at integrated data stored in the CDH Hadoop distributed file system or Oracle NoSQL database with Oracle Database 11g.
New Computational Model
The partnership also holds interesting implications for Oracle's future direction, John Busch, founder and CTO of Schooner, told the E-Commerce Times.
"Certainly, Hadoop has a strong level of adoption, and I am sure that played part of a role in Oracle's decision to partner with it," he said. "Hadoop is also a new computation model that applies very well for big data."
In short, it is different from Oracle's traditional relational database approach.
"The idea of bundling Hadoop computational model with Oracle's hardware and developing appliances is very interesting," Busch continued.
However, the execution of this partnership may be more difficult than the industry expects.
"I read somewhere that something like 40 percent of computer PhDs are working on Hadoop projects, sometimes unsuccessfully," Busch noted. "It can be a very complex process, so I don't know if bundling it with hardware will be easy as one-two-three."
The market may not be interested in the appliance, suggested Quantivo Senior Director of Products Jim Chiang.
"Businesses and users are demanding cost-effective analytics, with infinite flexibility and a pay-as-you-go model," he told the E-Commerce Times. "You just can't get that with yet another hardware product to add to your data center."
A Defensive Move
It was inevitable that Oracle would produce something like this appliance, Ken Bado, CEO of MarkLogic, told the E-Commerce Times.
It is a defensive move, as more organizations abandon relational technology -- Oracle's core competency -- for new approaches to modern data problems, he said.
"Oracle's model is to drive revenue stream by joining forces with open source in order to maintain product viability. It's really a block on any customer who is thinking about transitioning from Oracle," Bado continued.
"Hadoop was built for commodity hardware and batch processing, and this partnership is taking Hadoop out of its comfort zone," he explained. "Today's big data solutions need to address a few major points -- they must scale out on commodity hardware, provide real-time and ad-hoc access to data, and go beyond analytics to provide enterprise-class big data applications."