Recently while presenting the findings of a Big Data survey to an executive team, I was a bit taken aback when the CEO stopped me and said, “I’ll listen to what you learned from the survey as long as you don’t use those two words again — ‘Big Data’ — I’ve already told my team there will be hell to pay if one more person tells me, ‘we ought take a look at what Big Data can do for us,’ that may be the last suggestion they make at the company.” I would love to say I was surprised, but it wasn’t the first time I had heard this sentiment from an executive.
What is driving this reticence? The term Big Data has become so overused that its meaning is eroding. The term makes up in hype what it lacks in specificity and people are sick of its overuse. Despite potentially becoming unwitting victims to Big Data’s hype, these executives are emblematic of a new wave of sentiment that is gaining momentum within corporate America.
Hard work and planning is required for Big Data to deliver on its promises.
Critical to Success
Big Data is real and can be a critical component to the success of enterprises. The backlash is focused on the trend’s hype rather than its impact. One of the problems with the term comes from the word “Big.” People are getting caught up on the quantity side of the equation rather than the quality of the business insights that analytics can unearth.
The fact is that much of the media attention has focused on what companies like Google, LinkedIn, eBay and Facebook have been able to accomplish through strategic Big Data initiatives. Unfortunately, these companies’ business models are dramatically different from most traditional American businesses. They were actually created from and born into a Big Data world. Rather than leveraging things like Internet, social media, and sensor information, for example, the average Fortune 500 business leader is tasked with simply managing and making sense of years of legacy data that has been stored throughout the organization.
Finding a better way to more quickly glean valuable insights from these existing sources of disparate data is the real near-term victory for businesses. Variety and velocity are the primary Big Data issues enterprises are facing today, not volume. Enterprises’ ability to quickly analyze and leverage diverse data sources is the true game changer.
Rather than sucking in every imaginable piece of data into analytics models, companies should focus on the speed and quality of their analytics. The amount of data does not have any bearing on the success of a Big Data initiative. Big Data tools and platforms enable companies to more cost-effectively and easily take existing information that historically may not be readily accessible to analytics processes and put it into action to get answers more quickly, thereby decreasing their time-to-answer (TTA) for specific queries.
TTA is a key concept for enterprises’ considering Big Data initiatives. Currently, many enterprises are swamped with legacy information and even basic questions can take weeks, if not months to answer. Focusing on accelerating TTA is a surefire way to ensure companies enjoy the benefits promised by Big Data evangelists. An important step in achieving these goals is to clearly delineate known business insights from unknown.
Knowns and Unknowns
A key to sustainable success in the business world is to learn what you do not know. Successful companies constantly challenge their assumptions and beliefs by looking for new insights into their customers, markets, products and risks. To do this well, they must pursue the new while continuing to operate on the known — creating a healthy continuous improvement model for Big Data initiatives.
Before companies can create meaningful new insights from Big Data, they must first govern their existing sources of data — the known. To do so, the following architecture standards and guidelines should be followed for governing known information:
- Cleansed, mastered and pre-integrated analytic and operational data
- Common metrics, models and BI capabilities
- Up-to-date business and technical metadata
- Rigorous change management
- TTA Accelerators: Standards, scale and high availability
Conversely, the new source of data can be a bit more unruly. To gain new insights, organizations must draw on data from internal and external sources including social media, government data, service and sales data and research. New discovery must then be lightly governed, and include the following:
- Access to known data & definitions
- Secure, large workspace for working with external data
- Rapid profiling, quality and data integration tools
- Non-standard analytic routines and data structures
- Ability to load complete data sets without data modeling
- TTA Accelerators: Powerful infrastructure, specialized tools
The new and the known must be symbiotic systems connected to and feeding each other. New analyses require rapid access to all the known data representing the reality of today’s business. Conversely, there must be a disciplined approach to promoting new insights, data, and models to evolve the known. Without this linkage, the systems diverge into incoherence.
Accelerating TTA is a concrete goal that Big Data initiatives can strive to achieve, providing metrics that executives can see. The key to streamlining the time from a corporate question to game-changing business insight is to establish a thoughtful, manageable approach to data analysis.
Strong governance and the oversight of known data capabilities must coexist with agile data analysis to pave the way for new discoveries and insights. The promise of Big Data is enormous, but it must be taken in context. New and emerging Big Data technologies can help dramatically reduce TTA, but enterprises must be prepared to take time and build an ecosystem that can: capture and create data, cleanse and organize it, mine business insights from it, and use those insights for business gains. Once that happens, executives will be happy to bring “Big Data” back into their lexicon.