Heap on Tuesday introduced a new feature set for its autonomous customer insight platform.
Among the additions:
- Non-destructive data modeling — Users can define and model new insights without touching the raw data structure, which allows faster iteration and speeds up productivity. Virtual event definitions let users retroactively update metrics on the fly wherever they are used.
- Sources — 15 new data source connectors, including connectors to Salesforce, Marketo, and leading email and payment providers, make all history and events are available with one-click connectivity. No custom coding is required.
- Data integrity core — A data scoring system automatically ranks all customer data by level of usefulness and trustworthiness so companies can measure, define and control data access.
- Dashboards — Users can access data visualization reports across departmental data silos to get a bird’s-eye view of customers.
Event definitions, user segments and other higher-level semantic concepts “live in our data control plane and can be edited at will,” noted Heap CMO Shawn Hansen. “Renaming an event, combining multiple events into one, creating new user segments and more are all possible by just interacting with our UI. There’s no need to edit tracking code or perform SQL migrations.”
Heap’s non-destructive data modeling “is the secret sauce,” said Constellation Research Principal Analyst Ray Wang.
“The other features are expected if you want to play in this space,” he told CRM Buyer.
The Advent of Sources
Heap’s Sources analytics tool integrates with 18 third-party services to let users gain insights into the effectiveness of marketing campaigns and ads, for example, without needing to write code.
Sources offers the following capabilities:
- Captures all data from every source;
- Lets management maintain full control over which events to expose to teams and how the events are structured in reports;
- Retroactively backfills all historical data whenever possible; and
- Has a standardized event schema that can be accessed in users’ data warehouse through Heap SQL.
The underlying raw data is separated from the semantic names and definitions, which “makes it really easy to impose whatever schema or taxonomy you want and change that on the fly,” Heap’s Hansen said.
“Storage is so cheap that we can capture everything,” he told CRM Buyer.
“Because we capture everything and let you create virtual schema you can blow away on demand, you can think up and ask new questions and do it on the fly,” Hansen pointed out. “Customers can access the data and instantly rewire the question.”
Heap’s data is stored on the Amazon Web Services cloud. Its datastore is a custom distributed system built on top of PostgreSQL.
How Heap Works
The Heap platform automatically captures, validates and connects all customer data, which lets users derive meaningful customer insights instantly to better drive business decisions.
“Infinite ambient orchestration is where applications are headed next,” Constellation’s Wang said. Heap’s “customer insight platform is automating the hard part of integration and orchestration.”
The Heap autonomous customer insights platform has three key layers — data capture, control and insights:
- The data capture plane automatically captures all behavioral data from sources across departments and domain-specific tools into one standard schema;
- The control plane ensures data integrity and lets users change event definitions on the fly; and
- the insights plane, which sits on top of the data plane, lets users process networked insights across marketing, sales and customer success silos.
“Bringing together all sources and points of customer data insights is a common problem for companies that have developed their marketing and sales teams over time,” noted Rebecca Wettemann, vice president of research at Nucleus Research.
“Key to the success of any such approach is usability, where Heap appears to have significant success, speed of integration — and of course, price,” she told CRM Buyer.
Keeping the Raw Data Pristine
Data is benchmarked against peer groups and checked statistically for freshness, accuracy and usefulness.
“The data integrity core appears like a modern approach to master data management on a broad scale,” Wettemann remarked, which “could be very interesting provided [Heap’s] scoring is transparent and understandable to users.”