Salesforce earlier this year introduced its Einstein Vision capability, an idea with a lot of promise but not a great deal of precedent. Who had applications that could see, and how would this be used?
For decades, we’ve been content with scanning documents and analyzing them with optical character recognition tools, or we’ve used bar codes and QR codes — but it all came down to recognizing simple symbols.
Suddenly there was something much closer to human reading that needed explaining. Visual Search, introduced with Einstein Vision, gives customers the ability to photograph things and use them in searches for products and services. Vendors can use it to identify things in processes that didn’t have analogs before.
For instance, with Einstein Vision, marketers can quickly can analyze photos for presence of brand images and understand how brands are perceived and used. With a picture, you don’t have to rely on your gut or your experience, however faulty they might be.
Vision services also can be used in product identification to give service reps a way to evaluate possible service issues before dispatch, so that the right resources can be sent.
All of this leverages existing customer technologies, mostly in handheld devices. This is important, because it reduces the time it takes to diffuse the solution throughout the marketplace.
If, for instance, these vision solutions required special cameras or a wired connection, it would take far longer to diffuse the solution through a customer base. Or maybe the solution, however useful it is, might never make it to market.
The Einstein Vision announcement whetted appetites with its ability to recognize photos and logos, and Salesforce anticipated it would be able to provide applications to support visual search, brand detection and product ID in short order — which it did last month for its Social Studio.
Recognizing that social itself has become a visually oriented medium, it was logical for Salesforce to add vision recognition to the mix, and that’s what it delivered by adding Einstein Vision to Social Studio.
So what does this buy you? Well in marketing and service, it can mean a lot. Right now the solution is just available for Twitter — but with it, marketers and service people can search their streams for images that tell them something about their brand, products or customers’ experiences.
Finding your brand or logo in a stream, for instance, might give you reason to try to understand the context. What do the words that go with the pictures say? The sentiment, which also could be analyzed by Salesforce, might convey happiness or the opposite in each case, prompting different actions from a vendor.
Seeing a brand or product with a negative sentiment might kick off a service outreach. At the same time, unambiguous displays of logos or branding, say at an event, can tell a vendor how well sponsorship ads are performing.
Other insights are possible too. Sorting through a social stream, for instance, can provide basic research into potential trends.
If a noticeable percentage of your customers can be seen doing, eating or having something, it might indicate the early stages of a trend. Of course, none of this raw data is enough to make investments in, but it serves as level one research that you can test. This beats relying on your gut or thinking you know the customer. Everybody knows the customer… but well enough to make an investment?
So, the first version of AI-powered vision recognition from Salesforce is out there, and I expect it will be one of the many new ideas that get coverage at Dreamforce. I wouldn’t put it past Salesforce to announce more uses for it, or at least to announce a road map.
Going Into Overdrive
Two MIT professors, McAfee and Brynjolfsson, a few years ago alerted us to the reality that the tech revolution was about to go into overdrive. Their reasoning was simple: We build on top of prior successes, and at this point there’s a lot to build on.
Because of this organic growth, we really can’t forecast the uses and applications of these developments, the professors said, but we know uses can and will be invented.
To simplify with a concrete example, there were no computer programmers until there were computers. That makes sense to us today — but if you lived in the late 1940s, those words might have looked like modern English but they would have made no sense.
We’re at it again. Disruption is all around us, but history teaches us that we shouldn’t fear the future. Organic growth has a way of working itself out.