Showing posts with label analytics. Show all posts
Showing posts with label analytics. Show all posts

Tuesday, June 11, 2019

Tableau, Looker, and Origami Logic Acquisitions Show Analytics Is In Fashion

One of the unwritten laws of punditry is that one event is random, two events are interesting, and three events make a trend. By that measure, the purchases of data analytics vendors Looker by Google, Tableau by Salesforce, and Origami Logic by Intuit within a three week span must signify something. Is it that martech suites must now include business intelligence software?

I think not. Even though the acquired products were fairly similar, each of these deals had a different motivation. Conveniently, the buyers all stated their purposes quite clearly in their announcements.

- Intuit bought Origami Logic to advance its strategy to become an “A.I.-driven expert platform”. Specifically, they see Origami Logic as providing a “strong data architecture” that will “accelerate Intuit’s ability to organize, understand, and use data to deliver personalized insights that help customers quickly achieve success and build confidence whenever they use Intuit products.” Reading between the lines, Intuit recognized its existing data architecture can’t support the kinds of analysis needed to generate AI-based recommendations for its clients, and bought Origami Logic to close that technology gap. In other words, Origami Logic will be the foundation of new Intuit products.

- Google bought Looker “to provide customers with a more comprehensive analytics solution — from ingesting and integrating data to gain insights, to embedded analytics and visualizations — enabling enterprises to leverage the power of analytics, machine learning and AI.” That’s somewhat similar to Intuit’s purpose, but Looker is building applications on top of Google Cloud’s existing, extremely powerful data management capabilities rather than providing a new data management foundation. Indeed, Looker already runs on Google Cloud. So Looker is adding another layer of value to Google Cloud, letting it meet more needs of its existing clients.

- Salesforce bought Tableau so it can “play an even greater role in driving digital transformation, enabling companies around the world to tap into data across their entire business and surface deeper insights to make smarter decisions, drive intelligent, connected customer experiences and accelerate innovation”. That’s not exactly pithy, but we’re dealing with Marc Benioff. The key is digital transformation, which lets Salesforce participate in projects beyond its current base in sales and marketing departments. That is, the purpose isn’t to add products for existing customers but to serve entirely new customers. The huge size of Tableau’s customer community – “more than 1 million passionate data enthusiasts” -- clearly a draw for Salesforce. This makes complete sense for Salesforce, which is always straining to maintain its growth rate.

Is there some commonality here? Sure: each of these vendors is striving to offer products based on advanced data management and analytics. Intuit is focused on the data management foundation while Google Cloud and Salesforce are focused more on analytics. All are acknowledging that it’s easier to buy mature technology than to build it from scratch. But of the three buyers, only Salesforce is a martech vendor and their purpose is explicitly to serve customers outside the martech user base. So whatever these deals prove, it’s not that business intelligence is the latest martech must-have.

Friday, October 17, 2014

Dreamforce 2014: Process Is More Important Than Analytics

photo by Dion Hinchcliffe
Salesforce.com’s Dreamforce conference this year was the usual mix of spectacle, congestion, and heart-felt philanthropy. But the main announcements felt fairly slight: a new “analytics cloud” that is primarily about visualization and a mobile app builder for the Salesforce1 platform.

The analytics cloud* is a step forward only because Salesforce has been so far behind: it bulk loads data into a star schema relational database using inverted index for speed, which is a solid but old-fashioned approach. Of course, it’s cloud-based but so are other, newer approaches that are ultimately more flexible and scalable. Solutions to the really hard problems of entity association (matching identifiers for the same person in different systems) and predictive analytics are not included. Nor does the system handle real-time updates or allow queries by external systems for purposes like message personalization. The visualization itself is indeed fast and pretty, but it’s not obviously superior to Birst (also cloud-based), Tableau, or QlikView. The core technology was acquired when Salesforce.com bought EdgeSpring last June.

The mobile app builder for Salesforce1** is the sort of innovation only a geek would love: after all, most people don’t think much about system building in general, let alone get excited about making it easier to build mobile apps for Salesforce. But it’s certainly the more important of the two announcements, because it illustrates how broad the scope of Salesforce has become. The most impressive demonstrations were operational processes such as remote order-taking and customer support, which are far removed from traditional sales automation. They also illustrated how absolutely central mobile devices have become to most business processes, something we all vaguely realize but are still not necessarily acting upon. Business processes need to be reimagined from a mobile perspective, taking into account the possibilities of doing things instantly while on-site at a store, a shopper’s home, traveling, or whatever. This is no longer a new thought, but few companies have actually done it. By providing a drag-and-drop mobile app builder, Salesforce opens up possibilities for companies to innovate along these lines quickly, easily, and cheaply. That’s important to everyone, not just Salesforce geeks.

In fact, the closest thing I had to a deep thought during the conference was that people put too much emphasis on distributing data for decisions and not enough about distributing processes. Demonstrations for tools like Wave always show users drilling into sales data to uncover weak pickle sales at convenience stores in Milwaukee – something that’s exciting the first time but you don’t do on a regular basis. By contrast, a distributed process like better store shelf allocations provides continuous benefits, even though it doesn’t require a human analyst to have a brilliant insight. A really good organization has smoothly running processes that handle each situation according to rules that require little or no judgment. (Of course, a certain amount of discretion by empowered employees is still necessary –but I’d argue the sorts of decisions that make for, say, a great hotel experience have nothing to do with advanced data analysis.)  People like decision management guru James Taylor  have long known this and distinguished operational decisions from strategic decisions, so I guess this isn’t really a new thought, either. But, like the growing centrality of mobile, it’s something that companies need to address by giving them resources. Winners will; losers won’t. It’s that simple.

And while I’m being blunt: two Hawaiian dances in a keynote is two Hawaiian dances too many.


_____________________________________________________________*a.k.a. “Wave”, apparently to justify many Hawaii-themed promotions and an appearance by the Beach Boys.

**called “Lighting”, which suggests it was named separately from Wave, since it's unsafe to surf during electrical storms. But nomenclature notwithstanding, the two systems do seem to work together.