This week marked the 11th annual user conference for Tableau Software (NYSE: DATA). Set against the festive backdrop of New Orleans, I had the pleasure of attending and soaking in the vibes of (purportedly) 17,000 enthusiastic data jockeys and analytical ninjas. Replete with the trappings of a typical Tableau Conference, the company touted its rapid pace of innovation and penchant for early stage IP-heavy acquisitions with a number of announcements. On the back-end of the analytical value chain, we heard about continued investment into capabilities for rapid, large-scale data processing through Hyper, as well as more robust and developed data preparation tools. On the business-facing side, we got a glimpse into the assimilation of ClearGraph, Tableau’s recent acquisition into the world of natural language processing (NLP) and the announcement of Ask Data.

My overall observation, though, was that Tableau seemed significantly broader in their message to the market than in years past. Where the main stage talk-tracks once seemed narrowly targeted at the analytical elite — or as Tableau dubbed them, “data rock stars” — expanded out this year to share the message with a wider audience of potential users. This was readily apparent in CEO Adam Selipsky’s opening keynote where he talked about the concept of making analytics more universal, discussing analogous examples of other modern technologies that grew out of a primitive state to reach a level of near ubiquity (i.e., from cold-cellar to smart refrigerator, from DoD Arpanet to the modern world wide web).

Having covered the analytics space for 10 years and in processing my own view of its recent evolution, I see three factors that could potentially drive the market toward something close to ubiquity. Rather than go line by line through each Tableau announcement, I’ll try to fit the most relevant aspects of #TC2018 into each. If analytics is ever to become ingrained in the daily business lives of nearly everyone, the world will need to see advancement in three key areas:

  • Technology. Make no mistake about it, analytics isn’t easy. Forget about writing queries or creating predictive models. Most people, quite understandably in my view, struggle with some of the most rudimentary concepts. People just want answers. They want to know things. They want new questions to ask. This is where NLP comes into play. Debate as you will whether or not it falls under the umbrella of AI and machine learning, frankly I’m not sure it matters. NLP (and its younger language generation cousin, NLG) have the potential to translate human thought into actionable answers, better questions, and improved business. Tableau’s acquisition of ClearGraph and the incorporation of the Ask Data NLP engine into their workflows puts them right in the thick of the land-grab for non-technical business thinkers.
  • Infrastructure. I shudder to think of the IT effort required to support tens of thousands of analytical users at an organization, but many are doing it today including several Tableau customers like Charles Schwab and Pfizer. Ubiquity however is a completely different animal. Statista puts the number of U.S. full-time employees at just over 125 million Perhaps more analogous, Microsoft claims over 1.2 billion people across the globe are using some form of MS Office product. The effort to support even a small fraction of that many people with some form of analytics is daunting, to put it in laughably mild terms. However, Tableau’s clearly-stated enterprise strategy is one geared toward chipping away at the analytical masses through the vaunted passageway of IT. With deepening cloud relationships (AWS, Azure, Google Cloud) — enterprise support for Linux, developing data preparation and governance capabilities, to name but a few — Tableau has made a serious push to make their software play nicely in almost any infrastructure or data environment.
  • Culture. Everyone’s favorite nausea-inducing MBA buzz word rears its ugly head in a major way in the conversation of analytical ubiquity. How do you get that many people to get active with analytics on a regular basis? Can you force them? Can we solve it with cross-functional teams, focus groups, team-building exercises, or trust falls?  I would argue that ubiquity is a pipe dream for analytics without a concrete way of placing these capabilities in context, almost hidden in plain sight for the many many millions (billions??) of global employees that touch some form of software on a daily basis. Enter embedded analytics. Decision-supporting tools seamlessly integrated into the electronic workflow of a human being. I don’t believe analytics needs to build its own widespread culture from the ground up, it just needs to piggy-back on the existing culture and workflow of the millions of organizations that have already undertaken the effort to deploy software on a widespread basis. Tableau has had an embedded analytics (often blandly dubbed OEM) strategy for many years, but have been somewhat quiet about it publicly, at least at Tableau Conferences. Last year’s introduction of the Extensions API and their continued strategy to empower developers with the ability to embed Tableau capabilities in other applications (and vice versa) all signal a concerted effort around embedded analytics, one that I’ll watch closely (and probably continue to harp on).

It’s not entirely clear where we are in the adoption curve for analytics, or how close to full saturation we could ever get. However, the pace of innovation continues to border on astounding and the demonstrable evidence across analytical circles points to significant investment in the types of technologies and infrastructures that can help support massive scalability. The concepts of culture and adoption are always the trickiest, though. I don’t know that there is a perfect analogy for the adoption trajectory of analytics. Analytics isn’t a stand-alone focal point technology (like the smart fridge), nor is it a widespread technology-enabling infrastructure (like the modern internet). In a utopian future of analytical ubiquity, people will make better, faster, and smarter decisions without knowing or caring what helps them do it.

If we’re looking for analogy to model the expansion and ultimate ubiquity of analytics across the business landscape, I’ll harken back to the third bullet point above and offer another suggestion.

Two words: Intel® Inside

 


Do you know which specific companies are currently in-market to buy your product? Wouldn’t it be easier to sell to them if you already knew who they were, what they thought of you, and what they thought of your competitors? Good news – It is now possible to know this, with up to 91% accuracy. Check out Aberdeen’s comprehensive report Demystifying B2B Purchase Intent Data to learn more.