Toward the end of fiscal year 2017, Tableau Software (NYSE:DATA) will most likely cross the $1 billion revenue threshold. This will put them in an elite group of software providers and an even more exclusive club among those in the business intelligence (BI) and analytics space.
With new CEO Adam Selipsky at the helm, Tableau is angling for the next stage of hyper-growth as well as the kind of market leadership that Selipsky helped bring to Amazon Web Services (AWS) during his decade-plus tenure there. Tableau’s evolution into a bigger vendor with a much broader reach has intrigued me, and their annual user conference (this year set in Austin, TX) figured to be an interesting one.
A Shift in Messaging
Compared to last year’s conference, I noticed a palpable shift in the messaging and the positioning of the company, a shift that emphasized Tableau’s relevance to a broader audience of users. Tableau has always been at the forefront of the discussion around self-service analytics, with a stated mission to help people “see and understand data.” However, the 2015 show (and others before that, from what I’m told) seemed to focus on a more technically sophisticated, “Data Rockstar” (their term) audience. I blogged about the impressive, yet borderline exclusionary vibe that permeated the show, wondering if I was smart enough to use Tableau.
I’ll say right up front that this year’s show had no shortage of impressive and technically sophisticated product announcements. In fact, it would seem as though Tableau has a solution for just about every old, new, and next-generation analytical activity, from query optimization and data prep to collaboration, machine learning, and geospatial analysis. Ramping up development and product cohesion related to the acquisition of Hyper – a high-performance database system – Tableau is making major investments in the user’s ability to access high volumes of complex and disparate data in an accelerated time frame, an innovation category they refer to as “instant analytics.” Other key areas of innovation include natural language processing as well as time and space analysis.
Technology aside however, the general feel of the show seemed more approachable to a broader audience, even amidst a parade of fairly technical on-stage demonstrations from Tableau’s developer team. Some might argue that the show has outgrown the quirky coolness that characterized earlier iterations, becoming more corporate and staged. However, the DNA of the younger, pre-IPO company certainly came through to me as I absorbed the main stage and super-session presentations.
Two areas stood out to me as critical elements of Tableau’s strategy moving forward:
- Data prep: This year Tableau introduced Project Maestro, a stand-alone, self-service data preparation platform, most likely to be sold as a separate product. With capabilities for data cleansing, blending, and profiling, this product is intended to support a growing audience of less technical users who have an elevated interest in the cleanliness and relevance of their data. With an enhanced set of features for governance and oversight built into Project Maestro (and other products), Tableau continues to widen its path into larger enterprise-level accounts.
- Embedded analytics: The messaging and overall story for embedded didn’t make its way to the main stage for the morning keynote, but was discussed in several other sessions and formed the basis for many of my conversations with the executive team. Tableau has always had an OEM strategy to embed visualization capabilities within other applications in the ISV community. However, the growing number of non-technology enterprises looking to embed analytics in customer- or partner-facing solutions, or those looking for ways to monetize the data flowing through the organization, has started to bring analytics to the forefront for more companies not typically predisposed toward analytical activity.
Particularly with respect to embedded analytics, I for one see tremendous opportunity for Tableau to expand its brand across the business landscape.
With more people, in more companies, interacting with some form of software as an everyday aspect of their job, the potential enterprise user base for analytics is exploding, and many (if not most) of these users will not be a good fit for a traditional, separate, stand-alone analytics application. These people need the capabilities not just embedded into their applications, but into their everyday workflow and processes. Such an embedded approach offers greater relevance as capabilities get tailored to the application and the user at hand, ultimately leading to a higher degree of adoption and engagement.
Some of my prior research demonstrates how top companies with an embedded approach could deliver analytics more pervasively across several functional areas (Figure 1).
Figure 1: Analytics Woven into the Organization
Moving into the early part of 2017, Tableau will hit the ground running with a more developed, enterprise-grade story. The lingering question for me is their acquisition strategy, regarding which Tableau has been (understandably) tight-lipped. This is clearly an innovation-driven company, and apart from Hyper and other very small acquisitions, Tableau has been quiet on the M&A front.
As they cross the billion-dollar threshold, though, will their pace of innovation support the growth that Wall Street expects? Assuming there is an acquisition strategy in the works for the next stage of growth, it’s unclear which piece of the analytical “stack” would be most attractive to them.
From inception to IPO, the company has grown rapidly based largely on the strength of two products: Tableau Desktop and Tableau Server. The analyst sessions spent considerable time talking about Tableau online, the offering designed for flexibility and scalability in enterprise cloud environments. However the foray into Project Maestro, likely to be sold as a stand-alone product, holds the most interest for me. For this product to truly stand on its own, it will eventually need to serve BI and analytical environments other than Tableau while maintaining its stated mission as a self-service data prep solution. Time will tell how successful the product will be, to what extent it will step on the toes of the extensive partner network, and how well it will perform outside the Tableau ecosystem. At the same time, this does represent more than a small extension of product capabilities and puts Tableau in the realm of developing a much more comprehensive portfolio of analytical offerings.
Revenue-wise, Tableau is still very much dwarfed by the other software behemoths (IBM, SAP, Oracle). But, when compared to pure-play analytics providers of similar stature (SAS, MicroStrategy), Tableau is nearing the top of the heap with a far more nimble and agile reputation. Will they ever be the 800-pound gorilla in the software world? Probably not, nor does it seem like they have any desire to be.
However, with continued growth, innovation, judicious acquisitions, and savvy use of embedded partnerships, Tableau could become one of the fiercest and most agile competitors in the tech space.