Tableau Software (NYSE: DATA) has long boasted one of the most loyal and impassioned customer bases in the tech world, and it would be hard to imagine a greater test of that loyalty than this year’s location for TC 2017. One week after the worst mass shooting in U.S. history, Tableau leadership decided to forge ahead with their annual event scheduled for the Mandalay Bay Resort and Casino in Las Vegas.
Many people (myself included) were skeptical that the event would be well-attended or bring with it the same energy and excitement that we’ve come to associate with Tableau Conference. I, for one, was delighted to find that my skepticism was misplaced as the lights went down at the T-Mobile arena and CEO Adam Selipsky took the stage to a packed house of 14,000 fervently screaming data jockeys.
After an acknowledgement of the recent tragedy, a moment of contemplative silence, and palpably genuine gratitude on the part of the Tableau team for the strong turnout, it was time to set in motion the annual tradition of nerding out over things like machine learning and database query acceleration.
As usual, there were too many announcements to list out and comment on, so the following four takeaways represent what I consider the key themes of the event.
Two things Tableau is taking very seriously:
Enterprise readiness. Since their IPO, Tableau has made no secret of their plan to ramp up elephant hunting and land more large enterprise customers. Some very significant percentage of their product enhancements and acquisitions can be tied back to this strategy. High speed analysis of large data sets with Hyper, enhanced governance capabilities, greater data connectivity, full Linux compatibility, and several other announcements this year either directly or indirectly support this “big fish” approach.
Data prep. In a DARPA-like “skunk works” way, last year’s announcement of Project Maestro seemed to signal that Tableau was testing the waters with data prep and were feeling out how it would land with customers. A year later it seems pretty clear that data prep is a major area of continued investment. By Tableau-izing data prep with visualizations and intuitive data joining capabilities (among other things), Maestro seems to be aimed at democratizing data prep for a broader and less technical line-of-business audience.
When pressed about Maestro’s (potentially negative) impact on alliance partners in the data prep space, such as Alteryx, the Tableau executive team vehemently argued that the “rising tide” theory applies here. In other words, the market footprint of data prep is growing rapidly and thus leaves room (and market share) for healthy competition from a variety of vendors.
The pool into which Tableau is dipping its toe (or wading up to its knees…)
AI and natural language P-Q-G-C. Query. Generation. Conflagration. Usually situated under the banner of artificial intelligence (AI), natural language capabilities help translate human thought and speech into machine languages, and vice versa. (Ok. “Conflagration” is not a thing. At least not until someone figures out how to ignite a large fire using natural language).
The recent acquisition of Natural Language Query (NLQ) startup ClearGraph doesn’t exactly cement Tableau’s position as a leader in this emerging area, but it definitely signals a strong interest. Considering their long-standing mission of helping people see and understand data, natural language capabilities certainly seem like a logical extension of the product’s strengths today. With a handful of use cases, from table join recommendation engines to executive-friendly search analytics, Tableau seems to be hot on AI. I expect this to dominate keynote main stage time at TC 2018.
One area where they were kinda’ sorta quiet:
Embedded analytics. Perhaps it’s because Tableau Conference is an event focused on traditional customers leveraging Tableau as a stand-alone entity. Perhaps it’s because embedded analytics would compromise Tableau’s messaging (and price point) as an enterprise ready “platform.” Whatever the reason, they didn’t talk much about what I think is the most important trend in the world of analytics – the move toward embedding analytical capabilities where people work, that is, within the context of the software they interact with on a daily basis.
I see a huge portion of the world’s analytical activity taking place in this type of model, especially as more companies emerge from the paper-based stone age and automate their operational activity with software. To be fair, Tableau did announce the Extensions API (to raucous applause, I might add), which will purportedly make embedding Tableau and extending/tailoring its features significantly easier. Nevertheless, it seemed like it got short shrift for a big trend.
From the smallest micro features, like grid lines for table axis alignment, to long-term, strategic feature sets like Maestro, there was no shortage of announcements this year. Time will tell if Maestro becomes the business-friendly market-leading answer to data prep that Tableau hopes it will be, or if AI and NLP will take hold as critical defining features of Tableau’s future coolness.
What’s not in question is the zeal with which self-proclaimed data rock stars continue to raise the analytical bar across the business world, or, frankly, Tableau’s commitment to serving them.