Breakthrough after Breakthrough

In 2003 Tableau spun out of Stanford University with VizQL™, a technology that completely changes working with data by allowing simple drag and drop functions to create sophisticated visualizations. The fundamental innovation is a patented query language that translates your actions into a database query and then expresses the response graphically.

The next breakthrough was the ability to do ad-hoc analysis of billions of rows of data in seconds with Hyper, Tableau's data engine technology. A core Tableau platform technology, Hyper uses proprietary dynamic code generation and cutting-edge parallelism techniques to achieve fast performance for extract creation and query execution.


Hyper is a high-performance in-memory data engine technology that helps customers analyze large or complex data sets faster, by efficiently evaluating analytical queries directly in the transactional database. A core Tableau platform technology, Hyper uses proprietary dynamic code generation and cutting-edge parallelism techniques to achieve fast performance for extract creation and query execution.

Hyper's unique design

Over the past decade, in-memory data engines and analytical database technologies have delivered incredible query performance improvements through techniques such as sampling and summarization. These performance improvements come at a cost, however. Many systems sacrifice write performance—critical for fast extract creation and refresh performance—in favor of optimizing analytical workload performance. Poor write speeds lead to stale and disconnected data. The result? A lag between people and the data they want to analyze. Our mission with Hyper is to bring people closer to their data by giving you fast write speed and fast analytical workload performance. In short, Hyper delivers fresh data, faster—so you can analyze a larger, more complete view of your data.

Rethinking system architecture: one state for transactions and analytical queries

With Hyper, transactions and analytical queries are processed on the same column store, with no post-processing needed after data ingestion. This reduces stale data and minimizes the connection gap between specialized systems. Hyper's unique approach allows a true combination of read-and write-heavy workloads in a single system. This means you can have fast extract creation without sacrificing fast query performance. (We call that a win-win.)

A new approach to query execution: dynamic code generation

Hyper uses a novel just-in-time compilation execution model. Many other systems use a traditional query execution model that cannot take full advantage of modern multicore hardware. Instead, Hyper optimizes and compiles queries into custom machine code to make better use of the underlying hardware. When Hyper receives a query, it creates a tree, logically optimizes the tree, and then uses it as a blueprint to create a unique program, which is then executed. The end result is better utilization of modern hardware for faster query execution.

Leveraging more of your hardware: morsel-driven parallelization

We designed Hyper from the ground up with large, multi-core environments in mind. Our parallelization model is based on very small units of work (morsels.) These morsels are assigned efficiently across all available cores, allowing Hyper to more efficiently account for differences in core speed. This translates into a more efficient hardware utilization and faster performance.

Hyper began as an academic research project at the Technical University of Munich (TUM) in 2010. It spun off into an independent organization in 2015 with the goal of bringing Hyper to industry and shipping a commercial version of the technology. Hyper was acquired by Tableau in 2016; the core technology now powers the Tableau data engine.


Natively visual and therefore faster

At the heart of Tableau is a proprietary technology that makes interactive data visualization an integral part of understanding data. A traditional analysis tool forces you to analyze data in rows and columns, choose a subset of your data to present, organize that data into a table, then create a chart from that table. VizQL skips those steps and creates a visual representation of your data right away, giving you visual feedback as you analyze. As a result you get a much deeper understanding of your data and can work much faster than conventional methods–up to 100 times faster.

VizQL enables a broad range of visualization

A new language for data means you can say more

This fundamentally new architecture does for data interactions in visual form what SQL did for data interactions in text form. VizQL statements describe an infinite class of sophisticated multi-dimensional visualizations. With VizQL, people have a single analysis interface and database visualization tool to produce a broad range of graphical summaries. Tableau can create a shockingly broad range of visualizations, from bar and line charts to maps and sophisticated linked views. This flexibility allows you to understand data in an entirely new way. It allows you to find insights that would be lost if you had to shoehorn your data into rigid charting templates.

Supports natural patterns of thought

Thinking is naturally a pattern of questioning and answering, incrementally making progress and taking new information into account. It’s rare that you know exactly where you’re going when you begin an analysis. Yet that’s what traditional BI tools require. There’s an alternative: VizQL allows you to explore your data visually and find the best representation of it. You learn as you go, add more data if needed, and ultimately get deeper insights. We call this the cycle of visual analysis. When you’ve gone through this cycle you can communicate a much better story about your data.

It doesn’t exist anywhere else in the world

Because of VizQL, fast analytics and visualization are reality. People with little or no training can see and understand data faster than ever and in ways like never before. And that’s the biggest difference of all.