Visual analysis for data discovery is a must in today’s BI space. Since the human brain processes information visually, Tableau created a product that aligns with human thought.
Tableau's in-memory software architecture achieves speed through emulating some of the high-performance capabilities of column-store databases, like Sybase IQ and Vertica. Because of this data engine Tableau can process very large data sets locally—nearly a billion rows—and since most of that data is held in-memory results are produced rapidly.
People can connect directly to fast databases, or use Tableau’s in-memory analytics engine—both resulting in insight at the speed of thought. Insights that may have been missed before are caught, and can be quickly applied to business strategy to maximize success.
We've also pulled out the first several pages of the whitepaper for you to read. Download the PDF on the right to read the rest.
Introduction and Executive Summary
Tableau is a software company that specializes in highly visual and rapid-fire reporting and analytics, using breakthrough technology for both its visualizations and data management. In its current version, Tableau is taking business intelligence (BI) to a whole new level of fast. This white paper examines Tableau’s visual capabilities and its new data engine, which together make for a formidable partnership for delivering BI solutions. Here is a brief summary of the contents of this paper:
- Visual analysis for data discovery is a must in today’s BI space. Knowing that the human brain processes information visually, Tableau’s developers created an intuitive product that aligns with human thought. Much like our tendency to ponder questions and arrive at conclusions – often termed “stream of consciousness” – the Tableau user interface allows for the progression of analytics via familiar visual representations. In doing so it uses a visual drag and drop interface, conventional charts and interfaces for desktop, web browsers, and iPad/mobile devices.
- While Tableau has always made it possible for users to connect to live data sources, either local files (e.g., Excel) or databases (e.g., Oracle, SQL Server, Teradata), previous versions of Tableau could be constrained by the underlying data sources, both in the speed of analytics and the volume of data that could be processed. Tableau was configured either for local usage (e.g., files, spreadsheets) or to query data directly in data marts or other data stores. Both modes of usage could give rise to latency problems. Running natively on a PC using local data sources, Tableau lacked the ability to handle large amounts of data. When accessing data in data marts the processing speed depended primarily on the speed of the database being accessed. So while Tableau’s analytics capabilities were up to the task, the user could experience sluggish performance if the back-end database was slow. With Version 6 of Tableau, this became a thing of the past.
- Tableau Version 6 introduced a new data engine that provides in-memory analytics optimized for speed. The improvement in performance it delivers is dramatic.
- It achieves speed through emulating some of the high-performance capabilities of column-store databases, such as Sybase IQ and Vertica, making effective use of data compression and storing data in columns. The approach is tailored for use in the single PC environment by the inclusion of a virtual memory paging capability.
- Because of this data engine Tableau is now capable of processing very large data sets locally – up to about a billion rows – and because the majority of that data is held in memory, it produces results rapidly. Additionally, it is still capable of accessing data in data marts. Organizations with fast, highly performant databases can still access those directly with Tableau.
- Tableau has been a BI platform since the introduction of Tableau Server in Tableau 4.0, and many customers use Tableau as their primary BI platform. Report authors can build reports, dashboards, or even drill-down analytics and distribute them to less sophisticated users with existing data and user security enforced.
- It is clear from our interviews with two Tableau Version 6 customers (Kaleida Health and Nokia) that Tableau is a changed product. It alters the way in which BI can be carried out. Because Tableau can now do analytics so swiftly and gives people the choice to connect directly to fast databases or use Tableau’s in-memory data engine, it has become much more powerful in respect of data exploration and data discovery. This leads to analytical insights that would most likely have been missed before.
Beyond Rapid-Fire BI
The name Tableau is synonymous with visualization and rapid-fire BI.
Born from years of academic research, Tableau stormed out of Stanford in 2003, carrying on its shoulders the highly innovative database visualization language, Visual Query Language (VizQL™). This language had the same analytic and query capabilities as its predecessors, SQL and MDX, but it took reporting one important step further. Its visual language facilitated an unlimited number of unique and customizable picture expressions of data. Recognizing the need for more effective BI, creators Pat Hanrahan and Chris Stolte developed VizQL™ as a means of solving an area of perpetual neglect: visual analysis.
Let’s face it, we 21st century humans are visual creatures. Since the advent of television, the way we process knowledge has been shaped by movement, images, and color as opposed to the mundane rows of text and numbers that were once the only expression of BI. When personal computers became more about graphics than programming, it only enhanced the user’s need for visual representation. To some purists, this was a travesty; however, it could not be undone.
A founding principle of Tableau is that data analysis should mimic the user’s inherent ability to think visually. Instead of running a query against the typical SQL database, producing a report and creating a visualization to accompany the results, Tableau’s visualization engine marries the two. In the Tableau world, analysis and the visual aids that represent and explain the analysis should not be separate.
Information Discovery
Tableau’s user interface doesn’t just help the user create visualizations. It serves as a data exploration tool that partners with the user and his train of thought, allowing for a visual cycle of analysis. Because of Tableau’s careful, incremental, stream of consciousness visualizations, the user often finds unexpected results, which in turn call for deeper analysis. After all, the goal of analysis is not to find one answer to one question; it is to find a collection of answers and propagate more questions.
Tableau does not inundate the user with a vast array of unconventional graphics; it relies on the expectations of the end user. What is familiar? What will be easy to understand? Simple, elegant and richly colored bar charts, tables, maps, and time series are the most common types of business graphics. Although nearly any type of graphic can be generated, Tableau’s practice demonstrates that design is second to the data and must act as an enhancement, not a
distraction.
Versatile visual display is not the only compelling reason to use Tableau. All of Tableau’s products – Desktop, Server, Digital and Public – are fast, user-driven applications with the option of being configured for desktop and/or the web. They promote user control.
Though it has traditionally been the case, it is not the job of IT to read the minds of business users concerning exactly what type of report or representation of data is required. Little satisfaction can be gained from either side when a report is requested and generated, only to be requested again for lack of a specific or forgotten detail. And while neither side is particularly at fault, it has led to the widening ravine that separates the back end from the
front end.
Tableau Limited
The power to make good decisions depends on the data available. Tableau gives users the analytical power they need and it makes exporting results as easy as a few clicks of the mouse. Where IT is able to deliver the appropriate data service, business professionals can serve themselves the data and reports. However, there were limitations.
Prior to Version 6, it was not practical to extract large amounts of data in Tableau, due to the query performance. With Version 6, Tableau has made it practical to extract and work with large volumes of data at fast query speeds.