Tableau and Google BigQuery allows people to analyze massive amounts of data and get answers fast using an easy-to-use, visual interface. Using the tools together, you can:
Optimizing the two technologies together will yield significant performance gains, shorten design cycles, and help users and organizations become more successful. In this paper, we will discuss techniques to optimize data modeling and query formation to maximize the responsiveness of visualizations. We will also discuss techniques to get the best cost efficiency when using Tableau and BigQuery together.
We've also pulled out the first several pages of the whitepaper for you to read. Download the PDF on the right to read the rest.
BigQuery can process petabytes of data in seconds in plain SQL, with no fine-tuning or special skill set required. Powered by Dremel, Google's revolutionary technology for analyzing massive data sets, BigQuery provides a level of performance that large businesses previously had to pay millions to obtain—at a cost of pennies per gigabyte.
BigQuery is a Data Warehouse best suited for running SQL queries against massive, structured, and semi-structured data sets. Example use cases and data sets include:
BigQuery is completely No-Ops and maintenance-free, and is integrated with the Google Cloud Platform. Unlike other cloud-based analytics solutions, BigQuery does not require you to provision a cluster of servers in advance—processing clusters are sized and provisioned by BigQuery at runtime. As your data size increases, BigQuery will automatically add processing power—yet you pay the same price per gigabyte.