It used to be that setting up a large database required a huge investment of money, time, and people. What’s worse, in return, you got a tortured analytics process that yielded mediocre rewards. In many cases, the juice simply wasn’t worth the squeeze.

Here’s to putting those days behind us.

With Google BigQuery, you can store billions of rows of data in the cloud. And with Tableau, you can visualize and analyze the data in seconds without writing a single line of code.

In this demonstration, we’ll log into a database of 300 million rows from the Global Database of Events, Language, and Tone (GDELT) Project. Then we’ll paint a picture of the world based on a machine-scored index of what journalists are covering around the world. I'll show you how to:

  • Quickly profile large amounts of data in Google BigQuery.
  • Slice and dice in real-time using filters to focus on data of interest.
  • Link views together in dashboards.
  • Sequence snapshots in a storyline.
This is not a highly-scripted demonstration. We approach the data as you would and let patterns reveal themselves in iterations. Along the way, you’ll learn to:
  • Conduct real-time queries on large data sets using Google BigQuery.
  • Use Tableau’s Show Me feature to examine data from various perspectives.
  • Blend local Excel data with cloud-hosted BigQuery data.
  • Create functions and formulas to push down to BigQuery for evaluation.
To follow along, download a free trial of Tableau Desktop and sign up for a free GDELT BigQuery credential.


Thanks for the demo, Anthony. I am getting way slower response times than in the video - 10 minutes plus. Is there a way to improve the response times between BigQuery and Tableau?

Subscribe to our blog