Dr. Robin Bloor recently undertook a research project to understand the value that in-memory technology offers, and how customers are using it. The short answer? In-memory changes the way people can work with big data. But it's not for every data set, every time. Read on.

In Analytics at the Speed of Thought, Bloor says about Tableau's in-memory Data Engine:
"It alters the way in which BI can be carried out. Because Tableau can now do analytics so swiftly and gives people the choice to connect directly to fast databases or use Tableau’s in-memory data engine, it has become much more powerful in respect of data exploration and data discovery. This leads to analytical insights that would most likely have been missed before," says Bloor.

How Customers are Using In-Memory Technology

Robin went on to talk with two Tableau customers, Kaleida Health and Nokia, about how and when they use in-memory technology. He notes that Kaleida Health's BI department was able to work with doctors and nurses to resource utilization analysis, which they'd previously had to run to consultants for.

Looking at Nokia's use of Tableau for marketing analytics, Bloor writes, "in-memory capabilities basically offer two benefits: they provide a do-it-yourself interface, and they increase the speed of query performance."

Robin Bloor is one of the most thoughtful analysts working in the business intelligence community today. I've yet to read one of his reports and not see a topic with more depth and nuance than before. You can read his paper here.

In-memory or Direct Connect?

Both approaches have their advantages, and the truth is neither is right for every scenario. We take up this subject in another whitepaper: In Memory or Live Data: Which is Better?

In-memory is ideal when:

  • Your database is too slow for interactive analytics
  • You need to take load off a transactional database
  • You need to be offline and can't connect to your data live

But live connections can be preferable when:

  • you have a fast database, like Vertica, Teradata, or another analytics-optimized database
  • you need up-to-the minute data

For most people, the answer to "which is better?" is: both. Lack of an in-memory solution will constrain analysis of large, slow data sets. But being forced always into an in-memory approach can negate the investment you've made in a fast analytical database.

Download the white paper to learn more.

You might also be interested in...


Would it be accurate to say that the Tableau Data Engine is not like other "in-memory" databases, that instead of loading the entire data set into RAM, the TDE intelligently selects what data to load into RAM, so that we can work with data sets larger than our available RAM?

Yes, that is true, and there's some discussion of that in the second whitepaper mentioned above.

Yes, thanks for both articles above - very interesting, but for me will be even more interesting to compare new Tableau Data Engine with existing for ages in-memory Data Engines from Qlikview and Spotfire. Can you publish some comparison about those 3 engines (it will be perceived then less promotional and more objective).

2nd Question I have is: how the 1st Article, named "Analytics at the Speed of Thought" can be related to the article written 6+ years ago by Stephen Few, which named as ... "Data Analysis At the Speed of Thought" and also written about Tableau?


a.p., to your 2nd question: The Bloor paper describes the new Data Engine, which was introduced in Tableau 6.0 and did not exist when Few wrote his paper. It's specifically focused on how performance improvements help the process of analysis. Few's older article is more about the cycle of analysis and the principle that iteration is important. So they both discuss the same general concept, but Bloor's more specifically covers the new Data Engine technology.

We don't have a head-to-head comparison of the Data Engine vs. Spotfire and Qlikview, so apologies but I can't help you there.

Subscribe to our blog