Dr. Robin Bloor recently undertook a research project to understand the value that in-memory technology offers, and how customers are using it. The short answer? In-memory changes the way people can work with big data. But it's not for every data set, every time. Read on.
"It alters the way in which BI can be carried out. Because Tableau can now do analytics so swiftly and gives people the choice to connect directly to fast databases or use Tableau’s in-memory data engine, it has become much more powerful in respect of data exploration and data discovery. This leads to analytical insights that would most likely have been missed before," says Bloor.
How Customers are Using In-Memory Technology
Robin went on to talk with two Tableau customers, Kaleida Health and Nokia, about how and when they use in-memory technology. He notes that Kaleida Health's BI department was able to work with doctors and nurses to resource utilization analysis, which they'd previously had to run to consultants for.
Looking at Nokia's use of Tableau for marketing analytics, Bloor writes, "in-memory capabilities basically offer two benefits: they provide a do-it-yourself interface, and they increase the speed of query performance."
Robin Bloor is one of the most thoughtful analysts working in the business intelligence community today. I've yet to read one of his reports and not see a topic with more depth and nuance than before. You can read his paper here.
In-memory or Direct Connect?
Both approaches have their advantages, and the truth is neither is right for every scenario. We take up this subject in another whitepaper: In Memory or Live Data: Which is Better?
In-memory is ideal when:
- Your database is too slow for interactive analytics
- You need to take load off a transactional database
- You need to be offline and can't connect to your data live
But live connections can be preferable when:
- you have a fast database, like Vertica, Teradata, or another analytics-optimized database
- you need up-to-the minute data
For most people, the answer to "which is better?" is: both. Lack of an in-memory solution will constrain analysis of large, slow data sets. But being forced always into an in-memory approach can negate the investment you've made in a fast analytical database.
Download the white paper to learn more.