Analytics anyone can use.
Analytics for organizations.
Cloud analytics for organizations
Insight on your tablet or phone.
Publish interactive data on websites.
Read Tableau Desktop files for free.
Explore data on an iPad, free.
Explore customer stories, examples, and resources by industry.
Explore customer stories, examples, and resources by capability and department.
Explore customer stories, examples, and resources by data source and technology.
See what's possible with Tableau.
Learn Tableau with these bite-sized videos.
One-hour sessions led by a Tableau pro.
Multi-day courses offered around the world.
Technology, trends, and tips.
Learn from experts.
Live, online product walkthroughs.
Share knowledge and ideas with other Tableau users.
Visit the Forums
Find a partner to help you make the most of Tableau.
Find a Partner
Stay up-to-date with the latest virtual and in-person events.
Resources for Desktop authors.
Resources for Server administrators.
Resources for Tableau in the cloud.
Jump-start your deployment or dashboard.
Manage your Tableau accounts, product keys, and support cases.
Explore technical articles.
This matters to us.
Our top brass.
Extra, extra! Read all about it.
Let's build a better world with data.
News, events, and investor releases.
Join our team.
The short answer is both. Companies today are using both to deal with ever-larger volumes of data. For the business user, analyzing large data can be challenging simply because traditional tools are slow and cumbersome. Some companies are turning to in-memory BI, which lets users extract set of data and take advantage of the computing power on their local machine to speed up reporting and take query load off their transactional systems. This is especially important for SQL Server, which is often a critical part of the IT environment.
This whitepaper details:
We've also pulled out the first several pages of the whitepaper for you to read. Download the PDF on the right to read the rest.
Is in-memory or live data better when running reports from a SQL Server database? The short answer is both. Companies today are using both to deal with ever-larger volumes of data. For the business user, analyzing large data can be challenging simply because traditional tools are slow and cumbersome. Some companies are dealing with this by investing in fast, analytical databases that are distinct from their transactional systems. Others are turning to in-memory analytics, which lets users extract set of data and take advantage of the computing power on their local machine to speed up analytics and take query load off their transactional systems. This is especially important for SQL Server, which is often a critical part of the IT environment.
So which approach is better? There are times when you need to work within the comfort of your own PC without touching the database. On the other hand, sometimes a live connection to your SQL database is exactly what you need. The most important thing is not whether you choose in-memory or live, but that you have the option to choose.
Let’s look at some scenarios where in-memory or live data might be preferable.
When your database is too slow for interactive analytics
Not all databases are as fast as we’d like them to be. If you’re working with live data and it’s simply too slow for you to do interactive, speed-of-though analysis, then you may want to bring your data in memory on your local machine. The advantage of working interactively with your data is that you can follow your train of thought and explore new ideas without being constantly slowed down waiting for queries.
When you need to take load off a transactional database
Your database may be fast, it may be slow. But if it’s the primary workhorse for your transactional systems, you may want to keep all non-transactional load off it. That includes analytics. Analytical queries can tax a transactional database and slow it down. So bring a set of that data in-memory to do fast analytics without compromising the speed of critical business systems.
When you need to be offline
Until the Internet comes to every part of the earth and sky, you’re occasionally going to be offl ine. Get your work done even while not connected by bringing data in-memory so you can work with it right on your local machine. Just don’t forget your power cord or battery—you’ll still need that!
When you have a fast database
You’ve invested in making your SQL Server implementation blazing fast. Why should you have to move that data into another system to analyze it?
You shouldn’t. Leverage your database by connecting directly to it. Avoid data silos and ensure a single point of truth by pointing your analyses at a single, optimized database. You can give business people the power to ask and answer questions of massive data sets just by pointing to the source data. It’s a simple, elegant and highly secure approach.
When you need up-to-the minute data
If things are changing so fast that you need to see them in real-time, you need a live connection to your data. All your operational dashboards can be hooked up directly to live data so you know when your plant is facing overutilization or when you’re experiencing peak demand.
Even better is when you don’t have to choose between in-memory and live connect. Instead of looking for a solution that supports one or the other, look for one that supports choice. You should be able to switch back and forth between in-memory and live connection as needed. Scenarios where this is useful:
Neither in-memory nor live connect is always the right answer. If you’re forced to choose, you’ll lose something every time. So don’t choose—or rather, choose as you go. Bring your data in-memory, then connect live. Or bring recent data in-memory and work offline. Work the way that makes sense for you.
Tableau’s Data Engine provides the ability to do ad-hoc analysis in-memory of millions of rows of data in seconds. The Data Engine is a high-performing analytics database on your PC. It has the speed benefits of traditional in-memory solutions without the limitations of traditional solutions that your data must fit in memory. There’s no custom scripting needed to use the Data Engine. And of course, you can choose to use the Data Engine, or don’t—you can always connect live to your data.
Get Free Trial