In-memory analysis speeds up analysis, makes exploration possible
He noted that Tableau's in-memory capabilities basically offer two benefits: they provide a do-it yourself interface, and they increase the speed of query performance.
"It's letting the analyst do more analysis himself or herself without IT coming between them and their data," he said. "Using this kind of in-memory capability, I do see this being useful in exploring more complex and largish data sets, which were inaccessible before."
Nokia uses Tableau for marketing analysis, and they layer Tableau on top of their own database. Bandaru said that normally running live queries against their database does not give the kind of response times that they need, but running live queries against Tableau's data engine provides the type of instant responses they expect and can work with. He said that it is useful for ad hoc analysis and that most analysts refresh their data on a weekly basis.
Bandaru said he sometimes recommends Tableau for certain users, and other times they come to him wanting to use it. The typical end user he deals with is not versed in analytics, statistics or SQL.
Useful for all levels of user
Said Bandaru, "If the end user is currently data savvy or analytics savvy, he loves using Tableau because it's providing him access to lots of data sources now." But a less sophisticated user might just use it as another BI tool, and it's up to management to provide more education to explain that "you can actually leverage this to do these kinds of complex things which are impossible while using the rest of the tools."
Nokia is currently leveraging Tableau Version 6/6.1, and Bandaru said he expects some
improvements in 7.0, such as sharing extracts across workbooks, allowing the extract to work as a data source, and having these extract be available to multiple users across the globe.
He said this type of streamlined collaboration would be particularly useful for a new employee who is not yet familiar with company operations. Today, he said, the problem is that they always have to go to the original data source, but he said a future benefit would be to have that employee able to extract and leverage data that has been created by any department, anywhere.
Bandaru said he tries to ensure product adoption by offering in-house collaboration tools, such as Wikis, training sessions, documents, and best practices.
The difference that makes a difference
This user experience of Tableau Version 6 illustrates the movement of Tableau from being a BI tool in the traditional sense, to becoming a BI platform that can take responsibility for a large amount of the BI needs of any organization. Technically, the difference is in the architecture. Direct-connect leverages existing highly performant data sources. And the in-memory data engine drives the possibilities. The efficient caching and processing capabilities mean that data sets do not need even need to be fully loaded into memory before analysis commences.
But the user is unlikely to know or even care about that. It's speed and scope that they notice. Analysis can be performed at the speed of thought, leveraging more data on less hardware. This is true ad hoc analysis where the user does not have to determine in advance which measures to aggregate or query. The user can explore the data in every one of its dimensions, digging down into detail or summarizing into categories. Almost every form of data visualization is there,available at speed of thought and capable of processing very large data sets.
Learn More: Nokia and Big Data Analytics with Tableau
Tableau sits down with Nokia's Lee Feinberg, Sr. Manager of Decision Planning and Visualizations to discuss their use of Tableau and how it's changing the way they work. Watch the video or read the transcript here.
Nokia improves Big Data analytics with Tableau