The biggest data sets I've worked on have been about 10 million, and the query speed is incredibly good. It's only a couple of seconds.
Richard Leeke, Co-owner
|Tableau:||What kinds of problems are you trying to solve with Tableau?|
|Richard:||I use Tableau for almost everything I touch. Mostly, I'm analyzing performance metrics around customer's IT systems. We work on helping customers to ensure that their IT systems work fast enough, and that generates heaps and heaps of data in myriad formats. It's all about pulling that data into a form where we can analyze it and get to the bottom of a problem, and as soon as we get to the bottom of a problem we move on. It's not long-term analysis where we're refining and refining.
The biggest project I've been working on since we started with Tableau is analyzing the performance metrics around a 3G mobile phone network that's being rolled out by my customer. We're looking at the impact of the new patterns of usage on their IT systems due to this new technology and the network.
|Tableau:||What kinds of data are you working with, and what's the size of the data?|
|Richard:||Most of it is interactions with the network and the back end systems, like the billing system, having to do with 3G data usage – everything going onto the internet from your smart phone. Typically, we get a few million interactions a day. I've got aggregate data dating back for a year and typically work on data sets of millions.|
|Tableau:||What kind of query speeds have you seen with the Tableau Data Engine?|
|Richard:||The biggest data sets I've worked on have been about 10 million, and the query speed is incredibly good. It's only a couple of seconds. In my session earlier, I talked about the very first visualization that I did when I got the data engine. The analysis that I happened to be working on at the time, when the first technology preview of the data engine became available, was something involving a few million records worth of data. My coworker Paul and I had been trying to get to the bottom of a difficult problem for about a week, and using Tableau 5 the visualizations were taking 20 minutes to redraw every time I moved something onto another shelf, which meant I couldn't keep my train of thought. Paul didn't work on the same floor. If he happened to be walking past my desk, he'd stop, we'd try to remember where we were, do another bit of analysis, and see the hourglass. Then he'd go off to a meeting, and I'd do something else. The day I got the first release of the data engine, I refreshed my data extract, switched off to do something else, and suddenly I looked up to find that not only had the data extract refreshed, but the visualization was drawn. It was a feeling of disbelief. A visualization that used to take 20 minutes was drawing in a couple of seconds. The next day Paul was walking past and I called him over. We set to on the same problem, and immediately he could see how quickly we could draw down. He pulled up a chair, sat down, and within 5 or 10 minutes we got to the root of this problem that we'd been chasing for a week.|
|Tableau:||In your presentation, you also talked about a number of “aha” moments in looking at the way mobile handsets were sucking down data on your customer's 3G network. Can you talk a bit about that analysis?|
|Richard:||One of the things you look at in the performance of a system is how the interactions are distributed in time. Spiky load, if you get lots of requests at the exact same moment in time, that's the classic reason for failure or problems in the back end systems. With Tableau, we were able to visualize time distribution down to the second of the requests, and this highlighted very quickly that one particular make of handset was requesting email and synchronizing email at exactly the same time – which was causing a factor of 10 increase in the load versus the normal background load. Analyzing that data in a more conventional way where you might say average it per a minute or an hour, you wouldn't see that effect at all. But being able to draw right down to the fine grain detail, the behavior just leapt out of the page.|
|Tableau:||Can you compare Tableau with other tools you've worked with in the past, particularly when it comes to working with large data sets and doing ad hoc analysis?|
|Richard:||Certainly the speed of being able to do ad hoc analysis is just phenomenal. It makes it viable to collaborate on that sort of analysis. In the performance engineering space, I'm often working on a large, complex integrated system which has multiple vendors involved. Typically when there are issues, all the vendors move into the mode of defending themselves and suggesting the problem might be emanating from somewhere else – whether they're directly pointing the finger or just saying it's not their issue. I like to treat diagnosis as a team sport. I try to get everyone working together in a collaborative mode. If you can get everyone looking at the same data and throwing in ideas, then maybe you've got a specialist in the networking area, and a database specialist, and someone else in the infrastructure, and the architect for the software all in the same room. These may all be from different vendors, but they are all working towards the same goal of establishing what the problem is, rather than defending their own patch. It can be really powerful, but when you have half a dozen, or ten, highly paid key people working on a project, all sitting, watching the same hourglass, it just doesn't work in the same way. You see the phones come out, and people checking their next appointment, and someone has to go off for a meeting. The difference is the ability to do it in real time. The moment someone throws out an idea, you explore that idea and immediately you can say, "yes that looks like one that's worth exploring" or "no, it's not". I like to say you're looking for a needle in a haystack, and the first thing you have to do is make sure you're looking in the right haystack. People can spend ages diving into the local haystack because that's the one they know, but you're never going to find the needle if you're in the wrong haystack. Being able to take masses of data and find ways of identifying and localizing the problem and drilling down from there again and again – doing that collaboratively – is a very powerful way of working.|
|Tableau:||Tell me about another of your “aha” moments working with Telco clients.|
|Richard:||Recently, we were looking at the total number of requests being made by all the customers over a day. Where's all the load coming from? When analysis is so quick, you can just try things, ranking things by every dimension you can think of. You don't know what you're looking for, but you can just look at things different ways. I ranked which handsets, which phone numbers, were generating the most requests in a day, and looked to see if there were any patterns in common with those. What we found was that 9 out of the top 10 handsets were the same make and model, and they were all prepaid customers. It turned out that there was some particular application running on this particular handset that would say, “can I go to the internet and use data please?” and the billing system would say “no” because there was no credit. Three seconds later it would ask again, and the system would say no again. All day long it was stuck in repeat because the particular application on the handset wasn't being well behaved. The Telco customer and my customer were able to go back to the handset supplier and investigate and find out what application was behaving like that. More importantly, it highlighted the risk that any particular application running on any handset any time or any day can come along and start behaving that way. So, they built additional safeguards into the way that the system works.|
|Tableau:||What would your Telco clients be doing without this analysis?|
|Richard:||Typically, this sort of thing only comes to light once it starts having a customer impact. The key driver for Telcos is to avoid an adverse impact on the customer experience because customers can change suppliers. The key thing is hanging on to your customers. So preempting problems, rather than reacting to them once they've actually materialized, is really powerful.|
|Tableau:||How did you first get involved with Tableau?|
|Richard:||It's been about three years ago now. We were working in the performance engineering space and very often dealing with large data sets. Unfortunately, performance testing and performance engineering tools tend to aggregate all the goodness out of the data. They take the view that there's too much data to work with, so they aggregate some of the data, which doesn't allow you to draw down to the fine detail for diagnostic purposes. We had been looking for alternatives, and we did a lot of work in Excel pre-aggregating the data ourselves. But the trouble with that is you don't know which dimensions to aggregate by. You make assumptions, but if there's a pattern in a different dimension that you've already aggregated out, you're not going to find it. When Microsoft came out with Excel 2007, which was going to lift all the limits, and it only lifted one or two of the limits, we decided we had to do something else. We set down the path of building our own tool. We actually built our own very crude type of tool that was very specialized on little bits of the problem we were interested in. Then one of my colleagues came across Tableau somewhere. We decided to stop working on our own tool and adopt Tableau. There were some things at the time that Tableau didn't do for us, but the number of the things that it did immediately was great, and the direction that it was going was obvious. In version 6, a lot of that has come to life.|
|Tableau:||Can you talk a little more about how other tools force you to pre-aggregate data and how Tableau stands out in comparison?|
|Richard:||When I'm trying to get to the bottom of issues in the performance space, I like to start from the high level view, the 50,000 feet view, and try to work out what basic area to look in for the answer. Then, it's really important to be able to drill down to the very fine grain detail. Lots of built-in reporting tools pre-aggregate the data. They assume you're going to be happy with every 10 second, or minute, or second aggregates. But if you have 1000 transactions a second, the average over a second distills out a lot of the important diagnostic value. So, the ability to have massive data that you can draw down to is really very powerful.|
|Tableau:||Let's talk about how you use Tableau to run collaborative meetings. A lot of our customers are trying to do that. What are the keys to doing it effectively?|
|Richard:||I do a lot of this work where I get several people together who are experts in different aspects of the same problem, or involved in a system front in different ways. One of the key things is to establish a common language – a common understanding of the metrics we're looking at. I often find myself doing statistics 101 and explaining the difference between percentiles and medians, just so everyone is talking the same language and understanding why I'm drawing down to this detail, or why I'm doing this, or why they see distributional data versus just averages.
Then it's a question of encouraging the collaborative approach to work. People often say no question is a stupid question. We want it to be no suggestion is a stupid suggestion. People say, “I wonder if it's correlated to such and such?” It would be very easy to say, it's not worth it to drag that on the shelf. But my view is you should never assume the answer when you're looking for a problem. The golden rule is you don't know anything, so try everything. If the technology allows you to do things in sufficiently short time, you don't have to make assumptions of where to go looking.
|Tableau:||One of the strengths of Tableau, we believe, is that it lets business leaders and subject matter experts do their own analysis versus sending it out to some department and waiting for a report. As both an owner of Equinox and a subject matter expert, can you talk a bit about why it's so important to you to be able to do analysis yourself?|
|Richard:||The key thing here is that very often you can't describe to someone what you want them to find. Tableau allows me to explore data without knowing what I'm looking for. I know something's in there, but I don't know quite what I'm going to find until I find it. You follow your nose, trust your instincts, and see the pattern jump out of the page. It's hard to express that as a requirement to someone else. In problem solving, I often see the situation where the client says that's that vendor's responsibility, and that's another vendor's responsibility, and we'll get this vendor to check on that, and this vendor to check on that, and the next day everyone reports back and says, "I can't find anything". All that tells me is that they couldn't find anything – not that it isn't there. The important thing for me is to be able to draw, explore, and slice and dice every which way until a pattern jumps out of the page and tells me answer. The next person might have the same visual but not join the dots to form that into an answer.|
|Tableau:||Your clients are obviously seeing some financial benefit from your use of Tableau, such as your examples of solving problems before they have an end customer impact. Can you talk a little about ROI for you or your customers?|
|Richard:||It's difficult because most of my clients really wouldn't understand what they are getting. They don't know how hard the questions are they are having answered. All they understand is that the issues are being preempted. Tableau enables me to provide a better service to my customers, whether or not that's visible to the customer. I can get more insights more quickly as to what's going on in their data, help them get to the root cause of problems, and preempt problems before they manifest in a way that it impacts their customers. Of course that increases the chance that they will ask me to do more work for them.|