Analytics anyone can use.
Data prep anyone can use.
Analytics for organizations.
Cloud analytics for organizations.
"What do you do?" It's a question you probably get all the time, like I do. Being a QA engineer at Tableau, my usual quick answer is that "I test data visualization software." Simple enough, right?
Well, I've found that different people hear different things out of a statement like that. For some, visualizing data is the last step -- a way of constructing charts of information for documentation or presentation. But for others, visualizing data is the first step -- a way to understand data before analyzing it in detail.
Either way is certainly a valid approach for using Tableau. Personally, I like taking the latter approach, using visualization as a tool for analysis. I think it's because I have a background in the physical sciences, where the idea of using simple visuals to answer complex questions has a long history. After all, if you can come up with the right thought experiment, visualization, or graph, you can get profound results without going through a more costly, time-consuming series of experiments. In the business world, this might be called "efficiency"; in science, it's often called "elegance."
Take physics for instance, the scientific field I'm most familiar with. Physicists throughout history -- notably including historic geniuses like Galileo, Einstein and Feynman -- have taken devious pleasure in deriving profound insights from simple illustrations and reasoning.
This came to mind recently when I came across a great story about Geoffrey Taylor, the great fluid dynamicist. Professor Taylor cracked one of America's most closely guarded secrets -- the yield of the Trinity nuclear test of July 1945 -- using dimensional reasoning and a magazine photograph to get an amazingly close estimate.
The photo he used was the one above, showing the expanding, nearly-spherical blast wave of hot air following the Trinity explosion. Immediately after the war, pictures like this of the A-bomb's results were considered good for propaganda value, but the yield of atomic weapons was a closely guarded national-security secret. As the sole nuclear power, the US was wary of providing any clues that might help A-bomb development -- or defensive measures -- in other countries.
So how did he estimate the size of the explosion from this picture?
Here is a simplified version of the reasoning he used: Assume that a nuclear explosion releases a finite amount of energy (E) at a single point in space and time. Assume that most of the energy goes into movement (ie, kinetic energy) of the air in the expanding blast wave. Now kinetic energy is a function of mass (density times volume) and velocity (distance per unit time); furthermore, volume is a cubic function of distance.
Put it all together, and the energy scale (E) is logically a function of distance, time, and density scales. Now the most obvious distance, time, and density scales in this estimation problem are the blast-wave radius (r), the time after explosion (t), and the the mean density of atmospheric air (d), respectively. Put it all together and you get E ~ d*r^5/t^2 -- you can derive this algebraically if you want, but it's not really necessary; that's the only dimensional expression for (E) that makes the units work correctly.
When Taylor saw the photo above in Life magazine, he realized he now had all the information he needed to estimate the yield (E). From the picture's distance scale, he estimated r = 140 m; the US government helpfully labeled the time after detonation as t = 0.025 sec. Using d ~ 1 kg/m^3 for air density (a decent mean value), we plug in d, r, and t to get E ~ 86 terajoules, or 21 kilotons TNT. The actual (classified) answer was 20 kilotons. Not bad for just going off a picture. In truth we got a bit lucky that the factors of 2, 3, pi, and so forth tend to cancel out in this case, but we weren't really looking for an exact answer.
When Taylor published this result, many people assumed that he was leaking top-secret information. But there was no leak. Sir Geoffrey had demonstrated the power of images, intuition, and visual analysis, a concept that has roots much older than visualization software.
Elegant analyses like these have a beauty in their own right, but they are also important because they are efficient -- they save time, conserve resources, persuade convincingly, and prevent wasted effort. And maybe -- if we are successful at Tableau -- that's the real answer when someone asks "what we do." Based on your experience with Tableau, have we succeeded? I'd like to hear your comments, pro and con.