Will Alexa become my companion data analyst?

How a smart assistant can eventually become your organization's next data rock star.


This piece first appeared in ComputerWorld.

Alexa joined our family at Christmas.

She lives inside Amazon’s Echo device, now in our kitchen, and responds to our voice commands. Nothing feels more natural than asking Alexa to turn on the radio. There are no buttons, no scanning, just a voice command to start up what we want to listen to. As my children grow up, they’ll get to a point where they couldn’t imagine pressing buttons to do something as menial as start up some music in the room they are in.

But what if Alexa could help me understand my data? Stephen Few, a leading writer in the field of data sensemaking, recently wrote a great article explaining why he thinks natural language processing (NLP) does not have a future as an input mechanism for making sense of data.

Respectfully, I disagree.

Today, NLP is hopeless for anything but the most basic of questions about data, but this will change. If we don’t start taking faltering steps toward the goal, we will never get there.

Can a conversation ever drive data exploration?

First let’s remind ourselves why we explore data. At the individual level, we want to get insight as quickly as possible. At the organizational level, we want to democratize access to data. We aim to produce interfaces with machines that are as intuitive as possible so that anyone can find valuable insights in their data even without deep training.

As the questions get more complex, it should continue to be just as easy for anyone to use natural language to “flow” with the data and navigate toward insight. Vidya Setlur, a colleague and researcher of NLP, says, “Rarely can a person’s questions be answered by a single static chart. They create and explore a whole series of charts to answer new questions that arise. A critical requirement for any system is to answer iterative questions intelligently without expecting the user to be a skilled statistician or database expert.”

For example, imagine we’re exploring a dataset about earthquakes. “Show me large earthquakes in the USA” might be a valid starting point, which would likely produce a map. “How about Texas?” would be a natural way to ask a subsequent question. The challenge of NLP is to maintain the context from one question to another while allowing us to speak naturally, using what artificial intelligence researchers call language pragmatics.

“This approach is promising in maintaining flow,” says Vidya. “Users may be able to express questions more easily in natural language rather than translating those questions to appropriate graphical-user-interface commands.”

Why wouldn’t we strive to allow people ask complex questions of their data using language?

I believe developments in NLP will go hand in hand with voice recognition technology. I already dictate most Google searches and phone text messages because speech recognition is good enough. It’s easier to dictate than type. As soon as speech recognition is powerful enough to enable conversations with data, I will be the first to ditch my mouse and keyboard.

Mobile phones are where we spend most of our time these days. Consider our interaction with them: there’s no mouse; keyboards are impractical; and our fingers are inaccurate. This challenge was raised by Elon Musk at Recode’s Code conference last year: “We’re I/O bound -- particularly output bound. Your output level is so low, particularly on a phone, your two thumbs sort of tapping away. This is ridiculously slow.” That’s why there is so much work in text-to-speech. Ultimately, data analysis on the phone will be possible and feel natural when driven by language.

We’re not close to the vision yet, of course. Alexa forgets each question the moment I ask it. It’s not a conversation. Data analysis driven by voice command can cope with merely the most basic questions. Remember, though, that fifty years ago, nobody thought a computer could ever beat a human at chess. Today, machines not only defeat us at chess, they’ve also overtaken the best Go players and have even beaten humans at Poker, a game that requires emotion and bluffing as well as pure probability.

Take a look at robotics for other examples. Sure, some of the heroic failures make us laugh (check out @randy_olson’s tweets), but step back and consider how far this technology has come, and how just a few steps might be needed to go from failure to success. They may fall over themselves right now, but I bet it won’t be long before they can kick those balls properly.

The best data analysis is exploratory. Today I am forced to converse with my data using a mouse, a keyboard, and graphical user interface. I would much rather have an iterative conversation with my data, driven by my voice. Surely it’s worth pursuing that goal? Some of the early steps in NLP and speech recognition look silly, but we should pursue this path so that anybody, without the need to learn complex interfaces, can intuitively ask multiple, complex questions.

So what might the future of data analysis look like? Here’s how I imagine it: