The 3 waves of data visualization: A brief history and predictions for the future

Data enthusiast Elijah Meeks provides an overview of the waves of data visualization throughout recent history—and shares how this history has culminated in a third wave of data visualization.

This post is based on Elijah Meeks’ keynote from the 2018 Tapestry Conference. Elijah is a Senior Data Visualization Engineer at Netflix. A version of this post originally appeared on Medium.com.

Fifteen years ago, there was no D3, no Tableau, no ggplot or even Prefuse/Flare. If you wanted to do network visualization you might use the newly published Cytoscape, though it was focused on bioinformatics—the science of collecting and analyzing complex biological data like genetic codes.

Fast forward to today. There are many more tools for analysts to leverage, but our community still falls into a familiar trap. Instead of seeing data visualization as a universal medium, we split the data visualization community into categories:

  • Analysts using BI tools for reporting
  • Developers using code to make custom data visualization
  • Journalists creating data-driven stories
  • Data scientists leveraging exploratory data analysis

These categories of practice map directly to particular tools and modes that have, as of late, begun to transform. In this post, I’ll provide an overview of the waves of data visualization throughout recent history—and share how this history has culminated in a third wave of data visualization, characterized by the convergence of tools, audiences, and modes.


Footage of analysts operating Excel 2.0 to make pie charts c. 1988

First wave: Clarity

There was, in the modern sense of data visualization, a first wave of data visualization centered on the work of Edward Tufte who emphasized clarity, simplicity, and direct one-to-one mapping of data points. From this era, we see the rise of spartan color schemes — often focused on neutral or desaturated colors with one standout color — the importance of labels and natural language titles, and a sort of idealization of the perfect chart that is immediately readable, accessible, and actionable. This is a sort of chart-as-sentence with clear structures and rules like you might see in The Elements of Style.

Second wave: Systems

The second wave focused on systematizing the encoding of information that was necessary to develop the tools to produce data visualization. It is centered on the most influential work on this subject: The Grammar of Graphics by Leland Wilkinson. In the first wave, charts were created as a result of a spartan design ethic for storytelling. In this wave, charts are outputs from the specifications of the tools. The Grammar of Graphics aspired to enable people to create an ensemble of graphics and if we take a look at any data visualization library, we can still see that philosophy in action.

These libraries focused on “recipes” that allowed authors to take any set of building blocks and through a defined “grammar” produce a near infinite set of examples. Each was potentially a valid chart, since it was built using acceptable starting points. However, Wilkinson acknowledged that just because a logical relationship between data and graphics can be defined, it doesn’t mean all chart have any useful meaning. A grammar of graphics, and tools to build them, doesn’t guarantee legibility.

The second wave’s focus on tools had some consequences—with people producing a forest of dashboards and reports, sometimes forgoing design best practices.

Third wave: Convergence

In recent years, there has been a convergence of tools and user expectations. Modern platforms such as R and Tableau are built around the fundamentals of the Grammar of Graphics. At the same time, consumers are more design-savvy, with beautiful visualizations proliferating in the media.

In response, we need to shift our emphasis away from individual charts to the construction, evaluation, and delivery of the products where those charts appear. For example, at Netflix, we’re experimenting with analytical notebooks, designed to explain data visualizations and to foster collaboration and communication.

These factors all contribute to what I believe will define a third wave of data visualization, characterized by a shift in mindset around the following topics:

  • Create charts for reuse: That requires us to move away from the expectation that we are making isolated charts optimized for immediate readability. We currently evaluate and celebrate data visualization that is designed and optimized for a single visit. That’s fine, but we also need to promote, evaluate, and better understand data visualization designed and optimized for multiple visits. UI and UX need to be first-class concerns and interactivity can’t just be an attribute of the geometry. To do this, we need to move away from the model of the isolated genius creating bespoke data visualization and bring in best practices from collaborative projects that are common to the larger software development community.
  • Foster critique: Critique is an attempt to push us to not only celebrate what’s great about data visualization, but also to speak honestly about what’s bad. We need to do more of that if we want to move forward as a community. But criticism is hard — hard to hear and hard to give well. We need to actively work to develop our community to be a place to give, receive and model critical discourse. Ben Jones’ recent piece about building a healthy data visualization community provides some good guidance based on real-world experience that we should all seek to follow.
  • Understand design: Survey responses and conversation indicate design is an important theme to improving data visualization practice. But practitioners have expressed confusion as to what design means. Do we mean graphic design? UI design? General design thinking concepts? Information design? This third wave, characterized by convergence, might emphasize a more thorough understanding of how to incorporate design into the field of data visualization. In a lot of ways, this is already happening, but it should continue to grow as part of early career development for data visualization professionals.
  • Embrace the attention economy: When I first came to Netflix from Stanford, I thought naively that because it was a business, employees would be forced to use and learn about the data visualizations that I made. I found out quickly that was not the case. Netflix’s own culture was against such dictatorial measures but even more, my stakeholders were trying to make critical decisions and my data visualization products were competing with a dozen or more other dashboards and reports.

    As a result, the work we’re doing internally at Netflix leverages images, gifs, playful color and novel visual methods to delight users. While the general prohibition against “chart junk” is a good rule, like all things, it can be applied too strictly. A more decorative approach that acknowledges the existence of an attention economy, even in a data-driven organization, will result in more effective data visualization.

One thing is for sure—there have been a wealth of individuals who have set the stage for modern data visualization as we know it. As tools and processes evolve, we will see a new wave of data visualization emerge—one focused on converging best practices from the past with new ways of thinking.

Watch the full presentation on the Tapestry YouTube channel or view Elijah's speaker slides (with notes) for more information.