Just back in 2012, for the first time ever 'data' made the cut as a critical new form of economic currency at the prestigious World Economic Forum in Davos, Switzerland. And conversations were about more than just raising bottom lines—they were discussions by world leaders on the global socioeconomic impact of the "quantification of everything." In other words, data is a big, big deal.
This whitepaper shares a collection of predictions and strategies for grappling with and making use of the data revolution. The world is abuzz with the tidal wave of data-empowered solutions that is transforming our industries and our lives, and this paper will catch you up to speed.
We've also pulled out the first several pages of the whitepaper for you to read. Download the PDF on the right to read the rest.
Every year a gaggle of the world’s best and brightest minds descend on Davos, Switzerland, to discuss many of the world’s most important new developments. In 2012 for the first time, data made the cut as a critical new form of economic currency that, in the eyes of these global thought leaders, is as potent a force as gold, oil, or money itself.
“It’s a revolution,” says Gary King, director of Harvard’s Institute for Quantitative Social Science. “The march of quantification, made possible by enormous new sources of data, will sweep through academia, business, and government. There is no area that is going to be untouched.”
While with any new concept there is a risk of hyperbole, big data is driving significant opportunities across industries and the public sector. And as more and more data-generating resources emerge and find their way into the hands of hundreds of millions of new global users, data volumes will continue to increase by orders of magnitude.
“Data, data, and more data … data is everywhere, and it’s important,” says Ravi Kalakota, CEO of E-Business Strategies. “By 2015 nearly 3 billion people will be online, pushing the data created and shared to nearly 8 zettabytes.” For comparison purposes, that is the informational equivalent of 1.8 million new Libraries of Congress being created every year.
Gartner defines big data as “high volume, velocity, and variety information assets that demand cost-effective, innovative forms of information process for enhanced insight and decision making.” Obviously, the definition of big data is relative to an organization’s capacity to generate, manage, and make sense of it. Smaller companies may consider many terabytes of data “big” while their larger counterparts don’t assign it that value until it pushes into the petabytes or more.
Not surprisingly, this rapid onset of the “industrial revolution of data” has taken many organizations by surprise and led to more than a few cases of what some industry sources call “data analysis paralysis.”However, this wealth of new data offers exciting new opportunities for spearheading innovation and growth.
Indeed, key determinants to corporate success will be the willingness of organizations not merely to adapt to these changes, but to eagerly embrace this data windfall and adopt new technology and analytical best practices capable of making the most of it.
The Data on Data
Seeds of the Data Explosion
At some point in time most children become fascinated with big numbers and, when their budding vocabularies are no match for all those zeroes, they’ll cook up definitions of their own (think gajillion, bazillion, and so on). We say this because at some point a child’s imagination may be enlisted to coin a term capable of defining the staggering volume of data created by organizations, the Internet, and its digital peripherals.
And there is every indication that we’ve only just begun; that the mountains of data already generated by IT system logs, forms, multimedia files, email, social media feeds, Web analytics, metadata, mobile devices, and countless other applications are but the proverbial tip of the iceberg; that, in the words of Google Chairman Eric Schmidt, when it comes to data “people [and
organizations] aren’t nearly ready for the technology revolution that’s going to happen to them.”
Consider that just two years ago EMC and IDC jointly predicted an astounding 45-fold annual data growth rate
through 2020, yet today experts suggest that figure may seriously underestimate the potential. This year alone IDC predicts that 1.2 zettabytes of raw (or, in analytical parlance, “unstructured”) data will be generated—that’s 1,200,000,000,000,000,000,000 to the awe-struck child within.
The simple truth is that as more and more of the connective tissue that defines, feeds, and grows the digital Web “comes online” (e.g., content channels, user platforms, data applications) and billions more people embrace it, nobody really knows just how much data all of that gadgetry in the hands of all those individuals will produce.
The Rise of Unstructured Data
During the early years of the digital age, data wasn’t nearly so thorny an issue for organizations. Most companies and their IT departments implemented relational data management systems that, by definition, had their structural intelligence already baked in. This, in turn, enabled them to use basic desktop spreadsheets systems for analysis.
But as the technology pendulum swings squarely into the hands of individuals; as a seemingly endless cavalcade of new data sources emerge; and as data volumes explode, today’s organizations are finding themselves overwhelmed by data that falls “outside of the structured rows and columns of databases that enterprises have traditionally looked to as their primary sources of information,” says Ananth Krishnan, CTO of $10.2 billion Tata Consultancy Services.
Even for technically sophisticated organizations, the task of preparing for, managing, and analyzing disparate data types can be enormously challenging. Add in massive amounts of the stuff and an organization can quickly reach a kind of information overload. Worse still, many companies play it safe and focus only on the data and analytical practices with which they already are comfortable.
“The problem with this reasoning,” says Thomas Davenport, a senior advisor to Deloitte Analytics, “is that the advance of big data shows no signs of slowing. If companies sit out this trend’s early days … they risk falling behind as competitors and channel partners gain nearly unassailable advantages."
What’s in it for Business?
Algorithms Over Instinct
Throughout history businesses largely depended on the veteran wisdom of their leaders for making important tactical and strategic decisions. The fates of these organizations often hinged on the outcomes of these educated, yet nevertheless subjective choices. Today, however, the rapid ascendance of data and its ability to deliver critical, objective insights into enterprise operations, consumer behavior, and more, is changing all that.
“Organizational judgment is in the midst of a fundamental change,” says Erik Brynjolfsson, director of MIT Sloan’s Center for Digital Business. Businesses are being forced to transition “from a reliance on a leader’s ‘gut instinct’ to increasingly data-based analytics.”
Want to read more? Download the rest of the whitepaper!