Chris Anderson has written a great article in Wired on the data deluge and how it poses new challenges to the companies. He writes that the petabyte age that we live in information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics. For companies, that have or gather loads and loads of data, the implications are about how can they quickly sift thro' this massive volumes of data and the successful ones will be the ones who can track and measure this with unprecedented precision and scale. Take a look:
Speaking at the O'Reilly Emerging Technology Conference this past March, Peter Norvig, Google's research director, offered an update to George Box's maxim: "All models are wrong, and increasingly you can succeed without them."
This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.
The big target here isn't advertising, though. It's science. The scientific method is built around testable hypotheses. These models, for the most part, are systems visualized in the minds of scientists. The models are then tested, and experiments confirm or falsify theoretical models of how the world works. This is the way science has worked for hundreds of years.
Scientists are trained to recognize that correlation is not causation, that no conclusions should be drawn simply on the basis of correlation between X and Y (it could just be a coincidence). Instead, you must understand the underlying mechanisms that connect the two. Once you have a model, you can connect the data sets with confidence. Data without a model is just noise.
But faced with massive data, this approach to science - hypothesize, model, test - is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture - but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dimensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses - the energies are too high, the accelerators too expensive, and so on.
Read more on understanding data
Here's an interesting trend from emarketer on what to expect in the next couple of years on content and data of what & how customers will watch TV and consume entertainment which will become available for analytics & customer centric marketing.
"At eMarketer, we believe TV viewers will watch more, not less, TV content in the future," says Ben Macklin, senior analyst at eMarketer and author of the new report, TV Trends: Consumers Demand Control. "But they will be accessing and viewing it in different ways from the past."
eMarketer estimates that by 2012 nearly 25% of all TV content watched each day will be time-shifted, on-demand, on the Web or on a mobile device.
"Video-on-demand, digital video recorders, the broadband Web and 3G mobile phones are giving consumers new ways to access and watch TV," says Mr. Macklin. This does not spell the end of the traditional live TV broadcast or the traditional 30-second ad break, but TV advertising will need to evolve if it is to keep pace with consumer usage.
"Traditional TV broadcasters and advertisers have little time to wait to reinvent themselves and their organizations to take advantage of the interactive, on-demand and mobile video future," says Mr. Macklin.