Lately, we in the technology world are bombarded by the word Analytics at every turn. We have endless debates verging on theology about the distinction between business intelligence and business analytics. Is it real-time synchronous? Is it time-delayed like a DVR? Is it about fancy reports? Is it about smart people walking around with charts on iPads? Is it in-memory? Is it languishing in tapes? The distinguishing attributes raised are endless. However, the fact is, analytics has been around for ages. It has been around for so long that we don’t even think about it. Let’s step around the whirlpool of buzz words and see how.
Analytics through the Ages
How did Cro-Magnons and Neanderthals make day-to-day decisions? They collected data. They interpreted data. They acted on the interpretations. See the graphic below on how analytics worked in pre-history.
Prehistoric humans observed data; data in the form of sky, stars, moon, climate, flora and fauna; fauna like the woolly mammoths. They interpreted what was observed. Did the mammoth follow a pattern of behaviour? Was it vulnerable? Was it away from the herd? Did its path present an opportunity? Based on the interpretation of the data, they acted. They acted not just alone, but in cohorts. Significant human collaborative action brought down the mighty woolly mammoth. Of course, protein increased in the human diet, changed the course of humanity and so on and so forth. I’m digressing. Let’s get back to analytics!
21st century human ways are not fundamentally different. We still collect data, interpret it, and act on it. Let’s look at what has changed and what has not.
For starters, the words have changed. We refer to data as Big Data. We refer to interpretation as Analytics. We refer to action as Processes; business processes or more generally, social processes.
New-fangled terminology apart, have we really changed? Yes and no. We have not changed because fundamental human behaviour of data-decision-action methods have not changed. We have changed because technology has aided how and how much data is created and collected; technology has altered how we analyse that data; technology has changed how we have designed and implemented processes to act on that data.
Thanks to networking technology we can now collect numerous amounts of data from entities (human or otherwise) that are connected. See this post about the X2X world to see how connectivity helps. Thanks to computing technology we can process that data to derive insights. Thanks to a combination of computers and networks, we can collaborate at a scale never before achieved and act on the insights by implementing technology-aided processes. See this article on virtual social networks to see how human collaboration is getting a fillip from technology.
Big Data Analytics Makes for Efficient Processes
Without technology, analyzing the data that is generated from interactions of our massive network of entities would have required thousands of woolly mammoths in the back office. It would have taken us many more woolly mammoths to implement processes based on that analysis. Our vast network of computers continue to ease that burden by automating the processes. Rather than employing thousands of woolly mammoths in the back office, we are deploying computers to collect, interpret, and act on data. We have replaced woolly mammoths with machines – with hardware and software. Alternately, we have made the existing wooly mammoths more productive.
What is in store for us now that humanity has collaborated to slay woolly mammoths, curtailed the use of woolly mammoths at work, made working woolly mammoths efficient, and made collaboration between present day woolly mammoths a breeze? How about increased innovation & real productivity gains? Am I being optimistic? Perhaps. But isn’t that the story of humanity? Doesn’t technology drive productivity time and again?
Also published on Medium.