Criticality Of Big Data Wins over its Availability –Any Day!!
Discipline of analytics is actually about telling a story through the data – sewing together pieces of data from different sources, discovering patterns – both obvious and concealed ones, and telling a story in a way that’s easy to comprehend. The journey of analytics has changed in to the realm of self-discovery and what-if scenario analysis from the more basic slicing and dicing of data along diverse dimensions. The distance between what’s interesting and what’s valuable is gradually shrinking. Much of that’s possible due to progression of both storage technology and data management competences, linked to the abilities to practice data in near-real time or real-time. But much more is needed to render the full value spectrum of helpfulness of data.
All Data Fashioned Equal?
The movement around Big Data and associated technology would nearly make us assume that data is ubiquitous, hence easily affordable and nearly equally useful. Cheaper per unit cost of data storage has motivated this mindset probably more than anything else. Not all data is created identical. As such, not all data is equally valuable. While analytics tools try to address this gap by letting users realize and consume data in a multi-dimensional way, they often fall short of linking the content of the data to the context, where truly the utility bit comes in.
Context of the data is grounded in the connotation and usage of the data. It’s the environment that provides data with its purpose and describes its role as part of a business procedure or decision making step. Effectiveness of data, whether a single component, or as a collection, increases when context sensitivity equals trust, relevance, and timeliness of this data with the job at hand. Such sureness in data improves not only its half-life but also how much it backs down-stream. Incremental attentiveness of what a data component contributes to overall story is the measure of practicality of it in the bigger context. So, the story unfolds one step at a time, enriched by the context and meaning that are associated with it. A recent CMO survey by the AMA and Duke’s Fuqua School of Business found that the number of marketing projects using marketing analytics to drive decisions decreased from an already low 37% in 2012 to even below 30% a year later. Could it be because most of the data isn’t as useful as once it was thought, despite of the fact that volume and availability of actual data increased manifold during that one year period?
Does Practicality Matter in Big Data World?
Analytics achieved on such rich context-aware data is going to be more in line with what we might want from valuable data parts and not just performing analytics for the sake of it. This is in fact more obvious in the big data world. Recent Capgemini survey reveals, only 27% of respondents consider their big data initiatives as “successful” with a measly 8% describing them as “very successful.” Big data issues especially present this conundrum where data is both abundant yet presents a needle in a haystack discovery issue for many.
Association of data components among a myriad set of data doesn’t essentially add up to the causality or meaningful derivation of underlying conduct: a story-line we so want the data to communicate, may be, sometimes too eagerly. And we almost would like big data competences to provide the panacea to understanding of what sources what – simply because more data is available. But is that the real key in detecting true meaning or diagnosing a root cause? Zoomdata CEO Justin Langseth noted similar concern when he mentioned that when it comes to big data, design is as important as performance. Big or volume factor doesn’t amount to much, as it turns out, if it misses the practicality part. So, is it still a good enough reason to treat all data as identical simply because we can process, store, and access it? Or, much more is needed to comprehend the relative currency value of each data and how does it contribute to overall understanding of business issues? Is it time to treat big data and other data components for that matter, not as ubiquitous as experts may want you to believe, rather as scarce objects and treat with care and caution? Is it time to focus more on the usefulness of each data component more than the means of getting and processing it?