Big Data 2.0: Seeing Will Make Us All Big Data Believers
Big data offers zero value unless a business can extract the intelligence it's looking for. As advanced analytics with intuitive data visualization comes online, big data 2.0 will come into its own as the driver of fully personalized and actionable insights.
By Allen Bonde, Vice President of Product Marketing and Innovation, Actuate Corporation
Can you hear me over all the noise coming from the big data hype machine?
Between, big data, small data, and even fast data, it's easy to feel overwhelmed. Yet in all these conversations, there is a great deal of business-transforming discussion and opportunity in front of us.
The concern, however, is that we may be buying into the promise of big data without knowing it can deliver, which is why alternative views such as the small data movement are taking hold. The market is sold on the big picture, big capacity side of the big data revolution, as though by going in with everything they will automatically end up with something that adds value.
However, data volume and data source diversity does not equal data value, and we shouldn't be surprised to hear reports of big data initiatives that simply rack up huge costs and lead to confused and overwhelmed users. Along the way, the point -- to generate useful business insights -- has become lost in the noise.
Of course, as with any major business trend, there is a wave of initial interest followed by a period of hype, then a crash, and (hopefully) a period of realistic expectations and productivity gains emerges. Big data is currently teetering at this critical point, and many buyers are on the verge of slipping off the hype "cliff" in disillusionment as well.
No wonder Gartner described big data technologies as nudging the "peak of inflated expectations" -- and it is hard to argue with that sobering notion.
Time for the Next Generation?
As the big data hype ebbs and organizations look beyond the "big bang" approach to more practical ways to apply the power of big data to more users to solve everyday tasks, there's an opportunity to redefine what it actually means and even build on early successes and lessons learned from the first wave of big data deployments. Big data can certainly be corralled, but only if CIOs and line-of-business executives start from a position of ruthless pragmatism and with the understanding that identifying their organizations' critical questions is the necessary first step before diving in and making the 3Vs of big data bend to their will.
The second step must be laser focus on straightforward, effective analysis of only the most promising datasets against those questions.
Finally, all parties must agree to spend an equal amount of time focusing on making resulting insights and answers accessible -- and actionable -- by the broadest set of users.
Once we filter out 90 percent of the noise, and concentrate on making big data a useful driver of helpful insights via advanced analytics and effective visualization for everyday tasks, then organizations can, and will, get something useful from it.
Big Data Needs to Get "Insight-full"
To get there, we need to refocus on the best ways to deliver timely, meaningful insights to both technical and non-technical audiences. This increasingly means presenting data visually, via data visualization technology.
Formally speaking, data visualization is the presentation of complex, statistical data from a myriad of sources in an easy-to-use visual format that allows business analysts and users to understand the most important and significant trends. Data visualization tools help users identify patterns, outliers, anomalies, cause-and-effect relationships, and other associations within the data -- the kind of the business intelligence that prompted executives to write the Big Check for the big data project in the first place.
At the same time, don't forget the need for advanced analytics (including data mining and predictive techniques) to support meeting that target. Such tools bring richness and meaning to any big data: predictive analytics, for instance, can analyze and model big data to help you predict events, and text mining and natural language processing (NLP) solutions unlock hidden consumer sentiment or extract underlying meaning from textual data. In fact, good data analysis depends on the solid bedrock of proper business analytics.
Big Data 2.0
Whether you call it small data or big data 2.0, delivering advanced analytics and visualization tools that allow users (who aren't data scientists) to derive timely, meaningful insight from their data assets is the key to helping organization "see" the value from big data. These tools help to fill in the "last mile" of big data by presenting the complex relationships found within unstructured, structured, and even multi-structured big data in ways that make it easy to turn insights into action. The tools query and model the underlying data sources (in many cases, via the power of in-memory computing) before presenting a visual analysis of the data.
Such systems are particularly suitable for an exploratory style of analysis demanded by big data by virtue of their ability to recognize patterns and communicate data in a way that business users find more tangible and meaningful than picking through reams of tabular data.
The message is clear. We need to position ourselves carefully to avoid the fallout from the impending big data "hype hangover." Meanwhile, we need to look closely at how best to achieve real payback for our big data investments. The best way to do just that is to operationalize analytics by focusing on high-value use cases -- and use intuitive and engaging data visualization tools. That's a combination that provides a sensible way forward for next-generation big data 2.0, producing new ways to drive brand loyalty, improve profitability, and give customers the targeted, useful service they expect.
That's a message that will be heard through all the hype.
5 Best Practices for Developing Big Data Insight
1. Leverage and maximize
Leverage and maximize any investments in your conventional BI reporting, dashboard, and visualization tools. Such tools will easily work against structured big data stores, such as SQL based in-memory or MPP databases. Look for the same level of tool support against other big data stores, including Hadoop HDFS or a NoSQL database. Press your vendor about its plans in this area; you need this functionality today, not next year
2. Deliver insight
Data visualization tools provide a hugely helpful way to illustrate patterns, trends, warnings, and opportunities within your data. Give proper thought to how you present insights to end-users. Think "Personalized Analytics" -- that is, provide self-service BI tools for more advanced users and tools with a context- and role-based delivery interface to the front line workers or casual business users that need them. Meanwhile, remember that the most successful data visualizations tell a story about the data and provide an environment for users to interactively explore and probe the data further.
3. Recognize that support for pre-packaged apps is changing
Until now, big data has been focused on packaging tools and content that deliver insights sourced from structured data. Today, purpose-built appliances with preconfigured software and hardware (including delivery interfaces) are fast becoming a recognized, popular deployment model for big data work. Expect to see this trend extend to more multi-structured data environments in the very short to medium term
4. Scale securely
It's one thing to deliver personalization on a 1:1 basis. It's tough to deliver it to millions of users in a safe and secure fashion. When you're personalizing at mega scale, it's even more important that security is appropriate (contextual) for each and every user. Make sure security is provided at a molecular level.
5. Find the skills you need
Can you cope with the coming data deluge? You will struggle and fail if you don't have the right kind of employee helping you. This involves developing the necessary skills or data literacy across your organization so it can better understand how to value data, its quality or validity, and how it can be utilized to generate the right insight-producing process. Bottom line: buy all the tech you can, but if you skimp on internal brain power, you are wasting your big data chance.
Allen Bonde is vice president of product marketing and innovation at business analytics software specialist Actuate Corporation, the developer of BIRT iHub, BIRT Content Services for CCM, and BIRT Analytics. You can contact the author at