It’s hard to find a topic out there hotter than Data Science right now; and can be equally hard to find one more confusing. Data Science techniques have revolutionized nearly any industry you can imagine, and in some cases created whole new ones from thin air. Despite this, much of Data Science remains couched in mystery--a magic black box that is supposed to solve all of our problems.
Data lakes are coming on strong as a modern and practical way of managing the large volumes and broad range of data types and sources that enterprises are facing today. TDWI sees data lakes managing diverse data successfully for business-driven use cases, such as omni-channel marketing, multi-module ERP, the digital supply chain, and data warehouses extended for business analytics. Yet, even in business-driven examples like these, user organizations still haven’t achieved full business value and return on investment from their data lakes.
Philip Russom, Ph.D.
The volumes of data and speed at which data is produced continually increases on an exponential scale. Consumer transaction data, client records and data in motion from mobile devices, IoT sensors and other sources usually contains associated geographic coordinates that require geospatial processing to extract value. With the volume and variety of this data, organizations need to have a location strategy that includes big data technology that can join disparate data sets (geoenrichment) and perform location analytics to reveal actionable business and operational insights.
Pitney Bowes Software Solutions
A revolution is occurring in modern analytics, driven by our ability to capture new sources of information at a detail previously too complex and costly to imagine. As more data comes from new sources (from machines to social media) and is applied to new applications, data is evolving into greater diversity, including every variation of data type from unstructured to multistructured. Even as new tools to analyze and manipulate this newly available resource come online, it is not enough to look at the data manipulation layer alone.
Philip Russom, Ph.D.
Organizations of all sizes are in competition to realize value from data – and to realize it faster. To do so, they increasingly need flexible and agile business intelligence(BI), analytics, and data infrastructure, not systems that take too long to develop and do not give users the dynamic, iterative, and interactive access to data that they need. Fortunately, technology developments are trending in a positive direction for organizations seeking to accelerate their path to value with BI, analytics, and the critical supporting data infrastructure. These include self-service BI and visual analytics, self-service data preparation, cloud computing and software as a service(SaaS), and new data integration technologies.
Cambridge Semantics, Looker, Modemetric, SAP, SAS, Tableau Software, Unifi Software, Zoomdata
An increase in data maturity correlates to an increase in business success. Yet though organizations gladly allocate budget to business projects, they neglect data maturity—even to the point of allowing it to deteriorate.
As BI and analytics become more mainstream, organizations are realizing that it makes sense to both enrich and augment their data in order to gain more insight. Successful companies realize that utilizing traditional structured data only for analytics is a non-starter. Organizations are more often adding ‘new’ data sources to the mix, including demographic data, text data, and geospatial data to their data sets. They are also looking for external data, such as social media data, weather data, and other third-party sources. The demand from data consumers has also driven many new organizations to pursue sharing their data. Many of these data sources are cloud-based.
Fern Halper, Ph.D.