On Demand
Many end-user organizations are currently commencing or expanding solutions for big data and big data analytics. These organizations want to understand how to approach big data and where they stand relative to other companies, especially their competitors. In late October 2013, TDWI launched its Big Data Maturity Model Assessment Tool, which can help to guide IT and business professionals on their big data journey. The assessment looks at companies across five dimensions that impact maturity, including organization, infrastructure, data management, analytics, and governance.
Fern Halper, Ph.D., Krish Krishnan
Content Provided by
TDWI, IBM, Cloudera, MarkLogic, Pentaho
Predictive analytics is quickly becoming a decisive advantage for achieving desired business outcomes, including higher customer profitability, stickier websites, more relevant products and services, and more efficient and effective operations and finances. Predictive analytics involves methods and technologies to help organizations spot patterns and trends in data, test large numbers of variables, develop and score models, and mine data for unexpected insights. Sources for predictive analytics are expanding to include machine data and semi-structured and unstructured data, making it important to include text analytics and mining in technology portfolios.
Fern Halper, Ph.D.
Sponsored by
Birst, Actuate - now OpenText, Alteryx, Pentaho, SAP, Tableau Software
More and more, companies are looking to a variety of data types and new forms of analysis in order to remain competitive. Geospatial data is emerging as an important source of information, both in traditional and big data analytics. Companies are using geospatial data and geospatial analytics in applications ranging from marketing to operations. The analytics are moving past mapping to more sophisticated use cases.
Fern Halper, Ph.D.
Sponsored by
Alteryx, Information Builders, Tableau Software
Organizations today are seeking to drive deep analysis, detect patterns, and find anomalies across terabytes or petabytes of raw big data. Whether you’re trying to discover the root cause of the latest customer churn or the hidden costs that are eroding the bottom line, you need analytic tools and techniques that work well with unstructured and multi-structured data in its original raw form.
Apache Hadoop is maturing as a loosely coupled stack for inexpensive batch storage, where you don't need to know data formats or schemas to store and process the data.
Philip Russom, Ph.D.
Sponsored by
Splunk
There is considerable buzz in the IT industry about in-memory technologies. At the same time, there is considerable confusion about the role and benefits of these technologies in business intelligence processing and performance. A major reason for this confusion is that in-memory computing means different things to different people.
Colin White
Sponsored by
Tableau Software
Organizations are increasingly recognizing the intrinsic value in growing information resources to drive decisions on product and service innovation, customer relationships, operations, and value-chain activities. Yet management of information from a systemic perspective has historically reflected a dichotomy between the business side of the organization, where information is created, shared, delivered, and discovered, and the IT department, where information/data is being stored, protected, secured, and preserved.
David Loshin
Sponsored by
Hewlett Packard Enterprise
Master data management (MDM) comprises a robust set of processes, techniques, and technologies. MDM has matured to the point where organizations now recognize a need to transition away from “integration” and move toward “information use.” That implies a more informed concentration on customer data visibility to support corporate directives for improving the customer experience in ways that create and maintain corporate and customer value.
David Loshin
Sponsored by
TDWI and IBM Content