There is no question that mainframes continue to serve a wide range of organizations by providing a secure, high-performance, and scalable computing platform that’s hard to match on other systems. The issue comes when you attempt to extend mainframe data or applications to participate in new business applications on so-called open systems. Non-relational data, mainframe COBOL programs, and 3270 screen-based applications are difficult to access from open systems, and this inhibits modern data-driven business practices, like 360-degree views, on demand performance management, just-in-time inventory, business intelligence, and so on.
Sponsored By Progress DataDirect Shadow
Data profiling, data integration, and data quality go together like bread, peanut butter, and jam, because all three address related issues in data assessment, acquisition, and improvement. Because they overlap and complement each other, the three are progressively practiced in tandem, often by the same team within the same data-driven initiative. Hence, there are good reasons and ample precedence for bringing the three related practices together. The result is an integrated practice for data profiling, integration, and quality (dPIQ).
Sponsored By Pitney Bowes Business Insight
Some people don’t believe data integration has architecture, under the assumption that data integration is a small component of a larger data warehouse architecture. If you fail to recognize the autonomous architectures that data integration has developed in recent years, you can’t address how architecture affects data integration’s scalability, staffing, cost, and ability to support real time, master data management, SOA, and interoperability with related integration and quality tools. And all these are worth addressing. This Webinar makes a case for data integration architecture, by defining what it does, where it’s going, and why you should care.
Sponsored By Informatica Corporation
Usage rates for BI tools have nudged up from 18 percent three years ago to 24 percent today, according to TDWI Research. This abysmally low percentage accounts for most of an organization’s power users and a handful of very determined casual users. What can you do to make BI more pervasive? Most BI managers latch upon the notion of self-service BI as the panacea to increase adoption. But this strategy usually backfires unless it’s balanced with a careful understanding of the information requirements of various types of users. Organizations need to balance self-service with tailored delivery of information, and they need to understand what self-service means to different groups of users. This Webinar will address the major pitfalls organizations face when deploying BI tools and recommend steps to make BI accessible to the remaining 80 percent of employees who have yet to become active users.
Sponsored By Tableau Software
TDWI's MDM Insight Online Event was held June 16 & 17 and attended by hundreds of people. The sessions taught attendees how master data management can help companies enhance business process efficiency, connect more effectively with suppliers and customers, and drive higher sales and profits.
Jill Dyché, Philip Russom
Sponsored By IBM, Initiate Systems, Melissa Data, Teradata Aster
In this Webinar, Wayne Eckerson, author of the best-selling book “Performance Dashboards: Measuring, Monitoring, and Managing Your Business,” will explore new developments in performance dashboard technology. Wayne will define requirements for successful dashboard implementations, including layered delivery of information, interactive screens, synchronized charts and tables, remote data access, mobile data delivery, intelligent alerts, and embedded analytics. Sponsor MicroStrategy will then demonstrate how it supports these capabilities.
Sponsored By MicroStrategy
Partnering companies have long exchanged data associated with supply chains and financial routing networks, and more recently with online trade exchanges, e-commerce, and business process outsourcing. Many large companies sync data across business units in a similar fashion. Applications for business-to-business (B2B) data exchange have been around for years, and many have been modernized by interoperating with platforms for enterprise application integration (EAI) and business process management (BPM). And they have begun incorporating the tools and techniques of data integration (DI). That’s so the applications can cope with the numerous data standards that are common in B2B data exchange, plus versions and variants of these. DI also gives B2B data exchange the BI, data quality, stewardship, and remediation functions it has lacked. When DI is used in this context, it’s called B2B data integration.
Sponsored By Informatica Corporation
New technologies often change the rules of the game, making it possible for BI teams to address business needs in new and creative ways. BI teams that understand how to harness the power of the cloud, open source, virtualization, and high-performance analytical databases can create new opportunities to serve the business while saving money and time. For example, the combination of these technologies will enable organizations to spawn analytical sandboxes on demand to help business people optimize their response to sudden market changes, such as a new competitor, wholesale shifts in pricing, natural disasters, and economic freefalls. They can also be used to offload expensive processing from overburdened data warehouses and avoid costly upgrades, or prototype new proof-of-concept BI applications without having to purchase and implement new data center infrastructures. This Webinar will provide an overview of these new data center technologies and describe how BI teams can exploit them for business advantage.
Sponsored By Jaspersoft, Vertica