By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

Harness the Power of Data Streaming

TDWI Solution Spotlight

Watch Now On-Demand

 

Fill Out the Form to Register

Virtual Solution Spotlight: Harness the Power of Data Streaming


Your e-mail address is used to communicate with you about your registration, related products and services, and offers from select vendors. Refer to our Privacy Policy for additional information.

Strategies for faster and automated data ingestion, replication, and updating

Data streaming is revolutionizing analytics for customer engagement, operational decisions, supply chain optimization, fraud detection, and much more. In most cases, the closer organizations can get to real-time data, the more valuable the data is for gaining predictive insights, running AI/ML-driven applications, and providing situation awareness to users of dashboards and data analytics platforms. Adoption of Apache Kafka to run on cloud platforms, on premises, or a hybrid of both has proven critical to expanding the range of use cases that can benefit from real-time insights.
However, to achieve the potential of data and event streaming, organizations need to address key challenges:

  • Complexity. Data teams need to reduce confusion and have better visibility so they can prepare multiple data streams for Kafka-based platforms and target analytics and data consumption.
  • Efficient operations at scale: Organizations need to minimize operational complexity to ensure high performance and scalability as streaming grows throughout the organization.
  • Self-service and less manual coding. Data teams can’t afford delays and errors due to heavy manual coding. Organizations need to increase automation and provide users with modern, self-service interfaces—steps that can improve productivity and reduce training pressures.
  • Make it easy to build event streaming applications. Organizations need to be able to build applications that deliver fresh insights from semistructured data faster, with easier data preparation and pipeline maintenance.
  • Legacy data availability. Mainframe databases and older enterprise applications can’t be left behind. Organizations need to make this data available in real-time data streams to Kafka platforms for analytics and data consumption.
  • Integrate change data capture and replication. Capturing changes at data sources is an efficient, less-intrusive way of keeping target data warehouses, applications, and microservices updated. Organizations need to integrate CDC and replication with real-time data streaming.

Join this free Virtual Solution Spotlight webinar to learn how your organization can harness the power of data and event streaming for important initiatives. You will hear from David Stodder, TDWI senior director of research for business intelligence, Rankesh Kumar, partner solution engineer, Confluent, and John Neal, partner engineering team, Qlik. Learn technology directions and best practices for making data and event streaming central to your strategy for faster data and real-time analytics.

Watch Now On-Demand
Duration: 90 min.

  •  

    David Stodder

    Sr. Director of Research for Business Intelligence at TDWI

    Delivering the Benefits of Real-Time Data and Event Streaming

    Organizations are excited about the potential of data and event streaming for accelerating analytics and enabling operational managers and automated applications to work with steady streams of the freshest data possible. Real-time streams can take predictive analytics to a new level and enable proactive response to situations, trends, patterns, and customer behavior. However, delivering the benefits is not as easy as opening the floodgates to real-time data streams. It takes defining use case requirements carefully and assembling a comprehensive strategy that addresses varied workloads.
    In this session, TDWI’s David Stodder will share research insights to discuss:

    • Where peer organizations are now with reducing data latency and their priorities for real-time data and event streaming
    • How real-time streaming fits into a comprehensive strategy
    • Technology trends impacting data and event streaming and real-time analytics
    • Recommendations for moving forward to deliver business benefits
  • Rankesh Kumar

    Partner Solution Engineer, Confluent

    Event Streaming On-Premises, in the Cloud, and on Hybrid Architectures to Enable Real-Time Analytics

    Event streaming unlocks many use cases for larger enterprises, but most still require on-premises data centers. We take you through the journey of migrating to the cloud, streamlining and future-proofing infrastructures through a unified streaming platform, and explain how Kafka and Confluent delivers real economic benefits. We will talk about streaming across hybrid and multi-clouds, running mission-critical enterprise applications, and how an event streaming architecture enables real-time analytics.

    Speaker Bio: Rankesh helps grow and scale the Confluent technology ecosystem by working with partners on integrations with Kafka and Confluent. Rankesh began his career building application integrations on SOA and worked with the market-leading SOA vendor, developing application integration and B2B integrations. He built a bulk-payment gateway for the largest global financial institutions, integrating multiple payment and audit systems. Recently, he was a product specialist with market iPaaS leader, where he helped the presales team and leading partners with data engineering solutions.

  • John Neal

    Partner Engineering Team, Qlik

    Data in Motion: Building Stream-Based Architectures with Qlik Replicate and Kafka

    Data streaming is fundamental to today’s component-based architectures, particularly those constructed in the cloud. These solutions are challenged by the fact that data that has been written to disk—whether to files or to a database—loses all its inertia. These architectures are further complicated when an enterprise’s most important data is found elsewhere: stored in a relational database, located on a mainframe or other ‘legacy’ system, or managed by an application such as SAP or Salesforce.

    In this session, we will explore how Qlik Replicate can be used to get an enterprise’s data in motion again and how development of modern, component-based solutions can be accelerated by coupling Qlik Replicate with Kafka.

    Speaker Bio: John Neal loves getting his geek on and finding solutions to challenging problems, especially when it involves writing code. Early in his career he worked on government programs that he still can’t talk about … but you can read about them in Tom Clancy novels. Go figure.

    John was a recognized expert in mapping complex application data models into the precursors of today’s NoSQL databases, and he developed the first production-ready adapters that allowed Oracle GoldenGate to deliver data at scale into “modern” targets, including Kafka.

    Today John works on the Partner Engineering team at Qlik, where he gets paid to do what he enjoys most: researching new and interesting technologies and applying that knowledge in creative ways when solving problems.

Reserve your spot at the solution spotlight today!

Presented By

QLIK