By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Chicago Update

At TDWI, we have been working hard to navigate this ever-changing landscape in the face of COVID-19, and we want to assure you that the health and well-being of our employees, customers, and vendor partners is our top priority. Therefore, due to the growing concern around the coronavirus (COVID-19), and in alignment with the guidelines laid out by the CDC and WHO, we have decided to merge this year’s TDWI Chicago Conference (May 10-15) with TDWI Orlando 2020 (November 8-13), where it can be a successful experience for everyone. The Chicago 2020 agenda will be replicated at TDWI Orlando 2020.

Our registration team will be in contact with individual registrants and sponsors directly.

Course Description

T6P Data Engineering for Batch, Streaming, and Data Pipelines

May 12, 2020

2:15 pm - 5:30 pm

Duration: Half Day Course

Level: Intermediate to Advanced

Prerequisite: None

John O'Brien

President

Radiant Advisors

Data integration has evolved from traditional batch ETL into data engineering with new concepts and technologies that are proving to be more agile, scalable, and affordable. Understanding the key concepts and relationships of modern data integration will allow data integration teams to evolve into data engineering teams that manage batch ETL, streaming data ingestions, and data pipelines in a single architecture.

This course will have additional emphasis on how to organize and simplify data ingestion environments by exploring how database replication works to stream data from operational databases in a non-intrusive way. We will also look at the DataOps principles and data integration design patterns that should be adopted to deliver the data needed in a portable architecture across on-premises and cloud environments.

You Will Learn

  • Essential data ingestion factors including database replication, streams, and managed files
  • How streaming data hubs act as a data broker with Apache Kafka
  • How to engineer data pipelines with integration design patterns
  • How to develop, deploy, and manage data pipelines with DataOps

Geared To

  • Integration architects
  • DW/ETL developers
  • Data engineers
  • Data scientists
  • Developers
  • Database administrators
  • Data architects