Modernizing Data Pipelines for Greater Productivity and Cost Efficiency (Mexico Time)
Webinar Speaker: James Kobielus, Senior Research Director, Data Management
Date: Monday, April 24, 2023
Time: 10:00 a.m. CST
Modern cloud-based data pipelines are the key to putting scalable machine learning and other advanced analytics applications into production. However, pipeline modernization can be a daunting challenge if enterprises lack a clear target architecture, have failed to define a comprehensive migration plan that considers future requirements, or struggle to bring together data scientists, data engineers, and other professionals to build and deploy pipelines.
Modern data pipelines shift focus from infrastructure management to driving greater productivity and efficiency in every aspect of analytics-driven business operations. However, organizations can’t realize those benefits unless they minimize the complexity of their infrastructure, ramp up its performance and scalability, maximize the automation of its key functions, instrument it with the observability needed for 24/7 operations, and provide self-service tools for augmenting the collaborations of disparate data and analytics pipeline stakeholders.
Please join TDWI’s senior research director James Kobielus on this webinar, focusing on the steps needed to put a modern data pipeline into production:
- Migrating disparate data and analytics pipelines non-disruptively to a fully managed cloud platform
- Consolidating data processing infrastructure to reduce costs and enhance lineage tracking and data governance
- Providing a self-service framework and APIs that make data analytics applications programmable through the languages and tools of developers’ choice
- Building, training, and operationalizing data analytics more rapidly and efficiently by enabling flexible collaboration among data engineers, data scientists, and other roles and personas
After Kobielus’ presentation, he will be joined by Jeremiah Hansen, principal architect for data engineering at Snowflake’s CTO Office, to discuss trends, issues, and best practices in data pipeline modernization.
Office of CTO Principal Architect for Data Engineering
Jeremiah is a principal data platform architect in the Field CTO office at Snowflake, and currently leads the data engineering community of practice there. He has an extensive background in data engineering and DevOps, having spent the past 16 years focused on it as a developer, architect, dev manager, consultant leader, sales engineer, and currently field CTO. He is a frequent speaker and blogger on data architecture and engineering.