course background
course

Building Data Pipelines with Airflow


30

People

2

Days

About Course

This course teaches participants how to build, manage, and scale data pipelines using Apache Airflow to handle large datasets. It covers key concepts like DAGs, tasks, operators, and advanced topics such as workflow automation and error handling. Key Learning Objectives: Build data processing frameworks. Set up and manage data pipelines. Handle pipeline failures and ensure data quality.

Instructor

logo

Kan Ouivirach

Data Product Developer and Technical Enabler

Course Outline

Who this course is for ?

  • Data Engineers – looking to build automated data pipelines for largescale data
  • Developers – working with data teams or managing data platforms
  • Data Analysts & Scientists – looking to overcome database limitations and answer business questions