Training schedule
IN-COMPANY TRAINING PROGRAMS
Contact Giovanni Lanzani, if you want to know more about custom data & AI training for your teams. He’ll be happy to help you!
Check out more
Develop workflows with Apache Airflow!
Have you ever scheduled a chain of cron jobs only to discover down the line that data was missing? Apache Airflow eliminates that problem. Learn to author, schedule, and monitor workflows through hands-on experience with the leading open source platform in the space.
„It was a hands-on and tangible course. We could apply what we learned in a matter of minutes. The trainer did a great job of answering ad-hoc questions that complemented the material. We appreciated the fact that we could apply what we were taught directly to our company.“ —Technical Leader & Software Architect, bol.com
Once you pass the training, you will receive a certificate that you can show off to your organization and peers alike.
This training is for you if…
- You want to schedule and monitor your data pipelines in a user-friendly way.
- You have complex data pipelines with a lot of dependencies.
- You want to schedule your data pipelines on external systems.
This training is not for you if…
- You don’t run any batch data pipelines.
- You want to use Airflow as an execution engine.
- You want to learn how to set up Airflow for production.
Clients we've helped
What you'll learn
- The rundown of Apache Airflow user interface
- How to create and monitor DAGs
- The basics of using the most critical operators
- How to create dynamic workflows with branching
- How to communicate with external systems, using hooks and connections
- How to trigger your workflows with sensors
The schedule
The program consists of both theory and hands-on exercises.
- The essential components of Apache Airflow
- Running and managing workflows
- Creating dynamic workflows with Jinja templating
- Sharing state between tasks with XComs
After the training you will be able to:
- Use basic Airflow concepts, such as DAGs, Operators, and Hooks.
- Design and create task workflows using Airflow DAGs.
- Schedule Airflow DAGs and use concepts like backfilling and catchup.
- Use Airflow Operators to execute predefined tasks which integrate with many different systems.
- Use the Airflow UI to get insights into the status of your workflows.
Get Certified in Apache Airflow Fundamentals

Apache Airflow is the leading orchestrator for authoring, scheduling, and monitoring data pipelines. It has quickly become an invaluable asset in any data professional’s toolbox.
Receiving a Certification for Apache Airflow Fundamentals demonstrates your knowledge of Airflow’s core concepts and your ability to make wise architectural decisions, understand applied use cases, and design data pipelines.
Once you pass the training, you will receive a certificate that you can show off to your organization and peers alike.
Kris Geusebroek
Big Data Hacker and TrainerKris is a seasoned and communicative developer with a passion for combining technologies to create new possibilities for the people around him. He started developing with Java and gained vast experience with the development of Geographical Information Systems. Over time, Kris gradually developed a passion for open source solutions.
Over the past years, Kris has been working with distributed systems and graph databases, like Hadoop and Neo4J, for large enterprises.
Clients include: Rabobank, Wehkamp, Dutch National Police, ING, KNAB, Schiphol, ABN AMRO, and Technische Unie
Training Courses:
Certified Apache Airflow Training