Loading...
Please wait a moment
Founded by passionate advocates of learning and innovation, Learni set out to make professional training accessible to everyone, everywhere in the world. Our team works in the largest cities such as Paris, Lyon, Marseille, and internationally, to support talents and organizations in their skills development.
Which format do you prefer?
30 free minutes with a training advisor — no commitment.
Loading available slots...
Master influence and persuasion skills for 2026 with proven strategies, emerging tech, and practical exercises tailored for professional growth in a dynamic world.
Discover step-by-step methods to master bookkeeping and accounting fundamentals in April 2026. Explore online courses, tools, practice tips, and future trends like AI integration for aspiring professionals.
Artificial Intelligence training in Cardiff in May 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Professional Training training in Memphis in October 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Don't let this gap widen
Sans maîtrise d'Apache Airflow, vos workflows de données deviennent chaotiques et manuels, générant retards et erreurs récurrentes.
Les data engineers perdent en moyenne 25 à 40% de leur temps en maintenance manuelle, soit plus de 50 000 € annuels par équipe en productivité gaspillée.
70% des incidents data critiques sont liés à une orchestration défaillante, exposant l'entreprise à des pertes financières massives et à une perte de compétitivité.
Chaque mois sans compétences solides en Airflow compromet les projets stratégiques, menace les carrières des architectes data et freine la croissance business.
The Maîtriser Apache Airflow pour l'Orchestration de Workflows de Données training is delivered in-person or remotely (blended-learning, e-learning, virtual classroom, remote in-person). At Learni, a Qualiopi-certified training organization, each program is designed to maximize skills acquisition, regardless of the training mode chosen.
The trainer alternates between demonstrative, interrogative, and active methods (through practical exercises and/or real-world scenarios). This pedagogical approach ensures concrete and directly applicable learning in the workplace.
To ensure the quality of the Maîtriser Apache Airflow pour l'Orchestration de Workflows de Données training, Learni provides the following teaching resources:
For in-house training at a location external to Learni, the client ensures and commits to having all necessary teaching materials (IT equipment, internet connection...) for the proper conduct of the training action in accordance with the prerequisites indicated in the communicated training program.
The assessment of skills acquired during the Maîtriser Apache Airflow pour l'Orchestration de Workflows de Données training is carried out through:
Learni is committed to the accessibility of its professional training programs. All our training programs are accessible to people with disabilities. Our teams are available to adapt teaching methods to your specific needs. Do not hesitate to contact us for any accommodation request.
Learni training programs are available for inter-company and intra-company settings, both in-person and remote. Registration is possible up to 48 business hours before the start of training. Our programs are eligible for OPCO, Pôle emploi, and FNE-Formation funding. Contact us to discuss your training project and funding possibilities.
Présentation de l’orchestration des workflows de données, origines et concepts clés d’Airflow, architecture (Scheduler, Workers, Executor, PostgreSQL, WebUI), installation locale via Docker ou pip, prise en main de l’interface graphique, premiers DAGs et tâches simples, opérateurs de base, gestion des connexions, configuration des variables et secrets.
Écriture de DAGs dynamiques en Python, opérateurs avancés (BashOperator, PythonOperator, EmailOperator, etc.), gestion des dépendances entre tâches, planification, politics de retry, contrôle du parallélisme, Sensors et opérateurs personnalisés, déclenchement manuel vs planifications périodiques, hooks et interfaçage avec des systèmes externes (bases SQL, cloud, API, S3), gestion des templates et des macros.
Déploiement d’Airflow sur serveurs, Docker ou Kubernetes, configuration pour la production, gestion de la sécurité, scheduling à grande échelle, surveillance et alertes (logs, emails, SLAs, monitoring), gestion des erreurs et reprise automatique, maintenance des DAGs, gestion des mises à jour, optimisation des pipelines, présentation des cas d’usages réels, intégration avec des solutions Cloud (GCP, AWS, Azure), audit de logs et bonnes pratiques pour la fiabilité et la scalabilité.
Target audience
Data engineers, développeurs, architectes data et professionnels IT souhaitant automatiser et orchestrer des pipelines de données
Prerequisites
Connaissance de base de Python et notions fondamentales en gestion des données
Loading...
Please wait a moment





























