Loading...
Please wait a moment
Founded by passionate advocates of learning and innovation, Learni set out to make professional training accessible to everyone, everywhere in the world. Our team works in the largest cities such as Paris, Lyon, Marseille, and internationally, to support talents and organizations in their skills development.
Which format do you prefer?
30 free minutes with a training advisor — no commitment.
Loading available slots...
Professional Training training in New York in September 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Artificial Intelligence training in San Francisco in October 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Discover essential strategies, trends, and training programs for organizations to excel in data governance by March 2026. Stay compliant and leverage data effectively.
Professional Training training in Dallas in July 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Don't let this gap widen
Sans maîtrise d'Apache Airflow, 70 % des pipelines de données complexes échouent quotidiennement, générant des retards insurmontables dans l'analyse et la prise de décision.
Chaque incident coûte en moyenne 12 000 € en heures de débogage manuel et pertes d'opportunités business, avec des data teams perdant jusqu'à 25 heures par semaine en maintenance inefficace.
Pour l'entreprise, cela équivaut à une érosion de 15-20 % de la compétitivité data, exposant à des sanctions réglementaires et une stagnation carrière pour les ingénieurs.
Chaque mois sans orchestration avancée amplifie ces risques, transformant des atouts data en passifs coûteux.
The Maîtrisez Apache Airflow : Orchestration Efficace de Workflows de Données training is delivered in-person or remotely (blended-learning, e-learning, virtual classroom, remote in-person). At Learni, a Qualiopi-certified training organization, each program is designed to maximize skills acquisition, regardless of the training mode chosen.
The trainer alternates between demonstrative, interrogative, and active methods (through practical exercises and/or real-world scenarios). This pedagogical approach ensures concrete and directly applicable learning in the workplace.
To ensure the quality of the Maîtrisez Apache Airflow : Orchestration Efficace de Workflows de Données training, Learni provides the following teaching resources:
For in-house training at a location external to Learni, the client ensures and commits to having all necessary teaching materials (IT equipment, internet connection...) for the proper conduct of the training action in accordance with the prerequisites indicated in the communicated training program.
The assessment of skills acquired during the Maîtrisez Apache Airflow : Orchestration Efficace de Workflows de Données training is carried out through:
Learni is committed to the accessibility of its professional training programs. All our training programs are accessible to people with disabilities. Our teams are available to adapt teaching methods to your specific needs. Do not hesitate to contact us for any accommodation request.
Learni training programs are available for inter-company and intra-company settings, both in-person and remote. Registration is possible up to 48 business hours before the start of training. Our programs are eligible for OPCO, Pôle emploi, and FNE-Formation funding. Contact us to discuss your training project and funding possibilities.
Découverte de l’orchestration de workflows et du positionnement d’Airflow dans la stack data. Présentation des composants clés (Scheduler, Worker, Webserver, Metadata Database). Éléments d’architecture cloud et on-premise. Installation guidée d’Apache Airflow, configuration de base, découverte de l’interface WebUI, compréhension des logs et de la planification.
Création de DAGs et principes d'écriture (python_operator, bash_operator, sensors). Gestion des dépendances et des paramètres dynamiques. Mise en œuvre d’opérateurs avancés et personnalisation. Traitement d’erreurs, tâches conditionnelles, réexécutions automatiques et gestion des échecs. Exercices pratiques : création et déploiement de pipelines data, batch et temps réel.
Intégration de Airflow avec des bases de données relationnelles, des API, S3, BigQuery, Hadoop, Spark ou BI. Sécurisation de l’accès (RBAC, LDAP, secrets, audit logs). Planification avancée avec variables, templates, macros. Optimisation des performances (scaling, pools, queues, XCOMs). Surveillance avancée et alertes, déploiement, mise à l’échelle, maintenance et bonnes pratiques pour le run en production.
Target audience
Ingénieurs data, data scientists, développeurs, et administrateurs systèmes souhaitant orchestrer, automatiser et surveiller des pipelines de données complexes
Prerequisites
Bonne connaissance de Python, bases en administration système (Linux), notions d’ETL et de gestion de données
Loading...
Please wait a moment





























