Loading...
Please wait a moment
Founded by passionate advocates of learning and innovation, Learni set out to make professional training accessible to everyone, everywhere in the world. Our team works in the largest cities such as Paris, Lyon, Marseille, and internationally, to support talents and organizations in their skills development.
Which format do you prefer?
30 free minutes with a training advisor — no commitment.
Loading available slots...
Cybersecurity training in Oklahoma City in December 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
No-Code / Low-Code training in Leeds in November 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Professional Training training in Dallas in July 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Cybersecurity training in Sheffield in November 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
The Training Document AI Pipelines - Automate AI Analysis of Documents training is delivered in-person or remotely (blended-learning, e-learning, virtual classroom, remote in-person). At Learni, a Qualiopi-certified training organization, each program is designed to maximize skills acquisition, regardless of the training mode chosen.
The trainer alternates between demonstrative, interrogative, and active methods (through practical exercises and/or real-world scenarios). This pedagogical approach ensures concrete and directly applicable learning in the workplace.
To ensure the quality of the Training Document AI Pipelines - Automate AI Analysis of Documents training, Learni provides the following teaching resources:
For in-house training at a location external to Learni, the client ensures and commits to having all necessary teaching materials (IT equipment, internet connection...) for the proper conduct of the training action in accordance with the prerequisites indicated in the communicated training program.
The assessment of skills acquired during the Training Document AI Pipelines - Automate AI Analysis of Documents training is carried out through:
Learni is committed to the accessibility of its professional training programs. All our training programs are accessible to people with disabilities. Our teams are available to adapt teaching methods to your specific needs. Do not hesitate to contact us for any accommodation request.
Learni training programs are available for inter-company and intra-company settings, both in-person and remote. Registration is possible up to 48 business hours before the start of training. Our programs are eligible for OPCO, Pôle emploi, and FNE-Formation funding. Contact us to discuss your training project and funding possibilities.
Analysis of business needs to structure high-performance Document AI pipelines, hands-on with orchestration tools like MLflow and Kubeflow, modeling an end-to-end pipeline on real cases of invoices and contracts, practical exercises in containerization with Docker, creation of architecture diagrams and first functional prototype tested in groups to validate scalability.
Deployment of advanced OCR models like Tesseract and LayoutLM to extract text, tables, and images from complex documents, processing of scanned PDFs and various formats in batch, exercises on real enterprise datasets with automatic data cleaning, integration into a modular pipeline, production of structured datasets ready for AI analysis, evaluation of accuracy with custom metrics.
Development of semantic analysis modules with BERT and transformers for classification and named entity recognition on documents, combination with YOLO for visual element detection, practical exercises on legal contracts and financial reports, fine-tuning of pre-trained models via Hugging Face, enrichment of the pipeline with confidence scoring, automated report generation and visualization of insights for rapid business decisions.
Configuration of orchestrators like Apache Airflow and Kubernetes for distributed pipelines, migration to cloud AWS SageMaker or GCP Vertex AI, management of asynchronous and parallel workflows on massive volumes, load peak simulation exercises with real-time monitoring, cost and resource optimization, end-to-end integration tests on the capstone project, documentation of deployments for DevOps teams.
Production deployment with CI/CD GitHub Actions, setup of advanced monitoring via Prometheus and ELK stack for logs and alerts, performance optimization through model pruning and distillation, debugging exercises on real incidents, A/B testing of pipeline versions, securing sensitive data in GDPR compliance, finalization of the capstone project with ROI report and maintenance plan for immediate enterprise use.
Target audience
Data scientists, machine learning engineers, data engineers in enterprises seeking to upskill on AI pipelines
Prerequisites
Mastery of Python, ML frameworks (TensorFlow/PyTorch), NLP and computer vision
Loading...
Please wait a moment





























