Loading...
Please wait a moment
Founded by passionate advocates of learning and innovation, Learni set out to make professional training accessible to everyone, everywhere in the world. Our team works in the largest cities such as Paris, Lyon, Marseille, and internationally, to support talents and organizations in their skills development.
Which format do you prefer?
30 free minutes with a training advisor — no commitment.
Loading available slots...
No-Code / Low-Code training in Leeds in November 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Comprehensive guide to Figma training in 2025, covering essentials to sophisticated prototyping. Ideal for designers preparing for professional growth.
Professional Training training in Fort Worth in July 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Cybersecurity training in Oklahoma City in December 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
The Databricks Training - Mastering the Lakehouse for Data Pros training is delivered in-person or remotely (blended-learning, e-learning, virtual classroom, remote in-person). At Learni, a Qualiopi-certified training organization, each program is designed to maximize skills acquisition, regardless of the training mode chosen.
The trainer alternates between demonstrative, interrogative, and active methods (through practical exercises and/or real-world scenarios). This pedagogical approach ensures concrete and directly applicable learning in the workplace.
To ensure the quality of the Databricks Training - Mastering the Lakehouse for Data Pros training, Learni provides the following teaching resources:
For in-house training at a location external to Learni, the client ensures and commits to having all necessary teaching materials (IT equipment, internet connection...) for the proper conduct of the training action in accordance with the prerequisites indicated in the communicated training program.
The assessment of skills acquired during the Databricks Training - Mastering the Lakehouse for Data Pros training is carried out through:
Learni is committed to the accessibility of its professional training programs. All our training programs are accessible to people with disabilities. Our teams are available to adapt teaching methods to your specific needs. Do not hesitate to contact us for any accommodation request.
Learni training programs are available for inter-company and intra-company settings, both in-person and remote. Registration is possible up to 48 business hours before the start of training. Our programs are eligible for OPCO, Pôle emploi, and FNE-Formation funding. Contact us to discuss your training project and funding possibilities.
Discovery of the Databricks platform through creation of a professional workspace account, configuration of initial Spark clusters for fast data processing, hands-on experience with interactive notebooks using Python and SQL, practical exercises on importing real enterprise datasets, simple exploratory analyses with integrated visualizations, production of a first basic dashboard to validate immediate learning outcomes.
Deep dive into Databricks notebooks for distributed querying with Spark SQL, DataFrame manipulation via the Python pyspark API, exercises on cleaning and transforming large datasets from concrete business cases, use of Magic Commands for seamless execution, real-time collaboration on shared notebooks, generation of team-ready interactive reports, skill consolidation through a personalized ongoing project.
Intensive training on the Spark engine in Databricks for scalable ETL, advanced DataFrame transformations with aggregations and window functions, practical exercises on automated data pipelines for terabyte-scale volumes, cluster tuning for optimal performance, enterprise use cases such as log or sales processing, development of scheduled batch jobs, delivery of reusable scripts tested in real-world conditions.
Mastery of Delta Lake for transactional data lakes on Databricks, implementation of ACID tables with merge and upsert operations on live datasets, time travel exercises for audits and rollbacks, storage optimization using Z-Order and compaction, integration with Unity Catalog for secure governance, hands-on projects with sensitive enterprise data, production of reliable pipelines with built-in monitoring for immediate ROI.
Finalization with Databricks workflows using Jobs Scheduler for end-to-end automations, introduction to MLflow for tracking simple models, exercises on deploying complete Lakehouse pipelines, monitoring of performance and cluster costs, business cases such as scaled sales predictions, collaborative code review and live debugging, conclusion with presentation of the ongoing project to certify production-ready professional skills.
Target audience
Data analysts, beginner data engineers, IT managers for big data upskilling
Prerequisites
Basic knowledge of Python and SQL, familiarity with structured data concepts
Loading...
Please wait a moment





























