Loading...
Please wait a moment
Founded by passionate advocates of learning and innovation, Learni set out to make professional training accessible to everyone, everywhere in the world. Our team works in the largest cities such as Paris, Lyon, Marseille, and internationally, to support talents and organizations in their skills development.
Which format do you prefer?
30 free minutes with a training advisor — no commitment.
Loading available slots...
No-Code / Low-Code training in Leeds in November 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Professional Training training in Tucson in December 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Discover Learni's comprehensive Excel training program launching in May 2026, guiding learners from basic spreadsheets to advanced data mastery for career success.
Professional Training training in Fort Worth in July 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
The Training Kafka Connect - Integrating Real-Time Data Pipelines training is delivered in-person or remotely (blended-learning, e-learning, virtual classroom, remote in-person). At Learni, a Qualiopi-certified training organization, each program is designed to maximize skills acquisition, regardless of the training mode chosen.
The trainer alternates between demonstrative, interrogative, and active methods (through practical exercises and/or real-world scenarios). This pedagogical approach ensures concrete and directly applicable learning in the workplace.
To ensure the quality of the Training Kafka Connect - Integrating Real-Time Data Pipelines training, Learni provides the following teaching resources:
For in-house training at a location external to Learni, the client ensures and commits to having all necessary teaching materials (IT equipment, internet connection...) for the proper conduct of the training action in accordance with the prerequisites indicated in the communicated training program.
The assessment of skills acquired during the Training Kafka Connect - Integrating Real-Time Data Pipelines training is carried out through:
Learni is committed to the accessibility of its professional training programs. All our training programs are accessible to people with disabilities. Our teams are available to adapt teaching methods to your specific needs. Do not hesitate to contact us for any accommodation request.
Learni training programs are available for inter-company and intra-company settings, both in-person and remote. Registration is possible up to 48 business hours before the start of training. Our programs are eligible for OPCO, Pôle emploi, and FNE-Formation funding. Contact us to discuss your training project and funding possibilities.
Quick installation of Kafka Connect in standalone and distributed modes with Docker, hands-on with source connectors like JDBCSource to extract enterprise SQL data, practical exercises on ingesting application logs into Kafka topics, configuration of FileSink and KafkaSink to export to NoSQL databases, end-to-end tests with Avro schema validation, debugging via REST API logs, creation of a first operational pipeline and sharing of deliverables in group.
Exploration of Single Message Transforms to filter, enrich, and route data in real time, practical implementation of SMT such as ReplaceField and TimestampConverter on real enterprise data streams, integration of the Debezium connector for CDC from MySQL/PostgreSQL, exercises on complex transformations with regex and HoistField, error handling and dead letter queues, offset optimization for incident recovery, production of tested transformed pipelines using tools like kcat, focus on immediate business value.
Deployment in a Kubernetes cluster with horizontal scaling of Kafka Connect workers, advanced configuration for high availability and fault tolerance, setup of monitoring with Prometheus/Grafana and alerts on latency/throughput, development of a custom Java connector for proprietary business APIs, exercises on offset management and exactly-once semantics, real enterprise case with legacy migration to streaming, code review and presentation of the red thread project, post-training implementation plan.
Target audience
Data engineers, DevOps, and Big Data architects aiming to upskill on streaming data flows
Prerequisites
Knowledge of Apache Kafka, basic Java, and JSON/Avro formats
Loading...
Please wait a moment





























