Loading...
Please wait a moment
Founded by passionate advocates of learning and innovation, Learni set out to make professional training accessible to everyone, everywhere in the world. Our team works in the largest cities such as Paris, Lyon, Marseille, and internationally, to support talents and organizations in their skills development.
Which format do you prefer?
30 free minutes with a training advisor — no commitment.
Loading available slots...
Artificial Intelligence training in Mesa in September 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Professional Training training in New York in September 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
Discover Learni's comprehensive Excel training program launching in May 2026, guiding learners from basic spreadsheets to advanced data mastery for career success.
Professional Training training in Memphis in October 2026 with Learni. Certified, expert trainers, eligible for employer funding. Free quote.
The Training Apache Kafka - Managing Massive Real-Time Data Streams training is delivered in-person or remotely (blended-learning, e-learning, virtual classroom, remote in-person). At Learni, a Qualiopi-certified training organization, each program is designed to maximize skills acquisition, regardless of the training mode chosen.
The trainer alternates between demonstrative, interrogative, and active methods (through practical exercises and/or real-world scenarios). This pedagogical approach ensures concrete and directly applicable learning in the workplace.
To ensure the quality of the Training Apache Kafka - Managing Massive Real-Time Data Streams training, Learni provides the following teaching resources:
For in-house training at a location external to Learni, the client ensures and commits to having all necessary teaching materials (IT equipment, internet connection...) for the proper conduct of the training action in accordance with the prerequisites indicated in the communicated training program.
The assessment of skills acquired during the Training Apache Kafka - Managing Massive Real-Time Data Streams training is carried out through:
Learni is committed to the accessibility of its professional training programs. All our training programs are accessible to people with disabilities. Our teams are available to adapt teaching methods to your specific needs. Do not hesitate to contact us for any accommodation request.
Learni training programs are available for inter-company and intra-company settings, both in-person and remote. Registration is possible up to 48 business hours before the start of training. Our programs are eligible for OPCO, Pôle emploi, and FNE-Formation funding. Contact us to discuss your training project and funding possibilities.
Dive into advanced Kafka multi-broker cluster configuration with ZooKeeper or KRaft, optimized topic partitioning for horizontal scaling, hands-on exercises on creating high-availability topics, simulation of failures to test synchronous/asynchronous replication, production of a documented cluster schema ready for enterprise use, personalized feedback from the trainer on your configurations.
Develop idempotent and transactional Kafka producers with Snappy/LZ4 compression, offset management for exactly-once semantics, implement rebalanced consumers with group management, hands-on workshops on simulated massive streams at 1M messages/second, tuning parameters to minimize latency, creation of a scalable consumer group with Prometheus metrics, leverage your skills for immediate professional projects.
Master the Kafka Streams API for stateful applications, develop topologies with windowing and aggregations on continuous streams, integrate KSQL for SQL queries on streams, exercises on real-time e-commerce cases like fraud detection, deployment of Streams apps in Docker containers, production of a stream processing microservice prototype, transform your data into actionable business insights.
Configure Kafka security with SASL/SCRAM, TLS for in-transit encryption, granular ACL setup for producers/consumers, advanced monitoring via Kafka Manager and Grafana, workshops on partition auto-rebalancing with Cruise Control, simulation of attacks to test defenses, delivery of a complete supervision dashboard, protect your critical pipelines and enhance professional reliability.
Integrate Kafka with Spark, Flink, and Elasticsearch via Kafka Connect, deployment on Kubernetes with Strimzi Operator, optimization for AWS/MSK cloud or on-premise, real-world enterprise real-time ETL cases, disaster recovery and blue-green deployment exercises, finalization of the red thread project with GitHub Actions CI/CD, leave with a concrete portfolio certifying your advanced Kafka skills.
Target audience
Data engineers, big data architects, corporate DevOps professionals seeking to upskill on data pipelines
Prerequisites
Experience in Java or Python, basics in Apache Kafka, knowledge of distributed systems and Linux
Loading...
Please wait a moment





























