🔔 FCM Loaded

Kafka Confluent Developer

Crescendo Global

2 - 5 years

Gurugram

Posted: 28/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Senior Software Engineer Confluent and Kafka


Experience: 6+ Years (3+ Years in Confluent and Kafka)

Location- Gurugram

Employment Type: Full-Time

Function: Integration / Engineering

Immediate to 30 day joiners


Role Purpose

We are seeking a Senior Software Engineer with strong hands-on experience in Confluent Platform, Apache Kafka, and Apache Flink to support the evolution of enterprise streaming capabilities.

This role sits within the Integration function and focuses on enabling real-time data, event-driven architecture, and high-performance integrations across the organization.

This is an exciting opportunity to join a technology transformation journey focused on building a consolidated cloud-based platform using Agile and DevOps practices.

You may be required to provide out-of-hours standby support to ensure timely incident resolution and operational continuity.


Key Accountabilities

Design, develop, test, and deploy event-driven services and streaming data pipelines

Develop Kafka topics, schemas, and streaming applications using Kafka, Kafka Connect, Schema Registry, and Flink

Collaborate with architects and platform teams to shape the event streaming roadmap

Provide subject-matter expertise in distributed streaming and event-driven architecture

Review streaming applications to ensure quality, scalability, and best practices

Troubleshoot production issues and conduct root cause analysis

Promote platform standards, governance models, and reusable patterns

Collaborate with vendor teams and professional services partners

Actively participate in Agile ceremonies and technical discussions


Functional / Technical Skills

6+ years of software engineering experience

3+ years of hands-on experience with Confluent Platform / Apache Kafka

Experience designing and building distributed streaming applications

Strong understanding of event-driven architecture (event streaming, event sourcing, Pub/Sub, stream processing)

Experience with Avro, JSON, Protobuf and Schema Registry

Integration experience with SQL/NoSQL databases, APIs, and cloud data platforms

Familiarity with API design, data modelling, and microservice integration patterns

Proficiency with Git, Jira, Azure DevOps

Experience with Docker and Kubernetes

Working knowledge of AWS and/or Azure

Strong understanding of clean code, reusable component design, and Agile/DevOps practices

Knowledge of SDLC methodologies (Agile and Waterfall)

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.