🔔 FCM Loaded

Kafka developer

Tata Consultancy Services

6 - 10 years

Bengaluru

Posted: 08/01/2026

Getting a referral is 5x more effective than applying directly

Job Description

Dear Candidates,

Greetings from TCS!!!

TCS is looking for Kafka developer

Experience: 6-10 years

Location: Hyderabad/Pune/Bangalore/Chennai/Kolkata


Required Technical Skill Set: Kafka developer


Job description: We are seeking a skilled and motivated Mid-Level Kafka Developer to join our team. The ideal candidate will have a solid background in building scalable, high-throughput data pipelines and real-time streaming solutions using Apache Kafka. You will work closely with architects, engineers, and product managers to implement event-driven architectures and integrate Kafka with various systems.


Key Responsibilities:

  • Design, develop, and maintain real-time streaming solutions using Apache Kafka.
  • Implement Kafka producers, consumers, topics, and stream processing logic.
  • Develop and maintain Kafka Connectors and Kafka Streams applications.
  • Optimize Kafka clusters for performance, scalability, and reliability.
  • Collaborate with DevOps teams for Kafka deployment, monitoring, and alerting.
  • Integrate Kafka with data sources like databases, microservices, and cloud services.
  • Participate in code reviews, design discussions, and sprint planning.
  • Troubleshoot and resolve issues related to message delivery, data loss, and system failures.


Primary skills:

  • 6+ years of software development experience.
  • Hands-on experience with Apache Kafka.
  • Strong knowledge of Kafka architecture (brokers, zookeepers, partitions, offsets, replication).
  • Experience with Kafka Streams, Kafka Connect, or ksqlDB.
  • Proficiency in Java, Scala, or Python.
  • Understanding of distributed systems and real-time data processing.
  • Experience with RESTful APIs and microservices architecture


Good to have skill:

  • Experience with Kafka on Confluent Platform or AWS MSK.
  • Familiarity with Schema Registry, Avro/Protobuf, and serialization formats.
  • Knowledge of containerization (Docker, Kubernetes).
  • Experience integrating Kafka with Big Data technologies (e.g., Hadoop, Spark, Flink).
  • Familiarity with monitoring tools like Prometheus, Grafana, Kafka Manager, or Confluent Control Center

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.