🔔 FCM Loaded

Confluent Kafka Developer

Nice Software Solutions Pvt. Ltd.

2 - 5 years

Pune

Posted: 12/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Kafka/Confluent Developer (Banking Domain)

Location: Pune/Nagpur

Experience: 5+ Years


About the Role

We are seeking a highly skilled Kafka/Confluent Developer to design, build, and optimize real-time data integration and streaming solutions across our banking systems. This role requires strong expertise in Java, Confluent Kafka, and Kafka Connect APIs to ensure high-throughput, low-latency event-driven architectures that support mission-critical banking use cases.


Key Responsibilities

Develop and maintain event-driven architectures using Confluent Kafka for real-time integration

across core banking, CRM, fraud detection, and compliance systems.

Design and implement Kafka producers and consumers to handle high-volume, low-latency banking transactions.

Build reusable streaming components using Kafka Streams and ksqlDB for fraud detection,

customer notifications, and operational alerts.

Collaborate with the Data Governance team to ensure data lineage, quality, and metadata standards are upheld.

Enforce schema evolution best practices with Confluent Schema Registry to manage compatibility

across applications.

Develop custom Kafka Connectors (Source/Sink) and implement robust error handling, retries, and logging.

Integrate Kafka with external systems such as databases, REST APIs, and SOAP services.

Work with DevOps, cybersecurity, and platform teams to ensure seamless deployment, monitoring, and security compliance.

Partner with business units (Retail, Islamic Finance, Risk, Compliance) to gather requirements and

translate them into scalable Kafka-based solutions.

Support data platform architects and project managers with integration roadmaps and impact assessments.

Enable real-time use cases such as customer onboarding status, transaction streaming, digital

engagement analytics, and branch performance monitoring.


Required Skills & Experience

Strong Java proficiency (must-have).

Hands-on experience with Confluent Kafka and Kafka Connect APIs (Source/Sink connectors, Task

interfaces).

Deep understanding of Kafka topics, partitions, offset management, and replication.

Proficiency in schema handling (Schema Builder, Struct, Schema evolution).

Strong error handling, retries, and connector debugging using Confluent Connect Logs.

Experience integrating Kafka with databases, REST APIs, and SOAP services.

Proficiency in SQL for query optimization and data validation.

Knowledge of data governance, security standards, and compliance in banking/financial systems.

Familiarity with Kafka Streams and ksqlDB for real-time analytics.


Preferred Qualifications

Experience in banking, fintech, or financial services domain.

Exposure to cloud platforms (AWS, Azure, GCP) for Kafka deployment.

Familiarity with CI/CD pipelines and containerization (Docker/Kubernetes).

Strong collaboration skills with cross-functional teams. (DevOps, Business Analytics, Risk, Compliance)

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.