🔔 FCM Loaded

Technical Lead-Data Engg

Birlasoft

5 - 10 years

Pune

Posted: 23/11/2025

Getting a referral is 5x more effective than applying directly

Job Description

Area(s) of responsibility

JD - AWS Data Platform Framework Development Engineer

Key Responsibilities
-
Design and develop reusable and scalable data processing frameworks and libraries for data ingestion, processing, and ETL pipelines on AWS alongside the platform development team.
-
Collaborate closely with framework developers, data engineers, architects, and analysts to standardize data pipelines and processing patterns.
-
Develop and enhance Debezium Kafka CDC pipeline frameworks to enable rapid instantiation of CDC data ingestion workflows.
-
Build and maintain AWS Glue PySpark job frameworks aligned with medallion architecture principles.
-
Implement and maintain ETL frameworks for loading data into Snowflake.
-
Develop Infrastructure as Code using Terraform and Github to automate provisioning and deployment of platform components.
-
Ensure platform reliability, scalability, and observability.
-
Contribute to improving development standards, code reviews, and best practices focused on framework and platform engineering.
Required Skills & Experience
-
Masters degree in software engineering, computer science or equivalent
-
AWS certifications (Solutions Architect Associate, Developer Associate, Data Engineer Associate).
-
Strong software engineering background with expertise in Python, especially PySpark.
-
Experience with and thorough understanding of Kafka and Kafka Connect concepts.
-
Proven track record developing reusable frameworks or libraries focusing on scalability and maintainability.
-
Sound understanding and practical application of OOP and SOLID principles (encapsulation, inheritance, polymorphism, abstraction).
-
Hands-on experience with AWS services including Glue, ECS, S3, Kafka (including Debezium), and Snowflake.
-
Experience building and orchestrating data pipelines using Airflow or similar tools.
-
Proficient in Infrastructure as Code using Terraform.
-
Familiarity with CI/CD workflows using GitHub or similar platforms.
-
Strong problem-solving skills and ability to write clean, modular, and well-documented code.
-
Excellent communication skills and ability to work collaboratively in an international team of highly skilled IT engineers.
Preferred Qualifications
-
Experience with Iceberg or other open table formats in a data lakehouse environment.
-
Prior experience working on CDC (Change Data Capture) pipelines or Kafka streaming frameworks.
-
Experience with big data processing frameworks is considered a plus.
-
Understanding of medallion architecture and data lakehouse design patterns.
-
Multiple years of application development experience with an OOP native programming language like Java, C++, or C#.
-
Experience with open table formats and big data is considered a plus.

About Company

Birlasoft is a global IT services and consulting company that is part of the CK Birla Group. It specializes in digital transformation, enterprise application services, and IT modernization for industries such as manufacturing, life sciences, BFSI, and energy. Birlasoft is known for its strong capabilities in SAP, Oracle, cloud, and analytics, helping clients drive innovation, reduce costs, and improve agility.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.