Back-End Engineer (Data Engineering)
The Glove
2 - 5 years
Bengaluru
Posted: 20/04/2026
Job Description
Job Title: Back-End Engineer (Data Engineering)
Location: Bangalore, India
Experience: 5+ Years
Employment Type: Full-Time
About the Role
We are looking for a skilled Back-End Engineer with strong expertise in data engineering to design, build, and maintain scalable backend systems and data pipelines.
In this role, you will work with large-scale datasets, distributed systems, and cloud technologies, collaborating closely with data scientists and engineering teams to deliver high-impact, data-driven solutions.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using Spark, Scala, Java, and Python
- Build efficient ETL workflows and write optimized SQL queries
- Work with DataFrames for large-scale data processing and analysis
- Develop and manage real-time streaming pipelines using Kafka/MSK
- Implement cloud-based data solutions (AWS or GCP)
- Manage data storage using S3 or similar object storage systems
- Work with data lake technologies (e.g., Iceberg)
- Ensure data quality, integrity, and security across systems
- Collaborate with cross-functional teams to understand and deliver data requirements
- Troubleshoot and optimize performance of data pipelines and backend systems
Required Skills & Expertise
Must-Have
- 5+ years of experience in backend/data engineering
- Strong proficiency in Spark, Python, Scala, and Java
- Expertise in SQL and relational databases
- Hands-on experience with DataFrames and large-scale data processing
- Experience with cloud platforms (AWS / GCP)
- Experience with Kafka/MSK for real-time data streaming
- Experience with S3 or similar object storage systems
- Understanding of data warehousing concepts
- Strong communication and collaboration skills
Good-to-Have
- Experience with data modeling and schema design
- Knowledge of data governance and data quality frameworks
- Exposure to DevOps practices and CI/CD pipelines
- Certifications such as AWS Data Engineer / GCP Data Engineer
Technical Environment
- Languages: Python, Scala, Java
- Frameworks: Apache Spark
- Streaming: Kafka / MSK
- Cloud: AWS / GCP
- Storage: S3, Data Lakes (Iceberg)
- Databases: SQL-based systems
What Were Looking For
- Strong analytical and problem-solving mindset
- Ability to work with large-scale distributed systems
- Experience in building reliable and scalable backend solutions
- Strong team collaboration and communication skills
Education
- Bachelors degree in Computer Science, IT, or related field (or equivalent experience)
Why Join
- Work on large-scale data systems and modern cloud platforms
- Opportunity to collaborate with data scientists and high-performing engineering teams
- Exposure to real-time data processing and advanced data lake technologies
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
