Login Sign Up
🔔 FCM Loaded

Senior Data Engineer

ImmersiveData.AI

5 - 10 years

Pune

Posted: 09/03/2026

Getting a referral is 5x more effective than applying directly

Job Description

Job Description: Senior Data Engineer / Technical Lead

Location:Pune, India (In-office)

Job Type:Full-Time

Experience Required:5 Years

About ImmersiveData.ai

At ImmersiveData.ai, we go beyond traditional data transformation. We help organizations rethink and redefine their business models through advanced AI, intelligent automation, and modern data platforms. Our goal is to enable enterprises to unlock deeper insights, improve decision-making, and drive meaningful innovation using cutting-edge technologies.


Role Overview

  • Experience: 5 Years
  • Core Tech: Python, PySpark, Big Data (Hadoop/Spark), Cloud (AWS/Azure/GCP)
  • Leadership: Big Data Lead


Key Responsibilities

1. Architectural Design & Strategy

  • Design and implement end-to-end Data Pipelines (Batch & Streaming) that handle multi-terabyte datasets.
  • Architect scalable cloud-native data lakes and warehouses on GCP, AWS, Azure
  • Select the right tools for the jobbalancing performance, cost, and maintainability.

2. Advanced Data Engineering

  • Write expert-level Python and PySpark code to transform complex raw data into actionable insights.
  • Optimize Spark jobs for performance, addressing issues like data skew, shuffling, and memory management.
  • Implement robust ETL/ELT frameworks with a focus on Data Quality and Observability.

3. Technical Leadership & Standards

  • Enforce Coding Standards: Lead code reviews and champion PEP 8, SOLID principles, and DRY methodologies.
  • Protocols & Security: Implement data governance, encryption at rest/transit, and secure API protocols
  • Mentorship: Guide junior and mid-level engineers, fostering a culture of technical excellence.

4. DevOps & Automation

  • Manage CI/CD pipelines for data projects using Jenkins, GitLab CI, or GitHub Actions.
  • Orchestrate workflows using tools like Apache Airflow, Prefect, or Dagster.
  • Ensure "Infrastructure as Code" (Terraform/CloudFormation) is integrated into the data lifecycle.


Technical Skills Required

Category - Must-Have Expertise

Languages

Python (Expert), SQL (Advanced), Scala (Bonus)

Big Data

PySpark, Apache Spark, Hive, Delta Lake/Iceberg

Cloud

AWS (Glue, EMR, S3, Redshift) or Azure/GCP equivalent

Streaming

Kafka, Kinesis, or Spark Streaming

Databases

NoSQL (MongoDB, Cassandra), Graph DBs, and RDBMS

Tools

Airflow, Docker, Kubernetes, Terraform


Soft Skills & "The X-Factor"

  • Strong Communicator: Ability to explain complex architectural trade-offs to non-technical stakeholders.
  • Problem Solver: You don't just fix bugs; you identify systemic bottlenecks and re-engineer them.
  • Standard-Bearer: A passion for unit testing, integration testing, and documentation that actually stays up to date.


Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.