Data Engineer
Aarvian
2 - 5 years
Bengaluru
Posted: 21/03/2026
Job Description
Location: Pune / Chennai / Bangalore (Hybrid/On-site)
Experience: 4 6 Years
Role Type: Full-time
About the RoleAre you passionate about building large-scale data processing systems and modern Lakehouse architectures? We are looking for a high-caliber Data Engineer to join our growing team. You will sit at the intersection of AWS and Databricks, designing and optimizing scalable ETL/ELT pipelines that drive business-critical analytics.
If you have a deep understanding of Medallion Architecture, Unity Catalog, and serverless data orchestration, we want to hear from you.
The Tech Stack Youll Command- Platform: Databricks Lakehouse, Delta Lake, Delta Sharing
- Cloud: AWS (S3, Glue, Lambda, Step Functions)
- Languages: Python, SQL, Scala, or Java
- Governance: Unity Catalog
- Architect & Build: Design and implement robust data pipelines using the Databricks Lakehouse platform.
- Medallion Strategy: Apply Medallion Architecture (Bronze, Silver, Gold layers) for streamlined data processing and refinement.
- Modern Governance: Implement data governance and granular access control using Unity Catalog.
- Serverless Orchestration: Utilize AWS Lambda and Step Functions for efficient, serverless data processing workflows.
- ETL Excellence: Develop and manage complex ETL pipelines using AWS Glue and Amazon S3.
- Collaborate: Partner with Data Scientists and cross-functional Agile teams to deliver high-quality data solutions.
- Databricks Expert: Hands-on experience with the Databricks Lakehouse Platform and relevant certifications.
- AWS Professional: Strong proficiency in AWS data services, specifically Glue, S3, Lambda, and Step Functions.
- Programming Mastery: High proficiency in Python, Scala, or Java, along with advanced SQL skills.
- Data Modeling: Proven experience in data modeling for Lakehouse and modern warehouse architectures.
- Best Practices: Deep understanding of ETL best practices, data transformation techniques, and governance frameworks.
- Advanced AWS Certifications (Data Engineer / Solutions Architect).
- Experience with multi-cloud environments (e.g., Azure).
- Familiarity with modern data engineering tools and frameworks.
- Work on cutting-edge, global-scale data projects.
- Direct exposure to the latest cloud technologies and Databricks features.
- A collaborative environment that values innovation and technical excellence.
- If you are ready to engineer the future of data, please apply with your updated resume highlighting your Databricks and AWS experience.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
