Data Engineer
AideWiser SolTek
2 - 5 years
Pune City
Posted: 09/04/2026
Getting a referral is 5x more effective than applying directly
Job Description
Job Title: Data Engineer Databricks / Airflow / AWS Lakehouse
Location: Pune, Bangalore, Chennai
Experience: 57 Years
Type: C2H
About the Role
We are looking for a skilled Data Engineer with strong expertise in building scalable data pipelines and modern data platforms using Databricks, Apache Airflow, and AWS Lakehouse architecture. The ideal candidate should have hands-on experience in big data processing, orchestration, and cloud-based data solutions.
Key Responsibilities
- Design, build, and maintain scalable and reliable data pipelines using Databricks and Apache Airflow
- Develop and optimize ETL/ELT workflows for structured and unstructured data
- Implement and manage AWS Lakehouse architecture using services like S3, Glue, and Redshift
- Work with Apache Iceberg for data lake table management and optimization
- Ensure data quality, integrity, and governance across pipelines
- Collaborate with data analysts, data scientists, and stakeholders to understand business requirements
- Monitor, troubleshoot, and optimize data workflows for performance and cost efficiency
- Implement best practices for data security, scalability, and reliability
- Automate workflows and deployments using CI/CD pipelines
Required Skills
- Strong experience with Databricks (PySpark, Delta Lake)
- Hands-on experience with Apache Airflow for workflow orchestration
- Solid understanding of AWS Lakehouse architecture
- Proficiency in Python for data engineering tasks
- Experience with Apache Iceberg
- Good knowledge of SQL and data modeling techniques
- Experience working with large-scale distributed data systems
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
