GCP Data Engineer
Confidential
2 - 5 years
Gurugram
Posted: 20/03/2026
Getting a referral is 5x more effective than applying directly
Job Description
Role: GCP Data Engineer
Job Location: Bangalore / Chennai / Gurgaon
Experience: 6 to 12 Years
Notice Period: Immediate to Official 30 Days
Role Summary
We are seeking a skilled GCP Data Engineer with strong hands-on experience in SQL, PySpark, and Google Cloud Platform. The role involves designing, developing, and optimizing scalable data pipelines while ensuring high performance, reliability, and CI/CD readiness.
Mandatory Skills
- Advanced SQL (complex joins, window functions, performance optimization)
- Google Cloud Platform (GCP)
- BigQuery (partitioning, clustering, and cost optimization)
- Python
- Dataproc (PySpark / Spark)
- Google Cloud Storage (GCS)
- Cloud Composer (Airflow)
Additional GCP Services
- Pub/Sub
- Cloud Functions / Cloud Run
- Dataflow (Apache Beam)
DevOps & Tools
- Git / GitHub
- CI/CD pipelines
- Linux / Shell scripting
Key Responsibilities
- Develop and maintain ETL/ELT pipelines using Python and PySpark
- Design, build, and optimize Spark jobs on Dataproc
- Write and optimize BigQuery SQL queries with focus on performance and cost efficiency
- Orchestrate and schedule workflows using Cloud Composer (Airflow)
- Implement data validation, monitoring, and error handling mechanisms
- Manage incremental and batch data processing pipelines
- Support Hadoop to GCP migration initiatives
- Perform metadata management, pipeline optimization, and performance tuning
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
