Data engineer GCP (Fulltime with fortune 500 company)
HARP
2 - 5 years
Pune
Posted: 22/02/2026
Job Description
Job Title: GCP Big Data Engineer
Locations: Bangalore | Gurugram | Pune | Chennai | Hyderabad
Work Mode: Hybrid
Experience: 6 12 Years
Employment Type: Full-time
About the Role
We are seeking an experienced GCP Big Data Engineer with strong expertise in Google Cloud Platform data services and modern big data technologies. The ideal candidate will have deep hands-on experience in building scalable, enterprise-grade data pipelines across GCP environments.
This is a high-impact role where you will contribute to designing, developing, and optimizing cloud-based data solutions in a collaborative and fast-paced environment.
Must Have Skills
- Google Cloud Platform (GCP)
- BigQuery
- Dataflow
- Dataproc
- Cloud Composer (Airflow)
- PySpark
- SQL
- Python
Good to Have
- Apache Kafka
- Git
- Jenkins
- CI/CD Pipelines
- Hadoop
- Hive
Key Responsibilities
- Architect and implement scalable data pipelines on GCP
- Develop and optimize ETL/ELT processes using Dataflow & Dataproc
- Work extensively with BigQuery for analytics and performance tuning
- Implement workflow orchestration using Cloud Composer (Airflow)
- Build distributed data processing solutions using Spark / PySpark
- Collaborate with stakeholders to translate business requirements into technical solutions
- Ensure best practices in DevOps, CI/CD, and performance optimization
Qualifications
- 612 years of overall IT experience
- Strong hands-on experience in GCP data stack
- Expertise in Big Data technologies (Spark, Hadoop, Hive)
- Advanced SQL and Python proficiency
- Experience in CI/CD and version control
- GCP Professional Data Engineer Certification (Preferred)
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
