GCP Data Engineer
MethodHub
2 - 5 years
Hyderabad
Posted: 14/03/2026
Getting a referral is 5x more effective than applying directly
Job Description
Job Description:
We are looking for a Data Engineer with strong experience in SQL, PySpark, and Google Cloud Platform (GCP). The candidate will work on building and maintaining ETL pipelines, processing large datasets using Spark, and optimizing cluster performance. The role requires a strong understanding of data engineering and data analytics.
Required Skills:
- Strong SQL skills (complex queries, data processing)
- Hands-on experience with PySpark / Spark SQL
- Experience working on Google Cloud Platform (GCP)
- Experience in ETL pipeline development
- Knowledge of cluster optimization and Spark performance tuning
- Strong understanding of data engineering and data analytics
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
