🔔 FCM Loaded

GCP Data Engineer

Impetus

10 - 12 years

Chennai

Posted: 22/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

  • 810 years of experience in data engineering or ETL development, with at least 4+ years in Google Cloud Dataflow.
  • Good hands-on experience in Google Cloud Data Services (like Dataflow, Cloud Storage, BigQuery, Cloud Composer, Secret Manager, etc.).
  • Strong understanding of ETL/ELT concepts and data migration of TB scale of data.
  • Develop and optimize end-to-end data pipelines using Dataflow.
  • Develop and implement generic, reusable pipelines for integrating incremental data.
  • Expertise in leading a data processing team and delivering with quality.
  • Ensure data quality throughout the data pipeline.


Roles & Responsibilities


  • Have proficiency in design, implementation, and optimization of data engineering solutions over large volume (TB, PB scale) of data using GCP data services.
  • Proven expertise in GCP services including Dataflow, BigQuery, Cloud Storage, Cloud Composer, Cloud Functions. Experience building scalable data lakes and pipelines.
  • Proficiency in PySpark, Python, Spark SQL, and automating workflows.
  • Have good exposure to writing optimized SQL (BigQuery SQL preferred).
  • Have good communication and problem-solving skills.
  • Able to create POC to achieve the solutions and perform code reviews of team.
  • Have understanding of GenAI technologies and able to implement solutions using GenAI, code assist frameworks (CoPilot, Windsurf, etc.).

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.