Login Sign Up
🔔 FCM Loaded

GCP Data Engineer

Impetus

7 - 9 years

Bengaluru

Posted: 18/03/2026

Getting a referral is 5x more effective than applying directly

Job Description

Role Overview:

We are looking for a skilled GCP Data Engineer with 47 years of experience in designing and building scalable data pipelines and data processing systems on Google Cloud Platform (GCP). The ideal candidate should have strong expertise in Python, SQL, and GCP data services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer to support large-scale data analytics and data integration initiatives.

The candidate will work closely with data analysts, data scientists, and platform teams to build reliable, efficient, and scalable data solutions.


Key Responsibilities

Design, develop, and maintain scalable data pipelines on Google Cloud Platform (GCP).

Build ETL/ELT workflows using Python and SQL for ingesting, transforming, and processing large datasets.

Implement real-time and batch data processing pipelines using Dataflow and Pub/Sub.

Develop and optimize data warehouse solutions using BigQuery.

Orchestrate data workflows using Cloud Composer (Apache Airflow).

Integrate multiple data sources including APIs, databases, and streaming systems.

Ensure data quality, integrity, and reliability across pipelines.

Optimize query performance and cost efficiency in BigQuery.

Collaborate with data scientists, analysts, and engineering teams to support analytics and machine learning use cases.

Implement monitoring, logging, and troubleshooting for data pipelines.

Follow best practices for data governance, security, and compliance.


Required Skills & Qualifications

47 years of experience in Data Engineering or Big Data development.

Strong programming experience in Python for data processing and automation.

Advanced SQL skills for data transformation and analysis.

Hands-on experience with Google Cloud Platform (GCP) data services including:

  • BigQuery
  • Dataflow
  • Pub/Sub
  • Cloud Storage
  • Cloud Composer

Experience building batch and real-time data pipelines.

Strong understanding of data warehousing concepts and ETL frameworks.

Experience working with large-scale distributed data systems.

Familiarity with Apache Beam / Airflow concepts.

Experience with CI/CD pipelines and Git-based version control.


Good to Have

Experience with Spark, Hadoop, or other big data frameworks.

Knowledge of data modeling techniques (star schema, dimensional modeling).

Exposure to machine learning data pipelines.

Experience with Terraform or Infrastructure as Code.

GCP certifications such as Professional Data Engineer.


Education

Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or a related field.


Key Competencies

Strong problem-solving and analytical skills

Ability to work with large and complex datasets

Good communication and collaboration skills

Experience working in Agile environments

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.