Login Sign Up
🔔 FCM Loaded

GCP Data Engineers with Dataplex | 4 to 8 years | Bangalore | 18 to 20 LPA | Big4

Acme Services

2 - 5 years

Bengaluru

Posted: 09/03/2026

Getting a referral is 5x more effective than applying directly

Job Description

Job Title: GCP Data Engineer with Dataplex

Experience: 4 to 8 Years

Budget: 18 to 20 LPA

Location: Bangalore

Type: Full-Time


Responsibilities:

Job Summary: As a skilled GCP Data Engineer with hands-on experience in Google Cloud Platform (GCP) and Dataplex, you will be responsible for building, maintaining, and optimizing data pipelines and data lake infrastructure. You will work closely with data architects, governance teams, and business stakeholders to deliver reliable, high-quality, and governed data products on GCP.


Core Responsibilities:

Design, develop, and maintain scalable data ingestion, transformation, and orchestration pipelines using GCP-native services such as Dataflow, Dataproc, Cloud Composer, and Pub/Sub

Implement and manage GCP Dataplex lakes and zones configuring data discovery, metadata tagging, data quality rules, and governance policies

Build and manage data lake and lakehouse architectures on Cloud Storage and BigQuery, following best practices for partitioning, clustering, and cost optimisation

Collaborate with Data Governance and Architecture teams to ensure data quality, lineage, and cataloging standards are enforced across data pipelines

Develop and maintain ELT/ETL workflows using Cloud Composer (Apache Airflow), Dataflow (Apache Beam), and Dataproc (Spark/Hadoop)

Implement data access controls and security policies including IAM roles, column/row-level security, and VPC Service Controls on GCP

Monitor, troubleshoot, and optimise pipeline performance and reliability in production environments

Participate in code reviews, technical documentation, and contribute to data engineering best practices and standards

Support data migration and modernisation initiatives by moving on-premise or legacy data workloads to GCP cloud-native solutions


Mandatory skill sets:

Hands-on experience with GCP Dataplex creating and managing lakes, zones, assets, data quality rules, and metadata discovery

Strong proficiency in BigQuery writing complex SQL, data modelling, performance tuning, partitioning, and clustering

Experience building data pipelines using Cloud Dataflow (Apache Beam) and/or Dataproc (Apache Spark)

Hands-on experience with Cloud Composer (Apache Airflow) for pipeline orchestration and scheduling

Proficiency in Python and SQL for data transformation and pipeline development

Experience with Cloud Storage for data lake design raw, refined, and curated layer management

Working knowledge of data governance concepts data quality, metadata management, data lineage, and cataloging

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.