Login Sign Up

GCP Data Engineer

LTM

2 - 5 years

Bengaluru

Posted: 13/05/2026

Getting a referral is 5x more effective than applying directly

Job Description

A GCP Big Query professional designs develops and optimizes largescale data warehouses ELT

ETL pipelines and SQL scripts within Google Cloud Responsibilities include managing data ingestion PubSub Dataflow optimizing performance and cost and implementing security governance Key skills include SQL Python Apache Airflow and Data modeling


Key Responsibilities

Data Architecture Modeling Design and optimize Big Query data models schemas and storage for performance and cost efficiency

Pipeline Development ETLELT Build scalable data pipelines and workflows using tools like Cloud Composer Airflow Dataflow and Dataproc

Query Optimization Write complex SQL scripts and tune performance to minimize Big Query slot usage

Data Integration Ingest transform and manage data from diverse sources into Big Query

Governance Security Implement data security access controls and best practices

Required Skills Qualifications

Core Technical Proficiency in Google SQL Python and Google Cloud Platform GCP services BigQuery Cloud Storage Dataflow

Data Processing Experience with Apache Spark Beam or Hadoop

Database Knowledge Strong understanding of relational RDBMS and NoSQL databases


Skills

Mandatory Skills : GCP Big Query, GCP Storage, Python for DATA

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.