🔔 FCM Loaded

GCP Data Engineer

Tata Consultancy Services

2 - 5 years

Bengaluru

Posted: 05/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Role : GCP Data Engineer

Required Technical Skill Set: GCP, PySpark, Data proc, HDFS, Hadoop, SQL


Job Desciption:


  • Build, maintain, and troubleshoot pipelines using GCP services (e.g., DataProc, Pub/Sub, Cloud Functions, Cloud Composer/Apache Airflow) to ingest, transform, and load data from various sources (relational databases, APIs, streaming data, flat files).
  • Implement batch data processing solutions.
  • Identify and resolve data-related issues, including data quality problems, pipeline failures, and performance bottlenecks.
  • Sound programming knowledge on PySpark & SQL in terms of processing large amount of semi structured & unstructured data
  • Working Knowledge on working with Avro, Parquet format files
  • Knowledge on working on Hadoop Big Data platform and ecosystem

Good-to-Have

  • Knowledge on Jira, Agile, Sonar, Team city & CICD
  • Any exposure / experience for an international Banking client / multi-vendor / multi geography teams


Role & Responsibility:

  • Design, implement, and optimize scalable, reliable, and secure data architectures on GCP, including data lakes, data warehouses, and streaming solutions.
  • Develop and maintain data models for optimal storage, retrieval, and analysis.
  • Build, maintain, and troubleshoot robust ETL/ELT pipelines using GCP services (e.g., Dataflow, Pub/Sub, Cloud Functions, Cloud Composer/Apache Airflow) to ingest, transform, and load data from various sources (relational databases, APIs, streaming data, flat files).

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.