Login Sign Up
🔔 FCM Loaded

GCP Data Engineer

Intuitive.ai

5 - 7 years

Ahmedabad

Posted: 03/04/2026

Getting a referral is 5x more effective than applying directly

Job Description

About us:

Intuitive is an innovation-led engineering company delivering business outcomes for 100s of Enterprises globally. With the reputation of being a Tiger Team & a Trusted Partner of enterprise technology leaders, we help solve the most complex Digital Transformation challenges across following Intuitive Superpowers:


Modernization & Migration

  • Application & Database Modernization
  • Platform Engineering (IaC/EaC, DevSecOps & SRE)
  • Cloud Native Engineering, Migration to Cloud, VMware Exit
  • FinOps


Data & AI/ML

  • Data (Cloud Native / DataBricks / Snowflake)
  • Machine Learning, AI/GenAI


Cybersecurity

  • Infrastructure Security
  • Application Security
  • Data Security
  • AI/Model Security


SDx & Digital Workspace (M365, G-suite)

  • SDDC, SD-WAN, SDN, NetSec, Wireless/Mobility
  • Email, Collaboration, Directory Services, Shared Files Services


Intuitive Services:

  • Professional and Advisory Services
  • Elastic Engineering Services
  • Managed Services
  • Talent Acquisition & Platform Resell Services


About the job:

Title: GCP Data Engineer

Start Date: Immediate

Position Type: Full Time

Location: Ahmedabad, India


Job Summary

We are looking for a skilled GCP Data Engineer with 25 years of hands-on experience in building and maintaining scalable data pipelines on Google Cloud Platform. The ideal candidate will work closely with analytics, product, and engineering teams to enable reliable, high-performance data solutions.


Key Responsibilities

  • Design, build, and maintain ETL/ELT data pipelines on Google Cloud Platform
  • Develop batch and streaming pipelines using BigQuery, Dataflow (Apache Beam), and Pub/Sub
  • Optimize BigQuery queries for performance, scalability, and cost efficiency
  • Manage data ingestion from multiple sources using Cloud Storage (GCS)
  • Orchestrate workflows using Cloud Composer (Apache Airflow)
  • Ensure data quality, consistency, security, and monitoring across pipelines
  • Collaborate with analysts, data scientists, and stakeholders to support data-driven decisions
  • Document data architecture, pipelines, and operational processes


Required Skills & Qualifications

  • 25 years of experience in Data Engineering
  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Expertise in:
  • BigQuery
  • Dataflow (Apache Beam)
  • Cloud Storage (GCS)
  • Strong SQL skills for analytical and large-scale datasets
  • Proficiency in Python (or Java) for data processing
  • Experience with workflow orchestration tools (Airflow / Cloud Composer)
  • Solid understanding of data warehousing and ETL/ELT concepts


Good to Have (Preferred Skills)

  • Experience with Pub/Sub for real-time streaming pipelines
  • Spark / Dataproc experience
  • Infrastructure as Code (Terraform, Deployment Manager)
  • CI/CD pipelines and Git-based workflows
  • GCP Professional Data Engineer certification
  • Exposure to data formats like Parquet, Avro, JSON

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.