Login Sign Up

Senior GCP Data Engineer (Big Data Platform)

ThreatXIntel

5 - 10 years

Chennai

Posted: 25/04/2026

Getting a referral is 5x more effective than applying directly

Job Description

Company Description

ThreatXIntel is a growing Cybersecurity, IT Staffing, and Consulting company delivering end-to-end technology and security solutions.

We are hiring for our corporate client. ThreatXIntel is the official hiring partner for this requirement.


Job Overview

We are looking for a Senior Hands-on GCP Data Engineer who can independently design, build, and optimize large-scale data solutions on Google Cloud Platform.

The role is part of a Marketing Platforms team focused on building a Unified Marketing Data Layer to enable campaign analytics, reporting, and personalized customer experiences.

The engineer will contribute to the Flywheel modernization initiative, aimed at improving marketing efficiency and accelerating campaign delivery.


Key Responsibilities

Data Engineering & GCP Development

  • Design and build scalable data pipelines on GCP
  • Work extensively with BigQuery, Bigtable, and Dataproc
  • Develop and optimize batch and streaming data processing workflows

Workflow Orchestration

  • Build and manage data workflows using Cloud Composer (Airflow)
  • Ensure reliability, monitoring, and scheduling of pipelines

Event-Driven & Messaging Systems

  • Develop event-driven architectures using Pub/Sub
  • Enable real-time data ingestion and processing

Data Platform & Analytics

  • Build and maintain a Unified Marketing Data Layer
  • Enable analytics and reporting for marketing campaign performance

DevOps & CI/CD

  • Implement CI/CD pipelines using GitHub Actions
  • Ensure code quality, version control, and automated deployments

Collaboration

  • Work with marketing, analytics, and engineering teams
  • Translate business requirements into scalable data solutions

Required Skills (Mandatory)

  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Expertise in BigQuery and Bigtable
  • Strong experience with Dataproc (Spark-based processing)
  • Hands-on experience with Cloud Composer (Airflow)
  • Proficiency in Python programming
  • Experience in building data pipelines and ETL workflows
  • Strong understanding of data modeling and large-scale data processing

Nice to Have

  • Experience with Pub/Sub (event-driven architecture)
  • Knowledge of Java programming
  • Experience with CI/CD (GitHub Actions)
  • Exposure to marketing data platforms or campaign analytics systems

Industry & Job Function (LinkedIn Recommendation)

  • Industry: Information Technology & Services / Financial Services
  • Job Function: Engineering / Information Technology

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.