🔔 FCM Loaded

Data Engineer

UST

2 - 5 years

Bengaluru

Posted: 26/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

About the Company



We are seeking a highly skilled Senior Data Engineer with strong expertise in Python and Google Cloud Platform (GCP) to design, build, and maintain scalable, high-performance data pipelines and integration solutions.



About the Role



The ideal candidate is a hands-on engineer with deep knowledge of data architecture, ETL/ELT development, and real-time/batch data processing. You will collaborate closely with analytics, development, DevOps, and business teams to ensure secure, reliable, and efficient data delivery across the organization.



Responsibilities



  • Design, develop, and maintain scalable data pipelines and ETL/ELT workflows using Python and GCP services.
  • Build batch and real-time data ingestion pipelines using APIs, CDC tools, and orchestration frameworks.
  • Develop and optimize data models, schemas, and cloud-based data architectures.
  • Implement data transformation and data quality validation frameworks.
  • Work with analytics and business teams to deliver high-quality, reliable datasets.
  • Monitor, troubleshoot, and optimize data workflows for performance and scalability.
  • Implement CI/CD pipelines for data engineering workflows.
  • Ensure compliance, governance, and security best practices across cloud data systems.
  • Perform root cause analysis and system performance tuning.
  • Stay updated with emerging data engineering and cloud technologies.


Qualifications



  • 8+ years of experience in Data Engineering or related roles.
  • Bachelors or Masters degree in Computer Science, Information Systems, Data Engineering, or related field.


Required Skills



  • Strong Python programming skills with hands-on experience in data pipeline development.
  • Proven experience with Google Cloud Platform (GCP) services, including:
  • BigQuery
  • Dataflow
  • Pub/Sub
  • Dataproc
  • Cloud Composer
  • Cloud Functions
  • Cloud Scheduler
  • Datastream (CDC)
  • Google Cloud Storage (GCS)
  • Experience with Apache Beam or Apache Spark for distributed data processing.
  • Strong SQL skills and solid understanding of relational and cloud-native databases.
  • Experience building REST API-based ingestion pipelines and handling JSON-based integrations.
  • Strong understanding of data warehousing concepts, data modeling, and ETL/ELT principles.
  • Experience with CI/CD tools such as GitHub, Terraform, and Cloud Build.
  • Knowledge of data security, access control, and governance in cloud environments.
  • Experience working in large-scale, cloud-based, or enterprise environments.


Preferred Skills



  • Professional certifications in Google Cloud Platform (GCP) or Big Data Engineering.
  • Experience with Change Data Capture (CDC) architectures.
  • Experience with performance tuning and system optimization in distributed environments.
  • Knowledge of Shell or Perl scripting.
  • Exposure to DevOps collaboration in globally distributed teams.
  • Experience designing real-time streaming architectures.
  • Strong documentation and communication skills.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.