🔔 FCM Loaded

Data Engineer

Talentmatics

3 - 12 years

Bengaluru

Posted: 24/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

Data Engineering (GCP Big Data) | 3-12 Years | Bangalore / Gurgaon


Roles & Responsibilities :

  • Design, develop, and manage scalable data pipelines and ETL/ELT workflows using BigQuery, Dataflow, Dataproc, and Cloud Composer (Airflow) .
  • Work extensively on Big Data Analytics solutions leveraging Hadoop, Hive, Spark, and GCP services.
  • Build and optimize data models and SQL queries for performance and scalability.
  • Collaborate with cross-functional teams to enable data integration, transformation, and reporting solutions .
  • Implement and manage data pipelines using Airflow/Cloud Composer, Kafka, Git, Jenkins, and CI/CD frameworks.
  • Troubleshoot, monitor, and improve system performance and ensure high availability of data platforms.
  • Contribute to Agile/Scrum-based development cycles , providing inputs during sprint planning and reviews.
  • Drive continuous improvement and innovation in data engineering practices .


Key Skills Required :

  • Google Cloud Platform (GCP): BigQuery, Dataflow, Dataproc, Cloud Composer (Airflow)
  • Programming & Processing: PySpark, Python
  • BI/Reporting Tools: Tableau or MicroStrategy (MSTR)
  • Big Data Ecosystem: Hadoop, Hive, Spark
  • Databases & SQL: Advanced SQL, data model optimization
  • Data Engineering Tools: Airflow, Kafka, Git, Jenkins, CI/CD pipelines


Note: Immediate joiners to 15 days only

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.