Data Engineer
Sourcebae
2 - 5 years
Bengaluru
Posted: 08/04/2026
Job Description
Job Description GCP Data Engineer
Position: GCP Data Engineer
Experience: 68 Years
Location: Bangalore
Work Mode: Hybrid
Employment Type: Permanent
Client: Applied Systems
Interview Process: 1st Round Virtual | 2nd Round Face-to-Face (No relocation candidates)
Notice Period: Immediate / 15 Days
Important Note
Profiles from Andhra Pradesh / Telangana will be auto-rejected
No relocation candidates (must be Bangalore-based or ready without relocation dependency)
Role Overview
We are seeking a highly skilled GCP Data Engineer to design, build, and optimize scalable data pipelines and data platforms. The ideal candidate will have strong experience with Google Cloud Platform (GCP), modern ETL tools, and data warehousing solutions.
Key Responsibilities
Design, develop, and maintain scalable data pipelines on GCP
Build robust ETL workflows with automated data quality checks
Implement data lifecycle, lineage, metadata, and governance frameworks
Develop and optimize BigQuery SQL procedures, functions, and queries
Design and deploy end-to-end pipelines (ingestion transformation consumption)
Work with Dataflow, Dataproc, and Airflow/Composer for data processing
Perform data parsing (JSON, XML, text) using Python/Dataflow
Build and maintain data models (star schema, snowflake schema)
Optimize performance, scalability, and reliability of data systems
Implement monitoring, alerting, and troubleshooting mechanisms
Ensure data quality, validation, and consistency across systems
Maintain technical documentation for pipelines and data models
Collaborate with cross-functional teams and stakeholders
Mandatory Skills
GCP Services: BigQuery, Dataflow, Dataproc
Orchestration: Apache Airflow / Composer
Programming: Python
Data Transformation: DBT
Version Control: Git
ETL Tools: Talend / Fivetran / Celigo (or similar)
Required Experience
6+ years in Data Engineering / ETL Development
Minimum 1+ year hands-on experience with GCP
Strong experience building cloud-based data warehouses (BigQuery preferred)
Hands-on experience with ETL pipeline development & optimization
Expertise in SQL, Python, DBT, Airflow
Experience working with structured & semi-structured data (JSON/XML)
Familiarity with Data Lake & Data Warehousing concepts
Good to Have
Experience with Celigo integrations
Strong knowledge of data modeling techniques
Exposure to data governance and metadata management tools
Education
Bachelors degree in Computer Science / IT / MIS / CIS or equivalent experience
Key Competencies
Strong problem-solving and analytical skills
Ability to work independently and in teams
Excellent communication and stakeholder management skills
Ideal Candidate Profile
Hands-on GCP Data Engineer with end-to-end pipeline ownership
Strong in BigQuery + DBT + Airflow ecosystem
Experience in scalable, production-grade data systems
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
