GCP Data Engineer
People Prime Worldwide
2 - 5 years
Hyderabad
Posted: 20/12/2025
Job Description
Our client is a trusted global innovator of IT and business services. They help clients transform through consulting, industry solutions, business process services, digital & IT modernization and managed services. Our client enables them, as well as society, to move confidently into the digital future. We are committed to our clients long-term success and combine global reach with local client attention to serve them in over 50 countries around the globe.
Job Title: GCP Data Engineer
Location: Pune/ Hyderabad
Experience: 5+ yrs
Job Type : Contract to hire.
Notice Period:- Immediate joiners.
Mandatory Skills:
Primary Skill : GCP,Python,SQL,Bigquery,Pubsub,Dataflow
Secondary Skills : Airflow,Control M
Monitor and maintain data ingestion pipelines built on GCP services like dataflow, dataproc
Respond to and resolve production issues related to data pipelines, databases on GCP
Conduct root cause analysis and implement corrective actions to prevent recurrence
Implement monitoring solutions to detect pipeline failures, data quality issues or performance bottlenecks
Configures alerts for critical issues and proactively address potential problems
Maintain documentation for data pipelines, ETL processes, database schema and system configurations
Share knowledge and best practices with team members and stakeholders
Plan and implement scaling strategies for data infrastructure to accommodate growing data volumes and user demands
Collaborate with analysts, BI developer and business stakeholders to understand data requirements and support initiatives
Participate in cross-functional meetings and provide technical guidance on data engineering solutions
Identify opportunities to automate repetitive tasks, streamline workflows, and optimize data processes
Stay updated with GCP services and features to leverage new capabilities and improving data engineering operations
Optimize data processing workflows and queries for performance, cost efficiency and scalability
Implement caching mechanisms and indexing strategies to enhance query response times
Implement data quality checks and validation processes to ensure data accuracy and consistency
Enforce data governance policies related to data access, privacy, and compliance
Technical Skill
5+years of experience in python and SQL
2+years of experience in developing Airflow Dags development, deployment and testing
2+years in Bigquery
Good knowledge in GCP Cloudrun
Good to have understanding of Qlik and Looker Reporting tool
Knowledge on JIRA and Confluence
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
