GCP Data Engineer
Ampstek
2 - 5 years
Bengaluru
Posted: 30/04/2026
Job Description
About Us:
AmpsTek a global technology leader since 2013 is transforming how businesses approach technology and staffing solutions. Founded by seasoned technology leaders across the UK, Europe, APAC, North America, and LATAM, and with registered offices in 30+ countries, we deliver exceptional service, scalable solutions, and measurable impact.
With a portfolio of 200+ clients and millions of users across web and mobile platforms, we empower businesses to innovate, grow, and succeed.
Join our team and be part of a dynamic, growth-oriented organization that values talent, creativity, and results.
Role : GCP Data Engineer
Location : Bengaluru (Hybrid 2 days onsite/week)
Contract to Hire (C2H)
Key Responsibilities:
Design and develop solutions using tools like Dataflow, Dataproc, and BigQuery etc.
Extensive hands-on experience in object-oriented programming using Python, PySpark APIs etc.
Experience in building data pipelines for huge volume of data.
Extracting, transforming, and loading data from various sources, including databases, APIs, and flat files, using Python and PySpark.
Experience in implementing and maintaining Data Ingestion process.
Hands on experience in writing basic to advance level of optimized queries using SQL & BigQuery.
Hands on experience in designing, implementing, and maintaining Data Transformation jobs using most efficient tools/technologies.
Ensure the performance, quality, and responsiveness of solutions by implementing rigorous testing, validation, and monitoring to ensure data accuracy and reliability.
Participate in code reviews to maintain code quality.
Work with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
Adhere to security best practices for cloud environments and ensuring compliance with regulatory standards.
Manage and optimize the entire data lifecycle, from ingestion to archiving and deletion.
Should be able to write shell scripts.
Utilize GitHub for source version control.
Set up and maintain CI/CD pipelines.
Create and maintain documentation for data solutions including design specifications and user guides.
Troubleshoot, debug, and upgrade existing application & ETL job chains.
Stay up-to-date with the latest advancements in data technologies and incorporate them into the Amexs data strategy.
Required Skills and Qualifications:
Bachelors degree in Computer Science Engineering, or a related field.
Proven experience as Data Engineer or similar role.
Strong proficiency in Object Oriented programming using Python.
Experience with ETL jobs design principles.
Solid understanding of SQL and data modelling.
Knowledge on Unix/Linux and Shell scripting principles.
Familiarity with Git and version control systems.
Experience with Jenkins and CI/CD pipelines,
Knowledge of software development best practices and design patterns.
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills.
Experience with cloud platforms such as Google Cloud
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
