Senior Data Engineer
Radiant Digital
10 - 12 years
Hyderabad
Posted: 31/01/2026
Job Description
Position: Senior Data Engineer GCP
Experience: 10+ Years
Job Type: Fulltime
Position Overview:
We are seeking a highly skilled Senior Data Engineer with over 10 years of experience in designing, building, and optimizing large-scale data solutions within the Google Cloud Platform (GCP) ecosystem. The successful candidate will work closely with business stakeholders, data scientists, and application teams to design pipelines and data architectures that feed into BigQuery and manage enterprise-wide data governance using Dataplex . This role requires strong SQL expertise, deep knowledge of GCP-native services, and the ability to ensure data solutions are scalable, reliable, and secure.
Key Responsibilities:
- Design and implement scalable data pipelines to ingest, process, and manage structured and unstructured data in GCP .
- Develop and optimize BigQuery data warehouses for analytics, reporting, and machine learning workloads.
- Leverage Dataplex to establish data governance, metadata management, lineage, and quality frameworks across the enterprise.
- Collaborate with data scientists, analysts, and application developers to enable advanced analytics and AI/ML workloads.
- Implement data security, compliance, and access controls aligned with organizational and regulatory requirements.
- Automate infrastructure and workflows using Terraform, Cloud Composer (Airflow), and CI/CD pipelines .
- Monitor, troubleshoot, and optimize data workflows for cost efficiency, reliability, and performance.
- Document solution designs, data models, and operational procedures to ensure maintainability and knowledge transfer.
- Stay current with emerging GCP services, data engineering best practices, and modern data platform patterns.
Required Skills:
- Minimum of 10 years in data engineering , with at least 5 years in GCP-based environments .
- Strong proficiency in SQL (including query optimization and advanced analytical functions).
- Hands-on experience with BigQuery (partitioning, clustering, optimization, cost control).
- Expertise in Dataplex for governance, metadata, and lifecycle management.
- Experience with ETL/ELT pipeline development using Dataflow, Dataproc, Pub/Sub, Cloud Functions , or Cloud Run .
- Knowledge of data modeling techniques (star schema, snowflake, data vault) and best practices for large-scale analytics platforms.
- Familiarity with Python or Java for building scalable data pipelines.
- Knowledgeable in Terraform or Deployment Manager for infrastructure as code.
- Experience with CI/CD tools (e.g., Cloud Build, GitLab CI/CD).
- Knowledge of relational and NoSQL databases (e.g., PostgreSQL, Cloud Spanner, Firestore).
Preferred Qualifications:
- Bachelors or Masters degree in Computer Science, Information Systems, or related field (or equivalent experience).
- Google Cloud Professional Data Engineer or Cloud Architect Certification strongly preferred.
- Exposure to machine learning pipelines and integration with Vertex AI is a plus.
- Experience with real-time data streaming using Pub/Sub or Kafka.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
