Login Sign Up

Data Engineer

Anblicks

2 - 5 years

Hyderabad

Posted: 07/05/2026

Getting a referral is 5x more effective than applying directly

Job Description

Job Title: Data Engineer Snowflake & dbt

Location: Hyderabad

Experience: 48 years


Role Overview

We are looking for a highly skilled Data Engineer with strong hands-on experience in Snowflake and dbt to design, build, and optimize modern cloud data platforms.

The ideal candidate will work closely with Data Architects, Analytics Engineers, and business stakeholders to deliver scalable, highperformance data solutions following modern ELT and analytics engineering best practices.


Key Responsibilities

Data Platform & Pipeline Development

  • Design, develop, and maintain end-to-end ELT pipelines on Snowflake
  • Build scalable and reliable batch data pipelines using SQL and Python
  • Implement and manage Bronze Silver Gold data layer architectures
  • Ensure high data quality, reliability, and performance across pipelines

dbt & Analytics Engineering

  • Develop and maintain dbt models, tests, snapshots, and documentation
  • Implement dimensional and analytical data models for reporting and BI
  • Apply dbt best practices: modular models, testing, freshness checks, and exposures
  • Optimize transformation logic for cost and performance in Snowflake

Performance & Optimization

  • Tune SQL queries and Snowflake warehouse usage for performance and efficiency
  • Manage clustering, partitioning, and resource monitoring in Snowflake
  • Identify and resolve data quality and pipeline performance issues

Collaboration & Delivery

  • Work closely with Data Architects, BI teams, and business stakeholders
  • Participate in requirement gathering, design discussions, and code reviews
  • Contribute to Agile/Scrum ceremonies and sprint deliveries
  • Maintain clear documentation for data models, pipelines, and processes


Required Technical Skills

  • Strong hands-on experience with Snowflake
  • Warehouses, schemas, roles, performance tuning, cost optimization
  • Strong experience with dbt (Core or Cloud)
  • Models, macros, tests, snapshots, CI/CD integration
  • Advanced SQL skills (analytical queries, optimization techniques)
  • Working experience with Python for data processing or orchestration
  • Experience with data modeling:
  • Dimensional modeling (Star/Snowflake schema)
  • Analytical and reporting-oriented models
  • Experience with Git-based version control and CI/CD pipelines


Good to Have

  • Experience with ingestion tools such as Fivetran, Informatica, Matillion, Kafka, or Airflow
  • Exposure to cloud platforms (AWS / Azure / GCP)
  • Experience supporting BI tools like Power BI, Looker, Tableau, Sigma
  • Prior experience with data migrations from onprem or legacy warehouses
  • Familiarity with data governance, data quality frameworks, and monitoring


Soft Skills & Competencies

  • Strong problem-solving and analytical skills
  • Ability to work independently and in team-based delivery models
  • Clear communication with technical and non-technical stakeholders
  • Ownership-driven mindset with attention to detail
  • Comfortable working in fastpaced, client-facing environments


Education

  • Bachelors degree in Computer Science, Engineering, Data Science, or a related field

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.