🔔 FCM Loaded

Snowflake Data Engineer

Anblicks

5 - 10 years

Hyderabad

Posted: 12/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Experience-5- 10 years


About the Role

Seeking a skilled Data Engineer to design, build, and optimize modern data pipelines and analytics platforms. The ideal candidate will have strong hands-on experience with Snowflake, dbt, Airflow, and Python/Spark, along with solid SQL and data modeling skills. Youll collaborate with cross-functional teamsdata analysts, data scientists, and business stakeholdersto ensure data reliability, scalability, and Performance.


Key Responsibilities

Design, develop, and maintain data pipelines for ingestion, transformation, and delivery using Airflow, dbt, and Snowflake.

Implement and optimize ETL/ELT workflows for structured and semi-structured data.

Develop and maintain data models, schemas, and views in Snowflake to support analytics and reporting.

Build and manage data processing frameworks using Spark (PySpark or Spark SQL).

Integrate data from various sources (databases, APIs, files, cloud storage, streaming data).

Monitor data pipelines for performance, reliability, and cost optimization.

Implement data quality checks, observability, and error handling mechanisms.

Collaborate with data analysts/scientists to understand data needs and deliver scalable solutions.

Apply CI/CD best practices for data pipeline deployment and version control (Git).

Ensure compliance with data governance, security, and privacy policies.


Required Skills & Experience

Experience in data engineering or a related field.

Strong expertise with Snowflake (warehousing concepts, performance tuning, cost optimization, security).

Proven experience with dbt (data modeling, testing, documentation, modular SQL).

Hands-on experience with Apache Airflow (DAG design, scheduling, orchestration).

Proficiency in SQL and Python for data manipulation and automation.

Experience with Apache Spark (PySpark preferred).

Strong understanding of ETL/ELT design patterns, data modeling (Kimball, Data Vault), and dimensional modeling.

Experience with Git, CI/CD, and Cloud Platforms (AWS, Azure, or GCP).

Knowledge of data quality, observability, and monitoring tools (e.g., Great Expectations, Monte Carlo, or similar).

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.