Login Sign Up

ETL Data Engineer

AGAM Technologies

3 - 5 years

Coimbatore

Posted: 28/04/2026

Getting a referral is 5x more effective than applying directly

Job Description

Job description

Experience: 2 to 4 Years

Location: Coimbatore- Work from Office

Employment Type: Full-time

Availability: Immediate Joiner Preferred



Job Summary


We are looking for a motivated ETL Data Engineer with 23 years of experience in building and maintaining data pipelines. The ideal candidate should have hands-on experience with Snowflake and strong expertise in ETL/ELT processes, data warehousing, and SQL. You will play a key role in transforming raw data into reliable, scalable, and high-quality datasets for analytics and business intelligence.


Key Responsibilities


Design, develop, and maintain ETL/ELT pipelines using Snowflake

Design, develop, and maintain modular dbt models (SQL & Python) using best practices like DRY (Dont Repeat Yourself) and version control.

Extract data from multiple sources (databases, APIs, flat files, cloud systems)

Transform and load data into Snowflake data warehouse efficiently

Write optimized SQL queries for data processing and transformation

Develop and maintain data models (star schema, snowflake schema)

Ensure data quality, integrity, and consistency across pipelines

Monitor, troubleshoot, and optimize ETL workflows and job performance

Collaborate with data analysts, BI developers, and business stakeholders

Implement data validation, error handling, and logging mechanisms

Maintain documentation for data pipelines, workflows, and architecture


Required Skills & Qualifications


23 years of experience in ETL/Data Engineering

Hands-on experience with dbt (Core or Cloud), including macros, packages, and hooks.

Hands-on experience with Snowflake (tables, stages, warehouses, data loading)

Strong proficiency in SQL (joins, window functions, CTEs, performance tuning)

Experience in ETL/ELT tools (e.g., Azure Data Factory, Informatica, Talend, etc.)

Good understanding of data warehousing concepts

Knowledge of data modeling techniques (dimensional modeling)

Familiarity with file formats (CSV, JSON, Parquet)

Experience handling large datasets and optimizing performance

Basic scripting knowledge (Python or Shell scripting is a plus)

Strong analytical and problem-solving skills


Preferred Skills


Experience with cloud platforms (Azure/AWS/GCP)

Knowledge of Snowflake features like Time Travel, Cloning, and Data Sharing

Familiarity with orchestration tools (Airflow, Prefect, etc.)

Experience with dbt (Data Build Tool)

Exposure to CI/CD pipelines and version control (Git)

Understanding of Agile methodologies (Scrum/Kanban)


Education


Bachelors degree in Computer Science, Information Technology, or related field


Nice to Have


Snowflake certification (SnowPro Core or equivalent)

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.