🔔 FCM Loaded

Data Engineer

Apptad

8 - 10 years

Bengaluru

Posted: 05/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Job Description

Job Title: Data Engineer Specialist (Snowflake)

Location: Offshore / India (Open to BLR & HYD)

Experience: 1012 years


Key Responsibilities

  • Analyze and solve complex problems using technical expertise, judgment, and prior experience
  • Provide informal guidance and support to new team members
  • Explain complex technical concepts in a clear and straightforward manner

1. Data Engineering & Modeling

  • Design & Develop Scalable Data Pipelines: Use AWS technologies to design, develop, and manage end-to-end data pipelines using ETL, Kafka, DMS, Glue, Lambda, and Step Functions
  • Workflow Orchestration: Build, deploy, and manage automated workflows using Apache Airflow to ensure efficient data processing
  • Snowflake Data Warehouse: Design, implement, and maintain Snowflake data warehouses ensuring optimal performance and scalability
  • Infrastructure Automation: Automate cloud infrastructure provisioning using Terraform and CloudFormation, ensuring security and scalability
  • Data Modeling: Design and implement high-performance logical and physical data models using Star and Snowflake schemas
  • Modeling Tools: Use Erwin or similar tools to create, maintain, and optimize data models aligned with business requirements
  • Continuous Optimization: Monitor and enhance data models to improve performance, scalability, and security

2. Collaboration, Communication & Continuous Improvement

  • Collaborate closely with data scientists, analysts, and business stakeholders to gather requirements and deliver tailored data solutions
  • Provide guidance on data security best practices and ensure adherence to secure coding standards
  • Stay updated with emerging trends in data engineering, cloud technologies, and data security
  • Proactively identify opportunities for system optimization, automation, and performance improvements


Key Skills & Expertise

  • Snowflake: Hands-on experience with performance tuning, RBAC, dynamic masking, data sharing, encryption, and row/column-level security
  • Data Modeling: Strong expertise in physical and logical data modeling using Star and Snowflake schemas
  • AWS Services: ETL, DMS, Glue, Step Functions, Airflow, Lambda, CloudFormation, S3, IAM, EKS, and Terraform
  • Programming: Proficiency in Python, R, Scala, PySpark, and SQL (including stored procedures)
  • DevOps & CI/CD: Experience with CI/CD pipelines and IaC tools such as Terraform, JFrog, Jenkins, and CloudFormation
  • Problem Solving: Strong analytical and troubleshooting skills
  • Communication: Excellent interpersonal and stakeholder management skills


Qualifications & Experience

  • Bachelors degree in Computer Science, Engineering, or a related field
  • 78 years of experience in designing and implementing large-scale Data Lake and Data Warehouse solutions

Certifications

  • AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect (Preferred)
  • Snowflake Advanced Architect and/or Snowflake Core Certification (Required )

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.