🔔 FCM Loaded

Senior Data Architect

Tredence Inc.

5 - 10 years

Bengaluru

Posted: 10/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

SNOWFLAKE ARCHITECT

Location: Bengaluru, Pune, Gurugram, Kolkata, Chennai and Hyderabad

Primary Roles and Responsibilities :

  • Working experience in Snowflake; use of Snow SQL CLI, Snow Pipe creation of custom functions and Snowflake stored producers, schema modelling, performance tuning etc.
  • Expertise in Snowflake data modelling, ELT using Snowflake SQL, Snowflake Task Orchestration implementing complex stored Procedures and standard DWH and ETL concepts
  • Extensive experience in DBT CLI, DBT Cloud, GitHub version control and repository knowledge, and DBT scripting to design & develop SQL processes to perform complex ELT processes and data pipeline build.
  • Ability to independently envision and develop innovative ETL and reporting solutions and execute them through to completion.
  • Triage issues to find gaps in existing pipelines and fix the issues
  • Analyze the data quality, align the technical design to data governance, and address all the required non-business but operational requirements during the design and build of data pipelines
  • Develop and maintain data pipelines using DBT
  • Provide advice, guidance, and best practices around Snowflake
  • Provide guidance on moving data across different environments in Snowflake
  • Create relevant documentation around database objects
  • Troubleshoot production support issues post-deployment and come up with solutions as required
  • Good Understanding and knowledge of CI/CD process and GitHub->DBT-> Snowflake integrations.
  • Advance SQL knowledge and hands-on experience in complex query writing using Analytical functions, Troubleshooting, problem-solving, and performance tuning of SQL queries accessing data warehouse as well as Strong knowledge of stored procedures.
  • Experience in Snowflake advanced concepts such as resource monitors, virtual warehouse sizing, query performance tuning, zero-copy clone, time travel and understanding how to use these features
  • Help joiner team members to resolve issues and technical challenges.
  • Drive technical discussions with client architect and team members
  • Good experience in developing scripts for data auditing and automating various database platform manual activities.
  • Understanding of the full software lifecycle and how development teams should work with DevOps to create more software faster.

Excellent communication, working in Agile Methodology/Scrum.


Skills and Qualifications :

  • Bachelor's and/or masters degree in computer science or equivalent experience.
  • Must have total 13+ yrs. of IT experience and 5+ years' experience in data Integration, ETL/ETL development, and database design or Datawarehouse design
  • Deep understanding of Star and Snowflake dimensional modelling.
  • Strong knowledge of Data Management principles
  • Working experience in Snowflake; use of Snow SQL CLI, schema modelling, performance tuning etc.
  • Develop and maintain data pipelines using DBT
  • Experience with writing complex SQL queries, especially dynamic SQL
  • Expertise in Snowflake data modelling, ELT using Snowflake SQL, Snowflake Task Orchestration implementing complex stored Procedures and standard DWH and ETL concepts
  • Experience with performance tuning and optimization of SQL queries
  • Experience of working with Retail Data
  • Experience with data security and role-based access controls
  • Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects
  • Should have experience working in Agile methodology
  • Strong verbal and written communication skills.

Strong analytical and problem-solving skills with high attention to detail.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.