Snowflake Lead/ Sr. Lead
Birlasoft
6 - 12 years
Bengaluru
Posted: 23/04/2026
Job Description
Job Title: Data Engineer (Snowflake + DBT)
Location: Bangalore,Hyderabad,Pune,Mumbai,Noida Experience: 6-12 Years Employment Type: Full-time
Job Summary
We are seeking a Data Engineer (6+ years) with strong expertise in Snowflake, DBT, Apache Airflow, and StreamSets, and mandatory experience in Regulatory Data within the Life Sciences domain. This role will lead enterprise data platform modernization, including legacy-to-cloud migrations, multi-source data integration, and regulatory-compliant data engineering in highly regulated environments.
Key Responsibilities
1) Data Platform Architecture (Snowflake)
a. Design and implement scalable Snowflake-based data platforms (Landing Curated Data Marts).
b. Implement Snowpipe, Streams & Tasks (CDC), Time Travel, and Zero-copy cloning.
c. Ensure data security (RBAC, masking, row-level security) and performance optimization.
2) Data Migration & Modernization
a. Lead end-to-end migration from legacy systems (Teradata, Oracle, SQL Server, Netezza) to Snowflake.
b. Define migration strategies (bulk load, incremental, parallel run validation).
c. Perform data profiling, transformation mapping, reconciliation, and validation.
3) Data Ingestion & Integration
a. Build scalable ingestion pipelines for structured, semi-structured, and API-based data.
b. Use StreamSets, Snowpipe, and CDC mechanisms for real-time and batch ingestion.
c. Develop metadata-driven, resilient pipelines with monitoring and schema evolution support.
4) Transformation (DBT)
a. Architect DBT layers (staging, intermediate, marts).
b. Implement incremental models, snapshots, testing frameworks, and lineage documentation.
c. Optimize transformations for Snowflake performance and cost efficiency.
5) Orchestration & Automation
a. Orchestrate workflows using Apache Airflow (DAGs, SLAs, recovery).
b. Implement CI/CD pipelines for DBT, Airflow, and Snowflake.
c. Build data observability and monitoring frameworks.
6) Data Modelling
a. Design dimensional, Data Vault 2.0, and canonical models.
b. Enable cross-domain data harmonization across enterprise systems.
7) Strong understanding of:
a. Hands-on experience with Regulatory systems (RIM) and submission data.
b. Clinical, Regulatory, Pharmacovigilance, and RWE data domains
c. GxP, 21 CFR Part 11, HIPAA/GDPR, ALCOA+ compliance
d. Understanding of IDMP standards
e. Ensure audit readiness, traceability, and data integrity for regulatory use cases.
Required Qualifications:
6+ years in Data Engineering and enterprise data platforms.
Strong hands-on expertise in:
o Snowflake
o DBT
o Apache Airflow
o StreamSets
Proven experience in:
o Large-scale data migration & modernization
o Multi-source ingestion frameworks
o Mandatory: Experience with Regulatory Data in Life Sciences
Experience with AWS/Azure/GCP
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
