Azure Data Warehouse Developer
Ensar Solutions Inc
2 - 5 years
Hyderabad
Posted: 28/02/2026
Getting a referral is 5x more effective than applying directly
Job Description
Job Description:
We are seeking an 5+ Years experienced Azure Data Warehouse Developer to join our growing data team. The ideal candidate will have a deep understanding of cloud-based data architecture, specifically within Microsoft Azure, and will play a key role in designing, implementing, and maintaining data warehouse solutions on Azure. You will collaborate with cross-functional teams to ensure scalable and high-performance data solutions.
- Designing and implementing dimensional models star and snowflake schemas, slowly changing dimensions (SCD Types 1, 2, 3), and fact/dimension table strategies
- Writing complex, optimized T-SQL including window functions, CTEs, dynamic SQL, and query tuning
- Table design considerations within Fabric Warehouse: distribution strategies, indexing, partitioning, and statistics management
- Data quality enforcement through constraints, validation logic, and reconciliation frameworks
- Understanding of Fabric Warehouse's current limitations vs. Azure Synapse Dedicated SQL Pools, and how to work within them
- Hands-on Synapse Analytics experience dedicated/serverless SQL pools, distribution strategies
- Deep ADLS Gen2 knowledge hierarchical namespace, Parquet/Delta file formats, partitioning strategies
- Storage security RBAC, ACLs, managed identities, private endpoints
- Managing large-scale file formats: Parquet, Delta, CSV, JSON, and Avro
- Experience connecting to source systems: databases, REST APIs, flat files, SaaS platforms
- Query and storage performance tuning file size optimization, indexing, partitioning
- Hands-on experience with Fabric's T-SQL Warehouse and understanding how it differs from Synapse
- Knowledge of cross-database and cross-item queries spanning Lakehouse and Warehouse
- Understanding of Fabric's V-Order optimization and Delta Lake file management
- Identifying and resolving data skew, spill, and shuffle issues in Spark workloads
- Monitoring capacity usage, query execution plans, and resource contention
- Experience building semantic models directly on top of Fabric Warehouse or Lakehouse
- Understanding of when to choose Fabric DW over Lakehouse SQL endpoint for a given workload
- OneLake architecture shortcuts to ADLS Gen2, Lakehouse vs. Warehouse trade-offs
- Medallion architecture implementation (Bronze/Silver/Gold)
- Lakehouse mirroring and real-time data ingestion patterns
- Fabric Data Pipelines, Notebooks (PySpark/SQL), and Dataflows Gen2
- Delta Lake operations OPTIMIZE, VACUUM, Z-ordering, time travel
- Incremental load strategies: watermark patterns, change data capture (CDC), and merge/upsert logic
- Implementing row-level security (RLS) and column-level security (CLS) in Fabric Warehouse
- Managing workspace roles, item permissions, and One Lake access controls
- Understanding of data lineage tracking and cataloging within Fabric
- Compliance awareness: GDPR, HIPAA, or industry-specific data handling requirements
Required Skills:
- Direct Lake mode in Power BI and semantic model design
- End-to-end security model spanning ADLS and Fabric
- Row-level and column-level security in DW and semantic layers
- Sensitivity labels and Microsoft Information Protection
- Designing end-to-end solutions where ADLS underpins Fabric via One Lake shortcuts
- ELT/ETL pattern selection and pipeline orchestration
- Microsoft Purview for governance and lineage across ADLS and Fabric
- Multi-workspace and multi-domain architecture patterns for enterprise environments
- Disaster recovery, backup strategies, and SLA considerations within Fabric
- Fabric REST APIs and CI/CD using Azure DevOps or GitHub
- Comfortable with Fabric's rapid evolution and incomplete documentation
- Experience with integrating with Human in the middleworkflows
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
