🔔 FCM Loaded

Azure Data Engineer

Wall Street Consulting Services LLC

2 - 5 years

Hyderabad

Posted: 10/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

Job description:

Job Title: Azure Data Engineer

Location: Onsite in Rayadurgam, Hyderabad

Employment Type: Full-time


About the Role: We are seeking an experienced Azure Data Engineer Architect with hands-on expertise in designing and implementing scalable data platforms in the Microsoft Azure ecosystem. The ideal candidate will have multiple end-to-end project delivery experiences, deep knowledge of Azure Data Factory (ADF), Azure Data Lake, Databricks, SQL**, and P&C Insurance domain experience. This role will be responsible for architecting data solutions, developing data pipelines, and partnering with business and technology stakeholders to drive enterprise data initiatives.


Key Responsibilities:

  • Lead architecture, design, and implementation of enterprise data platforms on Microsoft Azure.
  • Develop and optimize ETL/ELT data pipelines using Azure Data Factory, Databricks, and Azure Data Lake Storage.
  • Architect and manage data models, data ingestion frameworks, data lake structures, and data transformations.
  • Collaborate with BI, Data Governance, and Application teams to ensure integration and alignment across the data ecosystem.
  • Ensure data quality, lineage, metadata management, and security across cloud data assets.
  • Work with business stakeholders in Property & Casualty (P&C) Insurance to translate business requirements into technical designs and data flows.
  • Develop performance monitoring dashboards and CI/CD release automations for data pipelines.
  • Champion best practices, coding standards, and reusable framework components across data engineering.


Required Skills & Experience:

  • 06 - 08+ years of professional experience in Data Engineering, including architecture and data platform design.
  • Strong hands-on experience with Azure Data Factory (ADF), Azure Data Lake (Gen2), Databricks (PySpark / Spark SQL), and Azure SQL / Synapse.
  • End-to-end cloud data pipeline design and production deployment experience is required.
  • Solid understanding of Data Modeling, Data Warehousing, and ELT/ETL patterns.
  • Proven experience working in the Property & Casualty (P&C) Insurance domain policy, claims, billing data structures preferred.
  • Strong SQL development skills, including performance tuning and query optimization.
  • Experience with CI/CD pipelines (Azure DevOps / Git).
  • Understanding of data governance, security, and master data management (MDM) principles.
  • Excellent communication, documentation, and stakeholder engagement skills.


Preferred Qualifications

  • Azure Certifications (AZ-305, DP-203, DP-500, or equivalent).
  • Knowledge of Guidewire data models or other major P&C insurance systems.
  • Experience with Snowflake or Synapse Analytics is a plus.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.