Senior Data Engineer (Databricks | Insurance | Data Migration)
PwC India
12 - 14 years
Mumbai
Posted: 17/12/2025
Job Description
Job Description Senior Data Engineer (Databricks | Insurance | Data Migration)
Location: Pan India
Experience: 712 years
Role Overview
We are looking for a highly skilled Senior Manager / Manager /Senior Data Engineer with deep expertise in Databricks data management , logical & physical data modelling , and insurance domain data workflows . The candidate will work on a strategic data migration initiative for a leading UK-based insurance company, moving data from Guidewire into Databricks Silver and Gold layers with strong governance, lineage, and scalability standards.
Key Responsibilities
Databricks Data Engineering & Management
- Design, build, and optimize Silver and Gold layer data pipelines in Databricks using PySpark, SQL, Delta Lake, and Workflow orchestration.
- Implement data quality, lineage, schema evolution, and governance controls across curated layers.
- Optimize Databricks jobs for performance, scalability, and cost efficiency .
Guidewire Databricks Migration
- Lead the end-to-end migration of large-scale insurance data from Guidewire PolicyCenter/ClaimCenter/BillingCenter into Databricks.
- Map and transform complex Guidewire entity structures into normalized and star-schema models.
Data Modelling & Architecture
- Develop robust logical and physical data models aligned to insurance business processes.
- Build high-quality curated data marts (Gold) for analytics, reporting, pricing, underwriting, and claims.
- Define standards for metadata, naming conventions, partitioning, and model documentation.
Insurance Domain Expertise
- Understand core insurance data entities such as policy, claims, billing, customer, underwriting, rating, and product hierarchies .
- Apply domain knowledge to rationalize Guidewire data structures and create business-ready datasets.
Solutioning & Ideation
- Collaborate with client SMEs, architects, and business analysts to shape data solutions and propose design improvements.
- Ability to ** ideate, simplify complex data flows**, and contribute to overall solution architecture.
Required Skills & Experience
Technical
- 712 years of experience in data engineering, data modelling, and data management .
- Strong hands-on experience in Databricks, Delta Lake, PySpark, Spark SQL, and ETL/ELT pipelines .
- Expertise in logical & physical data modelling (3NF, Star Schema, Data Vault preferred).
- Practical knowledge of Guidewire data model and prior migration experience (mandatory).
- Experience working with large-scale insurance datasets
- Strong understanding of data quality frameworks , lineage, cataloging, and governance.
Soft Skills
- Strong problem-solving and conceptualization / ideation capability.
- Excellent communication and stakeholder-management for UK client environment.
- Ability to work in fast-paced delivery tracks with cross-functional global teams.
Preferred Qualifications
- Certifications in Databricks , Azure/AWS , and Data Migration are added advantages.
- Experience delivering enterprise-grade data lakes or lakehouse architectures .
Why Join This Role?
- Work on a flagship insurance data modernisation project for a top UK carrier.
- Opportunity to shape enterprise-scale data models on the Databricks Lakehouse .
- High-visibility role with strong career growth in insurance data engineering .
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
