Data Engineer
The Hartford
2 - 5 years
Hyderabad
Posted: 12/02/2026
Getting a referral is 5x more effective than applying directly
Job Description
Experience Level: 6 to 9 yrs
Key Responsibilities
- Lead teams to build and operate large and complex data ecosystem leveraging data domains, data products, cloud and modern technology stack
- Lead teams responsible for Implementing Data and AI pipelines that bring together structured, semi-structured and unstructured data to support AI and Agentic solutions. This Includes pre-processing with extraction, chunking, embedding and grounding strategies to get the data ready.
- Lead teams implementing real time data streaming pipelines using technologies such as Apache Kafka, AWS Kinesis, Spark streaming, or similar.
- Stay up to date with industry advancements in GenAI and apply modern technologies and methodologies to our systems. This includes leading prototypes (POCs), conducting experiments, and recommending innovative tools and technologies to enhance data capabilities enabling business strategy.
- Identify and Champion AI driven Data Engineering productivity improvements capabilities accelerating end-to-end data delivery lifecycle.
- Lead teams designing and developing graph database solutions for complex data relationships supporting AI systems, this also includes developing and optimizing queries (e.g., Cyhper, SPARQL) to enable complex reasoning, relationship discovery, and contextual enrichment for AI agents.
- Lead Implementation of best practices in reliability engineering, including redundancy, fault tolerance, and disaster recovery strategies.
- Simplify architectures to increase velocity, quality, and reliability
- Enable data-driven decisions that advance The Hartfords strategic growth agenda
- Champion a strong data culture across the enterprise
- Own outcomes for programs/portfolios; establish delivery strategy, guardrails, and success metrics
- Drive execution from initiation through release and stabilization; manage scope, estimates, schedules, risks/issues
- Promote advanced Agile behaviors, coach teams toward self-organization and continuous improvement
- Serve as a trusted point of contact for sponsors and leaders; proactively manage risks and dependencies
- Mentor Scrum Masters/Data Engineers/Tech Leads; coordinate partner resources; uphold engineering and documentation standards
- Align delivery with technical design; leverage knowledge of modern data stacks (AWS, Snowflake, ETL/ELT), CI/CD, and DevOps)
- Support the development and implementation of project and portfolio strategy, roadmaps, and implementations
- Operationalizing GitHub Copilot and Snowflake Copilot across multiple scrum teams with review standards and SDLC governance
- Own portfoliolevel AI adoption metrics (hours saved per sprint, PR review effort, pipeline leadtime, retrieval success rate); publish quarterly accelerator impact to EDS leadership.
- Standardize Copilot review gates and agent guardrails; achieve 20% reduction in boilerplate coding time and 15% improvement in documentation completeness.
Required Skills & Experience:
- 7 to 8 years in technology leadership/delivery/program/project management with complex, multi-team initiatives
- Overall 10+ years of data engineering experience including Data solutions, SQL and NoSQL, Snowflake, ETL/ELT tools, CICD, Bigdata, Cloud Technologies (AWS/Google/AZURE), Python/Spark, Datamesh, Datalake or Data Fabric.
- Demonstrated Agile leadership (Scrum/Kanban/SAFe) and coaching skills (3+ years intermediate/expert)
- Successful management of 3-4 scrum teams (3+ years intermediate/expert)
- Coaching and Mentoring skills (3+ years intermediate/expert)
- Excellent communication and stakeholder management (3+ years intermediate/expert)
- Proficiency with delivery tooling (Rally, dashboards) and SDLC governance(3+ years intermediate/expert)
- Proven delivery of GenAIenabled engineering workflows (prompttoSQL, pipeline scaffolds, doc/runbook generation) with humanintheloop controls.
Nice to Have
- Experience in data and analytics delivery and cloud migration (AWS/Snowflake)
- Certifications: SAFe (SPC/RTE), PMP, PgMP, CSM/PSM
- Background leading vendor/partner teams and optimizing onshore/offshore models
- Cloud certifications preferred
What We Offer
- Collaborative work environment with global teams.
- Competitive salary and comprehensive benefits.
- Continuous learning and professional development opportunities.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
