Data Engineer
EliteRecruitments
2 - 5 years
Hyderabad
Posted: 17/02/2026
Job Description
Hiring data engineer for insurtech organisation for Mumbai, Pune, Hyderabad and Bangalore location
Primary Responsibilities:
Development (Approximately 70%)
Implement, and manage Snowflake data warehouse platforms ensuring scalability and performance.
Design, develop, and implement data transformations using ETL/ELT tools and/or stored procedures within the data warehouse and data mart.
Build and maintain data ingestion frameworks using core Python to integrate data from multiple heterogeneous sources.
Build and optimize data pipelines for processing large-scale datasets efficiently.
Develop and support data sharing options for our clients using various technologies (Kafka, Azure Service Bus, Azure Data Factory, Streams, etc.)
Build processes in an automated, templated, repeatable fashion to increase reusability and predictable patterns.
Produce solutions that maximize performance, scalability, and extensibility.
Assist with requirements gathering and reverse engineering of existing processes and reports
Partner with Reporting and BI teams to procure and transform the necessary data to support our internal and external stakeholders
Develop and maintain documentation for various audiences
Participate in agile/scrum process to groom and develop tasks
Design (Approximately 10%)
Drive the platform modernization, consolidation, and migration of applications to Azure cloud
Collaborate with Business Leaders/Product Management and Engineering, to continuously improve illumifins data strategy
Other responsibilities as required
Customer Interaction (Approximately 10%)
Designing and implementing data solutions for our customer needs and their use cases which could include streaming, to Snowflake, to analytics, and beyond across a progressively evolving technical stack
Guide customers through the process of migrating off LTCG's on prem solution to Snowflake and develop methodologies to improve the migration process
Provide guidance on how to resolve customer-specific technical challenges that will be impacted by LTCG's migration to Snowflake
Troubleshoot issues as they arise
Mentoring (Approximately 10%)
Coaching, Mentoring team members to build the strong data platform team.
Providing thought leadership by recommending the right technologies and solutions for a given use case, from the application layer to infrastructure; and they have the team leadership and coding skills to get their solutions into production to help ensure performance, security, scalability, and robust data integration
This role will also be expected to pass on the learning to peers for knowledge sharing
Qualifications:
Degree in computer science, engineering, mathematics or related fields, or equivalent experience
Minimal 8+ years experience in data warehousing/BI/analytics
Minimum 5+ years as a Data Engineer or equivalent
Strong hands-on experience with advanced python
Understanding of migration, CI/CD pipelines and DevOps practices for data engineering.
Good understanding Data Modelling and Data Governance.
Preferred Qualifications:
Experience with Snowflake DWH, Python with a focus on libraries such as NumPy, Pandas etc and DataFrame / dataclasses.
Experience of Snowflake capabilities like Snowpipe, STREAMS, etc. and data transformation tools such as DBT
Experience on SQL
Familiarity with Unix Scripting
Familiarity and experience with common BI and data exploration tools (e.g. Power BI)
Familiarity with Data Governance processes
Strong analytical mind with hands-on experience in pattern-based solutions, data transformations, and creative ways of using BI tools such as leveraging them to
convert canned reports to exploratory reports
If you find the role interesting please share your resume.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
