Data Engineer
Omni Recruit Private Limited
47 - 49 years
Bengaluru
Posted: 12/02/2026
Job Description
Position Description
The Data Engineering function is growing across ingestion, integration, transformation, and consumption
capabilities to deliver data products that transform how the organization leverages data. The Data Engineering and
Platforms team is seeking an experienced Senior Data Engineer to provide technical leadership and expertise to
both internal and partner teams within the Enterprise Data environment. This role will be instrumental in shaping
technical roadmaps, defining best practices, and contributing significantly to architectural decisions to drive
innovation at scale.
This role demands deep technical expertise, strong problem-solving skills, and architectural understanding across
data engineering initiatives. It requires close collaboration with architects and stakeholders to contribute to
enterprise-wide data engineering strategies, ensuring scalable, resilient, and cost-optimized data solutions.
To be successful in this engineering role requires a highly motivated individual with an innovative mindset and
willingness to drive tangible outcomes. The individual must be able to articulate complex technical topics,
collaborate with internal and external partners, and ensure quality delivery of required data products.
Responsibilities
Lead the technical design and hands-on development of scalable, high-performance data platforms and
solutions provide technical guidance to the engineering team.
Drive data-driven decision-making to deliver measurable business impact across the organization.
Contribute to defining and implementing enterprise data architecture principles aligned with organizational data
strategy.
Modernize legacy data products to modern data architectures ensuring performance and scalability.
Establish standards, frameworks, and patterns aligned with security, compliance, and performance best
practices.
Provide hands-on technical guidance and mentor junior engineers.
Architect and oversee fault-tolerant, cost-optimized data pipelines across Azure, Databricks, and GCP.
Partner with security teams to ensure compliance with security and regulatory standards.
Collaborate with Product Owners, Data Architects, and Engineering Squads in agile delivery.
Evaluate new tools and trends contribute to AI-driven automation, DataOps, and DevSecOps adoption.
Lead technical discussions in sprints, backlog refinement, and iterative solution design.
Drive POCs, rapid prototyping, and pilot implementations for new approaches.
Provide advanced technical solutions for complex ingestion, processing, and storage challenges.
Ensure consistency in data engineering patterns across domains for a unified data ecosystem.
Qualifications
Bachelors Degree in Computer Science, Software Engineering, or equivalent professional experience.
47 years of experience delivering enterprise-scale data solutions in the cloud (Databricks, Azure, GCP
preferred).
Additional Skills / Preferences
Proven track record in leading and delivering complex data projects.
Expertise in data management, information integration, and analytics practices.
Experience with modern data architectures and methodologies (Domain-driven design, scalable pipelines,
DataOps CI/CD, API-centric design, SQL/NoSQL, FAIR principles).
Strong expertise in Azure Data Factory (ADF).
Proficiency with GitHub for version control and CI/CD workflows.
Experience in DevSecOps culture, CI/CD, and TDD practices.
Familiarity with ML workflows, data quality, and data governance.
Experience in complex, multi-stakeholder environments.
Experience mentoring junior engineers.
Strong interpersonal and communication skills.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
