đź”” FCM Loaded

Lead DataOps Engineer

BD

5 - 10 years

Bengaluru

Posted: 25/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Job Description Summary

Job Description

Lead Engineer for Data Operations

Are you ready to join a team known for innovation and ambition at BD? We are hiring a Lead DataOps Engineer to take on an essential role within our Digital Transformation Organization. This position lets you engage with advanced technology in a constantly evolving environment. You will supply to developing outstanding solutions that boost business results and improve patient health.

Position Summary

As a Lead DataOps Engineer, you will assume a data operations leadership position within our Digital Transformation Organization. You will be highly motivated, self-starting, and dedicated to delivering results with a solid sense of responsibility and persistence until resolution. Your excellent organizational skills will allow you to handle various tasks and succeed in a fast-paced environment with minimal oversight. Furthermore, your strong analytical, problem-solving, and solve abilities will be essential in this position.

Educational Background

  • Bachelor’s degree or equivalent experience in Computer Science, Data Analytics, or related fields.

Professional Experience

  • More than 5 years of professional experience with Databricks, Azure Data Factory, and Power BI.

Job Responsibilities

Databricks and Azure Data Factory tools:

  • Proactive Monitoring & Support: Monitor ADF pipeline and Databricks schedule runs and activity logs for all tracks. Handle failures, retries, and blocking issues.
  • Ensure SLA compliance for data movement and transformation.
  • Process files by manually initiating the necessary jobs according to business needs.
  • Cluster upgrade in case of performance issues.
  • Pause & resume jobs and identify impacted objects due to source-side issues.
  • Provide month-end assistance according to business requirements.
  • Build artifacts and document issues & object-level inventory.
  • Collaborate with teams and lead knowledge-sharing sessions.
  • Connects daily with teams and clients to deliver timely updates.
  • Perform data corrections to maintain consistency between source and downstream systems.

Incident Management:

  • Perform root cause analysis for failures and build incidents when necessary.
  • Communicate and inform issues to engineering teams.
  • Interact with users and resolve issues.

Scheduling & Trigger Management:

  • Configure and maintain scheduled, tumbling window, and event-based triggers for ADF.
  • Modify schedules according to business requirements.

Deployment & CI/CD:

  • Validate BD internal requests and coordinate deployments across Dev, QA, and Prod environments.
  • Maintain Git integration and release pipelines.

DevOps Activities:

  • Build folders in DEV, QAS, and PROD ADLs.
  • Manage files across ADLs environments.
  • Make changes in DAB files and upload to ADLs.

Power BI:

  • Monitor scheduled dataset, dataflow, and pipeline refreshes across workspaces.
  • Track refresh failures, identify root causes, and take corrective actions.
  • Maintain daily refresh status reports and send updates to collaborators.
  • Ensure gateway health, connection stability, and capacity utilization.
  • Solve visualization issues, filters, bookmarks, drilldowns, and broken navigation.
  • Validate and fix data mismatch, performance problems, and incorrect measures.
  • Support users with access requests, RLS issues, and permissions.
  • Coordinate deployments using deployment pipelines, PBIX file migration, and parameter updates.
  • Validate releases in QA and Production through Quality Control checks.

Knowledge and Skills

  • Practical experience working directly with Databricks, Azure Data Factory, and Power BI.
  • Strong ETL and SQL knowledge is a must.
  • Advanced knowledge of Power BI (DAX, Power Query, data modeling).
  • Familiarity with Azure Data Services or similar cloud platforms.
  • Excellent problem-solving and interpersonal skills.
  • Experience with Azure DevOps / GitHub.
  • Strong critical thinking, solve, and leadership skills.

Desired / Additional Skills & Knowledge

  • Knowledge of ServiceNow and Microsoft Azure cloud platform.
  • Good interpersonal and behavioral skills.
  • Excellent soft skills.

If you are ready to make an impact and drive digital transformation at BD, we want to hear from you!

Required Skills

Batch Monitoring, Communication, Data Engineering, Data Monitoring, DataOps, Data Pipelines, End-to-End Orchestration, Microsoft Azure Databricks, Operations Orchestration, Problem Resolution, Root Cause Analysis (RCA), Security Monitoring, SLA Monitoring

Optional Skills

Collaborating, Customer Engagement, Emotional Intelligence, User Engagement

.

Primary Work Location

IND Bengaluru - Technology Campus

Additional Locations

Work Shift

About Company

BD (Becton, Dickinson and Company) is a global medical technology company headquartered in Franklin Lakes, New Jersey. Founded in 1897, BD is a leading provider of medical devices, instruments, and reagents for healthcare institutions, life sciences, and laboratories. The company's product offerings include diagnostic equipment, infusion pumps, insulin delivery systems, surgical instruments, and laboratory supplies. BD is also involved in the development of vaccines, diagnostics, and critical care solutions. With a focus on improving patient outcomes and advancing healthcare, BD aims to enhance the quality of care through innovative technologies and solutions in the medical and life sciences fields.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.