🔔 FCM Loaded

Databricks Platform Expert

MICHELIN

6 - 8 years

Pune

Posted: 25/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Databricks Platform Expert

- - - - - - - - - - - -

Key Responsibilities

Databricks Platform Expert

  • Manage, configure, and administer Databricks workspaces, Clusters, SQL Warehouses, Serverless, jobs, and workspace objects.

  • Implement and manage Unity Catalog, including catalogs, schemas, tables, access controls, and data lineage.

  • Optimize cluster policies, auto-scaling strategies, and cost management for Serverless and Classic compute.

  • Serve as the SME for Databricks infrastructure, governance, and security best practices.

  • Monitor workspace performance, cluster stability, logs, job reliability, and platform health.

  • Implement CI/CD pipelines for notebooks, jobs, and Delta Live Tables using Git integration.

  • Support user provisioning, access controls (ACLs), secrets management, and workspace SSO.

  • Write efficient Spark (PySpark / SQL / Scala) code for ETL, data transformations, and pipeline optimizations.

  • Assist data engineering teams with Spark job debugging, performance tuning, and code reviews.

  • Build and maintain production-grade pipelines leveraging Delta Lake, Databricks Jobs, and DLT.

  • Implement and manage RBAC, SCIM provisioning, AIM, service principals, and cluster access controls.

  • Ensure compliance with enterprise data governance, audit, and logging requirements.

  • Manage secrets Key Vault and enforce secure credential handling.

  • Support audit reports, compliance reviews, and workspace security configuration.

  • Monitor job failures, cluster lifecycle performance, and system events using Databricks logs and cloud-native monitoring tools (Azure Monitor).

  • Create automated alerts and observability dashboards for platform usage, cost, and performance.

  • Troubleshoot Databricks runtime issues, library conflicts, and Spark execution failures.

  • Collaborate with cloud and network teams on VNet, peering, and private-link connectivity issues.

  • Develop cost governance policies for cluster sizes, job policies, and SQL Warehouse tiers.

  • Identify opportunities to reduce cost via autoscaling, spot instances (classic clusters), and job consolidation.

 

Required Qualifications

  • 4–6 years of experience working with Databricks as an administrator or data engineer.

  • Strong expertise in Apache Spark programming (PySpark preferred; SQL or Scala is a plus).

  • Hands-on experience with Databricks Jobs, cluster configuration, SQL Warehouses, and Unity Catalog.

  • Deep understanding of Delta Lake, ACID transactions, and lakehouse architecture.

  • Experience with Git, CI/CD, and DevOps concepts for data engineering workflows.

  • Knowledge of cloud platforms ( Azure).

  • Familiarity with IAM, networking basics, monitoring tools, and security patterns

About Company

Michelin is a global tire manufacturer known for its high-performance tires used in automobiles, trucks, and aircraft. The company is committed to sustainability, producing eco-friendly products and investing in technologies that improve fuel efficiency, safety, and environmental impact.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.