🔔 FCM Loaded

Cloud Databricks Architect

ARA Resources Pvt. Ltd.

2 - 5 years

Bengaluru

Posted: 12/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

About ARAs Client

ARAs Client is a global leader in end-to-end data and analytics solutions, with nearly two decades of experience helping enterprises unlock value from data-driven capabilities. Operating at the scale of a global consulting firm with the agility of a niche specialist, ARAs Client partners with customers worldwide on large-scale data modernization and analytics transformation initiatives.

With a strong presence in India and a growing global footprint, ARAs Client is known for its high-performance culture, reusable analytics accelerators, and focus on continuous learning and innovation.


Role Summary

Design and optimize cloud-native data architectures on platforms like Databricks and Snowflake, enabling scalable data engineering, advanced analytics, and AI/ML solutions aligned with business needs.


Key Responsibilities

  • Design and implement Lakehouse architectures using Databricks, Delta Lake, and Apache Spark.
  • Lead the development of data pipelines, ETL/ELT processes, and data integration strategies.
  • Collaborate with business and technical teams to define data architecture standards, governance, and security models.
  • Optimize performance and cost-efficiency of Databricks clusters and jobs.
  • Provide technical leadership and mentorship to data engineers and developers.
  • Integrate Databricks with cloud platforms (Azure, AWS, or GCP) and enterprise systems.
  • Evaluate and recommend tools and technologies to enhance the data ecosystem.
  • Ensure compliance with data privacy and regulatory requirements.
  • Contribute to proposal and pre sales activities.


Must-Have Qualifications

  • Expertise in data engineering, data architecture, or analytics.
  • Hands-on experience on Databricks and Apache Spark.
  • Hands-on experience on Snowflake
  • Strong proficiency in Python, SQL, and PySpark.
  • Deep understanding of Delta Lake, Lakehouse architecture, and data mesh principles.
  • Deep understanding of Data Governance and Unity Catalog.
  • Experience with cloud platforms (Azure preferred, AWS or GCP acceptable).


Good to have Skills:

  • Good understanding of the CI/CD pipeline
  • Working experience with GitHub
  • Experience in providing data engineering solutions while maintaining balance between architecture requirements, required efforts and customer specific needs in other tools


Qualification:

  • Bachelors degree in computer science, engineering, or related field
  • Demonstrated continued learning through one or more technical certifications or related methods
  • Over 10+ years of relevant experience in ETL tools
  • Relevant experience in Retail domain

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.