AWS Databricks Engineer
CitiusTech
2 - 5 years
Bengaluru
Posted: 21/03/2026
Job Description
We are looking for a highly skilled Databricks Engineer to design, build, optimize, and maintain large-scale data processing pipelines and analytics workloads on the Databricks platform.
This role requires strong hands-on experience with Databricks Jobs, Delta Tables, Pipelines, Spark SQL, AWS integrations, Genie workflows, and Databricks features such as Unity Catalog, Delta Lake optimizations, Workflow orchestration, and advanced cluster management.
The ideal candidate is expected to have proven experience in both data engineering and cloud engineering.
Key Responsibilities -
Databricks Platform Engineering
- Build, configure, and maintain Databricks Jobs, Workflows, Pipelines, and Delta Tables.
- Develop scalable high-performance ETL pipelines using PySpark, SQL, and Databricks notebooks.
- Apply optimization techniques - SQL queries, Delta Lake tables, partitioning strategies, liquid clustering.
- Ensure workflows scale efficiently and maintain reliability.
- Manage cluster performance, job debugging, resource auto-scaling.
- Document workflows, data models
- Build, publish and maintain Databricks Dashboards
- Apply the Databricks Backup and restore methods based on requirements.
Security, Governance & Compliance
- Implement Unity Catalog, data lineage, access controls, and governance best practices.
- Manage secrets, tokens, and secure integrations.
- Ensure compliance with enterprise security and data governance standards.
AWS & Databricks Integration
- Integrate Databricks with AWS services (S3, EC2, IAM).
- Manage roles, secure data lake permissions.
- Optimize cluster configuration for cost, performance, and scalability on AWS.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
