🔔 FCM Loaded

Technical Specialist-Data Engg

Birlasoft

2 - 5 years

Noida

Posted: 22/09/2025

Getting a referral is 5x more effective than applying directly

Job Description

Area(s) of responsibility

ROLE SUMMARY

We are seeking a highly skilled PySpark Developer with hands-on experience in Databricks to join  IT Systems Development unit in an offshore capacity. This role focuses on designing, building, and optimizing large-scale data pipelines and processing solutions on the Databricks Unified Analytics Platform. The ideal candidate will have expertise in big data frameworks, distributed computing, and cloud platforms, with a deep understanding of Databricks architecture. This is an excellent opportunity to work with cutting-edge technologies in a dynamic, fast-paced environment.

ROLE RESPONSIBILITIES

Data Engineering and Processing:

  • Develop and manage data pipelines using PySpark on Databricks.
  • Implement ETL/ELT processes to process structured and unstructured data at scale.
  • Optimize data pipelines for performance, scalability, and cost-efficiency in Databricks.

Databricks Platform Expertise:

  • Experience in Perform Design, Development & Deployment using Azure Services (Data Factory, Databricks, PySpark, SQL)
  • Develop and maintain scalable data pipelines and build new Data Source integrations to support increasing data volume and complexity.
  • Leverage the Databricks Lakehouse architecture for advanced analytics and machine learning workflows.
  • Manage Delta Lake for ACID transactions and data versioning.
  • Develop notebooks and workflows for end-to-end data solutions.

Cloud Platforms and Deployment:

  • Deploy and manage Databricks on Azure (e.g., Azure Databricks).
  • Use Databricks Jobs, Clusters, and Workflows to orchestrate data pipelines.
  • Optimize resource utilization and troubleshoot performance issues on the Databricks platform.

CI/CD and Testing:

  • Build and maintain CI/CD pipelines for Databricks workflows using tools like Azure DevOps, GitHub Actions, or Jenkins.
  • Write unit and integration tests for PySpark code using frameworks like Pytest or unittest.

Collaboration and Documentation:

  • Work closely with data scientists, data analysts, and IT teams to deliver robust data solutions.
  • Document Databricks workflows, configurations, and best practices for internal use.

About Company

Birlasoft is a global IT services and consulting company that is part of the CK Birla Group. It specializes in digital transformation, enterprise application services, and IT modernization for industries such as manufacturing, life sciences, BFSI, and energy. Birlasoft is known for its strong capabilities in SAP, Oracle, cloud, and analytics, helping clients drive innovation, reduce costs, and improve agility.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.