Job Summary
We are seeking a Sr. Developer with 4 to 6 years of experience to join our team. The ideal candidate will have expertise in Spark in Scala Apache Airflow Python and Databricks SQL. Experience in Asset Management Operations is a plus. This is a hybrid work model with day shifts and no travel required.
Responsibilities
Develop and maintain scalable data pipelines using Spark in Scala to ensure efficient data processing.Implement and manage workflows using Apache Airflow to automate data tasks and ensure timely execution.Write clean efficient and maintainable code in Python to support various data engineering tasks.Utilize Databricks SQL to perform complex queries and data transformations for analytics and reporting.Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.Conduct code reviews to ensure code quality and adherence to best practices.Troubleshoot and resolve data pipeline issues to maintain data integrity and availability.Optimize data processing performance to improve system efficiency and reduce latency.Document data workflows processes and code to ensure knowledge sharing and maintainability.Stay updated with the latest industry trends and technologies to continuously improve data engineering practices.Provide technical guidance and mentorship to junior developers to foster team growth.Participate in agile ceremonies and contribute to sprint planning retrospectives and daily stand-ups.Ensure compliance with data governance and security policies to protect sensitive information.
Qualifications
Must have strong experience in Spark in Scala for developing scalable data pipelines.Must have hands-on experience with Apache Airflow for workflow automation.Must be proficient in Python for data engineering tasks.Must have experience with Databricks SQL for data querying and transformation.Nice to have experience in Asset Management Operations domain.Must have excellent problem-solving skills and attention to detail.Must have strong communication and collaboration skills.Must be able to work in a hybrid work model with day shifts.Must be able to work independently and as part of a team.Must have a proactive attitude and a willingness to learn new technologies.Must have a strong understanding of data governance and security practices.Must be able to document processes and code effectively.
Certifications Required
Certified Spark Developer Apache Airflow Certification Python Certification