Data Engineer - Airflow DAG's
GS Lab & GAVS
5 - 8 years
Remote
Posted: 25/07/2025
Job Description
Our Culture
At Neurealm, we believe in the power of performance, creativity, and collaboration. Our team members, whom we call "Neuronauts," thrive in an environment driven by innovation, trust, and continuous learning. We encourage everyone to challenge boundaries and explore the edge of what’s possible.
Join us and be part of a purpose-led, AI-driven future. At Neurealm, your ideas matter, your work has impact, and your career can reach new heights.
We look forward to the possibility of welcoming you into our world — now as Neurealm — where human ingenuity and technology come together to shape what’s next !!
WORK TIMINGS : 2 PM -11 PM IST (3 Hours overlap with EST Time Zone )
Description:
We seek a dedicated and detail-oriented Senior Developer to join our dynamic team.
The responsibility of the successful candidate will be to handle repetitive technical tasks, such as Healthy Planet MS SQL file loads into a data warehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes.
Additionally, the role will require the analyst to monitor various daily and weekly jobs, which may include generation of revenue cycle reports and data delivery to external vendors.
The perfect candidate will have a robust experience with MS/Oracle SQL, Python, Epic Health Systems, and other relevant technologies.
Overview:
As a Senior Developer , you will play a vital role to improve the operation of our data load and management processes.
Your primary responsibilities will be to ensure the accuracy and timeliness of data loads, maintain the health of data pipelines, and monitor that all scheduled jobs are completed successfully.
You will collaborate with cross-functional teams to identify and resolve issues, improve processes, and maintain a high standard of data integrity.
Responsibilities:
· Manage and perform Healthy Planet file loads into a data warehouse.
· Monitor Airflow DAGs for successful completion, manage alerts, and rerun failed tasks as necessary.
· Monitor and oversee other daily and weekly jobs, including FGP cash reports and external reports.
· Collaborate with the data engineering team to streamline data processing workflows.
· Develop automation scripts to reduce manual intervention in repetitive tasks using SQL and Python.
· Ensure all data-related tasks are performed accurately and on time.
· Investigate and resolve data discrepancies and processing issues.
· Prepare and maintain documentation for processes and workflows.
· Conduct periodic data audits to ensure data integrity and compliance with defined standards.
Skillset Requirements:
· MS/Oracle SQL
· Python
· Data warehousing and ETL processes
· Monitoring tools such as Apache Airflow
· Data quality and integrity assurance
· Strong analytical and problem-solving abilities
· Excellent written and verbal communication
Additional Skillset:
· Familiarity with monitoring and managing Apache Airflow DAGs.
Experience:
· Minimum of 5 years’ experience in a similar role, with a focus on data management and process automation.
· Proven track record of successfully managing complex data processes and meeting deadlines.
Education:
Bachelor’s degree in Computer Science, Information Technology, Data Science or a related field.
About Company
GS Lab and GAVS have merged to offer end-to-end digital transformation and IT services. Their combined expertise spans AI/ML, cloud modernization, infrastructure management, and cybersecurity. They serve clients in healthcare, BFSI, and enterprise IT.
Services you might be interested in
One-Shot Campaign
Reach out to ideal employees in one shot!
The intelligent campaign for reaching out to the ideal audience to whom you can ask for help (guidance or referral).