🔔 FCM Loaded

Data Engineer ETL

Vistec Partners

2 - 4 years

Noida

Posted: 26/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Position: Data Engineer

Experience: 56 Years

Location: Work From Home (WFH)

Office Requirement: Once a week Noida

Time Overlap: Mandatory overlap with US EST


Role Summary

We are seeking an experienced Data Engineer with 56 years of industry experience, including a minimum of 2 years of hands-on expertise in Databricks and Azure Data Factory (ADF). The role involves designing, building, and optimizing scalable data pipelines and analytics solutions on Azure. Collaboration with US-based stakeholders requires daily overlap with the EST time zone.


Key Responsibilities

  • Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks.
  • Build scalable batch and streaming data processing workflows.
  • Develop Databricks notebooks, jobs, and Delta Lake tables.
  • Perform performance tuning and cost optimization of data workloads.
  • Implement robust data quality checks, validations, and monitoring.
  • Develop Python-based data transformation and automation scripts.
  • Write and optimize complex SQL queries for analytics and reporting.
  • Collaborate with Analytics, BI, and Product teams.
  • Document technical designs, workflows, and operational procedures.


Required Skills & Experience

  • Databricks: Minimum 2 years of hands-on experience with Spark, notebooks, workflows, Delta Lake.
  • Azure Data Factory (ADF): Minimum 2 years of experience building production-grade pipelines.
  • Python: Strong scripting and transformation capabilities.
  • SQL: Advanced querying, joins, window functions, and optimization.
  • Data Modeling: Knowledge of data warehousing concepts (star/snowflake schemas).
  • Azure Cloud: Familiarity with Azure data services and architecture.
  • Version Control: Experience with Git / DevOps workflows.
  • Debugging: Strong troubleshooting and problem-solving skills.
  • Communication: Ability to collaborate effectively with US-based stakeholders.


Preferred / Good-to-Have

  • Experience with streaming technologies (Kafka / Event Hub).
  • CI/CD pipelines using Azure DevOps.
  • Expertise in Data Lake / Delta Lake architecture.
  • Exposure to BI tools such as Power BI.
  • Experience in performance and cost optimization initiatives.


Work Conditions

  • Primarily Work From Home (WFH).
  • Mandatory once-a-week office visit in Noida.
  • Mandatory daily overlap with US EST time zone.
  • Flexibility to work evening hours as required.


#DataEngineer #Databricks #AzureDataFactory #Python #SQL #DataEngineering #Hiring #NoidaJobs #WFH #Azure #ETL #BigData

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.