Senior Azure Data Engineer
Tata Consultancy Services
6 - 8 years
Bengaluru
Posted: 17/12/2025
Getting a referral is 5x more effective than applying directly
Job Description
Greetings from TCS!
Job Title: Senior Azure Data Engineer
Required Skillset: Azure data factory, data bricks, data lake, automation and performance optimization of ETL.
Location: Gurugram
Experience Range: 6-8 years
Job Description
Must-Have**
- Good Knowledge of Data Brick lakehouse and Azure DataLake concept Knowledge of Data Bricks delta concept
- Delta live tables (DLT)
- Strong hands-on experience in ELT pipeline development using Azure Data factory and Databricks Autoloader, Notebook scripting and Azure Synapse Activity Copy, Data Flow Task
- Strong knowledge of metadata-driven data pipeline, metadata management, dynamic logic
- In-depth knowledge of data storage solutions, including Azure Data Lake Storage (ADLS), and Azure Serverless SQL Pool.
- Experience with data transformation using Spark, and SQL technologies. Solid understanding of design patterns, and best practices of the cloud stack.
- Experience with code management and version control using Git or similar tools.
- Strong problem-solving and debugging skills in ETL workflows and data pipelines.
- Strong understanding of Azure Data bricks and Azure Synapse internals features and capabilities.
- Knowledge of Azure DevOps and continuous integration and deployment (CI/CD) process.
- Knowledge of data quality and data profiling techniques, with experience in data validation and data cleansing
Good To Have:
- Power Shell Knowledge
- Power BI Knowledge
Responsibility of / Expectations from the Role:
- Conducting technical sessions, design reviews, code reviews, and demos of pipelines and their functionality
- Developing technical specification for Data pipelines and workflow and getting sign-off from Architect and Stryker leads.
- Developing, deploying, and maintaining workflows and data pipelines using Azure Data bricks and Azure Synapse.
- Developing pipeline/notebook for Delta live tables ( DLT) Databricks Auto Loader, Notebook scripting and Azure Synapse Activity Copy, Data Flow Task
- Writing efficient and high-performing ETL code using PySpark, and SQL technologies.
Thanks & Regards,
Ria Aarthi A.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
