IN_Senior Associate– Data Engineer – India IT– IFS – PAN India
PWC
4 - 8 years
Gurugram
Posted: 31/05/2025
Job Description
Line of Service
Internal Firm ServicesIndustry/Sector
Not ApplicableSpecialism
IFS - Information Technology (IT)Management Level
Senior AssociateJob Description & Summary
At PwC, our people in integration and platform architecture focus on designing and implementing seamless integration solutions and robust platform architectures for clients. They enable efficient data flow and optimise technology infrastructure for enhanced business performance.In enterprise architecture at PwC, you will focus on designing and implementing architectural solutions that align with the organisation's overall strategy and goals. Your work will involve understanding business products, business strategies and customer usage of products. You will be responsible for defining architectural principles, analysing business and technology landscapes and translating content / develop frameworks to guide technology decisions and investments. Working in this area, you will have a familiarity with business strategy, processes and experience in business solutions which enable an organisation's technology infrastructure. You will help to confirm that technology infrastructure is optimised, scalable, and aligned with business needs, enabling efficient data flow, interoperability, and agility. Through your work, you will communicate a deep understanding of the business and a broad knowledge of architecture and applications.
Job Description & Summary:
A strong team player who will be the part of the core technical architect team of PwC India responsible for “Application/Solution design, development, management, understanding of technical architecture constructs and operational support”.
Hands-on technical experience in the range of 4-8 years with good understanding of SLDC, integration architecture, Azure Cloud and Data Engineering, Microsoft technologies and having good verbal & business communication skills. He/She will be responsible for designing, building, and maintaining scalable data pipelines and systems to support our data-driven initiatives and ensure the availability and quality of our data. The person will work closely with analysts, and other stakeholders to implement robust data solutions that empower quality.
Responsibilities:
Role & Responsibilities:
- Collaborate with cross-functional teams to understand data requirements and translate them into technical specifications.
- Write clean, scalable code and Low Code no Code solutions in cross cutting technology platforms.
- Design and develop scalable, reliable, and performant data pipelines to process and manage large datasets.Creating and managing data mapping and transformation routines to ensure data accuracy and consistency.Implementing data ingestion pipelines from multiple data sources using Azure Data Factory.
- Implement and maintain ETL processes to extract, transform, and load data from various sources into our data warehouse.
- Optimize and maintain existing data systems and infrastructure to improve efficiency and scalability.
- Ensure data quality and integrity by implementing robust data validation and monitoring processes.
- Develop and maintain documentation related to data architecture, processes, and workflows.
- Stay updated with the latest industry trends and best practices in data engineering and apply them to enhance our data systems.
- Assist in the integration of new data sources and tools into the existing data ecosystem.
- Provide support and troubleshooting for data-related issues as needed
- Understanding of integration architecture.
- Should be able to handle operational support responsibilities
- Attend daily scrum calls and update the ADO user stories.
- Familiarity with architecture patterns, APIs (REST, SOAP, RPC), XML and JSON formats
- Working experience on Azure Cloud data storage and processing.
- Proficiency in Azure Data factory, Azure Storage, Azure SQL, Azure functions, Azure Logic apps, Azure app services, Azure Networking, Azure key Vault integrations.
- Experience in ETL operations between on premises/on cloud and Azure SQL/Azure Blob Storage/ Azure Data Lake Gen2/Azure Synapse Analytics using Azure Data Factory V2
- Good hands-on experience in writing pipelines in azure data factory to fetch data from different sources and loading into Azure Data Lake and Azure SQL Database, facilitating seamless data movement and transformation to support advanced analytics. Responsible for creating Linked services, datasets, pipelines in Azure data factory. Creating data flows to transforming the data to Azure using Azure data factory and scheduling triggers and notification jobs.
- Proficiency in programming languages such as Python, .NET, C#, TypeScript, Javascript, Java.
- Strong knowledge of SQL and experience with relational Databases.
- Learnability towards cutting-edge data projects that drive business insights and innovation.
- Excellent problem-solving skills and attention to detail.
• Strong communication and collaboration skills, with the ability to work effectively in a team environment. • Understanding of CI/CD and DevOps.
Good to have Qualifications:
• Experience in data engineering or a related role, with a strong understanding of data architecture and data processing frameworks.
• MS Fabric familiarity is a good to have.
• Write Notebooks in Apache Spark pool using Pyspark, spark SQL to ingest the data from Data Lake to the blob storage with data quality checks and transformations.
• Understanding of Agile methodologies. • Experience with big data technologies such as Hadoop, Pig, Spark is a good to have. Databricks level notebooks creation with Pyspark.
• Using PYSARK to read CSV, Parquet, JSON files and applying transformation then load into SQL Tables.
• Experience with data visualization tools or platforms. • Knowledge of machine learning concepts and tools.
Mandatory skill sets:
Data Engineer, Azure Data factory, ETL, SQL Azure, Cloud Computing and Data Engineering.
Preferred skill sets:
Essential Skills & Personal Attributes: ● Positive, “can-do” attitude towards colleagues, clients and problems alike. ● Team Player ● Lateral Thinker ● Inquisitive mind and capacity to delve into details. ● Work in an organized manner. ● Adhere to timelines.
Years of experience required:
4 -8 years
Education qualification:
Bachelors
Shift Hours
IST
Education (if blank, degree and/or field of study not specified)
Degrees/Field of Study required: Bachelor DegreeDegrees/Field of Study preferred:Certifications (if blank, certifications not specified)
Required Skills
Azure Data Factory, Data EngineeringOptional Skills
Team PlayerDesired Languages (If blank, desired languages not specified)
Travel Requirements
Available for Work Visa Sponsorship?
Government Clearance Required?
Job Posting End Date
About Company
PricewaterhouseCoopers (PwC) is a global professional services firm providing audit, tax, and consulting services. PwC helps organizations manage financial risks, comply with regulations, and improve performance through its expertise in industries like finance, healthcare, and technology.
Services you might be interested in
One-Shot Campaign
Reach out to ideal employees in one shot!
The intelligent campaign for reaching out to the ideal audience to whom you can ask for help (guidance or referral).