Analyst Level 3
A Azure/Pyspark Developer with a good understanding of the entire ETL/Azure lifecycle with a background of data projects.
Work youll do
- Design and development of ETL systems, data integration solutions, ensuring proper coding standards, procedures and techniques are followed
Act as the subject matter expert on development techniques
Design and develop migration of data/ETL to Cloud(Azure) based services like Logic Apps, Synapse, etc.
Design and develop Pyspark code and have expertise on the same
Responsible for code changes in .net
Creates technical code documentation and participates as a contributor in solution design and estimation
- Perform unit testing and analyze / report the results to pertinent teams
- Troubleshoot errors and problems reported by QA, product owners, and end users while documenting how to resolve issues
- Identify / troubleshoot application code-related issues (provide 3rd level production support / issue resolution)
Contributes to direction, product and project planning and SDLC practices for ETL/data-oriented projects. Possesses in-depth knowledge and enforces standards, procedures, and methodologies. Supports the maturing of systems engineering, quality assurance and analysis policies, procedures, standards for data integration projects in line with organizational standards
The team
Solutions Delivery-Canada is an integral part of the Information Technology Services group. The principle focus of this organization is the development and maintenance of technology solutions that e-enable the delivery of Function and Marketplace Services and Management Information Systems.
Solutions Delivery Canada develops and maintains solutions built on varied technologies like Siebel, PeopleSoft Microsoft technologies and Lotus Notes. Solutions Delivery Canada has various groups which provide the best of the breed solutions to the clients by following a streamlined system development methodology. Solutions Delivery Canada comprises of groups like Usability, Application Architecture, Development and Quality Assurance and Performance.
Qualifications
Required:
Computer Science University degree or equivalent work experience
At least 1.5-2.5 years experience in Pyspark and Azure Data Factory and Azure related services like Logic Apps, Synapse, etc.
Familiarity with ETL tools
Expert in developing solutions and deliver the high quality projects using Pyspark and Azure Data Factory
Expert on data integration tools, such as Data Stage is good to have
Knowledge on basic Python programming - Preferable
Knowledge on .net preferable
Solid grounding in relational database (Microsoft SQL Server)
Expertise on Web Service design and development (including SOAP, XML, .Net) - Preferable.
Expertise on message queue design and development - Preferable
Knowledge on Cloud based ETL services
Knowledge on Microsoft Systems Center Orchestrator
Excellent organizational and communication skills
Must have strong interpersonal skills, and the ability to effectively work with others in teams
Strong problem-solving skills in identifying issues and resolution
Adeptness at learning new technology and tools
The ability to effectively manage multiple assignments and responsibilities in a fast-paced environment
Strong commitment to professional client service excellence
Should be a strong individual contributor
Work Location:
Hyderabad