Role Summary: Solutioning lead for Data Engineering - AWS and Snowflake as primary stack
Role Responsibilities:
- Architecture and Solutioning on AWS and Snowflake platforms - - data warehouse, lakehouse, data fabric and datamesh
- Sizing ,Estimation and Implementation plan for solutioning
- Solution Prototyping, Advisory and orchestrating in-person/remote workshops
- Work with hyperscalers and platform vendors to understand and test platform roadmaps and develop joint solutions
- Own end-to-end solutions working across various teams in Cognizant - Sales, Delivery and Global solutioning
- Own key accounts as Architecture advisory and establish deep client relationships
- Contribute to practice by developing reusable assets and solutions
Job Requirements
- Bachelor’s or Master’s degree in computer science, engineering, information systems or a related field
- Minimum 15 years’ experience as Solution Architect designing and developing data architecture patterns
- Minimum 5-year hands-on experience in building AWS & Snowflake based solutions
- Minimum 3 years’ experience as Solution Architect in pre-sales team driving the sales process from a technical solution standpoint
- Excellent verbal and written communication skills with ability to present complex Cloud Data Architecture solutions concepts to technical and executive audience (leveraging PPTs, Demos and Whiteboard)
- Deep expertise in designing AWS and Snowflake
- Strong expertise in handling large and complex RFPs/RFIs and collaborating with multiple service lines & platform vendors in a fast-paced environment
- Strong relationship building skills and ability to provide technical advisory and guidance
Technology architecture & implementation experience with deep implementation experience with Data solutions
- 15~20 years of experience in Data Engineering and 5+ Years Data Engineering Experience on cloud data engineering
- Technology pre sales experience – Architecture, Effort sizing , Estimation and Solution defense
- Data architecture patterns– Data Warehouse , Data Lake , Data Mesh , Lake house , Data as a product
- Develop or Co-develop proofs of concept and prototypes with customer teams
- Excellent understanding of distributed computing fundamentals
Experience working with one or more major cloud vendors
- Deep expertise on End to End Pipeline ( or ETL) development following best practices and including orchestration, Optimization of Data pipelines
- Strong understanding of the full CI/CD lifecycle
- Large legacy migration ( Hadoop , Terdata like) experience to Cloud Data platforms
- Expert level proficiency in engineering & optimizing with various data engineering ingestion patterns - Batch, Micro Batch, Streaming and API
- Understand imperatives of change data capture with tools & best practices POV
- Architect and Solution Data Governance capability pillars supporting modern data eco system
- Data services and various consumption archetypes including semantic layers, BI tools and AI&ML
- Thought leadership designing self-service data engineering platforms & solutions
Core Platform – AWS & Snowflake
- Ability to engage and offer differing points of view to customers architecture using AWS and Snowflake platform
- Strong understanding of the Snowflake platform including evolving services like Snowpark
- Implementation expertise using AWS services – EMR , Redshift , Glue , Kinesis , Lambda, AWS Lake formation and Snowflake
- Security design and implementation on AWS & Snowflake
- Pipelines development in multi-hop pipeline architecture
- Architecture and Implementation experience with Spark and Snowflake performance tuning including topics such as cluster sizing
#LI-CTSAPAC