Skill: AWS Cloud Data Architect
Role: Solution Architect
Roles and responsibilities:
Solutioning lead for NA Data Engineering - AWS and Snowflake as primary stack
1) Architecture and Solutioning on AWS and Snowflake platforms - - data warehouse, lakehouse, data fabric and datamesh
2) Sizing ,Estimation and Implementation plan for solutioning
3) Solution Prototyping, Advisory and orchestrating in-person/remote workshops
4) Work with hyperscalers and platform vendors to understand and test platform roadmaps and develop joint solutions
5) Own end-to-end solutions working across various teams in Cognizant - Sales, Delivery and Global solutioning
6) Own key accounts as Architecture advisory and establish deep client relationships
7) Contribute to practice by developing reusable assets and solutions
Required skills:
1. Excellent verbal and written communication skills with ability to present complex Cloud Data Architecture solutions concepts to technical and executive audience (leveraging PPTs, Demos and Whiteboard)
2. Deep expertise in designing AWS and Snowflake
3. Strong expertise in handling large and complex RFPs/RFIs and collaborating with multiple service lines & platform vendors in a fast-paced environment
4. Strong relationship building skills and ability to provide technical advisory and guidance
5. Minimum 15 years’ experience as Solution Architect designing and developing data architecture patterns
6. Minimum 5-year hands-on experience in building AWS & Snowflake based solutions
7. Minimum 3 years’ experience as Solution Architect in pre-sales team driving the sales process from a technical solution standpoint
8. Bachelor’s or Master’s degree in computer science, engineering, information systems or a related field
1. Technology architecture & implementation experience with deep implementation experience with Data solutions
• 15~20 years of experience in Data Engineering and 5+ Years Data Engineering Experience on cloud data engineering
• Technology pre sales experience – Architecture, Effort sizing , Estimation and Solution defense
• Data architecture patterns– Data Warehouse , Data Lake , Data Mesh , Lake house , Data as a product
• Develop or Co-develop proofs of concept and prototypes with customer teams
• Excellent understanding of distributed computing fundamentals
2. Experience working with one or more major cloud vendors
• Deep expertise on End to End Pipeline ( or ETL) development following best practices and including orchestration, Optimization of Data pipelines
• Strong understanding of the full CI/CD lifecycle
• Large legacy migration ( Hadoop , Terdata like) experience to Cloud Data platforms
• Expert level proficiency in engineering & optimizing with various data engineering ingestion patterns - Batch, Micro Batch, Streaming and API
• Understand imperatives of change data capture with tools & best practices POV
• Architect and Solution Data Governance capability pillars supporting modern data eco system
• Data services and various consumption archetypes including semantic layers, BI tools and AI&ML
• Thought leadership designing self-service data engineering platforms & solutions
3. Core Platform – AWS & Snowflake
• Ability to engage and offer differing points of view to customers architecture using AWS and Snowflake platform
• Strong understanding of the Snowflake platform including evolving services like Snowpark
• Implementation expertise using AWS services – EMR , Redshift , Glue , Kinesis , Lambda, AWS Lake formation and Snowflake
• Security design and implementation on AWS & Snowflake
• Pipelines development in multi-hop pipeline architecture
• Architecture and Implementation experience with Spark and Snowflake performance tuning including topics such as cluster sizing
Gen-AI architecture patterns
Data Quality and Data Governance
Cloud Cost Monitoring and Optimization