🔔 FCM Loaded

AWS Data Architect

Tata Consultancy Services

2 - 5 years

Bengaluru

Posted: 09/01/2026

Getting a referral is 5x more effective than applying directly

Job Description

Greetings from TATA CONSULTANCY SERVICES LIMITED !!!

Thank you for exploring career opportunities with Asia's largest IT company.

Exciting #Job Opportunities for #Experienced Professionals


Role: AWS Data Architect

Experience Range: 815 years

Location: PAN India


Job Description AWS Data Architect


Responsibilities:

  • As a member of the Data on Cloud CoE (Centre of Excellence), provide technical pre-sales enablement covering data architecture, data engineering, data modeling, data consumption, data platform migration, and data governance focusing on AWS data platform.
  • Expert-level knowledge on AWS Data Platform data engineering, performance, consumption, security, governance, and admin aspects.
  • Collaborate with cross-functional teams in onsite/offshore setups and solve technical problems with various stakeholders, including customer teams.
  • Create technical proposals and respond to large-scale RFPs.
  • Discuss existing solutions, design/optimize solutions, and prepare execution plans for development, deployment, and enabling end users to utilize the data platform.
  • Organize workshops and meetings with account teams, leadership, and senior client stakeholders, including CXO level.
  • Create POVs and conduct PoCs.
  • Liaise with technology partners like AWS, Databricks, etc.

Required Skills:

  • 1015 years of total experience with at least 3+ years of expertise in AWS data platform technologies (Glue, EMR, RedShift, Databricks, etc.).
  • At least one end-to-end AWS data platform implementation covering architecture, design, data engineering, visualization, and governance.
  • Strong experience in data migrations and development of ODS, EDW, Data Lakes, and Data Marts.
  • Hands-on SQL and Data Warehousing lifecycle knowledge.
  • Experience with cloud ETL/ELT tools (DBT, Glue, EMR, Matillion) and exposure to Big Data ecosystem (Hadoop).
  • Expertise in traditional data warehouses (Oracle, Teradata, Oracle Exadata).
  • Excellent communication skills to liaise with business and IT stakeholders.
  • Expertise in project planning and effort estimation.
  • Understanding of Data Vault, data mesh, and data fabric architecture patterns.
  • Exposure to Agile ways of working.

Good to Have:

  • Coding experience in Python and PySpark.
  • Knowledge of DevOps, CI/CD, GitHub.
  • Experience or understanding of Banking and Financial Services domains.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.