Data Engineer(AWS,S3, Datalake, AWS Glue, Lambda, RMR,"Python", Pyspark Engineer)
Tata Consultancy Services
2 - 5 years
Pune
Posted: 02/01/2026
Job Description
Job Description
SN
Required Information
Details
1
Role
AWS Data Engineer
2
Required Technical Skill Set
AWS
Desired Experience Range
4-6 yrs.
5
Location of Requirement
Pune
Desired Competencies (Technical/Behavioral Competency)
Must-Have
- Should have expertise in creating data warehouses in AWS utilizing the following tools: EC2, S3, EMR, Athena, Sagemaker, Aurora and Snowflake., Kafka, Kinesis, Glue, Lambda, DMS, AppFlow, PowerBI
- Advanced development experience in cloud technologies such as AWS Aurora/RDS/S3, Lambda, JSON, Python
- Proficiency in scripting, querying, and/or analytics tools: Linux, python, SQL (already covered in description)
Analyze, re-architect and re-platform on premise / cloud data sources to AWS platform using AWS Glue.
- Design, build and automate AWS data pipelines from ingestion of data into consumption layer using Java / Python
Good-to-Have
- Basic knowledge in Redhat Linux and Windows Operating Systems
- Good at Console and the AWS CLI and APIs
- Knowledge of writing infrastructure as code (IaC) using CloudFormation or Terraform.
- AWS API's for integration
SN
Role descriptions / Expectations from the Role
1
Ability to understand and articulate the different functions within AWS and design appropriate solution, HLD, LLD around it.
2
Ability to identify and gather requirements to define a solution to be built and operated on AWS, perform high level and low-level design.
3
AWS Data Engineer will be responsible for building the services as per the design
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
