Job Title:- Data Engineer with Pyspark
Location: - Kolkata
Exp- 3+ years
Responsibilities:
· Minimum 3 years of experience in build & deployment of Bigdata applications using PySpark
· 2+ years of Experience with AWS Cloud on data integration with Spark & AWS Glue/EMR
· In-depth understanding of Spark architecture & distributed systems
· Good exposure to Spark job optimizations
· Expertise in handling complex large-scale Big Data environments
· Able to design, develop, test, deploy, maintain, and improve data integration pipeline
Mandatory Skills :
· 4+ years of exp in PySpark
· 2+ years of exp in AWS Glue/EMR
· Strong knowledge on SQL is required
· Excellent written & spoken communication skills, and time management skills.
Nice-to-Have
· Any cloud skills
· Any ETL knowledge