Scala Developer
Tata Consultancy Services
2 - 5 years
Mumbai
Posted: 05/01/2026
Getting a referral is 5x more effective than applying directly
Job Description
Location: Powai
Must have:
- Minimum 5+ years of experience in development of Spark Scala
- Experience in designing and development of solutions for Big Data using Hadoop ecosystem technologies such as with Hadoop Bigdata components like HDFS, Spark, Hive Parquet File format, YARN, MapReduce, Sqoop
- Good Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and streaming data processing.
- Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins, Views etc
- Experience in debugging the Spark code
- Working knowledge of basic UNIX commands and shell script
- Experience of Autosys, Gradle
Good to have:
- Good analytical and debugging skills
- Ability to coordinate with SMEs, stakeholders, manage timelines, escalation & provide on time status
- Write clear and precise documentation / specification
- Work in an agile environment
- Create documentation and document all developed mappings
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
