Developer
Tata Consultancy Services
2 - 5 years
Bengaluru
Posted: 08/01/2026
Job Description
Role - Spark Scala Developer
Required Technical Skill - Spark ,PySpark, Python, Hive, Scala, Unix shell scripting
Desired Experience Range - 5+years
Location of Requirement - Pan India
Must-Have
Roles and Responsibilities:
Enhance Machine Learning Models using PySpark or Scala
Work with Data Scientists to Build ML Models based on Business Requirements and Follow ML Cycle to Deploy them all
the way to Production Environment
Participate Feature Engineering, Training Models, Scoring and retraining
Architect Data Pipeline and Automate Data Ingestion and Model Jobs
Skills and competencies:
Required:
- Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
- Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
- Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
- Flexibility in approach and thought process
- Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
