🔔 FCM Loaded

Data Engineer

Vivriti Capital

2 - 5 years

Bengaluru

Posted: 13/01/2026

Getting a referral is 5x more effective than applying directly

Job Description

About Vivriti Group

Vivriti Group is a trailblazer in the mid-market lending space, offering customized debt solutions to mid-sized enterprises. The group operates through two core businesses:

  • Vivriti Capital Limited: A systematically important Non-Banking Financial Company (NBFC ND-SI) regulated by the Reserve Bank of India (RBI). Vivriti Capital has disbursed over USD 3 billion to 300+ enterprise borrowers and holds a CRISIL rating of A+.
  • Vivriti Asset Management: A fixed-income fund manager, managing multiple Alternative Investment Funds (AIFs). With over USD 550 million in commitments from 900+ institutional and private contributors, Vivriti AMC has invested more than USD 600 million across 90+ entities.

Learn more:

Location : Bengaluru (on-site)


The Opportunity

The Data Engineering unit at Vivriti not only serves as a backbone for the whole data team but works very closely almost all business units in VCL and VAM to design custom high value end-to-end workflows. The Data Engineering team heavily involved with Finance, Risk, Operations, Credit to define the business problem statement, not only design the solution. At Vivriti, while we believe in building highly efficient process workflows, we also believe that all problems are everyones problem . The Data Engineering unit at Vivriti stands is the centre attraction of this moto.

As a Data Engineer at Vivriti, while you learn to build end-to-end workflows on state-of-the-art platform like Databricks, you will deeply learn how a top tier mid-corporate lender operates to be at its peak. You will learn about the risk and credit policies implemented in the company across various business sectors such as co-lending, institutional finance and supply chain to main the least NPA. These policies shape the requirements for the Data Engineering unit which further define the tech stack we use.

Were looking for candidates who are passionate about building highly efficient end-to-end data workflows with immediate impact to the business. If youre someone who enjoys decoding complex financial problems using data solutions and thrive on working with petabytes of data while enabling real time insights, wed like to have a chat.


Key Responsibilities

  • Building and maintaining high scale end-to-end data workflows in Databricks for different business sectors like Co-lending, Institutional Finance and Supply Chain Finance.
  • Effectively communicate with stakeholders to thoroughly understand the requirement at a business level before designing the solution. Maintain a Align and Commit philosophy.
  • Understand the importance of platform engineering, data quality measures, version control for better scalability.
  • Maintain on-going communications with stakeholders on the projects/workflows in development and callout any changes that affect the development.
  • Maintain and help promote best in class coding standards to ensure high quality and minimum risk.
  • Have a growth mindset, bring in a new approach and look to improve the existing workflows.
  • Identify loopholes/tech debts in the data platform and provide solutions to fix it.
  • Work with junior subordinates/interns helping them understand the problem statements and solve them effectively.
  • Be a part of Vivriti s culture i.e. maintaining a positive mindset and approachable mindset and feel free to approach anyone for any problem statements.

Qualifications for the job

  • 2-3+ years of experience in the in building large scale Data Engineering pipelines using PySpark, SQL, Python, Airflow or Databricks.
  • 2+ years of experience in effectively communicating with stakeholders to understand their requirements and deliver as expected.
  • 2+ years of experience in data modelling on top of KPIs defined by stakeholders.
  • 2+ years of experience in maintaining data warehouses.
  • 3+ years of experience in writing complex SQL queries.
  • Highly proficient in designing platform level algorithms for efficiency.
  • Considerable amount of experience in the Fintech/Lending/Banking domain.
  • Experience with cloud-based services viz AWS, GCP, Azure, Databricks.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.