🔔 FCM Loaded

Azure PySpark Solution Architect

UST

2 - 5 years

Hapur, Thiruvananthapuram

Posted: 08/01/2026

Getting a referral is 5x more effective than applying directly

Job Description

Azure Databricks / PySpark Solution Architect Banking & Financial Services

Pune (Balewadi) / Trivandrum


Were looking for an Azure Databricks & PySpark Solution Architect to design, build, and scale enterprise-grade data platforms for leading Banking & Financial Services clients.

If you excel in Lakehouse architecture, PySpark engineering, data governance, and cloud-first analytics, this role is for you!


What Youll Do

Architecture & Data Platform Design

  • Build Databricks Lakehouse architecture using Delta Lake & Unity Catalog
  • Design scalable batch + streaming data pipelines using PySpark, ADF/Synapse, Event Hubs/Kafka
  • Define HLD/LLD, orchestration, CI/CD workflows, and cluster optimization
  • Architect ingestion for core banking datasetspayments, transactions, fraud, regulatory


Data Engineering, Performance & Analytics Enablement

  • Develop high-performance ETL/ELT pipelines using PySpark, SQL, Auto Loader, Structured Streaming
  • Optimize Spark workloads with AQE, caching, partitioning, Z-Ordering
  • Implement Delta Lake best practices: CDC, schema evolution, versioning
  • Enable ML Ops pipelines using MLflow, Feature Store, and Databricks workflows
  • Ensure observability, reliability, and performance at scale


Governance, Security & Technical Leadership

  • Implement governance with Unity Catalog / Purview (lineage, RBAC, PII/PCI classification)
  • Ensure compliance: PCI-DSS, AML/KYC, SOX, GDPR
  • Architect secure environments: Key Vault, VNET, Private Endpoints
  • Lead engineering teams, code reviews, DevOps automation (Azure DevOps, IaC Terraform/Bicep)
  • Engage with architects, SMEs, and business teams to define strategy


Mandatory Skills

  • Azure Databricks, PySpark, Delta Lake, Auto Loader, Structured Streaming
  • Azure Data Services: ADF, Synapse, ADLS Gen2, Event Hubs/Kafka
  • Lakehouse architecture & distributed computing
  • SQL/NoSQL + strong data modeling expertise
  • Spark performance tuning for large-scale workloads
  • Governance frameworks: Unity Catalog / Purview
  • CI/CD, DevOps, IaC (Terraform/ARM/Bicep)
  • Strong documentation & communication skills


Good to Have

  • Banking/FS domain expertise (fraud, payments, risk, regulatory, customer 360)
  • Experience with Databricks across multiple clouds (Azure/AWS/GCP)
  • Knowledge of event-driven and microservices-based architectures

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.