🔔 FCM Loaded

Azure Databricks GCP Data Engineer-4-9Y-PWC SDC

PwC Acceleration Center India

2 - 5 years

Bengaluru

Posted: 23/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

Position: Azure Databricks + GCP

Experience: 4-9Y


Must Have :

  • Proficient with Azure Data Services, Databricks and GCP
  • Experience with deep understanding in both traditional and modern data architecture and processing concepts including relational databases (e.g., SQL Server, MySQL, PostgreSQL), Data warehousing, big data, NoSQL and business analytics
  • Rich experience working in Azure ADLS, Data Bricks, Data Flows, HDInsight, Azure Analysis services
  • Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe,ETL data Pipelines, Big Data model techniques using

Python / Java

  • Experience in building stream-processing systems, using solutions such as Storm or Spark-Streaming
  • Design and implement scalable ETL/ELT pipelines using Databricks and Apache Spark
  • Optimize data workflows and ensure efficient data processing
  • Understanding of big data use-cases and Lambda/Kappa design patterns
  • Knowledge of real time/stream analytics trends.
  • Architecture, design, implementation, and/or support of complex application architectures (i.e. having an architectural sense for connecting data sources, data visualization, structured and unstructured data, etc.)
  • Demonstrable hands-on experience implementing Big Data solutions using Microsoft Data Platform and Azure Data Services

Azure SQL DB, Azure Synapse Analytics, Azure HD Insight, Azure Data Lake Storage, Azure Data Lake Analytics, Azure Machine Learning, Stream Analytics, Azure Data Factory, Azure CosmosDB and Power BI

  • Exposure to Open-Source technologies such as Apache Spark, Hadoop, NoSQL, Kafka, Solr/Elastic Search
  • Drive adoption and rollout of Power BI dashboards, standardizing reporting and enabling self-service BI capabilities for finance stakeholders.
  • Should be well versed with quality processes and implementation
  • Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build and Code Commit
  • Guide the assessment and implementation of finance data marketplaces, ensuring seamless integration across business units
  • Good communication and presentation skills

Desired Knowledge / Skills:

  • Experience in building stream-processing systems, using solutions such as Storm or Spark-Streaming
  • Experience in Big Data ML toolkits, such as Mahout, SparkML, or H2O
  • Knowledge in Python
  • Certification on AWS Architecture desirable
  • Worked in Offshore / Onsite Engagements
  • Experience in AWS services like STEP & Lambda
  • Good Project Management skills with consulting experience in Complex Program Delivery
  • Good to have knowledge in Cloud - AWS, GCP, Informatica-Cloud, Oracle-Cloud, Cloud DW -


Interested Candidate kindly share profile on

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.