Azure Data Streaming Engineer
Recro
2 - 5 years
Bengaluru
Posted: 02/05/2026
Job Description
Role: Azure Data Streaming Engineer
Location: Bangalore, India
Experience: 9+ years
Skills: ADF(Must have), Databricks, Pyspark, SQL, Python, Azure functions, Real- Time Streaming (Must have), Azure IoT Hub (Must have), Data Pipeline, Delta Live table (Good to have), Apache Kafka (Must have), Azure Event Hubs (Must have).
About the Role:
We are looking for an Azure Data Streaming Engineer who lives and breathes high-velocity data. If you are a Data Engineer who thinks like a Software Engineerprioritizing low latency, event-driven architecture, and clean code over simple batch ETLthis role is for you.
You will be joining our team to architect, build, and maintain mission-critical streaming pipelines. You wont just be moving data from A to B; you will be building the systems that power live tracking, real-time triggers, and high-frequency event processing.
What You Will Do:
- Architect Real-Time Pipelines: Design and implement robust, scalable, and low-latency data streaming architectures.
- Build & Optimize: Develop high-performance data processing pipelines using Python, PySpark, and Databricks.
- Event-Driven Logic: Build complex event-driven workflows (e.g., real-time notifications, IoT telemetry processing, live logistics tracking).
- Bridge Backend & Data: Collaborate with software engineers to integrate data pipelines directly into application backends.
- Monitor & Scale: Ensure system reliability, uptime, and performance in a high-concurrency Azure environment.
Must-Have Qualifications:
- Strong Python Backend Coding: This is not a SQL-only role. You must be comfortable writing production-grade, maintainable Python code.
- Real-Time/Streaming Expertise: Hands-on experience with streaming datasets and real-time architectures (not limited to batch).
- Azure Mastery: Deep knowledge of the Azure Data ecosystem, specifically Azure Event Hubs and Azure IoT Hub.
- Processing Frameworks: Proven experience with PySpark and Databricks.
- Message Brokers: Professional experience with Apache Kafka or similar distributed streaming platforms.
- Data Fundamentals: Strong SQL skills for complex data transformation and analytical queries.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
