🔔 FCM Loaded

Analytics Engineer

Six Sense Mobility

2 - 5 years

Delhi

Posted: 28/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Company Description

Six Sense Mobility is a Connected Mobility Company incubated by IIT Delhi and funded by MEITY (Ministry of Electronics and Information Technology). The company specializes in advanced vehicle telematics with AI-powered solutions, using state-of-the-art plug-and-play devices. All products and technologies are designed, developed, and manufactured in India to transform regular vehicles into smart, connected vehicles. Six Sense Mobility is committed to innovation and cutting-edge technology to improve mobility experiences.

About the Role

We are looking for a highly skilled Analytics Engineer to join our data team and help scale our real-time IoT vehicle telemetry architecture. In this role, you will sit at the intersection of data engineering and analytics, ensuring that massive volumes of streaming sensor data are accurately ingested, modeled, and made ready for high-performance querying.

You will take ownership of data quality, troubleshoot complex pipeline discrepancies, and optimize our reporting layers to ensure our analytics are both lightning-fast and structurally sound.

Key Responsibilities
  • Data Modeling & Architecture: Design, build, and maintain robust data models for high-speed analytical reporting and Machine Learning.
  • Query Optimization: Write, debug, and fine-tune complex SQL queries. Identify and eliminate performance bottlenecks, preventing architectural traps like Cartesian fan-outs in unmerged data parts.
  • Pipeline Troubleshooting: Act as the first line of defense for data consistency. Investigate and resolve data drops or ingestion mismatches across distributed systems (e.g., Kafka,, Elasticsearch, and ClickHouse).
  • Exploratory Data Analysis (EDA): Utilize Python/R (specifically Pandas and NumPy, R STudio) to perform deep dives into raw JSON telemetry, validate sensor metrics and identify dirty data patterns.
  • Cross-System Integration: Work closely with backend teams to ensure seamless data flow from Kafka topics into analytical and search databases, managing JSON extraction and schema enforcement.
Required Qualifications & Skills
  • Experience: 3+ years of hands-on experience in Analytics Engineering, Data Engineering, or a heavily data-focused backend role.
  • Advanced SQL: Exceptional SQL skills with a deep understanding of analytical databases. You must know how to pre-aggregate data, manage CTEs, and tune queries for execution speed. (Prior ClickHouse/Databricks experience is a massive advantage).
  • Python: Strong proficiency in Python for scripting, automation, and EDA.
  • Data Engineering Principles: Solid grasp of streaming data concepts, batch vs. stream processing, eventual consistency, and idempotent data ingestion.
  • Elasticsearch / NoSQL: Hands-on experience querying, aggregating, and debugging document-based databases via REST APIs or Dev Tools.
  • Kafka: Familiarity with consuming and monitoring event-driven data streams.


Nice-to-Haves
  • Experience with ClickHouse Cloud or managing SharedMergeTree engines.
  • Knowledge of advanced stream processing frameworks (e.g., Apache Flink).


Salary: As per industry standards

Experience: 3+ Years

Role Type: Full-Time






Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.