Login Sign Up

Data Engineer

Fluid Touch Pte Ltd

2 - 5 years

Hyderabad

Posted: 15/05/2026

Getting a referral is 5x more effective than applying directly

Job Description

Job Title: Data Engineer Analytics & AI Insights Platform

Experience: 3+ Years

Employment Type: FullTime

Location: Hyderabad, OnSite

About the Role

We are the team behind Noteshelf a leading notetaking app loved by millions of users across iOS and Android. As we scale, data is becoming the backbone of every decision we make from understanding how users engage with their notes to improving the product experience in real time.

We're looking for a talented Data Engineer to own and evolve our entire data journey: from instrumenting analytics events inside the Noteshelf iOS & Android apps, to ingesting and processing that data in the cloud, to powering an AI-driven analytics chat application where any team member can type a question in plain English and instantly get insights from our data.

This is a high impact, end-to-end role. You'll shape the data foundation of our product from the ground up.

What You'll Build & Own

Mobile & App Analytics Instrumentation

  • Integrate and manage analytics SDKs (e.g., Firebase, Amplitude, Segment, or custom event tracking) within the Noteshelf iOS and Android apps.
  • Define, standardize, and maintain the analytics event taxonomy covering key user actions like note creation, handwriting input, page interactions, sync events, and subscription flows in collaboration with Product and Engineering.
  • Ensure reliable, accurate, and complete event capture across all key user flows.

Cloud Data Infrastructure

  • Design and build scalable data ingestion pipelines that collect analytics events and dump them into cloud storage (GCP Cloud Storage / AWS S3).
  • Set up and manage a modern cloud data warehouse Google BigQuery (preferred for analytics scale) or AWS Redshift for structured querying and reporting.
  • Implement both batch and realtime/streaming data processing (Pub/Sub, Kafka, or Kinesis) as the product scales.
  • Ensure data quality, reliability, governance, and cost efficiency across all pipelines.

AIPowered Analytics Chat Application

  • Build the data and backend layer powering a conversational analytics interface.
  • Develop a TexttoSQL pipeline where user queries (typed in natural language) are interpreted by an LLM and translated into accurate SQL queries against our data warehouse.
  • Integrate with LLM APIs (OpenAI / Anthropic Claude) and frameworks like LangChain or LlamaIndex to handle query understanding, context management, and result summarization.
  • Design and expose clean data APIs that feed results back to the chat interface.
  • Ensure query results are accurate, performant, and contextualized with the right metadata for the AI layer to interpret.

Data Modeling & Reporting

  • Build clean, well-documented data models using dbt for metrics, KPIs, and product analytics.
  • Create reusable datasets and semantic layers that power both the chat app and any BI dashboards.
  • Collaborate with Product and Business teams to define KPIs and translate them into reliable, queryable data structures.

Required Technical Skills

Programming & Querying

  • Python data pipelines, scripting, API development
  • SQL advanced querying, optimization, data modeling
  • PySpark distributed processing (good to have)

Cloud Platforms

  • GCP / AWS
  • BigQuery
  • Redshift / Athena
  • Cloud Storage / S3
  • Cloud Dataflow / Dataproc , Glue / EMR
  • Pub/Sub , Kinesis / SQS
  • Cloud Functions, Lambda

Data Engineering & Orchestration

  • dbt data transformation and modeling
  • Apache Airflow / Prefect / Dagster pipeline orchestration
  • Apache Spark / Databricks largescale data processing

AI & LLM Integration

  • Experience working with LLM APIs (OpenAI, Anthropic, Gemini)
  • Familiarity with LangChain, LlamaIndex, or similar frameworks
  • Understanding of TexttoSQL, prompt engineering, and RAG patterns
  • Ability to build reliable AI pipelines where accuracy and data trust are critical

Analytics & Event Tracking

  • Experience with analytics SDKs Firebase Analytics, Amplitude, Segment, or custom eventing
  • Understanding of event-driven data models and funnel/retention analytics

Databases

  • PostgreSQL / MySQL
  • Experience with columnar data warehouses (BigQuery, Redshift, Snowflake)

DevOps & Collaboration

  • Git, GitHub / GitLab
  • CI/CD pipelines (GitHub Actions, Cloud Build)
  • Familiarity with Docker / containerisation


Preferred Qualifications

  • 3+ years of experience in Data Engineering or a closely related role.
  • Proven track record of building end-to-end data pipelines from ingestion to insight.
  • Handson experience with iOS and/or Android app analytics instrumentation ideally in a consumer app with a large user base.
  • Prior experience building or contributing to AI/LLMpowered data applications.
  • Strong understanding of data quality, reliability engineering, and data contracts.
  • Comfortable working in a fast-paced startup environment where ownership and initiative matter.
  • Excellent communication skills able to work directly with nontechnical stakeholders to understand data needs.

Nice to Have

  • Experience with streaming data architectures (Kafka, Pub/Sub, Flink).
  • Familiarity with BI tools (Looker, Metabase, Superset).
  • Backend API development experience (FastAPI, Flask) for serving data to applications.
  • Knowledge of data privacy regulations (GDPR, CCPA) and anonymization techniques.

What We Offer

  • Shape the data culture of a product used by millions of users globally across iOS and Android.
  • Work directly with product, engineering, and leadership teams.
  • Exposure to cuttingedge AI + data technologies.
  • Competitive compensation with equity.
  • Flexible, asynchronous work culture.
  • A team that values craftsmanship, curiosity, and impact over process.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.