Login Sign Up
🔔 FCM Loaded

Technical Lead

Nuaav

5 - 10 years

Noida

Posted: 12/04/2026

Getting a referral is 5x more effective than applying directly

Job Description

If you keen apply with resume and details - Immediate Joiners Preferred


About the Company

Nuaav is a boutique technology consulting firm focused on delivering innovative, scalable, and secure data engineering and AI solutions. We partner with corporate clients to drive impactful change through specialized expertise in product engineering, data systems, AI/ML, and user experience. Our services emphasize personalized attention, agility, and quality, enabling clients to confidently transform and grow in evolving business environments.


About the Role

We are seeking a hands-on Tech Lead with over 10years work expereience and 7+ years of experience in data engineering to lead an in-flight, multi-track project on AWS. The project centers on a data lakehouse platform that ingests data from heterogeneous sources, organizes it through semantic layers, exposes it via REST APIs, and is being enhanced with AI-powered natural language query capabilities and entity resolution. The ideal candidate brings deep experience with at least one modern data lakehouse, data warehouse, or data mesh platform (e.g., Databricks, Snowflake, Dremio, Apache Iceberg/Trino, or similar) and can translate that expertise to lead architecture decisions, pipeline development, API design, and team delivery on AWS.


Responsibilities

  • Lead end-to-end architecture, design, and engineering across all project tracks, serving as the primary technical decision-maker.
  • Design and optimize data ingestion strategies for heterogeneous sources, using native platform connectors where available and custom Python pipelines where not.
  • Architect and maintain semantic layers (raw/bronze, business/silver, application/gold) ensuring data quality, lineage, and governance.
  • Design, develop, and maintain REST APIs (read and write patterns) with appropriate transformations for diverse consumer needs.
  • Drive the AI-powered natural language query capability: define the data catalog and data dictionary structure, integrate with LLM services (Claude or similar), and validate response accuracy.
  • Evaluation, design and implementation of deterministic, probabilistic, ML-based data entity resolution.
  • Establish and enforce CI/CD, version control, testing, and deployment best practices for data pipelines, APIs, and platform configurations.
  • Collaborate with data scientists, analytics teams, and product stakeholders to translate requirements into technical designs and delivery plans.
  • Mentor and guide team members, conduct code reviews, and foster a culture of engineering excellence.


Qualifications

  • Bachelors degree in Computer Science, Information Technology, or a related field.

Required Skills

  • 7+ years of hands-on data engineering experience, with at least 2 years in a tech lead or senior technical role.
  • Strong experience with AWS cloud services (S3, CodePipeline/CodeBuild, RDS, Secrets Manager, IAM, and related services).
  • Hands-on expertise with at least one modern data lakehouse, data warehouse, or data mesh platform (e.g., Databricks, Snowflake, Dremio, Trino/Presto, Apache Iceberg, Delta Lake, or similar) with a strong understanding of semantic/virtual layers, reflections/materializations, and query optimization.
  • Proficiency in Python for data pipeline development, API development, and automation.
  • Experience designing and building REST APIs (Flask, FastAPI, or similar frameworks) with transformation logic for diverse consumers.
  • Experience with SQL across multiple database dialects (SQL Server, MySQL, PostgreSQL, Oracle) and with search engines (Solr, Elasticsearch) as data sources.
  • Understanding of data cataloging, data dictionary development, and metadata management practices.
  • Familiarity with entity resolution concepts and techniques (deterministic matching, probabilistic models, or ML-based approaches).
  • Experience with CI/CD pipelines, infrastructure-as-code, and DevOps practices in a cloud environment.
  • Strong communication skills with the ability to articulate technical designs to both technical and non-technical stakeholders.


Preferred Skills

  • Experience with Dremio (Arctic, Sonar, reflections, virtual datasets).
  • Experience integrating LLMs (Claude, GPT, or similar) into data platforms for natural language querying or AI-assisted analytics.
  • Familiarity with data lakehouse file formats (Parquet, Iceberg, Delta) and object storage patterns on S3.
  • Experience with graph-based or network-based entity resolution frameworks (e.g., Zingg, Splink, or custom solutions).
  • Knowledge of data governance frameworks and tools (e.g., Apache Atlas, Collibra, or similar).
  • Exposure to Power BI or similar BI tools connecting to lakehouse/warehouse platforms.
  • Experience working in agile consulting environments and managing multiple workstreams concurrently.

Pay range and compensation package - INR 35-40L

Early Joiners or within 30 days Joiners shall be evaluated for this role

Location: Noida / Hybrid || Employment Type: Full-time


Equal Opportunity Statement

Nuaav is committed to diversity and inclusivity in the workplace. We encourage applications from individuals of all backgrounds and experiences.


if you are keen, please do share your profile with details on artee.puri@nuaav.com / hiring@nuaav.com

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.