🔔 FCM Loaded

AI Engineer (Conversational Analytics & GenAI Systems)

IRISS, Inc.

3 - 5 years

Bengaluru

Posted: 10/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

Company Overview:

IRISS, Inc. is a leading innovator in the field of advanced technological solutions, providing cutting-edge products and services to enhance safety, reliability, and efficiency of assets across various industries. Our commitment to pushing boundaries and delivering exceptional solutions has positioned us as a trusted partner for clients seeking top-tier technical expertise in Condition Based Monitoring.


IRISS Inc - Leader in Electrical Maintenance Safety Solutions

IRISS - YouTube


Position: AI Engineer (Conversational Analytics & GenAI Systems)

Location: Bengaluru, India


About the Product:

You will work on IRISS's conversational analytics platform, a GenAI-powered chatbot that

transforms natural language queries into validated, compliant, and tenant-aware SQL and

visual insights. This platform enables users to ask business questions like Show me last

month's motor temperature anomalies in Plant 3 and get immediate, accurate dashboards

and reports generated safely through AI-driven data pipelines.


Our AI stack:

- Interprets user intent using LLMs.

- Generates validated, policy-compliant SQL.

- Executes and visualizes data with context and feedback loops.

- Powers a RAG-based (Retrieval-Augmented Generation) framework integrated with

existing IoT and analytics microservices


Job Overview:

You will design, develop, and maintain the AI chatbot platform that serves as the

intelligence layer for our SaaS ecosystem. This includes architecting end-to-end

conversational pipelines from LLM prompt design to data retrieval, integrating vector-

based search systems and RAG pipelines into our service mesh, leveraging AWS AI/ML and

orchestration services such as Bedrock, Kendra, OpenSearch, Lambda, ECS, and S3 to build

scalable and secure infrastructure, and partnering with full-stack and front-end engineers

to embed AI features directly into user workflows


Backend:

- ASP.NET Core with ABP & ASP.NET Zero modules, EF Core, and SQL Server for tenancy-

aware domain logic

- Python (FastAPI/Flask) for new microservices and migration targets

- APIs: SignalR hubs and REST endpoints exposed through the Web Host

- Infrastructure:

- AWS Services: ECS for container orchestration, RDS (Aurora) for databases, S3 for

storage, Lambda for serverless functions

- Hangfire for background jobs, log4net + custom middleware for correlation-aware

logging

- HealthChecks, Stripe + Firebase integrations

- DevOps: AWS CDK-driven Infrastructure as Code with containerized services, Redis

caching, and microservice extensions


Frontend:

- Angular 18 (latest version with standalone components support)

- TypeScript 5.5

- RxJS 7.4 for reactive programming

- PrimeNG, Angular Material, ngx-charts for UI components


Key Responsibilities:

- Design and implement backend services in .NET Core (ASP.NET Core Web API) using

Entity Framework Core and LINQ

- Help migrate our backend APIs to Python microservices architecture

- Build clean, testable Angular 18+ UIs and reusable components (standalone)

- Design and evolve multi-tenant backend services for assets, sensors, work orders,

notifications, and AI workflows

- Integrate data sources: SQL (SQL Server/Aurora) and InfluxDB for time-series telemetry

- Implement background jobs, rate limiting, and observability using Hangfire, Redis, and log

enrichment patterns

- Extend REST and SignalR endpoints while maintaining tenant isolation and role-based

access control

- Collaborate with IoT and data teams to expose sensor data, alerts, reports, and analytics

- Implement authentication/authorization, input validation, and error handling across the

stack

- Participate in code reviews, ADRs, grooming, and release readiness checks

- Contribute to CI/CD pipelines (GitHub Actions), basic observability, and performance

profiling

- Define service boundaries, transactional integrity, and performance within core

application layers



Core Stack & Technologies


AI/ML & Data Intelligence


- Python 3.10+ (FastAPI, LangChain, Haystack, or equivalent)


- LLMs: OpenAI, Anthropic, Hugging Face, or open-source models (LLaMA, Mistral, Falcon)


- RAG Systems: FAISS, Pinecone, OpenSearch Vector Store, or ChromaDB

- Prompt Orchestration: LangChain, Semantic Kernel, or internal tooling


- Data Validation & Safety: SQL sanitization layers and policy enforcement modules


- Visualization Layer: Chart.js or D3.js integration for generated insights


Cloud & Infrastructure:

- AWS Bedrock, Kendra, OpenSearch, Lambda, S3, CloudWatch, ECS, and EC2

- API Gateway for AI microservices

- Redis or DynamoDB for caching and conversation state

- OpenTelemetry for observability

- CI/CD using GitHub Actions, AWS CDK, and Docker-based microservices

Front-End & Integration

- Works closely with Angular 18+ applications and .NET/Python backend microservices

- Exposes APIs to the Full-Stack and Front-End teams for seamless user interactions

- Implements real-time feedback mechanisms for model evaluation and tuning


Key Responsibilities:

- Architect, develop, and maintain the GenAI chatbot platform from the ground up

- Build multi-turn conversation flows and contextual memory for data queries

- Implement RAG pipelines using vector databases and curated embeddings

- Integrate open-source and commercial LLMs through APIs or local deployment

- Create safety and compliance modules that validate SQL and policy rules before execution

- Collaborate with backend engineers to design AI microservices that scale horizontally

- Deploy, monitor, and optimize models using AWS Bedrock, Kendra, and OpenSearch

- Maintain observability and feedback loops for improving model accuracy and reliability

- Partner with front-end teams to deliver chat-first analytics interfaces

- Contribute to documentation, testing, and architectural decision records for AI systems


Requirements:

- Bachelor's or Master's degree in Computer Science, Data Science, or a related field

- Minimum 3 years of experience developing and deploying AI-powered applications or

chatbots

- Strong Python expertise (FastAPI, Flask, or Django for microservices)

- Experience with LLM integration (OpenAI, Bedrock, Hugging Face, or local models)

- Hands-on experience with AWS ecosystem including Bedrock, Kendra, OpenSearch, ECS,

Lambda, and CloudWatch

- Deep understanding of RAG architecture, vector databases, and embeddings-based

retrieval

- Knowledge of prompt design, model orchestration, and AI safety validation

- Familiarity with SQL and multi-tenant data systems

- Experience with Docker, Git-based CI/CD, and microservice architectures

Nice-to-Have

- Experience fine-tuning or hosting open-source LLMs (LLaMA, Mistral, Falcon)

- Understanding of LangChain Agents or Semantic Kernel pipelines

- Familiarity with Angular and .NET ecosystems for end-to-end integration

- Exposure to observability frameworks such as OpenTelemetry, Prometheus, or Grafana

- Knowledge of enterprise data governance and AI compliance frameworks

- Contributions to open-source AI projects or custom LLM integrations

What You'll Work On:

- Migration of .NET Core backend services to Python microservices

- Tenant-aware APIs powering asset hierarchies, predictive maintenance, and automated

work orders

- Real-time dashboards and notifications for sensor events, alerts, and chat integration

- Performance and reliability for data-heavy dashboards (pagination, caching, change

detection)

- Background workflows orchestrating AI-driven insights and report exports

- REST services consumed by Angular dashboards and mobile clients

- Observability hooks (health checks, telemetry, correlation IDs) for enterprise-grade

reliability

- Developer experience improvements (codegen, linting, templates, better local envs)


What You Will Build:

- A conversational analytics chatbot capable of generating real-time, compliant SQL queries

- RAG pipelines that fetch and embed domain knowledge across tenants

- Context-aware AI microservices integrated with IRISSs monitoring and reporting systems

- Evaluation dashboards for prompt performance, latency, and query accuracy

- Continuous learning and feedback loops to improve the GenAI system over time

Development Environment

- Python 3.10+, FastAPI, LangChain

- AWS Bedrock, OpenSearch, Kendra, Lambda, ECS

- Angular 18+ for embedded UIs

- Node.js 16+, Yarn, VS Code

- GitHub Actions and AWS CDK for CI/CD

- Dockerized microservices architecture


Compensation:

Competitive salary, benefits, and strong growth opportunities.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.