AI Engineer
Genisys Group
2 - 5 years
Bengaluru
Posted: 29/01/2026
Job Description
Job Description: Junior Engineer Hands-On (Agentic-AI) / AI Engineer
Role Details
- Experience: 24 years
- Primary Tech/Domain: Agentic AI/GenAI/DevOps/ConvAI/MLOps
- 100% Hands on
Overview & Expectations
Role Summary:
Lead and deliver high-impact initiatives aligned to the AI & Data charter. Own execution excellence with measurable business value, technical depth, and governance.
Key Outcomes (0306 months):
Ship production-grade solutions with clear ROI, reliability (SLOs), and security.
Establish engineering standards, pipelines, and observability for repeatable delivery.
Build Gen-AI applications
Mentor talent; uplift team capability through reviews, playbooks, and hands-on guidance.
Responsibilities:
Translate business problems into well-posed technical specifications and architectures.
Lead design reviews, prototype quickly, and harden solutions for scale
Build automated Gen-AI applications and model/data governance across environments.
Define & track KPIs: accuracy/latency/cost, adoption, and compliance readiness.
Partner with Product, Security, Compliance, and Ops to land safe-by-default systems.
Technical Skills:
Tracks: Agentic AI/GenAI/DevOps/Conversational AI/MLOps
Python + Cloud: FastAPI, async IO; AWS/Azure/GCP basics
Agents & RAG: LangChain/CrewAI basics, embeddings, vector DBs
DevOps: Docker, CI, unit/integration tests, logging
Conversational AI: intents, NLU, dialog management, evaluation
MLOps foundations: model packaging, simple pipelines, monitoring
Design multiagent architectures using orchestration frameworks (e.g., LangChain /CrewAI /LangGraph) with clear roles, handoffs, and acceptance criteria.
Build eventdriven platforms leveraging cloud messaging (EventBridge/Event Grid/Pub/Sub), durable state, retries, and idempotency for reliable agent workflows.
Integrate LLMs as reasoning engines (Azure OpenAI / AWS Bedrock / Vertex AI) with tool/function calling, structured outputs (JSON), and guardrails.
Develop robust tool adapters for agents (search, DB/SQL, vector stores, HTTP APIs, code execution), including error handling, circuit breakers, and fallbacks.
Implement observability at scale: Tracing of agent steps and LLM calls, metrics (latency, cost per task), logs, and incident playbooks; dashboards for SLOs.
Harden security & compliance: IAM/RBAC, secrets management (Key Vault/KMS/Secret Manager), PII redaction, audit trails, and policy enforcement.
Optimize deployment & performance: Containerized microservices on AKS/EKS/GKE, autoscaling, caching/batching, concurrency controls, and cost governance.
Architecture & Tooling Stack:
Source control & workflow: Git, branching standards, PR reviews, trunk-based delivery.
Containers & orchestration: Docker, Kubernetes, Helm; secrets, configs, RBAC.
Observability: logs, metrics, traces; dashboards with alerting & on-call runbooks.
Data/Model registries: metadata, lineage, versioning; staged promotions.
Performance & Reliability:
Define SLAs/SLOs for accuracy, tail latency, throughput, and availability.
Capacity planning with autoscaling; load tests; cache design; graceful degradation.
Cost controls: instance sizing, spot/reserved strategies, storage tiering.
Qualifications :
Bachelors/Masters in CS/CE/EE/Data Science or equivalent practical experience.
Strong applied programming in Python; familiarity with modern data/ML ecosystems.
Proven track record of shipping and operating systems in production.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
