Login Sign Up
🔔 FCM Loaded

Technical Architect

EXL

4 - 9 years

Gurugram

Posted: 05/03/2026

Getting a referral is 5x more effective than applying directly

Job Description

About the Role:


We are looking for a Technical Architect / DevOps & Solution Engineer to join our GenAI team, which builds enterprise-grade GenAI assistants for a variety of analytical and operational use cases. This role sits at the intersection of cloud infrastructure, DevOps, ML/GenAI systems, and solution architecture.

You will help design, deploy, and operate scalable, secure, and observable GenAI platforms, working closely with ML engineers, data scientists, backend engineers, and product teams. The ideal candidate has hands-on experience deploying Python-based ML/GenAI systems to the cloud and is eager to grow into a broader solution architecture role.


Key Responsibilities Cloud & DevOps:


Design, deploy, and maintain cloud-native infrastructure for GenAI and ML workloads.

Build and operate CI/CD pipelines for backend services, ML pipelines, and GenAI

applications.

Containerize applications using Docker and deploy them using managed container

platforms or orchestration tools.

Implement high availability, horizontal scaling, auto-scaling, and cost optimization

strategies.

Set up and maintain monitoring, logging, alerting, and observability for production

systems.

Ensure security best practices across infrastructure, secrets management, networking,

and CI/CD pipelines.

Troubleshoot production issues, perform root cause analysis, and improve system

reliability.


Key Responsibilities GenAI & ML Platform Enablement:

Support deployment and operation of GenAI assistants, LLM-powered services, and ML

APIs.

Work closely with ML engineers to productionize model inference services, prompt

pipelines, and RAG systems.

Enable stateless, scalable architectures for GenAI applications.

Assist in setting up model serving, experiment tracking, and versioning workflows.

Contribute to infrastructure design for agent-based and multi-step GenAI workflows.


Key Responsibilities Solution Architecture & Collaboration::


Translate functional and non-functional requirements into cloud and platform

architecture designs.

Partner with backend, frontend, and ML teams to ensure end-to-end solution scalability

and reliability.

Document architecture, deployment patterns, and operational runbooks.

Support POCs and gradually help scale them into production-ready systems.


Required Qualifications:


4-9 years of professional experience in DevOps, cloud engineering, platform

engineering, or solution engineering roles.

Strong Python skills, especially for backend services, ML/GenAI pipelines, or

automation.

Hands-on experience deploying applications on at least one major cloud provider (AWS,

Azure, or GCP).

Experience with Docker and container-based deployments.

Working knowledge of CI/CD tools and automation pipelines.

Experience monitoring production systems (metrics, logs, alerts, cost monitoring).

Exposure to ML or GenAI projects such as LLM-based applications, ML inference APIs, or

data/ML pipelines.

Understanding of microservices architecture, APIs, and cloud-native design principles.


Good-to-Have (Nice-to-Have):


Multi-cloud exposure (AWS, Azure, GCP).

Prior experience deploying enterprise or customer-facing cloud solutions end to end.

Experience with GenAI frameworks such as LangChain, LlamaIndex, or similar.

Exposure to agentic AI systems (any framework), preferably LangGraph.

Experience deploying or operating MCP (Model Context Protocol) servers or similar

orchestration components.

Familiarity with Kubernetes or managed Kubernetes services.

Knowledge of Infrastructure-as-Code tools (Terraform, ARM, CDK, etc.).

Understanding of security, compliance, and governance considerations for GenAI

systems.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.