Hi,

I'm Gurunathan
AI Solutions Architect · Agentic AI & Immersive Experiences

I design and build production-grade AI systems, multi-agent architectures, and real-time XR platforms that power intelligent automation, simulations, and interactive experiences.

Download CV
More About Me – Guru’s Folio

Building the future of
intelligent, immersive systems

I’m an AI Solutions Architect specialising in agentic AI pipelines, conversational AI systems, immersive XR platforms, industrial digital twins, and real-time voice-driven interfaces. I architect end-to-end solutions that are built to scale, built to last.

My track record spans production-grade systems for global enterprises including Airbus, Deutsche Bahn, Starbucks, Koch, Flinthill Resources, and Invista — spanning aerospace, rail, industrial, and consumer domains.

I enjoy building products from the ground up — from validating ideas through rapid prototyping, testing with real users, and delivering scalable, high-performance systems that connect people, data, and environments in powerful new ways.

My work sits at the intersection of two worlds that rarely converge — deep XR engineering and applied AI architecture. The goal isn’t to integrate AI into experiences — it’s to design systems where AI and immersion are inseparable.

My Approach & Philosophy

I focus on opportunities over challenges, exploring creative ways to innovate even under constraints. Guided by data-driven decision-making, I’m passionate about software architecture and designing systems that are both elegant and efficient.

Collaboration is at the core of how I work. I believe excellence achieved as a team is far more powerful than individual success. Art and technology blend naturally in everything I build — from immersive XR to intelligent, AI-driven systems.

I’m a lifelong learner and technology enthusiast. Outside of work I stay active through football and the gym, and love connecting with like-minded creators, technologists, and innovators.


What I bring to the table

Immersive & Real-Time Systems

UnityUnreal EngineAR/VR/MRNVIDIA OmniverseDigital TwinsIoT / MQTTPhotonMeta QuestHoloLensPICOMagic Leap

Agentic AI & Orchestration

LangChainLangGraphMulti-Agent SystemsRAG PipelinesChromaDBPrompt EngineeringTool UseMemory SystemsHuman-in-the-LoopAgent Evaluation

Voice AI & Conversational

ElevenLabs TTSDeepgram STTAgora SDKReal-time VoiceHume AIOVR Lip SyncXR Avatars

Cloud AI & Azure

Azure OpenAIAzure AI FoundryAzure AI SearchCognitive ServicesAzure STT/TTSResponsible AIContent Safety

Architecture & Engineering

End-to-end AI SystemsC#PythonNode.jsJavaScriptFirebasePostgreSQLRedisAWSAzureGCPREST APIs

Leadership & Delivery

Enterprise DeliveryStakeholder ManagementCross-functional TeamsAgileTechnical DocsMentoringPerf Optimisation

How I design AI systems

When architecting an AI solution I think in layers — each one solving a distinct problem. What makes my approach different is that every layer is also designed to surface inside a real-time human experience, not just run as a backend service.

01

LLM Orchestration & Tool Use

The reasoning core. Deciding when to call tools, which model fits the task, how to constrain outputs to stay grounded and safe, and how to handle tool failures gracefully.

LangGraph Azure OpenAI Tool Binding Output Parsers Responsible AI

In immersive use

Powers intelligent avatar responses, context-aware NPC behaviour, and adaptive training flows in real time
02

Multi-Agent Coordination

Breaking complex problems into specialised agents with clear responsibilities — a tutor agent, an evaluation agent, an emotion agent — each doing one job well, orchestrated by LangGraph.

LangGraph Nodes Conditional Routing Agent Handoff State Management

In immersive use

Multiple agents running simultaneously — one listening, one evaluating emotion, one generating the avatar’s next response
03

Memory Architecture

Short-term context for the current session. Long-term persistence across sessions via database checkpointers. The difference between an AI that feels present and one that forgets you.

MemorySaver PostgresSaver Session Context Cross-session Recall

In immersive use

Avatar remembers previous training sessions — what the learner struggled with, what topics are complete, how to adapt the next session
04

RAG Pipelines & Vector Retrieval

Grounding responses in trusted knowledge — chunking, embedding, semantic search, and retrieval. Ensures the AI speaks from your data, not from hallucination.

ChromaDB Azure AI Search Embeddings Semantic Retrieval LCEL Chains

In immersive use

Industrial digital twins retrieve live equipment specs. Training avatars pull curriculum knowledge. VR assistants answer from product metadata
05

Real-Time Interaction Layer

Voice pipelines (STT → LLM → TTS), WebSocket streaming, low-latency response design, and emotion detection. This is where AI stops being a service and becomes a presence.

Deepgram STT ElevenLabs TTS Agora SDK Hume AI OVR Lip Sync

In immersive use

3D avatars that speak, listen, respond to emotion, and lip-sync in real time — creating a genuinely human-feeling AI interaction inside XR
06

Cloud-Native Deployment

Containerised services, managed vector stores, observability via LangSmith, responsible AI guardrails, and scalable infrastructure that holds under enterprise load.

Docker AWS / GCP / Azure LangSmith Content Safety CI/CD

In immersive use

AI backend serving 1,000+ concurrent XR users with monitored, traced, and auditable agent interactions across every session

Where AI meets immersion

Most AI architects build pipelines. Most XR developers build environments. I build systems where the pipeline is the environment — where the AI response becomes the avatar’s voice, the training scenario’s outcome, or the digital twin’s next state.

🥽

Intelligent XR Training

AI-powered virtual mentors that replace scripted flows with natural conversation — learners ask anything, get grounded, contextually accurate answers in real time.

RAG + Azure OpenAI + STT/TTS → Adaptive training avatar
🏭

Industrial Digital Twins

Physical systems mirrored in XR — engineers interact with live equipment data, get AI-generated operational guidance, and control real hardware from inside VR.

IoT + MQTT + LLM context injection → Real-time industrial AI
🎓

Emotionally Aware AI Tutoring

A 3D teacher avatar that listens, detects emotion, retrieves curriculum knowledge, and adapts its teaching style — remembering each student across every session.

LangGraph + RAG + Hume AI + ElevenLabs → Ms. Nova
🤖

Context-Aware NPC Systems

Virtual characters that don’t follow scripts — they reason about their environment, respond to user intent, and maintain consistent personality across interactions.

Behaviour trees + LLM reasoning → Intelligent NPCs

Currently building

AI Tutoring Platform

Founder & Architect — 2026 – Present

In Development
  • Architecting a full-stack AI tutoring platform featuring a 3D conversational teacher avatar with real-time voice interaction and emotion-aware responses.
  • Designed a multi-agent orchestration pipeline using LangGraph, integrating LLM reasoning, speech (STT/TTS), and emotion detection for adaptive learning experiences.
  • Built end-to-end system across Unity (XR interface), Node.js backend, and cloud infrastructure (PostgreSQL, Redis, Firebase, AWS/GCP).
  • Implementing RAG-based knowledge retrieval using vector embeddings and ChromaDB for context-aware and reliable responses.
  • Designed for global accessibility with multilingual support — English and Hindi at launch, with additional languages planned.

Where I’ve worked

Koch

Aug 2023 – Present

Ontario, Canada

Flinthill Resources · Invista

AI Integration Specialist & XR Solutions Architect

  • Led architecture and delivery of an enterprise-scale AI and XR platform supporting 1,000+ global users across Pico, Meta Quest, mobile, and desktop — serving industrial clients in training and simulation use cases.
  • Transformed static training workflows into adaptive AI-driven systems by deploying real-time conversational AI using Azure OpenAI — combining STT/TTS pipelines and intelligent virtual mentors powered by live LLM interactions.
  • Designed and implemented multi-agent AI architectures integrating LLM reasoning, speech pipelines, and orchestration layers for scalable, modular AI systems.
  • Engineered industrial digital twin solutions integrating real-time IoT telemetry via MQTT, enabling live monitoring and operational optimisation in rail and industrial workflows.
  • Built a platform-agnostic XR framework in Unity/C# enabling single-codebase deployment across Meta Quest, Pico, mobile, and desktop — eliminating platform-specific redevelopment entirely.
  • Established performance engineering standards achieving stable 90Hz on standalone XR devices while reducing overall delivery timelines by ~40%.
  • Defined architecture governance and mentored engineering teams on scalable system design and modular AI integration.

Multiverse Labs

Mar 2022 – Mar 2023

Remote

Shinsegae · Gov. of Sharjah · Starbucks

XR Technical Consultant & AI Systems Integrator

  • Designed and implemented intelligent NPC systems — architecting behaviour trees, state machines, and interaction logic giving virtual characters context-aware, responsive personalities.
  • Led architecture and delivery of AI-enhanced XR platforms serving 1,000+ global users across Pico, Meta Quest, and mobile.
  • Acted as primary bridge between executive leadership and engineering teams — translating product vision into prioritised technical roadmaps within Agile delivery cycles.
  • Introduced platform-level telemetry and analytics to capture user interaction patterns, enabling data-driven feature prioritisation and continuous product optimisation.

Sopra Steria

Jan 2014 – Mar 2022

Bengaluru, India

Airbus · Deutsche Bahn · Stelia · CIMPA

XR Solutions Architect

  • Spearheaded design, development, and delivery of 20+ enterprise AR/VR/XR solutions including industrial digital twins and immersive simulations across aerospace, automotive, and manufacturing domains.
  • Built a Robotic Arm Digital Twin — bidirectional control system connecting Unity XR with physical hardware via MQTT, enabling real-time VR-based industrial control.
  • Developed Smart Drill HoloLens AR application connected over MQTT, providing technicians with contextual real-time operational guidance overlaid on physical machinery.
  • Architected cross-platform XR solutions across Meta Quest, HoloLens, HTC Vive, Nreal, Android, and iOS using Unity and Unreal Engine.
  • Led and mentored a cross-functional team of 5, establishing architectural standards, performance optimisation practices, and reusable system patterns.

Zynga Games

Aug 2011 – Jan 2014

Bengaluru, India

Game Developer

  • Contributed to development of YoVille, Hidden Shadows, and Vampire Wars — developing interactive prototypes and collaborating with artists and principal designers in a fast-paced live-ops environment.
  • Participated in game design workshops and rapid prototyping cycles to validate new mechanics through player testing.

Bharat Electronics Limited

Apr 2011 – Aug 2011

Kerala, India

Interactive Designer / Developer

  • Designed and developed interactive learning solutions for defence and enterprise training use cases — creating wireframes, prototypes, and high-fidelity UI mockups that translated complex content into engaging, accessible experiences.

Pixal

Jul 2009 – Apr 2011

Kerala, India

Interactive Designer / Developer

  • Created 2D and 3D graphic content for enterprise training and product demonstration use cases, working closely with stakeholders to translate complex requirements into engaging visual and interactive solutions.
  • Developed interactive learning experiences using Unity and interactive media tools — building a foundation in end-to-end content production and client-facing delivery.

My toolkit

XR & 3DUnity 6 · Unreal Engine · Blender · Maya · 3ds Max · ARCore · Vuforia · A-Frame · Babylon.js · NVIDIA Omniverse
AI & AgentsLangChain · LangGraph · ChromaDB · Azure OpenAI · Azure AI Foundry · Azure AI Search · Hume AI · Ollama · Groq
Voice & SpeechElevenLabs · Deepgram · Agora SDK · Azure TTS/STT · OVR Lip Sync · Azure Cognitive Services
LanguagesC# · Python · JavaScript · Node.js · Next.js · C++ · UE Blueprints · HTML5 · CSS
Backend & CloudNode.js/Express · PostgreSQL · Redis · Firebase · AWS · GCP · Microsoft Azure · Docker · FastAPI
NetworkingPhoton · Mirror · Unity Netcode · MQTT · REST APIs · WebSockets
DevicesMeta Quest · PICO · MS HoloLens · HTC Vive · Magic Leap · Nreal · Android · iOS
Dev ToolsGitHub · Perforce · Jira · Azure DevOps · Figma · Adobe Creative Suite · LangSmith

Education & Certifications

Certification – In Progress

Microsoft Azure AI Apps & Agents Developer Associate (AI-103)

Microsoft

Azure AI Foundry · Agentic Systems · RAG Pipelines · Responsible AI · Multimodal AI

Postgraduate Certificate

Virtual Reality Production

Conestoga College, Ontario, Canada

2022 – 2023

President’s Honour List

Degree

Bachelor of Computer Science

Kerala University, India

2005 – 2008


Awards

Sopra Steria Global Innovation Award — Concept Finalist Jan 2020
Sopra Steria Global Innovation Award — Concept Finalist Jan 2018
Code Gladiator India — Concept Finalist Jun 2015

Let’s build something remarkable

Open to senior AI Solutions Architect roles, contract engagements, and freelance AI integration projects.

→ guru.lak87@gmail.com