Hi, I’m Prajoth Kumar, a Senior AI & Cloud Software Engineer with hands-on experience building cloud-native microservices, enterprise SaaS apps, and scalable AI-driven platforms across multi-cloud environments (GCP, AWS, Azure). I specialize in turning complex business requirements into intelligent applications using distributed data processing, ML pipelines, and Large Language Models (LLMs). I’ve built corporate knowledge assistants, semantic search platforms, and AI-driven analytics services with LangChain, LangGraph, Retrieval-Augmented Generation (RAG), and vector databases to empower enterprise decision-making. I’m proficient in high-performance Python backends (FastAPI, Flask) and service-oriented architectures, containerization with Docker and Kubernetes, and DevOps/MLOps practices that enable reliable, scalable deployments. I enjoy collaborating with cross-functional teams to deliver data-intensive solutions—from real-time analytics and pricing optimization to network analytics and operational intelligence—while emphasizing observability, security, and iterative experimentation.

Prajoth Kumar

Hi, I’m Prajoth Kumar, a Senior AI & Cloud Software Engineer with hands-on experience building cloud-native microservices, enterprise SaaS apps, and scalable AI-driven platforms across multi-cloud environments (GCP, AWS, Azure). I specialize in turning complex business requirements into intelligent applications using distributed data processing, ML pipelines, and Large Language Models (LLMs). I’ve built corporate knowledge assistants, semantic search platforms, and AI-driven analytics services with LangChain, LangGraph, Retrieval-Augmented Generation (RAG), and vector databases to empower enterprise decision-making. I’m proficient in high-performance Python backends (FastAPI, Flask) and service-oriented architectures, containerization with Docker and Kubernetes, and DevOps/MLOps practices that enable reliable, scalable deployments. I enjoy collaborating with cross-functional teams to deliver data-intensive solutions—from real-time analytics and pricing optimization to network analytics and operational intelligence—while emphasizing observability, security, and iterative experimentation.

Available to hire

Hi, I’m Prajoth Kumar, a Senior AI & Cloud Software Engineer with hands-on experience building cloud-native microservices, enterprise SaaS apps, and scalable AI-driven platforms across multi-cloud environments (GCP, AWS, Azure). I specialize in turning complex business requirements into intelligent applications using distributed data processing, ML pipelines, and Large Language Models (LLMs). I’ve built corporate knowledge assistants, semantic search platforms, and AI-driven analytics services with LangChain, LangGraph, Retrieval-Augmented Generation (RAG), and vector databases to empower enterprise decision-making.

I’m proficient in high-performance Python backends (FastAPI, Flask) and service-oriented architectures, containerization with Docker and Kubernetes, and DevOps/MLOps practices that enable reliable, scalable deployments. I enjoy collaborating with cross-functional teams to deliver data-intensive solutions—from real-time analytics and pricing optimization to network analytics and operational intelligence—while emphasizing observability, security, and iterative experimentation.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Intermediate
Intermediate
Intermediate
See more

Work Experience

Senior AI / Cloud Software Engineer at Model N
July 1, 2024 - Present
Developed an AI-driven, multi-tenant revenue intelligence platform and built cloud-native microservices using Python, FastAPI, and Node.js for contract analytics, pricing validation, and AI-driven recommendations. Implemented multi-cloud architectures (AWS, Azure, GCP) with Docker/Kubernetes and infrastructure automation via Terraform and CI/CD pipelines. Designed and deployed an enterprise knowledge assistant with LLMs, LangChain, and RAG, plus semantic retrieval workflows using Pinecone, FAISS, and OpenSearch to access pricing clauses and rebate terms. Built distributed data pipelines with Spark, Kafka, and Airflow, unifying ERP/CRM/CPQ data for analytics and forecasting. Created NLP pipelines with Hugging Face transformers and exposed results via REST/GraphQL APIs. Implemented asynchronous processing (AsyncIO, multiprocessing) and robust observability (Prometheus, Grafana, CloudWatch). Automated deployments with GitHub Actions/Jenkins and enforced secure access with OAuth/RBAC.
AI Platform / Python Developer at Juniper Networks
July 1, 2023 - May 1, 2024
Engineered real-time network analytics platforms to process high-volume telemetry data using Python, NumPy, Kafka, Spark Streaming, and PySpark. Developed anomaly detection models (Isolation Forest, One-Class SVM) and time-series forecasts (ARIMA, LSTM) to identify outages and performance degradation. Built feature pipelines for latency, jitter, throughput, and traffic spikes; served real-time predictions via FastAPI/Flask. Implemented AI-assisted troubleshooting tools using vector embeddings and semantic search with OpenSearch for faster root-cause analysis. Designed microservices with REST/Event-driven patterns, containerized workloads with Docker/Kubernetes, and CI/CD/MLOps pipelines with Jenkins and MLFlow. Ensured observability with Prometheus and Grafana and stored data in PostgreSQL, MongoDB, and Redis.
Software Engineer / Associate Software Engineer at Ericsson
January 1, 2020 - June 1, 2022
Developed cloud-based analytics solutions for telecom clients, enabling business teams to process large network performance data for reporting and decision support. Built backend services using Python and Node.js to support secure REST APIs for ingestion, transformation, and analytics workflows. Implemented distributed ETL with Apache Spark and Airflow; developed asynchronous pipelines using AsyncIO and multiprocessing. Created interactive analytics dashboards with Django, DynamoDB, and Chart.js; optimized SQL queries and database schemas (PostgreSQL, DynamoDB). Containerized deployments on AWS; built automated tests with PyTest/JUnit and CI/CD pipelines with Jenkins. Collaborated with data engineers, analysts, and product managers to translate requirements into scalable analytics solutions.
Software Engineer at Cipla
May 1, 2016 - October 1, 2019
Developed enterprise backend applications using Python, Django, and REST APIs to support pharmaceutical workflows. Built database integration layers using SQL Server, Oracle, and MySQL to support structured data and production processes. Worked in Agile environments delivering SOAP and REST-based APIs for enterprise system integration. Optimized database queries and indexing, improved backend performance for high-volume transactions, and developed modular Python components to enhance maintainability and scalability. Collaborated with cross-functional teams to deliver software solutions supporting pharmaceutical business processes.

Education

Add your educational history here.

Qualifications

Add your qualifications or awards here.

Industry Experience

Telecommunications, Software & Internet, Professional Services