Hello, I’m Sharath Kumar Cherukumalla, a Generative AI and AI/ML Engineer with 10+ years of experience delivering enterprise-grade AI solutions across finance, healthcare, insurance, and retail. I specialize in Generative AI, LLMs, and agentic AI systems, building production-ready multi-agent workflows using LangChain, LangGraph, AutoGen, and CrewAI. I have hands-on experience with GPT-4/4o, Llama 2/3, and Hugging Face Transformers, and I design robust Retrieval-Augmented Generation (RAG) pipelines and semantic search systems to empower data-driven decision making. I excel in end-to-end NLP pipelines, scalable data architectures, and MLOps practices across AWS, Azure, and GCP. I’m passionate about HIPAA-compliant healthcare NLP, insurance underwriting, claims automation, and operational risk scoring, with a focus on measurable ROI and secure, compliant AI deployments. I enjoy mentoring teams, improving observability, and delivering enterprise-grade AI platforms that reduce manual effort and accelerate critical workflows.

Sharath Kumar Cherukumalla

Hello, I’m Sharath Kumar Cherukumalla, a Generative AI and AI/ML Engineer with 10+ years of experience delivering enterprise-grade AI solutions across finance, healthcare, insurance, and retail. I specialize in Generative AI, LLMs, and agentic AI systems, building production-ready multi-agent workflows using LangChain, LangGraph, AutoGen, and CrewAI. I have hands-on experience with GPT-4/4o, Llama 2/3, and Hugging Face Transformers, and I design robust Retrieval-Augmented Generation (RAG) pipelines and semantic search systems to empower data-driven decision making. I excel in end-to-end NLP pipelines, scalable data architectures, and MLOps practices across AWS, Azure, and GCP. I’m passionate about HIPAA-compliant healthcare NLP, insurance underwriting, claims automation, and operational risk scoring, with a focus on measurable ROI and secure, compliant AI deployments. I enjoy mentoring teams, improving observability, and delivering enterprise-grade AI platforms that reduce manual effort and accelerate critical workflows.

Available to hire

Hello, I’m Sharath Kumar Cherukumalla, a Generative AI and AI/ML Engineer with 10+ years of experience delivering enterprise-grade AI solutions across finance, healthcare, insurance, and retail. I specialize in Generative AI, LLMs, and agentic AI systems, building production-ready multi-agent workflows using LangChain, LangGraph, AutoGen, and CrewAI. I have hands-on experience with GPT-4/4o, Llama 2/3, and Hugging Face Transformers, and I design robust Retrieval-Augmented Generation (RAG) pipelines and semantic search systems to empower data-driven decision making.

I excel in end-to-end NLP pipelines, scalable data architectures, and MLOps practices across AWS, Azure, and GCP. I’m passionate about HIPAA-compliant healthcare NLP, insurance underwriting, claims automation, and operational risk scoring, with a focus on measurable ROI and secure, compliant AI deployments. I enjoy mentoring teams, improving observability, and delivering enterprise-grade AI platforms that reduce manual effort and accelerate critical workflows.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
See more

Language

English
Fluent

Work Experience

Generative AI Engineer at Lending Tree
March 1, 2024 - Present
Led the design and end-to-end architecture of Generative AI and agentic AI solutions for underwriting and risk scoring, aligning technical decisions with business goals, compliance requirements, and scalability standards. Designed and deployed AI-driven underwriting platforms on Microsoft Azure using Python, GPT-4, and Llama, improving document processing throughput and decision accuracy. Integrated key data sources including loan applications, credit reports, bank statements, and downstream fintech systems to support automated underwriting. Implemented Retrieval-Augmented Generation (RAG) pipelines using LangChain, Pinecone, and FAISS to enable semantic search and context-aware analysis over large volumes of unstructured financial documents. Architected vector embedding and storage strategies to support high-volume ingestion, low-latency retrieval, and reliable performance for underwriting and document intelligence use cases. Developed autonomous agent workflows using LangGraph and La
Generative AI Engineer/ Data Scientist at McKesson
April 1, 2022 - February 29, 2024
Designed and developed clinical NLP and Generative AI solutions to support healthcare workflows, working closely with clinicians, data engineers, and compliance teams to ensure scalability and HIPAA compliance. Helped set platform standards for clinical NLP, Generative AI pipelines, and MLOps practices for consistent and reliable deployments. Worked with claims, prescriptions, EMR text, and patient history data to support clinical and operational analytics. Built and maintained large ETL and data processing pipelines using Azure Data Factory, Databricks, Airflow, dbt, and Snowflake to clean, transform, and prepare healthcare data for modeling. Created HIPAA-compliant synthetic data to allow safe experimentation and model development without exposing sensitive information. Built RAG and question-answering workflows using embeddings, vector search, and medical knowledge bases to assist clinicians in reviewing and interpreting documents. Developed clinical NLP and Generative AI pipelines
AI/ML Engineer/ Data Scientist at Nationwide Insurance
December 1, 2019 - March 31, 2022
Designed and developed ML solutions to support fraud detection, claims severity prediction, anomaly detection, and underwriting risk scoring. Worked with large insurance datasets (claims, policy, notes, customer records) to support analytics and modeling. Built end-to-end ML pipelines (data preprocessing, feature engineering, model training, validation, production inference) using Python-based workflows. Developed NLP pipelines (BERT, RoBERTa, early Hugging Face transformers) for text classification, NER, summarization, and document embeddings. Deployed and operationalized ML models with Docker and Kubernetes for scalable inference, versioned deployments, and automated retraining. Integrated model outputs into enterprise systems via batch scoring and data pipelines. Implemented model tracking and monitoring (MLflow, monitoring tools) to detect data and concept drift. Produced dashboards in Power BI and Tableau for fraud indicators, underwriting risk signals, and overall model performan
Data Scientist at Walmart
June 1, 2017 - November 30, 2019
Designed and developed data science solutions for pricing optimization, demand forecasting, and shrink reduction. Built large-scale ETL pipelines (SQL, PySpark, Hive, Airflow) and leveraged Hadoop/Spark on AWS for analytics at scale. Engineered feature generation for demand forecasting, pricing, supply chain analytics, and customer insights. Developed predictive models using Scikit-learn, XGBoost, and TensorFlow. Built NLP workflows (SpaCy, NLTK, LSTMs) for sentiment and topics from customer reviews and call-center transcripts. Deployed modular inference services with Docker and Kubernetes; integrated streaming data (Kafka, AWS Kinesis) for near real-time monitoring. Implemented data quality checks and anomaly detection to improve pipeline stability. Created dashboards in Tableau and Power BI for business KPIs and model monitoring. Produced SHAP/LIME explanations for model transparency and regulatory audits. Ensured security/compliance in data handling and deployments, aligning with HI
Python Developer at Edvensoft Solutions India Pvt. Ltd
May 1, 2015 - February 28, 2017
Developed backend services and business logic using Python for multi-client applications on GCP Compute Engine. Implemented RESTful APIs with Flask and Django to integrate backend services with web front ends and external systems on GCP. Built automated ETL pipelines and data ingestion workflows using Python and SQL with Cloud Storage as staging. Designed schemas and queries for MySQL/PostgreSQL, ensuring performance and reliability. Implemented Python automation scripts for file processing, data validation, and scheduled jobs. Deployed Python apps on GCP Linux instances, ensuring secure and repeatable rollouts. Integrated third‑party APIs (SMS, payments, email) with secure request handling. Collaborated with front-end teams for API integration, wrote unit/integration tests, and migrated legacy scripts to Python for better scalability.

Education

Bachelor of Engineering (B.E.) – Computer Science & Engineering at Dhanalakshmi Srinivasan Engineering College
January 11, 2030 - January 6, 2026

Qualifications

Add your qualifications or awards here.

Industry Experience

Financial Services, Healthcare, Retail, Software & Internet, Professional Services