Hello! I’m Yallaiah Onteru, a Senior AI Agent Developer with over a decade of experience designing scalable multi‑agent systems for insurance, healthcare, banking, and consulting. I specialize in LangGraph orchestration, Model Context Protocol implementations, and Agent‑to‑Agent communication patterns that drive production‑grade AI workflows with a focus on reliability, explainability, and collaboration with cross‑functional teams. I have deployed production agents on AWS SageMaker and AWS Bedrock, built hybrid RAG architectures with Neo4j knowledge graphs, and implemented observability and CI/CD pipelines to support real‑time decision‑making and regulatory compliance in time‑sensitive environments. I enjoy refining system prompts, improving latency, and aligning agent behavior with domain standards across healthcare and financial services through continuous feedback and robust testing.

Yallaiah Onteru

Hello! I’m Yallaiah Onteru, a Senior AI Agent Developer with over a decade of experience designing scalable multi‑agent systems for insurance, healthcare, banking, and consulting. I specialize in LangGraph orchestration, Model Context Protocol implementations, and Agent‑to‑Agent communication patterns that drive production‑grade AI workflows with a focus on reliability, explainability, and collaboration with cross‑functional teams. I have deployed production agents on AWS SageMaker and AWS Bedrock, built hybrid RAG architectures with Neo4j knowledge graphs, and implemented observability and CI/CD pipelines to support real‑time decision‑making and regulatory compliance in time‑sensitive environments. I enjoy refining system prompts, improving latency, and aligning agent behavior with domain standards across healthcare and financial services through continuous feedback and robust testing.

Available to hire

Hello! I’m Yallaiah Onteru, a Senior AI Agent Developer with over a decade of experience designing scalable multi‑agent systems for insurance, healthcare, banking, and consulting. I specialize in LangGraph orchestration, Model Context Protocol implementations, and Agent‑to‑Agent communication patterns that drive production‑grade AI workflows with a focus on reliability, explainability, and collaboration with cross‑functional teams.

I have deployed production agents on AWS SageMaker and AWS Bedrock, built hybrid RAG architectures with Neo4j knowledge graphs, and implemented observability and CI/CD pipelines to support real‑time decision‑making and regulatory compliance in time‑sensitive environments. I enjoy refining system prompts, improving latency, and aligning agent behavior with domain standards across healthcare and financial services through continuous feedback and robust testing.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
See more

Language

English
Fluent

Work Experience

Senior AI Lead Developer at State Farm
January 1, 2025 - Present
Design and implement multi‑agent systems using LangGraph for insurance claim processing, coordinating document analysis agents with validation agents via Agent‑to‑Agent protocols. Implement Model Context Protocols to maintain conversation state across agent interactions, ensuring consistent information flow for claims adjusters between underwriting, fraud detection, and subrogation tasks. Build hybrid RAG pipelines with Neo4j knowledge graphs and vector embeddings to retrieve regulations, policy documents, and historical claim data for risk assessment and pricing recommendations. Deploy containerized agents to Kubernetes with horizontal pod autoscaling, and establish CI/CD, logging, monitoring, and automated feedback loops to maintain high availability and low latency in peak claim periods. Integrate AWS SageMaker endpoints for hosting fine‑tuned models and AWS Bedrock foundation models for comparative performance studies, with attention to regulatory compliance.
Senior AI Developer at Johnson & Johnson
August 1, 2021 - December 1, 2024
Construct multi‑agent workflows using LangGraph for clinical data analysis, processing patient records, medical imaging reports, and lab results while maintaining HIPAA compliance. Apply Model Context Protocols to enable seamless handoffs between medical data extraction agents and clinical decision support agents, preserving patient context across interaction turns. Assemble hybrid RAG architectures linking medical literature databases with electronic health records to reference current guidelines while analyzing patient histories for personalized care recommendations. Provision and manage AWS SageMaker training jobs for fine‑tuning biomedical language models, and orchestrate AWS Bedrock API calls with rate limiting and token tracking. Analyze agent performance logs with Databricks and PySpark to identify improvement opportunities, and explore CrewAI and AutoGen for coordinating radiology interpretation, pathology analysis, and patient history summarization. Collaborate with health
Senior ML Engineer at State of Maine
April 1, 2020 - July 1, 2021
Delivered Medicaid eligibility models and migrated data pipelines to Azure ML Studio and Azure Data Factory. Built ETL processes with PySpark on Azure Databricks to transform claims data, engineered features for risk scoring, and validated model performance with hold‑out testing and cross‑validation. Standardized deployment templates in Azure ML for reusable scoring endpoints and monitored data drift to trigger retraining. Communicated model insights to stakeholders and ensured HIPAA‑compliant data handling and auditability throughout the pipeline.
Data Scientist at Bank of America
January 1, 2018 - March 1, 2020
Developed fraud detection models using XGBoost, extracting features from transactional data to identify suspicious patterns while minimizing false positives. Explored customer segmentation with K‑Means for targeted campaigns, evaluated model fairness across demographics, and built historical features using SQL windows and joins. Automated retraining schedules and maintained PCI‑DSS compliance documentation for data access and security controls.
Data Engineer at Hexaware
October 1, 2015 - December 1, 2017
Loaded data from multiple sources into Hadoop clusters via Sqoop, processed data with Hive, and mapped transformations with Informatica PowerCenter. Implemented validation and data quality checks, documented ETL dependencies and data lineage, and supported batch processing with Unix shell scripts and cron. Gained exposure to MapReduce, HDFS, and distributed computing concepts while assisting senior engineers to optimize performance.

Education

B.Tech in Computer Science at KITS
January 11, 2030 - January 1, 2015

Qualifications

HIPAA Compliance
January 11, 2030 - January 5, 2026
PCI‑DSS Compliance
January 11, 2030 - January 5, 2026

Industry Experience

Healthcare, Financial Services, Professional Services, Media & Entertainment