I am a results-driven Senior AI/ML Engineer with 10+ years of experience in AI/ML, Big Data, Cloud Data Engineering, and end-to-end machine learning pipeline development. I design, build, and deploy ML/DL models for credit risk, fraud detection, customer segmentation, and healthcare analytics, with a strong focus on scalable MLOps and production-grade delivery. I have hands-on experience fine-tuning LLMs (GPT, LLaMA, Falcon) for document summarization and NLP tasks, real-time streaming analytics, and compliant data governance. I enjoy collaborating with cross-functional teams in Agile environments to translate complex business problems into measurable AI solutions and practical product outcomes.

Swetha Sakhamuri

I am a results-driven Senior AI/ML Engineer with 10+ years of experience in AI/ML, Big Data, Cloud Data Engineering, and end-to-end machine learning pipeline development. I design, build, and deploy ML/DL models for credit risk, fraud detection, customer segmentation, and healthcare analytics, with a strong focus on scalable MLOps and production-grade delivery. I have hands-on experience fine-tuning LLMs (GPT, LLaMA, Falcon) for document summarization and NLP tasks, real-time streaming analytics, and compliant data governance. I enjoy collaborating with cross-functional teams in Agile environments to translate complex business problems into measurable AI solutions and practical product outcomes.

Available to hire

I am a results-driven Senior AI/ML Engineer with 10+ years of experience in AI/ML, Big Data, Cloud Data Engineering, and end-to-end machine learning pipeline development. I design, build, and deploy ML/DL models for credit risk, fraud detection, customer segmentation, and healthcare analytics, with a strong focus on scalable MLOps and production-grade delivery.

I have hands-on experience fine-tuning LLMs (GPT, LLaMA, Falcon) for document summarization and NLP tasks, real-time streaming analytics, and compliant data governance. I enjoy collaborating with cross-functional teams in Agile environments to translate complex business problems into measurable AI solutions and practical product outcomes.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert

Language

English
Fluent

Work Experience

Senior AI/ML Engineer at Bank of America
April 1, 2024 - Present
Designed and deployed end-to-end ML pipelines for credit risk modeling, fraud detection, and customer segmentation using TensorFlow, PyTorch, and Scikit-learn. Fine-tuned and deployed large language models (GPT, LLaMA, Falcon) for document summarization, sentiment analysis, and call center automation. Built MLOps workflows (MLflow, Kubeflow, SageMaker) to streamline model training, deployment, and monitoring in production. Implemented real-time anomaly detection via streaming data pipelines with Apache Kafka and Spark Structured Streaming; applied NLP techniques (NER, topic modeling, embeddings) to extract insights from financial documents and regulatory transcripts. Deployed models as scalable RESTful microservices using Flask/FastAPI and containerized them with Docker/Kubernetes. Collaborated with Data Governance to ensure compliance with GDPR, SOX, CCAR, and SR 11-7 model risk guidelines.
AI/ML Engineer at Blue Cross Blue Shield
March 1, 2024 - October 9, 2025
Designed and implemented ML models for patient risk prediction, claims fraud detection, and care gap identification. Built end-to-end ML pipelines using Python, TensorFlow, and Scikit-learn for data preprocessing, model training, evaluation, and deployment. Developed NLP models (BERT, BioBERT, GPT-based) for clinical text classification, ICD code prediction, and claims data extraction. Processed PHI and HIPAA-compliant datasets with anonymization and encryption to meet regulatory standards. Leveraged Spark/PySpark for distributed processing of large-scale claims and provider network datasets. Implemented model explainability (SHAP, LIME) for clinicians and compliance teams. Deployed REST APIs via Flask/FastAPI; containerized with Docker and orchestrated with Kubernetes. Adopted MLOps (MLflow, Kubeflow) for versioning and monitoring; integrated Kafka for near real-time alerts. Collaborated with clinicians and business stakeholders; designed feature engineering workflows and monitored dr
Data Engineer at United Airlines
October 1, 2021 - October 9, 2025
Utilized AWS services (RedShift, S3, EMR) to streamline big data storage and processing. Migrated datasets with Sqoop into HDFS, Hive, and HBase; configured and maintained Hadoop MapReduce and HDFS. Built a centralized data lake on AWS using S3, Glue, Lambda, DynamoDB, Elasticsearch, CloudWatch, and Athena. Engineered MapReduce workflows, orchestrated real-time data ingestion with Apache Flume, and used Spark SQL to curate datasets and integrate Hive sources into S3. Automated data pipelines using NiFi and built Spark/Scala transformations; implemented Oozie workflows for batch scheduling. Deployed Kafka streams for near real-time ingestion and collaborated across Python, Pandas, NumPy, Django, R, MySQL, MongoDB, and more.
Senior AI/ML Engineer at Bank of America
April 1, 2024 - Present
Designed and deployed end-to-end ML pipelines for credit risk modeling, fraud detection, and customer segmentation using TensorFlow, PyTorch, and Scikit-learn. Fine-tuned and deployed LLMs (GPT, LLaMA, Falcon) for document summarization and sentiment analysis. Implemented MLOps workflows with MLflow, Kubeflow, and SageMaker, including model versioning, deployment, monitoring, and drift detection. Built real-time streaming analytics with Kafka and Spark Structured Streaming; applied NLP techniques for financial docs; and deployed scalable REST APIs via Flask/FastAPI, containerized with Docker/Kubernetes. Ensured regulatory compliance with GDPR/SOX/SR 11-7 and created model monitoring dashboards using Prometheus/Grafana. Led A/B testing, validated model performance, and integrated SHAP/LIME explainability. Collaborated with data governance, business analysts, and product teams; mentored junior staff; and contributed to Generative AI initiatives using LangChain and OpenAI.
AI/ML Engineer at Blue Cross Blue Shield
March 1, 2024 - October 9, 2025
Designed and implemented ML models for healthcare use cases including patient risk prediction, claims fraud detection, and care-gap identification. Built end-to-end pipelines using Python, TensorFlow, and Scikit-learn; fine-tuned NLP models (BERT, BioBERT, GPT-based) for clinical text classification and ICD-code extraction. Processed PHI/HIPAA-compliant data with anonymization and encryption; used Spark/PySpark for large-scale data; applied SHAP/LIME for explainability. Deployed REST APIs with Flask/FastAPI; containerized with Docker; orchestrated with Kubernetes. Applied MLOps practices (MLflow, Kubeflow) for versioning and monitoring; integrated real-time data via Kafka. Conducted AB tests and validation to ensure accuracy, fairness, and regulatory compliance. Collaborated with clinicians and stakeholders; built feature engineering workflows and feature stores; monitored data drift and compliance risk. Integrated Generative AI pipelines (LangChain, LlamaIndex) for automated clinical
Data Engineer at United Airlines
October 1, 2021 - October 9, 2025
Led data engineering efforts to enable analytics at scale on AWS. Migrated datasets to Redshift, S3, and EMR; built a centralized data lake using S3, Glue, Lambda, DynamoDB, Elasticsearch, CloudWatch, and Athena. Ingested structured and unstructured data; developed MapReduce workflows and Spark jobs; automated data flows with Apache NiFi. Implemented real-time ingestion with Flume and Kafka streams; designed Hive tables and Snowflake integrations for analytics; built streaming pipelines with Spark Streaming and HBase. Collaborated with cross-functional teams using Python, Pandas, NumPy, Django, R, MySQL, MongoDB to deliver end-to-end data solutions.
Data Engineer at IBM
December 1, 2019 - October 9, 2025
Designed dimensional data models and built ETL pipelines for enterprise data warehouses. Implemented data ingestion with Talend Open Studio, DataStage, and Sqoop; created Hive external tables and custom SerDes; built real-time pipelines with Kafka and Storm. Automated workflows with Airflow; processed data via Spark/PySpark; used Spark SQL and Hive for analytics; built MapReduce and Pig scripts for data cleansing; migrated queries to Spark for performance; integrated with Snowflake; built BI dashboards in Tableau. Implemented Oozie for scheduling; collaborated with multi-disciplinary teams and ensured data governance.
Big Data Engineer at Fractal Analytics
April 1, 2016 - October 9, 2025
Designed and implemented big data analytics pipelines; built Hadoop clusters and data ingestion with Flume and Sqoop; wrote Hive queries and MapReduce/Pig scripts for data cleansing and analytics. Built Spark jobs in Scala/PySpark; automated workflows with Apache Airflow/Oozie; orchestrated data flows with NiFi; processed data in HDFS with Hadoop ecosystem; integrated Snowflake and BI tools (Tableau). Enabled real-time data ingestion with Kafka and Spark Streaming; migrated legacy Hive queries to Spark; worked with multiple RDBMSs and cloud-native analytics.

Education

Add your educational history here.

Qualifications

Add your qualifications or awards here.

Industry Experience

Financial Services, Healthcare, Travel & Hospitality, Software & Internet, Professional Services