I am a data science professional with over 9 years of experience designing, building, and deploying advanced ML/AI solutions across healthcare, financial services, and investment domains. I thrive on turning complex data problems into production-ready pipelines using Python, PyTorch, TensorFlow, and a suite of modern ML tools, while collaborating with cross-functional teams to deliver measurable business impact. My strengths include end-to-end ML lifecycle management, scalable cloud deployments, model governance, explainability, and rapid experimentation. I’ve delivered multi-cloud, production-grade platforms with robust monitoring and retraining capabilities, built retrieval-augmented generation NLP systems, and created dashboards that drive decision making and operational efficiency.

Siri Reddy Anaka

I am a data science professional with over 9 years of experience designing, building, and deploying advanced ML/AI solutions across healthcare, financial services, and investment domains. I thrive on turning complex data problems into production-ready pipelines using Python, PyTorch, TensorFlow, and a suite of modern ML tools, while collaborating with cross-functional teams to deliver measurable business impact. My strengths include end-to-end ML lifecycle management, scalable cloud deployments, model governance, explainability, and rapid experimentation. I’ve delivered multi-cloud, production-grade platforms with robust monitoring and retraining capabilities, built retrieval-augmented generation NLP systems, and created dashboards that drive decision making and operational efficiency.

Available to hire

I am a data science professional with over 9 years of experience designing, building, and deploying advanced ML/AI solutions across healthcare, financial services, and investment domains. I thrive on turning complex data problems into production-ready pipelines using Python, PyTorch, TensorFlow, and a suite of modern ML tools, while collaborating with cross-functional teams to deliver measurable business impact.

My strengths include end-to-end ML lifecycle management, scalable cloud deployments, model governance, explainability, and rapid experimentation. I’ve delivered multi-cloud, production-grade platforms with robust monitoring and retraining capabilities, built retrieval-augmented generation NLP systems, and created dashboards that drive decision making and operational efficiency.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Intermediate
See more

Work Experience

Senior Data Scientist / ML Engineer at Duke Health
April 1, 2024 - November 27, 2025
Developed AI-driven healthcare intelligence models using Python, GPT models, and Azure OpenAI Service for clinical decision automation. Built predictive patient outcome systems leveraging Azure ML, Databricks, and Synapse Analytics to support early disease risk identification. Implemented retrieval-augmented generation pipelines with Azure Cognitive Search and OpenAI embeddings and designed NLP workflows for entity extraction from medical transcripts. Deployed scalable ML pipelines with Docker, Terraform, and Azure Kubernetes Service, and built healthcare data pipelines with Azure Data Factory, Blob Storage, and Event Hubs. Enhanced model explainability with SHAP and LIME visualizations in Power BI dashboards. Led automated retraining with MLflow and Azure DevOps; engineered patient similarity search with FAISS; implemented federated learning workflows; secured endpoints via Azure AD and Purview.
Data Scientist at JPMorgan Chase (JPMC)
April 1, 2024 - April 1, 2024
Developed credit risk models using Python, PyTorch, and Amazon SageMaker to predict PD, LGD, and EAD. Built feature pipelines with AWS Glue, S3, and EMR; implemented near-real-time fraud detection with PyTorch, SageMaker, and DynamoDB streams. Automated CECL loss estimation workflows; deployed containerized services with Docker/ECS and provisioned regulated ML environments with Terraform. Created model monitoring dashboards (CloudWatch, SageMaker Model Monitor, QuickSight), designed streaming ingestion with Kinesis/Lambda/DynamoDB, and conducted hyperparameter tuning via SageMaker Experiments. Facilitated model governance with SHAP/LIME and established CI/CD via GitLab CI; delivered risk dashboards and backtest-enabled testing.
ML Engineer at Wellington Management
November 1, 2021 - November 1, 2021
Developed portfolio optimization models in Python using Scikit-learn for alpha generation. Engineered factor risk models with BigQuery, Apache Beam, and Dataflow to compute daily exposures. Built real-time market sentiment pipelines with TensorFlow and NLP tools; implemented event-driven trade surveillance on GKE with Pub/Sub. Led ETL processes with Cloud Composer, BigQuery, and Airflow; built backtesting frameworks with pandas/Dask; deployed microservices on GKE with Docker and Terraform. Implemented feature stores in BigQuery and developed dashboards in Data Studio/Tableau; explored Bayesian optimization and MLflow-based governance; collaborated with quants to translate research to production.
ML Analyst at Carmatec
September 1, 2019 - September 1, 2019
Developed sentiment classifiers, built ETL pipelines with Airflow and MySQL, and performed topic modeling with Gensim/NLTK to extract product feedback themes. Created Tableau dashboards for release-quality analytics; containerized NLP microservices with Docker and Flask; trained models using Scikit-learn; established version control with Git; optimized SQL queries; built REST endpoints; automated A/B testing; implemented data quality checks with Great Expectations; designed feature stores and anomaly detection utilities; configured logging/monitoring and hyperparameter tuning; packaged reusable libraries.
NLP Engineer at TatvaSoft
July 1, 2017 - July 1, 2017
Developed KPI dashboards using Power BI and SQL Server; built reproducible notebooks; optimized stored procedures; engineered data extraction workflows with Python/pyodbc; automated daily reporting; created statistical models forecasting tickets; visualized operational metrics; implemented data validation; established Git-based version control; deployed REST prediction endpoints; automated A/B test analyses; built document clustering pipelines and data quality checks; created reusable utilities and delivered executive summaries for stakeholders.
Data Scientist at JPMorgan Chase (JPMC)
November 1, 2021 - April 1, 2024
Developed credit risk models using Python, PyTorch, and Amazon SageMaker; engineered feature pipelines with AWS Glue, S3, and EMR; implemented fraud detection and AML case prioritization; automated training pipelines with Step Functions, Lambda, and SageMaker Pipelines; containerized model services with Docker and ECS; provisioned infrastructure via Terraform; implemented model monitoring with CloudWatch and SageMaker Clarify; built streaming transaction enrichment with Kinesis; enabled API deployment with API Gateway and Lambda authorizers; established CI/CD governance for ML services.
ML Engineer at Wellington Management
January 1, 2020 - November 1, 2021
Developed portfolio optimization models in Python (Scikit-learn, Prophet, XGBoost) for alpha generation; engineered factor risk models with BigQuery, Apache Beam, Dataflow; built market sentiment pipelines using TensorFlow and NLTK; implemented real-time trade surveillance on GKE; orchestrated ETL workflows with Cloud Composer, BigQuery, and Airflow; designed backtesting frameworks; created data dashboards in Data Studio/Tableau; implemented feature stores and governance; collaborated with quants to operationalize research notebooks.
ML Analyst at Carmatec
August 1, 2017 - September 1, 2019
Developed sentiment classifiers, ETL pipelines, and topic modeling workflows; created interactive analytics dashboards in Tableau; containerized NLP microservices with Docker and Flask; designed model training scripts using Scikit-learn; established version control with GitLab; optimized SQL queries for performance; built REST endpoints; conducted A/B testing analyses; implemented document clustering and data quality checks; developed reusable feature stores and anomaly detection utilities; implemented logging and monitoring for observability.
NLP Engineer at TatvaSoft
December 1, 2015 - July 1, 2017
Developed KPI dashboards using Power BI, SQL Server, and Python; built reproducible notebooks; optimized SQL Server procedures; engineered data extraction workflows; automated daily reporting; created statistical models for forecasting ticket inflow; built cohort analyses; delivered stakeholder presentations with dashboards; implemented data validation scripts and version control practices; deployed REST endpoints for prediction services; built document clustering pipelines and anomaly detection utilities.

Education

Bachelor of Technology (B.Tech) in Information Technology at JNTUH, Hyderabad, Telangana, India
January 11, 2030 - January 1, 2016
Bachelor of Technology (B. Tech) in Information Technology at JNTUH
January 11, 2030 - January 1, 2016

Qualifications

Add your qualifications or awards here.

Industry Experience

Healthcare, Financial Services, Education, Professional Services, Software & Internet