I’m a Senior AI Solution Architect and Platform Engineer with 10+ years of experience helping large organizations adopt enterprise AI. I design agentic workflows and AI platforms using Python, Snowflake, and AWS to increase developer productivity and automate operations. I’ve led cross-functional squads, defined North Star AI guidelines, and delivered scalable GenAI capabilities across banking, healthcare, and retail. I’m passionate about governance, reproducibility, and building practical AI that adds measurable business value.

Gnaneshwar Javvaji

I’m a Senior AI Solution Architect and Platform Engineer with 10+ years of experience helping large organizations adopt enterprise AI. I design agentic workflows and AI platforms using Python, Snowflake, and AWS to increase developer productivity and automate operations. I’ve led cross-functional squads, defined North Star AI guidelines, and delivered scalable GenAI capabilities across banking, healthcare, and retail. I’m passionate about governance, reproducibility, and building practical AI that adds measurable business value.

Available to hire

I’m a Senior AI Solution Architect and Platform Engineer with 10+ years of experience helping large organizations adopt enterprise AI. I design agentic workflows and AI platforms using Python, Snowflake, and AWS to increase developer productivity and automate operations.

I’ve led cross-functional squads, defined North Star AI guidelines, and delivered scalable GenAI capabilities across banking, healthcare, and retail. I’m passionate about governance, reproducibility, and building practical AI that adds measurable business value.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert

Language

English
Fluent

Work Experience

GenAI Engineer at Bank Of America
January 1, 2024 - Present
Developed advanced Agentic AI workflows utilizing LangGraph and CrewAI for multi-step task decomposition and autonomous delegation across specialized LLM agents within customer interaction pipelines. Trained and fine-tuned a TensorFlow small BERT model for transaction categorization. Implemented model monitoring with CloudWatch and SageMaker Model Monitor. Built MLOps pipelines on AWS using SageMaker Pipelines, Lambda, and Step Functions. Architected transformer-based GenAI solutions for financial categorization, led North Star governance for AI adoption, and migrated forecasting workloads from Hadoop/Hive to Databricks. Re-implemented legacy econometric formulas in PyTorch to ensure numerical parity. Deployed enterprise-grade AI platform and AI agents to automate internal workflows, increasing developer productivity by 40%. Built RAG-enabled retrieval pipelines and secure REST APIs for enterprise integration.
Senior AI/ML Engineer at Johnson & Johnson Innovative Medicare
July 1, 2022 - December 1, 2023
Designed and deployed ML models for disease detection and treatment monitoring using MRI/CT imaging, clinical, and genomic data. Implemented production-grade RAG systems with vector databases (Pinecone/FAISS) and word embeddings for real-time document retrieval. Fine-tuned open-source LLMs (Llama2) for medical terminology; built multimodal AI solutions integrating imaging and clinical text. Developed modular, production-grade RAG pipelines with LangChain and GPU clusters; deployed HIPAA-compliant services on AWS; translated statistical models from R into modular PyTorch architectures for predictive analytics.
ML Platform Engineer at Best Buy
May 1, 2020 - July 1, 2022
Architected and productionized end-to-end NLP pipelines and conversational AI prototypes. Led transition from experimental to production ML using AKS for containerized deployments. Built explainable models (GAMs) and established a robust MLOps foundation with Artifactory for versioning. Migrated critical predictive workloads from Hadoop/Hive to AWS/Databricks and implemented modular ML pipelines for enterprise-scale AI features.
Data Engineer at AT&T
August 1, 2018 - April 1, 2020
Developed automated ETL pipelines with Alteryx and Snowflake; created interactive dashboards in Tableau to communicate KPIs. Built NLP pipelines (spaCy, Hugging Face) and established data warehousing practices. Enabled scalable data processing, versioned code, and governance for AI-enabled analytics.
Python Developer at Netflix
February 1, 2014 - September 1, 2016
Contributed to RESTful APIs and backend components with Django/Flask; containerized Python services with Docker/Kubernetes; designed scalable databases (PostgreSQL/MySQL). Implemented CI/CD via Jenkins; built speech-to-text and search-ready data pipelines; led data extraction and transformation tasks for streaming analytics.

Education

Add your educational history here.

Qualifications

Add your qualifications or awards here.

Industry Experience

Software & Internet, Healthcare, Financial Services, Retail, Professional Services