Hi, I’m Ivan Caban, a Senior Software Engineer with 15+ years of experience delivering AI and machine-learning solutions across healthcare and enterprise domains. I specialize in transformer-based NLP, building robust backend microservices, and crafting scalable data pipelines. I enjoy turning complex clinical data problems into reliable systems that clinicians can trust. I thrive in cross-functional teams and prioritize collaboration, security, and compliance. My focus areas include MLOps, Kubernetes, Docker, CI/CD automation, and HIPAA-compliant deployments, with an emphasis on human-in-the-loop workflows to continuously improve model accuracy and dataset quality.

Hi, I’m Ivan Caban, a Senior Software Engineer with 15+ years of experience delivering AI and machine-learning solutions across healthcare and enterprise domains. I specialize in transformer-based NLP, building robust backend microservices, and crafting scalable data pipelines. I enjoy turning complex clinical data problems into reliable systems that clinicians can trust. I thrive in cross-functional teams and prioritize collaboration, security, and compliance. My focus areas include MLOps, Kubernetes, Docker, CI/CD automation, and HIPAA-compliant deployments, with an emphasis on human-in-the-loop workflows to continuously improve model accuracy and dataset quality.

Available to hire

Hi, I’m Ivan Caban, a Senior Software Engineer with 15+ years of experience delivering AI and machine-learning solutions across healthcare and enterprise domains. I specialize in transformer-based NLP, building robust backend microservices, and crafting scalable data pipelines. I enjoy turning complex clinical data problems into reliable systems that clinicians can trust.

I thrive in cross-functional teams and prioritize collaboration, security, and compliance. My focus areas include MLOps, Kubernetes, Docker, CI/CD automation, and HIPAA-compliant deployments, with an emphasis on human-in-the-loop workflows to continuously improve model accuracy and dataset quality.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Intermediate
Intermediate
Intermediate
Intermediate
See more

Language

English
Fluent

Work Experience

Software Engineer at CorroHealth
November 1, 2020 - Present
Developed and maintained scalable AI-powered clinical data extraction pipelines using transformer models (BERT, GPT variants) for automated medical record summarization and information retrieval. Designed and deployed backend microservices with FastAPI and Flask for real-time NLP inference, enabling interoperability with EHR systems and HL7/FHIR. Implemented retrieval-augmented generation with vector search (FAISS, Elasticsearch) and fine-tuned LLMs to enhance clinical Q&A and decision support. Built data ingestion/streaming workflows with Apache Kafka and Spark for HIPAA-compliant processing. Automated MLOps with Kubernetes, Docker, and GitLab CI/CD, including data drift detection and scheduled retraining with Airflow. Collaborated with data scientists and product teams to optimize fine-tuning strategies, incorporating RLHF. Developed React/TypeScript annotation tools for human-in-the-loop workflows. Established monitoring dashboards with Prometheus/Grafana for AI system health and la
Software Engineer at Biofourmis
June 1, 2016 - November 1, 2020
Developed distributed model serving pipelines using TensorFlow Serving, gRPC, and Envoy across multi-region GCP clusters with autoscaling and request tracing. Built JAX/Flax-based training workflows for ad ranking models, optimized for TPU v4 pods, with full integration into internal orchestration systems (Borg + Blaze). Refactored data preprocessing with Apache Beam, reducing pipeline latency by 45% and enabling real-time feature generation from streaming logs. Implemented feature caching layers with Redis and Bigtable, coordinated via Kafka message routing and structured Pub/Sub topics. Created Go-based CLI tools for launching model experiments, tracking rollout metrics, and managing evaluation datasets across teams. Built real-time model monitoring dashboards with React +TypeScript, visualizing output drift, latency trends, and top-K error distribution.
Software Developer at EliseAI
October 1, 2014 - May 1, 2016
Developed scalable backend microservices using Python (FastAPI, Flask) and Node.js (Express) deployed on Kubernetes clusters, enabling real-time inference for symptom classification and early warning AI models. Built clinician-facing web applications and dashboards with React, TypeScript, and D3.js to visualize patient health data and predictive alerts. Engineered streaming data pipelines with Apache Kafka and Spark Streaming to process sensor, EHR, and patient-generated health data for downstream ML workflows. Designed and fine-tuned transformer-based NLP and time-series models (BERT, custom LSTMs) for personalized digital therapeutics and risk scoring. Automated ML pipelines with Apache Airflow, handling data validation, model retraining, deployment, and continuous monitoring via Prometheus and Grafana. Established robust CI/CD workflows with Jenkins, Docker, Helm, and GitLab CI/CD to ensure HIPAA-compliant, secure, and reliable AI service deployments.
Software Developer at Clarius Mobile Health
February 1, 2011 - September 1, 2014
Developed backend services and APIs using Python (Flask) and Node.js (Express) to support mobile ultrasound imaging applications and cloud-based data synchronization. Designed and implemented real-time image processing pipelines for mobile ultrasound devices, leveraging OpenCV and early computer vision techniques to enhance image clarity and diagnostic quality. Created scalable microservices deployed on AWS and on-premise servers, focusing on performance, security, and HIPAA-compliant handling of medical imaging data. Participated in cross-functional teams alongside radiologists and product managers to gather clinical requirements and translate them into technical specifications and AI model enhancements.
Junior Software Developer at Enlitic
May 1, 2010 - February 1, 2011
Assisted in developing machine learning pipelines for medical image analysis, focusing on preprocessing DICOM datasets and annotating radiology images for training CNNs. Supported backend API development using Python and Flask to enable integration between AI models and clinical systems. Contributed to data cleaning and augmentation workflows, improving dataset quality and model training efficacy. Collaborated with senior engineers and data scientists to prototype early deep learning models for abnormality detection in chest X-rays. Participated in testing and debugging AI service components, ensuring reliable deployment in hospital environments. Documented codebases and contributed to onboarding efforts.

Education

B.S. in Computer Science at The University of Melbourne
August 1, 2006 - April 1, 2010
B.S. in Computer Science at The University of Melbourne
August 1, 2006 - April 1, 2010

Qualifications

Add your qualifications or awards here.

Industry Experience

Healthcare, Software & Internet, Professional Services, Other, Life Sciences