I am Robert Williams, an AI Staff Engineer with 13+ years of experience building distributed fault-tolerant infrastructure and high-performance APIs using Python, Java, and cloud-native technologies. I’ve led cross-functional teams, prioritized reliability, security, and performance, and delivered scalable ML/DL solutions in ecommerce and enterprise contexts. I enjoy turning data into actionable insights and guiding teams through complex technical initiatives. Currently at Walmart, I architect scalable AI systems and data pipelines, mentor engineers, and drive cross-team initiatives that empower product and business outcomes. I thrive on simplifying hard problems, communicating clearly with stakeholders, and delivering robust systems that scale with business needs.

Robert Williams

I am Robert Williams, an AI Staff Engineer with 13+ years of experience building distributed fault-tolerant infrastructure and high-performance APIs using Python, Java, and cloud-native technologies. I’ve led cross-functional teams, prioritized reliability, security, and performance, and delivered scalable ML/DL solutions in ecommerce and enterprise contexts. I enjoy turning data into actionable insights and guiding teams through complex technical initiatives. Currently at Walmart, I architect scalable AI systems and data pipelines, mentor engineers, and drive cross-team initiatives that empower product and business outcomes. I thrive on simplifying hard problems, communicating clearly with stakeholders, and delivering robust systems that scale with business needs.

Available to hire

I am Robert Williams, an AI Staff Engineer with 13+ years of experience building distributed fault-tolerant infrastructure and high-performance APIs using Python, Java, and cloud-native technologies. I’ve led cross-functional teams, prioritized reliability, security, and performance, and delivered scalable ML/DL solutions in ecommerce and enterprise contexts. I enjoy turning data into actionable insights and guiding teams through complex technical initiatives.

Currently at Walmart, I architect scalable AI systems and data pipelines, mentor engineers, and drive cross-team initiatives that empower product and business outcomes. I thrive on simplifying hard problems, communicating clearly with stakeholders, and delivering robust systems that scale with business needs.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Intermediate
Intermediate
Intermediate
Intermediate
See more

Language

Javanese
Advanced
English
Fluent

Work Experience

AI Staff Engineer at Walmart
May 1, 2019 - Present
Architected and deployed scalable serverless APIs using AWS Lambda, API Gateway, and Python to improve backend throughput by 40% while supporting multi-tenant SaaS integrations. Led development and deployment of AI systems using Python, TensorFlow, and PyTorch for scalable ML/DL solutions. Designed and optimized ETL pipelines leveraging SQL and PySpark to process terabytes of streaming data daily into AWS S3 and Redshift, achieving a 50% increase in data ingestion efficiency. Migrated from monolithic Django APIs to a microservices architecture containerized with Docker and managed via Kubernetes, reducing deployment time by 30% and increasing service reliability. Integrated AI-powered tools like GitHub Copilot, Zencoder, and Cursor into the development workflow.
Software Engineer (E6) at Facebook
June 1, 2013 - April 30, 2019
Developed backend services using Django ORM and Python, processing over 500k records daily on MySQL with a 25% performance gain. Built scalable backend APIs in Python (Django, Flask) supporting financial and healthcare applications, improving API response times by 30% via query optimizations and caching. Implemented ETL pipelines to ingest and process 500+ GB of data weekly using Apache Kafka, Hive, and Python, increasing throughput by 20%. Integrated C++ modules to accelerate data processing, achieving a 20% reduction in batch processing time. Built real-time dashboards with React and D3.js, enabling faster data-driven decisions. Re-architected legacy services into serverless RESTful APIs with API Gateway and Lambda, reducing infrastructure costs by 35% and improving latency by 25%. Designed and maintained SQL queries and Snowflake-like warehouses and PostgreSQL for fast, reliable ETL workflows. Built data pipelines processing over 10TB of real-time event data using PySpark, Hive, and
Software Developer at Wish
April 1, 2013 - June 1, 2013
Created a robust web scraper using Beautiful Soup, Scrapy, and Selenium to gather business data from BizBuySell.com based on client criteria. Engineered a system to store scraped data in PostgreSQL on Google Cloud Platform, ensuring data integrity and easy access. Developed functionality to regularly update a Google Sheets document with new data, handling duplicates and removed listings. Implemented data export functionality in CSV and Excel using pandas. Designed and implemented an AI-based scraper to enhance data extraction accuracy and efficiency. Built a platform to automate and structure sales data by combining AI, a robust backend, and cloud deployment.
Software Engineer (E6) at FaceBook
June 1, 2015 - April 20, 2019
Developed and optimized backend services using Django ORM and Python, processing 500k+ records daily on MySQL with 25% performance gains. Built scalable backend APIs (Django, Flask) for financial and healthcare apps, improving API response times by 30% via query optimizations and caching. Implemented ETL pipelines to ingest and process 500+ GB of data weekly using Kafka, Hive, and Python, increasing throughput by 20%. Integrated C++ modules to accelerate data processing, achieving a 20% reduction in batch processing time for transaction datasets. Built real-time dashboards with React and D3.js for faster data-driven decisions. Re-architected legacy services into serverless RESTful APIs with API Gateway and Lambda, reducing costs by 35% and improving latency by 25%. Designed and maintained complex SQL queries and optimized database schema in Snowflake-like warehouses and PostgreSQL for fast BI workflows. Built pipelines processing over 10TB of real-time event data using PySpark, Hive, a

Education

Bachelor's Degree in Computer Science at Texas Christian University (TCU)
January 1, 2009 - January 1, 2013
Bachelor's Degree in Computer Science at Texas Christian University (TCU)
January 1, 2009 - January 1, 2013
Bachelor's Degree in Computer Science at Texas Christian University (TCU)
January 1, 2009 - December 31, 2013
Bachelor's Degree at Texas Christian University (TCU)
January 1, 2009 - January 1, 2013
Bachelor's Degree in Computer Science at Texas Christian University (TCU)
January 1, 2009 - January 1, 2013
Bachelor's Degree in Computer Science at Texas Christian University (TCU)
January 1, 2009 - January 1, 2013

Qualifications

Certified AI Developer
January 1, 2020 - December 24, 2025
Certified Python Developer
January 1, 2018 - December 24, 2025
Certified AI Developer
January 1, 2020 - January 14, 2026
Certified Python Developer
January 1, 2018 - January 14, 2026
Certified AI Developer
January 1, 2020 - February 14, 2026
Certified Python Developer
January 1, 2018 - February 14, 2026
Certified AI Developer
January 1, 2020 - February 20, 2026
Certified Python Developer
January 1, 2018 - February 20, 2026
Certified AI Developer
January 1, 2020 - February 20, 2026
Certified Python Developer
January 1, 2018 - February 20, 2026
Certified AI Developer
January 1, 2020 - March 11, 2026
Certified Python Developer
January 1, 2018 - March 11, 2026

Industry Experience

Software & Internet, Retail, Media & Entertainment, Education, Professional Services, Computers & Electronics