I am an AI Systems Engineer specializing in Production Inference and LLM Architecture. I don't just train models; I deploy them. My core expertise includes building custom Transformers from scratch (PyTorch), architecting high-frequency distributed data pipelines (FastAPI/SQLAlchemy), and ensuring technical accuracy for SOTA LLMs via RLHF. I bridge the gap between Data Science and Backend Engineering, turning research code into scalable, low-latency production systems.

Evelyn Azalea da Silva Olimpio

PRO

I am an AI Systems Engineer specializing in Production Inference and LLM Architecture. I don't just train models; I deploy them. My core expertise includes building custom Transformers from scratch (PyTorch), architecting high-frequency distributed data pipelines (FastAPI/SQLAlchemy), and ensuring technical accuracy for SOTA LLMs via RLHF. I bridge the gap between Data Science and Backend Engineering, turning research code into scalable, low-latency production systems.

Available to hire

I am an AI Systems Engineer specializing in Production Inference and LLM Architecture. I don’t just train models; I deploy them.

My core expertise includes building custom Transformers from scratch (PyTorch), architecting high-frequency distributed data pipelines (FastAPI/SQLAlchemy), and ensuring technical accuracy for SOTA LLMs via RLHF.

I bridge the gap between Data Science and Backend Engineering, turning research code into scalable, low-latency production systems.

See more

Experience Level

Expert
Expert
Expert
Expert

Language

English
Fluent
Portuguese
Fluent

Work Experience

RLHF Specialist at Outlier.ai (Scale AI)
January 1, 2026 - Present
Selected as a Domain Expert to evaluate Python code trajectories for SOTA LLM training. I specialize in Chain-of-Thought reasoning, debugging logic, and rewriting complex algorithms to ensure technical accuracy and safety.

Education

Bachelor of Science and Technology (Focus in Computer Science) at Universidade Federal de São Paulo (UNIFESP)
January 11, 2030 - December 1, 2027
Bachelor of Science and Technology (Focus in Computer Science) at Universidade Federal de São Paulo (UNIFESP)
January 11, 2030 - December 1, 2027

Qualifications

Add your qualifications or awards here.

Industry Experience

Software & Internet, Professional Services, Computers & Electronics, Financial Services
    uniE621 AI-Powered Background Remover (Full-Stack) - Deep Learning Architecture
    Engineered a U2Net saliency detection model from scratch using PyTorch, trained on the P3M-10k dataset to achieve high-precision alpha matting and hair-level detail preservation. Full-Stack Implementation: Built a responsive React (Vite) frontend with Material UI and Clerk authentication, integrated with a high-performance FastAPI backend that manages user credits and processing quotas via SQLAlchemy. Technologies: Python, PyTorch, FastAPI, React, JavaScript, Material UI, Clerk, SQLAlchemy.
    uniE621 QuantCandle API
    Architected a high-performance REST API designed to serve pre-processed, stationarity-engineered features to live machine learning models in real-time. Built with FastAPI and Asynchronous SQLAlchemy, this system serves as the delivery layer for the QuantCandle ecosystem, handling high-concurrency requests for 1-minute Bitcoin candle data. Key features include an Async Architecture for fully non-blocking database queries to ensure millisecond-latency responses, Cloud Deployment on Oracle Cloud VPS secured via RapidAPI proxy, and an ML-Ready Schema that delivers feature vectors formatted specifically for LSTMs, Transformers, and XGBoost consumption.
    uniE621 QuantCandle Engine: High-Frequency Data Pipeline
    Architected the core data ingestion engine capable of processing high-throughput cryptocurrency market data with zero downtime. This service maintains a persistent WebSocket connection to Binance, applying vectorized mathematical transformations (via NumPy) in real-time to convert raw price action into stationary, machine-learning-ready features.
    uniE621 Bitcoin Quant Core: Stationary ML Features Dataset
    Authored a specialized financial dataset explicitly engineered for Machine Learning stability, pre-processing 1-minute Bitcoin candles to eliminate non-stationary noise. The dataset utilizes Robust Scaling and Arcsinh transformations to normalize distributions across different market regimes, ensuring strictly continuous data streams that support high-frequency model training without look-ahead bias.
    uniE621 Multi-Target Transformer for Financial Forecasting (From Scratch)
    Published a thoroughly documented implementation of a Multi-Target Prediction (MTP) Time-Series Transformer built entirely from scratch in PyTorch. This project features a custom dual-output decoder architecture capable of simultaneous regression for log returns and classification for volatility regimes. To ensure model stability, the system trains strictly on stationary-engineered features and incorporates statistical validation via Spearman Rank Correlation analysis to prune redundant features during the EDA phase.