I’m Olesya —a software engineer with 5+ years of full‑stack development experience and hands‑on expertise in AI‑enabled solutions. I enjoy turning complex problems into reliable web apps, building scalable APIs, and shaping data models to power product experiences.
I’ve shipped AI-powered tools using RAG pipelines (Supabase pgvector) and LangChain/OpenAI prompt design, and I also automate workflows with n8n. My background spans production web apps (React, Node.js, NestJS, Python) and a strong focus on API design and data modeling. I’m passionate about delivering clean code and delightful user experiences.
Skills
Experience Level
Language
Work Experience
Education
Qualifications
Industry Experience
An interactive Python-based Travel Planning Assistant that helps users plan trips by suggesting hotels, restaurants, attractions, and more — all based on city, activity type, and budget. The assistant also supports natural language question answering powered by Retrieval-Augmented Generation (RAG) using LangChain, OpenAI, and Supabase.
Features
Interactive itinerary planning
Select city, activity type, and budget.
Get personalized recommendations with descriptions and pricing.
RAG-powered Q&A
Ask natural language travel questions.
Answers generated by combining large language models with vector search over a Supabase database.
Supports continuous, conversational Q&A.
Data backend
Uses Supabase for storing travel data and vector embeddings.
Efficient retrieval with PostgreSQL vector search.
Built with modern tools
Python 3.x
LangChain & OpenAI API
Supabase vector database
Skills: Web Development · UI/UX · AI Integration · Project Showcasing
I designed and developed a personal portfolio website to present my AI and software development projects in a structured, accessible, and professional format. The goal was to create a clean, modern interface that highlights practical skills, technical stacks, and real-world use cases.
Implemented features:
Responsive, minimalist website showcasing AI & software projects
Project pages with detailed descriptions, tech stacks, and demo links
Dynamic content structure — easy to update and scale with new work
Clear separation of AI tools, chatbots, automations, and web apps
Integration of external links (e.g., GitHub, YouTube tutorials, live demos)
Tech stack: HTML, CSS, JavaScript (or Next.js / React if used), GitHub Pages (or your preferred deployment method)
📎 You can explore the live portfolio here:
👉 https://www.twine.net/signin
I developed a semantic search web application that finds relevant texts based on the meaning of the query, not just by keywords. I used OpenAI to generate embeddings and Supabase as the database and backend.
Implemented features:
Semantic search using OpenAI Embeddings
React-based UI for uploading, searching, and displaying texts
Supabase as backend and embedding storage
Query storage and processing with PostgreSQL
Interactive and minimalist user interface
Tech stack: React.js, OpenAI Embeddings, Supabase, PostgreSQL
📎 I recorded a YouTube tutorial where I walk through the entire project step by step:
I created a Telegram bot that identifies plants from photos using the Plant.id API. The user sends an image of a plant — the bot processes it, identifies the species, and returns information. All requests and results are automatically recorded in Google Sheets for further analysis.
Implemented features:
Integration with the Plant.id API for plant recognition
Connection to the Telegram Bot API
Data storage in Google Sheets via Google API
Image processing and user response in chatbot format
Error handling and fallback logic for failed recognitions
Tech stack: Node.js, Plant.id API, Google Sheets, Telegram Bot API
📎 I recorded a detailed video (in English) where I walk through the entire process of building this bot step by step:
In this project, I created a fully automated system that collects business leads (e.g., real estate agencies or clinics) based on specified keywords and geolocation.
Manual lead collection means hours of searching on Google, browsing websites, and copying data into spreadsheets. The result? Outdated or incomplete information.
This system solves that problem.
By automating the entire process, you get:
50–100 targeted leads from Google Maps in just a few minutes
Structured, ready-to-use data (name, website, business type, address) directly in Airtable
This isn’t just a tool — it’s a scalable lead generation engine you can rely on every day.
Hire a Developer
We have the best developer experts on Twine. Hire a developer in Raleigh today.