I'm Marius Morar, a Senior Data Engineer with 9+ years of experience building reliable cloud-based data platforms. I focus on ELT pipelines, data warehousing, and orchestration using Snowflake, AWS, Airflow, PySpark, dbt, and SQL. I collaborate with BI and architecture teams to translate business requirements into analytics-ready datasets, while maintaining clear documentation and steady improvements that keep data systems scalable and dependable. In my approach, I value calm, methodical problem solving, strong data quality and observability, and proactive reliability improvements across end-to-end data workflows. I enjoy migrating legacy systems to modern cloud platforms and empowering stakeholders with timely, accurate data.

Marius Morar

I'm Marius Morar, a Senior Data Engineer with 9+ years of experience building reliable cloud-based data platforms. I focus on ELT pipelines, data warehousing, and orchestration using Snowflake, AWS, Airflow, PySpark, dbt, and SQL. I collaborate with BI and architecture teams to translate business requirements into analytics-ready datasets, while maintaining clear documentation and steady improvements that keep data systems scalable and dependable. In my approach, I value calm, methodical problem solving, strong data quality and observability, and proactive reliability improvements across end-to-end data workflows. I enjoy migrating legacy systems to modern cloud platforms and empowering stakeholders with timely, accurate data.

Available to hire

I’m Marius Morar, a Senior Data Engineer with 9+ years of experience building reliable cloud-based data platforms. I focus on ELT pipelines, data warehousing, and orchestration using Snowflake, AWS, Airflow, PySpark, dbt, and SQL. I collaborate with BI and architecture teams to translate business requirements into analytics-ready datasets, while maintaining clear documentation and steady improvements that keep data systems scalable and dependable.

In my approach, I value calm, methodical problem solving, strong data quality and observability, and proactive reliability improvements across end-to-end data workflows. I enjoy migrating legacy systems to modern cloud platforms and empowering stakeholders with timely, accurate data.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert

Language

English
Fluent
Romanian, Moldavian, Moldovan
Fluent

Work Experience

Senior Data Engineer at Vollcom Digital
May 1, 2025 - November 1, 2025
Designed and maintained robust cloud-based ELT pipelines using Snowflake, AWS, Python, SQL, PySpark, Airflow, dbt, and Fivetran, processing 20M+ records per month. Built and operated Apache Airflow orchestration with retries, backfills, SLAs, and observability hooks, improving pipeline reliability and reducing manual recovery efforts by ~30%. Developed dbt transformation models on Snowflake, standardizing business logic and delivering analytics-ready dimensional datasets for BI teams. Supported migration from legacy data systems to modern cloud data platforms, validating historical data and ensuring seamless BI dashboard continuity. Implemented data quality and observability checks, including SQL assertions, freshness monitoring, and validation tests to proactively detect pipeline failures. Optimized ELT performance through incremental loading, clustering, and query tuning, reducing nightly processing time by ~25%. Collaborated closely with business intelligence and data architecture t
Data Platform Engineer | ML Engineer at Wonderflow B.V.
November 1, 2020 - May 1, 2025
Designed and operated large-scale ELT pipelines using Apache Spark, PySpark, AWS, Snowflake, and Airflow, processing 1M+ events per day. Managed Airflow-based orchestration for batch and near-real-time workflows, increasing observability and reducing recurring failures by ~35%. Integrated and maintained Fivetran connectors for SaaS and marketing platforms, automating ingestion across legacy and modern data systems. Built dimensional data models and reporting layers to support BI tools and executive dashboards with consistent metrics. Optimized Spark SQL workloads via partitioning, caching, and query refactoring, reducing runtime by ~28% on multi-terabyte datasets. Supported data quality, governance, and reliability initiatives through validation rules, monitoring metrics, and documentation. Maintained CI/CD pipelines and Dockerized data services, enabling safe, repeatable deployments. Worked cross-functionally with BI, product, and engineering teams to deliver timely, high-quality data
Data Engineer at ArtSoft Consult
June 1, 2017 - August 1, 2020
Developed Python- and SQL-based ELT pipelines loading operational and financial data into PostgreSQL and SQL Server warehouses, processing around 1M records per day. Designed dimensional schemas with fact and dimension tables, including Slowly Changing Dimensions, enabling accurate historical analysis and consistent KPI reporting. Built scheduled batch workflows and basic orchestration scripts to automate recurring data integration and reporting tasks, reducing manual effort by ~30%. Optimized SQL queries and indexing strategies, improving nightly batch performance by roughly 30% while maintaining data accuracy and integrity. Created curated reporting layers consumed by BI tools, replacing spreadsheet-based workflows and improving turnaround time for management reports. Implemented reconciliation and data validation checks between source systems and warehouse outputs, reducing recurring discrepancies by ~25%. Worked directly with business stakeholders to gather requirements and transla
Junior Software Engineer / Data Analyst (Internship) at WebGurus
August 1, 2016 - May 1, 2017
Wrote Python scripts and SQL queries for data cleaning, transformation, and exploratory analysis, supporting early internal analytics initiatives. Assisted in building simple ETL workflows that consolidated data from multiple operational systems into a centralized PostgreSQL reporting database. Created SQL views and aggregated tables that reduced manual reporting work by approximately 20% for internal teams. Built basic operational dashboards and reports to track daily metrics and data freshness for non-technical stakeholders. Documented data logic, assumptions, and validation steps, improving transparency and reliability of shared reports. Supported senior engineers with testing and troubleshooting of data pipelines, building a strong foundation in data engineering best practices.

Education

Bachelor’s Degree in Software Engineering at University of Alba Iulia
October 1, 2013 - July 1, 2016

Qualifications

Add your qualifications or awards here.

Industry Experience

Software & Internet

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert

Hire a Data Analyst

We have the best data analyst experts on Twine. Hire a data analyst today.