I'm Remy Lager, a Senior Data Engineer designing and deploying scalable data pipelines. I design and deploy Databricks pipelines to collect, clean, and transform millions of rows and complex datasets into actionable insights. I leverage advanced Delta Lake features—Autoloader, CDC, and time travel—and I build high-quality PySpark notebooks with modular, testable Python code and efficient execution logic. I built an automated end-to-end Azure infrastructure (including Databricks job scheduling and resources) using Terraform. I engineered and optimized dynamic Power BI dashboards with smart data modeling and seamless automated deployments. I work in agile environments with cross-functional data, DevOps, and product teams, with a strong focus on performance, reliability, CI/CD, and data governance.

Remy Lager

I'm Remy Lager, a Senior Data Engineer designing and deploying scalable data pipelines. I design and deploy Databricks pipelines to collect, clean, and transform millions of rows and complex datasets into actionable insights. I leverage advanced Delta Lake features—Autoloader, CDC, and time travel—and I build high-quality PySpark notebooks with modular, testable Python code and efficient execution logic. I built an automated end-to-end Azure infrastructure (including Databricks job scheduling and resources) using Terraform. I engineered and optimized dynamic Power BI dashboards with smart data modeling and seamless automated deployments. I work in agile environments with cross-functional data, DevOps, and product teams, with a strong focus on performance, reliability, CI/CD, and data governance.

Available to hire

I’m Remy Lager, a Senior Data Engineer designing and deploying scalable data pipelines. I design and deploy Databricks pipelines to collect, clean, and transform millions of rows and complex datasets into actionable insights. I leverage advanced Delta Lake features—Autoloader, CDC, and time travel—and I build high-quality PySpark notebooks with modular, testable Python code and efficient execution logic.

I built an automated end-to-end Azure infrastructure (including Databricks job scheduling and resources) using Terraform. I engineered and optimized dynamic Power BI dashboards with smart data modeling and seamless automated deployments. I work in agile environments with cross-functional data, DevOps, and product teams, with a strong focus on performance, reliability, CI/CD, and data governance.

See more

Experience Level

Expert
Expert
Expert
Expert

Language

French
Fluent
English
Fluent
Spanish; Castilian
Fluent
Arabic
Beginner

Work Experience

Add your work experience history here.

Education

DUT at Université Lumière Lyon 2
January 1, 2009 - January 1, 2011

Qualifications

Add your qualifications or awards here.

Industry Experience

Software & Internet, Computers & Electronics, Professional Services

Experience Level

Expert
Expert
Expert
Expert

Hire a Data Visualizer

We have the best data visualizer experts on Twine. Hire a data visualizer in Lyon today.