I’m Remy Lager, a Senior Data Engineer designing and deploying scalable data pipelines. I design and deploy Databricks pipelines to collect, clean, and transform millions of rows and complex datasets into actionable insights. I leverage advanced Delta Lake features—Autoloader, CDC, and time travel—and I build high-quality PySpark notebooks with modular, testable Python code and efficient execution logic.
I built an automated end-to-end Azure infrastructure (including Databricks job scheduling and resources) using Terraform. I engineered and optimized dynamic Power BI dashboards with smart data modeling and seamless automated deployments. I work in agile environments with cross-functional data, DevOps, and product teams, with a strong focus on performance, reliability, CI/CD, and data governance.
Language
Work Experience
Education
Qualifications
Industry Experience
Hire a Data Visualizer
We have the best data visualizer experts on Twine. Hire a data visualizer in Lyon today.