I’m a developer who specializes in designing and building cloud-native data solutions on Azure. I help businesses create scalable ETL/ELT pipelines, transform complex datasets, and deliver analytics-ready platforms. My expertise includes Azure Data Factory, Databricks, PySpark, Delta Lake, SQL, and Snowflake, with strong experience in data governance, performance optimization, and workflow automation.
What sets me apart is my ability to handle end-to-end data engineering challenges — from ingestion and transformation to production-ready pipelines — with a focus on reliability, security, and cost-efficient cloud design. I have worked across insurance, healthcare, and pharmaceutical domains, giving me sector knowledge to understand your business needs quickly and deliver actionable data solutions.
Employment and Project Experience
Azure Data Engineer (Contract) – Intact Insurance, Canada
February 2025 – Present
Design and build ETL/ELT pipelines with ADF, Databricks, and PySpark
Process large-scale datasets efficiently and ensure data quality, security, and compliance
Implement CI/CD automation, workflow orchestration, and monitoring for production-ready pipelines
Senior Data Engineer – Altimetrik (Novartis)
February 2023 – February 2025
Developed distributed data processing systems using PySpark and Kafka
Delivered reusable, parameterized pipelines for batch and streaming workloads
Ensured data governance, security, and operational excellence across pipelines
Language
Work Experience
Education
Qualifications
Industry Experience
Designed and built cloud-native ETL/ELT pipelines for Intact Insurance using Azure Data Factory, Azure Databricks, PySpark, SQL Server, Snowflake, and Delta Lake. The project focused on processing large-scale insurance datasets efficiently and reliably while ensuring data quality, governance, and security.
Key highlights:
Developed PySpark backend jobs for high-volume, low-latency data processing, improving pipeline performance by ~30%.
Built orchestrated workflows using ADF and Airflow with retries, monitoring, and alerts for production-grade reliability.
Integrated data from APIs, relational databases (MySQL, SQL Server), and event-driven sources, ensuring consistency and secure access.
Applied data governance and role-based access controls using Unity Catalog and Azure security features.
Automated CI/CD deployment using Git and Azure DevOps for repeatable, safe pipeline delivery.
Supported identity-adjacent data flows, including user onboarding events, access-controlled datasets, and audit logging for compliance.
This project delivered robust, production-ready data solutions that enabled analytics, reporting, and business insights across insurance operations.
Hire a Cloud Developer
We have the best cloud developer experts on Twine. Hire a cloud developer in Vancouver today.