Available to hire
I’m a data engineer with 4 years of experience delivering scalable data platforms and analytics solutions across Banking and Enterprise IT Services domains. I specialize in cloud-native ELT/ETL, real-time streaming, and analytics-ready data products using AWS, Azure, Snowflake, and GCP.
I love turning complex datasets into actionable insights, collaborating with risk, compliance, and product teams in Agile environments, and continually improving data quality, lineage, and governance to empower data-driven decisions.
Skills
Work Experience
Data Engineer at BMO Bank
January 1, 2025 - PresentDesigned and implemented scalable ETL/ELT pipelines processing over 50 million daily banking transactions from core banking systems, payment platforms, and external data sources into cloud data lakes and Snowflake data warehouse instances. Orchestrated and monitored 200+ production workflows using Apache Airflow, achieving 99.9% SLA and reducing manual intervention by 60%. Led cloud modernization migrating legacy on-premise financial reporting pipelines to GCP and Snowflake, improving scalability, reducing infrastructure dependency, and supporting petabyte-scale storage and compute. Built end-to-end data pipelines on GCP using Cloud Storage, BigQuery, Pub/Sub, Dataflow, and Cloud Composer (Airflow), enabling scalable ingestion and processing of high-volume banking datasets. Implemented Snowflake-based cloud data warehousing for regulatory reporting and risk analytics, leveraging Snowpipe, Streams, and Tasks and performing warehouse tuning to support near real-time and batch workloads.
Data Analyst at GlobalLogic
January 1, 2021 - December 1, 2023Analyzed and interpreted data from multiple enterprise systems to support 20+ client accounts, enabling data-driven operational decisions. Processed millions of records per month using Python (Pandas, NumPy) and SQL, increasing data analysis efficiency by 40%. Developed and maintained 30+ dashboards and reports in Power BI to track KPIs, SLAs, and service performance. Automated recurring reporting, data preparation, and validation workflows, reducing manual effort by 50%. Collaborated with service delivery managers and stakeholders to translate requirements into analytical solutions, improving reporting turnaround by 35%. Performed data quality checks and anomaly detection, contributing to a 20% improvement in SLA adherence. Delivered ad hoc analyses and executive summaries for client reviews, supporting audit and governance. Built cloud-based data pipelines and datasets in AWS (S3, Athena, Glue) and Azure, enabling scalable analytics and improved query performance across large dataset
Education
Master's in Management Information Systems at Northern Illinois University
January 11, 2030 - February 16, 2026Qualifications
Industry Experience
Financial Services, Professional Services
Skills
Hire a Data Scientist
We have the best data scientist experts on Twine. Hire a data scientist in Chicago today.