Skills
Work Experience
Sr. Azure Data Engineer at Western Alliance Bank
July 1, 2023 - November 6, 2025Led design and implementation of cloud-native data engineering solutions on Microsoft Azure for enterprise banking operations and analytics. Architected end-to-end data pipelines using Azure Data Factory, Azure Synapse Analytics, and Databricks to integrate data from core banking systems. Built data lakehouse architecture on Azure Data Lake Gen2 with Delta Lake for scalable storage and fast analytics. Developed complex PySpark transformations for cleansing, enrichment, and business-rule application on financial datasets. Implemented real-time data pipelines with Azure Event Hubs and Azure Stream Analytics for fraud detection and transaction monitoring. Enforced data quality and lineage via Azure Purview; created star/snowflake schema data warehouses in Synapse for risk, lending, and compliance reporting. Implemented CDC-based incremental refreshes; optimized PySpark and SQL jobs; automated CI/CD via Azure DevOps. Enabled governance with RBAC, Key Vault, and encryption, and supported SO
Sr. Azure Data Engineer at McKesson Corporation
June 1, 2023 - June 1, 2023Designed and developed end-to-end data pipelines in Azure to process healthcare, clinical, and pharmaceutical data from multiple sources. Built ETL/ELT workflows with Azure Data Factory to extract from on-premises SQL Server, Oracle, and SAP into ADLS Gen2. Created transformation frameworks in Databricks (PySpark/SQL) for cleansing, aggregation, and validation; modeled data warehouses in Synapse for analytics and compliance reporting. Implemented CDC and incremental loads for near real-time data integration; created reusable pipelines and SHIR connectivity for hybrid environments. Implemented data quality rules, deduplication, and auditing, ensuring HIPAA-compliant datasets. Architected Data Lake zones (Raw, Curated, Consumption) for governance and lifecycle management. Partnered with Power BI teams to publish curated datasets and optimize data models for dashboards. Introduced event-driven data flows using Event Grid, Logic Apps, and Functions for real-time updates; migrated legacy ET
Azure Data Engineer at 3M
February 1, 2020 - February 1, 2020Migrated 20TB of data from on-prem SFTP to Azure Cloud, reducing data processing time by 30%. Optimized query performance in Azure SQL Database using bulk insert techniques and materialized views. Ensured high availability and scalability with Azure Cosmos DB, implementing effective partitioning and indexing. Leveraged Databricks, Data Lake, Blob Storage, Data Factory, and HDInsight for storage, processing, and analysis. Developed CI/CD pipelines in Azure DevOps (YAML), secured data with Key Vault, and provisioned infrastructure using ARM templates and Bicep. Automated scripting with PowerShell and Bash for pipeline automation. Applied Spark streaming for real-time analytics and built dashboards in Synapse. Designed and implemented partitioned Hive-like data structures and real-time data ingestion patterns with Event Hubs and Kafka integrations. Employed Snowflake-style cloning concepts and advanced SQL practices to optimize workflows.
Big Data Developer at Allstate Insurance Company
May 1, 2018 - May 1, 2018Designed, developed, and optimized data ingestion pipelines for large-scale insurance data using Hadoop ecosystem components (HDFS, Hive, Pig, Sqoop, Spark). Built ETL workflows in Spark (PySpark/Scala) for policy, claims, and customer datasets; created Hive tables and partitions; scheduled data loads with Oozie/Azkaban. Implemented real-time streaming with Kafka and Spark Streaming; built data marts for business users and data scientists. Developed Unix shell and Python automation scripts for ingestion and validation; implemented data governance with Apache Atlas and Informatica Metadata Manager; supported HIPAA/compliance initiatives and ad-hoc data requests. Migrated workflows from on-prem Cloudera to AWS EMR; produced technical documentation and managed production support.
ETL Developer at Capital One Financial Corp.
October 1, 2016 - October 1, 2016Designed and implemented ETL workflows using Informatica PowerCenter to extract, transform, and load banking and customer data into enterprise data warehouses. Built complex mappings, transformations, and reusable mapplets; integrated data from Oracle, SQL Server, flat files, and mainframe sources. Implemented incremental loads and CDC for near real-time synchronization; performed data cleansing, de-duplication, and standardization. Optimized performance with partitioning, pushdown, and caching; developed error handling, logging, and restartability features. Deployed changes via Control-M; supported PCI/compliance initiatives; collaborated with BI teams to deliver trusted datasets for dashboards. Participated in data migrations during core-banking upgrades and warehouse consolidations; documented data flows and ensured governance and security.
Education
Bachelors in CSE at Chandigarh University
January 11, 2030 - June 1, 2012Masters in Computer Science at Oklahoma Christian University
January 11, 2030 - December 1, 2013Qualifications
Industry Experience
Financial Services, Healthcare, Professional Services, Software & Internet, Education
Skills
Hire a Data Analyst
We have the best data analyst experts on Twine. Hire a data analyst in Phoenix today.