I am a passionate Data Engineer with extensive experience in building scalable and efficient data pipelines using modern cloud technologies such as Azure, Databricks, and Snowflake. I specialize in automating data workflows, improving data quality, and delivering actionable insights through data-driven analytics and Power BI dashboards. I enjoy leveraging PySpark, SQL, and Azure Data Factory to transform raw data into reliable, analytics-ready data assets. I have a strong background in cloud data engineering with hands-on experience designing end-to-end solutions, leading migrations to modern platforms, and optimizing data transformations and pipelines for performance and cost efficiency. I thrive in hybrid cloud environments and am committed to developing robust, error-free, and maintainable data infrastructure solutions that empower stakeholders and drive business decisions.

Harish Sathiyanandan

I am a passionate Data Engineer with extensive experience in building scalable and efficient data pipelines using modern cloud technologies such as Azure, Databricks, and Snowflake. I specialize in automating data workflows, improving data quality, and delivering actionable insights through data-driven analytics and Power BI dashboards. I enjoy leveraging PySpark, SQL, and Azure Data Factory to transform raw data into reliable, analytics-ready data assets. I have a strong background in cloud data engineering with hands-on experience designing end-to-end solutions, leading migrations to modern platforms, and optimizing data transformations and pipelines for performance and cost efficiency. I thrive in hybrid cloud environments and am committed to developing robust, error-free, and maintainable data infrastructure solutions that empower stakeholders and drive business decisions.

Available to hire

I am a passionate Data Engineer with extensive experience in building scalable and efficient data pipelines using modern cloud technologies such as Azure, Databricks, and Snowflake. I specialize in automating data workflows, improving data quality, and delivering actionable insights through data-driven analytics and Power BI dashboards. I enjoy leveraging PySpark, SQL, and Azure Data Factory to transform raw data into reliable, analytics-ready data assets.

I have a strong background in cloud data engineering with hands-on experience designing end-to-end solutions, leading migrations to modern platforms, and optimizing data transformations and pipelines for performance and cost efficiency. I thrive in hybrid cloud environments and am committed to developing robust, error-free, and maintainable data infrastructure solutions that empower stakeholders and drive business decisions.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Intermediate
Intermediate
Intermediate
Intermediate
Intermediate
Intermediate
See more

Work Experience

Data Engineer at Adventure Works
August 1, 2023 - August 26, 2025
Built a modular, layered Azure data pipeline that ingests, transforms, and serves Adventure Works data using Azure Data Factory, Databricks (PySpark), Delta Lake, and Synapse, integrating with Power BI for interactive, analytics-ready insights. Designed hybrid cloud data engineering solutions with self-hosted integration runtime, enabling secure ingestion, scalable transformation, and analytics delivery. Engineered a complete Snowflake solution for ODI cricket JSON data including ingestion, cleaning, modeling, and automated workflows with Snowflake Tasks, enabling actionable insights via Snowsight dashboards.
Azure Data Engineer at ODI Cricket Data Pipeline
May 1, 2025 - August 26, 2025
Focused on turning complex data into reliable infrastructure by building automated end-to-end pipelines using PySpark, SQL, and the Azure stack (ADF, Databricks). Led migrations to modern platforms like Snowflake. Designed and automated end-to-end data pipelines in Azure Data Factory to ingest multi-sourced data from APIs, flat files, and databases for sales and inventory reporting. Utilized PySpark within Databricks to perform complex data transformations, cleaning raw data for analysis and reducing data refresh time by 40%.
Data Analyst at LinkIn
September 1, 2021 - August 26, 2025
Extracted and cleaned multi-store sales files using SQL and Python, preparing structured datasets for trend and performance analysis. Documented data flows from POS to store analytics sheets, including mapping source fields and transformation rules, improving clarity and reducing misreporting. Automated routine reporting workflows and created structured dashboards in Google Sheets for better visibility of sales trends.
Data Engineer at Adventure Works Cloud ETL Pipeline
August 1, 2023 - August 26, 2025
Built a modular, layered Azure data pipeline that ingests, transforms, and serves Adventure Works data using Azure Data Factory, Databricks (PySpark), Delta Lake, and Synapse, integrated with Power BI for interactive analytics-ready insights.
Data Engineer at Hybrid Cloud Data Engineering with Self-Hosted IR and Delta Lake
August 1, 2023 - August 26, 2025
Designed an end-to-end solution with Azure Data Factory (Self-Hosted Integration Runtime), Databricks (PySpark), Delta Lake, and Synapse, enabling secure ingestion, scalable transformation, and analytics delivery.
Data Engineer at Snowflake ODI Cricket JSON Data Pipeline
August 1, 2023 - August 26, 2025
Engineered a complete Snowflake solution for ODI cricket JSON data, streamlining data ingestion, cleaning, and modeling, automating workflows with Snowflake Tasks, and enabling actionable insights through Snowsight dashboards.
Azure Data Engineer at Azure Data Engineering
August 1, 2023 - August 26, 2025
Built automated end-to-end pipelines using PySpark, SQL, and Azure stack (ADF, Databricks). Led migrations to modern platforms like Snowflake and developed efficient, scalable data systems through engineering and automation.
Data Engineer at Metro Tiles World
September 1, 2021 - August 26, 2025
Extracted and cleaned multi-store sales files using SQL and Python, preparing structured datasets for trend and performance analysis. Documented data flows, improved clarity, and automated reporting workflows with Google Sheets dashboards for enhanced visibility of sales trends.

Education

MSc Computing Science at University College Cork
January 1, 2018 - August 1, 2020
Bachelor of Engineering in Computer Science and Engineering at VelTech Hightech Engineering College
January 11, 2030 - August 26, 2025
MSc Computing Science at University College Cork
January 11, 2030 - August 26, 2025
Bachelor of Engineering in Computer Science and Engineering at Veltech Hightech Engineering College
January 11, 2030 - August 26, 2025

Qualifications

Add your qualifications or awards here.

Industry Experience

Retail, Software & Internet, Professional Services, Financial Services

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Intermediate
Intermediate
Intermediate
Intermediate
Intermediate
Intermediate
See more