I am a Senior Data Engineer and Architect with 4+ years of experience delivering secure, scalable data platforms on Azure, AWS, and GCP. I lead engineering on an Azure/Snowflake platform, owning solution design, networking and security, and end-to-end delivery from ingestion to analytics, while coaching junior analysts and engineers. I have worked across payments, credit, retail and postal logistics, collaborating with stakeholders to design and productionise data solutions. I'm hands-on with PySpark, Databricks, Cosmos DB, Event Hub, API integrations, IaC and CI/CD. I thrive on owning outcomes, planning and tracking work with Jira, and documenting runbooks in Confluence. I have led architectural discussions, built cross-cloud observability, and delivered impact like improved attribution models and recovering previously unallocated transactions.

SARATH BOLLEDDU

I am a Senior Data Engineer and Architect with 4+ years of experience delivering secure, scalable data platforms on Azure, AWS, and GCP. I lead engineering on an Azure/Snowflake platform, owning solution design, networking and security, and end-to-end delivery from ingestion to analytics, while coaching junior analysts and engineers. I have worked across payments, credit, retail and postal logistics, collaborating with stakeholders to design and productionise data solutions. I'm hands-on with PySpark, Databricks, Cosmos DB, Event Hub, API integrations, IaC and CI/CD. I thrive on owning outcomes, planning and tracking work with Jira, and documenting runbooks in Confluence. I have led architectural discussions, built cross-cloud observability, and delivered impact like improved attribution models and recovering previously unallocated transactions.

Available to hire

I am a Senior Data Engineer and Architect with 4+ years of experience delivering secure, scalable data platforms on Azure, AWS, and GCP. I lead engineering on an Azure/Snowflake platform, owning solution design, networking and security, and end-to-end delivery from ingestion to analytics, while coaching junior analysts and engineers. I have worked across payments, credit, retail and postal logistics, collaborating with stakeholders to design and productionise data solutions.

I’m hands-on with PySpark, Databricks, Cosmos DB, Event Hub, API integrations, IaC and CI/CD. I thrive on owning outcomes, planning and tracking work with Jira, and documenting runbooks in Confluence. I have led architectural discussions, built cross-cloud observability, and delivered impact like improved attribution models and recovering previously unallocated transactions.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Intermediate
Intermediate
Intermediate
See more

Language

English
Fluent

Work Experience

Data Engineer (acting Senior Data Engineer) at Pioneer Credit
August 1, 2024 - Present
Operate as the senior engineer on the Azure and Snowflake data platform, leading solution design, delivery and mentoring of a team of analysts and engineers. Migrated from on-prem SQL Server, Oracle and PostgreSQL to cloud ingestion, warehousing and analytics; standardised landing, staging and curated layers, SLAs/SLOs, and data-quality controls; defined modelling patterns (Kimball and OBT) and multi-environment Snowflake setups. Delivered event-driven and CDC ingestion with ADF, Fivetran, Event Hub, Snowpipe and Streams/Tasks; implemented Cosmos DB as a marketing capture layer and authored Snowflake JavaScript stored procedures to adapt to evolving source schemas. Implemented CI/CD with Terraform to provision secure infrastructure and established cross-cloud observability. Delivered a payment attribution model with exponential attribution improving accuracy (~35%) and recovered around AUD 500k in previously unallocated transactions. Mentored five analysts and engineers and represented
Data Engineer (Consultant) at ANZ Worldline Payments Solutions
September 1, 2023 - June 1, 2024
Designed a Databricks Medallion architecture to modernise data pipelines (bronze/silver/gold) with reusable PySpark ingestion components for payments and settlement data. Coordinated pipeline modernization using Jira and Confluence, and delivered ADF-triggered ELT integrating Oracle, Teradata and SAP BO/BW with SLA dashboards, lineage views and reconciliation for financial controls. Improved data quality by ~30% and reduced processing time by ~15% through standardisation and optimisation. Automated SharePoint transfers with Power Automate, saving ~10 hours per week. Collaborated with RBI and Deloitte on compliance and governance.
Reporting Analyst (Consultant) at Woolworths Group
May 1, 2023 - November 1, 2023
Integrated AI ScanAssist outputs with BigQuery-based operational datasets and built Looker/Tableau dashboards to enable non-technical store and category teams to act on insights, reducing operational costs. Optimised BigQuery SQL pipelines, cutting data readiness time by ~25% and improving reliability for daily reporting. Partnered with category managers, data engineers and product teams to prioritise metrics and ensure reports matched business needs.
Data Analyst (Consultant) at Australia Post
January 1, 2023 - June 1, 2023
Migrated on-premise data to Google Cloud Platform BigQuery, standardising ingestion across CSV, TXT and Excel sources and implementing governance basics. Integrated AWS Lambda-based product analytics with GCP via API delivery and payload normalisation into BigQuery. Built BigQuery SQL models and dashboards for customer segmentation, product analytics and budget tracking. Implemented Cloud Monitoring for throughput and freshness with alerting/runbooks, and combined Adobe Analytics with customer feedback to recommend UX improvements.
Software Developer at sensen.ai
July 1, 2021 - November 1, 2022
Delivered Power BI dashboards for Crown Resorts using real-time camera sensor data to improve operations and incident monitoring. Partnered with R&D to generate analytics supporting AI product development and tuning. Developed Python time-series models for compliance improvements and built CI/CD pipelines to automate ML workflow deployment; created real-time hazard detection models in TensorFlow and OpenCV for industrial clients.

Education

Master of Data Science at Deakin University
January 11, 2030 - January 8, 2026
Bachelor of Technology at IIT Roorkee
January 11, 2030 - January 8, 2026
Master of Data Science at Deakin University
January 11, 2030 - January 8, 2026
Bachelor of Technology at IIT Roorkee
January 11, 2030 - January 8, 2026

Qualifications

Microsoft Certified: Azure Data Engineer Associate
January 11, 2030 - January 8, 2026
SnowPro Core (in progress)
January 11, 2030 - January 8, 2026
Microsoft Certified: Azure Data Engineer Associate
January 11, 2030 - January 8, 2026
SnowPro Core
January 11, 2030 - January 8, 2026

Industry Experience

Financial Services, Retail, Software & Internet, Transportation & Logistics, Professional Services, Media & Entertainment