Hi, I’m Nikhil Dandaboina, a data engineer with 5+ years of experience delivering enterprise-grade data platforms across Azure, AWS, and GCP. I design and optimize cloud-native ETL/ELT pipelines, data warehouses, and analytics solutions to enable real-time insights and scalable decision-making. I’ve built and steward data products using Snowflake, Databricks, DBT, Airflow, Terraform, Docker, and Kubernetes, empowering cross-functional teams to derive value from complex datasets. I thrive in Agile environments, love automating workflows and deployments, and focus on robust data modeling, governance, and monitoring. My dashboards and BI work (Power BI, Tableau, Looker) translate data into actionable business outcomes, while production support and incident response ensure reliable, scalable platform operations.

Nikhil Dandaboina

Hi, I’m Nikhil Dandaboina, a data engineer with 5+ years of experience delivering enterprise-grade data platforms across Azure, AWS, and GCP. I design and optimize cloud-native ETL/ELT pipelines, data warehouses, and analytics solutions to enable real-time insights and scalable decision-making. I’ve built and steward data products using Snowflake, Databricks, DBT, Airflow, Terraform, Docker, and Kubernetes, empowering cross-functional teams to derive value from complex datasets. I thrive in Agile environments, love automating workflows and deployments, and focus on robust data modeling, governance, and monitoring. My dashboards and BI work (Power BI, Tableau, Looker) translate data into actionable business outcomes, while production support and incident response ensure reliable, scalable platform operations.

Available to hire

Hi, I’m Nikhil Dandaboina, a data engineer with 5+ years of experience delivering enterprise-grade data platforms across Azure, AWS, and GCP. I design and optimize cloud-native ETL/ELT pipelines, data warehouses, and analytics solutions to enable real-time insights and scalable decision-making. I’ve built and steward data products using Snowflake, Databricks, DBT, Airflow, Terraform, Docker, and Kubernetes, empowering cross-functional teams to derive value from complex datasets.

I thrive in Agile environments, love automating workflows and deployments, and focus on robust data modeling, governance, and monitoring. My dashboards and BI work (Power BI, Tableau, Looker) translate data into actionable business outcomes, while production support and incident response ensure reliable, scalable platform operations.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert

Language

English
Fluent

Work Experience

Senior Data Engineer / Data Analyst at JPMorgan Chase & Co
April 1, 2024 - November 21, 2025
Architected scalable enterprise analytics solutions using Snowflake, DBT, and Python; optimized SQL performance; built batch and streaming data pipelines; designed Snowflake Data Marts; developed statistical models and interactive dashboards (Power BI, Tableau, Looker) for fintech transactions and customer behavior; implemented ELT pipelines with Spark, Airflow, and DBT; developed Databricks ETL pipelines; modeled data with star/schema designs; implemented IaC with Terraform; managed Unity Catalog governance; conducted EDA and RCA; provided production support and cross-functional collaboration.
Data Engineer at CVS Health
July 1, 2022 - July 1, 2022
Developed Spark Streaming applications to process raw data from Kafka into JSON for analytics; built ETL pipelines for Snowflake data warehousing; processed X12/EDI transactions aligning with HIPAA; implemented HL7/FHIR data interoperability; followed SDLC principles and design patterns; built reusable components for scalable pipelines and mapped coding standards (ICD/CPT/LOINC/SNOMED).
Data Engineer at FedEx Dataworks
October 1, 2021 - October 1, 2021
Built Airflow/Cloud Composer orchestrated ETL/ELT jobs; implemented data monitoring with CI/CD, Airflow, and Terraform; developed Kafka-based streaming; performed secure data migrations across GCP and Azure using Azure Data Factory; processed large datasets with Pandas and SQL; migrated data between GCP and Azure; wrote Sqoop scripts for Oracle-to-Big Data migration.
ETL Developer & Data Engineer at Apex Laboratories Pvt Ltd
April 1, 2020 - April 1, 2020
Designed scalable batch and real-time data pipelines using Spark, Kafka, and Python; built data models and optimized ETL workflows for Snowflake, Redshift, and SQL Server; implemented CI/CD with Airflow and Terraform; established data monitoring and alerts; leveraged Hadoop/HBase/HDFS ecosystems and performed performance tuning.

Education

Master of Business Analytics at Trine University
January 11, 2030 - January 1, 2024
Bachelor in Computer Science at Masterji Degree and PG College
January 11, 2030 - January 1, 2019

Qualifications

Add your qualifications or awards here.

Industry Experience

Software & Internet, Financial Services, Professional Services, Healthcare, Media & Entertainment