Available to hire
I’m Rohith Naidu, a Senior Data Engineer with 11+ years of experience architecting and delivering enterprise-grade data solutions across AWS, Azure, and GCP. I specialize in Snowflake development, Databricks workflows, cloud-native ETL/ELT orchestration, and automated data-governance frameworks.
I lead cross-functional teams to build real-time analytics pipelines, data lakehouses, and cost-efficient, scalable infrastructure, leveraging Python and Scala for ETL/ELT development, data wrangling, and feature engineering across structured, semi-structured, and streaming datasets.
Experience Level
Language
English
Fluent
Work Experience
Sr. Data Engineer at Nationwide Insurance
March 1, 2023 - PresentAutomated and optimized complex ETL/ELT pipelines using Azure Data Factory and Microsoft Fabric to centralize enterprise financial and operational data. Architected unified ingestion frameworks for Oracle ERP, SQL Server, and legacy platforms into Azure Synapse Analytics. Developed real-time and batch data pipelines supporting financial transactions, risk exposure, and credit-portfolio analytics. Built Power BI dashboards delivering interactive visualizations for KPIs, financial health, and SOX-audit compliance. ImplementedAzure Purview and Apache Atlas for governance and lineage tracking aligned with SOX, PCI-DSS, HIPAA, and GDPR. Integrated NoSQL/distributed databases for real-time fraud detection. Built predictive pipelines leveraging Azure ML, AWS SageMaker, and Google Vertex AI. Tuned SQL/PL-SQL for regulatory reporting and deployed CI/CD workflows with Azure DevOps and GitHub Actions. Orchestrated cross-cloud workflows with Airflow, AWS Glue, and Google Dataflow. Enforced data-qu
Sr. Data Engineer at Ascension
January 1, 2021 - February 28, 2023Engineered scalable ETL pipelines using AWS Glue, Azure Data Factory, Google Dataflow, and Apache NiFi for EHR, claims, and pharmacy data integration. Consolidated structured and semi-structured data from S3, ADL, and GCS into a unified healthcare warehouse. Built real-time streaming pipelines with Kafka, Confluent Kafka, AWS Kinesis, Google Pub/Sub, and Azure Event Hubs; managed Databricks clusters across AWS, Azure, and GCP. Automated ingestion workflows with Step Functions and Airflow; enforced HIPAA/HITECH, GDPR, and PCI-DSS using RBAC and Purview/Atlas governance. Created reusable ETL templates and data models; modeled NoSQL architectures with DynamoDB, Bigtable, and Firestore; delivered analytics-ready datasets with Delta Lake and Spark SQL; implemented Airflow/Composer for workflow scheduling.
Data Engineer at Western Alliance Bank
October 1, 2018 - December 31, 2020Designed and implemented scalable ETL/ELT pipelines for financial operations, built streaming ingestion using Kafka/Kinesis, and deployed Spark-based transformations. Managed Snowflake/Redshift/Azure Synapse data environments; implemented data governance and lineage, and containerized ETL workloads with Docker/Kubernetes. Delivered analytics for risk and liquidity reporting and enabled cross-cloud data orchestration with modern CI/CD practices.
Data Engineer at State of Montana
May 1, 2016 - September 30, 2018Designed and deployed scalable ETL pipelines, data models, and governance for statewide analytics; implemented data lineage and compliance controls; collaborated with IT and regulatory teams to deliver validated datasets for reporting.
Data Engineer at Charter Communication
February 1, 2014 - January 31, 2015Designed and deployed enterprise-scale data pipelines for telemetry and operations analytics; consolidated multi-source data into Snowflake/BigQuery/Redshift; built streaming ingestion with Kafka/Kinesis; implemented ETL orchestration with Airflow; ensured SOX/GDPR compliance and data governance.
Data Engineer at Charter Communication
February 1, 2015 - April 30, 2016Developed ETL pipelines for telecom data; built real-time streams and batch processes; leveraged Spark/Databricks for data transformations; implemented data lineage and governance.
Data Engineer at Chevron
February 1, 2015 - September 30, 2018Delivered data pipelines for manufacturing and operations analytics; implemented predictive maintenance and reliability analytics using Spark/MLlib; built CI/CD for ETL using Terraform; implemented data governance and security policies; containerized ETL workloads.
Education
Qualifications
Industry Experience
Financial Services, Healthcare, Government, Manufacturing, Software & Internet
Experience Level
Hire a Data Scientist
We have the best data scientist experts on Twine. Hire a data scientist in Texas City today.