Hello, I’m Mohammed Shaik, a Senior Data Engineer with 9 years of progressive experience in software development and a strong shift to data engineering, ETL pipelines, and scalable data platforms. I bring a solid backend foundation in Node and .NET and now design data-driven systems using SQL Server, Redshift, APIs, and automation to solve enterprise data challenges. I build batch and real-time data pipelines, develop data integration services, and work across the full SDLC—from data modeling and transformation to reporting. I’m proficient with cloud ecosystems (AWS, Azure, GCP), HL7/FHIR data, PHI/PII compliance, and streaming tech like Kinesis and Kafka, always focused on reliable, scalable solutions and strong collaboration with product and business partners.

Mohammed Shaik

Hello, I’m Mohammed Shaik, a Senior Data Engineer with 9 years of progressive experience in software development and a strong shift to data engineering, ETL pipelines, and scalable data platforms. I bring a solid backend foundation in Node and .NET and now design data-driven systems using SQL Server, Redshift, APIs, and automation to solve enterprise data challenges. I build batch and real-time data pipelines, develop data integration services, and work across the full SDLC—from data modeling and transformation to reporting. I’m proficient with cloud ecosystems (AWS, Azure, GCP), HL7/FHIR data, PHI/PII compliance, and streaming tech like Kinesis and Kafka, always focused on reliable, scalable solutions and strong collaboration with product and business partners.

Available to hire

Hello, I’m Mohammed Shaik, a Senior Data Engineer with 9 years of progressive experience in software development and a strong shift to data engineering, ETL pipelines, and scalable data platforms. I bring a solid backend foundation in Node and .NET and now design data-driven systems using SQL Server, Redshift, APIs, and automation to solve enterprise data challenges.

I build batch and real-time data pipelines, develop data integration services, and work across the full SDLC—from data modeling and transformation to reporting. I’m proficient with cloud ecosystems (AWS, Azure, GCP), HL7/FHIR data, PHI/PII compliance, and streaming tech like Kinesis and Kafka, always focused on reliable, scalable solutions and strong collaboration with product and business partners.

See more

Experience Level

Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
Expert
See more

Language

English
Fluent

Work Experience

Senior Data Engineer at Noridian Healthcare
September 1, 2017 - Present
Implemented HIPAA-compliant data pipelines and encryption controls for PHI/PII, built ETL workflows in Apache Airflow on MWAA for Medicaid data (eligibility, claims, adjudication), and ingested HL7/X12 formats into Redshift via S3 triggers and Glue crawlers. Developed adjudication rule pipelines using Lambda/Step Functions/DynamoDB for near real-time claim outcomes, and created FHIR-compatible APIs with Lambda/API Gateway while enforcing granular IAM and encryption policies.
Senior Data Engineer at Hewlett Packard (HP)
November 1, 2015 - July 1, 2017
Participated in SDLC to translate requirements into data pipeline logic across Azure Data Factory, Databricks, and Synapse Analytics. Automated provisioning of data infrastructure with Terraform/ARM, built and maintained ADF pipelines for on-prem APIs ingestion into Azure SQL DB, Synapse, and ADLS Gen2, and managed logging/monitoring with Azure Monitor and Log Analytics. Migrated legacy ETL to Spark-based Databricks pipelines and deployed notebooks/jobs via Azure DevOps, with RBAC and network security for production reliability.
Senior Data Engineer at Noridian Healthcare, Deerfield, IL
September 1, 2017 - Present
Implemented HIPAA-compliant data security (IAM policies, KMS-based encryption, VPC endpoints, audit logging) for Medicaid data pipelines. Built scalable ETL workflows in Apache Airflow on MWAA to process eligibility, claims, and adjudication data; ingested HL7 and X12 formats into Redshift via S3 triggers and Glue crawlers. Developed real-time adjudication pipelines using Lambda, Step Functions, and DynamoDB for processing claim outcomes; created FHIR-compatible JSON payloads and exposed data through API Gateway. Created batch adjudication pipelines with AWS Glue and S3, logging job metadata in DynamoDB for traceability. Migrated legacy ETL jobs to Spark-based Databricks pipelines; maintained versioned Glue tables for claims lifecycle. Implemented data quality checks with Great Expectations integrated into Airflow; monitored ETL via CloudWatch and alerted via SNS. Supported CI/CD with GitHub Actions and AWS CodePipeline; infrastructure provisioned with Terraform; ensured encryption and
Senior Data Engineer at Hewlett Packard (HP), Plano, TX
October 31, 2015 - June 30, 2017
Participated in the SDLC by translating business requirements into data pipeline logic across Azure Data Factory, Databricks, and Synapse Analytics. Processed large retail data with Spark Structured Streaming in Databricks and wrote outputs to ADLS Gen2. Designed and deployed CI/CD pipelines using Azure DevOps and Jenkins to automate testing and deployment of Spark notebooks and data services. Automated infrastructure provisioning with Terraform and ARM templates for resources (ADF, Azure SQL DB, Synapse, Event Hubs, Blob Storage). Managed job execution and error handling in ADF, integrating with Azure Monitor and Log Analytics for proactive alerts. Created deployment artifacts for Spark jobs and Python wheels, published via Azure Artifacts, and deployed via Release pipelines. Built web-based data services and Python APIs with App Service and secured APIs via API Management, enforcing RBAC and encryption. Demonstrated founder-level ownership by coordinating requirements, testing, and s

Education

Bachelor’s in Information Technology at Monad University
January 11, 2030 - January 1, 2016
Bachelor's in Information Technology at Monad University
January 11, 2030 - January 1, 2016
Bachelor's in Information Technology at Monad University
January 11, 2030 - January 1, 2016
Bachelor’s in Information Technology at Monad University
January 11, 2030 - January 1, 2016
Bachelor's in Information Technology at Monad University
January 11, 2030 - January 1, 2016
Bachelor's at Monad University
January 11, 2030 - January 1, 2016

Qualifications

Add your qualifications or awards here.

Industry Experience

Healthcare, Financial Services, Professional Services, Government, Software & Internet, Media & Entertainment, Other, Education