Skills
Experience Level
Language
Work Experience
Education
Qualifications
Industry Experience
Context
Complex platform with multiple backend services, APIs, and a web UI. Frequent releases and cross-service dependencies created regression risk.
Goal
Increase release confidence and reduce production issues by validating user journeys end-to-end and ensuring backend service interactions remained consistent across environments.
What I did
Defined a practical QA strategy: smoke, regression, integration checkpoints, and risk-based coverage.
Validated API contracts and service interactions to catch integration failures early.
Drove end-to-end testing of key business flows (auth, permissions/roles, core CRUD, data validations).
Built repeatable defect reporting: severity, impact, reproduction steps, and environment details.
Worked closely with engineering/DevOps to align QA gates with CI/CD and reduce “last-minute QA”.
Impact
Higher stability during releases and fewer regressions in critical flows.
Faster triage due to clearer bug reports and structured evidence.
Improved predictability of deployments through QA aligned with delivery pipelines.
Context
Web application with evolving requirements and recurring regression issues due to fast iteration.
Goal
Create a scalable approach to test design and automation that improves clarity between product requirements and executable tests.
What I did
Converted user flows and acceptance criteria into Gherkin scenarios (Given/When/Then).
Organized scenarios into a “testbook” structure for review/approval before implementation.
Implemented automation with maintainability in mind: stable selectors, reusable utilities, clear naming, and consistent structure.
Added coverage around edge cases: invalid inputs, permissions, error states, and cross-browser/responsive behavior where relevant.
Produced test evidence and reporting that stakeholders could understand without digging into raw logs.
Impact
Clearer alignment between product expectations and QA deliverables.
More reliable regression coverage and faster feedback cycles.
Reduced ambiguity: scenarios became a shared language across QA, product, and engineering.
- Configured automated test suites to run on every pull request and merge to main branches.
- Implemented quality gates that blocked deployments when critical tests failed.
- Optimized test execution time through parallelization and selective test runs based on changed
files. - Set up test reporting and notifications (Slack, GitHub comments) so developers could act on
failures immediately. - Collaborated with DevOps to ensure test environments were consistent and reliable (Docker,
Kubernetes). - Faster feedback: developers knew within minutes if their changes broke something.
- Reduced production incidents by catching regressions before merge.
- Increased deployment frequency: teams trusted the pipeline to validate releases.
Context
Microservices platform with frequent releases and a need for faster, safer deployments. Manual QA cycles were slowing down delivery and occasionally missing regressions.
Goal
Shift testing left by embedding automated quality checks directly into the CI/CD pipeline, reducing reliance on manual verification before releases.
What I did
Impact
- Validated data transformations at each stage of the ETL pipeline (extraction, cleaning, aggregation).
- Built automated data quality checks: null detection, duplicate identification, schema validation, and referential integrity.
- Tested edge cases with malformed files, unexpected formats, and boundary values in large datasets.
- Collaborated with data engineers to define data contracts and validation rules.
- Created monitoring dashboards to track data quality metrics over time (GCP, BigQuery).
- Reduced data-related incidents reaching production.
- Increased client confidence in platform outputs for critical financial decisions.
- Faster detection of data pipeline issues through automated monitoring.
Context
B2B data analytics platform (Goliath) processing large-scale datasets for credit risk assessment. Data accuracy was business-critical—errors could lead to wrong financial decisions for clients.
Goal
Ensure data integrity throughout the ETL pipeline: from raw file ingestion to final processed outputs used in client dashboards and reports.
What I did
Impact
Hire Ana Flavia Roca Rojas today
To get started post up your job and then invite Ana Flavia Roca Rojas to your job.