Talking Laboratories.

View Original

3-Part Blog Series: Strategies for Testing – Part 3: Simplify Your Validation Summary Documentation

We are rolling out a 3-part blog series that discusses how to test your clinical automation system effectively and accurately. Subscribe here so you don’t miss an article.

The process of validating your rules in a clinical setting using actual specimens is the best way to exercise your auto-verification rules. Challenging your rule base using samples that represent your patient base and acuity will provide a real-life outcomes-based test. This process is not widely understood or undertaken in a way that appropriately approves rules for the system’s intended use.

In part 1, Practical Approaches to Risk-Based Automation System Testing of our 3-part Strategies for Testing blog series, we discussed how to apply risk analysis to your rules and workflows to optimize your testing efforts. In part 2 of our series, Build A Resilient Wet Testing Plan, we described how to create a risk-based validation protocol document and test plans that follow this strategic approach.

Now, in part 3, we will provide a framework for how to efficiently document your auto-verification rule testing outcomes. Documentation is necessary but often cumbersome and time-consuming. This blog focuses on striking the right balance between demonstrating adequate testing documentation without overwhelming the approvers with too much test data to review.

Validation Summary Report

At the conclusion of the user acceptance testing, a Validation Summary report should be created to communicate the testing outcomes to the stakeholders. The goal is to provide a concise overview of the entire validation data in an easy-to-read and understandable document targeted for leadership review and approval.  The information presented in the Validation Summary report should be consistent with the Validation Protocol and should conform to your approved laboratory procedures. It becomes the official signal to the team that all planned testing activities are now complete. The report should include:

  • Mirror key sections of the Validation Protocol that describe the objectives and criteria and strategy of the overall testing project.  

  • Bring together the collected documentation, summarize the testing data, and list all issues encountered along with their related mitigation actions.

  • Makes recommendations to management for the formal release of the auto-verification rules into live, production use.

Pro-Tip. The data presented should be in a logical manner so the reviewer can align your Validation Protocol against your Validation Summary of findings. The use of tables helps summarize your data and observations.

Pro-Tip.  Keep the report brief. It is not necessary to describe every step of the validation process. The Validation Protocol and Test Plans describe these in detail, there is no need to repeat this information here.

 Collating Test Plan Data For the Validation Summary

The data documented and gathered when the test plans were executed should be appropriately aggregated for final review and approval. Following these best practices can serve as a final front-to-end check that the test plans were completed appropriately before you begin to prepare your Validation Summary report.

  • Use standard test plan templates that have clear data entry fields and pass/fail prompts

  • All test plan testing output should be fully completed with no open prompts.

  • Obtain testing evidence through output reports or screen prints to document the key testing points

  • Ensure that all testing evidence is captured according to the test plan. For testing evidence that was not collected as required by the test plan, the reasons for non-conformance should be documented

  • Include any pertinent notes to remove any ambiguity of the testing status and any issues associated with the testing

  • Testers must date and sign all testing documents. Any added or updated information should be identified with new performance dates.

  • Store executed test plans, testing evidence, and other pertinent supporting information in a single secure storage location. The testing information should be available to all stakeholders and retrievable for inspection purposes

 Validation Summary Report Elements

The Validation Summary report should include information that succinctly describes the testing program, its outcomes, who performed it and finally, it seeks a specific action, typically leadership approval to move to the go-live phase. A Validation Summary report generally will include elements that cover the validation project name/identifier, deliverables, overview, document references, summary, acceptance criteria, outcomes, and approvals. We’ll describe each of these:

Project Name/Identifier – The name and identifiers used for the validation project name should match across all documents including the Validation Protocol and Test Plans

Deliverables - An inventory of the deliverables (test plans, testing evidence, and other supporting materials) created during the validation effort including any document identifiers.

Overview – A high-level summary (a table works well for this) of results obtained for each test plan including a status of the testing components as either passed or failed. Include a discussion of any non-conformance items, mitigation activities (how issues were resolved), and re-testing results. Describe all corrective actions or changes that were made to the system during the testing including any changes to the software, documentation, or protocols.

Be sure all stated issues have been mitigated and/or have been risk-accessed with a stated impact on the operation of the system. Do not leave any issue open without an appropriate discussion of mitigation and/or corrective action.

Document References - Lists the documents used to execute the validation including the Validation Protocol, Test Plans, etc.

Validation Brief - Summarize the outcome of the collation of data in a few sentences. List outstanding issues and planned to follow-up activities, if any, that may impact the use of auto-verification rules in production.

Acceptance Criteria - Provide a list of the validation testing criteria that were used to evaluate the results of the testing. For each acceptance criterion, provide the acceptance testing status (passed, failed or non-applicable) of the testing outputs.

Validation Outcomes - State a clear and concise conclusion of the project testing and whether the testing met the acceptance criteria. A statement should be included that indicates whether or not the auto-verification system or functionality tested is recommended for approval to move to production. If the auto-verification rules and associated software are not recommended for approval to release to production, this section should describe the reasons why as well as the conditions that need to be met for such recommendation.

Document Approval – The author, reviewer(s), and approver(s) should sign and date the Validation Summary Report.

Know your Audience

In preparing your Validation Summary report, consider your audience. The reviewers of the Validation Summary report could be laboratory and/or quality management stakeholders. This report is the official culmination of your testing activities and will become an auditable document by internal compliance teams and outside accreditation organizations and regulatory agencies. They are interested in confirming that your testing was completed according to your original Validation Protocol and that your testing was comprehensive enough to validate your auto-validation rule software and workflow for the intended use.  

Pro-Tip. Ensure your language is clear and concise. Avoid using descriptors or acronyms not described in your original Validation Protocol or Test Plan documents.

key Takeaways

A Validation Summary report is a final check that your user acceptance wet testing is complete. When it is approved and signed, the Validation Summary report signals the official end of the testing phase. It lends confidence to the team, letting everyone know that the system has been vetted and the project is now ready to move to the go-live readiness phase.