LESSONS LEARNED FROM IMPLEMENTING THE FDA’s CSA GUIDANCE: PART 3 –SUMMARIZING

This is the last installment of a three-article series that presents several lessons learned by our validation team while completing a pair of validation projects using the methodology presented in the FDA’s draft of the Computer Software Assurance for Production and Quality System Software guidance (CSA). When the validation effort reaches its summary stage, no more hardware or software modifications to the system are needed, testing is completed, and the system has been proven to perform its intended use. However, the project is not yet complete, as the information gathered and deliverables generated during the effort are not in a format that can be efficiently reviewed by internal/external auditors or is ready to undergo that scrutiny. The information used in the system’s validation is distributed across countless deliverables and resides in an array of different file types and formats. A well-written validation summary report not only states the conclusions that are supported by the validation deliverables but also provides a roadmap of the entire validation effort that permits inspectors or auditors to understand the plan, its execution, and how the various activities and deliverables achieved the requirements of that plan. The task of summarizing the activities and deliverables is central to any validation effort, but this task can be made more challenging in a validation effort that employs CSA methodology, as employing CSA often means the drug manufacturer does not have full control of the generation, execution, and storage of all the validation deliverables. This article identifies issues encountered during the summary phase of a validation utilizing CSA methodologies, and it presents actions that can be taken to minimize these issues by increasing a reviewer’s understanding of how the validation effort’s execution met the needs specified in the validation plan.

Lesson 1: Perform Appropriate Review for all Validation Deliverables

Using supplier documentation during validation can lead to test methods or test result storing/retrieval that may not fully adhere to a manufacturer’s validation procedures. Any supplier documentation used in the validation must be verified through the supplier’s approved quality management system (QMS), adhere to the draft CSA guidance, and meet the intent of the manufacturer’s procedures. If the supplier’s QMS is not acceptable to the manufacturer or the information is incomplete, the manufacturer must ensure that the supplier deliverables meet the minimum requirements through thorough inspection of the deliverable and, if needed, remediation. The importance of completing these reviews during the validation effort is amplified when using supplier documentation as described in the CSA guidance, as there will be limited access to the supplier’s resources to determine the reasons for any questions about the deliverables. Address any gaps, inconsistencies, or other flaws that are detected during this review and do not try to cover them up, as any inconsistencies or shortcomings of the validation reporting will compromise the intent of the summary report and may cause a loss of goodwill with the reviewer/auditor. It is always good practice to start the review of validation deliverables prior to the summary itself, preferably during the execution phase of the project to minimize any adverse impact to the project timeline caused by deliverables, especially delays that will be exacerbated by the use of supplier documents.

Lesson 2: Avoid Complexities and Highlight Critical Information

The CSA approach often involves depending on vendors for rigorous technical assessments of their own software systems, which may include code analysis, security evaluations, and quality assurance programs. The summary report should condense any complexities in these technical assessments into a concise summary that is understandable to auditors with varying levels of technical expertise. Highlight critical validation activities, including verification of high-risk requirements, any deviations, and the impact or resolution of those deviations. Distill large, complex results into a focused report that displays the reasoning for the actions and the significance of the result. This approach provides any reviewers with an understanding of what was achieved with the deliverables and how they meet the actions defined in the validation plan. If the tasks were performed outside the control of the manufacturer, be sure to highlight that the Quality Management System employed by the suppliers was assessed and acceptable to the manufacturer. Writing a summary report for a CSA-driven project requires careful consideration of the technical complexity involved and the needs of stakeholders and shows the importance of effectively communicating these results in a clear and concise manner.

Lesson 3: Consistency in Terminology

Define any specialized terms or acronyms used in the report and use this terminology consistently in the body of the summary report. CSA has introduced several new terms that did not previously exist in the CSV world, such as critical thinking, ad-hoc testing, and preferential use of the word verification instead of validation. It is important to understand the terminology being used, especially CSA methodology promoting new terms and the use of supplier testing in the verification process. The goal of verification is still the same as it always has been, but the terminology from the CSA-inspired validation plan should be used when describing the validation activity rather than regressing to the comfortable validation terms used in earlier summary reports. Do not confuse people by slipping between terms in the summary, such as interchanging installation qualification, installation verification, and user acceptance tests, or using verification and validation as synonyms in various parts of the report. One cannot underestimate the kind of chaos that can be created if interchanging these terms leads people astray from a shared understanding of what was expected in the validation planning and how it was delivered during the validation execution. Always be clear in the summary report on how the requirements were verified during the validation project and use consistent terminology throughout the validation to make that clarity much easier to achieve.

Lesson 4: Balance Detail and Brevity

Traditional summary reports are template-driven, reporting the results of qualification tests with no regard to supplier testing or any other inputs that may document that a system is fit to perform its intended function. CSA methodology promotes using supplier-created tests and deliverables to verify much of the system’s capability to perform its intended use, resulting in many more documents available to support the system’s operation in the summary report. This increased amount of available documentation makes it easy to pull in copious quantities of data, resulting in a bloated summary report that requires a painstaking review to interpret the results. A CSA-based summary report should leverage the validation plan and the results from the risk assessment to focus on verifying the system’s critical functions operate as intended, it should not spend inordinate time describing the verification of low risk functions. As with any document, using references for supporting information rather than rewriting the results in the summary is an excellent means to downsize an oversized summary report, but you will need to ensure that any reference used is accurate and complete. Use these tips to provide enough detail of the verification process in the summary report that accurately conveys the results of the validation effort, especially for critical functions, so that a summary report is generated that is concise while presenting essential verification information to stakeholders.

A summary report is an extremely important part of the validation effort, as it is the point where most audits or reviews begin, and a good summary report can be a useful tool in effectively explaining the validation effort to persons not participating in the validation. The challenges of reporting test results from various sources, with different complexities and formats, into a compact and understandable message in a CSA-driven summary report are different than the template-driven summary reports of traditional validations. Using CSA methodology replaces a standard report defining IQ, OQ and PQ results with a more challenging report to describe the more complicated story of using supplier deliverables in addition to the qualification testing to verify the system works as intended. Implementing the suggestions in this article during the execution and summarizing phases of the validation effort will help produce a practical and concise summary report that meets the challenges of a CSA-driven validation effort by effectively presenting and validating the conclusions of the verification activities without oversimplifying the actions or omitting critical information.