LESSONS LEARNED FROM IMPLEMENTING THE FDA’s CSA GUIDANCE: PART 2– EXECUTING

This is the second in a series of three articles that present lessons that our validation team learned while completing a pair of validation projects that used the FDA’s draft of the Computer Software Assurance for Production and Quality System Software Guidance (CSA). The number of resources, the amount of work being performed, and the number of completed deliverables are always at their highest level during the executing phase. Because of this, the executing phase is where most issues occur during the validation effort, significantly impacting the project cost and schedule. This article presents actions that can be implemented to prevent or at least reduce the impact of issues that occur during the implementation of CSA methodologies during the project’s executing stage.

Lesson 1: Create Comprehensive Testing Plans and Protocols for all Verification Testing

The validation plan identifies the deliverables needed to complete the validation of the systems. Still, the contents of the test plans and protocols must contain information and tests that meet the verification requirements defined in the plan. One problematic area that is introduced when using the CSA guidance is the use of vendor-generated documents to verify low-risk requirements. Vendor test documents are usually created outside the scope of the validation effort, and many times, they do not include documented test expectations and results, issue identification and resolution, or the identification of the test executor and date of test execution. The CSA guidance requires this information in the test documentation when using the test results to verify the system’s operation. Failure to record these items during testing can lead to delayed project completion due to retesting the inadequately documented tests. Assess a vendor’s quality program and procedures to confirm that their deliverables satisfy the guidance’s requirements for appropriate records. If the vendor tests are reviewed and do not satisfy these requirements, modifying the tests to meet the guidance may be possible before they are executed. It could be adding an extra space for initial/dates, referencing a resolution for an issue found, or explaining the objective of the test more clearly. A proactive approach to upgrading vendor test documents can eliminate the re-execution of the re-execution of vendor tests that do not originally meet the CSA documentation requirements for verifying GxP records.


Requirements that are medium or high risk are usually verified using more robust protocols that are created using GxP-compliant processes and procedures and and that are reviewed and approved by the drug manufacturer instead of the vendor. These test results usually contain all the elements listed in the CSA for an acceptable test record. When creating these documents, a good practice is to include a section in the protocols that identifies the requirements verified with testing that has already been completed and does not require further verification. For example, Site Acceptance Testing (SAT) can reference Factory Acceptance Testing (FAT) or development tests that verify low-risk requirements. Installation and Operational Qualifications can reference SATs or FATs in the same way. Approval of protocols containing these sections provides evidence that the Validation and Quality Assurance groups accepted these results before moving to the next testing stage. Following these practices prevents excessive and unneeded testing from being executed, and it prevents surprises to project stakeholders when reviewing requirements and test results later in the project.

Lesson 2: Closely Monitor All Testing

Remember that many vendors are accustomed to performing their development and factory testing with little oversight, informal test result recording, and nominal documentation of issues and their resolutions. Formally documenting their test results, recording the issues they encounter (and their resolution), and signing and dating all data entries are often new practices. Even after attending a training session to explain the new testing/documentation process, it is possible for them to fall back into old habits, especially when dealing with issues and deviations. This can result in the delivered test results and documentation not meeting the good documentation requirements (GDP) (GDP)for use in the validation effort. Implementing continuous monitoring mechanisms to detect and address issues promptly during execution can minimize the number of unacceptable test results that are delivered by vendor-performed testing. It is particularly important to monitor the vendor deliverables during final development tests and FAT when the vendor focuses on getting the system approved to be shipped and not on generating acceptable validation documentation.


One method used to provide this additional oversight is to hold short status meetings during vendor testing to answer questions and reinforce test requirements. Reviewing a sampling of test results during these meetings and presenting examples of properly documented test results, especially early in the testing, can verify their adherence to the protocols, provide a visual example of properly recorded test results, and prevent having to change many documented test results later in the project. These changes add a surprising number of resources and time to the later stages of the project, so it is a good practice to review test results early to prevent them from causing issues later. This type of oversight also expedites issue resolution early in the project, as it discourages the past vendor practice of noting many issues detected during development testing and even FAT but not completing the repair until SAT or qualification testing. This was acceptable pre-CSA because most companies only accepted qualification test results during their validation effort. Still, it is a problem when the results from development and FAT are being used as appropriate records in the system validation.

Lesson 3: Communication/Collaboration

Validation plans that follow existing validation methodologies require effective communication and a lot of flexibility during their execution. Employing new practices using the CSA guidance magnifies the importance of using these two tools across the project’s life. Meetings should be held that focus on issues, especially those related to issues caused by adherence to the CSA guidance. Promote openness in the team, as some members may feel uncomfortable about the changes and need reinforcement to perform as desired. Foster a collaborative approach to deviation and issue resolution during these meetings as well. Establish clear communication channels for reporting and addressing issues, providing updates to the project, and identifying the proper resources for completing tasks. These actions help maintain the project timeline by fast-tracking issue resolution, keeping team members informed, and increasing execution efficiencies. A good practice is to create a requirement traceability matrix early in the project and update it during each validation step. This identifies each requirement risk, its associated testing rigor, and the document that verifies the test results for the requirement, but it also visually documents progress on the project, identifies any changes in the deliverables, and minimizes the gaps identified when the project is nearing completion. This ability to swiftly change the validation will minimize its impact on the test plans and schedule.

The executing phase will always be the most unpredictable phase of a validation project, and implementing the CSA guidance exposes many resources to a different philosophy of validating software than they have become accustomed to, which can make testing even more unpredictable. Implementing the actions suggested in this article can take some of this unpredictability out of the executing phase by correcting issues early in the project to reduce later rework. The execution phase deliverables are extremely important because they provide any regulatory or future reviews with documented evidence of how the system was validated. Validation results must be clear, concise, and complete and verify the system requirements are met, no matter how much testing and retesting is involved, so it makes sense to set up to do it right the first time.