Medical Device Software Validation and Verification...

For Medical Device software validation and verification a good starting point is to look at the

US FDA Definition of Validation

Validation is “establishing documented evidence which provides a high degree of assurance, that a specific process, will consistently produce a product meeting pre-determined specifications and quality attributes”.

 

Categories of non-conformance related to medical device software.

A useful measure of software validation problems, is to review the various US FDA issued 483’s which are related to medical device software validation and verification. The top 10 Categories are as follows:

Medical Device Software Validation and Verification

Extract page from “Medical Device Validation” presentation



1 – Testing and / or qualification
2 – Development methodology
3 – Validation methodology and planning
4 – Change control and management
5 – Quality assurance and auditing
6 – Operating procedures
7 – Security
8 – Hardware, equipment, records and maintenance
9 – Device manufacturing
10 – Training, education and experience

 

Validation and verification from a quality perspective.

All regulations related to validation are based on a premise of good quality practices. Quality practices must keep pace with changes occurring in the industry and software validation is good business practice. For current Good Manufacturing Practices …
 

Medical Device Validation, information & training presentation >>>
Software Validation, information and training presentation >>>
 

GAMP – Good Automated Manufacturing Practices. How should GAMP be used?

i) As part of an effective and compliant quality management procedure
ii) To act as the basis for efficient complete contracts, when involving third party vendors.
iii) To build auditing and checking procedures to ensure compliance with best practice.
iv) As a reference for policies and procedures
v) In the education and training of staff, either those involved in validating or verifying software or involved in the use of the software.

 

What are the various problem areas that arise in a validation?

There may be a lack of or incomplete validation plans. The documented system requirements and design documents may be poorly drafted. There may be an inability to track requirements through the development life cycle. The software or validation might suffer from poor configuration management. Again procedures and manuals may be lacking in detail, missing key requirements. Finally there may be incomplete test cases and test protocols.

 

The Validation Plan.

We need to draft a validation plan. The validation plan will define who is responsible, i.e. it will define the validation team. The plan will also define the computer or software system. Normally a flow diagram of the current and proposed systems will be created. The functional requirements will be defined. The software development methodology will be outlined. The plan will then go on to outline the system specifications, are the proposed specifications adequate to meet the required functional requirements, will the level of documentation be acceptable? What will be the test process, how will document, software and hardware change control be achieved? What will be the agreed configuration management? Finally, what is the current standard operating procedure for software validation, how does it define proposed “new system” for “old system” change procedures?

 

The standard operating procedures for validation needs to consider aspects such as:

• What is the system operation
• What is the backup procedure
• How will change control be implemented
• The process for handling of source code
• The frequency of revalidation and the extent of any revalidation
• How will the hardware maintenance be implemented
• The process for software maintenance and software enhancements
• The process for raw data management
• The identification and implementation of training needs
• The standard operating procedure (SOP) for system crash and recovery
• How will security be implemented and compliance requirements (i.e. 21 CFR part 11 compliance)

Note: The FDA’s 21CFR Part11 requires special attention and has been the focus of much discussion especially on questions such as hybrid systems, non durable storage media (e.g. Flash memory), recording of audio data, etc.

 

Looking at software security, you need to consider:

• Access for the system
• Who can implement data changes
• What is the back-up procedure
• Frequency, duration and extent of archiving
• System failure and recovery practices

 

Vendor Supplied Software.

Looking at vender supplied software, there will be a need to audit any potential vendor. You need to consider their software development SOPs. What programming standards do they apply. The effectiveness of their software quality assurance program and what audit issues have arisen during your own audit? There will also be a need to consider the vendor history. What has been their market history, are you aware of the experiences of their previous customers and is the profile of their previous customers similar to our own profile.

 

Software or system quality assurance encompasses:

• Analysis, design, coding, and testing methods and tools
• Formal technical reviews applied during each software engineering step
• A testing strategy
• Control of software documentation and changes
• Procedures to assure compliance with software development standards
• Measurement and reporting mechanisms.

 

Audit Questions: There are a range of potential audit questions designed to assess.

• How does/did the firm identify and manage the many existing versions of a program and its documentation in a manner that will enable changes to be accommodated efficiently?
• How does the firm control change before and after software is released to a customer?
• What are the company’s overall quality assurance SOP’s that are used to generate the computer system SOP’s?
• When were the SOP’s developed for the software engineering process?
• Which SOP’s or versions of the SOP’s were in effect at the time of software development for this system?
• Who was or is responsible for approving and prioritizing changes?
• How did the firm ensure that changes have been made properly?
• What mechanism was used to inform others of changes that were made?

 

When considering potential software vendors you need to:

• Look at all manufacturing processes.
• Identify the critical processes.
• Develop a plan for validation
– New and old systems
– Set priorities
• Establish a vendor audit program
– Have specific requirements based on the type of product purchased.
• Have good documentation and rationale for validation practices
– Development (who is on the team?)
– Implementation
– Level of testing
• Be Proactive – Don’t Wait for FDA to find problems.

 

Simplified definition of requirements

• The Requirements section defines what the software is supposed to do and where it is supposed to work.
• The requirements describe, in detail, WHAT the software will do – but are NOT HOW requirements
• 60% of the software defects are attributable to incorrect requirement specifications!!!

 

User requirements specification.

This describes what the equipment or system is supposed to do, and as such is normally written by the client. An initial version of the URS may be included with the Invitation To Tender (ITT) sent to potential suppliers.This version should include all essential requirements (musts), and if possible a prioritized set of desirable Requirements (wants).

 

Functional specification.

The Functional Specification is a description of the product to be supplied in terms of the functions it will perform and facilities required to meet the user requirements as defined in the URS. The Functional Specification should be written in such a way that it is understood by both supplier and customer. This document is the controlling specification, against which the system will be tested.

 

Hardware design specification.

The Hardware Design Specification is a description of the hardware on which the software resides and how it is to be connected to any existing system or plant equipment.

 

Software design specification.

The Software Design Specification is a description of the software components and sub-systems to be provided as part of the product.

If there is only one module, the Design Specification should contain enough information to enable the code to be produced. In this case the module design specification, test specification and integration test specification are not required.

 

Software Module Design Specification.

For each software sub-system (module) identified in the Software Design Specification, a Software Module Design Specification should be produced.

The Software Module Design Specification should contain enough information to enable coding of the module to proceed.

 

Coding.

Coding consists of the actual writing of the code for the program. The programmer has a continuous cycle of writing and testing. The programmers testing is a part of writing the code, not validating it. The programmer’s testing IS NOT the testing used for validation.

Application software production
The following should be considered in each implementation activity:

– Where possible, appropriate implementation methodologies and tools should be used to formalize the production process. The use of these methods and tools should be documented.
– Rules and conventions such as programming rules, programming languages, consistent naming conventions, coding and commentary rules should be formally specified and observed.

 

Testing.

Testing includes:
– Tests of each module
– Groups of modules
– The whole program
• The testing will finally include the system as it will be used.
• Testing may show flaws in the code, design and or the requirements.

 

Software module test specification.

For each Software Module Design Specification, an associated Software Module Test Specification should be produced. The Software Module tests to be carried out should ensure that the software module meets its specification.

 

Software integration test specification.

The Software Integration Test Specification defines those tests which demonstrate that all software modules communicate with each other correctly and that the software system meets its design specification. A Software Integration Test Specification should be produced where more than one software module has been produced.

 

Hardware acceptance test specification.

The Hardware Acceptance Test Specification details those tests to be carried out on the hardware described in the Hardware Design Specification. These tests should ensure that the hardware to be supplied meets its specification and integrates correctly with any existing computer hardware or plant equipment.

 

System acceptance test specification.

The System Acceptance Test Specification is a description of those tests to be carried out to permit acceptance of the system by the user. Typically it should address the following:
System functionality
– System performance.
– Critical parameters.
– Operating procedures.

 

Verify requirement and design specifications:

• How does the firm trace the specifications through every stage of the life cycle? Are they using a traceability matrix?
• Are all of the specifications mapped?
• Has control flow diagram and data flow diagram analyses been done to assure that the requirement specifications are complete?

Verify that all functions were included for testing

• What are the critical functions?
• What tests were done?
– functional
– stress
– boundary
• What method was used to map functions to tests?

Verify that the testing steps to be performed were included
• Were the steps specific, complete and unambiguous so that they could be repeated?
• Was each expected result listed?
• Was there a place to write results and comments?

Verify expected outputs and the evaluation criteria
• What are the critical outputs?
• Who determined that the outputs are acceptable when out of range? What was the justification?
• Who is responsible for modifying the software and what are the priorities?
• How were the test results reviewed?
• What steps were taken when errors were discovered?

 

Examples of validation and verification failures as identified during FDA audits:

Software validation FDA Device Quality System Regulations, non compliance finding:
• Failure to establish and maintain procedures to control all documents that are required by 21 CFR 820.40, and failure to use authority checks to ensure that only authorized individuals can use the system and alter records, as required by 21 CFR 11.10(g). For example, engineering drawings for manufacturing equipment and devices are stored in AutoCAD form on a desktop computer. The storage device was not protected from unauthorized access and modification of the drawings.

Warning Letters related to FDA software validation & 21 CFR 11.
• Failure to maintain laboratory records to include complete data derived from all tests necessary to assure compliance with established specifications and standards. Specifically, your firm failed to maintain electronic files containing data secured in the course of tests from 20 HPLCs and 3 GLCs. Additionally, no investigation was conducted to determine the cause of missing data and no corrective measures were implemented to prevent the recurrence of this event.

• Failure to establish appropriate procedures to assure that computerized processing control systems and data storage systems are secured and managed to assure the integrity of processes and data that could effect conformance to specifications.

 

Medical Device Validation, information & training presentation >>>
Software Validation, information and training presentation >>>
 

US FDA Quality System Regulations (QSR’s) >>>
How Medical Devices are Regulated in Australia >>>
Medical Device Risk Management >>>
Good Manufacturing Practices for Medical Devices >>>
Classification of Medical Devices in Europe >>>
Operational Risk Management >>>