Software Quality Assurance SOPs for Healthcare Manufacturers Second Edition
Steven R. Mallory
Interpharm /CRC Boca Raton London New York Washington, D.C.
Software Quality Assurance SOPs for Healthcare Manufacturers Second Edition
Steven R. Mallory
Interpharm /CRC Boca Raton London New York Washington, D.C.
Library of Congress Cataloging-in-Publication Data Mallory, Steven R. Software quality assurance SOPs for healthcare manufacturers / Steven R. Mallory.—2nd ed. p. ; cm. Companion v. to: Software development and quality assurance for the healthcare manufacturing industries. Includes index. ISBN 1-57491-135-X 1. Medical instruments and apparatus—Data processing—Quality control. 2. Pharmaceutical industry— Data processing—Quality control. 3. Computer software—Quality control. [DNLM: 1. Software—standards. 2. Quality Control. 3. Software Design. 4. Software Validation. W 26.55.S6 M255sa 2002] I. Mallory, Steven R. Software development and quality assurance for the healthcare manufacturing industries. II. Title. R856.6 .M35 2002 681¢.761¢0285—dc21 2002004293
This book contains information obtained from authentic and highly regarded sources. Reprinted material is quoted with permission, and sources are indicated. A wide variety of references are listed. Reasonable efforts have been made to publish reliable data and information, but the authors and the publisher cannot assume responsibility for the validity of all materials or for the consequences of their use. Neither this book nor any part may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, microfilming, and recording, or by any information storage or retrieval system, without prior permission in writing from the publisher. The consent of CRC Press LLC does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific permission must be obtained in writing from CRC Press LLC for such copying. Direct all inquiries to CRC Press LLC, 2000 N.W. Corporate Blvd., Boca Raton, Florida 33431. Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation, without intent to infringe.
Visit the CRC Press Web site at www.crcpress.com © 2002 by CRC Press LLC Interpharm is an imprint of CRC Press LLC No claim to original U.S. Government works International Standard Book Number 1-57491-135-X Library of Congress Card Number 2002004293 Printed in the United States of America 1 2 3 4 5 6 7 8 9 0 Printed on acid-free paper
CONTENTS
Dedication
vii
Preface
ix
Preface to the First Edition
xi
Introduction
xiii
Introduction to the First Edition
xv
The Software Quality Assurance Program: Methodology and Process Documents SE-CMG
Software Engineering Configuration Management Guidelines
1
SE-CMP
Software Engineering Configuration Management Policies
23
SE-DCP
Software Engineering Design Control Policies
53
SE-SDG
Software Engineering Software Development Guidelines
61
SE-SDP
Software Engineering Software Development Policies
103
SE-SMG
Software Engineering Software Metrics Guidelines
155
SE-PCMP
Software Engineering Project Control and Management Policies
169
SE-PDP
Software Engineering Project Directory Policies
189
SE-VVG
Software Engineering Verification and Validation Guidelines
201
SE-VVP
Software Engineering Verification and Validation Policies
245
The Software Quality Assurance Program: Software Project Planning Documents SCMP
Software Configuration Management Plan
293
SDP
Software Development Plan (without SCMP and SQAP)
317
Software Development Plan (with SCMP and SQAP)
353
Software Development Test Plan
397
SDTP
iii
iv
Contents
SEAP
Software End-product Acceptance Plan
423
SQAP
Software Quality Assurance Plan
435
SVTP
Software Validation Test Plan
451
SVVP
Software Verification and Validation Plan
483
The Software Quality Assurance Program: Software Requirements and Design Documents IDS
Interface Design Specification
523
RTM
Requirements Traceability Matrix
537
SADS
Software Architecture Design Specification
539
SDDS
Software Detailed Design Specification
551
SRS
Software Requirements Specification (without RTM)
571
Software Requirements Specification (with RTM)
593
The Software Quality Assurance Program: Software Hazards and Safety Documents SFMEA
Software Failure Modes and Effects Analysis
615
SFMECA
Software Failure Modes Effects Criticality Analysis
617
The Software Quality Assurance Program: Software Testing and Verification and Validation Documents SAR
Software Anomaly Report
626
DTIS
Development Test Information Sheet
629
VTIS
Validation Test Information Sheet
630
SVTL
Software Validation Test Log
631
SVTPR
Software Validation Test Procedures
633
SVVR
Software Verification and Validation Report
647
The Software Quality Assurance Program: Software Configuration Management Documents CRA
Change Request/Approval (CRA) Form
651
SCAR
Software Configuration Audit Report
653
SCSR
Software Configuration Status Report
655
The Software Quality Assurance Program: Software Process Improvement Document STRC
[Project/Product Name] Software Team Report Card
657
v
DEDICATION
This book is dedicated to all of the healthcare industry software engineers, companies, and corporations and to the software professional societies that helped to provide the inspiration that an integrated whole could be created.
vii
PREFACE
I am very grateful for and flattered by the favorable reception of this work. In this second edition, I have made a large number of refinements and have included new material. A casual reader will notice little difference between this edition and the previous except for the organization of the material. Most of the document templates are fundamentally the same as in the previous edition, but they have been updated and brought in line with the Software Development and Quality Assurance for the Healthcare Manufacturing Industries text. In addition, at the request of some readers, several new templates have been included. I thank all of you who told me that the information could be better organized and could provide a more centralized approach for reference. I hope that this new organization satisfies your requirements. Steven R. Mallory December, 2001
ix
PREFACE TO THE FIRST EDITION
The difficulty with a project that encompasses a predefined and integrated approach to software engineering policies, Standard Operating Procedures (SOPs), guidelines, and project documentation is that you tend to think that it is not really needed. I have continually struggled with the notion that nearly every company already has formats in place for their software engineering policies, SOPs, and project documents; that nearly every company already has acceptable software project documentation in place; and that nearly every company already has software engineering policies, SOPs, and documents. However, since writing Software Development and Quality Assurance for the Healthcare Manufacturing Industries I have found that this is not usually the case. The intent of Software Development and Quality Assurance for the Healthcare Manufacturing Industries was to present a logical, organized, and coordinated effort at defining a software quality assurance program for healthcare manufacturers. The approach was to discuss and build a general and generic framework in basic and fundamental terms that would allow the users and readers to construct their own software quality assurance programs and outfit it with details tailored to the culture of their company. Since publication of that book, however, many individuals and client organizations have suggested that a more concrete approach to a healthcare software quality assurance program should be created, namely, the generation of a suite of example software engineering policies, SOPs, and project documents. Their recommendation was to provide a set of interrelated and interdependent policies that could be used verbatim or modified to some degree in order to reflect the particular software engineering environment within their company. In addition, they argued, examples of the software project documents based on those policies and SOPs would be valuable because the formats, content, and layouts could be used to compare and contrast with those that were already in use within their companies or used as templates by those organizations that do not have document standards. The logical conclusion of their arguments and the logical extension of my first book is, then, this book, Software Quality Assurance SOPs for Healthcare Manufacturers. It presents the concrete formalization of the software engineering policies, SOPs, and project documents of the first book in a form that is useful to software engineers and managers, regulatory affairs personnel, and anyone interested in establishing a software quality assurance program from the grassroots level. This volume presents information based on the information in Software Development and Quality Assurance for the Healthcare Manufacturing Industries and, in fact, can be considered a companion volume to that work. Steven R. Mallory March, 1997
xi
INTRODUCTION
This work represents an updated companion volume to my book Software Development and Quality Assurance for the Healthcare Manufacturing Industries and presents examples of software engineering policies, Standard Operating Procedures (SOPs), and project documents discussed in broad terms in that book. The reader is cautioned against interpreting the word “examples” too narrowly; the software policies, SOPs, and project documents presented here are usable as software quality assurance documentation. For example, my book provided guidance on the form, content, and intent of software development policies and left the details to the reader; this book presents a usable software development policy document. In addition, it presents usable documentation examples that directly support the specifics of the software development policy. The emphasis of this work is on the same concrete, formal, useful, and practical approach to software quality assurance policies, SOPs, and project documentation presented in Software Development and Quality Assurance for the Healthcare Manufacturing Industries. The intent is to provide users with “fill-in-the-blank” or “search-and-replace” templates for software engineering policies, SOPs, and project documents. This work can provide not only guidance for generating documents and the content of those documents but also templates that can be used to establish the documentation for a software quality assurance program for healthcare manufacturers. Readers can use the policies and guidelines as presented, modify them for use within their organizations, or use them to compare and contrast with any existing software engineering policies and SOPs within their organizations. They can also use the project documentation examples as presented or, again, modify them for use within their organizations. The content of the software project documentation as presented here satisfies the software policies and SOPs that are also presented within this volume. For the interpretation of how to convert specific information within the square brackets of these examples, please see the Introduction to the first edition, below.
xiii
INTRODUCTION TO THE FIRST EDITION
This work represents a companion volume to my book Software Development and Quality Assurance for the Healthcare Manufacturing Industries and presents examples of software engineering policies, SOPs, and project documents discussed in broad terms in that book. The reader is cautioned against interpreting the word “examples” too narrowly; the software policies, SOPs, and project documents presented here are usable as software quality assurance documentation. For example, my earlier book provided guidance on the form, content, and intent of software development policies and left the details to the reader; this book presents a usable software development policy document. In addition, it presents usable documentation examples that directly support the specifics of the software development policy. The emphasis is on the same concrete, formal, useful, and practical approach to software quality assurance policies, SOPs, and project documentation presented in Software Development and Quality Assurance for the Healthcare Manufacturing Industries. The intent, then, is to provide users with “fill-in-the-blank” or “search-and-replace” templates for software engineering policies, SOPs, and project documents. The hope is that the reader will understand that this work can provide not only guidance but templates that can be used to establish the documentation for a software quality assurance program for healthcare manufacturers. Readers can use the policies and guidelines as presented, modify them for use within their organizations, or use them to compare and contrast with any existing software engineering policies and SOPs within their organizations. Furthermore, they can also use the project documentation examples as presented or, again, modify them for use within their organizations. I stress that the content of the software project documentation as presented here satisfies the software policies and SOPs that are also presented in this volume. A few words about the general formats of the information found here are in order. Each document in this work should be viewed as if it could stand alone as a complete, signed-off, and released document. Consequently, document cover pages contain the rudimentary details required to make the entire document usable within any document control system. For example, each document requires signatures from the document’s writer, reviewer, and approver, as well as inclusion of their printed names and titles or positions. This information is indicated within the square brackets, (i.e., [Name/Title/Position]). For the policy documents found here, a second approval signature is shown as required, and this is the location where the appropriate officer would sign for approval if necessary. Below the signature area is the document configuration identification area where the document identification, revision, and pagination counts are found. Users may want to edit this so that it appears at the foot of every page in the document. Below the document identification area is the revision history of the document; users may want to edit this information so that it appears as the second, stand-alone page of the document.
xv
xvi
Introduction to the First Edition
In general, information appearing within square brackets on any page of the enclosed documentation is expected to be provided by users or to be tailored in order to comply with any existing documentation formats within their organizations. As a further example, suppose that the software project document templates shown here are to be used on the “New Generation Model” project. A global search and replace of the character string “[project/product name]” with “New Generation Model” will convert these documents from generic templates into specific project-related documents. In some instances, specific character strings may require more thought in order to convert them into their real-life equivalents. For example, the string “[project title/position]” appears in project-related documents and might refer to a project leader, software development leader, or software verification and validation leader, depending on the context of the discussion in which the string appears. The strings “[title/position]” or “[corporate title/position]” might refer to a software manager, a software quality assurance manager, or an appropriate vice president within the organization, depending on their context. Some discretion is required when the generic square brackets and their contents are replaced with the correct substitutions. On the technical side, the same ground rules apply. For example, a software project document may indicate where additional task description sections are required. Users can then include the additional information in the outline provided within the document. However, if no additional information is required, users may delete any unused or unnecessary sections. In fact, if a document contains information that is not applicable to the project or product, it is reasonable to expect that users would eliminate unnecessary sections or information. Furthermore, it would be expected that the user would include any necessary, pertinent, and additional information that is not readily apparent in the included document templates, either in an existing section or by creating entirely new headings and sections. The contents of this book are meant to form the basis of a software quality assurance program. The software policies and SOPs were written in such a way as to be meaningful in a regulatory environment and to convey the activities, tasks, responsibilities, and deliverables of good software engineering practices. The software documents here were written in such a way as to fully support and comply with the software policies and SOPs presented here. The software quality assurance documents can be used verbatim as fill-in-the-blank templates or as modifiable and customized templates, where no such software quality assurance documentation exists, or as a set of standards that can be used to compare and contrast existing software quality assurance documentation.
SE-CMG SOFTWARE ENGINEERING CONFIGURATION MANAGEMENT GUIDELINES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SECMG-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 22 SE-CMG
2
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
SOFTWARE REPORTING PRACTICES
4
3.0
PROJECT DIRECTORY DEFINITION
6
4.0
CONFIGURATION MANAGEMENT RESPONSIBILITIES
7
APPENDIX A-1
Software Anomaly Report
16
APPENDIX A-2
Instructions for Completing Software Anomaly Report
17
Software Configuration Management Process Record of Deviation or Waiver Approval
18
APPENDIX C
Project Directory Structure
19
APPENDIX D
Matrix for Configuration Management Script Execution
20
APPENDIX B
GLOSSARY
Page 2 of 22 SE-CMG
21
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
3
1.0 INTRODUCTION
1.1 Purpose This document has been developed to define consistent methods, standards, conventions, practices, and styles for software configuration management.
1.2 Scope The guidelines in this document are to be used to promote consistency in the configuration management of all software. The identified software configuration manager will use this guide to establish common practices for software configuration management.
1.3 Overview This document covers the creation of project directories, standard configuration management scripts, and the reporting of discrepancies from documented processes.
1.4 References •
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
Copyright © 2002 Interpharm Press
Page 3 of 22 SE-CMG
4
Software Quality Assurance SOPs for Healthcare Manufacturers
2.0 SOFTWARE REPORTING PRACTICES
2.1 Bug, Error, and Anomaly Reporting The reporting and documenting of software configuration discrepancies is accomplished through the use of the Software Anomaly Report.
2.1.1
Software Anomaly Report
Problem reporting is initiated by the project configuration manager with a Software Anomaly Report, which identifies problems detected during software development activities. The specific information required on an anomaly report identifies how, when, and where the problem occurred and the impact of the problem on the system capability and on the continued conduct of verification and validation (V&V) phase activities. Appendix A shows an example of the anomaly report and instructions for completing the report.
2.1.2
Anomaly Reporting and Resolution
The software configuration manager is responsible for ensuring the proper documentation and reporting of Software Anomaly Reports, and all anomalies are reported regardless of the perceived impact on software development or severity level with respect to the system operation. Unreported and unresolved problems can have a significant adverse impact in the later stages of the software development cycle, which may include little time for resolution. The projected impact of an anomaly is determined by evaluating the severity of its effect on the operation of the system. The severity of an anomaly report is defined as one of the following: •
High. The change is required to correct a condition that prevents or seriously degrades a system objective and no alternative exists, or to correct a safety-related problem.
•
Medium. The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user and system requirements can be met.
•
Low. The change is desirable to maintain the system, correct operator inconvenience, or other.
Resolution of the critical anomaly indicated as a severity of “high” is required before the development effort proceeds to the next software development phase.
Page 4 of 22 SE-CMG
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
5
Software Anomaly Reports are reviewed by the software lead engineer of the project for anomaly validity, type, and severity, and the software lead engineer can direct additional investigation, if required, to assess the validity of the anomaly or the proposed solution. When an anomaly solution is approved and the personnel responsible for performing the corrective action are indicated, the software lead engineer will authorize implementation of the corrective action. The software V&V lead engineer is responsible for anomaly report closure, which includes documenting that corrective action has been taken and verifying the incorporation of authorized changes as described in the anomaly report. If the anomaly requires a change to a baselined configuration item, a Change Request/Approval (CRA) is prepared by a member of the software development team for the item(s) to be changed. A reference to applicable anomaly reports will be documented in the issued CRA.
2.2 Configuration Management Deviation or Waiver Circumstances may require deviation(s) or waiver(s) from policy. A written request for a deviation is generated by the project configuration manager in advance of a future activity, event, or product in order that software engineering (SE) management be made aware of the project personnel’s intention to employ a higher risk development approach. A written request for a waiver is generated by the project configuration manager in those cases where the activity, event, or product has already been initiated. The deviations or waivers are submitted to the cognizant [project title/position] for review, and a recommendation will be made to the [title/position] and/or [title/position] for approval or disapproval of the proposed deviation or waiver. A proposed deviation or waiver must be approved by the [title/position] and/or [title/position] before commencing the software development tasks affected by that deviation or waiver. A copy of each approved deviation or waiver shall be forwarded to the secretary of the Software Configuration Management Policy Change Control Board (CCB). A copy shall also be placed in the product history file. A permanent record of deviation and waiver approvals shall be maintained for each project using the form depicted in Appendix B. Each request for a deviation or waiver identifies the following: •
Each specific policy or policy requirement for which it applies
•
Alternative policy approach to be taken by the project
•
Impact on project schedule, performance, and/or risk
This record shall be initiated during development of the product objectives documentation and shall serve as a record of all subject approvals for the duration of the project.
Copyright © 2002 Interpharm Press
Page 5 of 22 SE-CMG
6
Software Quality Assurance SOPs for Healthcare Manufacturers
3.0 PROJECT DIRECTORY DEFINITION All projects are controlled electronically, and a standard directory structure is defined for uniformity and consistency.When a new project is begun, the directory structure and environment are defined using configuration management (CM) scripts. Appendix C defines the project and software repository library directory structures.
3.1 Project Directory Scripts The software CM scripts are stored in the configuration management directory at [enter directory path here]. Each member of the new <project_name> group must have the path to these scripts in [enter file names that must define the CM directory path names]. It is the responsibility of each <project_name> member to ensure that they modify their files to accomplish this.
3.2 Project Directory Source Repository The repository where the project’s source data and information products are stored is under a directory called <project_name>, and ownership is given to login_name pdcm. The availability of disk space and the location of the project directory structure is determined by the system administrator and the configuration manager. The project name should be no more than six characters in length. It is the responsibility of the configuration manager to ensure that this task is done.
3.3 Project User and Group Identification Access to the stored source information, data, and code for a project is limited to the users of a specific group. Files are stored and accessed by means of standard operating system user and group IDs.Write access will be denied to any user not in the specific group.The system administrator will set up <project> and <project>_vnv as project groups for any project adding login_name pdcm to any group. In addition to the above two project groups, the global groups, integ and vnv, reside in the group file. All software integrators on each active project are members of the integ group. The scripts to be run by the project integrator are group owned by integ in order to make execution of the scripts possible by all integrators.
Page 6 of 22 SE-CMG
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
7
All members of the project’s <project>_vnv group are placed in the vnv group. The scripts to be run by members of any <project>_vnv group are group owned by V&V personnel in order to make execution of the scripts possible by all project V&V members. It is the responsibility of the configuration manager to ensure that these groups are set up.
4.0 CONFIGURATION MANAGEMENT RESPONSIBILITIES
4.1 Integrator Responsibilities The CM scripts depend upon the project integrator performing the following tasks for the duration of the project. •
<project>/se/project_utilities/data/directory_list.def contains an entry for each physically executable software task added under the / . . . /se/project_code directory.
•
The correct directory structure is created and maintained under the <project>/se/project_utilities directory. Because this directory contains only directories, it is just one level deep. The contents of the project_utilities/data directory must be put under software configuration control and contains all files associated with the MAKE process. Software configuration control is optional for all other directories under project_utilities.
•
After the new project directory structure has been created, the MAKE files must be added from the project_utilities/data directory to the source directory of each task under the project_code directory. These files must also be checked in under the software configuration control directory in the project_code/task/source directory.
•
Initially, check files under the <project>/se/project_include directory into a software configuration control directory.
•
Before a baseline is requested, the project_code, project_include, and project_ utilities/data directories must be placed under software configuration control and have been checked in by the integrator through the script “integ_in.”
Copyright © 2002 Interpharm Press
Page 7 of 22 SE-CMG
8
Software Quality Assurance SOPs for Healthcare Manufacturers
4.2 Configuration Management Scripts The following sections describe the CM scripts in terms of their function, the phase of the project that they apply to, the area of the project directories that is affected, who should run the script, and the tasks that the script accomplishes.
4.2.1
Script “new_project”
Script “new_project” is the first of the CM scripts to be run. This script, run only one time per project by login_name pdcm, accomplishes the following tasks: •
Creates an empty project directory structure
•
Builds the user_tool_info directories
•
Copies the data directory from the software library to the project data directory and places them under software configuration control
•
Copies the necessary tool files from the software library to the project tool directories
•
Creates any MAKESYS for the project
•
Sets ownership of the project directory
4.2.2
Script “new_frame”
Script “new_frame” runs within the development directories only, is initiated by the project integrator, and must be run from the destination location in /se. The purpose of this script is to initially check in an existing file to the local software configuration control directory. It accomplishes the following tasks: •
Checks the corresponding directory in /build for a file of the same name
•
If the duplicate exists, informs the user and exits
•
If a duplicate file does not exist in /build, the script then checks the file into the local software configuration control directory and sets the user table to allow access by all members of the project group
Page 8 of 22 SE-CMG
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
4.2.3
9
Script “integ_out”
Script “integ_out” is run within the development directories only, is initiated by the project integrator, and must be run from the destination location in /se. The purpose of this script is to check a file out of software configuration control with the default of a file tagged with the user ID. The option for an untagged file is -t, and the option to change the tag from the user ID is -n. This script accomplishes the following tasks: •
Checks the current directory in /se for the presence of the requested file
•
If it exists in /se and is not checked out, then checks it out as tagged or untagged, as requested
•
If the requested file does not exist in /se, then examines the corresponding directory in /build, informs the user that the file does not exist in /build, and exits or copies the file from /build to /se
•
Checks the file copied from /build into the /se local software configuration control directory
•
Clears the software configuration control user table
•
Sets the contents of the user table to the project group
•
Software configuration control edits a tagged or an untagged file from the local /se software configuration control directory as requested
4.2.4
Script “integ_in”
Script “integ_in” is run within the development area only by the project integrator and must be run from the destination location in /se. The purpose of this script is to check a file into the software configuration control, clearing the user table, and then opening access of the file to the entire project group. The read only copy of the file resides at the software configuration control directory level and reflects the file checked in by “integ_in.” Script “integ_in” should be the last script run on an /se source file before a baseline is requested.
4.2.5
Script “check_out”
Script “check_out” is run within the development area only by members of the project group and must be run from the destination location in /se. The purpose of this script is to check out a tagged file to the requestor, and it accomplishes the following tasks:
Copyright © 2002 Interpharm Press
Page 9 of 22 SE-CMG
10
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Checks the current directory for the presence of the requested file, and if it exists but is not checked out, then checks it out as tagged, or reports that the file is checked in but inaccessible. This would be true if the file had been checked in by a developer and was now locked to that developer and/or the integrator. In either case, exits. If the file exists but is checked out, the script reports this and exits.
•
If the requested file does not exist in /se, goes to the same location in /build. If the file does not exist in /build, then reports this and exits. If the file does exist in /build, then copies the file from /build to /se, checks the file into the local /se software configuration control directory with a user table containing the entire project group, and software configuration control edits a tagged file from the local software configuration control directory.
4.2.6
Script “check_in”
Script “check_in” is run within the development area only by any member of the project group and must be run from the destination location in /se. The purpose of this script is to allow developers to check in a file yet retain exclusive access to the file until it is released to the integrator. The read-only copy of the file resides at the software configuration control directory level and does not reflect the file checked in by “check_in.”The user table is then set to include the individual developer and the integrator group.
4.2.7
Script “prebase_check”
Script “prebase_check” is run by login_name pdcm before each baseline requested by the project and checks that each file under /project_utilities, /project_include, and /project_code is checked in to its software configuration control directory. The output of this script is directed to /pdcm/check.file. Any file shown to be checked out by this script must be resolved with the project integrator before the next script is run.
4.2.8
Script “prebase_diff”
Script “prebase_diff” is run by login_name pdcm after script “prebase_check” has been resolved, and it determines software configuration control differences on each file under /project_utilities, /project_include and /project_code. A difference between the source file and the software configuration control file other than expected variances indicates that the file has not been integrated. The output of this script is directed to /pdcm/diff.file and any differences reported by the script must be resolved with the project integrator before the next script is run.
Page 10 of 22 SE-CMG
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
4.2.9
11
Script “prebase_get”
Script “prebase_get” is run by login_name pdcm after script “prebase_diff” has been resolved. This script performs a software configuration control get -k on each file under /project_utilities, /project_include and /project_code to ensure that the source files being copied to /baseline contain appropriate software configuration control information.
4.2.10 Script “se_2_baseline” Script “se_2_baseline” is run by login_name pdcm for each baseline required by the project. This script assumes that the three prebaseline scripts above have been run successfully. Script “se_2_baseline” accomplishes the following tasks: •
Copies /project_code, /project_include, and /project_utilities from <project>/se to <project>/baseline, makes a directory named /project_libraries, and makes a directory named project_list
•
Uses directory_list.def in directory <project>/baseline/project_utilities/data, deletes each software configuration control directory under <project>/baseline/ project_code/
/source and <project>/baseline/project_code/ /include
•
Removes the contents of /baseline/project_code//object and generates a log list
•
Deletes the software configuration control directory under <project>/baseline/ project_include
•
Deletes the software configuration control directory under each <project>/baseline/project_utilities directory
•
Invokes script “s_se_2_baseline.chper1” and “s_se_2_baseline.chper2” to change group ownership of the /baseline directory from <project> to <project>_vnv
4.2.11 Script “ids_build” Script “ids_build” is run by login_name pdcm at the request of the project leader and copies the Interface Design Specification (IDS) document from /se/project_documents/specifications/ design/ids to directory /build/project_documents/specifications/design/ids.
Copyright © 2002 Interpharm Press
Page 11 of 22 SE-CMG
12
Software Quality Assurance SOPs for Healthcare Manufacturers
4.2.12 Script “srs_build” Script “srs_build” is run by login_name pdcm at the request of the project leader and accomplishes the following tasks: •
Copies /se/analysis to /build/analysis
•
Copies /se/stp_configuration to /build/stp_configuration, and modifies “projdir” path in /build/ToolInfo.project from /se to /build
•
Copies the project SRS from /se/project_documents/specifications/design/srs to /build/project_documents/specifications/design/srs
4.2.13 Script “ads_build” Script “ads_build” is run by login_name pdcm at the request of the project leader and accomplishes the following tasks: •
Copies /se/analysis to /build/analysis
•
Copies /se/stp_configuration to /build/stp_configuration and modifies “projdir” path in /build/ToolInfo.project from /se to /build
•
Copies the Software Architecture Design Specification (SADS) from /se/project_ documents/specifications/design/sads to /build/project_documents/ specifications/design/sads
4.2.14 Script “dds_build” Script “dds_build” is run by login_name pdcm at the request of the project leader and accomplishes the following tasks: •
Copies /se/detail_design to /build/detail_design
•
Copies /se/stp_configuration to /build/stp_configuration and modifies “projdir” path in /build/ToolInfo.project from /se to /build
•
Copies the Software Detailed Design Specification (SDDS) from /se/project_documents/ specifications/design/dds to /build/project_documents/specifications/design/dds
Page 12 of 22 SE-CMG
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
13
4.2.15 Script “code_build” Script “code_build” is run by login_name pdcm at the request of the project leader and accomplishes the following tasks: •
Removes contents of /build/project_libraries and /build/project_list if they exist, or makes these directories if they do not
•
Copies /project_utilities from /baseline to /build, adds a software configuration control directory for data under /project_utilities if this is the first code build, and places the contents of /build/project_utilities/data under software configuration control
•
Adds /project_code tasks added since previous build using new directory_list.def in /build/project_utilities/data or all tasks listed in directory_list.def, if this is the first code build
•
Copies /baseline/project_include to /build/project_include, and places /build/project_include under software configuration control
•
Copies /baseline/project_code//source and /include to /build, then places the contents of /build/project_code//source and /include under software configuration control, removes /build/project_code//object, and generates a log listing
•
Deletes /se/project_code
•
Deletes /se/project_include
•
Deletes /se/project_libraries
•
Deletes /se/project_list
•
Reconstructs the /se project_code directory structure
•
Checks out a copy of all source and include files in /build as project leader
•
Checks out a copy of all files in /build/project_utilities/data as project leader
4.2.16 Script “final_build” Script “final_build” is run by login_name pdcm at the request of the project leader. This is the last script to be run for a project and should be run only once prior to final V&V and transfer of the build to document control. This script accomplishes the following tasks:
Copyright © 2002 Interpharm Press
Page 13 of 22 SE-CMG
14
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Removes contents of /build/project_libraries and /build/project_list if they exist, or make these directories if they do not
•
Copies /project_utilities from /baseline to /build, adds a software configuration control directory for data under /project_utilities if this is the first code build, and places the contents of /build/project_utilities/data under software configuration control
•
Adds /project_code tasks added since previous build using directory_list.def in /build/ project_utilities/data or all tasks listed in directory_list.def if this is the first code build
•
Copies /baseline/project_include to /build/project_include and places /build/project_include under software configuration control
•
Copies /baseline/project_code//source and /include to /build, places contents of /build/project_code//source and /include under software configuration control, removes /build/project_code//object, and generates a log listing
•
Copies /se/project_database to /build
•
Copies /se/project_documents to /build
•
Copies /se/project_purchased_code to /build
•
Copies /se/stp_configuration to /build
•
Copies /se/analysis to /build
•
Copies /se/detail_design to /build
•
Deletes /se/project_code
•
Deletes /se/project_include
•
Deletes /se/project_libraries
•
Deletes /se/project_list
4.2.17 Script “vnv_out” Script “vnv_out” is run within the /build area by members of the <project>_vnv group and must be run from the destination location in /build.The purpose of this script is to allow mem-
Page 14 of 22 SE-CMG
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
15
bers of the <project>_vnv group to make a writable tagged copy of a source file in the /build area. The tagged copy is the result of a copy command and not an actual software configuration control edit. Any number of uniquely tagged source files can be created, and the tag used will be the requester’s user ID.
4.2.18 Script “vnv_del” Script “vnv_del” is run within the /build area by members of the <project>_vnv group and must be run from the destination location in /build. The purpose of this script is to delete a tagged version of a source file copied by script “vnv_out.”
4.2.19 Script “update_usertable” Script “update_usertable” is run within the /se area by the project integrator. The purpose of this script is to update the software configuration control user table of all /project_include and /project_code/source and /include files with the user ID of any new project members not in the original group file. If the source file is not checked out and is either a new frame or has been integrated, its software configuration control user table will be updated.
4.2.20 Script “report” Script “report” is run by login_name pdcm after running any build script, and it sends a file to /build/project_documents/reports. This file contains a list of all /project_include, /project_code, and /project_utilities source files that make up the build, the changes of the files, and a list of configuration items in the build not under software configuration control.
4.2.21 Script “delete_baseline” Script “delete_baseline” is run by login_name pdcm after running script “first_build” or “next_build” and before running script “se_2_baseline.” This script deletes the contents of /baseline.
4.3 Script Execution Matrix Appendix D describes execute permissions for the CM scripts, and execution may be either an individual login_name or a group. The workstation that executes the scripts as login_name pdcm needs root access to the disk containing the project.
Copyright © 2002 Interpharm Press
Page 15 of 22 SE-CMG
16
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX A-1
SOFTWARE ANOMALY REPORT SOFTWARE ANOMALY REPORT
1. Date:
2. Severity: HML
3. Anomaly Report
4. Title (briefly describe the problem):
5. System: 8. Originator:
6. Component: 9. Organization
12. Verification and Validation Task: 14.
System Configuration:
15.
Anomaly Description:
16.
Problem Duplication: During run Y N After restart Y N After reload Y N
10. Telephone
N/A N/A N/A
Investigation Time
19.
Proposed Solution:
20.
Corrective Action Taken: Date:
21.
Closure Sign-off:
11. Approval:
13. Reference Document(s):
17.
18.
Page 16 of 22 SE-CMG
7. Version
❑ ❑ ❑ ❑ ❑
Source of Anomaly: PHASE Requirements Architecture Design Detailed Design Implementation Undetermined
❑ ❑ ❑ ❑ ❑ ❑
TYPE Documentation Software Process Methodology Other Undetermined
Software Lead Engineer
Date
V&V Lead Engineer
Date
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
APPENDIX A-2
17
INSTRUCTIONS FOR COMPLETING SOFTWARE ANOMALY REPORT
1. Date: Form preparation date. 2. Severity: Circle the appropriate code. High: The change is required to correct a condition that prevents or seriously degrades a system objective (where no alternative exists) or to correct a safety-related problem. Medium: The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user and system requirements can be met. Low: The change is required to maintain the system, correct operator inconvenience, or other. 3. Anomaly report number: Number assigned for control purposes. 4. Title: Brief phrase or sentence describing the problem. 5. System: Name of the system or product against which the anomaly report is written. 6. Component: Component or document name against which the anomaly report is written. 7. Version: Version of the document or code against which the anomaly report is written. 8. Originator: Printed name of individual originating the anomaly report. 9. Organization: Organization of originator of anomaly report. 10. Telephone: Office phone number of the individual originating the anomaly report. 11. Approval: Software management individual or designatee approval for anomaly report distribution. 12. V&V task name: Name of the V&V task being performed when the anomaly was detected. 13. Reference document: Designation of the documents that provide the basis for determining that an anomaly exists. 14. System configuration: Configuration loaded when anomaly occurred; not applicable for documentation or logic errors. 15. Anomaly description: Description defining the anomaly and a word picture of events leading up to and coincident with the problem. Cite equipment being used, unusual configurations, environment parameters, and so forth, that will enable the programmer to duplicate the situation. If continuation sheets are required, fill in Page _ of _ at the top of the form. 16. Problem duplication: Duplication attempts, successes or failures for software errors; not applicable for documentation or logic errors. 17. Source of anomaly: On investigation completion, source of the anomaly in terms of phase origination and type. 18. Investigation time: Time, to the nearest half hour, required to determine the cause of the anomaly but not the time to determine a potential solution or time to implement the corrective action. 19. Proposed solution: Description defining in detail a solution to the detected anomaly, including documents, components and code. 20. Corrective action taken: Disposition of the anomaly report, including a description of any changes initiated as a direct result of this report and the date incorporated. 21. Closure sign-off: Signature of the software lead engineer authorizing implementation of the corrective action. Signature of the V&V lead engineer verifying incorporation of the authorized changes as described in this report. Only signature of software lead engineer is required when no corrective action is approved.
Copyright © 2002 Interpharm Press
Page 17 of 22 SE-CMG
18
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX B
SOFTWARE CONFIGURATION MANAGEMENT PROCESS RECORD OF DEVIATION OR WAIVER APPROVAL
SOFTWARE CONFIGURATION MANAGEMENT PROCESS RECORD OF DEVIATION OR WAIVER APPROVAL PROJECT:
TYPE: Deviation or Waiver
PHASE: SOP Requirement Paragraph(s):
Initiated by:
________________________________ Signature
Reviewed by:
________________________________ Signature
Approved by:
________________________
Date:
________________________
Date:
________________________
Title/Position
________________________________ Signature
Date:
Title/Position
Title/Position
Reason/Rationale/Explanation:
Project schedule and performance impact:
Project risk:
Alternative approach to be used:
Page 18 of 22 SE-CMG
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
APPENDIX C
19
PROJECT DIRECTORY STRUCTURE
Copyright © 2002 Interpharm Press
Page 19 of 22 SE-CMG
20
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX D
Script Name new_proj
MATRIX FOR CONFIGURATION MANAGEMENT SCRIPT EXECUTION Configuration Manager
V&V Personnel
Project Integrator
Project Personnel
X
new_frame
X
integ_out
X
integ_in
X
check_out
X
check_in
X
prebase_check
X
prebase_diff
X
prebase_get
X
se_2_baseline
X
ids_build
X
srs_build
X
ads_build
X
dds_build
X
code_build
X
final_build
X
vnv_out
X
vnv_del
X
update_usertable
X
X
reportX delete_baseline
Page 20 of 22 SE-CMG
X
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Guidelines
21
GLOSSARY Anomaly: Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Change Request/Approval (CRA): Form used to document changes to a baseline. Configuration control: Process of evaluating, approving or disapproving, and coordinating changes to configuration items after formal establishment of their configuration identification. Configuration identification: Process of designating the configuration items in a system and recording their characteristics. Configuration item: Aggregation of hardware, software, or any of its discrete parts that satisfies an end-use function. Configuration management (CM): Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Deviation: Authorization for a future activity, event, or product to depart from standard procedures. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Integrity: Accuracy in an item’s compliance with its requirements. Software configuration management (SCM): Discipline of identifying the configuration of a software system at discrete points in time for the purpose of systematically controlling changes to this configuration and maintaining the integrity and traceability of this configuration throughout the development process.
Copyright © 2002 Interpharm Press
Page 21 of 22 SE-CMG
22
Software Quality Assurance SOPs for Healthcare Manufacturers
Software development library: Software library containing computer-readable and humanreadable information relevant to a software development effort. Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software project: Planned and authorized undertaking of specified scope and duration that results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Source code: Original software expressed in human-readable form (programming language) that must be translated into machine-readable form before it can be executed by the computer. Waiver: Authorization to depart from SE policy for an activity, event, or product that has already been initiated.
Page 22 of 22 SE-CMG
Copyright © 2002 Interpharm Press
SE-CMP SOFTWARE ENGINEERING CONFIGURATION MANAGEMENT POLICIES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SECMP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 30 SE-CMP
24
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
PREAMBLE
SE Software Configuration Management Policies
3
POLICY 1
Software Configuration Management Organization
7
POLICY 2
Software Configuration Identification
9
POLICY 3
Software Configuration Management of Subcontractor and Vendor Products
10
POLICY 4
Software Configuration Management Plan
12
POLICY 5
Software Configuration Change Processing
14
POLICY 6
Change Request/Approval Form
16
POLICY 7
Software Change Review Board
17
POLICY 8
Software Configuration Status Accounting
19
POLICY 9
Software Configuration Status Report
20
POLICY 10
Software Configuration Audits and Reviews
21
POLICY 11
Anomaly Reporting and Resolution
23
GLOSSARY
Page 2 of 30 SE-CMP
26
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
PREAMBLE
25
SE SOFTWARE CONFIGURATION MANAGEMENT POLICIES
Policy Software engineering (SE) software projects shall comply with a set of Software Configuration Management (SCM) Policies, which are established, maintained, and used to achieve quality in all phases of the software life cycle. Justified and necessary departures from these policies may be authorized in response to a written request. A permanent board shall be established to control and maintain the SE SCM Policies.
Requirements 1. The SE SCM Policies shall be applied to all SE software projects. Projects in which effort will be expended in order to modify or enhance existing software also are subject to this requirement. 2. The SE SCM Policies shall be maintained by the Software Configuration Management Policy Change Control Board (CCB). The CCB chairman shall be appointed by the SE [title/position] with the approval of the [title/position], and board members shall be appointed in writing by the SE [title/position]. The SE [title/position] shall serve as the secretary to the CCB and shall be responsible for scheduling board meetings and maintaining minutes of meetings and permanent files of CCB actions. Proposed changes to SE SCM Policies must be submitted in writing to the board. At least once each year, the CCB shall convene to review all of the policies for relevancy and currency.Where appropriate, they shall propose revisions to the policies subject to the review and approval of the [title/position]. After approval by the SE [title/position], the policies shall be approved by the [title/position] and [title/position]. 3. Circumstances may require deviation(s) or waiver(s) from policy. A written request for a deviation shall be submitted by the project configuration manager in advance of a future activity, event, or product in order that SE management be made aware of the project’s intention to employ a higher risk approach to SCM. A written request for a waiver shall be submitted by the project configuration manager in those cases where the activity, event, or product has already been initiated. Deviations and waivers shall be reviewed by the [project title/position] and submitted to the SE [title/position] for review. The SE [title/position] will make a recommendation to the [title/position] and/or [title/position] for approval or disapproval of the proposed deviation or waiver. A proposed deviation or
Copyright © 2002 Interpharm Press
Page 3 of 30 SE-CMP
26
Software Quality Assurance SOPs for Healthcare Manufacturers
waiver must be approved by the [title/position] and/or [title/position] before commencing the SCM tasks affected by that deviation or waiver. 4. Each request for a deviation or waiver shall identify: a. Each specific policy or policy requirement for which it applies b. The alternative policy approach to be taken by the project c. The impact on project schedule, performance, and/or risk 5. A copy of each approved deviation or waiver shall be forwarded to the secretary of the Software Configuration Management Policy CCB. A copy shall also be placed in the product history file. 6. These policies refer to and govern a set of SE SCM procedures. The procedures are intended to provide detailed guidance within the framework and requirements provided by these policies (see Figures 1 and 2). It is the responsibility of the project configuration manager to apply the existing relevant SE SCM procedures. New SE SCM procedures are to be submitted to the SE [title/position] prior to their use, in order that they can be reviewed and approved (see Figure 3).
Figure 1
SE Software Configuration Management Policies Policy Category
Configuration Identification Project Management
Product Management and Acceptance Configuration Change Control
Configuration Status Accounting
Page 4 of 30 SE-CMP
Policy Topic Title Software Configuration Identification Software Configuration Management Organization Software Configuration Management of Subcontractor and Vendor Products Software Configuration Management Plan (SCMP) Software Configuration Change Processing Software Change Request/Approval Form (CRA) Software Change Review Board (SCRB) Anomaly Reporting and Resolution (SAR) Software Configuration Status Accounting Software Configuration Status Report (SCSR) Software Configuration Audits and Reviews
Policy Number 2 1 3 4 5 6 7 11 8 9 10
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
27
7. A permanent record of deviation and waiver approvals shall be maintained for each project using the form depicted in the SE configuration management procedures. This record shall be initiated during development of the Product Objectives Document and shall serve as a record of all subject approvals for the duration of the project.
Figure 2
SE Software Configuration Management Policies Throughout the Software Development Life Cycle
Requirements
Architecture Design
Detailed Design
Code and Test
Integrate and Test
Software Configuration Management Organization
E
E
ED
E
E
E
E
Software Configuration Identification
S
Software Configuration Management of Subcontractor and Vendor Products
S
S
S
S
S
S
S
S
Software Configuration Management Plan (SCMP)
S
S
ED
E
E
E
E
E
Software Configuration Change Processing
S
S
S
S
S
S
S
Software Change Request/Approval (CRA) Form
S
S
S
S
S
S
ED
Software Change Review Board (SCRB)
S
S
S
S
S
S
S
Software Configuration Status Accounting
E
E
E
E
E
E
E
E
ED
ED
Software Configuration Status Report (SCSR)
ED ED ED ED
ED ED ED ED
ED ED
Software Validation
Interface Design
Software Life Cycle Phase
Project Start-up
Policy Topic Title
ED
Software Configuration Audits and Reviews
E
E
E
E
E
E
E
Anomaly Reporting and Resolution (SAR)
S
S
S
S
S
S
ED
Notes: 1. D indicates that a deliverable or activity is required at that time. 2. E indicates that the procedure requirements are in effect for the entire phase. 3. S indicates that the procedure requirements can start at any time.
Copyright © 2002 Interpharm Press
Page 5 of 30 SE-CMP
28
Software Quality Assurance SOPs for Healthcare Manufacturers
Figure 3
Matrix of Responsibilities for Software Configuration Management Policy Documents Document Title
Software configuration management procedures Software configuration management deviation Software configuration management waiver Software Configuration Management Plan (SCMP) Software Change Request (SCR) Form (Class I and II) Software Change Request (SCR) Form (Class III) Configuration status accounting Software Configuration Status Report (SCSR) Software Trouble Report (STR) Form (low) Software Trouble Report (STR) Form (medium) Software Trouble Report (STR) Form (high) Notes
1. 2. 3. 4. 5. 6. 7. 8. 9. 10.
PCM1 SLE2 V&VLE3
Director, Program SE Manager SCRB4
G
R/D
G G
R/D R/D
G
R
R
A
R
V
A R/D E/M/A
R R
R/D R/D
V
G G
R/D
R
R/D
G
R
R
R/D
G
R
R
R/D
Assigned project configuration manager. Project software lead engineer assigned to the project. Project V&V lead engineer assigned to the project. Software Configuration Review Board. G means generate. A means assigns ID number. E/M/A means establish, maintain, and archive. R means review. R/D means review and disposition. V means verify disposition.
Responsibilities The project configuration manager is responsible for the following: 1. Generating changes to SE SCM Policies 2. Generating written deviations and waivers 3. Generating changes to SCM procedures 4. Applying relevant SCM procedures to the project
Page 6 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
29
The SE [title/position] is responsible for the following: 1. Review and approval or disapproval of SE SCM Policies 2. Review and recommendation of deviations and waivers from SE SCM Policies 3. Review and approval or disapproval of SCM procedures The [title/position] and/or [title/position] are responsible for the following: 1. Approval or disapproval of SE SCM Policies 2. Approval or disapproval of deviations and waivers from SE SCM Policies The [project title/position] is responsible for the review and submittal of deviations and waivers from SE SCM Policies. The managers of organizations supporting and sponsoring the project should share the commitment to implementing these policies.
POLICY 1 SOFTWARE CONFIGURATION MANAGEMENT ORGANIZATION
Policy SE software projects shall assign a project configuration manager to be responsible for the software configuration management (SCM) of the software end products developed on the project. The project configuration manager shall perform the configuration management activities described in the Software Configuration Management Plan (SCMP). In order to achieve the successful configuration management of a software project, the project configuration manager shall establish a product development library to be the depository for software end products placed under configuration control.
Copyright © 2002 Interpharm Press
Page 7 of 30 SE-CMP
30
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements 1. Management of the functions and tasks of SCM for each SE software project shall be the responsibility of the project configuration manager. The project configuration manager shall be responsible for making decisions regarding the performance of SCM, assigning priorities to SCM tasks, estimating the level of effort for an SCM task, tracking the progress of work, and assuring adherence to SE standards in all configuration management efforts. The authority for resolving issues raised by SCM tasks shall reside with the SE [title/position]. A non-project-related person shall be assigned the responsibility for the SCM tasks with the mutual approval of the software lead engineer and the SE [title/position]. This person shall report operationally to the software lead engineer and shall receive functional direction from the SE [title/position]. 2. The project shall prepare and maintain an SCMP in compliance with the SE Software Development Policies. The SCMP shall describe the SCM requirements for the project. The project’s SCMP shall be provided to the project configuration manager by the software lead engineer prior to the Software Requirements Review (SRR). The project configuration manager shall review the SCMP and establish the appropriate configuration management organization for implementing the prescribed configuration management activities. 3. A product software development library shall be established by the project configuration manager to be the depository for software end products placed under configuration control. The product development library shall provide storage of and controlled access to software and documentation in both human-readable and machine-readable form. 4. The project configuration manager shall be responsible for the following activities on the project: a. Providing support and guidance to the software lead engineer on all matters relating to SCM b. Preparing and issuing SCM procedures for SCM activities unique to the project c. Coordinating with the project’s technical team and other technical organizations on matters pertaining to SCM
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project configuration manager shall be responsible for the following: a. Managing the functions and tasks of SCM for an SE software project
Page 8 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
31
b. Reviewing the project’s SCMP and establishing a configuration management organization to implement the prescribed configuration management activities c. Establishing a product software development library to be the depository for software end products placed under configuration control d. Providing support and guidance to the software lead engineer on all matters relating to SCM e. Preparing and issuing SCM procedures for configuration management activities unique to the project f. Coordinating with the project’s technical team and other technical organizations on matters pertaining to SCM 2. The software lead engineer shall be responsible for providing the project configuration manager with the SCMP prior to the Software Requirements Review (SRR) 3. The SE [title/position] shall be responsible for the following: a. Providing functional review and guidance to project personnel performing SCM tasks b. Resolving issues raised by SCM tasks
POLICY 2 SOFTWARE CONFIGURATION IDENTIFICATION
Policy SE software projects shall provide configuration identification of software end products developed on the project. The configuration of each software end product shall be identified by its technical documentation. Configuration baselines shall be established at specific points during software development to further define the configuration of the items as they are developed. The project configuration manager shall be responsible for defining the configuration of the software items at each baseline and assigning configuration identification numbers to each item baselined.
Requirements 1. Interface Design, Requirements, Architecture Design, Detailed Design, Implementation, and Software Validation baselines shall be established for the software
Copyright © 2002 Interpharm Press
Page 9 of 30 SE-CMP
32
Software Quality Assurance SOPs for Healthcare Manufacturers
project at the appropriate software development phases. These baselines shall define a formal departure point for control of future changes to the performance and/or design of the software end products. 2. The project configuration manager shall be responsible for defining the technical documentation, which identifies and establishes the configuration of the software end products at each configuration baseline. 3. The items identified for configuration management at each configuration baseline, referred to as configuration items, shall be uniquely identified by a configuration identifier that is easily recognized and understood by the software project personnel. The project configuration manager shall be responsible for composing and assigning all configuration identification numbers to configuration items.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the project configuration manager shall be responsible for defining the configuration of the software items at each baseline and assigning configuration identification numbers to each configuration item.
POLICY 3 SOFTWARE CONFIGURATION MANAGEMENT OF SUBCONTRACTOR AND VENDOR PRODUCTS
Policy Subcontractors and vendors who design and/or produce software that is delivered as a component of an SE software project shall comply with the same software configuration management (SCM) requirements imposed on the project, insofar as they are applicable. These requirements are imposed on subcontractors and vendors through a procurement package. After source selection and negotiation, the procurement package becomes the contract. SE reserves the right to review the subcontractor’s or vendor’s configuration management system prior to contract award and to periodically audit that system subsequent to contract award to assure that adequate methods are implemented for identifying and controlling each product produced for SE.
Page 10 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
33
Requirements 1. The SE system of controlling subcontractors and vendors with respect to SCM shall include the following: a. A proven source selection process that recognizes and implements SCM requirements b. Contractually required compliance with these SE SCM Policies and SCM procedures c. Continuing liaison, monitoring, and audits to ensure that SCM requirements are met 2. SE’s subcontractors and vendors shall be required to meet the SCM requirements specified in the procurement package. Computer programs developed by subcontractors and vendors can range in complexity from previously developed standard “off-the-shelf” parts to highly sophisticated system elements developed for an SE software project. Therefore, the SCM requirements of the procurement package may vary with each subcontractor or vendor product, depending on the magnitude and complexity of the item being procured. The procurement package shall indicate those parts of the SE SCM Policies and SCM procedures applicable to each subcontractor and vendor and product. 3. The demonstrated capabilities of a subcontractor or vendor to support an SCM program shall be confirmed by SE during the subcontract or vendor selection process. The project configuration manager shall be responsible for reviewing and evaluating the subcontractor’s or vendor’s capabilities as they relate to configuration identification, change control, and configuration status records. 4. Baseline material and changes to baseline material on an SE software project that are submitted by subcontractors and vendors shall be reviewed and approved by SE prior to their submission for project approval. This processing shall include systematic checking for compliance with all SCM requirements. 5. The project configuration manager shall monitor the subcontractor’s or vendor’s configuration management procedures to ensure that adequate methods are being implemented for identifying and controlling each product produced for SE.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the project configuration manager shall be responsible for the following: 1. Reviewing and evaluating the subcontractor’s or vendor’s capabilities as they relate to configuration identification, change control, and configuration status records
Copyright © 2002 Interpharm Press
Page 11 of 30 SE-CMP
34
Software Quality Assurance SOPs for Healthcare Manufacturers
2. Monitoring the subcontractor’s configuration management procedures to ensure that adequate methods are being implemented for identifying and controlling each product produced for SE
POLICY 4 SOFTWARE CONFIGURATION MANAGEMENT PLAN
Policy SE software projects shall perform software configuration management functions that establish a series of baselines and methods for controlling changes to these baselines. The degree of formality and control employed and manpower used should be contingent upon the size and complexity of the project, the significance of the product, and the investment risks. The project’s software configuration management activities shall follow a Software Configuration Management Plan (SCMP) to be prepared and approved prior to the Software Requirements Review (SRR).
Requirements 1. An SCMP that addresses the requirements of this policy and any unique requirements shall be prepared by the project’s software configuration manager and approved by the SE [title/position] prior to the SRR. Depending upon the scope of the project, the SCMP may be a separate document or a section of the Software Development Plan (SDP). The plan shall be in the form prescribed by the relevant software development procedures and shall specify the following: a. Baselines to be used by the project and that are established after formal review and project approval b. Configuration identification requirements, including the types of products to be controlled and the rules for naming, marking, or otherwise identifying these products c. Configuration identification requirements, based on the relevant software development procedures d. Configuration control mechanisms, including the definition of the change and approval or disapproval processes for controlled products and the handling of waivers and deviations
Page 12 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
35
e. Problem reporting system, including definition of the forms to be used and their relationship to the change control process f. Configuration status accounting system, including the records and reports required to provide traceability of changes to controlled products and to provide a basis for communications of configuration information within the project g. Configuration verification approach to be used in order to assure that products are developed and maintained according to the requirements above 2. The project shall accomplish the issuance, retention, change control, packaging, and delivery of the physical computer program products in conformance with approved configuration control procedures and marking requirements. 3. The project shall establish and operate a product software development library in which the master representation of each software product is maintained and controlled.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project’s software configuration manager shall be responsible for generating and obtaining the approval of the SCMP. 2. The project’s software lead engineer shall be responsible for the following: a. Reviewing the SCMP and obtaining approval of it b. Ensuring that the software project adheres to the provisions of the SCMP 3. The SE [title/position] shall support and audit each software project’s configuration management program by: a. Reviewing and approving the project SCMP b. Providing functional review and guidance to project personnel performing configuration management duties c. Furnishing part- or full-time specialists to the project when such support is requested by the software lead engineer d. Providing, at project request, interpretive project documentation, standards, and procedures for the guidance of ongoing configuration management activities e. Providing the means for physical media control of baselines to the extent defined by the approved project SCMP f. Reviewing the effectiveness of the project software configuration management program as an element of periodic assurance audits
Copyright © 2002 Interpharm Press
Page 13 of 30 SE-CMP
36
Software Quality Assurance SOPs for Healthcare Manufacturers
POLICY 5 SOFTWARE CONFIGURATION CHANGE PROCESSING
Policy SE software projects shall provide for the systematic evaluation, coordination, approval or disapproval, and implementation of all changes to the configuration of a software item after establishment of its configuration identification. The project configuration manager shall be responsible for establishing a system to support the configuration change processing described in the project’s Software Configuration Management Plan (SCMP). The implementation and verification of approved changes to baselined material shall conform to the project’s SCMP and Software Verification and Validation Plan (SVVP).
Requirements 1. The project configuration manager shall select and/or generate the SCM procedures required to establish a system to support the change control processing described in the project’s SCMP. 2. The project configuration manager shall be responsible for configuration control of the software end products developed on the project. 3. The software items that compose each configuration baseline shall be provided to the project configuration manager upon approval and/or verification and validation (V&V). The project configuration manager shall be the custodian of the “to-be-established” configuration baseline, previously baselined material and all changes to baselined material, during the completion of the software development phase associated with the configuration baseline. 4. The project configuration manager shall enter baseline material in the product software development library for version and access control. Version control shall ensure the rapid, comprehensive, and accurate treatment of approved changes to items under configuration control. Access control shall ensure restricted access to configuration items undergoing change. 5. Changes to baselined material shall be documented on a Change Request/Approval (CRA) form and uniquely identified by a number that combines the name of the software project and the responsible engineer’s initials, providing valuable historical information as to who requested the change. The project configuration manager shall be
Page 14 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
37
responsible for assigning identification numbers to CRAs in accordance with the relevant SCM procedures. 6. CRAs shall be classified as one of the following: a. Class I—changes that affect the performance, function, or technical requirements b. Class II—changes that require a change to other baselined material but that do not meet the criteria defined for Class I changes c. Class III—changes that do not require a change to any other baselined material 7. All CRAs shall be reviewed by the project’s software lead engineer. Approval or disapproval of Class III changes shall be the responsibility of the project’s software lead engineer. Class I and Class II proposed changes shall be submitted by the project’s software lead engineer to the project’s Software Change Review Board (SCRB) with a change status recommendation. 8. The project’s software lead engineer is responsible for the distribution, implementation, and status update of approved CRAs in accordance with the project’s SCMP. 9. The project’s V&V project leader shall be responsible for the verification of revised baselined material in accordance with the project’s SCMP and SVVP.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project configuration manager shall be responsible for the following: a. Selecting and/or generating the SCM procedures required to establish a system to support the change control processing described in the project’s SCMP b. Maintaining the “to-be-established” configuration baseline, previously baselined material, and all changes to baselined material c. Entering baseline material in the product software development library for version and access control d. Assigning identification numbers to CRAs 2. The project software lead engineer shall be responsible for the following: a. Reviewing all proposed changes to baselined material b. Approving or disapproving Class III CRAs c. Reviewing and recommending Class I and Class II CRAs
Copyright © 2002 Interpharm Press
Page 15 of 30 SE-CMP
38
Software Quality Assurance SOPs for Healthcare Manufacturers
d. Distributing, implementing, and status updating approved CRAs in accordance with the project’s SCMP 3. The project V&V leader shall be responsible for the verification of revised baselined material in accordance with the project’s SCMP and SVVP.
POLICY 6 CHANGE REQUEST/APPROVAL FORM
Policy Changes to baselined material shall be documented on a Change Request/Approval (CRA) form. The CRA shall document the affected SE software project name, a description of the proposed change(s), its effect on configuration baselines, and the status of the change(s). Changes to baselined material may be initiated by anyone through the generation of a CRA.
Requirements 1. A CRA shall be used to identify proposed changes to baselined material. The form of the CRA shall be as defined in the relevant SE SCM procedures. 2. The CRA shall be submitted by the initiator to the project software lead engineer. 3. Each CRA shall identify the following: a. b. c. d. e. f. g. h.
Affected SE software project name CRA number Application level of the proposed change Initiator Configuration baseline(s) affected Change classification Documents affected Proposed change and estimated impact on other systems, software, or equipment
4. The disposition of the proposed change(s) shall be documented on the CRA, including the date of disposition and signature of appropriate dispositioner.
Page 16 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
39
5. The results of change verification shall be documented on the CRA. At the completion of change verification, the conductor of the verification and validation (V&V) activities shall sign and date the CRA. 6. The status of a CRA shall be determined to be closed upon successful change implementation and verification or upon change disapproval. The project software lead engineer is responsible for determining when a CRA shall be considered closed. Upon closure of the CRA, the project software lead engineer shall date and sign the CRA.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the software lead engineer shall be responsible for the following: 1. Determining when a CRA shall be considered closed 2. Dating and signing closed CRAs
POLICY 7 SOFTWARE CHANGE REVIEW BOARD
Policy SE software projects shall establish a Software Change Review Board (SCRB) to coordinate, review, and decide the disposition of Class I and Class II changes to baselined material. Change Request/Authorization (CRA) forms classified as Class I or Class II shall be submitted to the SCRB. The SE [title/position] shall chair the project’s SCRB, and the project configuration manager shall serve as the SCRB secretary.
Requirements 1. An SCRB shall be established for each project to evaluate and determine disposition of changes to baselined material, which are classified as Class I or Class II changes.
Copyright © 2002 Interpharm Press
Page 17 of 30 SE-CMP
40
Software Quality Assurance SOPs for Healthcare Manufacturers
2. The SE [title/position] shall be responsible for chairing the project’s SCRB and for proposed change disposition.The members of the project’s SCRB shall be appointed by the SE [title/position] and shall include SE personnel involved in the development of the baselined material to be changed and in the verification and validation (V&V) of the software project. 3. The project configuration manager shall serve as the SCRB secretary and shall be responsible for scheduling SCRB meetings and maintaining minutes of meetings and permanent files of SCRB actions. 4. The SCRB shall analyze and identify the impact of the proposed change and ensure that one or more of the following criteria are met: a. b. c. d.
Effects substantial life-cycle cost savings Significantly increases system effectiveness Corrects deficiencies Prevents slippage of the approved development schedule
5. The SCRB shall direct the proposed change disposition to be one of the following: a. b. c. d.
Additional problem analysis A change different from that proposed in the change document Approval of the change as proposed Disapproval of any change
6. The SCRB secretary shall distribute approved CRAs to the project software lead engineer for implementation. A copy of all CRAs shall be provided to the project configuration manager for configuration accounting purposes.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project configuration manager shall be responsible for the following: a. b. c. d. e.
Serving as the SCRB secretary Scheduling SCRB meetings Maintaining minutes of meetings Maintaining permanent files of SCRB actions Distributing approved CRAs to the software lead engineer
2. The SE [title/position] shall be responsible for the following:
Page 18 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
41
a. Chairing the project’s SCRB b. Appointing the members to the SCRB c. Proposed change disposition
POLICY 8 SOFTWARE CONFIGURATION STATUS ACCOUNTING
Policy SE software projects shall establish and maintain configuration status accounting records for the software end products developed on the project. Configuration status accounting shall provide identification of the project’s configuration baselines, traceability from the baselines resulting from approved changes, and a management tool for monitoring the accomplishment of all related tasks resulting from approved changes. The results of this configuration management activity shall be documented in the project’s Software Configuration Status Report (SCSR).
Requirements 1. Configuration status accounting shall provide for the identification of the project’s configuration baselines and traceability from the baselines resulting from approved changes. Configuration status accounting shall also be used as a management tool for monitoring the accomplishment of related tasks resulting from each approved change to the project’s configuration baselines. The results of this configuration management activity are reported in an SCSR. 2. Configuration status accounting records shall be established and maintained for the software project by the project configuration manager. These records shall provide a listing of the approved configuration identification, the status of proposed changes to the configuration, and the implementation status of approved changes. The types and formats of the configuration status accounting records shall be in accordance with the project’s Software Configuration Management Plan (SCMP). 3. Configuration status accounting records shall be maintained until all software end products developed on the project have been approved in accordance with the SE Software Development Policies and the project’s SCMP. Upon end product acceptance, all configuration status accounting records shall be archived by the project configuration manager
Copyright © 2002 Interpharm Press
Page 19 of 30 SE-CMP
42
Software Quality Assurance SOPs for Healthcare Manufacturers
in accordance with the project’s SCMP to provide project history for use in future software development planning.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the project configuration manager shall be responsible for the following: 1. Establishing and maintaining configuration status accounting records for the software project 2. Archiving all configuration status accounting records upon end-product acceptance
POLICY 9 SOFTWARE CONFIGURATION STATUS REPORT
Policy SE software projects shall document the results of configuration status accounting in a Software Configuration Status Report (SCSR). The SCSR shall be generated at the conclusion of the software development phase associated with each configuration baseline. The SCSR shall be distributed to the project software lead engineer and the SE [title/position]. The project configuration manager shall be responsible for the generation and distribution of the SCSR.
Requirements 1. The SCSR shall document the status of the configuration items being developed on the project, and the form of the SCSR shall be as defined in the relevant SE SCM procedures. 2. The results of configuration status accounting for each configuration baseline shall be reported in the SCSR. The SCSR shall list the following information: a. Baseline identification b. A list of all baselined material, indicating the current revision and referenced CRAs c. A list of all CRAs, including current disposition or date of closure
Page 20 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
43
3. The SCSR shall be generated and distributed at the conclusion of the software development phase associated with each configuration baseline and upon request of the project software lead engineer. The project configuration manager shall be responsible for the generation and distribution of the SCSR to the project software lead engineer and the SE [title/position].
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project configuration manager shall be responsible for generating and distributing the SCSR. 2. The project software lead engineer shall be responsible for reviewing the SCSR. 3. The SE [title/position] shall be responsible for reviewing the SCSR.
POLICY 10
SOFTWARE CONFIGURATION AUDITS AND REVIEWS
Policy SE software projects shall conduct technical reviews and configuration audits as described in the SE Software Development Policies and SE Software Verification and Validation Policies. Additionally, SE software projects shall ensure the integrity, completeness, and traceability of “to-be-established” and “established” configuration baselines. The project configuration manager shall be responsible for providing configuration management support during the project’s technical reviews and configuration audits. The project configuration manager is responsible for performing configuration audits of pre- and post-baselined material.
Requirements 1. The project configuration manager shall be responsible for supporting the project software lead engineer in the preparation and conduct of the project’s technical reviews.
Copyright © 2002 Interpharm Press
Page 21 of 30 SE-CMP
44
Software Quality Assurance SOPs for Healthcare Manufacturers
2. The project configuration manager shall be responsible for supporting the project V&V leader in the preparation and conduct of the project’s software configuration audit of the validated software. 3. The project configuration manager shall be responsible for the following during the project’s technical reviews and configuration audits: a. Providing an update of the project’s Software Configuration Status Report (SCSR) b. Obtaining copies of baselined material for use during the review or audit c. Providing a copy of all Change Request/Approval (CRA) forms generated since the last review or audit 4. The project configuration manager shall be responsible for conducting configuration audits of the project’s “to-be-established” and “established” configuration baselines. These audits shall be conducted to ensure the following: a. Technical and administrative integrity of pre- and/or post-baselined material b. Each element maps directly or through parents when traced through preceding baselines c. System and software requirements are fulfilled by the software configuration specified in the “to-be-established” and “established” baseline d. Changes made to baselined material are implemented as intended
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the project configuration manager shall be responsible for the following: 1. Supporting the project software lead engineer in the preparation and conduct of the project’s technical reviews 2. Supporting the project V&V leader in the preparation and conduct of the project’s software configuration audit of the validated software 3. Providing an update of the project’s SCSR to the review or audit team 4. Obtaining copies of baselined material for use during the review or audit 5. Providing a copy of all CRAs generated since the last review or audit 6. Conducting configuration audits of the project’s “to-be-established” and “established” configuration baselines
Page 22 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
POLICY 11
45
ANOMALY REPORTING AND RESOLUTION
Policy The project configuration manager shall be responsible for the proper documentation and reporting of software configuration management anomalies on a Software Anomaly Report. All anomalies shall be reported regardless of the perceived impact on software development or the severity of the anomaly with respect to system operation. Software Anomaly Reports shall be reviewed by the project software lead engineer for anomaly solution determination and implementation authorization.The project software V&V lead engineer shall be responsible for anomaly report closure.The SE [title/position] shall be responsible for the approval or disapproval of the distribution of Software Anomaly Reports.
Requirements 1. A Software Anomaly Report shall be used to identify problems detected during software configuration management activities. The specific information required includes the following: a. b. c. d. e.
Description and location of the anomaly Severity of the anomaly Cause and method of identifying the anomalous behavior Recommended action and actions taken to correct the anomalous behavior Impact of the problem on the system capability of the product and on the continued conduct of V&V phase activities
2. The form of the Software Anomaly Report shall be as defined in the relevant SE software configuration management procedures. The configuration identification, tracking, and status reporting of Software Anomaly Reports shall be in accordance with the project’s Software Configuration Management Plan (SCMP). 3. The projected impact of an anomaly shall be determined by evaluating the severity of its effect on the operation of the system. The severity of a Software Anomaly Report shall be defined as one of the following: •
High. The change is required to correct a condition that prevents or seriously degrades a system objective where no alternative exists, or to correct a safety-related problem.
•
Medium. The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user or system requirements can be met.
Copyright © 2002 Interpharm Press
Page 23 of 30 SE-CMP
46
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Low. The change is desirable to maintain the system, correct operator inconvenience, or any other.
4. The project software V&V lead engineer shall be responsible for ensuring the proper documentation and reporting of software anomalies. All anomalies shall be reported regardless of the perceived impact on software development or the severity of the anomaly with respect to the system operation. 5. Software Anomaly Reports shall be reviewed by the project software lead engineer for anomaly validity, type, and severity. The project software lead engineer can direct additional investigation if required to assess the validity of the anomaly or the proposed solution. An anomaly solution that does not require a change to a baselined software configuration item may be approved by the project software lead engineer. If the anomaly requires a change to a baselined software configuration item, then the anomaly solution shall be approved in accordance with the project’s SCMP. 6. When an anomaly solution is approved and the personnel responsible for performing the corrective action are indicated, the project software lead engineer shall authorize implementation of the corrective action. 7. The project software V&V lead engineer shall be responsible for anomaly report closure, which includes the following: a. Documenting the corrective action(s) taken b. Verifying the incorporation of authorized changes as described in the anomaly report c. Reporting the status of the Software Anomaly Report to the software lead engineer and the SE [title/position] 8. The SE [title/position] shall be responsible for the approval or disapproval of the distribution of Software Anomaly Reports that are reported closed. When distribution is approved, the project software V&V lead engineer shall distribute closed Software Anomaly Reports to the software project Quality Assurance representative(s). 9. The SE [title/position] shall ensure the resolution of anomalies that are indicated on the Software Anomaly Report with a severity of “high” before the software project proceeds to the next software development phase.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following:
Page 24 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
47
a. Proper documentation and reporting of software anomalies b. Anomaly report closure c. Distribution of closed Software Anomaly Reports to the software project Quality Assurance representative(s) 2. The software lead engineer shall be responsible for the following: a. Review of Software Anomaly Reports for anomaly validity, type, and severity b. Direction of additional investigation if required to assess the validity of the anomaly or the proposed solution c. Approval or disapproval of an anomaly solution that does not require a change to a baselined software configuration item d. Authorization of corrective action implementation 3. The SE [title/position] shall be responsible for the following: a. Approval or disapproval of the distribution of closed Software Anomaly Reports b. Ensuring the resolution of anomalies that are indicated on the Software Anomaly Report with a severity of “high” before the software project proceeds to the next software development phase
Copyright © 2002 Interpharm Press
Page 25 of 30 SE-CMP
48
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Archive: Provisions made for storing and retrieving records over a long period of time. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Change Request/Approval (CRA): Form used to document changes to a baseline. Code: Loosely, one or more computer programs or part of a computer program. Completeness: Those attributes of the software or documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Configuration audit: Process of verifying that all required configuration items have been produced, that the current version agrees with specified requirements, that the technical documentation completely and accurately describes the configuration items, and that all change requests have been resolved. Configuration control: Process of evaluating, approving or disapproving, and coordinating changes to configuration items after their configuration identification has been formally established. Configuration identification: Process of designating the configuration items in a system and recording their characteristics.
Page 26 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
49
Configuration item: Aggregation of hardware, software, or any of its discrete parts, that satisfies an end-use function. Configuration management (CM): Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Configuration status accounting: Recording and reporting of the information that is needed to manage a configuration effectively, including a listing of the approved configuration identification, the status of proposed changes to the configuration, and the implementation status of approved changes. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to quality assurance personnel for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Deviation: Authorization for a future activity, event, or product to depart from standard procedures. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system that may lead to a failure.
Copyright © 2002 Interpharm Press
Page 27 of 30 SE-CMP
50
Software Quality Assurance SOPs for Healthcare Manufacturers
Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Integrity: Accuracy in an item’s compliance with its requirements. Procurement package: All specifications, documents, state of work, and data required to describe a complete task for submittal to prospective subcontractors and vendors. Product history file: Compilation of records containing the complete development history of a finished product. Quality assurance (QA): Planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Change Review Board (SCRB): Forum for the evaluation, approval, monitoring, and control of changes to software baselines. Software configuration management (SCM): Discipline of identifying the configuration of a software system at discrete points in time for the purpose of systematically controlling changes to this configuration and maintaining the integrity and traceability of this configuration throughout the development process. Software Configuration Management Plan (SCMP): Project-specific plan that specifies the methods and planning employed to implement SCM activities. Software Configuration Management Policy Change Control Board (SCMPCCB): Members appointed to establish and maintain a set of Software Configuration Management Policies in order to achieve quality in all phases of the software development life cycle. Software Configuration Status Report (SCSR): Document that reports the results of configuration status accounting performed on a software project. Software development library: Software library containing computer-readable and humanreadable information relevant to a software development effort.
Page 28 of 30 SE-CMP
Copyright © 2002 Interpharm Press
Software Engineering Configuration Management Policies
51
Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures used to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software project: Planned and authorized undertaking of specified scope and duration that results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Software Requirements Review (SRR): Software review conducted to review the provisions of the Software Requirements Specification, which, once approved, will serve as the basis of acceptance of the software end product. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied. Software Verification and Validation Plan (SVVP): Project-specific plan that describes the project’s unique verification and validation organization, activities, schedule, inputs and outputs, and any deviations from the software policies required for effective management of verification and validation tasks. Technical reviews: Meetings at which the software end products of a phase of software development are presented for the purpose of end-product review, issue resolution, and obtaining commitment to proceed into the next software development phase. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements.
Copyright © 2002 Interpharm Press
Page 29 of 30 SE-CMP
52
Software Quality Assurance SOPs for Healthcare Manufacturers
Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. Waiver: Authorization to depart from SE policy for an activity, event, or product that has already been initiated.
Page 30 of 30 SE-CMP
Copyright © 2002 Interpharm Press
SE-DCP SOFTWARE ENGINEERING DESIGN CONTROL POLICIES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SEDCP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 8 SE-DCP
54
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
PURPOSE
3
2.0
SCOPE
3
3.0
RESPONSIBILITY
3
4.0
PROCEDURE
4
APPENDIX A
Page 2 of 8 SE-DCP
Process Record of Deviation or Waiver Approval
8
Copyright © 2002 Interpharm Press
Software Engineering Design Control Policies
55
1.0 PURPOSE
The purpose of this document is to describe the general, high-level software design control philosophy and methodology used for [company name] software.
2.0 SCOPE
This procedure applies to the design control of the following [company name] software: (1) product, engineering, embedded systems or application software; and (2) quality system(s) software applications used to trace, track, or generate reports containing quality-related data, such as defect tracking, field problem reporting, and so forth.The allocation of this procedure is to [company name] internal software development, vendor-supplied software, and contractor-supplied software.
3.0 RESPONSIBILITY
This document applies to the design control of all software developed by or for [company name] engineering, [insert any other company organizations, such as administration, Information Systems (IS), or Information Technology (IT), that will be using this design control policy to govern software development].
3.1 Software Development Responsibilities An engineer or team will be designated by engineering [list other company organizations such as administration, IS, or IT] to undertake a planned and authorized project of specified scope and duration which results in the expenditure of resources toward the development and/or implementation of a software end product.
Copyright © 2002 Interpharm Press
Page 3 of 8 SE-DCP
56
Software Quality Assurance SOPs for Healthcare Manufacturers
3.2 Software Verification and Validation Responsibilities A software verification and validation (V&V) engineer will be designated by engineering [list other company organizations such as administration, IS, or IT] to perform and coordinate software V&V activities, including assessment, evaluation, analysis, review, testing, and the generation of associated documentation.
3.3 Software Design Control Responsibilities The Quality Systems (QS) group will define and maintain this procedure and audit compliance with this procedure. The QS group will make all decisions regarding the [company name] software packages, applications, or systems that are governed by this procedure.
4.0 PROCEDURE
A software project is a planned and authorized undertaking of specified scope and duration which results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Each software project requires that V&V activities be performed to ensure an independent assessment and measurement of the correctness, accuracy, consistency, completeness, robustness, and testability of the software requirements, design, and implementation. All software developed by or for [company name] by employees or consultants will be subjected to appropriate V&V activities as generically specified in the following sections.
4.1 [company name] Developed Software for Products or Processes 4.1.1
The specifics of software V&V and software development activities, comprising the software life-cycle methodology, procedures, and practices, shall be detailed in writing, external to the software project, in separate software engineering process documents.
4.1.2
The end result of V&V is written affirmation stating that the software was developed in accordance with documented software engineering procedures, that good quality
Page 4 of 8 SE-DCP
Copyright © 2002 Interpharm Press
Software Engineering Design Control Policies
57
assurance procedures were adhered to, and that test results demonstrate that system and software specifications and/or functional requirements were met. 4.1.3
The results of the software development and related V&V activities are to be kept on file in [enter location where documents are kept such as Document Control] as part of the [enter formal name of document repository such as Product History File]. This repository will hold the appropriate software development documents, V&V documents, and vendor V&V supporting documents.
4.1.4
Software project documentation will consist of specifications and documents detailing functional units or modules and their operations; traceability to higher requirements, safety, and hazards; and specification of the software project development activities and quality assurance procedures.
4.1.5
Software project documentation will provide specifications of the V&V activities; test plans for all functional units or modules and test completion criteria; software functional test plan; traceability among requirements, safety, hazards, and testing; and a software V&V test report and summary with results of testing at all levels.
4.2 [company name] Developed Software for [enter other company organizations such as administration, IS, or IT not using this design control policy] 4.2.1
The specifics of V&V and software development activities shall be documented in separate business system or information system software engineering process documents that describe software activities related to [company name] business system, quality system, or information system software development projects. This can be satisfied with either [company name] [enter other company organizations such as administration, IS, or IT] software engineering procedures or product development software engineering procedures.
4.2.2
The end result of V&V is written affirmation stating that the software was developed in accordance with documented software engineering procedures, that good quality assurance procedures were adhered to, and that test results demonstrate that system specifications and/or functional requirements were met.
4.2.3
The results of the software development and related V&V activities are to be kept on file in the [enter location where documents are kept such as Document Control, IS or IT] area. The [enter location where documents are kept such as Document Control, IS or IT] area will also hold vendor V&V supporting documents.
Copyright © 2002 Interpharm Press
Page 5 of 8 SE-DCP
58
Software Quality Assurance SOPs for Healthcare Manufacturers
4.3 Vendor-Supplied (“off the shelf”) Software 4.3.1
All off the shelf software shall be validated for intended use with both correct and incorrect inputs. The software package shall produce intended and correct outputs for each of the types of input.
4.3.2
Software purchased from a supplier does not require V&V when the supplier can provide evidence of at least one of the following: (1) a product V&V certification; (2) an error or bug tracking and reporting capability; (3) has software maintenance, revision, and upgrade capabilities; or (4) a software quality assurance program.This type of software has software-industry-accepted V&V based on validation through common usage and wide distribution.
4.4 Vendor-Supplied Commercial Equipment With Software Commercial equipment with incorporated software purchased from a supplier and proven through use does not require V&V when the supplier can provide evidence of at least one of the following: (1) a product V&V certification; (2) an error or bug tracking and reporting capability; (3) software maintenance, revision, and upgrade capabilities; (4) a software quality assurance program; or (5) test programs that may be used to assure that the equipment will appropriately and accurately perform all intended functions before it is used for routine production. 4.4.1
Automated production and test equipment that is controlled by software can be validated through the use of a “golden unit,” which should exercise all functions and decisions in normal and worst-case situations that may be expected during normal production use.
4.4.2
Conducting a first and last piece inspection of representative product lots can validate automated machine tools that are controlled by software. The record of this activity may be noted on the routine quality control or production records for the machine.
4.5 Contractor-Supplied Software Departments that contract for the purchase of software from subcontractors or vendors who design and/or produce software shall establish a procurement package for assuring that the subcontractor or vendor will produce quality software.The subcontractor or vendor shall com-
Page 6 of 8 SE-DCP
Copyright © 2002 Interpharm Press
Software Engineering Design Control Policies
59
ply with the same requirements imposed in this document. The contracting department shall reserve the right to review the subcontractor’s or vendor’s configuration management system prior to contract award and to periodically audit that system subsequent to contract award to assure that adequate methods are implemented for identifying and controlling each end product produced. 4.5.1
The procurement package will provide assurance that the requirements for the software are clearly defined, communicated, and completely understood by the subcontractor or vendor. This may require written procedures for the preparation of requirements and purchase orders, subcontractor or vendor conferences prior to contract release, and other appropriate methods.
4.5.2
Acceptance procedures for contractor- or vendor-supplied software may include thirdparty certification. [company name] shall have the primary responsibility for assuring that the software is adequate for its intended use.When third-party certification is used, the certification package shall include adequate documented evidence that the software complies with specified requirements. Examples of such evidence include criteria listed in the above sections.
Copyright © 2002 Interpharm Press
Page 7 of 8 SE-DCP
60
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX A
SOFTWARE ENGINEERING PROCESS RECORD OF DEVIATION OR WAIVER APPROVAL
SOFTWARE ENGINEERING PROCESS RECORD OF DEVIATION OR WAIVER APPROVAL PROJECT:
TYPE: Deviation or Waiver
PHASE: SOP Requirement Paragraph(s):
Initiated by:
________________________________ Signature
Reviewed by:
________________________________ Signature
Approved by:
________________________
Date:
________________________
Date:
________________________
Title/Position
________________________________ Signature
Date:
Title/Position
Title/Position
Reason/Rationale/Explanation:
Project schedule and performance impact:
Project risk:
Alternative approach to be used:
Page 8 of 8 SE-DCP
Copyright © 2002 Interpharm Press
SE-SDG SOFTWARE ENGINEERING SOFTWARE DEVELOPMENT GUIDELINES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SESDG-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 41 SE-SDG
62
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
PROGRAM ORGANIZATION AND STYLE
5
3.0
INTERNAL SOURCE CODE DOCUMENTATION
10
4.0
NAMING CONVENTIONS
12
5.0
CONSTRUCTS
15
6.0
CODING PRACTICES
16
7.0
HARDWARE INTERFACE
22
8.0
SOFTWARE REPORTING PRACTICES
24
APPENDIX A
Assembly Language Programming Guidelines
27
APPENDIX B
C Programming Guidelines
32
APPENDIX C-1
Software Anomaly Report
38
APPENDIX C-2
Instructions for Completing Software Anomaly Report
39
Software Development Record of Deviation or Waiver Approval
40
APPENDIX D
GLOSSARY
Page 2 of 41 SE-SDG
41
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
63
1.0 INTRODUCTION
1.1 Purpose This document has been developed to define consistent methods, standards, conventions, practices, and styles for software development and resultant source code.
1.2 Scope The guidelines in this document are to be used to promote consistency in developing all software. The software development teams will use this guide to establish common practices for software development and code generation. This guide will be used as a general template for any software language.
1.3 Overview This document covers the following: •
Organization of files and internal documentation
•
Coding style and recommended limits on function and file sizes
•
Conventions used for naming program items
•
Recommended program and logical constructs
•
Optimization rationale and generic methods
1.4 References 1.4.1
Industry Standards
Unless otherwise specified, the latest revision of the following documents shall be used:
Copyright © 2002 Interpharm Press
Page 3 of 41 SE-SDG
64
Software Quality Assurance SOPs for Healthcare Manufacturers
•
ANSI X3.159-1989
C Language Specification
•
ANSI X3.53-1976
PL/1 Language Specification
•
ANSI X3.9-1978
FORTRAN Language Specification
•
ANSI/MIL-STD-1915A
ADA Language Specification
•
ANSI X3.4-1977
Information Exchange Specification
•
ANSI/IEEE 1016-1986
Software Design Description
•
ANSI/IEEE 729-1983
Glossary of Software Engineering Terminology
•
DOD-STD-2167
Defense Systems Software Standards
•
DOD-STD-7935
Automated Data Systems Documentation
•
MIL-STD-490
Specification Practices
1.4.2
Corporate Documents
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
1.4.3 •
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
Other Documents C Programming Guidelines. T. Plum. Prentice-Hall, 1984.
Page 4 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
•
The C Programming Language, Second Edition. B.W. Kernighan and D.M. Ritchie. Prentice-Hall, 1988.
•
Software Engineering Economics. B.W. Boehm. Prentice-Hall, 1981.
65
2.0 PROGRAM ORGANIZATION AND STYLE
All software should be written in a manner that enhances readability. Each individual module, file, function, procedure, and subroutine within the program should follow the same general format and be logically ordered. The following guidelines should be used when making style decisions.
2.1 Cohesion There are seven levels of cohesion listed here from the least desirable to the most desirable: 1. Coincidental cohesion.Tasks performed by a module are grouped purely by coincidence. 2. Logical cohesion. Tasks performed by a module are independent but logically similar. 3. Temporal cohesion. Tasks performed by a module are related by time. 4. Procedural cohesion. Tasks performed by a module are related by the order in which program control flow must occur. 5. Communicational cohesion.Tasks performed by a module are related through a requirement to access common data. 6. Sequential cohesion.Tasks performed by a module are related by the order in which program data flow must occur such that the output of one task is the input to the next task. 7. Functional cohesion. Module performs a single task. A program module should be communicatively, sequentially, or functionally cohesive, and functionally cohesive is preferred. If the purpose of a module can be summarized in a single sentence of the form “specific verb + specific object(s),” that module is functionally cohesive.
Copyright © 2002 Interpharm Press
Page 5 of 41 SE-SDG
66
Software Quality Assurance SOPs for Healthcare Manufacturers
2.2 Coupling There are five types of coupling listed here from the least desirable to the most desirable: 1. Content coupling. Modules are content coupled if a module alters a statement in another module, a module refers to or changes data contained within another module, or a module branches to another module. 2. Common coupling. Modules are common coupled if they share common data. 3. Control coupling. Modules that pass control parameters are control coupled. 4. Stamp coupling. Modules are stamp coupled if they communicate using a data structure. 5. Data coupling. Modules that communicate by passing ordinary data in a call or return statement are data coupled. Data coupling is the preferred method of coupling, but data that are passed through several modules before they reach the module in which they are operated on should be avoided. Control coupling, if required, should be implemented only to pass data from a higher to a lower level module in the hierarchy. Module coupling is considered optimum when a new module can replace any particular module without affecting any other software component in the system.
2.3 Declarations and Definitions 2.3.1
Function and Variable Definitions
All functions and variables should be defined at the beginning of the source code file before they are used. Explicit data types, return types, and return values should be used instead of default values or types, because the latter can lead to incorrect code and their type is less obvious. Explicit exit types and values should be used for the same reason.
2.3.2
Definition Locations
Definitions of literals, constants, macros, structures, and function prototypes should be grouped into header files that are separate from the source code files. These header files may be combined with source code files to create the file that is then compiled or assembled. The use of header files simplifies generation of source files that use the same definitions.
Page 6 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
2.3.3
67
Data Definition Locations
Data should not be defined in header files, because they may be included in more than one source file, which causes multiple definitions of the same data. Global data definitions should be grouped into source files that have no executable instructions. This will make global data easier to find and will simplify maintenance. Grouping of data that are common to a set of programs that make up a task will also facilitate understanding of the task and simplify maintenance.
2.4 Libraries Groups of object modules should be combined into libraries, which can then be thought of as a single module in terms of usage, configuration control, documentation, and verification and validation (V&V). The advantage of using libraries is that the library will be more stable than a large number of individual files. The use of libraries to group similar modules also encourages reusability of previously debugged and verified code.
2.5 Directives The directives or any means of performing special and/or unique operations by a tool should be used. The use of directives allows for special non-code-related tool functions and support and makes the human interaction easier. All directives for a given file should be at the beginning of the file unless their functions prohibit this; directives include the following: •
Compiler
•
Assembler
•
Linker
•
Loader
•
Locator
•
Library manager
•
MAKE program
Copyright © 2002 Interpharm Press
Page 7 of 41 SE-SDG
68
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Debug switches
•
Environment variables
Prior to beginning any coding, a thorough review of the tool directives should be made in order to determine the correct assignments.
2.6 File Organization 2.6.1
File Directories
Directory structures should be uniform throughout all software projects, and they should be organized logically to aid in clarity and maintainability. Source and generated files should reside in different directories.
2.6.2
Code File Contents
If only one function resides in each file, only a function prologue is necessary. However, if multiple functions are grouped together within a file, a designation for the group should precede any functions and explain, at a minimum, what the functions have in common and why they are grouped together. Each file should be organized so that text substitutions appear first, data declarations and initialization next, and then the code.
2.6.3
Number of Functions Per File
In general, files should be limited to one function.
2.7 Function Size Functions should consist of 100 or fewer source lines and be no longer than a screen length. This guideline does not require artificial splitting of cohesive code into smaller pieces.
Page 8 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
69
2.8 Layout 2.8.1
Tabs Versus Spaces
The use of tabs within the documentation of a project can cause document output by different tools to be inconsistent in appearance. If all of the tools used on the project can be configured to produce the same output when using tabs, then tabs may be used. If tabs are used, tab stops should be the equivalent of four spaces. If one or more tools used on a project cannot be configured to the same number of tabs as all of the other tools, spaces should be used for indentation.
2.8.2
White Space and Blank Lines
White space should be used to enhance program readability. Succeeding logical levels should be indented four spaces. Secondary and following lines of a multiple line statement should be indented to show logical constructs but not less that two spaces. Blank lines should also be used wherever they promote readability. A blank line should be left between the logical sections of a program. If more that one function resides in a single file, two or more blank lines should be left between them.
2.8.3
Page Width
The maximum page width of source files should be limited to the editor screen width. This allows ease of viewing and editing. An additional limitation should be that the page width must be sized so that it is compatible with the printer used. This allows ease of viewing hard copy material.
2.8.4
Columns
Structure definitions, table initializations, and other similar constructs should be organized in neat columns for easy reading. The enclosing braces of a table should be in the same column.
2.8.5
Code
Multiple statements per line should be avoided.
Copyright © 2002 Interpharm Press
Page 9 of 41 SE-SDG
70
Software Quality Assurance SOPs for Healthcare Manufacturers
3.0 INTERNAL SOURCE CODE DOCUMENTATION
The purpose of using source-code-embedded documentation is to provide an internal guide to understanding the source code for debugging, modification, testing, communicating with other readers, and maintaining the program. The basic elements of internal source code documentation should include function prologues, block comments, in-line comments, and documentation available in ROM.
3.1 Include Files Include files are files that are included in other source files prior to processing by a compiler or an assembler, and they are used to declare the standard library and other project-related library functions. These files are also used to contain the template for data organization, definitions, macros, function prototypes, and text substitutions that are shared among multiple files. However, include files should not be used for declaration or initialization of source code data. They should be functionally organized so that declarations for separate subsystems should be in separate include files. If a set of declarations is likely to change when code is sent from one machine to another, those declarations should be in a separate include file. Nested include files should be avoided. The purpose of each include file should be explicitly described as a comment block at the beginning of the file, and all items in the include files should be clearly documented. Unique file extensions may be used to categorize the files by content. Include files should be listed at the top of the code module in which they are to be used.
3.2 Function Prologues Function prologue comments should appear immediately following the routine header statement and parameter declarations. The basic format for such comments should include the following: •
Description of what the function or subroutine does
•
Calling argument descriptions: – Input. Arguments that are used or can be accessed – Output. Arguments that are modified – I/O. Arguments that are used, can be accessed, and are modified – Returned value(s)
Page 10 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
•
Data structures, global data, and parameter descriptions by name: – Input parameter descriptions – Output parameter descriptions
•
Module name. Name of the task or module in which the function or subroutine is incorporated
•
Function or subroutine development history such as copyright notice, original author, date of creation, and revision number
71
3.3 Source Code Comments 3.3.1
Block Comments
Sections of explanatory text should be used to describe blocks of code rather than commenting on every line. Block comments should be separated from code by using blank lines or indentation and may contain pseudocode for complicated data structures or algorithms. Use comments to explain processing, side effects, or unusual operations.
3.3.2
In-line Comments
Functions or subroutines should be documented with one-line comments placed liberally throughout the source code. The comments should appear immediately above the code it describes and should be tabbed over a uniform distance. Short comments may appear on the same line as the code it describes provided that the comment does not require a second line and does not violate any other guidelines for page width or columns.
3.4 Documentation in ROM Internal source code documentation that ends up in the ROM should include the following in at least one location of every subsystem: •
Copyright notice
•
Subsystem name
•
Software version and revision number
Copyright © 2002 Interpharm Press
Page 11 of 41 SE-SDG
72
Software Quality Assurance SOPs for Healthcare Manufacturers
In addition, in one location of every system the following information should be included: •
Copyright notice
•
Device type or model identification
•
Device serial number
•
Compatible electrical engineering (EE) revision number(s)
•
Compatible mechanical engineering (ME) revision number(s)
•
Compatible software engineering (SE) revision number(s)
4.0 NAMING CONVENTIONS
4.1 General Names of files, functions, variables, structures, and macros should be selected to impart the contents and understanding of the named unit to programmers, users, and maintainers of the software. The naming format to be used on a project should be selected before coding has started, documented in the Software Detailed Design Specification (SDDS), and maintained for the duration of the life of the product. Names should be selected so that they cannot be confused with the other groupings of names. The filename should coordinate with the task it is representing. Characters selected for names must be compatible with the character set allowed for names in the language selected. If multiple languages are used, only the characters common to the languages should be used for names within that project. Names should not be assumed to be case sensitive. The length selected for names must also be compatible with the language selected. Names must be unique within the length allowed by the language. If multiple languages are used, the names must be unique within the shorter of the lengths allowed by all languages used within that project. Names of more than four characters should differ by at least two characters.
Page 12 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
73
4.2 Scope The scope of a variable should be obvious from its name. A part of the name may be used to allow the user to determine the scope of the variable without the need to look up the definition. Each project should develop a naming scheme for macros, variables, functions, and pointers that would identify these entities as belonging to a particular module, task, or function operation. In addition, global variables that serve as identifiers of specific operating system task entities, such as mailboxes or task control blocks, should be marked appropriately.
4.2.1
Variables
Variables may have global, physical task, file, or local scope and should not be all uppercase characters.
4.2.2
Data Structures
Data structures may have global, physical task, file, or local scope and should not be all uppercase characters. Data structures should follow conventions associated with their form and use. All data structures should be documented to indicate type and purpose.
4.2.3
Macros
Macro names should be completely separate and unique from all other names and should be defined with names that are all uppercase.
4.2.4
Functions
Function names should be as descriptive as possible of the operational functions that they perform. Functions should also be grouped using one or more characters of the name that allow users to understand which functions make up a task or logically grouped set of functions.
4.3 Mnemonic After the letters that indicate the scope or grouping of a file, function, variable, or structure, the remaining characters of the name should be used to convey the function of the entity. Descriptive and meaningful names should be used to assist the user in understanding the basic purpose of an entity without the need to look up the definition.
Copyright © 2002 Interpharm Press
Page 13 of 41 SE-SDG
74
Software Quality Assurance SOPs for Healthcare Manufacturers
4.4 File Names File names must conform to the conventions of the hardware platform that will hold the source files of the project. If more than one platform is used on a project, the file names must be constructed to be compatible with all systems used. Some system conventions of concern are number of characters allowed in file names, characters allowed in file names, use of file extensions, use of version extensions, and size of extensions. Characters that are unique to the selected system should be avoided.
4.4.1
Source Code Files
The files that make up a task or logical set of functions should be grouped by using file names that are identical in one or more characters. A file name should use the name of the function that it contains or as much of the name as allowed by the file system. If more than one function is contained in the file, the name of the file should indicate the functionality of the set of functions in the file.
4.4.2
Source Files
Files that are related in their function, content, or use should be grouped by using file names that are identical in one or more characters.
4.4.3
Object and Executable Generated Files
The source code object modules and their executable files should have the same name as the source code file name and the same name as the file used to produce them.
4.4.4
Library Files
The library name should indicate the function performed by the library.
4.4.5
Generated Files
Files that are related by their generation or content should be grouped by using file names that are identical in one or more characters.
Page 14 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
75
4.5 Abbreviations If abbreviations are used, only one abbreviation version should be used for any object. For example, either “ptr” or “pt” may be used as the abbreviation for “pointer” but only one should be used on a given project. Standard abbreviations should be used if possible.
5.0 CONSTRUCTS
Constructs should be used in a disciplined and structured manner, and they should help eliminate obscurity, not cause it.
5.1 Control Constructs The three types of control structures are sequencing, selection, and repetition. Sequencing is used to indicate that the execution of one statement immediately follows the execution of another statement. It is possible to group together statements of a sequence to form a unique compound statement. Selection control constructs allow a selection to be made among a number of possible alternative statements. This is usually expressed as an “if,” “if-then,” “if-then-else” construct, as a “case” statement, or as a select statement with multiple selection. Repetition constructs allow the looping through a certain set of instructions repeatedly for a finite number of times. These instructions are expressed as “for,” “while,” “repeat,” or “do” loops.
5.2 Use of Constructs Multiple selection is favored over sequential instances of single selection if there are three or more choices involved. “GOTO” commands should be avoided and structured constructs used.
5.3 Exit and End Statements Exit, end, or process terminating statements and similar commands should be used only for error exits. Copyright © 2002 Interpharm Press
Page 15 of 41 SE-SDG
76
Software Quality Assurance SOPs for Healthcare Manufacturers
5.4 Nesting Nesting of constructs should be used wherever it enhances program clarity and readability and should be reinforced through the use of indentation. Nesting should not exceed seven levels.
6.0 CODING PRACTICES
The coding practices defined here are to be used as a guide for developing code in areas of the system that require innovative approaches to achieve the required levels of performance. Simple and direct coding should be used in all cases, unless it can be shown that more complex code is mandatory to reach the required level of performance.
6.1 Parentheses and Grouping Use of implied operator precedence should be avoided in both logical and arithmetic expressions. Parentheses should be used to explicitly demonstrate execution order.
6.2 Assembly Versus High-Level Language High-level languages are preferred to low-level languages in all cases. A change from high to assembly language is generally undesirable but may be needed in order to meet performance requirements. If the performance issue is a lack of execution speed, then the change to assembly language should be considered only after the high-level language code has been examined for possible performance improvements.
6.3 Transforming State Machines to Code State transition diagrams can be converted to executable code through the use of tables, “case” statements, and several other methods. Although one method may be more efficient than
Page 16 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
77
another for any particular application, once the best method for the project application has been selected, that method should be standardized and used throughout the project. The use of one method will enhance readability of the program and provide a consistent correlation between the code and the design.
6.4 Function Calls and Macros The trade-off between the use of macros and functions is usually the same as the trade-off between speed and code size. Macros can execute faster since the call and return overhead of a function are not present, but each macro call essentially places a copy of the code in-line, which increases the size of the code. Macros also may be used to enhance the readability of the source code if used properly. Macro names should give an obvious clue to the function that the macro performs, and it should not be so complicated that it has functionality that goes beyond what is suggested by its name. Macros cannot be nested.
6.5 Variables The range, data type, and precision of all variables should be established as a part of the detailed design process. Variables should be checked when used to verify that they are within range, and data tests that show that the value is outside the allowed domain of the variable should cause an error condition to be established. The data may be reconstructed, if possible, or a system error triggered. Data checking is especially critical for safety-related variables.
6.5.1
Scope
Functions and variables should use the lowest scope required by the desired operation. A lower scope level results in a more restrictive access to the functions and variables by other system functions and reduced module coupling.
6.5.2
Global Data Versus Passed Arguments
Variables should be passed as argument parameters, unless the calling overhead becomes burdensome because of large amounts of data or multiple levels of function calls.
Copyright © 2002 Interpharm Press
Page 17 of 41 SE-SDG
78
Software Quality Assurance SOPs for Healthcare Manufacturers
6.5.3
Magic Numbers, Literals, and Constants
The use of “magic numbers” within the executable source code is not recommended. Any constant or literal that is required should be defined with the mechanism of the compiler or assembler and placed in a standard area of the source file or in a header file, and the name is then used in the source code. The compiler or assembler may then use the name to interpret and replace it with the correct definition. The origin and intended usage of this type of definition should be well documented.
6.5.4
Initialized Versus Uninitialized Variables
It is recommended that all variables be initialized by the run-time code, because this is the safest method of insuring that the initial values are set correctly. Uninitialized variables may be used so long as the program start-up code has the ability to copy the correct initial values into all uninitialized variables. Initialization of variables is not allowed in any include files, because the file may be included in more than one executable source code file. Default initializations should not be used.
6.5.5
User-Defined Types
User-defined types should be used only to simplify high-level language source code. Defined types should not be used to create new data types that are simple redefinitions of intrinsic data types, unless the new type makes the source code easier to read and understand. User-defined types can be used to simplify complex declarations if the created type name is selected to suggest the properties of the object being declared.
6.6 Calling Conventions All functions should be written so that the interface to other functions in the system is as simple as possible, because the other functions could be written for an assembler or a high-level language compiler. A calling convention or interface definition should be established at the beginning of the project and maintained throughout the life of the product. Function input parameters may be obtained by reading global values, from CPU registers, or from the program stack. Depending on the language used, the names of these areas may be different or undefined. Methods of returning values should be consistent for all functions in the system, and the return value with the lowest possible scope should be used.
Page 18 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
79
6.7 Side Effects Side effects are defined as processing or activities performed or results obtained that are secondary to the primary function of a program, subprogram, or operation. All functions have side effects, and care should be taken that all side effects are known and desired when writing a function. Undesired side effects could be caused by a register that was not preserved or a global variable that was changed unintentionally. Desired side effects are the reason for writing the function and may include changes to input parameters, global variables, or changes in hardware register contents. The possibility of undesired side effects may be reduced by using functions and variables with the smallest possible scope, by using functions to access variables, and by implementing other data-hiding techniques.
6.8 Defensive Programming 6.8.1
Selection Control Constructs
Selection control statements must provide processing for the unspecified default case.
6.8.2
Data Structures
Data structures should be checked to ensure that they are uncorrupted and being referenced correctly. In particular: 1. Arrays. Check array bounds. 2. First in/first out (FIFO) queues, last in/first out (LIFO) queues, and stacks. Check for overflow and underflow conditions. 3. Linked lists. Check that links are intact. 4. Records. Use and check for identifiers where appropriate.
6.8.3
Parameters
Function argument parameters should be checked to ensure that they are within limits on entry.
Copyright © 2002 Interpharm Press
Page 19 of 41 SE-SDG
80
Software Quality Assurance SOPs for Healthcare Manufacturers
6.8.4
Entry and Exits
Each routine, function, or procedure should be limited to one entry point and one exit point.
6.8.5
Self-Modifying Code
Code that modifies itself or executes the stack is prohibited at all times.
6.8.6
Error Handling
It is preferable to write routines that return status and error codes, and when they are provided, the codes should be checked.
6.8.7
Logical Expressions
Logical expressions should be explicitly defined. For example, instead of using the construct “if (expression)”, use “if (expression = true).”
6.8.8
Conditional Assembly and Compilation
Conditional assembly and compilation may be used to turn code sections on and off. These sections should be clearly delineated, and the purpose for the conditional assembly or compilation should be documented.
6.8.9
Recursion
Recursive code should be used only if the language facilitates it, and it should not be simulated by the coder.
6.9 Portability When portability is an issue, macros should be used sparingly, since their evaluation and parameter conventions are not standard or portable. Macros and other non-portable code should be kept in a separate file and clearly documented.
Page 20 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
81
6.10 Optimization Optimization is the application of techniques to improve system performance or to reduce the use of limited system resources. Optimization is a time-consuming process and, if improperly done, may occur at the expense of good design practice and portability.
6.10.1 Scope Optimization should only be done to meet system performance requirements or resource constraints, and it should be considered only after the system design is complete. Optimization to meet performance requirements should be delayed until measurements are made to identify performance issues.
6.10.2 Side Effects Optimization has varying levels of impact on maintainability and portability. The techniques with the lowest impact should be implemented first and followed, if necessary, by techniques with greater impact.
6.10.3 Low-Impact Optimization Techniques Techniques that have little or no impact on portability and maintainability include the following: •
Using compiler optimization options
•
Removing invariants from loops
•
Changing a control flow statement to an alternate form
•
Changing data types
•
Eliminating variables that hold only intermediate values
•
Moving code from interrupt service routines to program code
6.10.4 Moderate-Impact Optimization Techniques Techniques that have some impact on portability and maintainability include the following:
Copyright © 2002 Interpharm Press
Page 21 of 41 SE-SDG
82
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Converting high-level language to assembler
•
Converting subroutine calls to macros
•
Converting subroutine calls to in-line code
•
Combining two or more modules into a single module, even though it reduces cohesion and increases the size of the module beyond the recommended limit
6.10.5 Significant Impact Optimization Techniques Techniques that have significant impact on portability and maintainability include the following: •
Modifying code to use “GOTO” commands
•
Adding multiple returns to subroutines
•
Employing techniques that increase module coupling
6.11 Linking There are link options that can be set in order to make the task of debugging easier. For example, explicitly request a map file that includes a list of global symbols; create a cross-reference listing between source lines and addresses; and turn on the option for symbolic debug.
7.0 HARDWARE INTERFACE
This section provides a general guide for implementing the interface between the software and the hardware platform on which it will operate.The hardware interfaces should be defined during the requirements analysis and design phases and should be documented in the Interface Design Specification (IDS).
Page 22 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
83
7.1 Input/Output (I/O) Interfacing Some other considerations need to be taken into account when interfacing to an I/O device.
7.1.1
I/O Classifications
I/O interfaces may be classified as human oriented, computer oriented, and external environment oriented. Human-oriented interfaces provide interfacing for external devices, such as switches, keyboards, lights, LEDs, alphanumeric displays, and LCD displays. Computeroriented interfaces provide interfacing for external devices, such as cassettes, cartridges, floppy disks, modems, remote communications, and printers. External-environment-oriented interfaces provide interfacing for devices such as analog-to-digital converters (ADC), digitalto-analog converters (DAC), relays, sensors, and stepping motors.
7.1.2
I/O Interface Timing
Resolve any differences that may exist regarding the timing between the processor and the peripheral device. In the case where some peripheral device demands immediate attention, the interface may need to produce interrupt signals to force the processor to react quickly.
7.1.3
Data Conversions
Convert the format of the data produced by the processor to the format necessary for the peripheral immediately before the peripheral interface. Convert the format of the data produced by the peripheral to the format necessary for the processor immediately after the peripheral interface.
7.1.4
Buffered I/O
All I/O should be buffered to reduce communication overhead, maintain a consistent response time, and provide appropriate I/O error checking and error recovery.
7.1.5
I/O Port Names
All I/O ports should be addressed with descriptive names.
Copyright © 2002 Interpharm Press
Page 23 of 41 SE-SDG
84
Software Quality Assurance SOPs for Healthcare Manufacturers
7.2 Interrupts Interrupt handling routines should save the state of the interrupted program, service the interrupt, restore the state of the interrupted program, and return.
7.3 Initialization Hardware should be explicitly set to the initial state or verified to be in the correct initial state.
8.0 SOFTWARE REPORTING PRACTICES
8.1 Bug, Error, and Anomaly Reporting The reporting and documenting of coding bugs and errors is accomplished through the use of the Software Anomaly Report.
8.1.1
Software Anomaly Report
Problem reporting is initiated by the software engineer(s) with a Software Anomaly Report, which identifies problems detected during software development activities. The specific information required on an anomaly report identifies how, when, and where the problem occurred and the impact of the problem on the system capability and on the continued conduct of V&V phase activities. Appendix C shows an example of the anomaly report and the instructions for completing the report.
8.1.2
Anomaly Reporting and Resolution
The software V&V lead engineer is responsible for ensuring the proper documentation and reporting of Software Anomaly Reports, and all anomalies are reported regardless of the perceived impact on software development or severity level with respect to the system operation. Unreported and unresolved problems can have a significant adverse impact in the later stages of the software development cycle, which may include little time for resolution.
Page 24 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
85
The projected impact of an anomaly is determined by evaluating the severity of its effect on the operation of the system. The severity of an anomaly report is defined as one of the following: •
High. The change is required to correct a condition that prevents or seriously degrades a system objective and no alternative exists or to correct a safety-related problem.
•
Medium. The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user and system requirements can be met.
•
Low. The change is desirable to maintain the system, correct operator inconvenience, or other.
Resolution of the critical anomaly indicated as a severity of “high” is required before the development effort proceeds to the next software development phase. Software Anomaly Reports are reviewed by the software lead engineer for anomaly validity, type, and severity, and the software lead engineer can direct additional investigation if required to assess the validity of the anomaly or the proposed solution. When an anomaly solution is approved and the personnel responsible for performing the corrective action are indicated, the software lead engineer will authorize implementation of the corrective action. The software V&V lead engineer is responsible for anomaly report closure, which includes documenting that the corrective action(s) have been taken and verifying the incorporation of authorized changes as described in the anomaly report. If the anomaly requires a change to a baselined configuration item, a Change Request/Approval (CRA) is prepared by a member of the software development team for the item(s) to be changed. A reference to applicable anomaly reports will be documented in the issued CRA.
8.2 Software Deviation or Waiver Circumstances may require deviation(s) or waiver(s) from policy. A written request for a deviation is generated by the software lead engineer in advance of a future activity, event, or product in order that SE management be made aware of the project’s intention to employ a higher risk development approach. A written request for a waiver is generated by the software lead engineer in those cases where the activity, event, or product has already been initiated. The deviations or waivers are submitted to the [project title/position] for review, and a recommendation is made to the [title/position] and/or [title/position] for approval or disapproval of the proposed deviation or waiver.
Copyright © 2002 Interpharm Press
Page 25 of 41 SE-SDG
86
Software Quality Assurance SOPs for Healthcare Manufacturers
A proposed deviation or waiver must be approved by the [title/position] and/or [title/position] before commencing on the software development tasks affected by that deviation or waiver. A copy of each approved deviation or waiver shall be forwarded to the secretary of the Software Development Policy CCB. A copy shall also be placed in the product history file. A permanent record of deviation and waiver approvals shall be maintained for each project in the form depicted in Appendix D. Each request for a deviation or waiver identifies the following: •
Each specific policy or policy requirement for which it applies
•
The alternative policy approach to be taken by the project
•
The impact on project schedule, performance, and/or risk
This record shall be initiated during development of the Product Objectives Document and shall serve as a record of all subject approvals for the duration of the project.
Page 26 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
APPENDIX A:
87
ASSEMBLY LANGUAGE PROGRAMMING GUIDELINES
A.1 Introduction This appendix covers those topics not discussed above that are specific to the assembly language programming.
A.2 Commenting Conventions A.2.1 Looping Constructs The do-while construct is bracketed by comment lines “DO WHILE” and “END WHILE.” The do-until construct is bracketed by comment lines “DO” and “END DO.” Within the dountil construct, the “UNTIL” condition comment line is placed above the conditional check. The executable code within the loop should be indented to aid in readability.
A.2.2 Case Constructs The if-else and if-then-else constructs are bracketed by comment lines “IF” and “END IF.” Place the “THEN” condition comment line after the conditional check. If there is an else part, place the “ELSE” condition comment line after the body of the if part. All executable code within the construct should be indented.
A.2.3 Multiple Case Constructs This construct is bracketed by comment lines “CASE” and “END CASE.” Place the “IF” condition comment line above the first check point and “ELSE IF” condition comment line above each remaining check point. Place the “THEN” condition comment line below each check point. All executable code within the construct should be indented.
Copyright © 2002 Interpharm Press
Page 27 of 41 SE-SDG
88
Software Quality Assurance SOPs for Healthcare Manufacturers
A.3 Directives The following suggestions should be followed concerning the use of directives: •
Don’t make the assembly process of the program dependent upon unique directives.
•
If string definitions and substitutions are allowed, then ASSIGNing string substitutions allows easy redefinition if assemblers are changed.
•
Reduce the number of embedded directives and list the general directives, collected as a block.
•
Use, as an include or file, this list of general directives as they apply across all program modules.
•
All directives should have the same invocation from within the same program.
A.4 Operations A.4.1 Registers Use register operands when needed for speed; the instruction cycle time is reduced since a memory fetch for the operand is not required. Certain registers in the microprocessor may have a unique feature such as accumulator, memory addressing, counting, indexing, and status control.
A.4.2 Constants A constant may be explicitly defined in memory and referenced by its label. A certain data area should be set aside for constants so as to prevent inadvertent destruction. If the microprocessor supports immediate addressing, the constant may become part of the instruction. When using this method, the constant should be associated with a descriptive symbolic label via an assembler directive.
Page 28 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
89
A.4.3 Expressions An assembler may allow the use of constant expressions as operands. The level of expression evaluation will be assembler specific but may allow use of nested parentheses and/or arithmetic, logical, relational, and memory operators.
A.5 Instruction Set Care should be exercised when using the instruction set of a specific assembler. Consistent programming techniques can enhance readability and maintainability of assembler code. Use arithmetic opcodes for arithmetic operations and logical opcodes for logical operations. Use the same register each time a loop counter is required, and be consistent with methods of loop control. In addition to performance improvement, the programmer should use hardware math functions if available and shift operations for powers of 2 multiplication and division. The use of the smallest form of call or jump will enhance both size and speed, but be certain that the use is consistent with the rest of the modules in the program. Avoid creating macros that look like assembler mnemonics. Use the most common mnemonics and avoid vendor-specific mnemonics.
A.6 Interfacing with High-Level Languages A calling convention or interface definition should be established at the beginning of the project and maintained throughout the life of the product. It will usually be strongly influenced by the high-level language being used. The following sections contain information that should be defined as part of the calling convention.
A.6.1 Parameter Passing Function input parameters may be obtained by reading global values, from CPU registers or from the program stack. Depending on the high-level language used, the names of these areas may be different or undefined.
A.6.2 Global Variables Reading global variables is straightforward as long as the calling and called functions are consistent in the way the variable is used and the way in which it is stored. The calling function
Copyright © 2002 Interpharm Press
Page 29 of 41 SE-SDG
90
Software Quality Assurance SOPs for Healthcare Manufacturers
must set the global variables and then do a call to the desired sub-function. The called subfunction will then simply read the global data and process it, possibly placing results in another global data area or overwriting the input data area.
A.6.3 Usage, Storage, and Variable Naming Some high-level languages add a prefix or change the variable name in some way, and the names must be arranged so that they will match after the high-level language compiler and the low level language assembler have processed their respective source code. The responsibility for conforming to the name format is usually placed on the low-level language, so the low-level language names will probably need to be adjusted to conform to the high-level language format. When a low-level language function is called from another low-level language function or a high-level language function is called by a low-level language function, the same requirements of consistent usage, storage, and variable naming must also be followed.
A.6.4 Register Variables Reading register variables is easy to do, but high-level languages may not be able to load the registers easily. High-level languages that do have the ability to load registers may be less efficient than they are when using other methods of parameter passing. Care should be exercised when choosing this method of parameter passing, since high-level language compilers usually have restrictions on which registers may be changed. Passing parameters in registers can be very easy and very fast when a low-level language function is called from another low-level language function. However, if the low-level language function is also called from a high-level language or the low-level language function calls a high-level language function, the use of register-passed parameters will cause the lowered efficiency described above. It may not be possible to directly call a high-level language function with register-passed parameters because that method may not be supported by the high-level language compiler.
A.6.5 Stack Variables For high-level languages that support a program stack, variables may be passed on the stack. Some high-level languages manipulate the stack and variables placed on the stack. Low-level language functions that receive parameters on the stack must reference these variables by using the stack pointer or its equivalent and an offset to the correct position on the stack. The offset will be the same for any variable on the stack for any call that uses an identical calling sequence. Each high-level language call to the low-level language assembler function must have the same number, order, and size variables, since these factors affect the offset of every variable on the stack. Also affecting the offset is the size of the return address.When a low-level
Page 30 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
91
language function is called from another low-level language function or a high-level language function is called from a low-level language function, the same requirements of consistent usage, storage, and variable naming must be followed. The calling function must place the parameters on the stack and then call the desired sub-function, which will reference and process the input data from the stack.
A.6.6 Preserving Registers Some high-level languages have restrictions on which registers may be changed or which registers must be preserved. All registers that are changed should be saved and restored by the called function unless one of the following conditions applies: •
The high-level language defines a register as a return value storage location.
•
The high-level language does not require the register to be saved, and a time constraint is in effect for the assembler function. This may be the case for an interrupt service routine but requires careful consideration.
•
A low-level language is calling a high-level language that does not require all registers to be preserved. The low-level language interface must be aware of which registers may be changed by the high-level language and preserve appropriate registers before calling the high-level language.
A.6.7 Returning Values Methods of returning values should be consistent for all functions in the system. Any high-level languages in use will affect the location of the return value. Return values can be passed back to the calling program as a changed global variable, memory location, or register. If a global is used as an input parameter, it may be changed and used as an output parameter. Some highlevel languages use a fixed set of registers as the return value storage area. Any low-level language functions should follow this convention if the function is to be called from the high-level language and should expect return values in those registers if high-level language functions are called from low-level language functions. The return value with the lowest possible scope should be used.
Copyright © 2002 Interpharm Press
Page 31 of 41 SE-SDG
92
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX B
C PROGRAMMING GUIDELINES
B.1 Introduction This appendix covers those topics not discussed above and that are specific to the C programming language.
B.2 Scope A source file is the principal administrative unit of code written in the C language, and it is a key element in the C language scoping rules, allowing data and function “hiding.” Since a file may be made up of one or more related C functions, it is important that there be a file prologue distinct from each function prologue. If an operating system task main function is defined within a file, then certain additional elements are required in the prologue.
B.3 Braces In C language control structures, each opening and closing brace should appear on a line of its own and should be indented to the same tab position as the code enclosed in the braces. Braces should be used around the body of all for, if, while, switch, and do-while conditional statements to prevent errors when code is modified. In data structures, left and right braces should be vertically aligned.
B.4 Naming Conventions The ANSI standard for naming conventions should be followed.
B.5 Constants All manifest and enumeration constants should be defined with uppercase names either in the file where used or in a header file when used in more than one file. A set of useful manifest constants, such as YES and NO, should be provided in a project header file or taken directly
Page 32 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
93
from standard header files. Multicharacter constants, such as \r\f, should be avoided because of the non-portability of machine byte order and probable maintenance difficulties. If enumeration constants are utilized as array indices, then this usage should be thoroughly commented where the enumeration constants are defined.
B.6 Variables Variables should be defined with all lowercase names and be distinct within the first 31 characters. Only one variable should be declared per source line, and a comment should be provided to explain the purpose of the variable.
B.7 Data Conversion Pointers should be explicitly declared with the correct pointer type. Pointer conversions should be limited to the following: •
Assignment of the NULL pointer
•
Assignment of an allocation function
•
Conversion of a general pointer to a pointer to void, or pointer to character and back again
B.8 TypeDefs TypeDefs should be used in the tag definitions of structures, unions, and enumerations. This provides cleaner and shorter declarations and, in some cases, may produce better diagnostics in the case of syntax errors. TypeDefs should be consistently uppercase or lowercase throughout the project. Tag definitions should be separated from variable declarations. Unions should not be used to transfer data from one type to another because of portability considerations.
B.9 Defined Types The basic data types of char, short, and others should be used instead of artificially defined types, but this does not prevent the use of enumerated types, such as BOOL. The exception
Copyright © 2002 Interpharm Press
Page 33 of 41 SE-SDG
94
Software Quality Assurance SOPs for Healthcare Manufacturers
might be in cases where portability is an issue, as with an operating system or communications library. If metatypes are necessary, they should all be defined in a header file. Preprocessor conditionals should be used in conjunction with the header definitions to minimize portability problems.
B.10 Defensive Programming B.10.1 Spaces Around Operators Readability of the code is enhanced by a uniform layout of the operators. Spaces are related to precedence, because spaces imply a looser binding than the absence of spaces. An improper or non-uniform use of spaces may convey a different meaning to an expression than was intended. Operators are categorized into the following groups: •
High Precedence. These operators include the primary operators of [], (), ., and -> and unary operators of -, ++, —, !, ~, *, &, (type), and sizeof.
•
Medium Precedence. These operators include the arithmetic operators +, -, / and *, bitwise operators <<, >>, &, ^, and |, logical operators && and ||, and relational operators <, <=, >, and >=.
•
Low Precedence. These operators include the conditional operators ? and :, assignment operators =, +=, -=, /=, *=, &=, ^=, |=, <<=, >>=, and %= and the comma operator.
The high-precedence operators should never have space around them. The low-precedence operators should always have space around them, and the medium-precedence operators should usually have space around them. An exception to the rule for the primary operator () is the keywords if, while, for, switch, and return, which should be followed by a space.
B.10.2 Evaluation Order Code that depends on the order of evaluation may perform differently with different tools, so programs should not depend upon the order of evaluation of expressions.
Page 34 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
95
B.10.3 Bitwise Operators and Parentheses The bitwise operators &, |, ~, ^, >>, and << should be explicitly parenthesized when combined with other operators of equal precedence, such as arithmetic, bitwise, relational, and logical.
B.11 Side Effects B.11.1 Order of Side Effects Programs that depend upon the order of side effects may not perform correctly when ported to a new machine or a new compiler. Programs must not depend upon the order in which side effects take place. In particular, the postfix increment and decrement operators may alter the memory at unpredictable times during the evaluation of the expression, and C guarantees only that the side effect will be complete when the next statement is reached.
B.11.2 Side Effects of Macros Macros can produce hidden dependencies on side effects. If the formal parameter of a macro appears more than once in the body of the macro and the actual parameter has a side effect, then that side effect is executed more than once. If macros must use a formal parameter more than once, then an explicit warning should be given in the documentation.
B.12 Control Structures B.12.1 GOTO The GOTO statement should not be used.
B.12.2 Switch Switch statements must have a default clause. Break statements are encouraged as the last statement in each case option.
Copyright © 2002 Interpharm Press
Page 35 of 41 SE-SDG
96
Software Quality Assurance SOPs for Healthcare Manufacturers
B.13 Include Files B.13.1 Nesting Nesting of include files is not recommended, but if the need arises, then at the beginning of each include header file there should be a #define inc_header_file_name. Each time an include file is introduced into another include file, a #ifndef and #endif should be placed around the #include to avoid multiple include files.
B.13.2 Live Code and Initialization Include files should not contain initialization or live code.
B.13.3 Macros If a macro is used globally, then the macro should be defined in an include file. If the macro is used locally, then the macro should be defined in the file that uses that particular macro.
B.14 Use of Static Data Between Functions Static variables should be placed at the top of the file to ensure that there are no repeated static variables throughout the file.
B.15 LINT LINT is a tool to aid the programmer in producing reliable and portable code that is free from questionable coding practices. LINT generates warnings about constructions that it considers “suspicious,” even though they are perfectly legal and may be good programming style. These warnings must be considered by the programmer, but it is permissible to accept code that generates LINT warnings. LINT cannot be used for the acceptance testing of modules, and it is not a replacement for other forms of code review. LINT should be used across all modules in a task or all modules referencing a given set of global variables. LINT is capable of checking the consistency of the declarations of global variables across modules, whereas the compiler and linker are not.
Page 36 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
97
B.16 Portability C programs should adhere to the ANSI standard as much as possible in order to help prevent difficulties in moving code from one project to another. It also makes possible compiling and linking code on the host for debugging with stubs and/or simulators. When non-standard constructs must be used they should be localized as much as possible and carefully documented. If it is necessary to have different versions of the source to compile for host debugging and for the target, then conditional compilations and/or conditional include files may be used, and the condition used for switching between versions should be as automatic as possible.
Copyright © 2002 Interpharm Press
Page 37 of 41 SE-SDG
98
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX C-1
SOFTWARE ANOMALY REPORT SOFTWARE ANOMALY REPORT
1. Date:
2. Severity: HML
3. Anomaly Report
4. Title (briefly describe the problem):
5. System: 8. Originator:
6. Component: 9. Organization
12. Verification and Validation Task: 14.
System Configuration:
15.
Anomaly Description:
16.
Problem Duplication: During run Y N After restart Y N After reload Y N
10. Telephone
Investigation Time
19.
Proposed Solution:
20.
Corrective Action Taken: Date:
21.
Closure Sign-off:
11. Approval:
13. Reference Document(s):
17. N/A N/A N/A
18.
Page 38 of 41 SE-SDG
7. Version
❑ ❑ ❑ ❑ ❑
Source of Anomaly: PHASE Requirements Architecture Design Detailed Design Implementation Undetermined
❑ ❑ ❑ ❑ ❑ ❑
TYPE Documentation Software Process Methodology Other Undetermined
Software Lead Engineer
Date
V&V Lead Engineer
Date
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
APPENDIX C-2
99
INSTRUCTIONS FOR COMPLETING SOFTWARE ANOMALY REPORT
1. Date: Form preparation date. 2. Severity: Circle the appropriate code. High: The change is required to correct a condition that prevents or seriously degrades a system objective (where no alternative exists) or to correct a safety-related problem. Medium: The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user and system requirements can be met. Low: The change is required to maintain the system, correct operator inconvenience, or other. 3. Anomaly report number: Number assigned for control purposes. 4. Title: Brief phrase or sentence describing the problem. 5. System: Name of the system or product against which the anomaly report is written. 6. Component: Component or document name against which the anomaly report is written. 7. Version: Version of the document or code against which the anomaly report is written. 8. Originator: Printed name of individual originating the anomaly report. 9. Organization: Organization of originator of anomaly report. 10. Telephone: Office phone number of the individual originating the anomaly report. 11. Approval: Software management individual or designatee approval for anomaly report distribution. 12. V&V task name: Name of the V&V task being performed when the anomaly was detected. 13. Reference document: Designation of the documents that provide the basis for determining that an anomaly exists. 14. System configuration: Configuration loaded when anomaly occurred; not applicable for documentation or logic errors. 15. Anomaly description: Description defining the anomaly and a word picture of events leading up to and coincident with the problem. Cite equipment being used, unusual configurations, environment parameters, and so forth, that will enable the programmer to duplicate the situation. If continuation sheets are required, fill in Page _ of _ at the top of the form. 16. Problem duplication: Duplication attempts, successes or failures for software errors; not applicable for documentation or logic errors. 17. Source of anomaly: On investigation completion, source of the anomaly in terms of phase origination and type. 18. Investigation time: Time, to the nearest half hour, required to determine the cause of the anomaly but not the time to determine a potential solution or time to implement the corrective action. 19. Proposed solution: Description defining in detail a solution to the detected anomaly, including documents, components and code. 20. Corrective action taken: Disposition of the anomaly report, including a description of any changes initiated as a direct result of this report and the date incorporated. 21. Closure sign-off: Signature of the software lead engineer authorizing implementation of the corrective action. Signature of the V&V lead engineer verifying incorporation of the authorized changes as described in this report. Only signature of software lead engineer is required when no corrective action is approved.
Copyright © 2002 Interpharm Press
Page 39 of 41 SE-SDG
100
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX D
SOFTWARE DEVELOPMENT RECORD OF DEVIATION OR WAIVER APPROVAL SOFTWARE DEVELOPMENT RECORD OF DEVIATION OR WAIVER APPROVAL
PROJECT:
TYPE: Deviation or Waiver
PHASE: SOP Requirement Paragraph(s):
Initiated by:
________________________________
Reviewed by:
________________________________
Signature Signature
Approved by:
________________________
Date:
________________________
Date:
________________________
Title/Position Title/Position
________________________________ Signature
Date:
Title/Position
Reason/Rationale/Explanation:
Project schedule and performance impact:
Project risk:
Alternative approach to be used:
Page 40 of 41 SE-SDG
Copyright © 2002 Interpharm Press
Software Engineering Software Development Guidelines
101
GLOSSARY Cohesion: Degree to which the tasks performed by a single program module are functionally related. Coupling: Measure of interdependence among modules in a computer program. Generated file: Any file that can be created or regenerated automatically. Magic number: Number used in a program that has no obvious meaning or derivation. Source file: Any file that cannot be created or regenerated from an automated system utility. State tables: One method of converting state transition diagrams to executable code.
Copyright © 2002 Interpharm Press
Page 41 of 41 SE-SDG
SE-SDP SOFTWARE ENGINEERING SOFTWARE DEVELOPMENT POLICIES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SESDP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 51 SE-SDP
104
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
PREAMBLE
SE Software Development Policies
4
POLICY 1
Software Development Estimations
9
POLICY 2
Software Quality Assurance
11
POLICY 3
Software Tools
12
POLICY 4
Hardware Tools
13
POLICY 5
Software Development Plan
15
POLICY 6
Programming Standards and Conventions
16
POLICY 7
Software Documentation
17
POLICY 8
Software User’s Manual
19
POLICY 9
Development Test Information Sheets
20
POLICY 10
Software Design Methodology
22
POLICY 11
System Analysis Methodology
23
POLICY 12
Software Development Test Plan
24
POLICY 13
Software Design Walk-throughs
25
POLICY 14
Software Code Walk-throughs
26
Page 2 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
105
POLICY 15
Software End-product Acceptance Plan
27
POLICY 16
Interface Design Specification
29
POLICY 17
Software Requirements Specification
30
POLICY 18
Software Requirements Review and Acceptance
32
POLICY 19
Software Architecture Design Specification
33
POLICY 20
Software Architecture Design Review and Acceptance
35
POLICY 21
Software Detailed Design Specification
37
POLICY 22
Software Detailed Design Review and Acceptance
39
Anomaly Reporting and Resolution
41
POLICY 23 GLOSSARY
Copyright © 2002 Interpharm Press
44
Page 3 of 51 SE-SDP
106
Software Quality Assurance SOPs for Healthcare Manufacturers
PREAMBLE
SE SOFTWARE DEVELOPMENT POLICIES
Policy Software Engineering (SE) software projects shall comply with a set of Software Development Policies that are established, maintained, and used to promote effective and consistent software development practices in all phases of the software life cycle. Justified and necessary departures from these policies may be authorized in response to a written request. A permanent board shall be established to control and maintain the SE Software Development Policies.
Requirements 1. The SE Software Development Policies shall be applied to all SE software projects. Projects in which effort will be expended in order to modify or enhance existing software also are subject to this requirement. 2. SE Software Development Policies (Figures 1 and 2) shall be maintained by the Software Development Policy Change Control Board (CCB). The chairman of this board shall be appointed by the SE [title/position] with the approval of the [title/position], and board members shall be appointed in writing by the SE [title/position]. The SE [title/position] shall serve as the secretary to the Board and shall be responsible for scheduling board meetings and maintaining minutes of meetings and permanent files of CCB actions. Proposed changes to SE Software Development Policies must be submitted in writing to the CCB. At least once each year, the board shall convene to review the policies in their totality for relevancy and currency. Where appropriate, they shall propose revisions to the policies subject to the review and approval of the SE [title/position]. After approval by the SE [title/position] the policies shall be approved by [title/position] and [title/position] (see Figure 3). 3. Circumstances may require deviation(s) or waiver(s) from policy. A written request for a deviation shall be submitted by the project software lead engineer in advance of a future activity, event, or product in order that SE management be made aware of the project’s intention to employ a higher risk development approach. A written request for a waiver shall be submitted by the project software lead engineer in those cases where the activity, event, or product has been already initiated. Deviations or waivers shall be reviewed by the [project title/position] and be submitted to the SE [title/position] for review. The SE [title/position] will make a recommendation to the [title/position] and/or [title/position] for approval or disapproval of the proposed deviation or waiver. A proposed deviation or waiver must be approved by the [title/position] and/or [title/position] before commencing on the software development tasks affected by that deviation or waiver.
Page 4 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
Figure 1
107
SE Software Development Policies Policy Category
Project Preparation Specifications
Reviews
Development Practices
Integration, Test, and Operations
Product Management and Acceptance
Copyright © 2002 Interpharm Press
Policy Topic Title Software Development Estimations Interface Design Specification (IDS) Software Requirements Specification (SRS) Software Architecture Design Specification (SADS) Software Detailed Design Specification (SDDS) Software Requirements Review and Acceptance Software Architecture Design Review and Acceptance Software Detailed Design Review and Acceptance Software Design Walk-throughs Software Code Walk-throughs Programming Standards and Conventions Software Design Methodology System Analysis Methodology Software Tools Hardware Tools Development Test Information Sheets (DTIS) Software Development Test Plan (SDTP) Software User’s Manual (SUM) Anomaly Reporting and Resolution (SAR) Software Quality Assurance (SQAP) Software Development Plan (SDP) Software Documentation Software End-Product Acceptance Plan (SEAP)
Policy Number 1 16 17 19 21 18 20 22 13 14 6 10 11 3 4 9 12 8 23 2 5 7 15
Page 5 of 51 SE-SDP
108
Software Quality Assurance SOPs for Healthcare Manufacturers
Figure 2
SE Software Development Policies Throughout the Software Development Life Cycle
U
U
U
Software Requirements Specification (SRS)
ED
U
U
U
Software Requirements Review (SRR)
ED
Software Detailed Design Specification (SDDS)
ED
U
Software Detailed Design Review (SDDR)
ED
ED
Software Architecture Design Specification (SADS)
ED
Software Architecture Design Review (SADR)
ED
Programming Standards and Conventions
E
E
E
E
Software Design Methodology
E
E
E
E
System Analysis Methodology
E
E
E
E
Software Design Walk-Throughs
E
E
E
Software Code Walk-Throughs
E
E
E
E
Software Validation
U
Interface Design Specification (IDS)
Integrate and Test
U
ED
Code and Test
Detailed Design
U
Software Development Estimations
Interface Design
U
Project Start-up
Architecture Design
Software Life Cycle Phase
Requirements
Policy Topic Title
E
Software Tools
S
S
S
S
S
S
S
S
Hardware Tools
S
S
S
S
S
S
S
S
Development Test Information Sheet (DTIS)
S
S
S
SD
E
E
E
Software Development Test Plan (SDTP)
S
S
SD EU
E
E
E
Software User’s Manual (SUM)
S
S
SD EU
E
E
E
E
E
E
E
E
E
Software Quality Assurance (SQAP)
D
E
E
E
E
Software Development Plan (SDP)
S
S
SD
U
U
Software Documentation
E
E
E
E
Software End-Product Acceptance Plan (SEAP)
S
SD
EU
U
Notes: 1. 2. 3. 4.
D
D indicates that a deliverable or activity is required at that time. U indicates that an update of a previous deliverable occurs. E indicates that the procedure requirements are in effect for the entire phase. S indicates that the procedure requirements can start at any time.
Page 6 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
109
4. Each request for a deviation or waiver shall identify: a. Each specific policy or policy requirement for which it applies b. The alternative policy approach to be taken by the project c. The impact on project schedule, performance, and/or risk 5. A copy of each approved deviation or waiver shall be forwarded to the secretary of the Software Development Policy CCB. A copy shall also be placed in the product history file. 6. These policies refer to and govern a set of SE software development procedures. The procedures are intended to provide detailed guidance within the framework and requirements provided by these policies. It is the responsibility of each project software lead
Figure 3
Matrix of Responsibilities for Software Development Policy Documents Document Title
Software development procedures Software development deviation Software development waiver Software Quality Assurance Plan (SQAP) Software Development Plan (SDP) Development Test Information Sheet (DTIS) Software Test Plan (STP) Software End-product Acceptance Plan (SEAP) Interface Design Specification (IDS) Software Requirements Specification (SRS) Software Architecture Design Specification (SADS) Software Detailed Design Specification (SDDS) Notes: 1. 2. 3. 4. 5. 6. 7. 8.
SE1 SLE2 G
G
EE3 ME4
System Engineer
G G G
R R
G G
R
R/D G
R/D
G G
Project Director, Manager SE R/S R/S
R/D R/D R/D R/D R/D
R/D R/D
R
G
R/D
R/D
G
R/D
R/D
G
R/D
R/D
R
R
SE is a senior software engineer assigned to the project. SLE is the project software lead engineer assigned to the project. EE is a senior electrical engineer assigned to the project. ME is a senior mechanical engineer assigned to the project. G means generate. R/D means review and disposition. R means review. R/S means review and submit.
Copyright © 2002 Interpharm Press
Page 7 of 51 SE-SDP
110
Software Quality Assurance SOPs for Healthcare Manufacturers
engineer to apply the existing relevant SE software development procedures. New SE software development procedures are to be submitted to the SE [title/position] prior to their use, so that they can be reviewed and approved. 7. A permanent record of deviation and waiver approvals shall be maintained for each project using the form depicted in the SE software development procedures. This record shall be initiated during development of the Product Objectives Document and shall serve as a record of all subject approvals for the duration of the project.
Responsibilities The project software lead engineer is responsible for: 1. Generating written deviations and waivers 2. Generating changes to development policies 3. Generating changes to development procedures 4. Applying relevant development procedures to the project The SE [title/position] is responsible for: 1. Review and recommendation of software deviations and waivers 2. Review and recommendation of Software Development Policies 3. Review and approval of software development procedures The [title/position] and/or [title/position] is responsible for: 1. Approval of software development deviations and waivers 2. Approval of Software Development Policies The [project title/position] is responsible for the review and submittal of deviations and waivers from Software Development Policies. The managers of organizations supporting and sponsoring the project should share the commitment to the implementation of these policies.
Page 8 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
111
POLICY 1 SOFTWARE DEVELOPMENT ESTIMATIONS
Policy The project software lead engineer shall be responsible for generating and obtaining approval of software development schedule(s), personnel estimates, and tasking prior to beginning any software development. Software development schedule(s), personnel estimates, and tasking shall be based either on bottoms-up estimates of the required staffing levels for each Work Breakdown Structure (WBS) element or on software models based on size estimates of each software unit, measured in number of thousand delivered source instructions (KDSI). These estimates shall include the major development milestones and the allocation of estimated time to each software development phase. The software development estimates shall be reviewed again at the Software Requirements Review (SRR), Software Architecture Design Review (SADR), and Software Detailed Design Review (SDDR) and shall be adjusted appropriately to reflect changes in the software design and/or in the development plan.
Requirements 1. Software development estimation for software projects shall be based on a preliminary software architecture that is defined to a level of detail and information content commensurate with that required at the SADR. The level of definition of the software components shall be sufficient to uniquely allocate all of the software requirements among the defined components. Each component shall be small enough so that its size can be estimated reliably in terms of the number of source language instructions.The count can be based on prior experience for comparable software components that contain the same type of algorithms or by actually counting the number of operations in the chosen language necessary to implement the equations in the algorithms. 2. In the event that it is necessary to create a development estimation for a system where the requirements are not defined to a sufficient level of detail to support the modeling described above, assumptions shall be made concerning the nature of all missing requirements. The development estimation then shall be derived by employing this augmented set of requirements. All such assumptions shall be clearly documented. 3. The development estimate of each software component shall be determined in personhours by applying a validated software development model to the size estimates expressed in KDSI for that component.The same models shall be used to derive the nominal development schedule in person-months for the project and the nominal allocations of effort
Copyright © 2002 Interpharm Press
Page 9 of 51 SE-SDP
112
Software Quality Assurance SOPs for Healthcare Manufacturers
to the various software development phases. Deviations from the nominal values shall be justified and their impact assessed. Personnel estimates for the requirements generation and any training phases shall be estimated separately. 4. The software development estimates shall be modified by the appropriate multipliers in the development model to account for the following software development attributes: a. Product attributes, which include software reliability, database size, and software and data processing architecture complexity b. Computer(s) attributes, which include execution time constraints, main memory storage constraints, response-time requirements, integrity of the system software and the development software, capability of analysts assigned to the project, applications software experience, capability of programmers assigned to the project, experience on and/or with the hardware selected, experience with the selected programming language and development tools c. Project attributes, which include stability of requirements and exterior interface specifications, degree of concurrency in the software and data processing hardware development, use of automated software tools, development schedule, documentation required and project control reporting requirements 5. The development estimate of any required or planned conversion or enhancements to existing software shall be estimated on the basis of the following factors: a. b. c. d.
Size of the programs to be converted Percentage of design modifications Code changes required Reintegration required
6. Software development estimates shall be submitted for review to the SE [title/position]. Approval must be obtained prior to submission of the estimates to the project [title/position]. This step shall be repeated at the time of the SRR, SADR, and SDDR, and these software development estimates shall be traced back and reconciled to the original estimates generated during the scheduling phase. Significant variances shall be justified by changes in requirements, design, or development environment.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of software development schedule(s), personnel estimates, and tasking prior to beginning any software development.
Page 10 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
113
2. The SE [title/position] shall be responsible for review and approval of the software development schedule(s), personnel estimates, and tasking.
POLICY 2 SOFTWARE QUALITY ASSURANCE
Policy SE software projects shall provide the independent assurance that their products meet appropriate standards for software quality. Software quality assurance shall be achieved through ongoing review and by periodic software quality audits. The project’s software quality assurance activities shall follow a Software Quality Assurance Plan (SQAP) which is to be prepared by the software lead engineer and approved within one month of project start-up by the SE [title/position]. The degree of formality, control employed, and personnel used should be contingent upon the size and complexity of the project, the significance of the project, and the investment risks.
Requirements 1. A non-project-related person shall be assigned the responsibility for the software quality assurance functions with the mutual approval of the project software lead engineer and the SE [title/position]. This person shall report operationally to the project software lead engineer and shall receive functional direction from the SE [title/position]. 2. The project shall prepare and maintain an SQAP in compliance with the relevant SE software development procedures and company policies and standards. The plan shall state the software quality objectives of the project as conditioned by the product requirements and the significance of the intended application. This plan is to be prepared within one month of project start-up by the project software lead engineer and approved by the SE [title/position]. Depending upon the scope of the project, the plan may be a separate document or a section within the Software Development Plan (SDP). 3. The contents of the SQAP shall direct the project to: a. Identify, prepare, coordinate and maintain software development procedures for control of critical steps affecting product software quality
Copyright © 2002 Interpharm Press
Page 11 of 51 SE-SDP
114
Software Quality Assurance SOPs for Healthcare Manufacturers
b. Schedule and have conducted independent audits of the following for consistent compliance with software development procedures: requirements definition, design, documentation, code production, testing, and the software configuration management (SCM) program. Audit results shall be documented and reported to the project software lead engineer c. Include SE [title/position] participation in formal project reviews, audits, and control boards d. Provide for an inspection of the deliverable documents for compliance with software quality assurance provisions of the software development procedures e. Assure that the software discrepancy reporting system supports change control and forms a database for systematic problem resolution. Perform periodic review of problem reports and make recommendations to the project software lead engineer as necessary
Responsibilities In addition to the responsibilities indicated in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining the approval of the SQAP. 2. The SE [title/position] is responsible for: a. Review and approval of the SQAP b. Supporting the projects with respect to the required product software quality responsibilities
POLICY 3 SOFTWARE TOOLS
Policy SE software projects shall use software tools to support software development activities. It is the responsibility of SE to define the requirements for software tools. Where it is not possible to obtain existing software products that serve the required functions, SE personnel will be responsible for the development of such tools. The development methodology to be employed for software tools will be identical to the SE Software Development Policies.
Page 12 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
115
Requirements 1. SE personnel shall be responsible for establishing the requirements for software tools and assuring that the acquisition and/or development of the software tools will be completed in time to support the planned use. 2. An approach for validating the software tools must be defined and documented in a Software Tool Validation Plan. This plan must be sufficient to assure that this software is adequate for its intended purpose.This validation may take the form of inspection, analysis, simulation, test, or some other project-approved method. The results of tool validation shall be reviewed and approved by the SE [title/position] prior to tool utilization.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. SE personnel shall be responsible for: a. Identifying, defining, obtaining, and validating software tools b. Assuring that the development of software tools conforms to the SE software policies 2. The SE [title/position] shall be responsible for review and approval of the use of the software tools.
POLICY 4 HARDWARE TOOLS
Policy SE software projects shall use hardware tools to support software development activities. It is the responsibility of SE personnel to define the requirements for hardware tools.Where it is not possible to obtain existing hardware products that serve the required functions, SE personnel will be responsible for the development of such tools. The development methodology to be employed for such tools will, as a minimum, include a design review of engineering drawings, specifications, and test and maintenance plans for the purpose of: (1) assuring the adequacy of the hardware design and plans; (2) resolving any identified issues; and (3) obtaining commitment to a program supporting tool acceptance and subsequent maintenance.
Copyright © 2002 Interpharm Press
Page 13 of 51 SE-SDP
116
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements 1. SE personnel shall be responsible for establishing the requirements for hardware tools. 2. SE personnel shall perform the following activities for hardware tools developed by SE personnel: a. Generation of engineering drawings defining device interfaces, electrical interconnections, and supplementary mechanical requirements b. Review and approval of hardware tool design prior to tool manufacture c. Material selection in accordance with preferred parts guidelines 3. SE personnel shall be responsible for assuring that the acquisition and/or development of the hardware tools will be completed in time to support its planned use. 4. An approach for validating the hardware tools must be defined, and documented in a Hardware Tool Validation Plan. This plan must be sufficient to assure that the tool is adequate for its intended purpose. This validation may take the form of inspection, calibration, test, or some other approved method. The results of tool validation shall be reviewed and approved by the SE [title/position] prior to tool utilization. 5. SE personnel shall be responsible for establishing a maintenance schedule for SEdeveloped hardware tools and assuring that tool maintenance is accomplished as scheduled.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. SE personnel shall be responsible for identifying, defining, obtaining, validating, and maintaining hardware tools. 2. The SE [title/position] shall be responsible for review and approval of the use of the hardware tools.
Page 14 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
117
POLICY 5 SOFTWARE DEVELOPMENT PLAN
Policy SE software projects shall provide the means to coordinate schedules, control resources, initiate actions, and monitor the progress of the software development effort. The plan for accomplishing these management activities shall be identified and described in the Software Development Plan (SDP). The SDP shall be prepared by the project software lead engineer and approved by the SE [title/position] prior to the Software Requirements Review (SRR).
Requirements 1. The SDP shall be produced in the format identified in the relevant SE software development procedures. 2. The SDP shall: a. Identify and describe the organizational structure, personnel, and resources for software development, software configuration management, and software quality assurance. b. Indicate the development schedule and milestones. c. Indicate the methods and techniques to be used in requirements definition, requirements review, design, design review, and software testing. d. Indicate the methods to be used to ensure that the design requirements and requirements for system resources will be met. e. Describe the configuration control methods and organization for processing changes to the software and associated documentation throughout the development. If a Software Configuration Management Plan (SCMP) is prepared for the project, a reference to that plan shall be made and is sufficient in meeting this requirement. f. Indicate the methods, techniques, and organization for assuring that software end products meet appropriate standards for quality. If a Software Quality Assurance Plan (SQAP) is prepared for the project, a reference to that plan shall be made and is sufficient in meeting this requirement. g. Identify the potential problem and high-risk areas in terms of schedule and technological risks and describe the means by which the software project may minimize the impact of identified risk areas. 3. The SDP shall be prepared by the project software lead engineer, reviewed by the project system engineer and shall be approved by the SE [title/position] prior to the SRR.
Copyright © 2002 Interpharm Press
Page 15 of 51 SE-SDP
118
Software Quality Assurance SOPs for Healthcare Manufacturers
4. The SDP shall be maintained by the project software lead engineer and shall be updated prior to the Software Architecture Design Review (SADR) and Software Detailed Design Review (SDDR) to reflect changes to the management plan for software development.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of the SDP. 2. The project system engineer shall be responsible for reviewing the SDP. 3. The SE [title/position] shall be responsible for approval of the SDP.
POLICY 6 PROGRAMMING STANDARDS AND CONVENTIONS
Policy SE software projects shall employ programming standards and conventions in order to promote uniformity, readability, understandability, reliability, maintainability, compatibility, and other quality characteristics of the software products. Where applicable, such standards and conventions should also contribute to portability of software between hardware systems and compatibility, with existing and future support software.These standards and conventions shall include the standards to be followed during software design, development, and maintenance. The design and development shall be periodically audited by the cognizant software lead engineer to assess adherence to the programming standards and conventions.
Requirements 1. The programming standards and conventions shall contain, as a minimum:
Page 16 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
119
a. Standards and conventions to be followed in describing processing flows and sequences, including definitions of symbols and conventions for their usage b. Standards and conventions to be used for preface text and in-line comments within the source code c. Higher order and assembly language coding standards for every language that will be used to generate code d. Structured programming standards and conventions for every higher order language that is used to generate code 2. Periodic audits of design documentation and code shall be conducted to assess compliance with programming standards and conventions. The Software Quality Assurance Plan (SQAP) is to establish the procedure and frequency for these audits. 3. The programming standards and conventions document shall be prepared and kept current by personnel appointed and directed by the SE [title/position]. Written signature approval shall be made by the SE [title/position].
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The SE [title/position] shall be responsible for approval of any changes to the programming standards and conventions. 2. The project software lead engineer shall ensure conformance to the programming standards and conventions.
POLICY 7 SOFTWARE DOCUMENTATION
Policy SE software projects shall produce and maintain a minimum set of software documents that satisfy the SE Software Development Policies and that are necessary design and planning tools for disciplined and successful software development.
Copyright © 2002 Interpharm Press
Page 17 of 51 SE-SDP
120
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements 1. Early in the project planning, the project shall identify the set of software documentation that satisfies the SE Software Development Policies and project needs. This identification shall include, for each document, the document’s title, purpose and schedule. This set of documentation shall be produced in the format specified in the relevant SE software development procedures. 2. The following minimum documentation is required of SE software projects: a. b. c. d. e. f. g. h.
Software Quality Assurance Plan (SQAP) Software Configuration Management Plan (SCMP) Software Development Plan (SDP) Software Development Test Plan (SDTP) Software End-product Acceptance Plan (SEAP) Software Requirements Specification (SRS) Software Architecture Design Specification (SADS) Software Detailed Design Specification (SDDS)
3. If a project is a “distributed processing subsystem,” the following additional documentation is required: a. System-level requirements specification b. System-level design document, which is to include the rationale for the selected decomposition into subsystems and for the selected interface c. Interface Design Specification (IDS) 4. Document content, format, and size shall satisfy relevant SE software development procedures and should be appropriate to user and project needs. Documents should be grouped and bound into physical volumes that are consistent with user and project needs. 5. If the project is required to produce documents in addition to those specified by SE Software Development Policies, then: a. Those additional documents shall conform with the format and content specified in the relevant SE software development procedures b. For those additional documents not described in the relevant SE software development procedures, the project shall produce, prior to the SRR, an outline of these documents as to content and format to the approximate level of detail used in the relevant SE software development procedures
Page 18 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
121
POLICY 8 SOFTWARE USER’S MANUAL
Policy SE software projects that are responsible for the development of software tools shall produce a software User’s Manual that contains the instructions necessary to operate the software system. An outline of this document shall be reviewed at the Software Architecture Design Review (SADR) in order to obtain SE [title/position] commitment to the design of the user/machine interface.
Requirements 1. The User’s Manual shall contain as a minimum a description of how to set up, execute, select options, and interpret printout and displays for software operation. 2. For operator-oriented software, the User’s Manual shall be expanded to include those operational procedures needed to provide user personnel with instructions sufficient to execute the software. It shall relate these operational procedures to the operational system functions. 3. The User’s Manual shall be produced in the format identified in the relevant SE software development procedures. 4. A top-level software system operational description that specifies the user/machine interface shall be available at the SADR. 5. The Software Detailed Design Review (SDDR) version of the User’s Manual shall be updated as appropriate with the material provided at the SADR. 6. Prior to the start of code and test, a complete preliminary version of the User’s Manual shall be available for use and validation during the project testing phases. 7. The User’s Manual, reflecting the “as built and delivered” software, shall be delivered in final form when the software is delivered to the SE [title/position]. 8. The User’s Manual shall be reviewed and approved by the SE [title/position].
Copyright © 2002 Interpharm Press
Page 19 of 51 SE-SDP
122
Software Quality Assurance SOPs for Healthcare Manufacturers
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of the User’s Manual. 2. The SE [title/position] shall be responsible for approval of the User’s Manual.
POLICY 9 DEVELOPMENT TEST INFORMATION SHEETS
Policy SE software projects shall prepare and maintain Development Test Information Sheets (DTISs) for each test conducted during software development in order to provide: •
An organized, accessible collection of all testing and test results
•
A means of tracking the progression and status of testing
•
A means of test verification
A DTIS shall be prepared for each test defined in the Software Development Test Plan (SDTP), shall be prepared prior to the Software Detailed Design Review (SDDR), and shall be maintained until end-product acceptance. Completed DTISs shall be reviewed for completeness and technical adequacy of the testing conducted and periodically audited to assess compliance with relevant SE software development procedures.The project software lead engineer is responsible for the review and approval of the completed DTISs.
Requirements 1. The DTIS shall be produced in the format specified in the relevant SE software development procedures and shall define:
Page 20 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
123
a. b. c. d. e.
Title of the test to be conducted Requirement(s) to be tested, including the requirement’s unique identifier Specification title where the requirement was defined Objective of the test and the success criteria Test approach to the depth necessary to establish a baseline for resource requirements f. Required test instrumentation g. Expected duration of the test h. Data collection, reduction, and analysis requirements 2. The DTIS for each test defined in the SDTP shall be prepared prior to SDDR. Review for approval of the software test program at SDDR shall include an assessment of the adequacy of the test methods and test limits defined in the DTISs. 3. DTISs shall appropriately cover unit, module, function, or routine during the coding and testing phase of the software development and for the integration phase of software development. In addition, any design and code examined during a walk-through shall have a DTIS produced in order to be available for the review. 4. The DTIS shall be used as a guide for test set-up and conduct. Any required test data shall be attached to the DTIS. The test conductor shall sign and date the completed DTIS. 5. At the completion of testing, the project software lead engineer shall review the DTIS for completeness and technical adequacy of the testing conducted and, when satisfied, shall sign and date the DTIS. 6. Periodic audits of DTISs shall be conducted to assess compliance with relevant SE software development procedures. Problems detected in these audits shall be identified in a written summary that shall be attached to the DTIS, and copies will be sent to the SE [title/position] and project software lead engineer.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The DTISs are generated by the project software engineer. 2. The project software lead engineer is responsible for the review and approval of the completed DTIS.
Copyright © 2002 Interpharm Press
Page 21 of 51 SE-SDP
124
Software Quality Assurance SOPs for Healthcare Manufacturers
POLICY 10
SOFTWARE DESIGN METHODOLOGY
Policy SE software projects shall perform software design using a top-down approach, which requires that the design be performed by starting with the top-level system functions and proceeding through a downward allocation, evaluation, and iteration to successively lower levels of design. This design approach enhances design traceability, completeness, and comprehensiveness.
Requirements 1. The design process shall be initiated by establishing a functional design hierarchy in which the top level of design is the overall mission to be performed by the entire software system. a. Lower levels shall be obtained by breaking down and partitioning the software into blocks with progressively greater functional detail. b. The software requirements shall then be allocated and mapped onto this design hierarchy. c. The lowest level of the design hierarchy shall be defined so that its software components can be structured by the program control logic to implement all of the input-to-output paths in the requirements. 2. For the downward development of design below the system level, the following design activities shall be performed: a. Review and expand upon the functions from previous levels to be performed at this level. b. Establish criteria for defining the software components at each level, including maximum size of components, use of similar data, and separation of safety critical functions and data from non-safety critical functions and data and time-critical functions from non-time-critical functions. c. Iteratively evaluate functions, criteria, and design concepts in order to establish the software components defined at each level, the functions performed by each component, and the interactions and interfaces among the software components. d. Reconcile any differences in the software design and the functional design at each level. e. Record the criteria, rationale, and trade-offs used to establish the selected software design.
Page 22 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
125
3. The design shall be a hierarchical structure of identifiable software components wherein the highest level of control logic resides at the top of the hierarchy and the computational or algorithmic functions reside at the lower levels. Levels shall be structured so that a lower level does not call upon a higher level. 4. This policy does not preclude the simulation or prototyping of critical lower level components to the extent necessary to perform design verification as discussed in the sections “Software Architecture Design Review and Acceptance” and “Software Detailed Design Review and Acceptance.” 5. For software programs under modification, the highest level of software structure encompassing all software elements to be modified shall be considered the “top” of the design hierarchy.
POLICY 11
SYSTEM ANALYSIS METHODOLOGY
Policy SE software projects shall perform system analyses and present the results of these analyses at the Software Architecture Design Review (SADR) and Software Detailed Design Review (SDDR). The system analyses must address the adequacy of the software design to fulfill the software performance and safety requirements and to remain within the allocated design budgets for memory and other storage utilizations, timing allocations, and communications bandwidths. Analytic analysis or simulation shall be employed to demonstrate the adequacy of the selected algorithms to meet accuracy requirements.
Requirements 1. Acceptable analysis techniques include functional simulation, manual static analyses, and walk-throughs. 2. Not-to-exceed design allocation budgets must be defined and validated for each of the following system parameters: a. Port-to-port thread timing b. Software task timing c. Memory utilization
Copyright © 2002 Interpharm Press
Page 23 of 51 SE-SDP
126
Software Quality Assurance SOPs for Healthcare Manufacturers
d. e. f. g.
Database and secondary storage sizing Effective communication bandwidths Operational window for potential failure modes System critical time
3. The analyses must verify by proof or demonstration that the design is within the allocated budget constraints and fulfills the software performance and accuracy requirements. There must also be analyses that verify the suitability and correctness of critical control algorithms. 4. If a functional simulation is employed, a design notebook for the simulation itself is mandatory, and at least one section of this notebook shall be set aside for the simulator requirements. The top-level requirements for the functional simulation are to be developed by the designers of the software system that is being simulated.These requirements may be documented in a simplified form, such as thread diagrams. Those who are assigned responsibility for implementing the simulator must present a design walkthrough of the simulator’s detailed processing threads and data tables to the designers of the software system that is being simulated and obtain their approval prior to the actual implementation of the simulator. Another section of the simulator design notebook shall be devoted to a simulator validation and calibration plan and the results of the simulator validation and calibration efforts.
POLICY 12
SOFTWARE DEVELOPMENT TEST PLAN
Policy SE software projects shall prepare an overall Software Development Test Plan (SDTP) that defines the scope of software testing that must be successfully completed for each software component. This plan shall be reviewed by the cognizant software lead engineer and approved by the cognizant system engineer.
Requirements 1. The SDTP shall specify how the following items are to be accomplished: a. Verification of all computations using not only nominal data values but also singular and extreme values
Page 24 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
127
b. Verification of all data input options c. Verification of all data output options and formats, including error and information messages d. Exercise of all executable statements in each component e. Test of all options at each branch point in each component f. Conduct and monitoring of software testing to assure compliance with this plan 2. The SDTP shall also contain an identification of test input data that must be supplied by external sources and the plan for obtaining these data. 3. The SDTP shall be produced in the format specified in the relevant SE software development procedures. 4. A preliminary version of the SDTP shall be generated by the project software lead engineer prior to the Software Architecture Design Review (SADR). An updated version will be provided at the Software Detailed Design Review (SDDR) for review, and approval will occur prior to code and test.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of the SDTP. 2. The project system engineer shall be responsible for approval of the SDTP.
POLICY 13
SOFTWARE DESIGN WALK-THROUGHS
Policy SE software projects shall conduct component design walk-throughs in order to facilitate the early detection of design errors. Design walk-throughs shall be accomplished by having the software component design reviewed by one or more individuals other than the actual designer.
Copyright © 2002 Interpharm Press
Page 25 of 51 SE-SDP
128
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements 1. Design walk-throughs shall be conducted at the component level as the design of each component is completed. 2. The technique used for the design walk-throughs shall consist of a visual and oral presentation of the design by the originator(s) in the presence of the reviewer(s). 3. The walk-through team shall consist of at least one and no more than four people. It is desirable but not mandatory that the eventual programmer(s) and tester(s) of that component be members of the walkthrough team if they did not participate in the design themselves. 4. The scope of the walk-through shall include checks for at least the following items: a. b. c. d. e. f. g. h.
Responsiveness of the design to the requirements Design completeness and consistency Flow of data through input and output interfaces Testability Error recovery procedures Modularity Simplicity Adherence to SE standards
5. Problems detected during the design walk-through of a component shall be identified in a written summary and made available to the component developer. 6. Development Test Information Sheets (DTISs) should be prepared for discussion on how the design may be tested.
POLICY 14
SOFTWARE CODE WALK-THROUGHS
Policy SE software projects shall conduct component code walk-throughs in order to facilitate the early detection of design, implementation, and coding errors. Code walk-throughs shall be accomplished by having the software component code reviewed by one or more individuals other than the actual implementor.
Page 26 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
129
Requirements 1. Code walk-throughs shall be conducted at the component level as the implementation of each component is completed. 2. The technique used for the code walk-throughs shall consist of a visual and oral presentation of the implemented code by the originator(s) in the presence of the reviewer(s). 3. The walk-through team shall consist of at least one and no more than four people. The team membership participation criteria should be: a. It is desirable but not mandatory that the designer(s) and eventual tester(s) of that component be members of the walk-through team b. It is mandatory that the software verification and validation (V&V) engineer(s) of that component be members of the walk-through team 4. The scope of the walk-through shall include checks for at least the following items: a. b. c. d. e. f. g h.
Responsiveness of the code to the requirements Design implementation completeness and consistency Flow of data through input and output interfaces Testability Error recovery procedures Modularity Simplicity Adherence to SE standards
5. Problems detected during the code walk-through of a component shall be identified in a written summary and made available to the component developer. 6. Development Test Information Sheets (DTISs) should be prepared for discussion on how the code will be tested.
POLICY 15
SOFTWARE END-PRODUCT ACCEPTANCE PLAN
Policy SE software projects shall follow an orderly procedure governed by a written plan to prepare for and achieve cognizant system engineer written approval of all end products of design, code,
Copyright © 2002 Interpharm Press
Page 27 of 51 SE-SDP
130
Software Quality Assurance SOPs for Healthcare Manufacturers
and test. This plan, the Software End-product Acceptance Plan (SEAP), shall be prepared by the project software lead engineer and reviewed at the Software Requirements Review (SRR), modified to reflect agreements reached at the SRR, approved by the project system engineer prior to the end of the Software Architecture Design Review (SADR) and modified to reflect agreements reached at the Software Detailed Design Review (SDDR).
Requirements 1. The SEAP shall be prepared as a descriptive checklist of the end products and services required for project system engineer approval. As a minimum, for each such item, the plan shall include: a. b. c. d. e.
Name or title of the item Indication of the degree of cognizant system engineer concurrence required Required format of the item Schedule for producing and delivering the item Criteria for determining item readiness for close-out
2. The SEAP shall define the procedures and schedules for project system engineer review and approval of each item. 3. The project shall establish and maintain a file to accumulate acceptance-related data for each item. The file for each item shall contain: a. Applicable part of the SEAP b. All communications and documentation related to the item’s acceptance not contained in the SEAP c. Physical evidences of the product or service having been produced or performed d. Written approval where cognizant system engineer approval is required e. Other supporting documentation as necessary 4. The project shall conduct an acceptance audit near the end of the SE software project at a time mutually agreed to by the project software lead engineer and the project system engineer for the purpose of reviewing the status of each acceptance item, achieving close-out of those items still open, and obtaining project system engineer approval of each accepted item. The audit shall be conducted subject to the agreements in the SEAP. 5. The SEAP shall be produced in the format specified in the relevant SE software development procedures.
Page 28 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
131
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of the SEAP. 2. The project system engineer shall be responsible for approval of the SEAP.
POLICY 16
INTERFACE DESIGN SPECIFICATION
Policy Systems that consist of multiple autonomous or semi-autonomous subsystems are termed “distributed processing subsystems.” During the course of the design process, such systems shall be decomposed into subsystems, eventually arriving at a state where each of the defined subsystems should contain one autonomous component and could also contain one or more components that are dependent upon the autonomous component. Each such subsystem is then to be developed in accordance with the relevant SE Software Development Policies, with the additional constraint that the inter-subsystem interfaces must be defined and controlled. There will then be an integration and test phase that integrates the individual components into the complete system.
Requirements 1. Given a set of system requirements, there shall be a design stage that results in the decomposition of the system into subsystems 2. The result of the system-level software design effort is the specification of requirements for each of the subsystems 3. There shall be an Interface Design Specification (IDS) that completely specifies the interfaces among the subsystems. The IDS shall be placed under configuration control simultaneously with the subsystem requirements specifications.
Copyright © 2002 Interpharm Press
Page 29 of 51 SE-SDP
132
Software Quality Assurance SOPs for Healthcare Manufacturers
4. System-level design reviews will result in approval of the Software Requirements Specifications (SRS) and the IDS. 5. The project must establish procedures and mechanisms to achieve the required continuing interaction between the system-level design and the subsystem development efforts. 6. The Software Development Test Plans (SDTPs) and test procedures are to be developed for each of the subsystems and for the overall system.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of the IDS. 2. The IDS shall be reviewed by the: a. Project mechanical engineering manager b. Project electrical engineering manager c. SE [title/position] 3. The project system engineer is responsible for approval of the IDS.
POLICY 17
SOFTWARE REQUIREMENTS SPECIFICATION
Policy SE software projects shall generate a Software Requirements Specification (SRS), that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. The project shall produce an SRS prior to the Software Requirements Review (SRR).
Page 30 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
133
Requirements 1. The SRS shall be produced in the format identified in the relevant SE software development procedures. 2. The SRS shall be approved in writing by the project system engineer and the SE [title/position] as the basis for acceptance of the computer program and data base end products. Once approved, the SRS shall be modified only to incorporate changes approved in accordance with the SE Configuration Management Policies. 3. The SRS shall: a. Identify the system-level functions and the system-level objectives that the software must support b. Include functional design requirements and performance requirements. Functional requirements shall be derived by allocating the system-level requirements, with the addition of any derived requirements that may be necessary for the orderly execution of the software process. Performance requirements shall establish resource budgets in terms of accuracies, response times, storage, database access and input and output rates, margins for growth, and so on. c. Define the interfaces between the software to be developed and the hardware, software, external databases, facilities, and personnel with which this software must interact d. Assign a unique identifier for each unique requirement that may be used in subsequent project phases to trace requirements to (1) parts of the software design and (2) test cases e. Specify the system’s safety-critical parameters and critical indicators identified in the Hazards Analysis, which are controlled or commanded by software f. Specify criteria for acceptance, including the levels of test, test objectives, and test methods required to validate the software end products
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of the SRS. 2. The project system engineer shall be responsible for approval of the SRS. 3. The SE [title/position] shall be responsible for approval of the SRS.
Copyright © 2002 Interpharm Press
Page 31 of 51 SE-SDP
134
Software Quality Assurance SOPs for Healthcare Manufacturers
POLICY 18
SOFTWARE REQUIREMENTS REVIEW AND ACCEPTANCE
Policy After preparation of a Software Requirements Specification (SRS) and prior to the Software Architecture Design Review (SADR), the project software lead engineer shall conduct a Software Requirements Review (SRR) for the SE project.The purpose of this review is to present to and achieve written agreement with the project system engineer on the provisions of the SRS, which will then serve as the basis for software end product acceptance. In preparation for this review, SE software projects shall (1) analyze and evaluate the SRS for its technical acceptability and (2) develop a response to each problem identified in this analysis.
Requirements 1. The project personnel shall prepare for the SRR by performing the following activities: a. Analyze the completeness, consistency, testability, and technical feasibility of the complete requirements set from both (1) a flow-oriented (system inputs to system outputs) point of view and (2) a functional breakdown (conceptual design) point of view b. Analyze each individual requirement to verify each of the following: • Compatibility with system-level objectives where appropriate • Technical feasibility (i.e., the requirements engineer can postulate a design that will satisfy the requirement and can offer convincing arguments for such a design or can offer a set of design alternatives) • Testability • Completeness (i.e., identify TBD items, explicit or implicit) • Internal consistency c. Analyze the complete requirements set to determine its compatibility with schedule and other project resources such as personnel, computers, and facilities d. Develop a response to the problems identified in the above analyses 2. As appropriate, the project shall present at the SRR a description of the analysis techniques utilized, the tools employed to perform these analyses, and the results of these analyses. 3. Issues and problems identified in the requirements analysis and evaluation shall be addressed at the SRR. Agreements and action items with associated due dates resulting
Page 32 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
135
from the SRR shall be documented and co-signed by the project software lead engineer and the project system engineer as a formal output of the SRR. 4. The SRR shall include a review of the Software End-product Acceptance Plan (SEAP). 5. The end result of the SRR shall be an agreement between the project system engineer and SE personnel as to the acceptability of the SRS and as modified by agreements reached at the SRR as the basis for software end product acceptance and its establishment as the formal baseline.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for: a. Conducting the SRR b. Documenting and signing agreements and action items resulting from the SRR 2. The project system engineer shall attend the SRR and sign the documented agreements and action items.
POLICY 19
SOFTWARE ARCHITECTURE DESIGN SPECIFICATION
Policy SE software projects shall generate a Software Architecture Design Specification (SADS) for the software end products defined in the Software Requirements Specification (SRS) in order to establish the documented design baseline from which the detailed design will be developed. The SADS shall contain the design information needed to support the detailed definition of the individual software system components and upon completion of the Software Architecture Design Review (SADR) shall become the design baseline for development of the Software Detailed Design Specification (SDDS) used in support of software coding.
Copyright © 2002 Interpharm Press
Page 33 of 51 SE-SDP
136
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements 1. The SADS shall be produced in the format identified in the relevant SE software development procedures. 2. The SADS shall assign each unique requirement in the SRS to specific components of the software design and shall explicitly identify the mapping from requirements to software design. 3. The SADS shall identify and name all the levels of software hierarchy for the software to be developed. 4. For each level of software organization, the SADS shall: a. b. c. d.
Identify by name and function the components at that level Specify the control interfaces affecting components at that level Specify the data interfaces affecting components at that level Identify the data processing flow at that level for each of the basic types of data from point of input to point of output e. For each of the components that will be adapted from existing software, describe the required development or modification approach 5. The SADS shall specify the data processing resource budgets, such as timing, storage, and accuracy, at an appropriate level of design. 6. The SADS shall identify all required major algorithms and their location within the software design. For all critical algorithms, candidate solutions and their anticipated performance shall be discussed. 7. The SADS shall identify and name the levels of database hierarchy down through the bit or field level. For each level of database hierarchy, the SADS shall identify the name, engineering description, units, defaults, and size of the database components. The dependencies and relationships between the database and software components shall be identified. Each instance of the software routines gaining access to a database entity shall be identified. 8. The SADS shall demonstrate as required that the aggregate data processing resource budgets are within the total available resources and requirements for the selected computer architecture. 9. The SADS shall include a description of the user interface design. 10. The SADS shall address the methods chosen to meet the software testability requirements derived from the test section in the software requirements document and the Software Development Test Plan (SDTP).
Page 34 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
137
11. The SADS shall address the methods chosen to meet the requirements for safety identified in the software requirements document.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of the SADS. 2. The project system engineer shall be responsible for approval of the SADS. 3. The SE [title/position] shall be responsible for approval of the SADS.
POLICY 20
SOFTWARE ARCHITECTURE DESIGN REVIEW AND ACCEPTANCE
Policy SE software projects shall perform those design and planning activities that establish a preliminary software design baseline and the implementation and test plans necessary to proceed into the detailed design and development.The preliminary design, associated plans, and any technical issues shall be reviewed at the Software Architecture Design Review (SADR), conducted by the project software lead engineer, in order to assess the adequacy of the design and plans, resolve and identify issues, and obtain mutual commitment to proceed into the Software Detailed Design Phase.
Requirements 1. The following SADR material shall be available to reviewers at a predetermined time before the SADR. The amount of lead time will be mutually determined by the project software lead engineer and the project system engineer. a. Software Architecture Design Specification (SADS)
Copyright © 2002 Interpharm Press
Page 35 of 51 SE-SDP
138
Software Quality Assurance SOPs for Healthcare Manufacturers
b. Approved Software Requirements Specification (SRS) and associated interface specifications c. Proposed changes to the SRS d. Preliminary Software Development Test Plan (SDTP) 2. The project shall have completed the following activities prior to the SADR: a. Preparation of the review materials identified in requirement 1 above b. Verification that every requirement has been properly accounted for in the design c. Verification that the design is complete, consistent, feasible, and testable from a flow-oriented (system inputs to system outputs) point of view and a functional breakdown point of view d. Verification that the aggregate design budgets for items like storage, timing, and accuracy for the software system components satisfy the SRS and do not exceed the limitations of the software’s physical and functional environments, including margins for growth e. Review of the implementation plans and project or performer commitments to these plans f. Identification of any issues of a technical nature, including any requirements not satisfied and preparation of a recommended project disposition g. Assembly of engineering analysis material substantiating software design and algorithm selection and, for high-risk items, identification of the design approach and alternatives 3. For those SE software developments that involve expedient response constraints or high volume of input and output messages, sharing and competing for data processing resources, or where utilization of computer throughput or storage is part of the specifications, the use of functional simulation is strongly recommended to validate the viability and conformance to specifications of the hardware and software architectures. 4. The SADR meeting with the project system engineer shall consist of a presentation by project personnel which addresses at least the following items: a. An overview of the design identifying software structure, supporting design rationale, software operation in the system environment, functional characteristics, and the user interface b. Results of the design verification activities identified in requirements 2(c) and 2(d) above c. An overview of the implementation and test plans d. Critical technical issues These items shall then be followed by a resolution of any outstanding issues and an agreement with the project system engineer to proceed into the Software Detailed Design Phase.
Page 36 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
139
5. The disposition of all identified technical issues and other agreements and action items with associated due dates shall be documented and co-signed by the project software lead engineer and the project system engineer as a formal output of the meeting.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall conduct the SADR which must result in approval to proceed to the next phase of development. 2. The SE [title/position] is responsible for approval of the SADR. 3. The project software lead engineer is responsible for approval of the SADR. 4. The project system engineer is responsible for approval of the SADR.
POLICY 21
SOFTWARE DETAILED DESIGN SPECIFICATION
Policy SE software projects shall update and expand the Software Architecture Design Specification (SADS) to produce a Software Detailed Design Specification (SDDS) for the defined software end products. This specification shall establish a detailed (“build-to”) design from which the code will be produced.
Requirements 1. The SDDS shall be produced in the format identified in the relevant SE software development procedures. 2. The SDDS shall constitute an update to and expansion of the design baseline established at the Software Architecture Design Review (SADR), including a description of the overall program operation and control and the use of common data. The detailed
Copyright © 2002 Interpharm Press
Page 37 of 51 SE-SDP
140
Software Quality Assurance SOPs for Healthcare Manufacturers
design shall be described through the lowest component level of software organization and the lowest logical level of database organization. 3. The SDDS shall contain an overview that encompasses the design of the overall computer program. It shall emphasize timing, storage, and accuracy and shall refer to and update the SADS. 4. The detailed design shall adhere to the basic control structures allowed in structured programming as described in the relevant programming standards and conventions parts of the relevant SE software development procedures. 5. The SDDS shall contain the following information for each component: a. b. c. d. e. f. g. h. i.
j.
Component name Purpose Assumptions Sizing of code and data Calling sequence, arguments and definitions, error exits, and return status Inputs, processing, and outputs Components called by this component Components calling this component Engineering description, including equations and a processing flow description expressed in a flowchart, in a program design language, or in other suitable description format Restrictions and limitations
6. The SDDS shall contain a complete definition of the database down through the bit or field level. Each instance of the software components gaining access to a database entity shall be identified. 7. The SDDS shall be updated after completion of integration testing to reflect the “asbuilt” software product configuration, in order to provide a basis for delivery and subsequent software maintenance.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall be responsible for generating and obtaining approval of the SDDS. 2. The project system engineer is responsible for approval of the SDDS.
Page 38 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
141
3. The SE [title/position] is responsible for approval of the SDDS.
POLICY 22
SOFTWARE DETAILED DESIGN REVIEW AND ACCEPTANCE
Policy SE software projects shall perform those design and planning activities that establish a Software Detailed Design Specification (SDDS) and the associated implementation and test plans necessary to proceed into the code and test phase. The detailed design, its specification, associated plans, and any critical issues shall be reviewed at a Software Detailed Design Review (SDDR) conducted by the project software lead engineer in order to •
Gain concurrence in the adequacy of the detailed design and plans
•
Resolve any identified issues
•
Obtain commitment to proceed into the code and test phase
•
Obtain commitment to a test program supporting product acceptance
Requirements 1. The following SDDR material shall be available to reviewers at a predetermined time before the SDDR. The amount of lead time will be mutually determined by the project software lead engineer and the project system engineer. a. The SDDS b. The approved Software Requirements Specification (SRS) and associated interface specifications c. Proposed changes to the SRS and associated interface specifications d. The updated or current version of the Software Development Test Plan (SDTP) e. Design evaluation or trade-off study results 2. Project personnel shall have completed the following activities prior to the SDDR: a. Preparation of the review materials identified in item 1 above
Copyright © 2002 Interpharm Press
Page 39 of 51 SE-SDP
142
Software Quality Assurance SOPs for Healthcare Manufacturers
b. Verification that every requirement has been properly accounted for in the design c. Verification that the detailed design is complete, consistent, feasible, and testable from a flow-oriented (system inputs to system outputs) point of view and a functional breakdown point of view d. Verification that the detailed design and critical parameter budgets for items such as storage, timing, and accuracy for the software system components do not collectively exceed the limits given in the SRS and additionally do not exceed the limitations of the software’s physical and functional environments, including margins for growth e. Review of the current, detailed implementation and test plans and project or performer commitments to these plans f. Identification of any critical and technical issues, including any requirements not satisfied, and preparation of a recommended project disposition g. Assembly of engineering analysis material substantiating software detailed design and algorithm selection 3. The SDDR meeting with the project system engineer shall consist of a presentation by project personnel which addresses at least the following items: a. An overview of the detailed design identifying software structure, component interface and interaction, supporting design rationale, software operation in the system environment, and detailed user interface b. Results of the design verification activities c. An overview of the implementation and test plans d. Critical technical issues These items shall be followed by a resolution of any unresolved issues, leading to an agreement with the project system engineer to proceed into the Code and Test Phase and an agreed-to test program supporting product acceptance. 4. The disposition of all identified critical issues and other agreements and action items shall be documented and co-signed by the project software lead engineer and the project system engineer as a formal output of the meeting.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The software lead engineer shall conduct the SDDR, which must result in approval to proceed to the next phase of development.
Page 40 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
143
2. The SE [title/position] is responsible for approval of the SDDR. 3. The project software lead engineer is responsible for approval of the SDDR. 4. The project system engineer is responsible for approval of the SDDR.
POLICY 23
ANOMALY REPORTING AND RESOLUTION
Policy The project software V&V lead engineer shall be responsible for the proper documentation and reporting of software anomalies on a Software Anomaly Report. All anomalies shall be reported regardless of the perceived impact on software development or the severity of the anomaly with respect to system operation. Software Anomaly Reports shall be reviewed by the project software lead engineer for anomaly solution determination and implementation authorization. The project software V&V lead engineer shall be responsible for anomaly report closure. The SE [title/position] shall be responsible for the approval or disapproval of the distribution of Software Anomaly Reports.
Requirements 1. A Software Anomaly Report shall be used to identify problems detected during software development activities. The specific information required includes: a. b. c. d. e.
Description and location of the anomaly Severity of the anomaly Cause and method of identifying the anomalous behavior Recommended action and actions taken to correct the anomalous behavior Impact of the problem on the system capability of the product and on the continued conduct of V&V phase activities
2. The form of the Software Anomaly Report shall be as defined in the relevant SE software development procedures. The configuration identification, tracking, and status reporting of Software Anomaly Reports shall be in accordance with the project’s Software Configuration Management Plan (SCMP).
Copyright © 2002 Interpharm Press
Page 41 of 51 SE-SDP
144
Software Quality Assurance SOPs for Healthcare Manufacturers
3. The projected impact of an anomaly shall be determined by evaluating the severity of its effect on the operation of the system. The severity of a Software Anomaly Report shall be defined as one of the following: •
High. The change is required to correct a condition that prevents or seriously degrades a system objective where no alternative exists or to correct a safety-related problem.
•
Medium. The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user or system requirements can be met.
•
Low. The change is desirable to maintain the system, correct operator inconvenience, or any other.
4. The project software V&V lead engineer shall be responsible for ensuring the proper documentation and reporting of software anomalies. All anomalies shall be reported regardless of the perceived impact on software development or the severity of the anomaly with respect to the system operation. 5. Software Anomaly Reports shall be reviewed by the project’s software lead engineer for anomaly validity, type, and severity.The project’s software lead engineer can direct additional investigation, if required, to assess the validity of the anomaly or the proposed solution. An anomaly solution that does not require a change to a baselined software configuration item may be approved by the project’s software lead engineer. If the anomaly requires a change to a baselined software configuration item, then the anomaly solution shall be approved in accordance with the project’s SCMP. 6. When an anomaly solution is approved and the personnel responsible for performing the corrective action are indicated, the project’s software lead engineer shall authorize implementation of the corrective action. 7. The project software V&V lead engineer shall be responsible for anomaly report closure, which includes: a. Documenting the corrective action(s) taken b. Verifying the incorporation of authorized changes as described in the anomaly report c. Reporting the status of the Software Anomaly Report to the project’s software lead engineer and the SE [title/position] 8. The SE [title/position] shall be responsible for the approval or disapproval of the distribution of Software Anomaly Reports that are closed. Upon approval, the project software V&V lead engineer shall distribute closed Software Anomaly Reports to the software project Quality Assurance representative(s).
Page 42 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
145
9. The SE [title/position] shall ensure the resolution of anomalies that are indicated on the Software Anomaly Report with a severity of “high” before the software project proceeds to the next software development phase.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for: a. Proper documentation and reporting of software anomalies b. Anomaly report closure c. Distribution of closed Software Anomaly Reports to the software project Quality Assurance representative(s) 2. The project software lead engineer shall be responsible for: a. Review of Software Anomaly Reports for anomaly validity, type, and severity b. Directing additional investigation if required to assess the validity of the anomaly or the proposed solution c. Approval or disapproval of an anomaly solution that does not require a change to a baselined software configuration item d. Authorization of corrective action implementation 3. The SE [title/position] shall be responsible for: a. Approval or disapproval of the distribution of closed Software Anomaly Reports b. Ensuring the resolution of anomalies that are indicated on the Software Anomaly Report with a severity of “high” before the software project proceeds to the next software development phase
Copyright © 2002 Interpharm Press
Page 43 of 51 SE-SDP
146
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Acceptance audit: Audit conducted near the end of a software project for the purpose of reviewing the status of each acceptance item, achieving close-out of those items still open, and obtaining approval of each accepted item. Acceptance criteria: Criteria that a software end product must meet in order to successfully complete a test phase or meet delivery requirements. Accuracy: Quantitative assessment of freedom from error. Algorithm: Finite set of well-defined rules for the solution of a problem in a finite number of steps. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Code and Test: Phase of the software development life cycle during which a software end product is created from design documentation and tested. Completeness: Those attributes of the software and documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Computer attributes: Characteristics of a computer that determine multiplicative cost factors that estimate the software development effort. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it.
Page 44 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
147
Conceptual design: Functional breakdown of a system. Configuration control: Process of evaluating, approving or disapproving, and coordinating changes to configuration items after formal establishment of their configuration identification. Configuration identification: Process of designating the configuration items in a system and recording their characteristics. Configuration item: Aggregation of hardware, software, or any of its discrete parts, that satisfies an end-use function. Configuration management (CM): Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Consistency: Those attributes of the software and documentation that provide uniformity in the specification, design, and implementation of the product. Correctness: Extent to which software is free from design defects, coding defects, and faults; meets its specified requirements; and meets user expectations. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to quality assurance personnel for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Design requirement: Any requirement that impacts or constrains the design of a software system or software system component. Deviation: Authorization for a future activity, event, or product that departs from standard procedures. Distributed processing subsystems: Systems that consist of multiple autonomous or semiautonomous subsystems. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software.
Copyright © 2002 Interpharm Press
Page 45 of 51 SE-SDP
148
Software Quality Assurance SOPs for Healthcare Manufacturers
Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, that may lead to a failure. Flow oriented: Representing flow of system inputs to system outputs. Functional simulation: Simulation of the functional operations of a system. Hardware tools: Hardware products that serve the required function of supporting software development activities. Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness, or damage to or loss of equipment or property. Hazard analysis: Listing of potential hazards associated with a device or system, with an estimation of the severity of each hazard and its probability of occurrence. Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Inspection: Formal evaluation technique in which software requirements, design, or code are examined in detail by a person or group other than the author to detect faults, violations of development standards, or other problems. Integration: Process of combining software elements, hardware elements, or both into an overall system. Integrity: Accuracy in an item’s compliance with its requirements. Interface Design Specification (IDS): Project-specific document that completely specifies the interfaces among the subsystems. Iteration: Process of repeatedly executing a given sequence of steps until a given condition is met or while a given condition is true.
Page 46 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
149
KDSI: Thousand delivered source instructions. Milestone: Scheduled and accountable event that is used to measure progress. Non-project-related person: Person not directly involved in the specification of software requirements or acceptance criteria. Product attributes: Characteristics of a product that determine multiplicative cost factors that estimate the software development effort. Product history file: Compilation of records containing the complete development history of a finished product. Product Objectives Document (POD): Project-specific document that specifies the objective of the product in terms of marketing and function. Programming convention: Customary or agreed-upon rule or procedure reflecting a commonly accepted practice. Programming standard: Definite rule or procedure established and imposed by authority. Project attributes: Characteristics of a project that determine multiplicative factors that estimate the software development effort. Quality assurance (QA): Planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements. Reliability: Ability of an item to perform a required function under stated conditions for a stated period of time. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Safety-critical parameters: System parameters within which the system must operate in order to be safe.
Copyright © 2002 Interpharm Press
Page 47 of 51 SE-SDP
150
Software Quality Assurance SOPs for Healthcare Manufacturers
Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Architecture Design Review (SADR): Review conducted for the purpose of: (1) reviewing the project’s architecture design, SADS, associated plans, and critical issues; (2) resolving identified issues; and (3) obtaining commitment to proceed into the Detailed Design Phase. Software Architecture Design Specification (SADS): Project-specific document that constrains the design information needed to support the detailed definition of the individual software system components and, upon completion of the Architecture Design Review, becomes the design baseline for development of the SDDS used in support of software coding. Software code walk-throughs: Reviews conducted at the component source code level, as the implementation of each component is completed, with the purpose of detecting implementation, documentation, and programming standards problems. Correctness and efficiency may also be evaluated. Software configuration management (SCM): Discipline of identifying the configuration of a software system at discrete points in time for the purpose of systematically controlling changes to this configuration and maintaining the integrity and traceability of this configuration throughout the development process. Software Configuration Management Plan (SCMP): Project-specific plan that specifies the methods and planning employed to implement SCM activities. Software design walk-throughs: Reviews conducted at the component level, as the design of each component is completed, with the purpose of detecting design problems. Software Detailed Design Review (SDDR): Review conducted for the purpose of: (1) reviewing the project’s detailed design, SDDS, associated plans, and critical issues; (2) resolving identified issues; (3) obtaining commitment to proceed into the code and test phase; and (4) obtaining commitment to a test program supporting product acceptance. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review, including a description of the overall program operation and control and the use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization.
Page 48 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
151
Software development estimations: Estimations of a project’s software development schedule, personnel estimate, tasking, KDSI size, and time allocation. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures employed to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software Development Policy Change Control Board (SDPCCB): Board that establishes and maintains a set of software policies in order to promote effective and consistent software development, verification and validation, configuration management, and non-project administrative practices. Software development procedures: Procedures that provide detailed guidance for software development within the framework and requirements provided by the SE Software Development Policies. Software Development Test Plan (SDTP): Project-specific plan that defines the scope of software testing that must be successfully completed for each software component developed. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software End-product Acceptance Plan (SEAP): Project-specific plan designed to serve as a descriptive checklist of the end products and services required for approval. Software project: Planned and authorized undertaking of specified scope and duration which results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Software quality: Totality of features and characteristics of a software product that bear on its ability to satisfy given needs. Software quality assurance: Independent assurance that a software project’s products meet appropriate standards for software quality.
Copyright © 2002 Interpharm Press
Page 49 of 51 SE-SDP
152
Software Quality Assurance SOPs for Healthcare Manufacturers
Software Quality Assurance Plan (SQAP): Project-specific plan that states the software quality objectives of the project as determined by the product requirements and the significance of the intended application. Software reliability: Probability that software will not cause the failure of a system for a specified time under specified conditions. Software Requirements Review (SRR): Software review conducted to review the provisions of the Software Requirements Specification, which, once approved, will serve as the basis of software end-product acceptance. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software system: Software components and their interfaces. Software tool: Computer program used to help develop, test, analyze, or maintain another computer program or its documentation. Software User’s Manual: Project-specific manual that contains the instructions necessary to operate the software system. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether or not requirements have been satisfied. Source code: Original software expressed in human-readable form (programming language) which must be translated into machine-readable form before it can be executed by the computer. System analysis: Analysis of a system specifically addressing the adequacy of the software design to fulfill the software performance and safety requirements and to remain within the allocated design budgets for memory and other storage utilizations, timing allocations, and communication bandwidths. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test. Testability: Extent to which software facilitates both the establishment of test criteria and the evaluation of the software with respect to those criteria or the extent to which the definition of requirements facilitates analysis of the requirements to establish test criteria.
Page 50 of 51 SE-SDP
Copyright © 2002 Interpharm Press
Software Engineering Software Development Policies
153
To be determined (TBD): Any item or issue not fully resolved. Top-down software design: Design approach that starts with the top-level system functions and proceeds through a downward allocation, evaluation, and iteration to successively lower levels of design and which enhances design traceability, completeness, and comprehension. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether or not the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. Waiver: Authorization to depart from SE policy for an activity, event, or product that has already been initiated. Walk-through: Review in which the designer or programmer leads members of the review team through a segment of design or code, while the reviewers ask questions and submit comments about technique, style, possible errors, violation of development standards, and other problems. Work Breakdown Structure (WBS): Form that clarifies staffing levels required for a project by month and engineering category.
Copyright © 2002 Interpharm Press
Page 51 of 51 SE-SDP
SE-SMG SOFTWARE ENGINEERING SOFTWARE METRICS GUIDELINES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SESMG-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 13 SE-SMG
156
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
METRIC DESCRIPTION
4
3.0
METRIC COLLECTION AND REPORTING
7
GLOSSARY
Page 2 of 13 SE-SMG
12
Copyright © 2002 Interpharm Press
Software Engineering Software Metrics Guidelines
157
1.0 INTRODUCTION
1.1 Purpose This document describes a set of industry standard software metrics for the quality of the software processes and products through each phase of the software development life cycle.
1.2 Scope This document applies to the development of all engineering and product software.
1.3 Overview This document describes metrics that are applicable to a given software development phase. The method for collecting and reporting each metric is also described. Through periodic reporting and review of each metric, a continual assessment of the project can be made. This allows for timely adjustments, if required. At the end of the project, the metrics serve as baselines for making predictions about certain attributes of future projects.
1.4 References •
Software Metrics—A Rigorous Approach. Norman E. Fenton. Chapman and Hall, 1991.
•
Software Metrics: Establishing a Company-Wide Program. Robert B. Grady and Deborah L. Caswell of Hewlett-Packard. Prentice-Hall, 1987.
•
IEEE Standard for a Software Quality Metrics Methodology. IEEE Computer Society. IEEE Std 1061, 1992.
•
Structured Testing: A Software Testing Methodology Using the Cycloniatic Complexity Metric. Thomas J. McCabe. National Bureau of Standards Special Publication 500, 1999.
Copyright © 2002 Interpharm Press
Page 3 of 13 SE-SMG
158
Software Quality Assurance SOPs for Healthcare Manufacturers
2.0 METRIC DESCRIPTION
2.1 Process Metrics 2.1.1
Maturity Level
Upon completion of a project, a measurement will be made of [company name] software development process maturity.This metric is derived from the Software Engineering Institute (SEI) Maturity Model. The model is a set of guidelines used to control a software development process. The SEI Maturity Model requires that every software development project maintain a development and implementation process covering all life cycle phases. The goal of the model is to ensure that a quality software product is delivered. The model evaluates software processes by comparing their maturity to a set of standard practices. The evaluation is based on five maturity levels (1–5). A level 1 evaluation means that no process or software quality organization is in place. Each of the successive levels evaluates a software development project according to the processes and use of a software quality organization to guide and control the development process. The main mechanism for controlling software quality processes is internal audits. A software organization is mature when its processes are repeatable, defined, managed, and optimized.
2.1.2
Traceability Coverage
The metrics described in this section provide a measure of completeness with regard to requirements implementation. The first metric is the percentage of software-related system requirements in the System Requirements Document (SRD) which have been decomposed into software requirements as specified in the Software Requirements Specification (SRS). This metric should be 100 percent the first time, unless the Software Development Plan (SDP) specifies a phased release of the SRS. The second metric is the percentage of requirements in the SRS that have been implemented in the code. This metric should be 100 percent the first time. The final metric is the percentage of software-related system requirements in the SRD that are implemented in the code. This metric provides a cross-check between the first two metrics. It should only be 100 percent when the other metrics are 100 percent.
2.1.3
Requirements Stability
The metrics in this category provide a measure of software requirements stability. The first metric is the number of Class I and II CRAs that affect the SRS. The second metric is the
Page 4 of 13 SE-SMG
Copyright © 2002 Interpharm Press
Software Engineering Software Metrics Guidelines
159
number of anomalies that result in a Class I or II CRA against the SRS. The rate of change of these metrics should decrease over time.
2.1.4
Design Stability
The metrics in this category provide a measure of design stability. The first metric is the number of Class I and II CRAs that affect the Software Detailed Design Specification (SDDS).The second metric is the number of anomalies that result in a Class I or II CRA against the SDDS. The rate of change of these metrics should decrease over time.
2.1.5
Fault Profiles
Anomaly and CRA profiles track the number of open versus closed reports by type and development phase. The number of open reports should decrease over time. The rate of change of open reports should also decrease over time. Another metric is the average time an anomaly report remains open. This metric provides a measure of how quickly anomalies are being corrected. This metric should decrease over time.
2.2 Product Metrics 2.2.1
Test Coverage
The test coverage metrics provide a measure of the breadth of software requirements testing as well as the degree of component testing. The metrics for requirements test coverage consist of the percentage of requirements tested and the percentage of requirements whose test results were successful. These metrics apply both to the software development and V&V groups. The development metrics should be 100 percent at the end of the Implementation Phase.The V&V metrics should he 100 percent at the end of the Validation Phase.The metric for measuring the degree of component testing is the percentage of [McCabe] test paths that have been tested. This metric should be 100 percent at the end of the Implementation Phase.
2.2.2
Program Size
The metrics for program size are divided into two categories: number of source lines and token counts. The count of source lines consists of code lines without comments, code lines with comments, comment lines, blank lines, and the total count. The token count consists of number of
Copyright © 2002 Interpharm Press
Page 5 of 13 SE-SMG
160
Software Quality Assurance SOPs for Healthcare Manufacturers
unique operators, number of unique operands, total number of operators, and total number of operands. The token count metrics are used in the calculation of some of the composite metrics.
2.2.3
Function Size
The metrics for the size of a C function are source line counts and token counts. From these counts, statistical characteristics about function size are derived.These characteristics are minimum value, maximum value, and mean and standard deviation. The minimum, maximum, and mean of total source lines should be close to the industry-recommended standard of 100.
2.2.4
Logic Structure
The logic structure metrics measure some aspect of a module’s control-flow structure. The cyclomatic complexity metric is the number of independent paths through a module’s flow graph and represents the ideal number of test paths that should be exercised during component testing. The industry-recommended threshold is 10. The essential complexity metric is the degree to which a module contains unstructured constructs. The industry-recommended threshold is 4. The module design complexity is a measure of the module’s calling pattern to its immediately subordinate modules and represents the minimum number of test paths that should be exercised to test all invocations of subordinate modules.The last logic structure metric is the number of branches in the module’s flow graph. From these metrics, statistical characteristics about the logic structure of the program are derived. These characteristics are minimum and maximum value and mean and standard deviation. The minimum, maximum, and mean should be as low as possible. This provides a qualitative indication that the program is testable, understandable, and maintainable.
2.2.5
Composite
The composite metrics are those defined by Maurice Halstead in the mid-1970s and reflect that software comprehension is a process of mental manipulation of program tokens. Some of Halstead’s metrics show a correlation to quality factors such as code errors and maintenance effort. The Halstead measures will initially be kept in [company name]’s metric set with the understanding that some or all may be discarded if they show no statistical significance. The Halstead metrics are Program Length, Program Volume, Program Level, Program Difficulty, Intelligent Content, Programming Effort, Error Estimate, and Programming Time.
Page 6 of 13 SE-SMG
Copyright © 2002 Interpharm Press
Software Engineering Software Metrics Guidelines
161
3.0 METRIC COLLECTION AND REPORTING
3.1 Metric Forms The forms in Figures 1, 2, and 3 are used for metrics reporting. The allocation of those forms to metric categories is indicated in Table 1.
3.2 Metric Collection Table 2 depicts the applicable metrics for each software development phase.
Table 1
Allocation of Metrics Categories to Metrics Form
METRIC CATEGORY Traceability coverage Requirements stability Design stability Fault profiles Test coverage Program size Function size Logic structure Composite
Copyright © 2002 Interpharm Press
FIGURE 1 × × × × ×
FIGURE 2
FIGURE 3
×
×
× × × ×
Page 7 of 13 SE-SMG
Software Metrics and Primary Characteristics Form
Page 8 of 13 SE-SMG
Quality
Requirements
Management
Category
Reliability
Depth of testing Fault profiles
Complexity indices Percent requirements tested
Complexity Breadth of testing
Average open age
Other
Methodology
Process
Software
Degree of code testing Documentation
Percent requirements passed
Stability index
Number of requirement changes
Stability
Design stability
Percent requirements traced
Computed maturity level
Measurements
Software engineering environment Traceability
Metrics
SOFTWARE ENGINEERING METRICS AND PRIMARY CHARACTERISTICS
Figure 1
Open Closed Open Closed Open Closed Open Closed Open Closed
By development By V&V By development By V&V
System requirements to SRS System requirements to code SRS to code Anomalies CRAs Number of anomalies Number of CRAs
Date:
Project:
Value
162 Software Quality Assurance SOPs for Healthcare Manufacturers
Copyright © 2002 Interpharm Press
Software Development Code and Error Metrics Form
Copyright © 2002 Interpharm Press Phase Description Requirements Architecture design Detailed design Implementation Total open Total closed Requirements Architecture design Detailed design Implementation Total open Total closed Phase Description Requirements Architecture design Detailed design Implementation Total open Total closed
Category Anomalies, all
Debug report entries
Category
CRAs, all
Reliable and maintainable Unreliable and unmaintainable Reliable and unmaintainable Unreliable and maintainable Total
Description
Scatter diagram
Category
Open
Metrics Closed
Metrics
Metrics
Total
Other Open Closed
Percent
Number of functions:
Number of files:
Revision:
Date:
Category Anomalies resulting in CRAs Number of CRAs generated Ratio of anomalies to CRAs
Software Open Closed
Number of Functions
Documentation Open Closed
SOFTWARE DEVELOPMENT CODE AND ERROR METRICS
Figure 2
Total
Total
Software Engineering Software Metrics Guidelines 163
Page 9 of 13 SE-SMG
Software Development Code Metrics Form
Page 10 of 13 SE-SMG
Composite metrics
Category Logic structure
Category Function size
Category Program size
Description Cyclomatic complexity Essential complexity Design complexity Branch count Program length Program volume Program level Program difficulty Intelligent content Programming effort Error estimate Programming time
Description Lines of code Comment lines Blank lines Code with comment Total Number of operators Number of operands Number of tokens Total operators Total operands Total token count
Token counts
Description Number of lines
SOFTWARE DEVELOPMENT CODE METRICS
Figure 3
Min
Min
Max
Max
Metrics Total Code Comment Blank Code with comment Operators Operands Total operators Total operands
Metrics Total
Metrics
Mean
Mean
Std. Dev.
Std. Dev.
Date: Revision: Number of files: Number of functions: Value
164 Software Quality Assurance SOPs for Healthcare Manufacturers
Copyright © 2002 Interpharm Press
Copyright © 2002 Interpharm Press
Page 11 of 13 SE-SMG
Composite
Logic structure
Function size
Program size
Test coverage
Fault profiles
Design stability
Requirements stability
Traceability coverage
System requirements document to SRS SRS to code SDDS to code Number SRS anomalies resulting in Class I or II CRA Number Class I or II CRAs affecting SRS Number SADS anomalies resulting in Class I or II CRA Number Class I or II CRAs affecting SADS Number SDDS anomalies resulting in Class I or II CRA Number Class I or II CRAs affecting SDDS Number open and closed SRS anomalies Anomalies Process violation Methodology violation SRS CRAs SADS Anomalies CRAs SDDS Anomalies CRAs Code Anomalies CRAs Anomaly open time average Percent requirements Tested by development Passed component testing Tested by V&V Passed validation testing Degree of code testing Line counts Token counts Line count statistics Token count statistics Cyclomatic complexity statistics Essential complexity statistics Design complexity statistics Branch count statistics Halstead metric statistics
Percent requirements traced from
X X X X X X
X
X X X X
X
X X X X
X X
Requirements X
Architecture Design
X
X X X
X
X
X X
X X X X
X X
X
X X X
X X
X X
X
X X X X
X X
X
X X X X X X
X X
X
Detailed Design
SOFTWARE LIFE CYCLE PHASE
Code and Test
SOFTWARE METRIC Integrate and Test
Software Metric Collection by Software Development Life Cycle Phase
X X X X X X X X X X X X X X X X X X X
X
X
X X X X X
Software Validation
Table 2
Software Engineering Software Metrics Guidelines 165
166
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Cyclomatic complexity (ν(G)): Measure of the complexity of a module’s decision structure. Given a module whose flow graph G has e edges and n nodes, its cyclomatic complexity is ν(G) = e – n + 2. The industry-recommended threshold is 10. Edge: Transfer of control from one node to another, represented graphically by a line. Error estimate: Estimate of the number of errors in the module. Defined by the equation V/Eo where V is program volume and Eo is programming error rate. Eo is bounded in the range 3000 to 3200. Essential complexity (eν(G)): A measure of the degree to which a module contains unstructured constructs (i.e., branching out of loops, branching into loops, branching into a decision, or branching out of a decision).The essential complexity of the module’s flow graph G is equal to the cyclomatic complexity of the reduced graph G’, where G’ is obtained by removing all structured constructs from G. In a completely structured module, essential complexity is equal to one. The industry-recommended threshold is 4. Flow graph: Graphic representation of the nodes and edges of a module of code. Module design complexity (iν(G)): Complexity of the design-reduced module. Reflects the complexity of the module’s calling patterns to its immediately subordinate modules. The term design reduction means treating all decisions and loops that do not contain calls to subordinate modules as if they were straight-line code. The complexity of the reduced flow graph will be the complexity of the module’s structure as it relates to those calls. Module design complexity, therefore, can be no greater than the cyclomatic complexity of the original flow graph and is typically much less. Node: Smallest unit of code in a module. A node consists of a block of one or more sequential statements. Program flow enters this block of code at its first statement and continues executing sequentially until reaching an exit from the block. Process metric: Measure of an attribute of a software development process. Product metric: Measure of an attribute of a product that arises out of a software development process. Program difficulty: Inverse of program level. Program length: Total number of operators and operands in a module.
Page 12 of 13 SE-SMG
Copyright © 2002 Interpharm Press
Software Engineering Software Metrics Guidelines
167
Program level: Defined by the ratio V*/V where V is the actual volume of the module and V* is the potential volume or the volume of the minimal implementation of the module. An estimate of program level is given by the equation (2/η1) × (η2/N2) where η1 is the number of distinct operators, η2 is the number of distinct operands, and N2 is the total number of operands. Program volume: Defined by the expression N × (log2n) where N is the total number of operators and operands and n is the total number of distinct operators and operands. Programming effort: Effort required to generate a module. The unit of measurement is the number of elementary discriminations and is defined as the ratio V/L where V is program volume and L is program level. Programming time: Estimate of the amount of time in seconds to code a module. Defined as the ratio E/S where E is the programming effort and S is the Stroud number or the number of elementary mental discriminations the programmer can make per second. S is between 5 and 20. Token: Element within a programming language; arithmetic operators, keywords, and module names are examples. Software Engineering Institute (SEI) maturity level: Measure of maturity of an organization’s software development process. The measure is based upon five maturity levels (1–5). A level 1 evaluation means that no process or software quality organization is in place. A level 5 evaluation means the software processes are repeatable, defined, managed, and optimized.
Copyright © 2002 Interpharm Press
Page 13 of 13 SE-SMG
SE-PCMP SOFTWARE ENGINEERING PROJECT CONTROL AND MANAGEMENT POLICIES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SEPCMP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 19 SE-PCMP
170
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
PREAMBLE
SE Operating Procedure
3
POLICY 1
Software End-product Release
5
POLICY 2
Software Development Management
7
POLICY 3
Data Integrity and Retention
9
POLICY 4
SE Systems Management
10
POLICY 5
Software Library Administration
12
POLICY 6
Data Security
13
POLICY 7
Product Development Supporting Activities
14
GLOSSARY
Page 2 of 19 SE-PCMP
17
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
PREAMBLE
171
SE OPERATING PROCEDURE
Policy Software Engineering (SE) project operations shall comply with this SOP, which is established, maintained, and used to achieve quality, uniformity, and consistency in all phases of software project operations.
Requirements 1. The SE Project Control and Management Policies (Figure 1) shall be applied to all SE software projects. Projects where effort will be expended in order to modify or enhance existing software also are subject to this requirement. 2. The SE Project Control and Management Policies shall be maintained by the Software Project Control and Management Policy Change Control Board (CCB). The chairman of this board shall be appointed by the SE [title/position] with the approval of the [title/position], and board members shall be appointed in writing by the SE [title/position]. The SE [title/position] shall serve as the secretary to the board and shall be responsible for scheduling board meetings and maintaining minutes of meetings and permanent files of CCB actions. Proposed changes to SE Project Control and Management Policies must be submitted in writing to the board. At least once each year, the board shall convene to review the policies in their totality for relevancy and currency. Where appropriate, they shall propose revisions to the policies subject to the review and approval of the [title/position]. After approval by the SE [title/position], the policies shall be approved by the [title/position] and [title/position]. 3. Circumstances may require deviation(s) or waiver(s) from policy. A written request for a deviation shall be submitted by the cognizant project software lead engineer in advance of a future activity, event, or product in order that SE management be made aware of the project’s intention to employ a higher-risk approach to project control and management. A written request for a waiver shall be submitted by the project software lead engineer in those cases where the activity, event, or product has already been initiated. Deviations and waivers shall be reviewed by the [project title/position] and submitted to the SE [title/position] for review. The SE [title/position] will make a recommendation to the [title/position] and/or [title/position] for approval or disapproval of the proposed deviation or waiver. A proposed deviation or waiver must be approved by the [title/position] and/or [title/position] before the project directory management tasks affected by that deviation or waiver are begun.
Copyright © 2002 Interpharm Press
Page 3 of 19 SE-PCMP
172
Software Quality Assurance SOPs for Healthcare Manufacturers
4. Each request for a deviation or waiver shall identify: a. Each specific policy or policy requirement for which it applies b. The alternative policy approach to be taken by the project c. The impact on project schedule, performance, and/or risk 5. A copy of each approved deviation or waiver shall be forwarded to the secretary of the Software Project Control and Management Policy CCB. A copy shall also be placed in the product history file. 6. These policies refer to and govern a set of SE software configuration management (SCM) procedures.The procedures are intended to provide detailed guidance within the framework and requirements provided by these policies. It is the responsibility of the project configuration manager to apply the existing relevant SE SCM procedures. New SE SCM procedures are to be submitted to the SE [title/position] prior to use in order that they can be reviewed and approved. 7. A permanent record of deviation and waiver approvals shall be maintained for each project by using the form depicted in the SE configuration management procedures. This record shall be initiated during development of the Product Objectives Document and shall serve as a record of all subject approvals for the duration of the project.
Figure 1
SE Software Project Control and Management Policies
Policy Category Software Release
Policy Topic Title Software End-product Release
Project Management Software Development Management SE System Management Software Library Administration Software Development Supporting Activities Software Security Data Integrity and Retention Data Security
Page 4 of 19 SE-PCMP
Document Governed Policy Number IDS SRS SDDS SVVR software EPROM
1
2 4 5 7 3 6
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
173
Responsibilities 1. The project configuration manager is responsible for the following: a. Generating changes to SE Project Control and Management Policies b. Generating written deviations and waivers c. Applying relevant SCM procedures to the project 2. The project software lead engineer is responsible for the following: a. Generating changes to SE Project Control and Management Policies b. Generating written deviations and waivers c. Applying relevant SCM procedures to the project 3. The SE [title/position] is responsible for the following: a. Review and approval of SE Project Control and Management Policies b. Review and recommendation of deviations and waivers from SE Project Control and Management Policies 4. The [title/position] and/or [title/position] is responsible for the following: a. Approval of SE Project Control and Management Policies b. Approval of deviations or waivers from SE Project Control and Management Policies 5. The [project title/position] is responsible for the review and submittal of deviations and waivers from SE Project Control and Management Policies. 6. The managers of organizations supporting and sponsoring the project should share the commitment to the implementation of these policies.
POLICY 1 SOFTWARE END-PRODUCT RELEASE
Policy The release of the software end products that are developed on a software project shall be the responsibility of the project software lead engineer. The software end products to be released are the Interface Design Specification (IDS), the Software Requirements Specification (SRS), the Software Detailed Design Specification (SDDS), the Software
Copyright © 2002 Interpharm Press
Page 5 of 19 SE-PCMP
174
Software Quality Assurance SOPs for Healthcare Manufacturers
Verification and Validation Report (SVVR), and media containing the validated instrument software. These software end products shall be delivered by the software lead engineer to [insert corporate document control function here] upon completion of all software V&V activities and tasks for the software project. Document control numbers for these software end products shall be obtained from [insert corporate document control function here] by the project software configuration manager.
Requirements 1. The project software configuration manager shall obtain document control numbers for the following software end products from [insert corporate document control function here] prior to their release for signature: IDS, SRS, DDS, and SVVR. The configuration of these documents shall be maintained by the software configuration manager until their delivery to [insert corporate document control function here]. 2. The “as-built” IDS, SRS, SDDS, SVVR, and validated instrument software shall be delivered to [insert corporate document control function here] by the project software lead engineer.The IDS, SRS, SDDS, and SVVR shall be delivered in hard copy and on electronic media. The hard copy of each document shall indicate the release change order number on each page. 3. Two EPROMs containing the validated software shall be generated and labeled. The EPROMs shall be placed in separate electro-static secure (ESS) bags in accordance with electro-static discharge (ESD) procedures and delivered to [insert corporate document control function here] by the project software lead engineer. The EPROM labels shall indicate the release change order number. 4. The validated instrument software shall be delivered to [insert corporate document control function here] as files electronically transferred to the corporate drop box and on electronic media. Delivery shall be accomplished by the project software lead engineer upon completion of software V&V and the generation of the SVVR. 5. Subsequent changes to software end products after delivery to [insert corporate document control function here] shall be implemented according to standard operating procedures governing changes to production-released documentation. 6. For each subsequent new software version released to [insert corporate document control function here], the IDS, SRS, and SDDS shall be updated to reflect the current “as-built” software and be released as new document revisions. 7. After the deliverables to [insert corporate document control function here] have been made, the remaining software project data files shall be archived by the project software lead engi-
Page 6 of 19 SE-PCMP
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
175
neer. The electronic media shall be labeled and delivered to [insert corporate document control function here].
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software configuration manager shall be responsible for the following: a. Obtaining document control numbers for the IDS, SRS, SDDS, and SVVR prior to their release b. Obtaining a change order number to release the IDS, SRS, SDDS, SVVR, and EPROMs c. Configuration management of the IDS, SRS, SDDS, and SVVR until their release d. Archiving the electronic version of the software end products delivered to [insert corporate document control function here] in accordance with software configuration management SOPs 2. The project software lead engineer shall be responsible for the delivery of the “as-built” IDS, SRS, SDDS, SVVR, EPROMs, and validated instrument software to [insert corporate document control function here] upon completion of software V&V activities.
POLICY 2 SOFTWARE DEVELOPMENT MANAGEMENT
Policy Software project end-product generation shall be governed by a development schedule and a software development and V&V team. Overall responsibility for the development and V&V of the software shall reside with the project software lead engineer.The project software V&V lead engineer shall be responsible for the V&V aspects of the project.
Copyright © 2002 Interpharm Press
Page 7 of 19 SE-PCMP
176
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements 1. The project software lead engineer shall direct the individual members of the software team who are charged with technical performance on the project. The project software lead engineer shall take software technical direction from the SE [title/position] and report directly to the project [title/position]. 2. The project software V&V lead engineer shall direct the individual members of the software V&V team who are charged with technical performance of V&V on the project.The project software V&V lead engineer shall take technical direction from the project software lead engineer and report directly to the SE [title/position]. 3. The project software lead engineer shall serve as a member of the product technical team and be responsible to the project [title/position]. The project software lead engineer shall be the point of contact for project SE personnel and shall provide status, schedule, resource loading, and task assignments and be responsible for software development progress on the project.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. SE personnel assigned to a software project are charged with completing assigned project tasks. 2. The project software lead engineer shall be responsible for the following: a. Generating estimates of the development schedule, resource loading, and tasking prior to beginning any work b. Providing schedule estimates to the project [title/position] c. Assigning and completion of project tasks d. Providing review, guidance, and direction to project personnel performing tasks on the project e. Providing, at project request, interpretive project documentation, standards, status, and effectiveness of tasks f. Project [title/position] support 3. The project software V&V lead engineer shall be responsible for the following: a. Providing V&V schedule estimates to the project software lead engineer
Page 8 of 19 SE-PCMP
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
177
b. Providing review, guidance, and direction to project V&V personnel performing tasks on the project c. Providing, at project request, interpretive project documentation, standards, status, and effectiveness of tasks d. Project software lead engineer support e. Reporting to the project software lead engineer for project technical direction and to the SE [title/position] for overall V&V responsibility 4. The SE [title/position] shall be responsible for the following: a. Appointing a project software lead engineer b. Appointing a project software V&V lead engineer c. Reviewing and approving the software development schedule, resource loading, and tasking
POLICY 3 DATA INTEGRITY AND RETENTION
Policy SE members shall ensure that computer data are duplicated on a reliable media for storage in a secure area. SE members shall ensure that completed or suspended work is saved as a set of files that can be used to regenerate the work at a later date.
Requirements 1. The period of retention for this data shall be in accordance with corporate SOPs. 2. The SE system tool administrator(s) shall arrange for storage of their respective tool(s) backup and archive data in the secure, off-site area identified in corporate disaster contingency plan documents. 3. The SE system tool administrator(s) shall perform data recovery as needed in accordance with the corporate disaster contingency plan documents.
Copyright © 2002 Interpharm Press
Page 9 of 19 SE-PCMP
178
Software Quality Assurance SOPs for Healthcare Manufacturers
Responsibilities 1. SE personnel shall be responsible for the following: a. Making their computer systems available to the SE system tool administrator(s) during normal working hours b. Restricting project-related and individual working files to the predefined directory structures 2. The SE system tool administrator(s) shall be responsible for ensuring: a. Backup and archiving of data b. Transferring backup and archive data to the off-site storage location
POLICY 4 SE SYSTEMS MANAGEMENT
Policy SE systems shall support and enhance the process of product development, including systems security regulation, user account administration, systems monitoring, system backup administration, system utilities, and system selection and purchasing.
Requirements 1. Physical access to SE systems shall be controlled by the following: a. Users shall not share nor give out their passwords b. Users shall abide by the system security as specified within the corporate information systems SOP 2. Software projects shall control user access to their systems by the following: a. Users shall submit in writing a signed request for a user name and password b. Privileged user names shall be given only upon written approval of project management and the system administrator c. When employment is terminated, the user’s account shall be suspended and files shall be turned over to the appropriate project software lead engineer within one week of employment termination
Page 10 of 19 SE-PCMP
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
179
3. SE members shall adhere to licensing and copyright requirements for all systems, software, and products installed. 4. Electronic mail users shall abide by appropriate corporate professional standards in their mail communications. 5. SE systems utilities shall conform to the software development standards with regard to design, development, testing, and documentation. 6. Selection and purchasing of SE equipment and tools shall be performed after suitable benchmarking by the designated SE system tool administrator and users. After vendor qualification by the SE system tool administrator, the following shall be criteria for the selection of SE hardware and software tools: a. b. c. d. e.
Verify that tools perform in a known manner Provide for contingency planning Provide for a system that allows for compatibility from project to project Verify that the tools chosen meet requirements as defined by the users Provide a system for maintaining and updating tools for configuration uniformity
Responsibilities 1. Users shall be responsible for the following: a. b. c. d.
Periodically changing passwords Abiding by corporate electronic mail standards Abiding by copyright and licensing agreements Not importing programs or data from other computer systems without explicit permission and virus scans e. Not exporting programs or data to any other computer system without explicit permission 2. The project software lead engineer shall be responsible for authorization of users to have user names and passwords for their project. 3. The SE system tool administrator(s) shall be responsible for the following: a. b. c. d.
Ensuring backups and archives and their removal to off-site storage Providing user help as required Coordinating with hardware and software vendors for repair and maintenance Ensuring that appropriate configuration management and directory structure changes are made e. Originating of capital, supplies, and maintenance purchase requisitions
Copyright © 2002 Interpharm Press
Page 11 of 19 SE-PCMP
180
Software Quality Assurance SOPs for Healthcare Manufacturers
4. The SE [title/position] shall be responsible for the following: a. Authorization of SE system tool administrator(s) to have privileged user names b. Review of system utilization and performance
POLICY 5 SOFTWARE LIBRARY ADMINISTRATION
Policy The SE Software Library shall be the repository for the files that comprise a validated software reusable end product. Access to the SE Software Library source information shall be restricted to persons authorized by the SE [title/position]. Archival, retrieval, and disaster recovery shall be provided for the files in the SE Software Library in accordance with the information systems SOPs. A software configuration manager shall prescribe and administer the necessary procedures to ensure compliance with this SOP.
Requirements 1. The SE Software Library shall serve as the repository for the master representation of each validated software reusable end product. These end product items shall be project independent and maintained in accordance with the requirements for software configuration management specified in the appropriate software configuration management (SCM) SOPs. 2. Automated procedures shall be used to control and monitor access to all files in the SE Software Library. System access controls will be implemented to prevent access except through prescribed check-out and check-in tools. 3. Access to the SE Software Library source information shall be restricted to persons authorized by the SE [title/position]. Audits shall be performed periodically to ensure that only authorized users have access to the SE Software Library. 4. Automated procedures shall be provided for the archival, retrieval, and disaster recovery of files of the SE Software Library in accordance with the requirements specified elsewhere in this SOP and within relevant information systems SOPs.
Page 12 of 19 SE-PCMP
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
181
5. A software configuration manager shall be responsible for administration of the SE Software Library. The software configuration manager shall prescribe and administer all necessary procedures to ensure compliance of the SE Software Library with the requirements described in this SOP.
Responsibilities 1. The software configuration manager shall be responsible for prescribing and administering the necessary procedures to ensure compliance of the SE Software Library with the requirements described in this SOP. 2. The SE [title/position] shall be responsible for the following: a. Authorizing user access to the SE Software Library b. Appointing the Software Library configuration manager
POLICY 6 DATA SECURITY
Policy SE members shall ensure that confidential or copyright data are physically secure.
Requirements 1. SE members are responsible for the physical security of their assigned workstations and software. Each SE member shall: a. Log-off from their workstations when they are not in use b. Store computer data media containing software packages away from printed manuals for those packages c. Store printed manuals for software packages in personal secure areas 2. SE members shall ensure that confidential and copyright material is protected from unauthorized use, alteration, and destruction.
Copyright © 2002 Interpharm Press
Page 13 of 19 SE-PCMP
182
Software Quality Assurance SOPs for Healthcare Manufacturers
3. Network software shall restrict computer network access by one or more of the following methods: a. The user shall be required to enter password data before gaining access. b. Command or batch files to facilitate automated access to network systems shall not contain passwords or other confidential information. c. Passwords shall be protected from discovery by unauthorized personnel. d. Access to any data on a logical disk from the workstation over the network shall be limited to those requiring access. 4. Confidential material shall be disposed of as follows: a. Paper documents shall be shredded before disposal. b. Computer data storage media shall be erased in such a way as to prevent reading of data in any way before reuse or disposal. c. Other media shall be physically destroyed or otherwise rendered unreadable before disposal.
Responsibilities 1. SE personnel shall be responsible for the following: a. b. c. d.
Security of their assigned workstations Security of media containing confidential or copyright data in their possession Security of manuals and other documents in their possession Delivery of obsolete documents to the proper location to be shredded
2. The SE system tool administrator(s) shall be responsible for ensuring vendor compliance with copyright data protection.
POLICY 7 PRODUCT DEVELOPMENT SUPPORTING ACTIVITIES
Policy SE is an integral part of the product development activities and therefore SE personnel are committed to support the overall product development process and administration, to be
Page 14 of 19 SE-PCMP
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
183
responsive to product development needs, and to provide technical software experience and expertise to other functional disciplines.
Requirements 1. SE personnel shall support the corporate product development process and administration. SE shall actively support and maintain: a. Development SOPs, procedures, and guidelines b. Computer Aided Software Engineering (CASE) equipment c. An open interface to marketing, program management, and project technical lead engineers d. Phased product release 2. SE personnel shall provide a responsive and responsible product development organizational structure that actively fosters software development in support of product development. This will be accomplished in the following manner: a. The SE [title/position] shall actively support all project phases. b. For each project, the SE [title/position] shall assign a project software lead engineer, who will be responsible for software development aspects of the project. c. For each project, the SE [title/position] shall assign a project software V&V lead engineer who will be responsible for software V&V aspects of the project. d. The SE [title/position] shall assign the most competent personnel possessing the best match of technical skill to the project. 3. SE personnel shall actively support and embrace the safety and efficacy of products.This will be accomplished in the following manner: a. Providing a software hazards analysis as part of product development b. Complying with corporate product safety standards c. Not knowingly releasing unsafe software 4. SE personnel shall support the development of product hardware by providing nonproduction software. This will be accomplished in the following manner: a. b. c. d.
Providing support for product system Design Validation Testing (DVT) Providing DVT software and written DVT procedures for the hardware Providing software support to other functional areas Supporting electrical engineering and mechanical engineering product development by providing non-production software that tests the evolution of the hardware and the hardware subsystems
Copyright © 2002 Interpharm Press
Page 15 of 19 SE-PCMP
184
Software Quality Assurance SOPs for Healthcare Manufacturers
Responsibilities 1. SE personnel shall support the corporate product development process and administration. 2. The project software lead engineer shall be responsible for: a. b. c. d.
Support of product safety and efficacy Support of product hazards analysis Support of product DVT Support and implementation of the software development, V&V, and configuration management SOPs
3. The project software V&V lead engineer shall be responsible for: a. Support of product safety and efficacy b. Support of product DVT c. Support and implementation of the software development, V&V, and configuration management SOPs 4. The SE [title/position] shall be responsible for: a. Assigning project resources b. Providing nonproject support
Page 16 of 19 SE-PCMP
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
185
GLOSSARY Archive: Provisions made for storing and retrieving records over a long period of time. Audit: Independent review to assess compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Backup: Provisions made for the recovery of data files or software lost due to a system failure, human failure, or disaster. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Computer data: Electronic data that include systems software, applications software, project software, files of collected data, and documents. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Confidential data: Computer data or information that includes project software, files of collected data, written applications software, and project documents. Configuration manager: Individual designated to be responsible for software configuration management activities on a software project. Copyright data: Computer data or information that includes systems software, purchased applications software, software manuals, and purchased documents. Delivery: Transfer of responsibility for an item from one activity or function to another, as in the delivery of the validated software to quality assurance personnel for certification.
Copyright © 2002 Interpharm Press
Page 17 of 19 SE-PCMP
186
Software Quality Assurance SOPs for Healthcare Manufacturers
Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness, or damage to or loss of equipment or property. Hazard analysis: Listing of potential hazards associated with a device or system along with an estimation of the severity of each hazard and its probability of occurrence. Interface Design Specification (IDS): Project-specific document that delineates completely the interfaces among the subsystems. Privileged user name: Access to a computer that gives the user varying degrees of rights to modify a computer’s hardware, operating system, applications software, and user files. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software configuration management (SCM): Discipline of identifying the configuration of a software system at discrete points in time for the purpose of systematically controlling changes to this configuration and maintaining the integrity and traceability of this configuration throughout the development process. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review, including a description of the overall program operation and control and the use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software lead engineer: Individual designated to head a software project development team and provide technical direction in all aspects of software development. Software project: Planned and authorized undertaking of specified scope and duration that results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software Verification and Validation lead engineer: Individual designated to head a software project verification and validation (V&V) team and provide technical direction in all aspects of software V&V.
Page 18 of 19 SE-PCMP
Copyright © 2002 Interpharm Press
Software Engineering Project Control and Management Policies
187
System tool administrator: Individual designated to be responsible for and have the authorization to maintain, configure, and alter a software development tool. System utilities: Collection of software tools developed by or authorized by the systems tool administrator(s) and used by multiple users. Validation: Process of evaluating software at the end of the software development cycle to ensure compliance with software and system requirements. Verification: Process of determining whether the products of a given phase of the software development life cycle fulfill the requirements established during the previous phase. Work Breakdown Structure (WBS): Form that clarifies staffing levels and tasks required for a project by month and engineering category.
Copyright © 2002 Interpharm Press
Page 19 of 19 SE-PCMP
SE-PDP SOFTWARE ENGINEERING PROJECT DIRECTORY POLICIES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SEPDP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 12 SE-PDP
190
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
PREAMBLE
SE Project Directory Policies
3
POLICY 1
Project Directory Generation and Management
6
POLICY 2
Managing Electronic Documents in the Software Development Directory
8
Managing Electronic Documents in the Baseline V&V Directory
9
POLICY 3
POLICY 4
GLOSSARY
Page 2 of 12 SE-PDP
Managing Electronic Documents in the Build V&V Directory
10 12
Copyright © 2002 Interpharm Press
Software Engineering Project Directory Policies
PREAMBLE
191
SE PROJECT DIRECTORY POLICIES
Policy Software Engineering (SE) software projects shall establish and maintain a uniform, consistent, and static project directory structure that is used in all phases of the software life cycle.
Requirements 1. The SE Project Directory Policies shall be applied to all SE software projects. Projects where effort will be expended in order to modify or enhance existing software also are subject to this requirement. 2. The SE Project Directory Policies (Figures 1 and 2) shall be maintained by the Software Project Directory Policy Change Control Board (CCB). The chairman of this board shall be appointed by the SE [title/position] with the approval of the [title/position], and board members shall be appointed in writing by the SE [title/position]. The SE [title/position] shall serve as the secretary to the board and shall be responsible for scheduling board meetings and maintaining minutes of meetings and permanent files of CCB actions. Proposed changes to SE Project Directory Policies must be submitted in writing to the board. At least once each year, the board shall convene to review the policies in their totality for relevancy and currency. Where appropriate, they shall propose revisions to the policies subject to the review and approval of the [title/position]. After approval by the SE [title/position], the policies shall be approved by the [title/position] and [title/position]. 3. Circumstances may require deviation(s) and waiver(s) from policy. A written request for a deviation shall be submitted by the project configuration manager in advance of a future activity, event, or product, in order that SE management be made aware of the project’s intention to employ a higher risk approach to project directory management. A written request for a waiver shall be submitted by the project configuration manager in those cases where the activity, event, or product has already been initiated. Deviations and waivers shall be reviewed by the [project title/position] and submitted to the SE [title/position] for review. The SE [title/position] will make a recommendation to the [title/position] and/or [title/position] for approval or disapproval of the proposed deviation or waiver. A proposed deviation or waiver must be approved by the [title/position] and/or [title/position] before the project directory management tasks affected by that deviation or waiver are begun. 4. Each request for a deviation or waiver shall identify the following: a. Each specific policy or policy requirement for which it applies
Copyright © 2002 Interpharm Press
Page 3 of 12 SE-PDP
192
Software Quality Assurance SOPs for Healthcare Manufacturers
b. The alternative policy approach to be taken by the project c. The impact on project schedule, performance, and/or risk 5. A copy of each approved deviation and waiver shall be forwarded to the secretary of the Software Project Directory Policy CCB. A copy shall also be placed in the product history file. 6. These policies refer to and govern a set of SE software configuration management (SCM) procedures.The procedures are intended to provide detailed guidance within the framework and requirements provided by these policies. It is the responsibility of the project software configuration manager to apply the existing relevant SE SCM procedures. New SE SCM procedures are to be submitted to the SE [title/position] prior to their use in order that they can be reviewed and approved. 7. A permanent record of deviation and waiver approvals shall be maintained for each project using the form depicted in the SE configuration management procedures. This record shall be initiated during development of the Product Objectives Document and shall serve as a record of all subject approvals for the duration of the project. 8. The definition and usage of the scripts and directory structures discussed in this SOP shall be maintained in the SE configuration management procedures.
Responsibilities 1. The project software configuration manager is responsible for the following: a. b. c. d. e.
Figure 1
Generating changes to SE Project Directory Policies Generating written deviations and waivers Generating changes to project directory procedures Applying relevant SCM procedures to the project Providing and performing the control and maintenance tasks of the software project directories SE Software Project Directory Policies
Policy Category Project Preparation Development Practices V&V Practices
Page 4 of 12 SE-PDP
Policy Topic Title
Policy Number
Project Directory Generation and Management Managing Electronic Documents in the Software Development Directory Managing Electronic Documents in the Baseline V&V Directory Managing Electronic Documents in the Build V&V Directory
1 2 3 4
Copyright © 2002 Interpharm Press
Software Engineering Project Directory Policies
Figure 2
193
SSE Software Project Directory Policies Throughout the Software Development Life Cycle
ED ED ED ED ED
ED ED
ED
Baseline V&V Directory Document Management
ED ED ED ED ED
ED ED
ED
Build V&V Directory Document Management
ED ED ED ED ED
ED ED
ED
Code and Test
Software Development Directory Document Management
Detailed Design
ED
Architecture Design
ED ED
Requirements
ED ED ED ED ED
Interface Design
Project Directory Generation and Management
Project Start-up
Software Validation
Software Life Cycle Phase Integrate and Test
Policy Topic Title
Notes: 1. D indicates that a deliverable or activity is required at that time. 2. E indicates that the procedure requirements are in effect for the entire phase.
Figure 3
Matrix of Responsibilities for SE Software Project Directory Policies Document Title
CM1
SLE2
V&VLE3
Director, SE
Software project directory procedures Software project directory deviation Software project directory waiver Project directory structure creation Change project directory structure Project directory initial content creation Change project directory labeling
Generate Generate Generate Generate Generate Generate
Generate Generate Generate
Generate Generate Generate
Review/disposition Review Review
Request
Request
Review/disposition
Generate
Generate
Notes: 1. Project configuration management engineer assigned to the project. 2. Project software lead engineer assigned to the project. 3. Software V&V lead engineer assigned to the project.
Copyright © 2002 Interpharm Press
Page 5 of 12 SE-PDP
194
Software Quality Assurance SOPs for Healthcare Manufacturers
2. The project software V&V lead engineer is responsible for the following: a. b. c. d.
Generating changes to SE Project Directory Policies Generating written deviations and waivers Generating changes to project directory procedures Applying relevant SCM procedures to the project
3. The project software lead engineer is responsible for the following: a. b. c. d.
Generating changes to SE Project Directory Policies Generating written deviations and waivers Generating changes to project directory procedures Applying relevant SCM procedures to the project
4. The SE [title/position] is responsible for the following: a. Review and approval of SE Project Directory Policies b. Review and recommendation of deviations and waivers from SE Project Directory Policies c. Review and approval/disapproval of project directory procedures 5. The [title/position] and/or [title/position] are/is responsible for the following: a. Approval of SE Project Directory Policies b. Approval of deviations and waivers from SE Project Directory Policies 6. The [project title/position] is responsible for the review and submittal of deviations and waivers from SE Project Directory Policies. 7. The managers of organizations supporting and sponsoring the project should share the commitment to the implementation of these policies.
POLICY 1 PROJECT DIRECTORY GENERATION AND MANAGEMENT
Policy SE software projects shall generate the software project directory structure at the initiation of a software project, and the directory structure shall be maintained and audited throughout the software life cycle. Data files that describe key milestones of the software development shall be archived offline.
Page 6 of 12 SE-PDP
Copyright © 2002 Interpharm Press
Software Engineering Project Directory Policies
195
Requirements 1. The software project directory shall be generated at the initiation of a software project by the project software configuration manager. The project software configuration manager shall: a. Request an account log-in for the software project directory b. Generate the directory skeleton by executing the directory generation scripts 2. The project software configuration manager shall maintain the active software project subdirectory through: a. Ongoing building of the structure to meet the needs of the project and as directed by the project software lead engineer b. Controlling the release to the project directory of a document 3. The project software configuration manager shall be the only individual who possesses source file update privileges of read, write, execute, and delete in the project directory. All other users are limited to read access. 4. The Build V&V Directory shall serve as the source information for the individual builds or prototype versions of each software configuration. This ongoing online history subdirectory shall be maintained at the direction of the project software lead engineer. 5. Software project milestone data files shall be archived offline by the project software lead engineer. 6. The software project directory configuration immediately prior to product release shall be archived offline. This task is to be accomplished by the software lead engineer and coordinated with the corporate document control group.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software lead engineer shall archive the software project milestones and project directory configuration. 2. The project software configuration manager shall ensure that the software directories are configured as described in this SOP and the SE configuration management procedures. 3. The project software V&V lead engineer shall ensure that the appropriate software project directories have been created.
Copyright © 2002 Interpharm Press
Page 7 of 12 SE-PDP
196
Software Quality Assurance SOPs for Healthcare Manufacturers
POLICY 2 MANAGING ELECTRONIC DOCUMENTS IN THE SOFTWARE DEVELOPMENT DIRECTORY
Policy After the software project directory has been generated, the directories shall be populated with the tool and with document template files from the software library. Software project documents and code shall be created and maintained using editors and configuration management tools. The Build V&V Directory shall be the repository for all validated source files.
Requirements 1. After the software project directory structure has been created, the release to the project directory shall be made. This will consist of the following: a. A script file shall be used to copy the tool files from the software library directories to the Software Development Directory. b. A script file shall be used to copy the document template files from the software library directories to the Software Development Directory. 2. After the releases to the project directories has been completed, the code source documents shall be created as necessary, using the source code editor(s) and configuration management tool(s). 3. Other than the release to the project directory, the subsequent transfers of source documents shall be accomplished in the following manner: a. Copying shall be from the Build V&V Directory to the Software Development Directory by a script file. b. Copying of user-designed code source documents from the Build V&V Directory to the Software Development Directory shall be by a script file. c. Additional code source documents shall be created using the source code editor and configuration management tool.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy:
Page 8 of 12 SE-PDP
Copyright © 2002 Interpharm Press
Software Engineering Project Directory Policies
197
1. The project software configuration manager shall ensure that: a. The appropriate software document templates are released to the project directory software subdirectories. b. The appropriate software tool files are released to the project directory software subdirectories. 2. The project software engineers shall: a. Create the code source documents using the source code editor(s) and configuration management tool(s) b. Transfer or create all subsequent files using the appropriate tools and scripts
POLICY 3 MANAGING ELECTRONIC DOCUMENTS IN THE BASELINE V&V DIRECTORY
Policy Script files shall be used to copy project-developed code to the Baseline V&V Directory. The compilation and executable build of the Baseline V&V Directory code shall be accomplished with script files.
Requirements 1. After any milestone, deliverable, or prototype has been reached, a script file shall be used to copy the project code from the Software Development Directory into the Baseline V&V Directory. 2. After copying the code from the Software Development Directories to the Baseline V&V directory, a script file(s) shall be used to compile and make the project code.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy:
Copyright © 2002 Interpharm Press
Page 9 of 12 SE-PDP
198
Software Quality Assurance SOPs for Healthcare Manufacturers
1. The project software lead engineer shall execute the script files to copy the project code from the Software Development Directory to the Baseline V&V Directory. 2. The project software V&V lead engineer shall execute the script files to compile and make the project code in the Baseline V&V Directory.
POLICY 4 MANAGING ELECTRONIC DOCUMENTS IN THE BUILD V&V DIRECTORY
Policy Script files shall be used to copy project-developed code to the Build V&V Directory from the Baseline V&V Directory and documents form the Software Development Directory. The Baseline V&V Directory and Software Development Directory shall be cleaned up in order to remove the baselined data documents.
Requirements 1. After the code has been successfully baselined in the Baseline V&V Directory: a. A script file shall be used to copy the Baseline V&V Directory project code to the Build V&V directory. b. A script file shall be used to copy the Software Development Directory source documents to the Build V&V directory. 2. After transfer of the Baseline V&V Directory and Software Development Directory data documents: a. A script file shall be used to delete the Software Development Directory source documents. b. A script file shall be used to delete the Baseline V&V Directory source documents.
Page 10 of 12 SE-PDP
Copyright © 2002 Interpharm Press
Software Engineering Project Directory Policies
199
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall: a. Execute the script files to copy the project code from the Baseline V&V Directory to the Build V&V Directory b. Execute the script files to copy the appropriate source documents from the Development Directory to the Build V&V Directory 2. The project software lead engineer shall execute the appropriate script files to clean up and delete the files in the Baseline V&V Directory and Software Development Directory.
Copyright © 2002 Interpharm Press
Page 11 of 12 SE-PDP
200
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Baseline Verification and Validation (V&V) Directory: Software project directory consisting of electronic data files that have been modified since the last build and are waiting for verification and validation. Build Verification and Validation (V&V) Directory: Software project directory consisting of electronic data files that have been or are being verified and validated. Electronic document: One or more data files stored on a computer. Electronic document control: Specific control process for documents in the form of electronic files such as check-in, check-out, and release. Initial document release: Point at which a paper or electronic document passes from local configuration control to the group responsible for corporate configuration control. Project directory: Official structured grouping of directories and their associated electronic documents that make up a project. Project directory management: Process of generating, structuring, and maintaining the project directories, including both current or active versions and historical versions. Release to project directory: First time an electronic document is transferred or placed into the project directory. Software Development Directory: Software project directory consisting of electronic data files that are undergoing software development. Software Lead Engineer: Individual responsible for the development of the software by the software team. Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software project: Planned and authorized undertaking of specified scope and duration, which results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Source document: Document in the form of paper or electronic file(s) that contain(s) the source data used for creating the next document revision and subsequent documents. Verification and validation (V&V) directories: Software project subdirectories consisting of electronic data files used for software verification and validation.
Page 12 of 12 SE-PDP
Copyright © 2002 Interpharm Press
SE-VVG SOFTWARE ENGINEERING VERIFICATION AND VALIDATION GUIDELINES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SEVVG-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 46 SE-VVG
202
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
TEST OVERVIEW
4
3.0
TEST REQUIREMENTS
8
4.0
VERIFICATION AND VALIDATION PHASES
15
5.0
VERIFICATION AND VALIDATION REPORTING
28
APPENDIX A
List of References for Software Verification and Validation
35
APPENDIX B
Requirements Traceability Matrix
36
APPENDIX C
Software Validation Test Log
37
APPENDIX D
Validation Test Information Sheet
38
APPENDIX E-1
Software Anomaly Report
39
APPENDIX E-2
Instructions for Completing Software Anomaly Report
40
APPENDIX F
Software Verification and Validation Report
41
APPENDIX G
Software Verification and Validation Record of Deviation or Waiver Approval
43
GLOSSARY
Page 2 of 46 SE-VVG
44
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
203
1.0 INTRODUCTION
1.1 Purpose This document provides the verification and validation (V&V) guidelines for software V&V and encompasses the responsibilities, methodologies, and testing to be performed by the V&V engineers.
1.2 Scope This document outlines the categories of products and equipment that are under the V&V umbrella of responsibilities and the corresponding methodologies and level of testing to be performed.
1.3 Overview The corporate responsibilities of the software V&V group have been expanded from software development of products to include software-driven test and manufacturing equipment. The present V&V methodologies and practices originated with the support of software development in order to validate the software developed for products that included both full life cycle development projects and product enhancement or iterative life cycle projects. The V&V responsibilities have broadened to include the product enhancements generated by the operations group, software-driven test, and manufacturing equipment. This document provides a consistent V&V approach to a diverse set of products and equipment. The Food and Drug Administration (FDA) has produced several documents to be used as guidance for software validation (see Appendix A). In addition, the Good Manufacturing Practices (GMPs) and the International Organization for Standardization (ISO) 9000 series are listed because of their influence on process validation. The software V&V described in these guidelines encompasses only the effort to certify that the software as a component of a larger system is safe for its intended use and that different testing programs take place in the effort to qualify the system. One such effort is Design Validation Testing (DVT), in which hardware fault insertion testing is performed and is a separate and distinct undertaking from the software V&V effort. Another system validation effort is the product
Copyright © 2002 Interpharm Press
Page 3 of 46 SE-VVG
204
Software Quality Assurance SOPs for Healthcare Manufacturers
or challenge test, of a sample size (N ) of a particular instrument which is run through an acceptance test procedure. Additional performance testing tasks include flow rate testing, accuracy testing, and accelerated life testing, which is often referred to as HALT testing. All of these tests are distinct and separate from the software V&V effort.
1.4 References •
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Programming Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 TEST OVERVIEW
The Software V&V group validates two types of software applications, products, and test and manufacturing equipment. Although product software is intuitively obvious, the test and manufacturing equipment software encompasses the range of in-house built hardware and software support equipment to software that is written or modified to execute on a host PC or workstation.While there is a significant difference between products and equipment, there are basic software standards that are applicable to both software applications, and consequently, a common approach to software validation is to apply it to both products and equipment.
Page 4 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
205
There are two fundamental differences between the two applications, however. First, the level of effort will be adjusted on the basis of the potential for harm. Second, testing of equipment is viewed as process validation. Process validation tends to promote black box testing, which requires a survey of the environment in which the equipment is to perform and the procedures used to operate it. White box testing, on the other hand, requires focusing on the software as a component of a system. Within the phases of software development are three phases and activities that are concerned with testing. 1. The Code and Test Phase and its activities include software coding and debugging performed by the software developers. 2. The Integrate and Test Phase and its activities include the integration of the software components by the software developers. 3. The Software Validation Phase and its activities encompass the software verification and validation testing that is performed by the V&V engineers on the integrated software and hardware; it is divided into two parts: a. Testing that is performed with an in-circuit emulator on the latest hardware with a socket for the microprocessor b. Testing that is performed on the latest buttoned-up hardware and software configuration without the use of the in-circuit emulator
2.1 Tools The V&V group provides a common test suite for the various processors and controllers and their accompanying software languages. This functionality is provided in the test tool referred to as the [insert tool library name here]. The test tool is a workstation-based tool that provides both static and dynamic test capabilities and reverse engineering capabilities. The diverse nature of the products and equipment to be processed by the V&V group requires that tools are in place to support the following processors, controllers, and languages: •
Processors and controllers: [insert list here]
•
Languages: [insert list here]
Copyright © 2002 Interpharm Press
Page 5 of 46 SE-VVG
206
Software Quality Assurance SOPs for Healthcare Manufacturers
2.2 Techniques and Methodologies The software validation approach and test categories to be applied to the product and equipment are based on the categorization of the life cycle that the product or equipment belongs to. If the product or equipment is new and is deemed to be at a high level of concern and risk, then a full development life cycle process will be adhered to. If the product or equipment to be produced is based on an existing baseline that has accumulated sufficient run time to realize a high level of confidence that a majority of the faults have been detected, then an accelerated enhancement development life cycle will be followed.
2.2.1
Test Approach
The test approach to any product or equipment is a combination of requirements testing and safety testing. Requirements testing encompasses a rigorous project development effort that includes the production of requirements and design specification. After the requirements and design are solidified, the V&V project team systematically details the requirements to be tested and traces them into the design specification. The embodiment of this effort is the Requirements Traceability Matrix (RTM), and completing the RTM requires the V&V team to describe in detail the testing that they will perform in order to validate the requirements. This thorough and systematic approach to requirements testing provides high confidence that the testing will be thorough, comprehensive, and rigorous. An example of the RTM is shown in Appendix B. Safety testing takes a different approach to validation in that the focus is shifted to preventing harm to the user. This approach is aimed at producing a product or equipment that is safe for its intended use. When applied to products, a key element to this approach is the production of a hazards analysis.This analysis starts with the system as a whole and then breaks the instrument down into mechanical-, hardware-, and software-related hazards and attempts to mitigate single point failures or justify a single error trap by calculating a low probability of occurrence. Safety testing of a product requires the use of the hazards analysis in the software validation effort. When applied to equipment validation, safety testing takes on a different meaning. Safety in this perspective requires surveying the environment, role, and procedures that the equipment plays in supporting the production of a safe product. If the equipment is used to make a pass or fail determination of a product on a production line, then the safety issue is elevated. Furthermore, if the equipment is the sole check for a function of a product before it leaves the production line and is boxed for shipment, then the safety issue is elevated to the level of a product. However, if the equipment consists of a label generator, then safety testing takes on the role of ensuring control of the software and validating that the process in place minimizes the chances that human intervention might cause an error.
Page 6 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
207
The preferred software test approach to all products or equipment is white box testing rather than black box testing. The white box approach requires that the V&V team decipher the software down to the lowest level necessary to validate a function. Code inspection as a sole technique for requirements and function validation is to be avoided unless the change is a data change rather than a logic change. Software testing of a product and equipment shall be effected at two levels. Software componentlevel testing concentrates on the individual components or the grouping of components. Software system-level testing focuses on the performance of the software as an integrated whole. Software component-level testing is performed during the Code and Test Phase activities and the Integrate and Test Phase activities and is performed by the software developers. Software system-level testing is performed during the Software Validation Testing Phase activities and is performed by the software V&V engineers.
2.2.2
Test Categories
The test categories for software validation testing include the following types: •
Functional Testing. Designed to verify that all of the functional requirements have been satisfied. Termed success oriented, because the tests are expected to produce successful results.
•
Robustness Testing. Designed to determine how the software performs with unexpected inputs, such as whether it recovers from such inputs by issuing an error text or audio message, locks the system in an indeterminate state, or continues to operate in a manner that is unpredictable. Termed failure oriented, because the test inputs are designed to cause the product to fail, given foreseeable and reasonably unforeseeable misuse of the product.
•
Stress Testing. Designed to determine how the product reacts to a stress condition in which the amount or rate of data exceeds the amount expected. Stress tests can help determine the margin of safety that exists in the product.
•
Safety Testing. Designed to verify that the product performs in a safe manner and that a complete assessment of the safety design has been accomplished.
•
Regression Testing. Performed whenever a software or hardware change that affects software occurs. Verifies that the change produces the desired results to the altered component while no other component is adversely affected.
Copyright © 2002 Interpharm Press
Page 7 of 46 SE-VVG
208
Software Quality Assurance SOPs for Healthcare Manufacturers
3.0 TEST REQUIREMENTS
3.1 Software Component-level Testing Software component-level testing is concerned with individual parts of the software product and the integration of those modules. This level of testing is the responsibility of the software development engineers. Products that fit into the full development life cycle address this testing in the project-specific Software Development Plan.
3.2 Software System-level Testing Software system-level testing is performed by the software V&V engineers and utilizes two different hardware environments. The first environment employs an in-circuit emulator in place of the actual microprocessor. These tests delve into the software control and performance. The tests in the second environment are performed on the latest integrated software and hardware configuration without the in-circuit emulator. The specific tests to be performed by test category are summarized in the following sections. The testing of iterative project products and equipment focuses on changes to the baseline and is coupled with regression testing to ensure that the changes produced no adverse affects. The sequence of steps for conducting software system-level testing by the software V&V engineers is as follows: 1. Complete the RTM with the exception of the test column, which will be completed upon the performance of all tests. 2. Validate the MAKE process. Software developers must demonstrate the process to create a consistent and reliable build, and all compiles must trap for the highest level errors or warnings. 3. Perform V&V testing from a known and controlled test bed. 4. Perform software fault insertion testing by using the in-circuit emulator whenever possible. 5. Perform functional, robustness, stress, and safety testing using the in-circuit emulator with the latest software and hardware.
Page 8 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
209
6. Remove the in-circuit emulator and perform the following tests on the latest configured software and hardware: functional testing of the user interface, robustness testing of the user interface, and stress testing of the communications interface. 7. Execute the appropriate test suite in order to locate software design errors, locate suspicious coding practices, locate unused and improperly scoped variables, and reverse engineer software in generating test cases. 8. Validate the EPROM burn-in procedure.Typical errors include inconsistent memory fill techniques such as all ones, all zeros, halt OPCODE, or a jump to an interrupt service routine to handle an illegal address error routine. 9. Complete and review the RTM in order to verify that all requirements have been tested and satisfied. 10. Complete the Software Anomaly Report and RTM database reports to provide closure. 11. Notify the project software lead engineer when testing is complete. The software lead engineer will provide a document control number to be inserted on the final Software Verification and Validation Report (SVVR).The SVVR must include the version-specific cyclical redundancy check (CRC) identifier and checksum and must list all baselined items that were verified with this version of software and the V&V-database reports. 12. Provide the completed SVVR to the software lead engineer for submittal to the documentation control group. Insert into the SE CM library the following: the supporting reports; a copy of the SVVR; all supporting test documentation; and V&V-generated test tools.
3.3 Functional Testing Testing of the functional capabilities involves the exercising of the operational modes and the events that allow a transition between the modes or states. These tests are performed in order to verify that proper mode transitions are executed and proper outputs are generated given the correct context inputs. These tests verify that the software generates the expected output given the expected user input. The expected inputs and outputs include switches, tones, messages, and alarms. Audio frequencies and amplitudes of products are tested by the challenge tests.
3.3.1
Flow Control Tests
The software algorithms employed in controlling motors are scrutinized for proper translation of user inputs into the correct digital-to-analog converter (DAC) value, correct alterations due
Copyright © 2002 Interpharm Press
Page 9 of 46 SE-VVG
210
Software Quality Assurance SOPs for Healthcare Manufacturers
to feedback values, and error handling for out-of-tolerance conditions. Rate accuracy is tested through the challenge tests, which utilize special testing tools.
3.3.2
Pressure Tests
The software algorithms implemented to detect, condition, or utilize pressure values are scrutinized with tests to determine the status of the pressure transducer; fast and slow filters to condition pressure readings and to trap values beyond the set limits; effectiveness in detecting upstream and downstream occlusions; detection and handling of out-of-tolerance pressure readings; the handling of auto-zero pressure calibration; the time to alarm under the worst case scenario; and the capability to detect low pressure.
3.3.3
Air-in-Line Tests
The software algorithms employed in providing air-in-line (AIL) detection are tested to verify detection, timeliness of detection, and granularity of detection and determination.
3.3.4
Sensor Tests
The sensors that are sampled by the software are tested to verify detection, timeliness of detection, and granularity of detection and determination.
3.3.5
Remote Communications Interface Tests
A communications test tool is used to test the proper operation of the remote communications protocol and functionality of the communications software located in the product under test. As a minimum, the following tests should be performed: •
Connect and disconnect tests
•
Valid commands and inquiries tests
•
Handling of invalid commands and inquiries, such as NAK (no acknowledgment)
•
Tests for all baud rates that are supported
•
Corrupted frames tests
Page 10 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
•
Error handling in general and the interface to the products and equipments error handler
•
Control mode testing with emphasis on the safety aspects
•
Monitor mode testing with emphasis on fidelity of values reported
3.3.6
211
Timing Tests
Timing tests should be performed only for the system critical functions that relate to the system critical time, the system critical volume, and the operational window. As a minimum, the following timing tests are performed: •
Active failure tests are completed within the system critical time
•
Passive failure tests are completed within the product- and equipment-defined operational window
3.3.7
Power Tests
Power tests are performed whenever a software change to the software that monitors the battery levels has been made. If the new functionality is pushing the product to the edge of hardware resources, such as timing or memory, then battery tests should also be performed because of their potential effect on power-down software routines. Battery testing should include a ramp up and ramp down of voltages in order to test the various levels of warnings, alarms, and errors.
3.4 Robustness Testing Robustness testing is performed as follows: •
Boundary testing for over- and underspecified limits is performed for numerical values that determine logic flow based on a maximum or minimum value; test cases should include negative values
•
Overflow and underflow are tested for all algorithms
•
The user interface is tested by entering unexpected values and sequences
Copyright © 2002 Interpharm Press
Page 11 of 46 SE-VVG
212
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Routines that have execution time limits should be altered in order to introduce reasonable delays to help determine the reaction
•
Unexpected commands and data are transmitted to the remote communications handler
3.5 Stress Testing 3.5.1
Duration Tests
Tests that exercise the equipment continuously over varying periods of time and operating parameters determine if latent errors exist in the software. These tests will usually consist of overnight and weekend runs in order to gain the optimum benefit of the allotted test time. These tests are akin to software burn-in tests.
3.5.2
Buffer Overload Tests
Global buffers and data structures are tested under loaded and overflow conditions in order to determine the response of the software.
3.5.3
Remote Communications Load Tests
Tests are performed that verify the transfer rate of remote communications interfaces at the maximum transfer rate and under worst case conditions.
3.5.4
Worst Case Scenario Tests
Tests are performed that verify the product and equipment operating capability under the projected worst case scenario. The worst case scenario tests for products include the following: •
Highest rates with the most restrictive hardware or peripherals, to ascertain high resistance
•
Lowest rates with the least restrictive hardware or peripherals, to ascertain low resistance
•
Plugging and unplugging the AC/DC power outlet
Page 12 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
•
Static discharge
•
RF interference with the remote communications link
•
Event overload for event-driven systems
213
These tests are limited to reasonable environmental tests that do not include temperature and vibration testing.
3.6 Safety Testing Tests are performed specifically to verify the fail-safe provisions of the software design. These tests cover error conditions only and do not address warnings and alarms, which are tested under the category of Functional Tests. Limited, non-destructive fault insertion tests are performed by the software V&V engineer whenever possible.The formal hardware fault insertion testing for products is performed using the Fault Insertion Protocol that is generated for DVT activities. Products that provide error-handling routines must be tested with data corruption tests in order to ensure an acceptable level of safety. The analysis must include a review of active failure tests and their completion within the system critical time, and passive failure tests and their completion within the product-defined operational window. Some examples of active failure tests include the following: •
ROM testing via CRC computation and comparison to stored value
•
RAM testing via several patterns looking for stuck bits, address decoding problems, and data and address path errors
•
Cyclic program execution checked with a timer and a watchdog circuit
•
Motor speed check by monitoring lead screw rotation flag
•
Backward calculations for motor speed and pressure
•
Sanity checks, which test for relative execution rates between tasks
•
LED indicators, by sensing the voltage across them
Copyright © 2002 Interpharm Press
Page 13 of 46 SE-VVG
214
Software Quality Assurance SOPs for Healthcare Manufacturers
•
LCD displays, by comparing the display memory with the display image in main memory
•
Processor and controller checks, by executing varying instructions as diagnostic tests
•
Digital-to-analog converter (DAC), by reading output with the analog to digital converter
•
Pressure transducer and preamp tests
Some examples of passive failure tests include the following: •
Watchdog timer test
•
Watchdog motor disable test
•
Hardware RAM tests
•
CRC generator
•
Power or battery test
•
Audio generators and speakers tests
•
EEPROM tests
Safety aspects that must also be addressed are as follows: •
Critical parameters and their duplicates
•
Events that lead to a loss of audio indicators
•
Events that lead to a loss of visual indicators
•
Events that lead to tactile feedback misses
•
Error handling for corrupted vectors and structures
•
Error handling for corrupted sanity checks
•
Sufficiency of periodic versus aperiodic tests
•
Tests for over- and under-infusion
•
Roles that software plays in preventing phlebitis, air embolus, and infiltration
Page 14 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
215
Safety testing of instruments must include use of the hazards analysis, particularly the software part. Special attention should be paid to all single-point failures. In addition, all of the normal power-up, run-time, and power-down safety tests that are performed by the product should be compiled and compared against the instrument under test.
3.7 Regression Testing Regression testing is performed on products and equipment that have had a change to an established validated baseline. The sequence of steps for conducting regression testing is as follows: 1. Compare the new software to the existing baseline by executing a version difference tool. 2. Assess the amount of change and the criticality. 3. Determine the level of effort required and assess the risk. 4. Test the new functions and bug fixes. 5. Execute a compiled list of core tests in order to establish that no new unintended changes have been introduced. 6. Devote special attention to the safety implications.
4.0 VERIFICATION AND VALIDATION PHASES
4.1 Program Goals The verification and validation (V&V) of software is the independent assessment and measurement of the correctness, accuracy, consistency, completeness, robustness, and testability of the software requirements, design, and implementation. The goals of the program are as follows: •
Verify that the products of each development phase comply with previous phase requirements and products; address all safety-related requirements for critical components and functions; satisfy the standards, practices, and conventions of the phase; and establish the proper basis for initiating the next software development phase
Copyright © 2002 Interpharm Press
Page 15 of 46 SE-VVG
216
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Validate that the completed software end product complies with established software and system requirements
•
Validate that the finished product is free of any known safety problems
•
Document the results of V&V tasks in support of software and management planning activities
•
Facilitate the accomplishment of product quality goals
V&V is an integral part of each phase of the software development cycle; the project-specific details are contained in the project Software Verification and Validation Plan (SVVP). At the completion of each development phase, a project milestone is attained, and the V&V tasks are integrated into the project schedule in order to provide feedback to the development process and support management functions. The management of the V&V tasks and the personnel required to support management and technical reviews requires scheduling of V&V tasks to correspond with project milestones. As the V&V tasks are completed, the results are documented in a V&V task report, and the task reports are correlated at the completion of each software development phase into a Phase V&V Task Summary Report. The exchange of V&V data and results with the development effort is also provided by Software Anomaly Reports. Anomaly reports, task reports, and task summary reports provide feedback to the software development process regarding the technical quality of software products. Resolution of critical anomalies is required before the V&V effort proceeds to the next software development phase.
4.2 Requirements Phase Verification and Validation The goal of the Requirements Phase V&V is to ensure that both the problem and the constraints upon the solution are specified in a rigorous form. During this phase of software development, the software requirements analysis is performed, and as problem evaluation and solution synthesis is accomplished, the interface characteristics of the software are established and design constraints are uncovered. System specifications such as the product objectives document (POD), product requirements document (PRD), and User Interface Specification (UIS) specify the product or system level requirements of the instrument, and these documents establish the product requirements from which software requirements are allocated. The Software Requirements Specification (SRS) specifies the results of the software requirements analysis. The SRS defines the basic functions, performance, interfaces, flow, and structure of information and validation criteria of a successful software implementation. The emphasis of the Requirements Phase V&V tasks is the analysis and evaluation of the cor-
Page 16 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
217
rectness, consistency, completeness, accuracy, and testability of the specified software requirements.
4.2.1
Verification and Validation Tasks
4.2.1.1 Review of Product Requirements Documentation. Review of the product requirements documentation is critical because it establishes the basis upon which all succeeding documents and products are developed. During this phase of development the specifications of system performance, user interface, and critical components of the instrument are reviewed for use in V&V planning and in defining the level of effort required to successfully verify and validate the software. The requirements defined in these documents provide the basis for development of the RTM. Product requirements documentation of the instrument is provided to the software V&V lead engineer by the software lead engineer for review prior to development of the SRS. Review of the POD, PRD, and UIS supports the V&V effort and ensures that the development of a safe, reliable, user-friendly, and cost-effective product is achievable. 4.2.1.2 Verification of the Software Requirements Specification. The SRS will be evaluated for correctness, consistency, completeness, accuracy, and testability. The SRS is provided to the software V&V lead engineer for review prior to the Software Requirements Review (SRR). Review of the SRS by V&V will concentrate on the following areas of requirement definition: •
Specification of the computing environment(s) in which the software must perform
•
Specification of the safety requirements, including a description of any unsafe operating condition in terms of critical software functions and goals, the severity of the hazard, and the set of associated critical parameters and critical indicators
•
Specification of the hardware interfaces through which the software must gather input and send output
•
Specification of the software interfaces, including the purpose of the interface, the type of data to be interchanged via the interface, and an estimate of data quantity and transfer rate requirements
•
Specification of the user interfaces, including the characteristics the software must support for each human interface to the software product
•
Specification of the interfaces to communications devices, including the name, version, interface type, and required usage
Copyright © 2002 Interpharm Press
Page 17 of 46 SE-VVG
218
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Specification of the required values of each output expressed using functions and state tables
•
Specification of the timing requirements and accuracy and stability requirements for each such output value
•
Software design constraints specifying likely changes and desirable subsets that must be accounted for in the design
An assessment should also be made of how well the SRS satisfies system objectives, including system safety considerations. 4.2.1.3 Generation of the RTM. The RTM is developed using a database tool and traces the development of the software product from requirements through software validation. The RTM is developed by V&V for use in the following: •
Evaluating subsequent requirements and design documents
•
Developing test events and test data
•
Documenting the validation of instrument software
During review of the instrument POD, PRD, and UIS, software-related requirements are listed in the RTM with a reference to the document that specified them. These requirements will be refined in subsequent levels of engineering documentation and entered into the database with a reference to the higher level requirement document. Every requirement in the RTM must be traceable to the product requirements documentation and to the code and subsequent test(s). The RTM is used to generate tests designed to validate a specific requirement or a group of related requirements. Inconsistencies in the refinement of requirements, incomplete definition of requirements in lower level specifications, and code and incomplete specification of testing for requirements are detected by the RTM. Concurrent with evaluation of the SRS, the RTM is updated to document the tracing of the specified software and interface requirements to requirements in the product requirements documentation.
4.2.2
Inputs and Outputs
The inputs to Requirements Phase V&V include the product requirements documentation, SRS, and periodic program status reports. The outputs of Requirements Phase V&V tasks and activities are the Requirements Phase V&V Task reports, Requirements Phase V&V Task Summary report, RTM, and updates to the Software V&V Plan as required in order to accom-
Page 18 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
219
modate changes in product requirements and/or program objectives. The outputs from this phase are inputs to subsequent V&V tasks. V&V task reports are developed for each V&V task conducted during the requirements phase and will be used to document discrepancies between requirements documentation and previously defined product requirements.V&V task reports are distributed by the software V&V lead engineer to the software lead engineer and a software manager for review and initiation of corrective action. The Requirements Phase V&V Task Summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The Requirements Phase V&V Task Summary report is distributed to the software lead engineer and a software manager.
4.3 Architecture Design Phase Verification and Validation The goal of the Architecture Design Phase V&V is to ensure that the software architectural design establishes the design baseline from which the detailed design will be developed. A Software Architecture Design Specification (SADS) is generated during this phase of software development and describes how the software system will be structured to satisfy the requirements identified in the SRS. The SADS translates the software requirements into a description of the software structure, software components, interfaces, and data necessary for the detail design phase. The goal of Architecture Design Phase V&V tasks is to ensure internal consistency, completeness, correctness, and clarity of the information needed to support the detailed definition of the individual software system components. A plan for software validation of the instrument software end product will be generated by V&V during this software development phase.
4.3.1
Verification and Validation Tasks
4.3.1.1 Verification of the Software Architecture Design Specification. The tasks defined for the Architecture Design Phase V&V concentrate on evaluating the preliminary specification of the software design for the instrument. The relationships between the requirements of the SADS and the SRS are analyzed for correctness, consistency, and accuracy. The inclusion of safety features in the software design will be evaluated for compliance with approved software safety design guidelines and safety considerations identified in the hazards analysis, which are controlled and/or commanded by software. The SADS is provided to the software V&V lead engineer by the software lead engineer for review prior to the Software Architecture Design Review (SADR). Review of the SADS by V&V is accomplished by the following:
Copyright © 2002 Interpharm Press
Page 19 of 46 SE-VVG
220
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Evaluating the form, structure, and functional description of the design for correctness, consistency, completeness, and accuracy
•
Evaluating the software structure for robustness, testability, and compliance with established software development procedures and SE Software Development Policies
•
Analyzing the data items defined at each interface for correctness, consistency, completeness, and accuracy
4.3.1.2 Generation of the Software Validation Test Plan. The generation of a plan for software validation testing is the responsibility of the software V&V lead engineer and is accomplished concurrently with design analysis. This document, the Software Validation Test Plan (SVTP), defines the methods for verifying the following: •
Correct implementation of software requirements
•
Software system capabilities
•
Throughput and timing requirements
•
Safety design requirements
•
Correct interface to the system environment
Development of the SVTP is accomplished in parallel with the review and verification of the SADS and describes the tests conducted and resource requirements for software validation of the software end products. The tests to be performed during software validation are carefully selected to verify correct system operation under the range of environments and input conditions defined in the SRS. During software validation the following are measured: •
Compliance of the complete software product with all functional requirements while operating in all system environment(s)
•
Performance at hardware, software, and user interfaces
•
Performance at boundaries and under stress conditions
•
Compliance with safety design requirements
4.3.1.3 Review of the Software Test Plan. The scope of testing that must be successfully completed for each software component by the software developers is defined in the Software Test Plan (STP). The STP is provided to the software V&V lead engineer by the software lead engineer for review prior to document approval. The completeness, correctness, and consis-
Page 20 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
221
tency of the software testing described in the STP required for each software component is evaluated. Compliance of the STP with the requirements specified in the SE Software Development Policies and software development procedures is also verified. 4.3.1.4 Updates to the RTM. The RTM is updated to document the tracing of the software structure specified in the SADS to requirements in the SRS. The updated RTM is subsequently used in the generation of the SVTP to ensure completeness and consistency in test coverage. At the completion of this development phase, the RTM is updated to cross-reference each software requirement to the test(s) described in the SVTP and the STP. The software V&V lead engineer is responsible for ensuring the accuracy and completeness of RTM updates.
4.3.2
Inputs and Outputs
The inputs to Architecture Design Phase V&V include the SRS, hazards analysis, RTM, SADS, SDTP, and periodic program-status reports. The outputs of Architecture Design Phase V&V are the V&V Task reports,V&V Task Summary report, SVTP, RTM updates, and updates to the SVVP as required in order to accommodate changes in product and software requirements. Outputs from this phase are inputs to subsequent V&V tasks. V&V task reports are developed for each V&V task conducted during this phase and are used to document discrepancies between the architectural design documentation and previously defined software requirements. V&V task reports are distributed by the software V&V lead engineer to the software lead engineer and a software manager for review and initiation of corrective action. Prior to completion of the SADR, the SVTP will be reviewed by the software lead engineer and approved by a software manager. The Architecture Design Phase V&V Task Summary Report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The V&V task summary report is distributed to the software lead engineer, a software manager, and quality assurance representative(s).
4.4 Detailed Design Phase Verification and Validation The goal of the Detailed Design Phase V&V is to ensure that the detailed software design satisfies the requirements and constraints specified in the SRS and augments the design specified in the SADS. A Software Detailed Design Specification (SDDS) generated during this phase of software development describes how the software system will be structured to satisfy the requirements identified in the SRS and supports the design specified in the SADS. The SDDS translates the software requirements into a description of the software structure,
Copyright © 2002 Interpharm Press
Page 21 of 46 SE-VVG
222
Software Quality Assurance SOPs for Healthcare Manufacturers
software components, interfaces, and data necessary for the implementation phase.The result is a solution specification that can be implemented in code with little additional refinement. The goal of Detailed Design Phase V&V tasks is to ensure internal consistency, completeness, correctness, and clarity of the SDDS and to verify that the implemented design will satisfy the requirements specified in the SRS. The SVTP will be updated as required to incorporate the additional design details of the SDDS. Software Validation Test Information Sheets (VTISs) are developed by the V&V group to define the objectives, approach, and requirements of each test defined in the SVTP.
4.4.1
Verification and Validation Tasks
4.4.1.1 Verification of the Detailed Design Specification. The tasks defined for Detailed Design Phase V&V concentrate on evaluating the specification of the software design for the instrument. The relationships between the requirements of the SDDS and SRS and the design of the SADS and SDDS are analyzed for correctness, consistency, completeness, and accuracy. The inclusion of safety features in the software design is evaluated for compliance with approved software safety design guidelines and safety considerations identified in the hazards analysis, which are controlled and/or commanded by software. The SDDS is provided to the software V&V lead engineer by the software lead engineer for review prior to the Software Detailed Design Review (SDDR). Review of the SDDS is accomplished by the following: •
Evaluating the form, structure, and functional description of the design for correctness, consistency, completeness, and accuracy
•
Evaluating the software structure for robustness, testability, and compliance with established SE Software Development Policies and procedures
•
Analyzing the data items defined in the SDDS at each hardware, software, and user interface for correctness, consistency, completeness, and accuracy
An assessment will be made of how well the software structures defined in the SDDS satisfy the fundamentals of structured design. The structured design techniques that provide a foundation for good design methods include the following: •
Evaluating the preliminary software structure to reduce coupling and improve cohesion
•
Minimizing structures with high fan-out and strive for fan-in as depth increases
•
Keeping the scope of effect of a component within the scope of control of that component
Page 22 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
223
•
Evaluating component interfaces to reduce complexity and redundancy and improve consistency
•
Defining components whose function is predictable but avoiding components that are overly restrictive
•
Striving for single-entry, single-exit components and avoiding content coupling
•
Packaging software on the basis of design constraints and portability requirements
•
Selecting the size of each component so that independence is maintained
4.4.1.2 Review of Software Development Test Information Sheets. A software Development Test Information Sheet (DTIS) is prepared by the software developers for each software component test defined in the STP.The DTISs are provided to the software V&V lead engineer by the software lead engineer for review prior to the SDDR. Verification of the adequacy of software component testing is supported by the review of the DTISs. The DTISs are analyzed by V&V to evaluate the following: •
The adequacy of the test methods and test limits defined
•
The adequacy of test coverage
•
Software behavior
•
Software reliability
4.4.1.3 Generation of the Software Validation Test Information Sheets. The generation of Software VTISs is accomplished concurrent with design analysis.These test documents provide the following: •
An organized and accessible collection of all testing and test results
•
A means of tracking the progression and status of testing
•
A means of test verification
For each test conducted during software validation, a VTIS is generated and maintained that describes the following: •
Objectives of the test and the success criteria
•
Item under test
Copyright © 2002 Interpharm Press
Page 23 of 46 SE-VVG
224
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Test approach
•
Required test instrumentation
•
Test phasing, scheduling, and duration
•
Data collection, reduction, and analysis requirements
The VTISs are provided to the software lead engineer prior to the SDDR for use in assessing the adequacy of the test methods and limits defined for the software validation test program. Upon successful completion of the SDDR, the VTISs will serve as the basis for developing the Software Validation Test Procedures (SVTPR). 4.4.1.4 Updates to the RTM. The RTM will be updated to document the tracing of the software structure specified in the SDDS to requirements in the SRS.
4.4.2
Inputs and Outputs
The inputs to Detailed Design Phase V&V include the SRS, hazards analysis, SADS, RTM, SDDS, DTISs, and periodic program status reports. The outputs of Detailed Design Phase V&V are the Detailed Design Phase V&V Task reports, a Detailed Design Phase V&V Task Summary report, VTISs, RTM updates, and updates to the SVVP as required in order to accommodate changes in the product and software requirements. Outputs from this phase are inputs to subsequent V&V tasks. V&V task reports are developed for each V&V task conducted during this phase and are used to document discrepancies between the specification of software design, and/or tests and previously defined software requirements. V&V task reports are distributed by the software V&V lead engineer to the software lead engineer and a software manager for review and initiation of corrective action. The Detailed Design Phase V&V Task Summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The V&V task summary report will be distributed to the software lead engineer and a software manager.
4.5 Implementation Phase Verification and Validation The goal of the Implementation Phase V&V is to ensure that the design is correctly implemented in code and results in a program or system that is ready for validation. The Implementation Phase of the software development effort encompasses the activities defined
Page 24 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
225
in the SE Software Development Policies for the Code and Test Phase and the Integrate and Test Phase. The goal of Implementation Phase V&V tasks is to ensure the accurate translation of the detailed design and to detect undiscovered errors. Verification of the implementation phase activities performed by software developers is accomplished by reviewing and auditing the code and software integration results. The instructions for validation test setup, operation, and evaluation are generated by software V&V engineers for approval prior to test execution.
4.5.1
Verification and Validation Tasks
4.5.1.1 Source Code Verification. The V&V tasks performed during the Implementation Phase emphasize the analysis and evaluation of the source code to the SDDS. A traceability analysis is performed to identify the source code implementation of the design and assess the correctness, consistency, completeness, and accuracy of that implementation. The source code is also evaluated for robustness, testability, and compliance with established programming standards and conventions. Code walk-throughs are conducted by the code developer(s) during the Implementation Phase to examine both high-level and detailed properties of the source code. V&V will participate in these walk-throughs and provide results of source code V&V to the software lead engineer and a software manager. V&V audits of source code documentation will do the following: •
Evaluate the structure of the source code for compliance with SE coding standards
•
Assess the communication value of the source code
•
Evaluate the source code for efficiency of algorithms, memory, execution, and input and output efficiency
•
Evaluate the source code for consistency, completeness, and traceability to software requirements and design
Discrepancies and deficiencies found during V&V of source code are documented in Software Anomaly Reports. 4.5.1.2 Verification of Software Component Testing. During the implementation phase, the software developers will use the DTISs to conduct software component testing. At the successful completion of the testing described, the DTIS is signed and dated by the software lead engineer. The DTISs and the associated test data will be provided to the software V&V lead
Copyright © 2002 Interpharm Press
Page 25 of 46 SE-VVG
226
Software Quality Assurance SOPs for Healthcare Manufacturers
engineer by the software lead engineer as each test is completed. The completed DTISs will be analyzed by the software V&V engineers to evaluate the following: •
Adequacy of test coverage
•
Adequacy of test data
•
Software behavior
•
Software reliability
Discrepancies and deficiencies found during V&V of software component testing are documented in Software Anomaly Reports. 4.5.1.3 Generation of Software Validation Test Procedures. The generation of test procedures for software validation will be accomplished concurrent with code and integration analysis.The SVTPR will be developed using the test information defined in the VTISs as an outline and adding procedures for test setup, operation, and evaluation. Test setup requirements and computing environments defined in the challenge test protocols will be included in the SVTPR to ensure that test setup and computer environment configurations are as accurate as possible prior to validation test execution. Software Validation Test Procedures specify the following: •
The steps for executing the set of tests defined in the SVTP
•
Requirements for logging test activities
•
Criteria for procedure stop and restart
•
Methods of collecting and analyzing test data
4.5.2
Inputs and Outputs
Inputs to Implementation Phase V&V include the SDDS, SRS, code, SVTP, and VTISs. The outputs are V&V task reports, anomaly reports, a task summary report, SVTPR, updates to the RTM, and updates to the SVVP as required. Outputs from this phase are inputs to subsequent V&V tasks. The Implementation Phase V&V Task reports and anomaly reports are distributed to the software lead engineer and a software manager for review and initiation of corrective action. The Detailed Design Phase V&V Task Summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The task summary report is distributed to the software lead engineer and a software manager.
Page 26 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
227
4.6 Software Validation Phase Verification and Validation The goal of the Software Validation Phase V&V tasks and activities is to verify that the software satisfies the requirements and design specified in the SRS and SDDS.
4.6.1
Prevalidation Software Configuration Control
At the completion of software component testing, the software is delivered to the software configuration manager by the software lead engineer for baseline processing. The baselined source code and associated files are stored in the SE project library, which provides internal source file control, problem identification, change traceability, and status determination of software and associated documentation.
4.6.2
Software Validation Testing
Software validation is performed using the current controlled version of the software as determined by the software configuration manager. Software validation is conducted in accordance with the SVTP using the SVTPR. The results of software validation are documented on the VTISs, and the validation test results are analyzed to determine if the software satisfies software requirements and objectives. Software Anomaly Reports are generated to document test failures and software faults. Software validation is conducted to ensure the verification of the following software performance requirements: •
Satisfaction of applicable human interface requirements
•
Satisfaction of applicable system safety and data integrity requirements
•
Proper operation, including initiation, data entries via peripheral devices, and system operation monitoring and control
•
Proper interface of all hardware specified in the software requirements specification
4.6.3
Regression Testing
Regression testing will be conducted during software validation as necessary in order to confirm that the redesign of corrected software has been effective and has not introduced other errors.This retesting includes repeat testing of all test procedures that revealed problems in the previous testing and that verify functions affected by the corrections. The SVTPR is corrected
Copyright © 2002 Interpharm Press
Page 27 of 46 SE-VVG
228
Software Quality Assurance SOPs for Healthcare Manufacturers
to incorporate changes resulting from procedure validation through execution and procedure modification to accommodate approved software design changes.
4.6.4
Software Verification and Validation Report
The SVVR is generated by the software V&V lead engineer at the completion of all V&V tasks during the Software Validation Phase and is a summary of all V&V activities and results, including status and disposition of anomalies. An assessment of the overall software quality and recommendations for software process improvements is documented in the report.
4.6.5
Inputs and Outputs
The inputs to the Software Validation Phase V&V are the product requirements documentation, SRS, SDDS, SVTP, VTISs, and SVTPR. The outputs are completed VTISs, Software Validation Phase V&V Task Summary report, SVVR, anomaly reports, and updates to the SVVP as required in order to accommodate changes in the software validation program. The Software Validation Phase V&V Task Summary report is generated at the conclusion of all testing and includes a summary and detail of the test results, a detailed test history, an evaluation of test results and recommendations, and a record of test procedure deviations. Anomaly reports generated during this V&V phase document discrepancies detected during testing and the software configuration audit. The Software Validation Phase V&V Task Summary report and anomaly reports are distributed to the software lead engineer and a software manager for review and initiation of corrective action.
5.0 VERIFICATION AND VALIDATION REPORTING
The test documentation to be produced for a particular product and equipment depends on the life cycle classification and its level of concern classification. Upon successful completion of the validation effort, the SVVR, which contains a section listing the pertinent documentation for the revision of software, is issued.
Page 28 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
229
5.1 Verification and Validation Documentation Templates In order to create an efficient and standardized V&V program, a collection of V&V documentation templates have been generated and inserted onto the network for access by the software V&V engineers. At the beginning of a V&V assignment, the V&V engineer shall classify the type of product and equipment under test and determine the level of effort and documentation deliverables that are required. The software V&V engineer will then access the required templates as a starting point. The following directories and templates are provided on the network [insert directory path here]. •
[insert directory path name]
Software V&V policies
•
[insert directory path name]
Full Cycle Software V&V Plan
•
[insert directory path name]
Short Cycle Software V&V Plan
•
[insert directory path name]
Software Validation Test Plan
•
[insert directory path name]
Software Validation Test Procedures
•
[insert directory path name]
Full Cycle Software V&V Report
•
[insert directory path name]
Short Cycle Software V&V Report
•
[insert directory path name]
Forms, including VTIS, CRA, and Anomaly
•
[insert directory path name]
RTM database
•
[insert directory path name]
Anomaly database
•
[insert directory path name]
CRA database
Upon completion of a project, the software V&V lead engineer shall insert the completed templates above into the corresponding project history directory located under [insert directory path here]. Each project history directory shall contain the following subdirectories: •
doc
All documents
•
rtm
RTM database files
•
anomaly
Anomaly database files
Copyright © 2002 Interpharm Press
Page 29 of 46 SE-VVG
230
Software Quality Assurance SOPs for Healthcare Manufacturers
5.2 Verification and Validation 510(k) Submittal V&V support for a 510(k) submittal requires that an SVVP be generated for the particular product. A full life cycle development product for a new product requires the broader coverage SVVP template contained in the full cycle subdirectory listed above, and the product enhancement 510(k) submittal requires the accelerated SVVP located in the short cycle subdirectory listed above. A 510(k) submittal requires that the software V&V engineer produces the SVVP, RTM, SVTP, and SVTPR, along with the SVVR and its test documentation of Test Log, VTIS, and Software Anomaly Report. The primary difference between the two documents is that the product enhancement SVVP template eliminates the SDP, STP, and Software Quality Assurance Plan (SQAP).
5.3 Test Log A test log book shall be kept of all significant V&V tests performed on the software. See a sample test log form in Appendix C.
5.4 Requirements Traceability Matrix The RTM shown in Appendix B traces the higher level system requirements to the derived software requirements into the implemented design and to where it was tested. Once the RTM is completed and all anomalies closed, the Software Verification and Validation Report (SVVR) is generated.
5.5 Validation Test Information Sheets The VTISs are utilized during the software system testing activity for full life cycle development projects. The VTIS shown in Appendix D incorporates all of the necessary test information, including purpose and success criteria of test, test engineer, test approach, test environment, test results, and comments. The VTISs are to be retained by the V&V group in a central location for review by the software developers and management. The V&V personnel use the VTISs in the following manner: •
Before testing, the VTIS is used as a planning tool to outline the tests to be performed, updated as the test scenario or plan is completed, and used to generate the SVTPR.
Page 30 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
231
•
During testing, the VTIS is used as a means of tracking the progress and status of all testing, and as a means of test verification.
•
After testing, the completed VTISs are included in the Software Test Report and delivered to the software configuration manager for archiving.
5.6 Software Anomaly Report Problem reporting is initiated by the software V&V engineer(s) with a Software Anomaly Report (Appendix E) that identifies problems detected during V&V activities. The specific information required on an anomaly report identifies how, when, and where the problem occurred and the impact of the problem on the system capability and on the continued conduct of V&V phase activities.
5.7 Anomaly Reporting and Resolution The software V&V lead engineer is responsible for the proper documentation and reporting of Software Anomaly Reports, and all anomalies are reported regardless of the perceived impact on software development or severity level with respect to the system operation. Unreported and unresolved problems can have a significant adverse impact in the later stages of the software development cycle, which may include little time for resolution. The projected impact of an anomaly is determined by evaluating the severity of its effect on the operation of the system. The severity of an anomaly report is defined as one of the following: •
High. The change is required in order to correct a condition that prevents or seriously degrades a system objective and no alternative exists, or to correct a safetyrelated problem.
•
Medium. The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user and system requirements can be met.
•
Low. The change is desirable to maintain the system, correct operator inconvenience, or other.
Resolution of the critical anomaly indicated as a severity of “high” is required before the V&V effort can proceed to the next software development phase.
Copyright © 2002 Interpharm Press
Page 31 of 46 SE-VVG
232
Software Quality Assurance SOPs for Healthcare Manufacturers
Software Anomaly Reports are reviewed by the software lead engineer for anomaly validity, type, and severity, and the software lead engineer can direct additional investigation if required to assess the validity of the anomaly or the proposed solution. When an anomaly solution is approved and the personnel responsible for performing the corrective action are indicated, the software lead engineer will authorize implementation of the corrective action. The software V&V lead engineer is responsible for anomaly report closure, which includes documenting that the corrective action(s) have been taken and verifying the incorporation of authorized changes as described in the anomaly report. If the anomaly requires a change to a baselined configuration item, a Change Request/Approval (CRA) is prepared by a member of the software development team for the item(s) to be changed. A reference to applicable anomaly reports will be documented in the issued CRA.
5.8 Task Reporting The results of individual V&V tasks are documented in a V&V task report, which identifies the V&V phase at which the task was conducted, the responsible V&V engineer(s), the responsible software development team member(s), interim results, and status and recommended corrective action, if any. The V&V task report may be in a format that is appropriate for technical disclosure, such as technical reports or memos.The V&V task reports are provided to the software lead engineer and a software manager in a timely manner to aid in the detection and resolution of problems prior to the start of the next software development phase.
5.9 Verification and Validation Phase Summary Report At the conclusion of each V&V phase, the V&V Phase Summary Report is generated; it summarizes the results of V&V performed during the applicable software development phase. This summary report contains the following: •
A description of the V&V tasks performed
•
A summary of task results
•
A summary of anomalies and implemented resolutions
•
An assessment of software quality
•
Any recommendations
Page 32 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
233
The V&V Phase Summary Report may be in a format that is appropriate for technical disclosure, such as technical reports or memos.
5.10 Software Verification and Validation Report When the RTM is completed and all anomalies have been closed, the SVVR (Appendix F) is generated. The SVVR certifies that the software is safe for its intended use and includes the following: •
A list of documents reviewed and utilized
•
A summary of all V&V activities
•
Specific test results
•
Test history
•
Pertinent pages of the RTM database
•
Pertinent pages of the anomaly database
•
An assessment of overall software acceptability
•
Any process recommendations
The RTM and anomaly database reports are attached to the SVVR and become one package, which is provided to the software lead engineer, who is responsible for submitting the software and related documentation to corporate document control. For non-instrument software, the department may choose to impose local control, thereby assuming all responsibility for configuration management.
5.11 Verification and Validation Deviation or Waiver Circumstances may require deviation(s) and waiver(s) from policy. A written request for a deviation is generated by the cognizant project software V&V lead engineer in advance of a future activity, event, or product in order that SE management be made aware of the project’s intention to employ a higher risk approach to V&V. A written request for a waiver is generated by the cognizant project software V&V lead engineer in those cases where the activity,
Copyright © 2002 Interpharm Press
Page 33 of 46 SE-VVG
234
Software Quality Assurance SOPs for Healthcare Manufacturers
event, or product has already been initiated. The deviations and waivers are submitted to the project [project title/position] for review, and a recommendation is made to the [title/position] and/or [title/position] for approval or disapproval of the proposed deviation or waiver. A proposed deviation or waiver must be approved by the [title/position] and/or [title/position] before the verification and validation tasks affected by that deviation or waiver are begun. A copy of each approved deviation and waiver shall be forwarded to the secretary of the Verification and Validation Policy CCB. A copy shall also be placed in the product history file. A permanent record of deviation and waiver approvals shall be maintained for each project. The form depicted in Appendix G is to be used. Each request for a deviation or waiver shall identify the following: •
Each specific policy or policy requirement for which it applies
•
The alternative policy approach to be taken by the project
•
The impact on project schedule, performance, and/or risk
This record shall be initiated during development of the product objectives and shall serve as a record of all subject approvals for the duration of the project.
Page 34 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
APPENDIX A
235
LIST OF REFERENCES FOR SOFTWARE VERIFICATION AND VALIDATION
FDA Policy for the Regulation of Computer Products (Draft), November 1989 Application of the Medical Device GMPs to Computerized Devices and Manufacturing Processes— Medical Device GMP Guidance for FDA Investigators (Draft), November 1990 Medical Device Industry Computer Software Committee Comments on the November 1990 Draft, May 1991 Review of 510(K)s for Computer Controlled Medical Devices, 510(K) Memorandum #K91-1, August 1991 HIMA (MDICSC) Rewrite of Reviewer Guidance for Computer Controlled Devices, November 1989 Technical Reference on Software Development Activities Preproduction Quality Assurance Planning: Recommendations for Medical Device Manufacturers, HHS Publication FDA 90-423, September 1989 Process Validation: Guideline on General Principle of Process Validation, May 1987 International Standard, ISO 9000: Quality management and quality assurance standards— Guidelines for selection and use, First Edition, 1987-03-15, Reference number ISO 9000 1987(E) International Standard, ISO 9001: Quality systems—Model for quality assurance in design/development, production, installation, and servicing, First Edition, 1987-03-15, Reference number ISO 9001 1987(E) International Standard, ISO 9002: Quality systems—Model for quality assurance in production and installation, First Edition, 1987-03-15, Reference number ISO 9002 1987(E) International Standard, ISO 9003: Quality systems—Model for quality assurance in final inspection and test, First Edition, 1987-03-15, Reference number ISO 9003 1987(E) International Standard, ISO 9004: Quality management and quality system elements—Guidelines, First Edition, 1987-03-15, Reference number ISO 9004 1987(E)
Copyright © 2002 Interpharm Press
Page 35 of 46 SE-VVG
Requirement Number 001 002 003 004 005 006 007 008 009 010
Page 36 of 46 SE-VVG SDDS Paragraph Number 3.3.1 3.10 3.3 3.3.2 3.4 3.8 3.5 3.6 3.7 Software Component(s) power_up shutdown motor_task pid transitions safety_task alarms_task warnings errors_task
Test Verification Test Number(s) Method(s) Results FU1+RE3 P/F FU2 P/F FU1+RE1 P/F FU3 P/F RO15 P/F SA1 P/F FU5+RE5 P/F FU6 P/F FU10 P/F P/F P/F P/F P/F
APPENDIX B
Higher Level SRS Requirement Requirement Paragraph Description Number Number Power-up test 3.1.1 Power down 3.8 Motor control 3.1 Control algorithm 3.1.2 User interface 3.2 Safety tests 3.6 Alarms 004 3.3 Warnings 3.4 Errors 005 3.5
REQUIREMENTS TRACEABILITY MATRIX
236 Software Quality Assurance SOPs for Healthcare Manufacturers
REQUIREMENTS TRACEABILITY MATRIX
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
APPENDIX C
237
SOFTWARE VALIDATION TEST LOG
SOFTWARE VALIDATION TEST LOG Location:
Time
Software Project:
Test Number
Date:
Entry
References
Engineer
Page ____ of ____
Copyright © 2002 Interpharm Press
Page 37 of 46 SE-VVG
238
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX D
SOFTWARE VALIDATION TEST INFORMATION SHEET
SOFTWARE VALIDATION TEST INFORMATION SHEET
Test Category
Test Number
Requirement
Requirement Number
1. Objectives and success criteria 2. Test approach 3. Test instrumentation 4. Test duration 5. Data collection, reductions, and analysis requirements 6. Comments 7. Results 8. Signatures:
Page 38 of 46 SE-VVG
Test Conductor
Date
V&V Lead Engineer
Date
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
APPENDIX E-1
239
SOFTWARE ANOMALY REPORT SOFTWARE ANOMALY REPORT
1. Date:
2. Severity: HML
3. Anomaly Report
4. Title (briefly describe the problem):
5. System: 8. Originator:
6. Component: 9. Organization
12. Verification and Validation Task: 14.
System Configuration:
15.
Anomaly Description:
16.
Problem Duplication: During run Y N After restart Y N After reload Y N
10. Telephone
N/A N/A N/A
Investigation Time
19.
Proposed Solution:
20.
Corrective Action Taken: Date:
21.
Closure Sign-off:
11. Approval:
13. Reference Document(s):
17.
18.
Copyright © 2002 Interpharm Press
7. Version
❑ ❑ ❑ ❑ ❑
Source of Anomaly: PHASE Requirements Architecture Design Detailed Design Implementation Undetermined
❑ ❑ ❑ ❑ ❑ ❑
TYPE Documentation Software Process Methodology Other Undetermined
Software Lead Engineer
Date
V&V Lead Engineer
Date
Page 39 of 46 SE-VVG
240
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX E-2
INSTRUCTIONS FOR COMPLETING SOFTWARE ANOMALY REPORT
2. Severity: Circle the appropriate code. High: The change is required to correct a condition that prevents or seriously degrades a system objective (where no alternative exists) or to correct a safety-related problem. Medium: The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user and system requirements can be met. Low: The change is required to maintain the system, correct operator inconvenience, or other. 3. Anomaly report number: Number assigned for control purposes. 4. Title: Brief phrase or sentence describing the problem. 5. System: Name of the system or product against which the anomaly report is written. 6. Component: Component or document name against which the anomaly report is written. 7. Version: Version of the document or code against which the anomaly report is written. 8. Originator: Printed name of individual originating the anomaly report. 9. Organization: Organization of originator of anomaly report. 10. Telephone: Office phone number of the individual originating the anomaly report. 11. Approval: Software management individual or designatee approval for anomaly report distribution. 12. V&V task name: Name of the V&V task being performed when the anomaly was detected. 13. Reference document: Designation of the documents that provide the basis for determining that an anomaly exists. 14. System configuration: Configuration loaded when anomaly occurred; not applicable for documentation or logic errors. 15. Anomaly description: Description defining the anomaly and a word picture of events leading up to and coincident with the problem. Cite equipment being used, unusual configurations, environment parameters, and so forth, that will enable the programmer to duplicate the situation. If continuation sheets are required, fill in Page _ of _ at the top of the form. 16. Problem duplication: Duplication attempts, successes or failures for software errors; not applicable for documentation or logic errors. 17. Source of anomaly: On investigation completion, source of the anomaly in terms of phase origination and type. 18. Investigation time: Time, to the nearest half hour, required to determine the cause of the anomaly but not the time to determine a potential solution or time to implement the corrective action. 19. Proposed solution: Description defining in detail a solution to the detected anomaly, including documents, components and code. 20. Corrective action taken: Disposition of the anomaly report, including a description of any changes initiated as a direct result of this report and the date incorporated. 21. Closure sign-off: Signature of the software lead engineer authorizing implementation of the corrective action. Signature of the V&V lead engineer verifying incorporation of the authorized changes as described in this report.
Page 40 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
APPENDIX F
241
SOFTWARE VERIFICATION AND VALIDATION REPORT
SOFTWARE VERIFICATION AND VALIDATION REPORT (SVVR) PREPARED BY:
[V&V lead engineer name and signature]
DATE: [mm/dd/yy]
REVIEWED BY:
[software lead engineer name and signature]
DATE: [mm/dd/yy]
APPROVED BY:
[title/position and signature]
DATE: [mm/dd/yy]
SUBJECT:
[project/product name] Revision [#.#] SVVR
Document Number [######] Revision [L] Software CRC [######], Checksum [######] V&V Testing of the [project/product name] Revision [#.##] software has been completed with no outstanding safety anomalies. The quality of the software is deemed acceptable for release. The following [project/product name] software specifications and reports have been created and reviewed for this version of software and are located [enter controlled storage location name]: • • • • •
Interface Design Specification (IDS) Software Requirements Specification (SRS) Software Detailed Design Specification (SDDS) Requirements Traceability Matrix (RTM) Software Anomaly Reports
Revision [#] Revision [#] Revision [#] Dated [date] Dated [date]
Phase Summary Reports were generated for each phase [enter summary of V&V tasks performed]. The following tests and reviews were performed: 1. CRA Review Summary CRA [Mnnn-aaa-nnn]: Reviewed source code changes from the prior Revision [#.#]. Noted that this Revision [special notes]. The major changes included [summary of changes by file]. 2. CRA Test Summary CRA [Mnnn-aaa-nnn]: [Summarize those functions that were tested and describe how each was tested and the relevant results.] [OPTIONAL if not performed:
3. CRA Code Fix Test Summary CRA [Mnnn-aaa-nnn]: Inspected code change to fix [errant behavior] for Revisions [#.#].] 4. Chronological Test Record • • • • •
Requirements Phase Architecture Design Phase Detailed Design Phase Code and Test, Integrate and Test Phase Software Validation Phase
Copyright © 2002 Interpharm Press
[Enter [Enter [Enter [Enter [Enter
ending ending ending ending ending
date] date] date] date] date]
Page 41 of 46 SE-VVG
242
Software Quality Assurance SOPs for Healthcare Manufacturers
[OPTIONAL:
[5. Test Input Sources Other Than the SRS and SDDS] A. Enter source] [6. Process Recommendations The following software process recommendations are provided as feedback for future software development efforts. A. Enter recommendation]]
Page 42 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
APPENDIX G
243
SOFTWARE VERIFICATION AND VALIDATION RECORD OF DEVIATION OR WAIVER APPROVAL
SOFTWARE VERIFICATION AND VALIDATION RECORD OF DEVIATION OR WAIVER APPROVAL PROJECT:
TYPE: Deviation or Waiver
PHASE: SOP Requirement Paragraph(s):
Initiated by:
________________________________ Signature
Reviewed by:
________________________________ Signature
Approved by:
________________________
Date:
________________________
Date:
________________________
Title/Position
________________________________ Signature
Date:
Title/Position
Title/Position
Reason/Rationale/Explanation:
Project schedule and performance impact:
Project risk:
Alternative approach to be used:
Copyright © 2002 Interpharm Press
Page 43 of 46 SE-VVG
244
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Active failure: Failure or combination of failures that are dangerous (see fault). Black box testing: Method of testing in which the focus is on input-processing-output of the system under test, where the system is given input stimuli, performs processing, and produces outputs. The internals of the system are ignored. Code and debug: Activity of software development during which a software product is created from design documentation and errors or malfunctions are removed from it. Critical indicator: Indicator that displays a critical parameter. Critical parameter: Parameter that may cause harm to the patient or user if it is incorrectly entered or displayed. Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, that may lead to a failure. Fault insertion testing: Software testing in which a fault condition is directly inserted in order to verify that the product can detect expected error conditions. Functional testing: Testing designed to verify that all the functional requirements have been satisfied. Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness, or damage to or loss of equipment or property. Operational window: Period over which the probability of two or more independent failures combining to become dangerous is judged to be acceptably low and at the beginning of which tests are made to determine that specific passive failures are not present in the system. Passive failure: Failure that is not dangerous in and of itself but that in combination with other failures is dangerous. Regression testing: Selective retesting to detect faults that might have been introduced during modification, to verify that modifications have not caused unintended adverse effects, and to verify that a modified system or system component still meets its specified requirements. Requirements testing: Testing that encompasses a rigorous development effort and includes the production of a requirements and design specification, systematically tracking the require-
Page 44 of 46 SE-VVG
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Guidelines
245
ments to be tested and tracing them to the design specification and ultimately to where they were tested. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs. Robustness testing: Testing designed to see how the software performs given unexpected inputs, by determining whether the software recovers from an unexpected input by issuing an error text or audio message, locks the system in an indeterminate state, or continues to operate in a manner that is unpredictable. Safety-critical computer software components: Software components, such as processes, functions, values, or program states, whose errors, such as inadvertent or unauthorized occurrence, failure to occur when required, occurrence out of sequence, occurrence in combination with other functions, or erroneous values, can result in a potential hazard or loss of predictability or control of a system. Safety testing: Testing designed to verify that the product performs in a safe manner and that a complete assessment of the safety design is accomplished. Software critical components path: Identification of safety-critical computer software components, the components that use or create the safety-critical computer software components, and the tracing of the paths back to the origins. Stress testing: Testing designed to determine how the product reacts to a stress condition in which the amount or rate of data exceeds the amount expected. Stress tests can help determine the margin of safety in the product. System integration: Activity of software development during which the software and hardware are integrated and tested. System critical time: Maximum period of time during which the system can operate in a hazardous state without causing harm to the patient or to the user. Active failures must be detected and acted upon within the system critical time. System critical volume: Delivery of the system critical time equivalent fluid or more than 0.5 ml fluid before corrective action is taken under worst case failure conditions. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements.
Copyright © 2002 Interpharm Press
Page 45 of 46 SE-VVG
246
Software Quality Assurance SOPs for Healthcare Manufacturers
Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. White box testing: Testing that scrutinizes the internal workings of the system, especially as it relates to required system functionality.
Page 46 of 46 SE-VVG
Copyright © 2002 Interpharm Press
SE-VVP SOFTWARE ENGINEERING VERIFICATION AND VALIDATION POLICIES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SEVVP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 46 SE-VVP
248
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
PREAMBLE
SE Software Verification and Validation Policies
4
POLICY 1
Software Verification and Validation Management
9
POLICY 2
Software Verification and Validation Plan
10
POLICY 3
Verification and Validation Reporting
12
POLICY 4
Anomaly Reporting and Resolution
13
POLICY 5
Verification and Validation Task Iteration
16
POLICY 6
Interface Design Phase Verification and Validation
17
POLICY 7
Requirements Phase Verification and Validation
18
POLICY 8
Requirements Traceability Matrix
20
POLICY 9
Software Architecture Design Phase Verification and Validation
22
POLICY 10
Software Validation Test Plan
24
POLICY 11
Validation Test Information Sheets
26
POLICY 12
Software Validation Test Procedures
27
POLICY 13
Software Detailed Design Phase Verification and Validation
28
Page 2 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
249
POLICY 14
Code and Test Phase Verification and Validation
31
POLICY 15
Integration and Test Phase Verification and Validation
33
Software Validation Phase Verification and Validation
36
POLICY 17
Software Configuration Audit Report
38
POLICY 18
Software Verification and Validation Report
39
POLICY 16
GLOSSARY
Copyright © 2002 Interpharm Press
40
Page 3 of 46 SE-VVP
250
Software Quality Assurance SOPs for Healthcare Manufacturers
PREAMBLE
SE SOFTWARE VERIFICATION AND VALIDATION POLICIES
Policy Software Engineering (SE) software projects shall comply with a set of Software Verification and Validation (V&V) Policies, which are established, maintained, and used to achieve quality in all phases of the software life cycle. Justified and necessary departures from these policies may be authorized in response to a written request. A permanent board shall be established by SE to control and maintain the SE Software V&V Policies.
Requirements 1. The SE Software V&V Policies (Figures 1 and 2) shall be applied to all SE software projects. Projects where effort will be expended in order to modify or enhance existing software are also subject to this requirement. 2. The SE Software V&V Policies shall be maintained by the Verification and Validation Policy Change Control Board (CCB). The chairman of this board shall be appointed by the SE [title/position] with the approval of the [title/position], and board members shall be appointed in writing by the SE [title/position]. The SE [title/position] shall serve as the secretary to the board and shall be responsible for scheduling board meetings and maintaining minutes of meetings and permanent files of CCB actions. Proposed changes to SE Software V&V Policies must be submitted in writing to the board. At least once each year, the board shall convene to review the policies in their totality for relevancy and currency. Where appropriate, they shall propose revisions to the policies subject to the review and approval of the SE [title/position]. After approval by the SE [title/position], the policies shall be approved by the [title/position] and [title/position]. 3. Circumstances may require deviation(s) and waiver(s) from policy. A written request for a deviation shall be submitted by the project software V&V lead engineer in advance of a future activity, event, or product in order that SE management be made aware of the project’s intention to employ a higher risk approach to V&V. A written request for a waiver shall be submitted by the project software V&V lead engineer in those cases where the activity, event, or product has already been initiated. Deviations and waivers shall be reviewed by the project [project title/position] and submitted to the SE [title/position] for review. The SE [title/position] will make a recommendation to the [title/position] and/or [title/position] for approval or disapproval of the proposed deviation or waiver. A proposed deviation or waiver must be approved by the [title/position] and/or [title/position] before the verification and validation tasks affected by that deviation or waiver are begun.
Page 4 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
251
4. Each request for a deviation or waiver shall identify the following: a. Each specific policy or policy requirement for which it applies b. The alternative policy approach to be taken by the project c. The impact on project schedule, performance, and/or risk 5. A copy of each approved deviation and waiver shall be forwarded to the secretary of the Verification and Validation Policy CCB. A copy shall also be placed in the product history file. 6. These policies refer to and govern a set of SE software V&V procedures. The procedures are intended to provide detailed guidance within the framework and requirements provided by these policies. It is the responsibility of the cognizant project software V&V lead engineer to apply the existing relevant SE software V&V procedures. New SE software V&V procedures are to be submitted to the SE [title/position] prior to their use in order that they can be reviewed and approved.
Figure 1
SE Software Verification and Validation Policies
Policy Category Project Management
Phase Verification and Validation
Validation Test and Operations
Verification and Validation Reporting
Policy Topic Title
Policy Number
Software Verification and Validation Management Software Verification and Validation Plan (SVVP) Verification and Validation Task Iteration
1 2 5
Interface Design Phase Verification and Validation Software Requirements Phase Verification and Validation Software Architecture Design Phase Verification and Validation Software Detailed Design Phase Verification and Validation Code and Test Phase Verification and Validation Integration and Test Phase Verification and Validation Software Validation Phase Verification and Validation
6
13 14 15 16
Requirements Traceability Matrix (RTM) Software Validation Test Plan (SVTP) Validation Test Information Sheet (VTIS) Software Validation Test Procedures (SVTPR)
8 10 11 12
Verification and Validation Reporting Anomaly Reporting and Resolution (SAR) Software Configuration Audit Report (SCAR) Software Verification and Validation Report (SVVR)
3 4 17 19
Copyright © 2002 Interpharm Press
7 9
Page 5 of 46 SE-VVP
252
Software Quality Assurance SOPs for Healthcare Manufacturers
Figure 2
SE Software Verification and Validation Policies Throughout the Software Development Life Cycle
Interface Design
Requirements
Architecture Design
Detailed Design
Code and Test
Integrate and Test
Software Validation
Software Life Cycle Phase
Project Start-up
Policy Topic Title
Verification and Validation Management
S
E
E
E
E
E
E
E
Software Verification and Validation Plan (SVVP)
S
ED
E
E
E
E
E
E
E
E
E
E
E
E
E
Verification and Validation Task Iteration Interface Design Phase Verification and Validation
ED
Requirements Phase Verification and Validation
ED
Architecture Design Phase Verification and Validation
ED
Detailed Design Phase Verification and Validation
ED
Code and Test Phase Verification and Validation
ED
Integration and Test Phase Verification and Validation
ED
Software Validation Phase Verification and Validation
ED
Requirements Traceability Matrix (RTM)
ED
Software Validation Test Plan (SVTP) Validation Test Information Sheet (VTIS)
U
U
U
U
U
ED
U
E
E
E
S
SD
SU
SU
ED
S
S
SD
E
Software Validation Test Procedures (SVTPR) Verification and Validation Reporting Anomaly Reporting and Resolution
ED ED ED ED E
E
E
E
ED ED E
E
ED ED
Software Configuration Audit Report (SCAR)
ED
Software Verification and Validation Report (SVVR)
ED
Notes: 1. 2. 3. 4.
D indicates that a deliverable or activity is required at that time. U indicates that an update of a previous deliverable occurs. E indicates that the procedure requirements are in effect for the entire phase. S indicates that the procedure requirements can start at any time.
Page 6 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
253
7. A permanent record of deviation and waiver approvals shall be maintained for each project using the form depicted in the SE software V&V procedures.This record shall be initiated during development of the Product Objectives Document and shall serve as a record of all subject approvals for the duration of the project.
Responsibilities: The project software V&V lead engineer is responsible for the following: 1. Generating changes to SE Software V&V Policies 2. Generating written deviations and waivers 3. Generating changes to V&V procedures 4. Applying relevant V&V procedures to the project The SE [title/position] is responsible for the following: 1. Review and approval or disapproval of SE Software V&V Policies 2. Review and recommendation of deviations and waivers from SE Software V&V Policies 3. Review and approval or disapproval of V&V procedures The [title/position] and/or [title/position] is responsible for the following: 1. Approval of SE Software V&V Policies 2. Approval of deviations and waivers from SE Software V&V Policies The project [project title/position] is responsible for the review and submittal of deviations and waivers from SE Software V&V Policies. The managers of organizations supporting and sponsoring the project should share the commitment to the implementation of these policies.
Copyright © 2002 Interpharm Press
Page 7 of 46 SE-VVP
254
Software Quality Assurance SOPs for Healthcare Manufacturers
Figure 3
Matrix of Responsibilities for Software Verification and Validation Policy Documents Document Title
Software verification and validation procedures Software development deviation Software development waiver Software Verification and Validation Plan (SVVP) Verification and Validation Reports Software Trouble Report (low) Software Trouble Report (medium) Software Trouble Report (high) Interface Design Phase Task Summary Report Requirements Phase Task Summary Report Requirements Traceability Matrix (RTM) Architecture Design Phase Task Summary Report Software Validation Test Plan (SVTP) Verification Test Information Sheets (VTIS) Software Validation Test Procedures (SVTPR) Detailed Design Phase Task Summary Report Code and Test Phase Task Summary Report Integrate and Test Phase Task Summary Report Software Configuration Audit Report (SCAR) Software Verification and Validation Report (SVVR) Notes: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10.
V&VE1 V&VLE2
System Project Director SLE3 Engineer Engineer SE
G G G
G
R R
R/S R/S
R/R R/D R/D
G G G G G
R R R/D R R
R/D R R/D R/D R/D
G
R
R
G
R
R
G/U
R
R
G G
R R
R R/D
G/D
R
R
G
R
R/D
G
R
R
G
R
R
G
R
R
G
R
R
G
R
R/D
A senior software V&V engineer assigned to the project. The project V&V lead engineer assigned to the project. The project software lead engineer assigned to the project. G means generate. G/U means generate and update. G/D means generate and disposition. R means review. R/D means review and disposition. R/S means review and submit. R/R means review and recommend.
Page 8 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
255
POLICY 1 SOFTWARE VERIFICATION AND VALIDATION MANAGEMENT
Policy SE software projects shall provide the verification and validation required to ensure an independent assessment and measurement of the correctness, accuracy, consistency, completeness, robustness, and testability of the software requirements, design, and implementation. The methodology and procedures for achieving these project-specific activities shall be described in a Software Verification and Validation Plan (SVVP), which is generated by the project software V&V lead engineer, reviewed by the cognizant software lead engineer, and approved by the SE [title/position].
Requirements 1. Management of the functions and tasks of V&V for each SE software project shall be the responsibility of the project software V&V lead engineer, who shall be responsible for making decisions regarding the performance of V&V, assigning priorities to V&V tasks, estimating the level of effort for a task, tracking the progress of work, determining the need for V&V task iteration or initiation of new V&V tasks, and assuring adherence to SE standards in all V&V efforts. The authority for resolving issues raised by V&V tasks and the approval or disapproval of V&V products shall reside with the SE [title/position]. A non-project-related person shall be assigned the responsibility for the V&V tasks with the mutual approval of the project software lead engineer and the SE [title/position]. This person shall report operationally to the project software lead engineer and shall receive functional direction from the SE [title/position]. 2. The project shall prepare and maintain an SVVP in compliance with the relevant SE V&V procedures. The SVVP shall describe the project’s V&V organization, activities, schedule, and inputs and outputs and any deviation from these policies required for effective management of V&V tasks. The SVVP shall be prepared within one month of project start-up by the project software V&V lead engineer, reviewed by the project software lead engineer, and approved by the SE [title/position]. 3. The requirement for reperformance of previous V&V tasks or initiation of new V&V tasks to address software changes for an SE software project shall be determined by the project software V&V lead engineer, by continuous review of V&V efforts, technical accomplishments, resource utilization, future planning, and risk assessment.
Copyright © 2002 Interpharm Press
Page 9 of 46 SE-VVP
256
Software Quality Assurance SOPs for Healthcare Manufacturers
4. Periodic reviews of the V&V effort shall be conducted for the SE [title/position]. An evaluation of the technical quality and results of the outputs of V&V tasks, based on these periodic reviews, will be given to the SE [title/position]. Periodic reviews and V&V task evaluation are performed in order to support the recommendation by the project software V&V lead engineer to proceed or not proceed to the next software development phase and to define changes to V&V tasks described in the SVVP in order to improve the V&V effort.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. b. c. d.
Generating, maintaining, and obtaining approval of the SVVP Management of the functions and tasks of V&V for an SE software project Evaluating the need for changes to V&V tasks to improve the V&V effort Making a recommendation to the SE [title/position] to proceed or not proceed to the next software development phase
2. The project software lead engineer shall be responsible for the review of the SVVP. 3. The SE [title/position] shall be responsible for the following: a. b. c. d.
Providing functional review and guidance to project personnel performing V&V tasks Resolving issues raised by V&V tasks Reviewing and approval of the SVVP Evaluating the technical quality and results of V&V tasks in order to support the decision to proceed or not proceed to the next phase of software development and define improvements to the V&V effort
POLICY 2 SOFTWARE VERIFICATION AND VALIDATION PLAN
Policy SE software projects shall prepare a Software Verification and Validation Plan (SVVP) that identifies and describes the plan for V&V of the software end products. The SVVP shall define the
Page 10 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
257
V&V program to be applied throughout all the phases of software development.The SVVP shall be prepared within one month of project start-up by the project software V&V lead engineer, reviewed by the project software lead engineer, and approved by the SE [title/position].
Requirements 1. The project software V&V lead engineer is responsible for the generation of the SVVP, which shall be produced in the format specified in the relevant SE V&V procedures. The SVVP shall be reviewed by the project software lead engineer and approved by the SE [title/position] within one month of project start-up. 2. The contents of the SVVP shall specify the following: a. Organization of the V&V effort, lines of communication within the V&V effort, and the relationship of V&V to the other software project efforts, such as development, program management, and quality assurance b. The schedule of V&V tasks c. Tools, techniques, and methodologies employed by the V&V effort, including plans for acquisition, training, and support d. Resources needed to perform the V&V tasks, including staffing, facilities, tools, and special procedural requirements e. Organizational elements or individuals responsible for performing the V&V tasks f. Risks and assumptions associated with the V&V tasks, including schedule, resources, and approach g. Identification of all V&V deviations 3. The maintenance of the SVVP shall be the responsibility of the project software V&V lead engineer. SVVP updates required to correlate V&V tasks with software evolution shall be completed and approved prior to task implementation.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for generating, maintaining, and obtaining approval of the SVVP. 2. The project software lead engineer shall be responsible for reviewing the SVVP. 3. The SE [title/position] shall be responsible for approving the SVVP.
Copyright © 2002 Interpharm Press
Page 11 of 46 SE-VVP
258
Software Quality Assurance SOPs for Healthcare Manufacturers
POLICY 3
VERIFICATION AND VALIDATION REPORTING
Policy The results of implementing V&V shall be documented in V&V reports. V&V reporting shall occur throughout all phases of software development. The project software V&V lead engineer is responsible for the generation and distribution of V&V reports.
Requirements 1. The results of V&V tasks shall be documented in a V&V task report, which identifies the following: a. b. c. d. e.
The V&V phase at which the task was conducted Responsible V&V personnel Responsible software development team member(s) Interim results and status Recommended corrective action
2. At the conclusion of each V&V phase, the appropriate Phase V&V task summary report shall be generated. The V&V task summary report summarizes the results of V&V performed during the corresponding software development phase. This summary report shall contain, as a minimum, the following: a. b. c. d. e.
Description of the V&V tasks performed Summary of task results Summary of anomalies and implemented resolutions Assessment of software quality Recommendations
3. All inputs and outputs of the V&V effort for each project shall be controlled and maintained in accordance with the procedures defined in the project’s Software Configuration Management Plan (SCMP). V&V document change control and configuration status accounting shall be implemented to ensure that the validity of V&V results are protected from accidental or unauthorized alteration. All V&V inputs and outputs shall be archived to provide project history for use in future software development planning.
Page 12 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
259
4. The project software V&V lead engineer is responsible for the generation and distribution of V&V reports. The format, timing, and distribution of V&V reports shall be in accordance with the project’s Software Verification and Validation Plan (SVVP).
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for generating and distribution of V&V reports in accordance with the SVVP. 2. The project software lead engineer shall be responsible for reviewing the V&V reports. 3. The SE [title/position] shall be responsible for reviewing the V&V reports.
POLICY 4 ANOMALY REPORTING AND RESOLUTION
Policy The project software V&V lead engineer shall be responsible for the proper documentation and reporting of software anomalies on a Software Anomaly Report. All anomalies shall be reported regardless of the perceived impact on software development or the severity of the anomaly with respect to system operation. Software Anomaly Reports shall be reviewed by the project software lead engineer for anomaly solution determination and implementation authorization. The project software V&V lead engineer shall be responsible for anomaly report closure. The SE [title/position] shall be responsible for the approval or disapproval of the distribution of Software Anomaly Reports.
Requirements 1. A Software Anomaly Report shall be used to identify problems detected during V&V activities. Specific information required includes the following:
Copyright © 2002 Interpharm Press
Page 13 of 46 SE-VVP
260
Software Quality Assurance SOPs for Healthcare Manufacturers
a. b. c. d. e.
Description and location of the anomaly Severity of the anomaly Cause and method of identifying the anomalous behavior Recommended action and actions taken to correct the anomalous behavior Impact of the problem on the system capability of the product and on the continued conduct of V&V phase activities
2. The form of the Software Anomaly Report shall be as defined in the relevant SE V&V procedures. The configuration identification, tracking, and status reporting of Software Anomaly Reports shall be in accordance with the project’s Software Configuration Management Plan (SCMP). 3. The projected impact of an anomaly shall be determined by evaluating the severity of its effect on the operation of the system. The severity of a Software Anomaly Report shall be defined as one of the following: •
High. The change is required to correct a condition that prevents or seriously degrades a system objective where no alternative exists or to correct a safety-related problem.
•
Medium. The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user or system requirements can be met.
•
Low. The change is desirable to maintain the system, correct operator inconvenience, or any other.
4. The project software V&V lead engineer shall be responsible for the proper documentation and reporting of software anomalies. All anomalies shall be reported, regardless of the perceived impact on software development or the severity of the anomaly with respect to the system operation. 5. Software Anomaly Reports shall be reviewed by the project software lead engineer for anomaly validity, type, and severity. The project software lead engineer can direct additional investigation if required to assess the validity of the anomaly or the proposed solution. An anomaly solution that does not require a change to a baselined software configuration item may be approved by the project software lead engineer. If the anomaly requires a change to a baselined software configuration item, then the anomaly solution shall be approved in accordance with the project’s SCMP. 6. When an anomaly solution is approved and the personnel responsible for performing the corrective action are indicated, the project software lead engineer shall authorize implementation of the corrective action. 7. The project software V&V lead engineer shall be responsible for anomaly report closure, which includes the following: a. Documenting the corrective action(s) taken
Page 14 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
261
b. Verifying the incorporation of authorized changes as described in the anomaly report c. Reporting the status of the Software Anomaly Report to the project software lead engineer and the SE [title/position] 8. The SE [title/position] shall be responsible for approval or disapproval of the distribution of Software Anomaly Reports that are closed. Upon approval, the project software V&V lead engineer shall distribute closed Software Anomaly Reports to the software project quality assurance representative(s). 9. The SE [title/position] shall ensure the resolution of anomalies that are indicated on the Software Anomaly Report with a severity of “high” before the software project proceeds to the next software development phase.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. Proper documentation and reporting of software anomalies b. Anomaly report closure c. Distribution of closed Software Anomaly Reports to the software project quality assurance representative(s) 2. The project software lead engineer shall be responsible for the following: a. Review of Software Anomaly Reports for anomaly validity, type, and severity b. Directing additional investigation, if required, to assess the validity of the anomaly or the proposed solution c. Approval or disapproval of an anomaly solution that does not require a change to a baselined software configuration item d. Authorization of corrective action implementation 3. The SE [title/position] shall be responsible for the following: a. Approval or disapproval of the distribution of closed Software Anomaly Reports b. Ensuring the resolution of anomalies that are indicated on the Software Anomaly Report with a severity of “high” before the software project proceeds to the next software development phase
Copyright © 2002 Interpharm Press
Page 15 of 46 SE-VVP
262
Software Quality Assurance SOPs for Healthcare Manufacturers
POLICY 5
VERIFICATION AND VALIDATION TASK ITERATION
Policy The effects of software changes on previously completed V&V tasks or future V&V tasks shall be analyzed in order to determine the requirement for reperformance of previous V&V tasks or initiation of new V&V tasks. V&V task iteration requirements shall be established by the project software V&V lead engineer and approved or disapproved by the SE [title/position].
Requirements 1. The project software V&V lead engineer shall be responsible for the continuous review of V&V efforts, technical accomplishments, resource utilization, future planning, and risk assessment in order to establish V&V task iteration requirements. 2. V&V tasks that uncover significant problems and/or tasks for which a significant part of the defined activity was not completed shall be candidates for V&V task iteration. 3. When the source information to the V&V task has undergone significant changes to the representation of system or software requirements, these V&V tasks shall be candidates for V&V task iteration. 4. Required iteration of V&V tasks shall be determined by the project software V&V lead engineer through assessments of change, criticality, and quality effects. The project software V&V lead engineer shall document required iteration of V&V tasks as an update to the project’s Software Verification and Validation Plan (SVVP) in accordance with the project’s Software Configuration Management Plan (SCMP). The SE [title/position] shall approve or disapprove V&V task iterations.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. Continuous review of V&V efforts, technical accomplishments, resource utilization, future planning, and risk assessment
Page 16 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
263
b. Establishing V&V task iteration requirements 2. The SE [title/position] shall be responsible for the approval or disapproval of V&V task iterations.
POLICY 6
INTERFACE DESIGN PHASE VERIFICATION AND VALIDATION
Policy V&V tasks shall be performed to ensure that the decomposition of the system into subsystems and the interfaces among the subsystems are completely specified. The results of Interface Design Phase V&V tasks shall be documented in an Interface Design Phase V&V task summary report.
Requirements 1. Review of the Interface Design Specification (IDS) shall be performed prior to document approval in order to provide feedback to the development process and to support software project management.The IDS shall be evaluated for consistency, completeness, accuracy, and testability. Review of the IDS shall ensure the following: a. Decomposition of the system into subsystems is complete, and each subsystem specified is traceable to the system level. b. All interfaces are completely and accurately defined. 2. The project software V&V lead engineer shall be responsible for ensuring that all V&V tasks are performed and are documented in a V&V task report. These reports shall be submitted by the project software V&V lead engineer to the project software lead engineer, and SE [title/position] for review and initiation of appropriate corrective action. 3. The Interface Design Phase V&V task summary report shall be generated by the project software V&V lead engineer. The task summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The V&V summary report shall be distributed to the project software lead engineer, SE [title/position], and the software project quality assurance representative(s).
Copyright © 2002 Interpharm Press
Page 17 of 46 SE-VVP
264
Software Quality Assurance SOPs for Healthcare Manufacturers
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. Ensuring that all V&V tasks performed are documented in a V&V task report and submitted to the project software lead engineer and SE [title/position] b. Generating the Interface Design Phase V&V task summary report and distributing it to the project software lead engineer, SE [title/position], and software project quality assurance representative(s) 2. The project software lead engineer shall be responsible for review of the V&V task reports and initiation of appropriate corrective action. 3. The SE [title/position] shall be responsible for review of the V&V task reports and initiation of appropriate corrective action.
POLICY 7 REQUIREMENTS PHASE VERIFICATION AND VALIDATION
Policy The Requirements Phase V&V shall ensure that both the problem and the constraints upon the solution are specified in a rigorous form. Evaluation and analysis of the Software Requirements Specification (SRS) shall ensure the correctness, consistency, completeness, accuracy, and testability of the functional, performance, and external interface requirements for the software end product. The results of Requirements Phase V&V tasks shall be documented in a Requirements Phase V&V task summary report.
Requirements 1. Those product requirements documents that specify the system performance, user interface, and critical system components shall be reviewed for use in V&V planning and in defining the level of effort required to successfully verify and validate the software end
Page 18 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
265
product. Product requirements documentation for the software end product shall be provided to the project software V&V lead engineer by the project software lead engineer prior to development of the SRS. 2. During the review of the product requirements documentation, software requirements shall be documented in the Requirements Traceability Matrix (RTM), with a reference to the document that specified them. Generation and maintenance of the RTM shall be the responsibility of the project software V&V lead engineer. 3. The SRS shall be provided to the project software V&V lead engineer by the project software lead engineer prior to the Software Requirements Review (SRR). The analysis and evaluation of the SRS shall ensure compliance with the requirements defined in the SE Software Development Policies for the SRS. Verification of the SRS shall ensure the testability of all requirements specified and compliance of the SRS with established SE software development procedures. An assessment shall also be made of how well the SRS satisfies system objectives, including system safety considerations identified in the Hazards Analysis, which are controlled and/or commanded by software. 4. The RTM shall be updated to document the tracing of the software and interface requirements specified in the SRS to requirements defined in the product requirements documentation. The project software V&V lead engineer shall be responsible for ensuring the accuracy and completeness of RTM updates. 5. The project software V&V lead engineer shall be responsible for ensuring that all V&V tasks are performed and are documented in a V&V task report. These reports shall be submitted by the project software V&V lead engineer to the project software lead engineer, and SE [title/position], for review and initiation of appropriate corrective action. 6. The project software V&V lead engineer shall be responsible for supporting the project software lead engineer in preparation for the SRR. All V&V outputs generated during the Requirements Phase shall be made available to the project software lead engineer prior to SRR in order to support management in determining the adequacy, correctness, and testability of the stated software and interface requirements. 7. The Requirements Phase V&V task summary report shall be generated by the project software V&V lead engineer. The task summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The V&V summary report shall be distributed to the project software lead engineer, SE [title/position], and the software project quality assurance representative(s).
Copyright © 2002 Interpharm Press
Page 19 of 46 SE-VVP
266
Software Quality Assurance SOPs for Healthcare Manufacturers
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. Generating and ensuring the accuracy and completeness of updates to the RTM b. Ensuring that all V&V tasks performed are documented in a V&V task report and submitted to the project software lead engineer and SE [title/position] c. Supporting the project software lead engineer in preparation for the SRR d. Generating the Requirements Phase V&V task summary report and distributing it to the project software lead engineer, SE [title/position], and software project quality assurance representative(s) 2. The project software lead engineer shall be responsible for the following: a. Providing the project software V&V lead engineer with the product requirements documentation prior to development of the SRS b. Providing the project software V&V lead engineer with the SRS prior to the SRR c. Review of the RTM d. Review of V&V task reports and initiation of appropriate corrective action 3. The SE [title/position] shall be responsible for the following: a. Review of the RTM b. Review of V&V task reports and initiation of appropriate corrective action
POLICY 8 REQUIREMENTS TRACEABILITY MATRIX
Policy SE software projects shall prepare a Requirements Traceability Matrix (RTM) which documents the development of the software end products from requirements specification through software validation. The RTM shall be generated during the Requirements Phase of software development and updated at each subsequent phase. All software requirements shall be traced on the RTM from the Software Requirements Specification (SRS) to the code and subsequent testing.The RTM shall be included in the Software Verification and Validation Report (SVVR).
Page 20 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
267
Requirements 1. The RTM shall accurately reflect the documented software end product requirements, design, and validation. 2. The RTM shall be produced in the format specified in the relevant SE V&V procedures. The requirements of the software end products shall be identified on the RTM and assigned a requirement number for reference. For each RTM requirement number in the RTM, the following information shall be provided: a. A description of the requirement b The high-level RTM requirement number used to identify the source of a lowlevel requirement generated during top-down software design c. Paragraph number(s) of the document that specifies the requirement d. Paragraph number of the design document that specifies the requirement e. Component name(s) implementing the requirement f. Test(s) required to verify the requirement g. Result(s) of testing 3. The RTM shall be developed for use in the following: a. Evaluating subsequent requirements and design documents b. Developing test events and test data c. Documenting the validation of software end products 4. The RTM shall be generated during the Requirements Phase of software development and updated at each subsequent phase. 5. The RTM shall be reviewed by V&V for inconsistencies in the refinement of requirements, incomplete definition of requirements in lower level specifications and code, and incomplete specification of testing for requirements. Discrepancies found during the review shall be documented in the appropriate phase V&V summary task report. 6. The project software V&V lead engineer is responsible for the generation and maintenance of the RTM. The appropriate phase V&V task summary report shall include the updated RTM. 7. The RTM shall be reviewed by the project software lead engineer and the SE [title/position] at each phase of the software project for initiation of appropriate corrective action.
Copyright © 2002 Interpharm Press
Page 21 of 46 SE-VVP
268
Software Quality Assurance SOPs for Healthcare Manufacturers
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the generation and maintenance of the RTM. 2. The project software lead engineer shall be responsible for review of the RTM at each development phase and initiation of appropriate corrective action. 3. The SE [title/position] shall be responsible for review of the RTM at each development phase and initiation of appropriate corrective action.
POLICY 9 SOFTWARE ARCHITECTURE DESIGN PHASE VERIFICATION AND VALIDATION
Policy The Software Architecture Design Phase V&V shall ensure that the preliminary software design establishes the design baseline from which the detailed design will be developed. Evaluation and analysis of the Software Architecture Design Specification (SADS) shall ensure the internal consistency, completeness, correctness, and clarity of the information needed to support the detailed definition of the individual software components. A plan for software validation of the software end product shall be generated for approval prior to completion of the Software Architecture Design Review (SADR). The results of Software Architecture Design Phase V&V tasks shall be documented in a Software Architecture Design Phase V&V task summary report.
Requirements 1. The SADS shall be provided to the project software V&V lead engineer by the project software lead engineer prior to the SADR.The analysis and evaluation of the SADS shall ensure compliance with the requirements defined in the SE Software Development Policies for the SADS. Verification of the SADS shall also ensure the following: a. The robustness and testability of the design
Page 22 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
269
b. Compliance with established SE software development procedures 2. The generation of a plan for validation of the software end product shall be accomplished concurrent with design analysis.This test document, the Software Validation Test Plan (SVTP), shall be generated by the project software V&V lead engineer, reviewed by the project software lead engineer, and approved by the SE [title/position] prior to completion of SADR. 3. The Software Test Plan (STP) shall be provided to the project software V&V lead engineer by the project software lead engineer prior to the SADR.The STP shall be reviewed for the following: a. Completeness, correctness, and consistency of the software testing required for each software component b. Compliance with the requirements specified in the SE Software Development Policies and the established SE software development procedures 4. The project software V&V lead engineer shall be responsible for ensuring the accuracy and completeness of Requirements Traceability Matrix (RTM) updates. The RTM shall be updated to: a. Document the tracing of the software design to the requirements in the Software Requirements Specification (SRS) b. Provide a cross-reference of each software requirement to the test(s) described in the STP and SVTP 5. A review of the preliminary version of the User’s Manual shall be performed. Evaluation of the User’s Manual shall ensure compliance with the requirements defined in the SE Software Development Policies and SE software development procedures for the User’s Manual. 6. The project software V&V lead engineer shall be responsible for ensuring that all V&V tasks are performed and are documented in a V&V task report. These reports shall be submitted by the project software V&V lead engineer to the project software lead engineer and SE [title/position] for review and initiation of appropriate corrective action. 7. The project software V&V lead engineer shall be responsible for supporting the project software lead engineer in preparation for the SADR. All V&V outputs generated during the Software Architecture Design Phase shall be made available to the project software lead engineer prior to the SADR in order to support management in establishing the compatibility, reliability, and testability of stated software and interface design. 8. The Software Architecture Design Phase V&V task summary report shall be generated by the project software V&V lead engineer. The task summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The V&V summary report shall be distributed to the project software lead engineer, SE [title/position], and project quality assurance representative(s).
Copyright © 2002 Interpharm Press
Page 23 of 46 SE-VVP
270
Software Quality Assurance SOPs for Healthcare Manufacturers
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. Ensuring the accuracy and completeness of RTM updates b. Ensuring that all V&V tasks performed are documented in a V&V task report and submitted to the project software lead engineer and SE [title/position] c. Generating and obtaining approval of the SVTP d. Supporting the project software lead engineer in preparation for the SADR e. Generating the Software Architecture Design Phase V&V task summary report and distribution to the project software lead engineer, SE [title/position], and project quality assurance representative(s) 2. The project software lead engineer shall be responsible for the following: a. Providing the project software V&V lead engineer with the SADS and STP prior to the SADR b. Review of the SVTP c. Review of the RTM d. Review of V&V task reports and initiation of appropriate corrective action 3. The SE [title/position] shall be responsible for the following: a. Approval of the SVTP b. Review of the RTM c. Review of V&V task reports and initiation of appropriate corrective action
POLICY 10
SOFTWARE VALIDATION TEST PLAN
Policy SE software projects shall prepare a Software Validation Test Plan (SVTP) that defines the testing required to verify that the software end products satisfy the requirements of the Software Requirements Specification (SRS). The generation and maintenance of the SVTP shall be the responsibility of the project software V&V lead engineer. The SVTP shall be reviewed by the project software lead engineer and approved by the SE [title/position] prior to the completion of the Software Architecture Design Review (SADR).
Page 24 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
271
Requirements 1. The project software V&V lead engineer is responsible for the generation of the SVTP. The SVTP shall be produced in the format specified in the relevant SE V&V procedures. The SVTP shall be reviewed by the project software lead engineer and approved by the SE prior to completion of the SADR. 2. The SVTP shall define the methods for verifying the following: a. b. c. d. e.
Correct implementation of software requirements Software system capabilities Throughput and timing requirements Safety design requirements Correct interface to the system environment
3. The SVTP shall specify tests to measure the following: a. Compliance of the complete software product with all functional requirements while operating in all system environments b. Performance of hardware, software, and user interfaces c. Performance at software limits and under stress conditions d. Compliance with safety design requirements e. Performance in terms of accuracy, response times, storage, input and output rates, and margins for growth 4. The SVTP shall define the objectives, test methods, and system environments for each test to be performed during the Software Validation Phase. 5. The SVTP shall contain a test validation matrix that correlates the SRS requirements to the type of test verification method(s) to be utilized and the test level(s) in which the requirements will be tested. 6. The SVTP shall define the procedures for collecting, reviewing, and evaluating test results.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for generating and obtaining approval of the SVTP. 2. The project software lead engineer shall be responsible for reviewing the SVTP. 3. The SE [title/position] shall be responsible for approving the SVTP.
Copyright © 2002 Interpharm Press
Page 25 of 46 SE-VVP
272
Software Quality Assurance SOPs for Healthcare Manufacturers
POLICY 11
VALIDATION TEST INFORMATION SHEETS
Policy SE software projects shall prepare and maintain Validation Test Information Sheets (VTISs) for each test conducted during the Software Validation Phase.The VTISs shall provide (1) an organized, accessible collection of all testing and test results; (2) a means of tracking the progression and status of testing; and (3) a means of test verification. A VTIS shall be prepared for each test defined in the Software Validation Test Plan (SVTP), shall be prepared prior to Software Detailed Design Review (SDDR), and shall be maintained and included as a part of the Software Verification and Validation Report (SVVR). Completed VTISs shall be reviewed for completeness and technical adequacy of the testing conducted and periodically audited to assess compliance with relevant SE V&V procedures. The VTISs shall be generated by the project software V&V engineer and approved by the project software V&V lead engineer upon completion.
Requirements 1. The project V&V engineer is responsible for the generation of the VTISs.The VTIS shall be produced in the format specified in the relevant SE V&V procedures and shall define the following: a. b. c. d. e.
Title of the test to be conducted Objectives of the test and the success criteria Requirements to be tested, including each requirement’s unique identifier Specification title where the requirement was defined Test approach to the depth necessary to establish a baseline for resource requirements f. Required test instrumentation g. Test phasing, scheduling, and duration h. Data collection, reduction, and analysis requirements 2. The VTIS for each test defined in the SVTP shall be prepared prior to SDDR. Review for approval or disapproval of the software test program at SDDR shall include an assessment of the adequacy of the test methods and test limits defined in the VTISs. 3. The VTIS shall be used as a basis for development of the Software Validation Test Procedures (SVTPR). The results of testing described by the VTIS shall be documented on the VTIS. The test conductor shall sign and date the completed VTIS. The project
Page 26 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
273
software V&V lead engineer shall review the VTIS for completeness and technical adequacy of the testing conducted and, when satisfied, shall sign and date the VTIS. 4. Periodic audits of VTISs shall be conducted to assess compliance with relevant SE V&V procedures. Problems detected in these audits shall be identified in a written summary, which shall be attached to the VTIS and copies given to the SE [title/position] and project software V&V lead engineer. 5. Completed SVTP VTISs shall be maintained until included as a part of the Software Verification and Validation Report.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project V&V engineer shall be responsible for generating the VTISs. 2. The project software V&V lead engineer shall be responsible for the review and approval or disapproval of the VTISs. 3. The project software lead engineer shall be responsible for reviewing the VTISs. 4. The SE [title/position] shall be responsible for reviewing the VTISs.
POLICY 12
SOFTWARE VALIDATION TEST PROCEDURES
Policy SE software projects shall prepare a Software Validation Test Procedures (SVTPR) document, which describes the detailed procedures for performing the testing defined in the Software Validation Test Plan (SVTP). The generation and maintenance of the SVTPR shall be the responsibility of the project software V&V lead engineer. The SVTPR shall be reviewed by the project software lead engineer and approved by the SE [title/position] prior to test execution.
Copyright © 2002 Interpharm Press
Page 27 of 46 SE-VVP
274
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements 1. The project software V&V lead engineer is responsible for the generation of the SVTPR. The SVTPR shall be produced in the format specified in the relevant SE V&V procedures.The SVTPR shall be reviewed by the project software lead engineer and approved by the SE [title/position] prior to test execution. 2. The SVTPR shall be developed using the test information defined in the Validation Test Information Sheets (VTISs). In addition, the SVTPR shall specify the following: a. b. c. d.
Steps for executing the set of tests defined in the SVTP Requirements for logging test activities Criteria for procedure stop and restart Methods of collecting and analyzing test data
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for generating and obtaining approval of the SVTPR. 2. The project software lead engineer shall be responsible for reviewing the SVTPR. 3. The SE [title/position] shall be responsible for approving the SVTPR.
POLICY 13
SOFTWARE DETAILED DESIGN PHASE VERIFICATION AND VALIDATION
Policy The Software Detailed Design Phase V&V shall ensure that the detailed software design satisfies the requirements and constraints specified in the Software Requirements Specification (SRS) and augments the design specified in the Software Architecture Design Specification (SADS). Evaluation and analysis of the Software Detailed Design Specification (SDDS) shall ensure the internal consistency, completeness, correctness, and clarity of the SDDS and verify
Page 28 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
275
that the design, when implemented, will satisfy the requirements specified in the SRS. The Software Validation Test Plan (SVTP) shall be updated to incorporate the additional design details of the SDDS. Validation Test Information Sheets (VTISs) shall be developed to define the objectives, approach, and requirements of each test defined in the SVTP. The results of Software Detailed Design Phase V&V tasks shall be documented in a Detailed Design Phase V&V task summary report.
Requirements 1. The SDDS shall be provided to the project software V&V lead engineer by the project software lead engineer prior to the Software Detailed Design Review (SDDR). The analysis and evaluation of the SDDS shall ensure compliance with the requirements defined in the SE Software Development Policies for the SDDS. Verification of the SDDS shall also ensure the following: a. Consistency with the preliminary design b. Completeness in the allocation of the software end product’s functional capabilities to one or more components c. Identification of relationships among and interfaces between all software components d. Correctness of the algorithms defined e. The robustness and testability of the design f. Compliance with established SE software development procedures 2. The Software Test Plan (STP) shall be provided to the project software V&V lead engineer by the project software lead engineer prior to the SDDR. The STP shall be reviewed for the following: a. Completeness, correctness, and consistency of the software testing required for each software component b. Compliance with the requirements specified in the SE Software Development Policies and the established SE software development procedures 3. The Development Test Information Sheets (DTISs) prepared by the software engineer shall be provided to the project software V&V lead engineer by the project software lead engineer prior to the SDDR. Analysis and evaluation of the adequacy of software component testing is supported by the review of the DTISs. 4. Updates to the SVTP required to incorporate the additional design details of the SDDS shall be generated by the project software V&V lead engineer, reviewed by the project software lead engineer, and approved by the SE [title/position] prior to the completion of the SDDR.
Copyright © 2002 Interpharm Press
Page 29 of 46 SE-VVP
276
Software Quality Assurance SOPs for Healthcare Manufacturers
5. The generation of VTISs for each test defined in the SVTP shall be accomplished concurrent with design analysis and shall be the responsibility of the project V&V engineer. The VTISs shall be provided to the project software lead engineer by the project software V&V lead engineer prior to the SDDR and used for assessing the adequacy of the test methods and limits defined for the software test program. 6. A review of the preliminary version of the User’s Manual shall be performed. Evaluation of the User’s Manual shall ensure compliance with the requirements defined in the SE Software Development Policies and SE software development procedures for the User’s Manual. 7. The Requirements Traceability Matrix (RTM) shall be updated to document the tracing of the software design to the requirements in the SRS.The project software V&V lead engineer shall be responsible for ensuring the accuracy and completeness of RTM updates. 8. The project software V&V lead engineer shall be responsible for ensuring that all V&V tasks are performed and are documented in a V&V task report. These reports shall be submitted by the project software V&V lead engineer to the project software lead engineer and SE [title/position] for review and initiation of appropriate corrective action. 9. The project software V&V lead engineer shall be responsible for supporting the project software lead engineer in preparation for the SDDR. All V&V outputs generated during the Software Detailed Design Phase shall be made available to the project software lead engineer prior to the SDDR in order to support management in establishing the compatibility, reliability, and testability of stated software and interface design. 10. The Detailed Design Phase V&V task summary report shall be generated by the project software V&V lead engineer. The task summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The V&V summary report shall be distributed to the project software lead engineer, SE [title/position], and project quality assurance representative(s).
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project V&V engineer shall be responsible for the generation of the VTISs to be used during software validation testing of the software end products. 2. The project software V&V lead engineer shall be responsible for the following: a. Ensuring the accuracy and completeness of RTM updates b. Providing the project software lead engineer with the VTISs prior to the SDDR
Page 30 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
277
c. Ensuring that all V&V tasks performed are documented in a V&V task report and submitted to the project software lead engineer and SE [title/position] d. Supporting the project software lead engineer in preparation of the SDDR e. Generating the Detailed Design Phase V&V task summary report and distributing it to the project software lead engineer, SE [title/position], and project quality assurance representative(s) 3. The project software lead engineer shall be responsible for the following: a. Providing the project software V&V lead engineer with the SDDS prior to the SDDR b. Providing the project software V&V lead engineer with the DTISs and updated STP prior to the SDDR c. Review of the VTISs d. Review of the RTM e. Review of V&V task reports and initiation of appropriate corrective action 4. The SE [title/position] shall be responsible for the following: a. Review of the RTM b. Review of V&V task reports and initiation of appropriate corrective action
POLICY 14
CODE AND TEST PHASE VERIFICATION AND VALIDATION
Policy The Code and Test Phase V&V shall ensure the correctness, consistency, completeness, and accuracy of the implementation of the design defined in the Software Detailed Design Specification (SDDS). The results of Code and Test Phase V&V tasks shall be documented in a Code and Test Phase V&V task summary report.
Requirements 1. The code for the software end products shall be provided to the project software V&V lead engineer by the project software lead engineer upon successful completion of
Copyright © 2002 Interpharm Press
Page 31 of 46 SE-VVP
278
Software Quality Assurance SOPs for Healthcare Manufacturers
compilation. Code audits shall be performed to examine both high-level and detailed properties of the code and will verify the following: a. Consistency of the code with the SDDS b. Adherence of the code to SE programming standards and conventions c. Correctness and efficiency of the code 2. Anomalies detected during V&V task performance shall be documented in a Software V&V Anomaly report. The project software V&V lead engineer shall be responsible for the proper documentation and reporting of software end product anomalies. Software V&V Anomaly reports shall be submitted by the project software V&V lead engineer to the project software lead engineer and SE [title/position] for review and initiation of appropriate corrective action. 3. The Requirements Traceability Matrix (RTM) shall be updated to document the tracing of source code to the software design and the requirements in the Software Requirements Specification (SRS). The project software V&V lead engineer shall be responsible for ensuring the accuracy and completeness of RTM updates. 4. The project software V&V lead engineer shall be responsible for ensuring that all V&V tasks are performed and are documented in a V&V task report. These reports shall be submitted by the project software V&V lead engineer to the project software lead engineer and SE [title/position] for review and initiation of appropriate corrective action. 5. The Code and Test Phase V&V task summary report shall be generated by the project software V&V lead engineer. The task summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The V&V summary report shall be distributed to the project software lead engineer, SE [title/position], and project quality assurance representative(s).
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. Ensuring the accuracy and completeness of RTM updates b. Ensuring that all anomalies detected during V&V task performance are documented in a Software V&V Anomaly report and submitted to the project software lead engineer and SE [title/position]
Page 32 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
279
c. Ensuring that all V&V tasks performed are documented in a V&V task report and submitted to the project software lead engineer and SE [title/position] d. Generating the Code and Test Phase V&V task summary report and distributing it to the project software lead engineer, SE [title/position], and project quality assurance representative(s) 2. The project software lead engineer shall be responsible for the following: a. Providing the project software V&V lead engineer with the code upon successful completion of compilation b. Review of the RTM c. Review of Software V&V Anomaly reports and initiation of appropriate corrective action d. Review of V&V task reports and initiation of appropriate corrective action 3. The SE [title/position] shall be responsible for the following: a. Review of the RTM b. Review of Software V&V Anomaly reports and initiation of appropriate corrective action c. Review of V&V task reports and initiation of appropriate corrective action
POLICY 15 INTEGRATION AND TEST PHASE VERIFICATION AND VALIDATION
Policy The Integration and Test Phase V&V shall ensure the correctness, completeness, and accuracy of software component interactions and interfaces. Analysis and evaluation of software integration and testing shall be accomplished through review of the Development Test Information Sheets (DTISs), which document the successful completion of integration and test for designated software components. The instructions for validation test set-up, operation, and evaluation are generated by the project software V&V lead engineer for approval by the SE [title/position] prior to test execution. The results of Integration and Test Phase V&V tasks shall be documented in an Integration and Test Phase V&V task summary report.
Copyright © 2002 Interpharm Press
Page 33 of 46 SE-VVP
280
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements 1. The DTISs and associated test data shall be provided to the project software V&V lead engineer by the project software lead engineer as each test is successfully completed.The DTISs shall be analyzed to evaluate the following: a. b. c. d.
Adequacy of test coverage Adequacy of test data Software behavior Software Reliability
2. The Software Validation Test Procedures (SVTPR) are the procedures required to execute the tests defined in the Software Validation Test Plan (SVTP) and are developed using the test information defined in the VTISs. The SVTPR shall be generated by the project software V&V lead engineer, reviewed by the project software lead engineer, and approved by the SE [title/position] prior to test execution. Software validation testing shall not be conducted without an approved SVTPR. 3. The Requirements Traceability Matrix (RTM) shall be updated to document any changes to the tracing of source code to the software design and the requirements in the Software Requirements Specification (SRS). The project software V&V lead engineer shall be responsible for ensuring the accuracy and completeness of RTM updates. 4. Anomalies detected during V&V task performance shall be documented in a Software V&V Anomaly report. The project software V&V lead engineer shall be responsible for the proper documentation and reporting of software end product anomalies. Software V&V Anomaly reports shall be submitted by the project software V&V lead engineer to the project software lead engineer and SE [title/position] for review and initiation of appropriate corrective action. 5. The project software V&V lead engineer shall be responsible for ensuring that all V&V tasks are performed and are documented in a V&V task report. These reports shall be submitted by the project software V&V lead engineer to the project software lead engineer and SE [title/position] for review and initiation of appropriate corrective action. 6. The Integration and Test Phase V&V task summary report shall be generated by the project software V&V lead engineer. The task summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The V&V summary report shall be distributed to the project software lead engineer, SE [title/position], and project quality assurance representative(s).
Page 34 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
281
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. Ensuring the accuracy and completeness of RTM updates b. Generating and obtaining approval of the SVTPR c. Ensuring that all anomalies detected during V&V task performance are documented in a Software V&V Anomaly report and submitted to the project software lead engineer and SE [title/position] d. Ensuring that all V&V tasks performed are documented in a V&V task report and submitted to the project software lead engineer and SE [title/position] e. Generating the Integration and Test Phase V&V task summary report and distributing it to the project software lead engineer, SE [title/position], and project quality assurance representative(s) 2. The project software lead engineer shall be responsible for the following: a. Providing the project software V&V lead engineer with the completed DTISs as each test is successfully completed b. Review of the SVTPR c. Review of the RTM d. Review of Software V&V Anomaly reports and initiation of appropriate corrective action e. Review of V&V task reports and initiation of appropriate corrective action 3. The SE [title/position] shall be responsible for the following: a. Approval of the SVTPR b. Review of the RTM c. Review of Software V&V Anomaly reports and initiation of appropriate corrective action d. Review of V&V task reports and initiation of appropriate corrective action
Copyright © 2002 Interpharm Press
Page 35 of 46 SE-VVP
282
Software Quality Assurance SOPs for Healthcare Manufacturers
POLICY 16 SOFTWARE VALIDATION PHASE VERIFICATION AND VALIDATION Policy The Software Validation Phase V&V shall verify that the software end products satisfy the requirements and design specified in the Software Requirements Specification (SRS) and Software Detailed Design Specification (SDDS), respectively. The project software V&V lead engineer shall ensure that all validation is performed in accordance with the Software Validation Test Plan (SVTP) using the Software Validation Test Procedures (SVTPR). The project software V&V lead engineer shall ensure that all procedures for the configuration management of software end products during and at the completion of testing are implemented in accordance with the project’s Software Configuration Management Plan (SCMP). At the completion of all V&V activities, a Software Verification and Validation Report (SVVR) is generated by the project software V&V lead engineer.
Requirements 1. Software validation shall be performed using the current controlled version of the software as determined in accordance with the procedures defined in the project’s SCMP. 2. Software validation shall be conducted in accordance with the SVTP by using the SVTPR. The results of software validation shall be documented on the Validation Test Information Sheets (VTISs). The VTISs shall be included as a part of the SVVR. 3. Test results shall be analyzed to determine if the software end products satisfy the software requirements and design specified in the SRS and SDDS. 4. Approved changes to the software under test shall be verified by regression testing to confirm that the redesign of corrected software has been effective and has not introduced other errors. 5. The Requirements Traceability Matrix (RTM) shall be updated to document the verification of each requirement in the SRS. The project software V&V lead engineer shall be responsible for ensuring the accuracy and completeness of RTM updates. 6. Software V&V Anomaly reports shall be generated to document test failures and software faults. Software V&V Anomaly reports shall be submitted by the project software V&V lead engineer to the project software lead engineer and SE [title/position] for review and initiation of appropriate corrective action. Anomaly reports shall be included as a part of the SVVR.
Page 36 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
283
7. A software configuration audit of the validated software shall be conducted. The version description of all software end products is verified to demonstrate that the delivered software corresponds to the software subjected to software validation. Completion of the software configuration audit is contingent upon closure of all outstanding software discrepancies and deficiencies. The project software V&V lead engineer shall be responsible for generating a Software Configuration Audit Report (SCAR) to document the final configuration of the software end products. The SCAR shall be included as a part of the SVVR. 8. The SVVR is generated by the project software V&V lead engineer at the completion of all V&V tasks during the Software Validation Phase. The SVVR shall be reviewed by the project software lead engineer and approved for distribution by the SE [title/position]. The project software V&V lead engineer shall distribute the approved SVVR to the project software lead engineer, SE [title/position], and project quality assurance representative(s). 9. The project software V&V lead engineer shall be responsible for ensuring that all V&V tasks are performed and are documented appropriately.
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the following: a. Ensuring that all anomalies detected during V&V task performance are documented in a Software V&V Anomaly report and submitted to the project software lead engineer and SE [title/position] RTM updates b. Generating and distributing the SCAR c. Generating and obtaining approval of the SVVR d. Distributing the SVVR to the project software lead engineer, SE [title/position], and project quality assurance representative(s) e. Ensuring that all V&V tasks are performed and are documented appropriately 2. The project software lead engineer shall be responsible for the following: a. b. c. d.
Review Review Review Review action e. Review
of of of of
the RTM the SCAR the SVVR Software V&V Anomaly reports and initiation of appropriate corrective
of V&V task reports and initiation of appropriate corrective action
Copyright © 2002 Interpharm Press
Page 37 of 46 SE-VVP
284
Software Quality Assurance SOPs for Healthcare Manufacturers
3. The SE [title/position] shall be responsible for the following: a. b. c. d.
Review of the RTM Review of the SCAR Approval of the SVVR Review of Software V&V Anomaly reports and initiation of appropriate corrective action e. Review of V&V task reports and initiation of appropriate corrective action
POLICY 17
SOFTWARE CONFIGURATION AUDIT REPORT
Policy SE software project personnel shall prepare the Software Configuration Audit Report (SCAR), which summarizes the results of the software configuration audit of the validated software.The project software V&V lead engineer shall be responsible for generating the SCAR. This report of the final configuration of the software end products for the software project shall be reviewed by the cognizant software lead engineer and SE manager.
Requirements 1. At the conclusion of testing during the Software Validation Phase, a software configuration audit of the validation software is conducted. At the completion of the software configuration audit, the SCAR shall be generated to document the final configuration of the software end products. The SCAR is a checklist that does the following: a. Identifies and describes each item of software b. Verifies that the software configurations are what they were intended and proclaimed to be c. Verifies that the configuration of each item of software is the same configuration validated during the Software Validation Phase 2. The SCAR shall be generated by the project software V&V lead engineer and reviewed by the project software lead engineer and SE [title/position]. The format of the SCAR shall be in accordance with the project’s Software Verification and Validation Plan (SVVP).
Page 38 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
285
3. The SCAR shall be completed and reviewed prior to the completion of the Software Validation Phase. The SCAR shall be included in the project’s Software Verification and Validation Report (SVVR).
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for the generation and distribution of the SCAR. 2. The project software lead engineer shall be responsible for reviewing the SCAR. 3. The SE [title/position] shall be responsible for reviewing the SCAR.
POLICY 18 SOFTWARE VERIFICATION AND VALIDATION REPORT
Policy SE software projects shall prepare the Software Verification and Validation Report (SVVR), which summarizes the V&V tasks and results, including the status and disposition of anomalies. An assessment of the overall software quality and recommendations for software and/or development process improvements shall also be documented in the SVVR. The SVVR shall be generated upon completion of all V&V tasks during the Software Validation Phase.
Requirements 1. The project software V&V lead engineer shall be responsible for generating the SVVR. The SVVR shall be produced in the format specified in the relevant SE V&V procedures. 2. The SVVR shall include, as a minimum, the following: a. A summary of all V&V tasks performed
Copyright © 2002 Interpharm Press
Page 39 of 46 SE-VVP
286
Software Quality Assurance SOPs for Healthcare Manufacturers
b. c. d. e.
A summary of task results A summary of anomalies and resolutions An assessment of overall software quality Recommendations and/or development process improvements
3. The SVVR shall include the actual results of the testing performed by V&V personnel. As a minimum, the SVVR shall include the following: a. A test history, which provides a chronological record of the actual conduct of a test b. The Software Validation Test Plan Test Information Sheets, which provide test results c. The completed Requirements Traceability Matrix, which provides a tracing of all software requirements, from specification through validation 4. The SVVR shall include the Software Configuration Audit Report. 5. The SVVR shall be generated upon completion of all V&V tasks during the Software Validation Phase. 6. The SVVR shall be reviewed by the project software lead engineer and approved for distribution by the SE [title/position]. 7. Upon approval, the project software V&V lead engineer shall distribute the SVVR to the project software lead engineer, SE [title/position], and project quality assurance representative(s).
Responsibilities In addition to the responsibilities identified in the preamble to these policies, the following responsibilities are prescribed by this policy: 1. The project software V&V lead engineer shall be responsible for generating and obtaining approval of the SVVR. 2. The project software lead engineer shall be responsible for reviewing the SVVR. 3. The SE [title/position] shall be responsible for approving the distribution of the SVVR.
Page 40 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
287
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Algorithm: Finite set of well-defined rules for the solution of a problem in a finite number of steps. Anomaly: Anything observed in the documentation or operation of software that deviates from expectations on the basis of previously verified software products or reference documents. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon and that thereafter serves as the basis for further development; it can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Code audit: Independent review of source code by a person, team, or tool to verify compliance with software design documentation and programming standards. Correctness and efficiency may also be evaluated. Completeness: Those attributes of the software or documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Component testing: Testing conducted to verify the implementation of the design for one software component or a collection of software components. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Configuration identification: Process of designating the configuration items in a system and recording their characteristics.
Copyright © 2002 Interpharm Press
Page 41 of 46 SE-VVP
288
Software Quality Assurance SOPs for Healthcare Manufacturers
Configuration item: Aggregation of hardware, software, or any of its discrete parts that satisfies an end-use function. Configuration status accounting: Recording and reporting of the information that is needed to manage a configuration effectively, including a listing of the approved configuration identification, the status of proposed changes to the configuration, and the implementation status of approved changes. Consistency: Those attributes of the software or documentation that provide uniformity in the specification, design, and implementation of the product. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations. Criticality: Classification of a software error or fault based upon an evaluation of the degree of impact of that error or fault on the development or operation of a system. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to Quality Assurance for certification. Design requirement: Any requirement that impacts or constrains the design of a software system or software system component. Deviation: Authorization for a future activity, event, or product that departs from the SE policy. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, that may lead to a failure. Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness, or damage to or loss of equipment or property.
Page 42 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
289
Hazards analysis: Analysis of components and subsystems that identifies where faults or failures would cause hazardous conditions or actions, describes the fault or failure, and estimates the probability and severity of the hazard. Integration: Process of combining software elements, hardware elements, or both into an overall system. Iteration: Process of repeatedly executing a given sequence of steps until a given condition is met or while a given condition is true. Product history file: Compilation of records containing the complete production history of a finished device. Quality assurance: Planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements. Regression testing: Selective retesting to detect faults introduced during modification, to verify that modifications have not caused unintended adverse effects and to verify that a modified system or system component still meets its specified requirements. Reliability: Ability of an item to perform a required function under stated conditions for a stated period of time. Requirements Phase: Period in the software life cycle during which the requirements, such as functional and performance capabilities for a software end product, are defined and documented. Requirements Traceability Matrix (RTM): Matrix that traces the development of software end products from requirements specification through software validation. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Architecture Design Review (SADR): Review conducted for the purpose of: (1) reviewing the project’s preliminary design, SADS, associated plans, and technical issues;
Copyright © 2002 Interpharm Press
Page 43 of 46 SE-VVP
290
Software Quality Assurance SOPs for Healthcare Manufacturers
(2) resolving identified issues; and (3) obtaining commitment to proceed into the detailed design phase. Software Architecture Design Specification (SADS): Project-specific document that constrains the design information needed to support the detailed definition of the individual software system components and that, upon completion of the Software Architecture Design Review, becomes the design baseline for development of the SDDS used in support of software coding. Software Configuration Audit Report (SCAR): Document that summarizes the results of the software configuration audit of the validated software and is a report of the final configuration of the software end products for the software project. Software configuration management (SCM): Discipline of identifying the configuration of a software system at discrete points in time for the purpose of systematically controlling changes to this configuration and maintaining the integrity and traceability of this configuration throughout the development process. Software Configuration Management Plan (SCMP): Project-specific plan that specifies the methods and planning employed to implement software configuration management activities. Software Detailed Design Review (SDDR): Review conducted for the purpose of: (1) reviewing the project’s detailed design, SDDS, associated plans, and critical issues; (2) resolving identified issues; (3) obtaining commitment to proceed into the Code and Test Phase; and (4) obtaining commitment to a test program supporting product acceptance. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Software Architecture Design Review, including a description of the overall program operation and control and the use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software end products: Computer programs, software documentation, and databases produced by a software development project.
Page 44 of 46 SE-VVP
Copyright © 2002 Interpharm Press
Software Engineering Verification and Validation Policies
291
Software project: Planned and authorized undertaking of specified scope and duration that results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Software quality: Totality of features and characteristics of a software product that bear on its ability to satisfy given needs. Software reliability: Probability that software will not cause the failure of a system for a specified time under specified conditions. Software Requirements Review (SRR): Review of the provisions of the Software Requirements Specification, which, once approved, will serve as the basis of software endproduct acceptance. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software Test Plan (STP): Project-specific plan that defines the scope of software testing that must be successfully completed for each software component. Software Validation Test Plan (SVTP): Project-specific plan that defines the testing required to verify that the software end products satisfy the requirements of the SRS. Software Validation Test Procedures (SVTPR): Project-specific set of procedures for performing the testing defined in the SVTP. Software Verification and Validation Plan (SVVP): Project-specific plan that describes the project’s unique verification and validation (V&V) organization, activities, schedule, inputs and outputs, and any deviations from SE policies required for effective management of V&V tasks. Software Verification and Validation Policy Change Control Board (CCB): A board that establishes and maintains a set of verification and validation policies in order to achieve quality in all phases of the software development life cycle. Software Verification and Validation Report (SVVR): Final report specific to the project that summarizes the V&V tasks and results, including the status and disposition of anomalies. Software verification and validation tasks: Activities performed during verification and validation, including assessment, evaluation, analysis, review, testing, and the associated document development.
Copyright © 2002 Interpharm Press
Page 45 of 46 SE-VVP
292
Software Quality Assurance SOPs for Healthcare Manufacturers
Source code: Original software expressed in human-readable form (programming language), which must be translated into machine-readable form before it can be executed by the computer. Test Information Sheet (TIS): Development or validation test document that defines the objectives, approach, and requirements for a specific test. Testability: Extent to which software facilitates both the establishment of test criteria and the evaluation of the software with respect to those criteria or the extent to which the definition of requirements facilitates analysis of the requirements to establish test criteria. Top-down software design: Design approach that starts with the top-level system functions and proceeds through downward allocation, evaluation, and iteration to successively lower levels of design; it enhances design traceability, completeness, and comprehensiveness. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether or not the products of a given phase of the software life cycle fulfill the requirements established during the previous phase. Waiver: Authorization to depart from SE policy for an activity, event, or product that has already been initiated.
Page 46 of 46 SE-VVP
Copyright © 2002 Interpharm Press
[Project/Product Name] SCMP SOFTWARE CONFIGURATION MANAGEMENT PLAN
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SCMP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 24 SCMP
294
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
MANAGEMENT
5
3.0
SOFTWARE CONFIGURATION MANAGEMENT ACTIVITIES
9
4.0
TOOLS, TECHNIQUES, AND METHODOLOGIES
15
5.0
RECORDS COLLECTION AND RETENTION
17
APPENDIX A
Software Configuration Milestones
18
APPENDIX B
Change Request/Approval (CRA) Form
19
APPENDIX C
Automated Tools Used in Software Configuration Management
20
GLOSSARY
Page 2 of 24 SCMP
21
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
295
1.0 INTRODUCTION
1.1 Purpose This plan identifies and describes the planning and procedures to be performed for software configuration management (SCM) of the software end products developed for the [project/ product name].
1.2 Scope This plan is applicable to the software end products produced and used during [project/product name] software development, which are under the control of the software development team. The organizations, activities, and phases of the [project/product name] software development relative to SCM are also presented.The program of SCM defined in this document will be applied throughout all phases of the software development, which is defined in the [project/product name] Software Development Plan (SDP).
1.3 Overview The requirements and procedures for SCM of the software end products, associated development plans, and verification and validation (V&V) documentation developed by the [project/product name] software development team are established by this plan in order to meet the following objectives: •
Ensure that the software end products meet the requirements established in requirements, design, and implementation documentation
•
Provide visibility to the software development process
•
Provide traceability between “as designed” and “as built”
•
Support the definition and verification of software configurations
•
Control changes to the software end products
•
Monitor the implementation of change
Copyright © 2002 Interpharm Press
Page 3 of 24 SCMP
296
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Track configurations of the software end products
•
Provide a consistent approach to configuration management
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the contents of this specification shall be considered a superseding requirement.
1.4.1 •
Project Specifications Product Objectives Document, Document Number [aaa]-PODRevision [#.#], dated [date]
[project/product name] [#.#],
•
[project/product name] Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
•
Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name] Software End Product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
[project/product name]
Software Development Test Plan, Document Number [aaa]-DTP[#.#], Revision [#.#], dated [date]
[#.#],
•
1.4.2
Software Quality Assurance Plan, Document Number [aaa]-QAPRevision [#.#], dated [date]
[project/product name] Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
Page 4 of 24 SCMP
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
•
297
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Verification and Validation Policies, Revision [#.#], dated [date]
2.0 MANAGEMENT
2.1 Organization The [project/product name] has been organized under the direction of [title/position], who has assembled a program team that represents all concerned disciplines.This team coordinates the development activities and provides support for product development. Product development of [project/product name] involves engineering [insert disciplines here] and software engineering. The interface between these disciplines during product development is provided through the technical team, who meet to address and resolve [project/product name] product development issues. A lead software engineer heads the [project/product name] software development and provides technical direction in all aspects of software development. Other software engineers have been assigned to the team and all participants in the development of the [project/product name] software are responsible for ensuring that their efforts are in compliance with the Software Development Policies and Software Configuration Management Policies. The functions and tasks of verification and validation (V&V) for the [project/product name] software are organized under the direction of [corporate title/position]. The [project/product name] V&V organization is composed of software engineers who have not been directly involved in the
Copyright © 2002 Interpharm Press
Page 5 of 24 SCMP
298
Software Quality Assurance SOPs for Healthcare Manufacturers
development of the software being verified and have not established the criteria against which the software is validated. The development and administration of the V&V program is the responsibility of the V&V lead software engineer. This individual is responsible for planning, organizing, monitoring, and controlling the V&V tasks performed for the [project/product name] project.The project V&V organization will comply with the Software Verification and Validation Policies and Software Configuration Management Policies. All [project/product name] software development and V&V team members are responsible for ensuring that their efforts are in compliance with the SCM procedures described in this plan.
2.2 Responsibilities The [corporate title/position] will support the SCM of the [project/product name] software by: •
Reviewing and approving this plan
•
Providing resources and guidance to project personnel performing configuration management duties
•
Reviewing and approving changes to software baselines
•
Reviewing the effectiveness of the configuration management program described in this plan as an element of periodic project performance evaluations
The software lead engineer will support the SCM of the [project/product name] software by: •
Reviewing and providing comments to this plan
•
Providing direction to team members performing configuration management duties
•
Ensuring that the development of the software end product is controlled and maintained in accordance with this plan
The software engineer will support the SCM of the [project/product name] software by: •
Reviewing and providing comments to this plan
•
Generating change documentation as required by this plan
•
Complying with the procedures for change documentation and control described in this plan throughout the development of the software end product
Page 6 of 24 SCMP
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
299
The software V&V engineer will support the SCM of the [project/product name] software by: •
Reviewing and providing comments to this plan
•
Maintaining V&V documentation in accordance with this plan
•
Complying with the procedures for V&V described in this plan throughout the software development effort
2.3 Software Configuration Management Plan Implementation SCM of the [project/product name] software end product will begin with the development of the [project/product name] Software Requirements Specification (SRS) and end with the delivery of the software end product to the quality assurance group for [insert term for design, system, or final product level testing here]. The major milestones associated with this development process have been defined in the [project/product name] Software Development Plan (SDP). Each milestone is associated with a set of development activities to be performed and documented and represents a point in time where a software configuration baseline is established or updated. The software configuration baseline: •
Establishes the current software configuration identification
•
Serves as the basis for further development
•
Can be changed only through the change control procedures defined in this plan
2.4 Milestone Management The major milestones or phases of software development for the [project/product name] project are: •
Requirements
•
Architecture design
•
Detailed design
•
Implementation
•
Software validation
Copyright © 2002 Interpharm Press
Page 7 of 24 SCMP
300
Software Quality Assurance SOPs for Healthcare Manufacturers
The products of each major milestone are reviewed and/or audited prior to advancing to the next phase of software development. Appendix A identifies the development activities, documents generated, and review and/or audit(s) performed for each major milestone of the [project/product name] software development effort.
2.5 Software Library The [project/product name] software library is a controlled collection of software, documentation, and associated tools and procedures used to facilitate the orderly development of the software end product. The [project/product name] software library provides storage of and controlled access to software and documentation in both human-readable and machine-readable form. The machine-readable software will be stored and maintained in project subdirectories. The [project/product name] software library will provide for: •
Positive identification of all software under configuration control
•
Rapid, comprehensive, and accurate treatment of proposed changes to items under configuration control
•
Implementation of approved changes and dissemination of corrected configuration items
•
Maintenance of accurate records and status of all changes
•
Verification of changes, control, identification, and status accounting of configuration items
2.6 Software Change Review Board The review, approval, and scheduling of changes to baselined configuration items is accomplished by the Software Change Review Board (SCRB). The SCRB is established by the [corporate title/position] to direct the analysis, approval or disapproval, and incorporation of changes to the specification, design, implementation, and test of software requirements.The SCRB will consist of the [project/product name] software lead engineer, V&V lead engineer, configuration manager for the project, and any other personnel who may or may not be associated with the [project/product name] software development effort. The SCRB ensures the documentation, control, and historical maintenance of the configuration items listed in this plan. It is not the intent of the SCRB to process changes to products of the software development process that are still
Page 8 of 24 SCMP
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
301
evolving and are not yet baselined or approved. Therefore, changes to implementation and test documentation prior to software validation are not subject to the SCRB process, unless as a by-product of those changes a change to a baselined configuration item is required.
3.0 SOFTWARE CONFIGURATION MANAGEMENT ACTIVITIES
Software configuration management (SCM) is the activity within software development management that provides identification, change control, baseline control, and status accounting to the specification, design, implementation, and test of a software end product. The procedures and methods for SCM of the [project/product name] software are described below.
3.1 Software Configuration Identification Identification is one of the basic SCM principles and establishes the software items for which a status is to be determined and that are accounted for and controlled. The materials to be identified and baselined at each major milestone in the development of the [project/product name] software are shown in Appendix A. The contents of each software configuration baseline of the [project/product name] software end product will be identified by SCM. It is important that the configuration identifier be easily recognized and understood by the software project team. Therefore, identification of the [project/product name] software end product will combine the name of the project, the name of the baselined material being identified, and the current revision of the baseline material. Changes that are applied by SCM to the software end product will be identified by a unique number combined with the responsible engineer’s initials, providing valuable historical information during test as to who made the change. Each software configuration baseline will be identified by the product name, the major milestone, and a version descriptor that is incremented for each successive revision of that baseline.
3.1.1
Baselined Material Identification
A configuration identifier will be assigned to all baselined material. Configuration identifiers will be structured as [AAA]-LLL-X.XX. The “[AAA]” specifies the three-letter abbreviation for the software end-product name; “LLL” specifies the three-letter abbreviation for the baselined
Copyright © 2002 Interpharm Press
Page 9 of 24 SCMP
302
Software Quality Assurance SOPs for Healthcare Manufacturers
material; and “X.XX” is a decimal number used to track the revision history of the baselined material.
3.1.2
Change Documentation Identification
Change documentation will be identified by the software end-product name, the responsible engineer’s initials, and a unique number assigned to the problem description document. The identification will be structured as [AAA]-III-XX. The “[AAA]” specifies the three-letter abbreviation for the software end-product name; “III” specifies the three letters representing the initials of the responsible engineer; and “XX” is a two-digit number that is consecutively assigned, beginning with 01.
3.1.3
Configuration Baseline Identification
Each version of a software configuration baseline will be identified with a unique identifier structured as [AAA]-LLL-XX. The “[AAA]” specifies the three-letter abbreviation for the software end-product name; “LLL” specifies the three-letter abbreviation for the major milestone; and “XX” is a two-digit number used to track updates in a software configuration baseline. The major milestone baseline abbreviations are Requirements (RQB), Architecture Design (ADB), Detailed Design (DDB), Implementation (IMB), and Software Validation (SVB).
3.2 Software Configuration Control Software configuration control is the mechanism used to control the development, interaction, and suitability of the software configuration items of each configuration baseline. Software configuration control not only monitors change to the software system but also monitors the specific implementation of the already approved system design. The implementation of software configuration control is discussed in terms of managing software baselines, change classification, the mechanism for baseline change, responsibilities of the SCRB, and using the software development library.
3.2.1
Software Baseline Management
All changes to baselined materials will be performed following software developer review and/or test. Changes to the contents of a previous software configuration baseline are generally created incrementally. At the completion of the baseline creation event, the following activities will occur:
Page 10 of 24 SCMP
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
303
•
Entry of baselined materials into the [project/product name] software library
•
Verification that all changes to previous baselines have been incorporated in accordance with the approved change document(s)
•
Preparation of change summary documentation
•
Distribution of the affected material
Software configuration procedures shall be implemented for source code on the [project/product name] project in a manner that will not restrict the design and development effort and yet will control changes to the software once the design is approved. The configuration of source code during the implementation phase of software development will be controlled by the source code control system. The source code developed and tested during the implementation phase will be reviewed by the software lead engineer to determine if it is mature enough to be baselined. Following approval, all source code will be uniquely identified and stored in the [project/product name] software library. At the completion of the implementation phase, a [project/product name] software end product version update will be performed, and all source code resident in the software development library will be baselined. Only one user at any one time is allowed access to modify the files stored in the software development library. All changes to files stored in the software development library are available for documenting, reporting, and analyzing.
3.2.2
Change Classification
Changes to baselined materials will be categorized as Class I, Class II, or Class III. Class I changes affect the performance, functional, or technical requirements, such as performance outside of stated tolerance, interface characteristics, schedules, compatibility with support equipment, or resource requirements. Class II changes require a change to other baselined material but do not meet the criteria defined for Class I changes, such as correction of errors, addition of clarifying notes, or views or editorial corrections. Class III changes do not require a change to any other baselined material. The classification of a recommended change is the responsibility of the requestor. Changes classified as Class I or Class II will be reviewed by the software lead engineer for completeness, accuracy, and clarity, and a recommendation for approval or disapproval will be made prior to submittal to the SCRB. Approval of Class III changes may be obtained from the software lead engineer.
Copyright © 2002 Interpharm Press
Page 11 of 24 SCMP
304
Software Quality Assurance SOPs for Healthcare Manufacturers
3.2.3
Change Mechanism
Changes to baselined materials will be documented on a Change Request/Approval (CRA) form (Appendix B). The CRA is used to document a description of the proposed change, its effect on baselines, and the change status. The requestor of the change will obtain a change number and then forward the completed CRA to the software lead engineer for evaluation. If the CRA has a classification of Class I or Class II, the software lead engineer will submit the CRA to the SCRB with a change-status recommendation. The SCRB will: •
Evaluate the proposed change
•
Review the recommendation of the software lead engineer
•
Direct additional change analysis, approve the change, or disapprove the change
If the CRA has a classification of Class III, the software lead engineer is responsible for approving or disapproving the proposed change. If the CRA is disapproved at any point during the evaluation process, a copy of the CRA will be sent to the requestor, stating the reasons for the disapproval. The software lead engineer is responsible for assuring that approved CRAs are distributed for implementation and status update. The baselined material to be modified will be obtained from the [project/product name] software library, and the proposed change(s) will be incorporated by the originator of the baselined material. Review and/or retest of revised baselined material will be accomplished by the [project/product name] V&V engineer(s) to ensure that: •
Revision processing was accomplished in accordance with the approved CRA
•
Faults were not introduced during modification
•
Modification has not caused unintended adverse effects
•
The baselined material still meets its specified requirements
The results of the V&V activities will be documented on the CRA. The updated CRA is then reviewed by the software lead engineer to assure the successful implementation of the proposed change(s). If the change implementation is successful, the software lead engineer marks the CRA as closed, and the original is placed in the library. If the change implementation is not successful, the software lead engineer is responsible for directing additional problem analysis and ensuring that a new or revised CRA is classified, reviewed, and submitted for approval in accordance with this plan.
Page 12 of 24 SCMP
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
305
Baselined documents requiring approval signatories will be resubmitted by the originator to the appropriate [project/product name] personnel for authorization to release the revised document.
3.2.4
Software Change Review Board
The [project/product name] SCRB is established by the [corporate title/position] to coordinate, review, and decide the disposition of Class I and Class II CRAs. The scheduling of meetings for the SCRB is the responsibility of the the software configuration manager and is dictated by the impact of the proposed change(s) and the phase of the software development. The [project/product name] SCRB analyzes and identifies the impact of the change documented in the CRA.The CRA is updated to reflect the decisions of the SCRB to: •
Direct additional problem analysis
•
Direct that a change different from that proposed in the CRA be implemented
•
Approve the change as proposed
•
Disapprove any change
The SCRB will distribute the original CRA to the software lead engineer, who will provide a copy of the updated CRA to the originator(s) of the affected material. The original CRA will be placed in the [project/product name] software library.
3.2.5
[project/product name] Software Library
The [project/product name] software library will provide source-file control, problem identification, change traceability, and status determination functions for the [project/product name] software development effort. These functions will be provided through the use of: •
A source code control system that provides a history of the revisions to source documents and access control for change processing
•
File storage space and access control
•
Development of a file activity and change-status database
•
A set of procedures for the use of these tools
Copyright © 2002 Interpharm Press
Page 13 of 24 SCMP
306
Software Quality Assurance SOPs for Healthcare Manufacturers
Baselined materials will be distributed from the [project/product name] software library at each major milestone and upon request of the software lead engineer. Software validation of the [project/product name] software will be executed on baselined source code retrieved from the [project/ product name] software library.
3.3 Software Configuration Status Accounting The object of configuration status accounting is to provide (1) identification of the [project/ configuration baselines and traceability from the baselines resulting from approved changes and (2) a management tool for monitoring the accomplishment of all related tasks resulting from each approved change. The result of this configuration management activity is a Software Configuration Status Report that is distributed at the completion of each major milestone and upon request of the software lead engineer. This report will list the following information:
product name]
•
Baseline identification, including version
•
A list of all baselined material, indicating the current revision and referenced CRAs
•
A list of all CRAs, including current disposition or date of closure
•
A list of all anomaly reports, including the date of detection, date of fix, associated CRAs if any, date of retest, and current disposition
3.4 Audits and Reviews The audits and reviews performed during development of the [project/product name] software are shown in Appendix A. The following material is used during the reviews and audits: •
An update of the [project/product name] software configuration status report
•
Copies of baselined materials
•
A copy of all CRAs generated since the last review or audit, for reference
Page 14 of 24 SCMP
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
307
4.0 TOOLS, TECHNIQUES, AND METHODOLOGIES
SCM of the [project/product name] software end product will be accomplished by software baseline management, software configuration identification, software configuration control, and software configuration status accounting. The [project/product name] software end-product configuration includes the following elements: •
Identification of all components that make up the system
•
A description of each component in the system
•
A description of the interdependencies of the components of the system
•
A description of how the components fit together to form the system
•
Identification of the proper revision of each component that makes up a version or software baseline
•
Documentation of the changes made to each component
•
The accurate building of the desired version of the system
These activities will be performed using automated tools and manual procedures for configuration audits.
4.1 Automated Tools Appendix C contains the list of automated tools that will be used for the [project/product name] software development effort.
4.1.1
Tools Used For SCM of Prebaselines
During the implementation phase of software development, the configuration of the [project/product name] software end product will be identified, controlled, and reported by the [project/product name] software engineers with the use of the following types of automated tools.
Copyright © 2002 Interpharm Press
Page 15 of 24 SCMP
308
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Object module librarian tool. This tool is used to create, organize, reconstruct, dissect, and compress libraries of object code modules. Using modules from a library promotes structured programming and supports code reusability. Object module libraries used in the [project/product name] software will be placed under configuration management in the [project/product name] software library.
•
Executable module creation tool. This tool is used to build and maintain computer programs by recording which parts of the software system depend upon which other parts and the steps necessary to reconstruct a correct program or system when some parts are changed and to automate the software updating process by assuring that the most recent object module copy is used in building the software system.
•
Source code control system. This tool enables the software engineers to control the software system’s organization, construction, and maintenance by maintaining a history of the revisions to source documents, as well as access control for change processing.
4.1.2
Tools Used For SCM of Baselines
The following tools will be used by the [project/product name] software configuration manager to identify, control, and report the configuration of baselined material. •
Source code control system. All [project/product name] software source documents will be entered and removed from the [project/product name] software library using this tool, which will document the history of changes made to a source document, who made each change, what the change involved, and when the change was made. This tool will also generate prior revisions of a source document, which is useful during regression testing and in isolating faults induced by software changes. Identifying the appropriate elements of a baseline will be performed by assigning each element the same revision number.
•
Secondary storage. Secondary storage will be used to support the storage, archiving, and file access control of baselined material for the [project/product name] software end product. The electronic media of the [project/product name] software library will be placed in the secondary storage.
•
Database tool. Configuration status accounting will be implemented using a database tool to generate and maintain status accounting records and reports for baseline identification, CRAs, and Software Anomaly Reports.
Page 16 of 24 SCMP
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
309
4.2 Manual Techniques SCM for the [project/product name] software end product will require manual audits of pre- and post-baselined material. The objectives of these audits are to assure the following: •
Technical and administrative integrity of a “to-be-established” software baseline
•
Each element in the “to-be-established” baseline maps directly or through parents when it is traced through preceding baselines
•
System and software requirements are fulfilled through the software configuration specified in the “to-be-established” baseline
•
Changes made to a baseline, resulting in an update or version, are implemented as intended
These objectives will be met by the coordination of V&V and SCM activities. A manual check will be performed of all material submitted for baseline assignment against the associated V&V review documentation, and all discrepancies will be reported to the software lead engineer. A new or updated software baseline is established only after the completion of a successful configuration audit of the “to-be-established” baseline material.
5.0 RECORDS COLLECTION AND RETENTION
The baselined materials, CRAs, and Software Anomaly Reports generated during the development of the [project/product name] software will be collected and stored in the software development library. Distribution of the software documents entered into the software development library will be performed by retrieving a read-only copy of the file for reproduction. Data backup, recovery, and retrieval of the source documents entered into the software development library will be performed in accordance with corporate disaster contingency plans. Upon completion of the Software Validation Phase and delivery of the [project/product name] software to the quality assurance group for system testing, all source documents of the [project/product name] software entered in the software development library will be archived in secondary storage. Archived files will be saved and made available for retrieval in accordance with corporate disaster contingency plans.
Copyright © 2002 Interpharm Press
Page 17 of 24 SCMP
Page 18 of 24 SCMP Requirements Baseline
Software Endproduct Acceptance Plan (SEAP)
Software Development Plan (SDP)
Software Configuration Management Plan (SCMP)
Software Requirements Specification (SRS)
Requirements
Configuration baselines
Interface Design Specification (IDS)
Interface Design
Software Requirements Review (SRR)
Software Verification and Validation Plan (SVVP)
Software Quality Assurance Plan (SQAP)
Project Start-Up
Architecture Design Baseline
CRAs
Software Anomaly Reports
Software Validation Test Procedures (SVTPR)
Software Development Test Information Sheets (DTISs)
Source code
Code and Test and Integrate and Test
Detailed Design Baseline
Implementation Baseline
Software configuration status
Code audits
Design walk- Code walkthroughs throughs
CRAs
Software Detailed Design Specification (SDDS)
Detailed Design
Software Software Architecture Detailed Design Review Design (SADR) Review (SDDR)
Design-walkthroughs
CRAs
Software Validation Test Plan (SVTP)
Software Development Test Plan (SDTP)
Software Architecture Design Specification (SADS)
Architecture Design
Validation Baseline
Software configuration status
Software configuration audit
Software Verification and Validation Report (SVVR)
CRAs
Software Anomaly Reports
Software Validation Test information Sheets (VTISs)
Software Validation
APPENDIX A
Reviews and audits
Configuration items
Configuration Management Description
310 Software Quality Assurance SOPs for Healthcare Manufacturers
SOFTWARE CONFIGURATION MILESTONES
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
APPENDIX B
311
CHANGE REQUEST/APPROVAL (CRA) FORM CHANGE REQUEST/APPROVAL (CRA) FORM
1. System name: ______________________________________ 3. Application Level:
❑
SOFTWARE
4.a. Originating Organization
5. Configuration Baseline Affected (highest level)
OTHER
❑
7. Configuration Items Affected:
6. Change Classification:
d. Date
❑
DOCUMENT
b. Initiator
c. Telephone
2. CRA Number: __________
a.
______________________
b.
______________________
Class I
❑
c.
______________________
Class II
❑
d.
______________________
Class III
❑
e.
______________________
8. Narrative: (if additional space is needed, indicate here ___ Page ___ of ___.) a. Description of change:
b. Need for change:
c. Estimated effects on other systems, software, or equipment:
d. Alternatives:
e. Anomaly Number (if any) used to generate this CRA: 9. Disposition:
Additional Analysis
________________________________ Approved
Disapproved
DATE: Signature:
____________________________________________________________________
10. Change Verification Results: 11.
V&V Signature: ________________________________________________
13.
Date Closed:________________
Copyright © 2002 Interpharm Press
12. Date: ______
14: Signature: __________________________________
Page 19 of 24 SCMP
312
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX C
AUTOMATED TOOLS USED IN SOFTWARE CONFIGURATION MANAGEMENT
Tool/Application Name and Version [Enter tool/application and version number] [Enter tool/application and version number] [Enter tool/application and version number] [Enter tool/application and version number] [Enter tool/application and version number]
Page 20 of 24 SCMP
Vendor Name
name
[Enter vendor/supplier name]
name
[Enter vendor/supplier name]
name
[Enter vendor/supplier name]
name
[Enter vendor/supplier name]
name
[Enter vendor/supplier name]
Tool/Application Description [Enter brief description tool/application] [Enter brief description tool/application] [Enter brief description tool/application] [Enter brief description tool/application] [Enter brief description tool/application]
of of of of of
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
313
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Anomaly: Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. Archiving: Provisions made for storing and retrieving records over a long period of time. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Backup: Provisions made for the recovery of data files or software lost due to a system failure, human failure, or disaster. Baseline: Specification or product that has been formally reviewed and agreed upon that thereafter serves as the basis for further development and that can be changed only through formal change control procedures. Baselined: A completed baseline. Change control: The process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Change Request/Approval (CRA): A form used to document changes to a baseline. Code: Loosely, one or more computer programs or part of a computer program. Completeness: Those attributes of the software or documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Configuration control: Process of evaluating, approving or disapproving, and coordinating changes to configuration items after formal establishment of their configuration identification.
Copyright © 2002 Interpharm Press
Page 21 of 24 SCMP
314
Software Quality Assurance SOPs for Healthcare Manufacturers
Configuration identification: Process of designating the configuration items in a system and recording their characteristics. Configuration item: Aggregation of hardware and software, or any of their discrete parts, that satisfies an end-use function. Configuration management (CM): Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Configuration status accounting: Recording and reporting of the information that is needed to manage a configuration effectively, including a listing of the approved configuration identification, the status of proposed changes to the configuration, and the implementation status of approved changes. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to Quality Assurance for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, which may lead to a failure.
Page 22 of 24 SCMP
Copyright © 2002 Interpharm Press
Software Configuration Management Plan
315
Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Milestone: Scheduled and accountable event that is used to measure progress. Quality assurance (QA): Planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements. Regression testing: Selective retesting to detect faults introduced during modification, to verify that modifications have not caused unintended adverse effects and that a modified system or system component still meets its specified requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Change Review Board (SCRB): Forum for the evaluation, approval, monitoring, and control of changes to software baselines. Software configuration management (SCM): Discipline of identifying the configuration of a software system at discrete points in time for the purpose of systematically controlling changes to this configuration and maintaining the integrity and traceability of this configuration throughout the development process. Software development library: Software library containing computer-readable and humanreadable information relevant to a software development effort. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures employed to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software end products: Computer programs, software documentation, and databases produced by a software development project.
Copyright © 2002 Interpharm Press
Page 23 of 24 SCMP
316
Software Quality Assurance SOPs for Healthcare Manufacturers
Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software tool: Computer program used to help develop, test, analyze, or maintain another computer program or its documentation. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied. Source code: Original software expressed in human-readable form (programming language), which must be translated into machine-readable form before it can be executed by the computer. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase.
Page 24 of 24 SCMP
Copyright © 2002 Interpharm Press
without SCMP and SQAP
[Project/Product Name] SDP SOFTWARE DEVELOPMENT PLAN
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SDP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 35 SDP without SCMP and SQAP
318
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
SOFTWARE DEVELOPMENT OVERVIEW
5
3.0
SOFTWARE DEVELOPMENT REQUIREMENTS
9
4.0
SOFTWARE DEVELOPMENT ADMINISTRATIVE PROCEDURES
25
APPENDIX A
Schedule of Software Development Tasks
27
APPENDIX B
Software Development Personnel and Resource Requirements
29
Software Development Tools, Techniques, and Methodologies
30
APPENDIX C
GLOSSARY
Page 2 of 35 SDP without SCMP and SQAP
31
Copyright © 2002 Interpharm Press
Software Development Plan
319
1.0 INTRODUCTION
1.1 Purpose This plan identifies and describes the plan for the development of the software for the [project/ product name]. The program of software development defined in this document will be applied throughout all phases of the software development life cycle for the [project/product name] and constitutes a formal software project.
1.2 Scope The scope of the [project/instrument name] software development is defined to be those tasks and the information necessary to manage and perform those tasks that are required in order to ensure the development of quality software for the [project/product name]. The basis for the development of software for the [project/product name] will be the capabilities as defined in the system requirements documents. The program described in this plan assures that an appropriate level of software verification and validation (V&V) will be applied to all phases of the software development, while supporting the [project/instrument name] market strategy and product launch schedule. The software V&V supported by this software development program is contained in the [project/product name] Software Verification and Validation Plan (SVVP).
1.3 Overview This document describes the organization, activities, schedule, and inputs and outputs required for an effective [project/product name] software development program. The scope of participation by associated organizations in the development of the [project/product name] software product is also identified. Software development will be defined for each phase of the [project/product name] software development life cycle relative to •
Tasks
•
Methods and evaluation criteria
•
Inputs and outputs
Copyright © 2002 Interpharm Press
Page 3 of 35 SDP without SCMP and SQAP
320
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Schedule
•
Resources
•
Risks and assumptions
•
Roles and responsibilities
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the contents of this specification shall be considered a superseding requirement.
1.4.1
Project Specification
•
[project/product name]
Product Objectives Document, Document Number [aaa]-POD[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
[aaa]-CMP-[#.#],
•
Software Configuration Management Plan, Document Number Revision [#.#], dated [date]
Software Development Test Plan, Document Number [aaa]-DTPRevision [#.#], dated [date]
[project/product name] [#.#],
•
Software End Product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
[project/product name]
Software Quality Assurance Plan, Document Number [aaa]-QAP[#.#], Revision [#.#], dated [date]
[aaa]-VVP-[#.#],
Software Verification and Validation Plan, Document Number Revision [#.#], dated [date]
Page 4 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
1.4.2
321
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Verification and Validation Policies, Revision [#.#], dated [date]
2.0 SOFTWARE DEVELOPMENT OVERVIEW
2.1 Organization The [project/product name] has been organized under the direction of [title/position], who has assembled a program team that represents all concerned disciplines.This team coordinates the development activities and provides support for product development. Product development of [project/product name] involves [insert engineering disciplines here], and software engineering.The interface between these disciplines during product development is provided through the technical team, who meet to address and resolve [project/product name] product development issues. A lead software engineer heads the [project/product name] software development and provides technical direction in all aspects of software development and V&V. Other software engineers have been assigned to the team, and all participants in the development of the [project/product
Copyright © 2002 Interpharm Press
Page 5 of 35 SDP without SCMP and SQAP
322
Software Quality Assurance SOPs for Healthcare Manufacturers
software are responsible for ensuring that their efforts are in compliance with the Software Development Policies.
name]
The functions and tasks of V&V for the [project/product name] software are organized under the direction of [corporate title/position].The V&V tasks, policies, and procedures are administered and approved by this individual. The authority for resolving project-related issues raised by the V&V tasks and approval of the V&V products resides with this individual or designate. The [project/product name] V&V organization is composed of software engineers who have not been directly involved in the development of the software being verified and have not established the criteria against which the software is validated. Software task assignments will be made by the software lead engineer. A software development schedule will be developed and milestones set for each task and phase. These milestones will include reviews or reports indicating the completion of the requirements for each phase and the methods for assessing what requirements are needed for the subsequent phases. This plan will be administered by the software lead engineer. Any updates or deviations reside under this authority. The software lead engineer also has the responsibility for ensuring that the tasks, activities, assignments, and milestones are properly met.
2.2 Master Schedule The [project/product name] software development consists of the following types of activities: •
Project administration
•
System interface analysis and design
•
Requirements analysis
•
Software architecture and design
•
Code implementation and testing
•
Code integration and testing
•
System design validation testing
•
Simulation and prototyping support
Page 6 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
323
These types of activities are subdivided into the following software development phases: •
Project Start-Up Phase
•
Interface Design Phase
•
Requirements Phase
•
Software Architecture Phase
•
Detailed Design Phase
•
Code and Test Phase
•
Integration and Test Phase
•
Software Validation Phase
A schedule of the [project/product name] software development tasks and the relationship of each task to the phases of software development is presented as Appendix A. Software V&V is an integral part of each phase of the software development life cycle.The V&V tasks are integrated into the project schedule in order to provide feedback to the development process and support management functions. A schedule of the [project/product name] V&V tasks and the relationship of each task to the phases of software development is presented in the SVVP.
2.3 Resources The personnel and material resources required to perform software development of the [project/ product name] software are presented as Appendix B. The factors that were analyzed in determining these resource requirements are the product features and performance requirements of the [project/product name] product as specified in the product requirements documentation, and the development of the [project/product name] software project in compliance with the Software Development Policies.
Copyright © 2002 Interpharm Press
Page 7 of 35 SDP without SCMP and SQAP
324
Software Quality Assurance SOPs for Healthcare Manufacturers
2.4 Responsibilities The project software development organization is responsible for performing the tasks defined in this plan.The personnel selected to perform the development of the [project/product name] software have the technical credibility to understand the source of problems related to software quality, to follow through with recommended corrective actions to the development process, and to abort delivery of defective software end products. Members of the project software development organization will not be assigned to the V&V of the software to be produced on the project, but they will establish the criteria against which the software is validated. The specific roles and responsibilities of the V&V organization during each phase of software development are presented in the SVVP.
2.5 Tools, Techniques, and Methodologies Development of the [project/product name] software will be accomplished by implementing, testing, checking, or otherwise establishing and documenting the conformance of the software to specified requirements. These tasks will be performed manually and by means of automated tools and techniques. Examples of the automated tools to be utilized include static analyzers, dynamic analyzers, comparators, editors, compilers, debuggers, linkers and loaders, and change trackers. Manual tools to be utilized include walk-throughs, formal reviews, and algorithm analysis. Support tools and techniques for development include the following: •
General system utilities and text processing tools for code and test preparation, organization, and modification
•
Data reduction and report generation tools
•
Library support systems consisting of database management systems and configuration control systems
•
Code editors, assemblers, compilers, linkers and loaders, and debug tools
•
Test drivers and test languages
The selection of tools for the development tasks is based on the objectives and goals for each phase of software development. The necessary tools for software development are listed in Appendix C.
Page 8 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
325
2.6 Software Configuration Management The configuration control of the [project/product name] software is specified in the [project/product name] Software Configuration Management Plan (SCMP).
2.7 Software Quality Assurance The quality assurance of the [project/product name] software is specified in the [project/product name] Software Quality Assurance Plan (SQAP).
2.8 Software Verification and Validation The V&V of the [project/product name] software is specified in the [project/product name] SVVP and its supporting documents.
3.0 SOFTWARE DEVELOPMENT REQUIREMENTS
3.1 Management The management of the software development program described in this plan spans all phases of [project/product name] software development. The [corporate title/position] designates a software lead engineer who is responsible for performing both the software development management tasks and technical direction. The management tasks to be performed for the [project/product name] software development program include, but are not limited to, the following: Software Development Plan (SDP) generation and maintenance
•
[project/product name]
•
Software baseline change assessment for effects on previously baselined tasks
Copyright © 2002 Interpharm Press
Page 9 of 35 SDP without SCMP and SQAP
326
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Periodic review of the V&V effort, technical accomplishments, resource utilization, future planning, and risk management
•
Daily management of software phase activities, including the technical quality of final and interim testing and reports
•
Review and evaluation of V&V results in order to determine when to proceed to the next software development life-cycle phase and define changes to V&V tasks, which will improve the V&V effort
•
Maintain good communication with all team members to ensure the accomplishment of project quality assurance goals and objectives
At each phase of software development, the software tasks and associated inputs and outputs, schedules, resource requirements, risks and assumptions, and personnel responsible for performing the task are evaluated. This evaluation establishes the criteria for updating this plan. Maintenance is performed as necessary to ensure the completeness and consistency of this plan with the changes in software developed for the project. will support the management of software development for the [project/ product name] software through reviews of software activities. Periodic reviews of the development effort, technical accomplishments, resource utilization, future planning, and risk management will be conducted by [corporate title/position]. The technical quality and results of the outputs of each phase of the software development will be evaluated in order to provide management support for the software leader’s recommendation to proceed or not proceed to the next development phase and to define changes to V&V tasks to improve the V&V effort. Updates to this plan during software development will be reviewed and approved by [corporate title/position] prior to implementation. [corporate title/position]
3.2 Interface Design Phase Software Development Activities The software development activities that occur during this phase are to determine and document the hardware-to-software and software-to-software interfaces for the product. The hardware-tosoftware interface tasks will include the following: •
Review electrical circuits and/or components that must be tested by software
•
Analyze performance requirements, testing procedure, sequencing, and timing as required
Page 10 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
•
Identify faults that can be detected by the software tests
•
Determine signals that are passed between software and external devices, components, or peripherals
•
Identify and define discrete signal states, reset states, and initialization values
•
Define relevant transfer functions for any analog signals
•
Define algorithms that encompass electromechanical systems
327
The software-to-software interface tasks will include the following: •
Mailbox structure and communication
•
Data structure, values, and defaults
•
Pointer, index, and parameter schemes for accessing data structures and arrays
•
Parameter passing conventions
•
Definition of semaphores, flags, and system status indicators
•
Basic computational units and conversion schemes
The results of this phase will be documented in the [project/product name] software Interface Design Specification. This document will be updated throughout the [project/product name] software development as necessary in order to reflect the evolving software.
3.3 Requirements Phase Software Development Activities The goal of the Requirements Phase is to ensure that both the problem and the constraints upon the solution are specified in a rigorous form. During this phase of software development, the software requirements analysis is performed. As problem evaluation and solution synthesis are accomplished, the characteristics of the software are established and design constraints are uncovered. The Product Requirements Document specifies the product or system-level requirements of the [project/product name] and establishes the product requirements from which software requirements are allocated. The [project/product name] Software Requirements Specification (SRS) is the document that specifies the results of the software requirements analysis.The SRS defines the basic functions,
Copyright © 2002 Interpharm Press
Page 11 of 35 SDP without SCMP and SQAP
328
Software Quality Assurance SOPs for Healthcare Manufacturers
performance, interfaces, and flow or structure of information and validation criteria of the successful software implementation.The emphasis of the Software Requirements Phase V&V tasks is the analysis and evaluation of the correctness, consistency, completeness, accuracy, and testability of the specified software requirements.
3.3.1
Software Requirements Specification (SRS)
Review of the product requirements documentation is critical because it establishes the basis upon which all succeeding documents and products are developed. During this phase of development, the specifications of system performance, user interface, and critical components of [project/product name] are reviewed for use in planning and in defining the level of effort required to successfully implement the software. Product requirements documentation of the project is provided to the V&V leader by the software lead engineer for review prior to development of the SRS. The [project/product name] SRS will concentrate on the following areas of requirement definition: •
Specification of the computing environment(s) in which the software must perform
•
Specification of the safety requirements, including a description of the unsafe operating conditions in terms of critical software functions and goals, the severity of hazards, and the set of associated critical parameters and critical indicators
•
Specification of the hardware interfaces through which the software must gather input and send output
•
Specification of the software interfaces, including the purpose of the interface, the type of data to be interchanged via the interface, and an estimate of data quantity and transfer rate requirements
•
Specification of the user interfaces, including the characteristics that the software must support for each human interface to the software product
•
Specification of the interfaces to communications devices, including the name, version, interface type, and required usage
•
Specification of the required values of each output, expressed through functions, and state tables
•
Specification of the timing, accuracy, and stability requirements for each such output value
Page 12 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
•
3.3.2
329
Software design constraints specifying likely changes and desirable subsets that must be accounted for in the design
Inputs and Outputs
The inputs to the Software Requirements Phase include the product requirements documentation, [project/product name] Interface Design Specification (IDS), and periodic program status reports. The outputs of the Software Requirements Phase are the Requirements Phase V&V Task reports, Requirements Phase V&V Task Summary report, the [project/product name] Requirements Traceability Matrix (RTM), the [project/product name] SRS, and updates to this plan as required in order to accommodate changes in product requirements and/or program objectives.
3.3.3
Software Requirements Review (SRR)
The goal of the SRR is to review the progress of the [project/product name] project to date; review the SRS and determine the adequacy, correctness, and testability of the stated software and interface requirements; and determine whether to proceed or not to proceed to the next development phase. All V&V outputs generated during this development phase will be provided by the V&V leader to the software lead engineer prior to the SRR.
3.3.4
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on the assumptions: that the [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives; and that development of the SRS will comply with the requirements defined in the Software Development Policies.
3.4 Software Architecture Phase Software Development Activities The goal of the Software Architecture Phase is to ensure that a preliminary software design has been achieved that establishes the design baseline from which the detailed design will be developed.The [project/product name] Software Architecture Design Specification (SADS) is generated during this phase of software development. The SADS describes how the software system will be structured to satisfy the requirements identified in the [project/product name] SRS. The SADS
Copyright © 2002 Interpharm Press
Page 13 of 35 SDP without SCMP and SQAP
330
Software Quality Assurance SOPs for Healthcare Manufacturers
translates the software requirements into a description of the software structure, software components, interfaces, and data necessary for the detail design phase. The goal of the Software Architecture Phase V&V tasks is to ensure internal consistency, completeness, correctness, and clarity of the information needed to support the detailed definition of the individual software system components.
3.4.1
Software Architecture Design Tasks
The requirements analysis tasks defined for the Software Architecture Phase software development include the following: •
Partitioning of the [project/product name] processes into appropriate tasks
•
Determination and design of start-up and shutdown tasks
•
Determination and design of interrupt service routines
•
Partition of the data and control structures into global and local task elements
•
Definition of mailboxes, messages, timers, and pointers
•
Definition of task priorities that are required for intertask communications and synchronization
The software safety features will be included in this design and will be in compliance with approved software safety design guidelines and safety considerations identified in the system Hazards Analysis, which are controlled and/or commanded by software. The results of the requirements will be documented in the Software Architecture Design Specification (SADS). Design walk-throughs will be conducted by the software designer(s) during this phase to examine the characteristics of the software architecture design. V&V will participate in these walk-throughs and will provide results of design V&V to the software lead engineer and [corporate title/position].
3.4.2
Inputs and Outputs
The inputs to the Software Architecture Phase include the [project/product name] SRS, Hazards Analysis, [project/product name] RTM, [project/product name] Software Development Test Plan (SDTP), and [project/product name] periodic program status reports. The outputs of the Software Architecture Phase are the V&V Task reports, V&V Task Summary report, [project/product name] Software Validation Test Plan (SVTP), [project/product name] SADS, [project/product name] RTM
Page 14 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
331
updates, and updates to this plan as required in order to accommodate changes in [project/product name] product or software requirements.
3.4.3
Software Architecture Review (SAR)
The goal of the SAR is to •
Review the progress of the [project/product name] project to date
•
Review the SADS and determine the adequacy, correctness, and testability relative to the stated software and interface requirements
•
Evaluate the form, structure, and functional description of the design for correctness, consistency, completeness, and accuracy
•
Evaluate the software structure for robustness, testability, and compliance with established software development procedures and Software Development Policies
•
Analyze the data items defined at each interface for correctness, consistency, completeness, and accuracy
•
Determine to proceed or not to proceed to the next development phase
All V&V outputs generated during this development phase will be provided by the V&V leader to the software lead engineer prior to the SAR.
3.4.4
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on the assumptions: that the [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives; and that development of the [project/product name] SADS will comply with the requirements defined in the Software Development Policies.
Copyright © 2002 Interpharm Press
Page 15 of 35 SDP without SCMP and SQAP
332
Software Quality Assurance SOPs for Healthcare Manufacturers
3.5 Detailed Design Phase Software Development Activities The goal of the Detailed Design Phase is to ensure that the detailed software design satisfies the requirements and constraints specified in the [project/product name] SRS and augments the design specified in the [project/product name] SADS. A [project/product name] Software Detailed Design Specification (SDDS) is generated during this phase of software development. The SDDS describes how the software system will be structured to satisfy the requirements identified in the SRS and supports the design specified in the SADS.The SDDS translates the software requirements into a description of the software structure, software components, interfaces, and data necessary for the implementation phases. The result is a solution specification that can be implemented in code with little additional refinement. The goal of the Detailed Design Phase V&V tasks is to ensure the internal consistency, completeness, correctness, and clarity of the [project/product name] SDDS and to verify that the implemented design will satisfy the requirements specified in the [project/product name] SRS.
3.5.1
Software Detailed Design Tasks
The detailed design tasks defined for the Software Architecture Phase software development include the following: •
Partitioning of the [project/product name] SADS tasks into appropriate subtasks
•
Final design of start-up and shutdown tasks
•
Final design of interrupt service routines
•
Partition of the data and control structures into global, local, and task elements
•
Final definition of mailboxes, messages, timers, and pointers
•
Final definition of task priorities that are required for intertask communications and synchronization
•
Development of structure charts
•
Development of final state transition, data flow, and control flow diagrams
The software safety features will be included in this design and will be in compliance with approved software safety design guidelines and safety considerations identified in the system Hazards Analysis, which are controlled and/or commanded by software. The results of the requirements will be documented in the SDDS.
Page 16 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
333
Design walk-throughs are conducted by the software designer(s) during the detailed design phase to examine the characteristics of the detailed software design. V&V will participate in these walk-throughs and will provide results of design V&V to the software lead engineer and the [corporate title/position].
3.5.2
Software Development Test Information Sheets
A Software Development Test Information Sheet (DTIS) is prepared by the software developers for each software component test defined in the SDTP. The DTISs are provided to the V&V leader by the software lead engineer for review prior to the Software Detailed Design Review (SDDR). Verification of the adequacy of software component testing is supported by the review of the DTISs.
3.5.3
Inputs and Outputs
The inputs to the Detailed Design Phase include the [project/product name] SRS, Hazards Analysis, [project/product name] SADS, [project/product name] RTM, and [project/product name] periodic program status reports. The outputs of Detailed Design Phase V&V are the Detailed Design Phase V&V Task reports, Detailed Design Phase V&V Task Summary report, [project/product name] VTISs, [project/product name] SDDS, [project/product name] DTISs, [project/product name] RTM updates, and updates to this plan as required in order to accommodate changes in [project/product name] product or software requirements.
3.5.4
Software Detailed Design Review
The goal of the SDDR is to •
Review the progress of the [project/product name] project to date
•
Review the SDDS and determine the adequacy, correctness, and testability relative to the stated software and interface requirements
•
Evaluate the form, structure, and functional description of the design for correctness, consistency, completeness, and accuracy
•
Evaluate the software structure for robustness, testability, and compliance with established software development procedures and Software Development Policies
•
Analyze the data items defined at each interface for correctness, consistency, completeness, and accuracy
Copyright © 2002 Interpharm Press
Page 17 of 35 SDP without SCMP and SQAP
334
•
Software Quality Assurance SOPs for Healthcare Manufacturers
Determine whether to proceed or not to proceed to the next development phase
All V&V outputs generated during this development phase will be provided by the V&V leader to the software lead engineer prior to the SDDR. An assessment will be made of how well the software structures defined in the [project/product SDDS satisfy the fundamentals of structured design. Structured design techniques that provide a foundation for “good” design methods include the following:
name]
•
Evaluating the preliminary software structure to reduce coupling and improve cohesion
•
Minimizing structures with high fan-out and striving for fan-in as depth increases
•
Keeping the scope of effect of a component within the scope of control of that component
•
Evaluating component interfaces to reduce complexity and redundancy and improve consistency
•
Defining components that have predictable function, but avoiding components that are overly restrictive
•
Striving for single-entry, single-exit components, and avoiding content coupling
•
Packaging software on the basis of design constraints and portability requirements
•
Selecting the size of each component so that independence is maintained
3.5.5
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on the assumptions: that the [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives; and that development of the [project/product name] SDDS and DTISs will comply with the requirements defined in the Software Development Policies.
Page 18 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
335
3.6 Code and Test Phase Software Development Activities The goals of the Code and Test Phase are as follows: •
To ensure that the design is correctly implemented in code, resulting in a program or system that is ready for integration and validation
•
To ensure the accurate translation of the detailed design
•
To detect undiscovered errors
Verification of the Code and Test Phase activities performed by software developers is accomplished by reviewing code and test results.
3.6.1
Code and Test Tasks
The [project/product name] software will be coded, using the text editor approved for the project, in conformance with the applicable programming guidelines. Code testing will be performed in accordance with the SDTP and DTIS requirements at the level of detail represented by the software at the time of the test. The source code submitted for integration will be free of unacceptable diagnostic utility analysis and compiler warnings. A list of acceptable diagnostic and compiler warnings and switch settings will be maintained by the software integrator. Software components will be analyzed by a complexity analysis tool. Code walk-throughs will be conducted by the code developer(s) during the implementation to examine both high-level and detailed properties of the source code. V&V will participate in these walk-throughs and provide results of source code V&V to the software lead engineer and the [corporate title/position]. Code reviews will also be performed by V&V. The V&V reviews of source code will •
Evaluate the structure of the source code for compliance with coding standards
•
Assess the communication value of the source code
•
Evaluate the source code for efficiency of algorithms, memory efficiency, execution efficiency, and input and output efficiency
•
Evaluate the source code for consistency, completeness, and traceability to software requirements and design
Copyright © 2002 Interpharm Press
Page 19 of 35 SDP without SCMP and SQAP
336
Software Quality Assurance SOPs for Healthcare Manufacturers
Discrepancies and deficiencies found during V&V of the source code are documented in Software Anomaly Reports. During the Code and Test Phase, software developers will use DTISs to conduct software component testing. At the successful completion of the testing described, the DTIS is signed and dated by the software lead engineer. DTISs and associated test data will be provided to the V&V leader by the software lead engineer as each test is completed. Completed DTISs will be analyzed by V&V to evaluate the following: •
Adequacy of test coverage
•
Adequacy of test data
•
Software behavior
•
Software reliability
Discrepancies and deficiencies found during software component testing are documented in Software Anomaly Reports.
3.6.2
Inputs and Outputs
Inputs to the Code and Test Phase include the [project/product name] SDDS, [project/product name] SRS, [project/product name] code, [project/product name] SDTP, [project/product name] DTISs, [project/product name] SVTP, and [project/product name] Validation Test Information Sheets (VTISs). The outputs are V&V task reports, anomaly reports, a task summary report, [project/product name] Software Verification Test Procedures (SVTPR), updates to the [project/product name] RTM, and updates to this plan as required.
3.6.3
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on the assumptions: that the [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives; that development of the [project/product name] code and conduct of software component testing will comply with the requirements defined in the Software Development Policies; and that when component testing has been completed successfully, [project/product name] code will be delivered to the V&V organization for baselining in accordance with this plan and the [project/product name] SCMP.
Page 20 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
337
3.7 Integrate and Test Phase Software Development Activities The goals of the Integrate and Test Phase are as follows: •
To ensure that the design is correctly implemented in code, resulting in a program or system that is ready for validation
•
To ensure the accurate translation of the implemented code and detailed design
•
To detect undiscovered errors
Verification of the Integration and Test Phase activities performed by software developers is accomplished by reviewing code and test results.
3.7.1
Integrate and Test Tasks
Integration will be the responsibility of the software integrator, who will receive and audit the software developer’s deliverables. Code to be integrated will be verified against the current existing integrated baseline to ensure that it does not cause unexpected side effects. The code to be integrated will conform with the requirements specified in the Product Requirements Document (PRD) and IDS. If the audit and integration are acceptable, the code will be added to the integration baseline. Otherwise, the code will be returned to the developer for error resolution and correction. Code walk-throughs may be conducted by the code developer(s) during the integration phase at the request of the software lead engineer or software integrator to examine both high-level and detailed properties of the source code.V&V will participate in these walk-throughs and will provide results of source code V&V to the software lead engineer and the [corporate title/position]. Code reviews will also be performed by V&V. The V&V reviews of source code will •
Evaluate the structure of the integrated code for compliance with coding standards
•
Assess the communication value of the integrated code
•
Evaluate the integrated code for efficiency of algorithms, memory efficiency, execution efficiency, and input/output efficiency
•
Evaluate the integrated code for consistency, completeness, and traceability to software requirements and design
Copyright © 2002 Interpharm Press
Page 21 of 35 SDP without SCMP and SQAP
338
Software Quality Assurance SOPs for Healthcare Manufacturers
Discrepancies and deficiencies found during V&V of the source code are documented in Software Anomaly Reports. During the Integrate and Test Phase, software developers will use DTISs to conduct software component integration testing. At the successful completion of the testing described, the DTIS is signed and dated by the software lead engineer. DTISs and associated test data will be provided to the V&V leader by the software lead engineer as each test is completed. Completed DTISs will be analyzed by V&V to evaluate the following: •
Adequacy of test coverage
•
Adequacy of test data
•
Software behavior
•
Software reliability
Discrepancies and deficiencies found during software integration testing are documented in Software Anomaly Reports.
3.7.2
Inputs and Outputs
Inputs to the Integrate and Test Phase include the following: •
[project/product name]
SDDS
•
[project/product name]
SRS
•
[project/product name]
code
•
[project/product name]
SDTP
•
[project/product name]
DTISs
•
[project/product name]
SVTP
•
[project/product name]
VTISs
The outputs are V&V task reports, anomaly reports, a task summary report, updates to the [project/product name] SVTPR, updates to the [project/product name] RTM, and updates to this plan as required.
Page 22 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
3.7.3
339
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on assumptions: •
The [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives.
•
Integration of the [project/product name] code and conduct of software component integration testing will comply with the requirements defined in the Software Development Policies.
•
When component testing has been completed successfully, the [project/product name] code will be delivered to the V&V organization for baselining in accordance with this plan and the [project/product name] SCMP.
3.8 Validation Phase Software V&V Activities The goal of the Software Validation Phase V&V is to verify that the [project/product name] software satisfies the requirements and design specified in the [project/product name] SRS and SDDS.
3.8.1
Prevalidation Software Configuration Control
At the completion of software component testing, the software is placed under configuration control for baseline processing. The baselined source code and associated files will be stored in the project software library in accordance with the [project/product name] SCMP. The library will provide internal source file control, problem identification, change traceability, and status determination of the software and associated documentation. By this means, software configuration is controlled prior to software validation.
3.8.2
Software Validation Testing
Software validation is performed using the current controlled version of the software. Software validation is conducted in accordance with the [project/product name] SVTP using the [project/product name] SVTPR. The results of software validation are documented on the VTISs. Validation test results are analyzed to determine if the software satisfies software requirements and objectives.
Copyright © 2002 Interpharm Press
Page 23 of 35 SDP without SCMP and SQAP
340
Software Quality Assurance SOPs for Healthcare Manufacturers
Software Anomaly Reports are generated to document test failures and software faults. Control of the software configuration under test is maintained by implementing the procedures in the [project/product name] SCMP.The SCMP describes the required steps for processing, reporting, and recording approved software changes and dissemination of baselined descriptive documentation and software media.
3.8.3
Software Configuration Audit
A Software Configuration Audit of the validated software is conducted by V&V at the conclusion of software validation.The baselined software documentation is audited to determine that all the software products to be delivered for certification are present. The version description of all items is verified to demonstrate that the delivered software end products correspond to the software subjected to software validation. Discrepancies and deficiencies found during the software configuration audit are documented in Software Anomaly Reports. All anomaly reports are provided to the software lead engineer and the [corporate title/position] for initiation of corrective action. Completion of the Software Configuration Audit is contingent upon closure of all outstanding software discrepancies and deficiencies. Upon successful completion of this audit, the certified software products are delivered by the V&V leader to the software lead engineer for final product certification. The Software Configuration Audit Report is generated by the V&V leader to document the final configuration of the software products delivered for product certification.
3.8.4
Software Verification and Validation Report
The [project/product name] Software Verification and Validation Report (SVVR) is generated by the V&V leader at the completion of all V&V tasks during the Software Validation Phase. The SVVR is a summary of all V&V activities and results, including status and disposition of anomalies. An assessment of the overall software quality and recommendations for software and/or development process improvements is documented in the report.
3.8.5
Inputs and Outputs
The inputs to the Software Validation Phase V&V are the following: •
[project/product name]
product requirements documentation
•
[project/product name]
SRS
Page 24 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
•
[project/product name]
SDDS
•
[project/product name]
SVTP
•
[project/product name]
VTISs
•
[project/product name]
SVTPR
341
The outputs are the completed [project/product name] VTISs, Software Validation Phase V&V Task Summary report, Software Configuration Audit Report, [project/product name] SVVR, anomaly reports, and updates to this plan as required in order to accommodate changes in the [project/ product name] software validation program.
3.8.6
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on these assumptions: •
The [corporate title/position] will provide the project organization with the resources required to fulfill the V&V tasks defined, will ensure that the necessary inputs are provided in a timely manner, and will review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives.
•
Changes to [project/product name] requirements that have not been approved by the [title/position] will not be implemented in the [project/product name] code without the approval of the [corporate title/position] or designate.
•
[project/product name] software to be tested will be obtained by the V&V engineer(s) from the configuration management library.
4.0 SOFTWARE DEVELOPMENT ADMINISTRATIVE PROCEDURES
4.1 Additional Software Development Procedures The [project/product name] does not require any additional software development procedures.
Copyright © 2002 Interpharm Press
Page 25 of 35 SDP without SCMP and SQAP
342
Software Quality Assurance SOPs for Healthcare Manufacturers
4.2 Commercially Available, Reusable Software The [project/product name] will use [insert language and version number here] as its source language and libraries.The product will also use [insert any purchased software products, libraries, or package names here].
4.3 Software End-product Repository The released [project/product name] product software, data, documents, specifications, and libraries will be entered into the product quality assurance system for product life cycle configuration control in accordance with established software end-product SOPs. The non-released [project/product name] product software end products will be entered into the software quality assurance system for product life cycle configuration control in accordance with established software end-product SOPs. The non-released [project/product name] product hardware, firmware, and hardware-control-support software end products will be entered into the software quality assurance system for product life cycle configuration control in accordance with established software end-product SOPs.
4.4 Software Development Metrics The [project/product name] software team will collect the following software metrics: •
Defects, counted as a function of the Software Anomaly Reports and CRAs
•
Number of software components subjected to walk-throughs
•
Number of software components integrated
•
Code, analyzed by a software complexity analysis tool
Page 26 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Quality Assurance Plan (SQAP)
Copyright © 2002 Interpharm Press
Interface requirements and design analysis
Interface Design Specification (IDS)
Interface Design
Software Endproduct Acceptance Plan (SEAP)
Software Development Plan (SDP)
Software Configuration Management Plan (SCMP)
Software Requirements Review (SRR)
Software Architecture Design Review (SADR)
Software architecture design walkthrough
Software Development Test Plan (SDTP)
Software Architecture Design Specification (SADS)
Software requirements analysis Software Requirements Specification (SRS)
Software architecture design analysis
Architecture Design
System requirements document review
Requirements
Software Detailed Design Review (SDDR)
Software Development Test Information Sheets (DTISs)
Software detailed design walkthrough
Software Detailed Design Specification (SDDS)
Software detailed design analysis
Detailed Design
Code implementation
Integration implementation
DTIS execution
Code audits
Code audits DTIS execution
Code walkthroughs
Integrate and Test
Code walkthroughs
Code and Test
TASKS, ACTIVITIES, AND DOCUMENTS BY PHASE
Software Verification and Validation Report (SVVR)
Software configuration audit and report
Regression test conduct
Software validation test conduct
Software Validation
APPENDIX A
Estimations
Project Start-Up
APPENDIX A.1
Software Development Plan 343
SCHEDULE OF SOFTWARE DEVELOPMENT TASKS
Page 27 of 35 SDP without SCMP and SQAP
344
Software Quality Assurance SOPs for Healthcare Manufacturers
A.2
SOFTWARE IMPLEMENTATION PHASED DELIVERABLES
[Enter intended order of build and baseline deliverables]
Page 28 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Copyright © 2002 Interpharm Press
Hardware Support Software Engineers Senior Software Technician Software Technician
1
1
1
1
1
1
Project Interface Start-Up Design Requirements
1
1
1
1
1
1
Architecture Detailed Code Design Design and Test
1
1
1
1
Integrate Software and Test Validation
APPENDIX B
Software V&V Engineers Software V&V Lead Engineer Senior Software V&V Scientist Software V&V Scientist Senior Software V&V Analyst Software V&V Analyst Software V&V Engineer Associate Software V&V Engineer
Software Engineers Software Lead Engineer Software Scientist System Analyst Software Analyst Programmer Analyst Programmer Associate Programmer
Resource Requirement
Software Development Plan 345
SOFTWARE DEVELOPMENT PERSONNEL AND RESOURCE REQUIREMENTS
Page 29 of 35 SDP without SCMP and SQAP
Software Development Software Database software Text-processing software Graphical/drawing software Spreadsheet software Project management software Performance analyzer Logic analyzer Complexity analysis software Reverse engineering software Configuration control software Code debugger software Compiler software Assembler software Linker/loader software Application development software Software Development Hardware Emulator PC/workstation Lap top/notebook computers Controlled current power supply Test fixtures EPROM generator
Resource Requirement
Page 30 of 35 SDP without SCMP and SQAP X
X
X
X
X
X
X X X X X
X X X X
X
X
X
X X X X
X
X X X X X
X
X X
X X X X
X X X X X X
X X X X
X
X X X X X
X X X
X
X X
Software Validation
X X
X
X
X X
X X X X X
X
X X X
X X X X
X X X X X
X
X X X
X X X X
Integrate and Test
APPENDIX C
X
X X X
Project Interface Architecture Detailed Code Start-Up Design Requirements Design Design and Test
346 Software Quality Assurance SOPs for Healthcare Manufacturers
SOFTWARE DEVELOPMENT TOOLS, TECHNIQUES, AND METHODOLOGIES
Copyright © 2002 Interpharm Press
Software Development Plan
347
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Algorithm: Finite set of well-defined rules for the solution of a problem in a finite number of steps. Algorithm analysis: Examination of an algorithm to determine its correctness with respect to its intended use, to determine its operational characteristics, or to understand it more fully in order to modify, simplify, or improve it. Anomaly: Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, which thereafter serves as the basis for further development and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Completeness: Those attributes of the software or documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Component testing: Testing conducted to verify the implementation of the design for one software component or collection of software components. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Configuration item: Aggregation of hardware, software, or any of its discrete parts, that satisfies an end-use function.
Copyright © 2002 Interpharm Press
Page 31 of 35 SDP without SCMP and SQAP
348
Software Quality Assurance SOPs for Healthcare Manufacturers
Configuration management: Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Consistency: Those attributes of the software or documentation that provide uniformity in the specification, design, and implementation of the product. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations. Critical software: Software whose failure could have an impact on safety. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to Quality Assurance for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Deviation: Authorization for a future activity, event, or product that departs from standard procedures. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, which may lead to a failure. Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness, or damage to or loss of equipment or property.
Page 32 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
349
Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Integration: Process of combining software elements, hardware elements, or both into an overall system. Milestone: Scheduled and accountable event that is used to measure progress. Quality assurance: Planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Architecture Design Specification (SADS): Project-specific document that constrains the design information needed to support the detailed definition of the individual software system components and, upon completion of the Architecture Design Review, becomes the design baseline for development of the SDDS used in support of software coding. Software Configuration Management Plan (SCMP): Project-specific plan that specifies the methods and planning employed to implement software configuration management activities. Software Detailed Design Review (SDDR): Review conducted for the purpose of: (1) reviewing the project’s detailed design, SDDS, associated plans, and critical issues; (2) resolving identified issues; (3) obtaining commitment to proceed into the code and test phase; and (4) obtaining commitment to a test program supporting product acceptance. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review, including a description of the overall program operation and control and the
Copyright © 2002 Interpharm Press
Page 33 of 35 SDP without SCMP and SQAP
350
Software Quality Assurance SOPs for Healthcare Manufacturers
use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures employed to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software Development Test Plan (SDTP): Project-specific plan that defines the scope of software testing that must be completed successfully for each software component developed. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software quality: Totality of features and characteristics of a software product that bear on its ability to satisfy given needs. Software Quality Assurance Plan (SQAP): Project-specific plan that states the software quality objectives of the project as conditioned by the product requirements and the significance of the intended application. Software reliability: Probability that software will not cause the failure of a system for a specified time under specified conditions. Software Requirements Review (SRR): Review of the provisions of the Software Requirements Specification, which, once approved, will serve as the basis of software endproduct acceptance. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied.
Page 34 of 35 SDP without SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
351
Software Validation Test Plan (SVTP): Project-specific plan that describes the software testing required to verify that the software product satisfies the specified requirements. Source code: Original software expressed in human-readable form (programming language), which must be translated into machine-readable form before it can be executed by the computer. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test. Testability: Extent to which software facilitates both the establishment of test criteria and the evaluation of the software with respect to those criteria, or the extent to which the definition of requirements facilitates analysis of the requirements to establish test criteria. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. Walk-through: Review in which the designer or programmer leads members of the review team through a segment of design or code, and the reviewers ask questions and submit comments about technique, style, possible errors, violation of development standards, and other problems.
Copyright © 2002 Interpharm Press
Page 35 of 35 SDP without SCMP and SQAP
with SCMP and SQAP
[Project/Product Name] SDP SOFTWARE DEVELOPMENT PLAN
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SDP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 44 SDP with SCMP and SQAP
354
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
SOFTWARE DEVELOPMENT OVERVIEW
5
3.0
SOFTWARE DEVELOPMENT REQUIREMENTS
16
4.0
SOFTWARE DEVELOPMENT ADMINISTRATIVE PROCEDURES
32
APPENDIX A
Schedule of Software Development Tasks
34
APPENDIX B
Software Development Personnel and Resource Requirements
36
Software Development Tools, Techniques, and Methodologies
37
APPENDIX D
Software Configuration Milestones
38
APPENDIX E
Change Request/Approval (CRA) Form
39
APPENDIX C
GLOSSARY
Page 2 of 44 SDP with SCMP and SQAP
40
Copyright © 2002 Interpharm Press
Software Development Plan
355
1.0 INTRODUCTION
1.1 Purpose This plan identifies and describes the plan for the development of the software for the [project/ product name]. The program of software development defined in this document will be applied throughout all phases of the software development life cycle for the [project/product name] and constitutes a formal software project.
1.2 Scope The scope of the [project/instrument name] software development is defined to be those tasks and the information necessary to manage and perform those tasks that are required in order to ensure the development of quality software for the [project/product name]. The basis for the development of software for the [project/product name] will be the capabilities as defined in the system requirements documents. The program described in this plan assures that an appropriate level of software verification and validation (V&V) will be applied to all phases of the software development, while supporting the [project/instrument name] market strategy and product launch schedule. The software V&V supported by this software development program is contained in the [project/product name] Software Verification and Validation Plan (SVVP).
1.3 Overview This document describes the organization, activities, schedule, and inputs and outputs required for an effective [project/product name] software development program. The scope of participation by associated organizations in the development of the [project/product name] software product is also identified. Software development will be defined for each phase of the [project/ product name] software development life cycle relative to •
Tasks
•
Methods and evaluation criteria
•
Inputs and outputs
Copyright © 2002 Interpharm Press
Page 3 of 44 SDP with SCMP and SQAP
356
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Schedule
•
Resources
•
Risks and assumptions
•
Roles and responsibilities
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the contents of this specification shall be considered a superseding requirement.
1.4.1
Project Specification
•
[project/product name]
Product Objectives Document, Document Number [aaa]-POD[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
[#.#],
Software Development Test Plan, Document Number [aaa]-DTPRevision [#.#], dated [date]
•
[project/product name] Software End Product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
1.4.2
Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
Page 4 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
•
357
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Verification and Validation Policies, Revision [#.#], dated [date]
2.0 SOFTWARE DEVELOPMENT OVERVIEW
2.1 Organization The [project/product name] has been organized under the direction of [title/position], who has assembled a program team that represents all concerned disciplines.This team coordinates the development activities and provides support for product development. Product development of [project/product name] involves [insert engineering disciplines here], and software engineering.The interface between these disciplines during product development is provided through the technical team, who meet to address and resolve [project/product name] product development issues. A lead software engineer heads the [project/product name] software development and provides technical direction in all aspects of software development and V&V. Other software engineers have been assigned to the team, and all participants in the development of the [project/product name] software are responsible for ensuring that their efforts are in compliance with the Software Development Policies. The functions and tasks of V&V for the [project/product name] software are organized under the direction of [corporate title/position].The V&V tasks, policies, and procedures are administered and approved by this individual. The authority for resolving project-related issues raised by the
Copyright © 2002 Interpharm Press
Page 5 of 44 SDP with SCMP and SQAP
358
Software Quality Assurance SOPs for Healthcare Manufacturers
V&V tasks and approval of the V&V products resides with this individual or designate. The [project/product name] V&V organization is composed of software engineers who have not been directly involved in the development of the software being verified and have not established the criteria against which the software is validated. Software task assignments will be made by the software lead engineer. A software development schedule will be developed and milestones set for each task and phase. These milestones will include reviews or reports indicating the completion of the requirements for each phase and the methods for assessing what requirements are needed for the subsequent phases. This plan will be administered by the software lead engineer. Any updates or deviations reside under this authority. The software lead engineer also has the responsibility for ensuring that the tasks, activities, assignments, and milestones are properly met.
2.2 Master Schedule The [project/product name] software development consists of the following types of activities: •
Project administration
•
System interface analysis and design
•
Requirements analysis
•
Software architecture and design
•
Code implementation and testing
•
Code integration and testing
•
System design validation testing
•
Simulation and prototyping support
These types of activities are subdivided into the following software development phases: •
Project Start-Up Phase
•
Interface Design Phase
•
Requirements Phase
Page 6 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
•
Software Architecture Phase
•
Detailed Design Phase
•
Code and Test Phase
•
Integration and Test Phase
•
Software Validation Phase
359
A schedule of the [project/product name] software development tasks and the relationship of each task to the phases of software development is presented as Appendix A. Software V&V is an integral part of each phase of the software development life cycle.The V&V tasks are integrated into the project schedule in order to provide feedback to the development process and support management functions. A schedule of the [project/product name] V&V tasks and the relationship of each task to the phases of software development is presented in the SVVP.
2.3 Resources The personnel and material resources required to perform software development of the [project/ product name] software are presented as Appendix B. The factors that were analyzed in determining these resource requirements are the product features and performance requirements of the [project/product name] product as specified in the product requirements documentation, and the development of the [project/product name] software project in compliance with the Software Development Policies.
2.4 Responsibilities The project software development organization is responsible for performing the tasks defined in this plan.The personnel selected to perform the development of the [project/product name] software have the technical credibility to understand the source of problems related to software quality, to follow through with recommended corrective actions to the development process, and to abort delivery of defective software end products. Members of the project software development organization will not be assigned to the V&V of the software to be produced on the project, but they will establish the criteria against which
Copyright © 2002 Interpharm Press
Page 7 of 44 SDP with SCMP and SQAP
360
Software Quality Assurance SOPs for Healthcare Manufacturers
the software is validated. The specific roles and responsibilities of the V&V organization during each phase of software development are presented in the SVVP.
2.5 Tools, Techniques, and Methodologies Development of the [project/product name] software will be accomplished by implementing, testing, checking, or otherwise establishing and documenting the conformance of the software to specified requirements. These tasks will be performed manually and by means of automated tools and techniques. Examples of the automated tools to be utilized include static analyzers, dynamic analyzers, comparators, editors, compilers, debuggers, linkers and loaders, and change trackers. Manual tools to be utilized include walk-throughs, formal reviews, and algorithm analysis. Support tools and techniques for development include the following: •
General system utilities and text processing tools for code and test preparation, organization, and modification
•
Data reduction and report generation tools
•
Library support systems consisting of database management systems and configuration control systems
•
Code editors, assemblers, compilers, linkers and loaders, and debug tools
•
Test drivers and test languages
The selection of tools for the development tasks is based on the objectives and goals for each phase of software development. The necessary tools for software development are listed in Appendix C.
2.6 Software Configuration Management Software Configuration Management (SCM) is the activity within software development management that provides identification, change control, baseline control, and status accounting to the specification, design, implementation, and test of a software end product. The procedures and methods for SCM of the [project/product name] software are described below.
Page 8 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
2.6.1
361
Software Configuration Identification
Identification is one of the basic SCM principles and establishes the software items that are to be assigned a status, accounted for, and controlled. The materials to be identified and baselined at each major milestone in the development of the [project/product name] software are shown in Appendix D. The contents of each software configuration baseline of the [project/product name] software end product will be identified by SCM. It is important that the configuration identifier be easily recognized and understood by the software project team.Therefore, identification of the [project/product name] software end product will combine the name of the project, the name of the baselined material being identified, and the current revision of the baseline material. Changes that are applied by SCM to the software end product will be identified by a unique number combined with the responsible engineer’s initials, providing valuable historical information during test as to who made the change. Each software configuration baseline will be identified by the product name, the major milestone, and a version descriptor that is incremented for each successive revision of that baseline. 2.6.1.1 Baselined Material Identification. A configuration identifier will be assigned to all baselined material. Configuration identifiers will be structured as [AAA]-LLL-X.XX. The “[AAA]” specifies the three-letter abbreviation for the software end-product name; “LLL” specifies the three-letter abbreviation for the baselined material; and “X.XX” is a decimal number used to track the revision history of the baselined material. 2.6.1.2 Change Documentation Identification. Change documentation will be identified by the software end-product name, the responsible engineer’s initials, and a unique number assigned to the problem description document. The identification will be structured as [AAA]LLL-XX. The “[AAA]” specifies the three-letter abbreviation for the software end-product name; “LLL” specifies the three letters representing the initials of the responsible engineer; and “XX” is a two-digit number that is consecutively assigned, beginning with 01. 2.6.1.3 Configuration Baseline Identification. Each version of a software configuration baseline will be identified with a unique identifier structured as [AAA]-LLL-XX. The “[AAA]” specifies the three-letter abbreviation for the software end-product name; “LLL” specifies the three-letter abbreviation for the major milestone; and “XX” is a two-digit number used to track updates in a software configuration baseline. The major milestone baseline abbreviations are Requirements (RQB), Architecture Design (ADB), Detailed Design (DDB), Implementation (IMB), and Software Validation (SVB).
2.6.2
Software Configuration Control
Software configuration control is the mechanism used to control the development, interaction, and suitability of the software configuration items of each configuration baseline. Software
Copyright © 2002 Interpharm Press
Page 9 of 44 SDP with SCMP and SQAP
362
Software Quality Assurance SOPs for Healthcare Manufacturers
configuration control not only monitors change to the software system but also monitors the specific implementation of the already approved system design. The implementation of software configuration control is discussed in terms of managing software baselines, change classification, the mechanism for baseline change, responsibilities of the Software Change Review Board (SCRB), and using the software development library. 2.6.2.1 Software Baseline Management. All changes to baselined materials will be performed following software developer review and/or test. Changes to the contents of a previous software configuration baseline are generally created incrementally. At the completion of the baseline creation event, the following activities will occur: •
Entry of baselined materials into the [project/product name] software library
•
Verification that all changes to previous baselines have been incorporated in accordance with the approved change document(s)
•
Preparation of change summary documentation
•
Distribution of the affected material
Software configuration procedures shall be implemented for source code on the [project/product project in a manner that will not restrict the design and development effort and yet will control changes to the software once the design is approved. The configuration of source code during the implementation phase of software development will be controlled by the source code control system. The source code developed and tested during the implementation phase will be reviewed by the software lead engineer to determine if it is mature enough to be baselined. Following approval, all source code will be uniquely identified and stored in the [project/ product name] software library. At the completion of the implementation phase, a [project/product name] software end-product version update will be performed, and all source code resident in the software development library will be baselined.
name]
Only one user at any one time is allowed access to modify the files stored in the software development library. All changes to files stored in the software development library are available for documenting, reporting, and analyzing. 2.6.2.2 Change Classification. Changes to baselined materials will be categorized as Class I, Class II, or Class III. Class I changes affect the performance, functional, or technical requirements, such as performance outside of stated tolerance, interface characteristics, schedules, compatibility with support equipment, or resource requirements. Class II changes require a change to other baselined material but do not meet the criteria defined for Class I changes, such as correction of errors, addition of clarifying notes, or views or editorial corrections. Class III changes do not require a change to any other baselined material. The classification of a rec-
Page 10 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
363
ommended change is the responsibility of the requestor. Changes classified as Class I or Class II will be reviewed by the software lead engineer for completeness, accuracy, and clarity, and a recommendation for approval or disapproval will be made prior to submittal to the SCRB. Approval of Class III changes may be obtained from the software lead engineer. 2.6.2.3 Change Mechanism. Changes to baselined materials will be documented on a Change Request/Approval (CRA) form. Appendix E shows an example of the CRA form. The CRA is used to document a description of the proposed change, its effect on baselines, and the change status. The requestor of the change will obtain a change number and then forward the completed CRA to the software lead engineer for evaluation. If the CRA has a classification of Class I or Class II, the software lead engineer will submit the CRA to the SCRB with a change status recommendation. The SCRB will: •
Evaluate the proposed change
•
Review the recommendation of the software lead engineer
•
Direct additional change analysis, approve the change, or disapprove the change
If the CRA has a classification of Class III, the software lead engineer is responsible for approving or disapproving the proposed change. If the CRA is disapproved at any point during the evaluation process, a copy of the CRA will be sent to the requestor, stating the reasons for the disapproval. The software lead engineer is responsible for assuring that approved CRAs are distributed for implementation and status update. The baselined material to be modified will be obtained from the [project/product name] software library and the proposed change(s) will be incorporated by the originator of the baselined material. Review and/or retest of revised baselined material will be accomplished by the [project/product name] V&V engineer(s) to ensure that: •
Revision processing was accomplished in accordance with the approved CRA
•
Faults were not introduced during modification
•
Modification has not caused unintended adverse effects
•
The baselined material still meets its specified requirements
The results of the V&V activities will be documented on the CRA. The updated CRA is then reviewed by the software lead engineer to assure successful implementation of the proposed change(s). If the change implementation is successful, the software lead engineer marks the CRA as closed, and the original is placed in the library. If the change implementation is not successful, the software lead engineer is responsible for directing additional problem analysis
Copyright © 2002 Interpharm Press
Page 11 of 44 SDP with SCMP and SQAP
364
Software Quality Assurance SOPs for Healthcare Manufacturers
and ensuring that a new or revised CRA is classified, reviewed, and submitted for approval in accordance with this plan. Baselined documents requiring approval signatories will be resubmitted by the originator to the appropriate [project/product name] personnel for authorization to release the revised document. 2.6.2.4 Software Change Review Board. The [project/product name] SCRB is established by the [corporate title/position] to coordinate, review, and decide the disposition of Class I and Class II CRAs. The scheduling of meetings for the SCRB is the responsibility of the the software configuration manager and is dictated by the impact of the proposed change(s) and the phase of the software development. The [project/product name] SCRB analyzes and identifies the impact of the change documented in the CRA. The CRA is updated to reflect the decisions of the SCRB to: •
Direct additional problem analysis
•
Direct that a change different from that proposed in the CRA be implemented
•
Approve the change as proposed
•
Disapprove any change
The SCRB will distribute the original CRA to the software lead engineer, who will provide a copy of the updated CRA to the originator(s) of the affected baselined material. The original CRA will be placed in the [project/product name] software library. 2.6.2.5 [project/product name] Software Library. The [project/product name] software library will provide source file control, problem identification, change traceability, and status determination functions for the [project/product name] software development effort. These functions will be provided through the use of: •
A source code control system that provides a history of the revisions to source documents and access control for change processing
•
File storage space and access control
•
Development of a file activity and change status database
•
A set of procedures for the use of these tools
Baselined materials will be distributed from the [project/product name] software library at each major milestone and upon request of the software lead engineer. Software validation of the [project/ product name] software will be executed on baselined source code retrieved from the [project/ product name] software library.
Page 12 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
2.6.3
365
Software Configuration Status Accounting
The object of configuration status accounting is to provide identification of the [project/product configuration baselines and traceability from the baselines resulting from approved changes and to provide a management tool for monitoring the accomplishment of all related tasks resulting from each approved change. The result of this configuration management activity is a software configuration status report, distributed at the completion of each major milestone and upon request of the software lead engineer. This status report will list the following information:
name]
•
Baseline identification, including version
•
A list of all baselined material, indicating the current revision and referenced CRAs
•
A list of all CRAs, including current disposition or date of closure
•
A list of all anomaly reports, including the date of detection, date of fix, associated CRAs if any, date of retest, and current disposition
2.6.4
Audits and Reviews
The audits and reviews performed during development of the [project/product name] software are shown in Appendix D. The following material is used during the reviews and audits: •
An update of the [project/product name] software configuration status report
•
Copies of baselined materials
•
Copies of all CRAs generated since the last review or audit, for reference
2.7 Software Quality Assurance 2.7.1
Activities, Methods, and Tools
The Software Development Policies contain the requirements for the development of quality software and for performing software quality assurance (QA) activities. The following procedures and methods governing QA activities for the [project/product name] software are consistent and in compliance with those policies and will be used to ensure software quality:
Copyright © 2002 Interpharm Press
Page 13 of 44 SDP with SCMP and SQAP
366
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Analysis of specifications to verify that the software requirements are accurately and completely identified
•
Review of the hazards analysis to ensure that all hazards are identified
•
Design reviews for the purpose of identifying design requirements, detection of design deficiencies, and adherence to the development policies
•
Design and code walk-throughs to verify that all requirements are addressed and that established policies, procedures, and guidelines are being followed
•
Review of development progress and compliance with software development policy requirements, software project plans, and software development estimates
•
Review of test plans and procedures to ensure that all specified requirements are adequately tested
•
Software tools used as necessary throughout the software development process for evaluating, analyzing, and documenting the software activities and products
•
Use of software CRA forms to ensure proper resolution and timely implementation, documentation, and close-out
2.7.2
Use of Development Plans, Procedures, and Tools
All planning documents will be reviewed to assure that they are adequate to support the software project, comply with the Software Development Policies, and are consistent with associated plans. The execution of these plans will be monitored throughout the software project, and corrections to the procedures or their implementation will be made as required.
2.7.3
Use of Configuration Management
Software configuration management practices will be utilized during all phases of software development. Software Anomaly Reports and CRAs will be used to ensure that software changes are properly incorporated and completed.
2.7.4
Use of the Software Library
Software designated for the [project/product name] software library will be entered and maintained in accordance with the procedures defined in Section 2.6, Software Configuration
Page 14 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
367
Management. As a minimum, configuration management of the library will ensure that (1) the most recent authorized version of materials under each level of configuration control are clearly identified and are the ones routinely available from the library; and (2) previous versions of materials under configuration control are clearly identified and controlled to provide an audit trail that permits reconstruction of all changes made to the item.
2.7.5
Design Reviews and Code Walk-Throughs
Prior to design review or code walk-through, the responsible software engineer will make available to other [project/product name] software developers and V&V members a review package. Code walk-throughs will be at the discretion of the [project/product name] software lead engineer. The [project/product name] SDP details the code walk-through practices, format, and content.
2.7.6
Corrective Action System
Corrective action of discrepancies and deficiencies found during software development and test will be processed through the use of Software Anomaly Reports and CRAs. Processing of Software Anomaly Reports is described in the [project/product name] Software Verification and Validation Plan (SVVP), and processing of the CRAs is described in Section 2.6, Software Configuration Management. Software QA will assure the following for the Software Anomaly Reports: •
Reports are reviewed and analyzed.
•
Appropriate corrective action is taken.
•
Trends are analyzed in performance of work to prevent the development of noncompliant products.
•
Corrective measures are reviewed to ensure that problems and discrepancies have been resolved and correctly reflected in the appropriate documents.
2.7.7
Documentation Review
All software documentation prepared during each phase of the [project/product name] software development will be reviewed to assure compliance with the following standards and requirements: •
Adherence to required format and documentation standards defined in the software project plans and software development procedures
[project/product name]
Copyright © 2002 Interpharm Press
Page 15 of 44 SDP with SCMP and SQAP
368
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Compliance with Software Development Policies
•
Internal consistency
•
Understandability
•
Traceability to the indicated document(s)
•
Consistency with the indicated document(s)
2.7.8
Testing
Test plans, Test Information Sheets (TISs), and test procedures will be reviewed for compliance with the standards and requirements described in the [project/product name] SVVP, SDP, and Software Development Policies. Tests will be conducted in accordance with approved test plans, TISs, and test procedures. Test results will be specified in a test report.
2.8 Software Verification and Validation The V&V of the [project/product name] software is specified in the [project/product name] SVVP and its supporting documents.
3.0 SOFTWARE DEVELOPMENT REQUIREMENTS
3.1 Management The management of the software development program described in this plan spans all phases of [project/product name] software development. The [corporate title/position] designates a software lead engineer who is responsible for performing both the software development management tasks and technical direction. The management tasks to be performed for the [project/product name] software development program include, but are not limited to, the following:
Page 16 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
369
•
[project/product name]
SDP generation and maintenance
•
Software baseline change assessment for effects on previously baselined tasks
•
Periodic review of the V&V effort, technical accomplishments, resource utilization, future planning, and risk management
•
Daily management of software phase activities, including the technical quality of final and interim testing and reports
•
Review and evaluation of V&V results in order to determine when to proceed to the next software development life cycle phase and define changes to V&V tasks, which will improve the V&V effort
•
Maintain good communication with all team members to ensure the accomplishment of project quality assurance goals and objectives
At each phase of software development, the software tasks and associated inputs and outputs, schedules, resource requirements, risks and assumptions, and personnel responsible for performing the task are evaluated. This evaluation establishes the criteria for updating this plan. Maintenance is performed as necessary to ensure the completeness and consistency of this plan with the changes in software developed for the project. [corporate title/position] will
support the management of software development for the [project/prodsoftware through reviews of software activities. Periodic reviews of the development effort, technical accomplishments, resource utilization, future planning, and risk management will be conducted by [corporate title/position]. The technical quality and results of the outputs of each phase of the software development will be evaluated in order to provide management support for the software leader’s recommendation to proceed or not proceed to the next development phase and to define changes to V&V tasks to improve the V&V effort. Updates to this plan during software development will be reviewed and approved by [corporate title/position] prior to implementation. uct name]
3.2 Interface Design Phase Software Development Activities The software development activities that occur during this phase are to determine and document the hardware-to-software and software-to-software interfaces for the product. The hardware-to-software interface tasks will include the following: •
Review electrical circuits and/or components that must be tested by software.
Copyright © 2002 Interpharm Press
Page 17 of 44 SDP with SCMP and SQAP
370
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Analyze performance requirements, testing procedure, sequencing, and timing as required.
•
Identify faults that can be detected by the software tests.
•
Determine signals that are passed between software and external devices, components, or peripherals.
•
Identify and define discrete signal states, reset states, and initialization values.
•
Define relevant transfer functions for any analog signals.
•
Define algorithms that encompass electromechanical systems.
The software-to-software interface tasks will include the following: •
Mailbox structure and communication
•
Data structure, values, and defaults
•
Pointer, index, and parameter schemes for accessing data structures and arrays
•
Parameter passing conventions
•
Definition of semaphores, flags, and system status indicators
•
Basic computational units and conversion schemes
The results of this phase will be documented in the [project/product name] software Interface Design Specification. This document will be updated throughout the [project/product name] software development as necessary in order to reflect the evolving software.
3.3 Requirements Phase Software Development Activities The goal of the Requirements Phase is to ensure that both the problem and the constraints upon the solution are specified in a rigorous form. During this phase of software development, the software requirements analysis is performed. As problem evaluation and solution synthesis are accomplished, the characteristics of the software are established and design constraints are uncovered. The Product Requirements Document specifies the product or system-level requirements of the [project/product name] and establishes the product requirements from which software requirements are allocated.
Page 18 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
371
The [project/product name] Software Requirements Specification (SRS) is the document that specifies the results of the software requirements analysis.The SRS defines the basic functions, performance, interfaces, and flow or structure of information and validation criteria of the successful software implementation.The emphasis of the Software Requirements Phase V&V tasks is the analysis and evaluation of the correctness, consistency, completeness, accuracy, and testability of the specified software requirements.
3.3.1
Software Requirements Specification (SRS)
Review of the product requirements documentation is critical because it establishes the basis upon which all succeeding documents and products are developed. During this phase of development, the specifications of system performance, user interface, and critical components of [project/product name] are reviewed for use in planning and in defining the level of effort required to successfully implement the software. Product requirements documentation of the project is provided to the V&V leader by the software lead engineer for review prior to development of the SRS. The [project/product name] SRS will concentrate on the following areas of requirement definition: •
Specification of the computing environment(s) in which the software must perform
•
Specification of the safety requirements, including a description of the unsafe operating conditions in terms of critical software functions and goals, the severity of hazards, and the set of associated critical parameters and critical indicators
•
Specification of the hardware interfaces through which the software must gather input and send output
•
Specification of the software interfaces, including the purpose of the interface, the type of data to be interchanged via the interface, and an estimate of data quantity and transfer rate requirements
•
Specification of the user interfaces, including the characteristics that the software must support for each human interface to the software product
•
Specification of the interfaces to communications devices, including the name, version, interface type, and required usage
•
Specification of the required values of each output, expressed through functions, and state tables
Copyright © 2002 Interpharm Press
Page 19 of 44 SDP with SCMP and SQAP
372
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Specification of the timing, accuracy, and stability requirements for each such output value
•
Software design constraints specifying likely changes and desirable subsets that must be accounted for in the design
3.3.2
Inputs and Outputs
The inputs to the Software Requirements Phase include the product requirements documentation, [project/product name] Interface Design Specification (IDS), and periodic program status reports. The outputs of the Software Requirements Phase are the Requirements Phase V&V Task reports, Requirements Phase V&V Task Summary report, the [project/product name] Requirements Traceability Matrix (RTM), the [project/product name] SRS, and updates to this plan as required in order to accommodate changes in product requirements and/or program objectives.
3.3.3
Software Requirements Review (SRR)
The goal of the SRR is to review the progress of the [project/product name] project to date; review the SRS and determine the adequacy, correctness, and testability of the stated software and interface requirements; and determine whether to proceed or not to proceed to the next development phase. All V&V outputs generated during this development phase will be provided by the V&V leader to the software lead engineer prior to the SRR.
3.3.4
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on the assumptions: that the [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives; and that development of the SRS will comply with the requirements defined in the Software Development Policies.
3.4 Software Architecture Phase Software Development Activities The goal of the Software Architecture Phase is to ensure that a preliminary software design has been achieved that establishes the design baseline from which the detailed design will be devel-
Page 20 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
373
oped.The [project/product name] Software Architecture Design Specification (SADS) is generated during this phase of software development. The SADS describes how the software system will be structured to satisfy the requirements identified in the [project/product name] SRS. The SADS translates the software requirements into a description of the software structure, software components, interfaces, and data necessary for the detail design phase. The goal of the Software Architecture Phase V&V tasks is to ensure internal consistency, completeness, correctness, and clarity of the information needed to support the detailed definition of the individual software system components.
3.4.1
Software Architecture Design Tasks
The requirements analysis tasks defined for the Software Architecture Phase software development include the following: •
Partitioning of the [project/product name] processes into appropriate tasks
•
Determination and design of start-up and shutdown tasks
•
Determination and design of interrupt service routines
•
Partition of the data and control structures into global and local task elements
•
Definition of mailboxes, messages, timers, and pointers
•
Definition of task priorities that are required for intertask communications and synchronization
The software safety features will be included in this design and will be in compliance with approved software safety design guidelines and safety considerations identified in the system Hazards Analysis, which are controlled and/or commanded by software. The results of the requirements will be documented in the Software Architecture Design Specification (SADS). Design walk-throughs are conducted by the software designer(s) during the detailed design phase to examine the characteristics of the detailed software design. V&V will participate in these walk-throughs and will provide results of design V&V to the software lead engineer and the [corporate title/position].
3.4.2
Inputs and Outputs
The inputs to the Software Architecture Phase include the [project/product name] SRS, Hazards Analysis, [project/product name] RTM, [project/product name] Software Development Test Plan
Copyright © 2002 Interpharm Press
Page 21 of 44 SDP with SCMP and SQAP
374
Software Quality Assurance SOPs for Healthcare Manufacturers
(SDTP), and [project/product name] periodic program status reports. The outputs of the Software Architecture Phase are the V&V Task reports, V&V Task Summary report, [project/product name] Software Validation Test Plan (SVTP), [project/product name] SADS, [project/product name] RTM updates, and updates to this plan as required in order to accommodate changes in [project/ product name] product or software requirements.
3.4.3
Software Architecture Review (SAR)
The goal of the SAR is to •
Review the progress of the [project/product name] project to date
•
Review the SADS and determine the adequacy, correctness, and testability relative to the stated software and interface requirements
•
Evaluate the form, structure, and functional description of the design for correctness, consistency, completeness, and accuracy
•
Evaluate the software structure for robustness, testability, and compliance with established software development procedures and Software Development Policies
•
Analyze the data items defined at each interface for correctness, consistency, completeness, and accuracy
•
Determine to proceed or not to proceed to the next development phase
All V&V outputs generated during this development phase will be provided by the V&V leader to the software lead engineer prior to the SAR.
3.4.4
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on the assumptions: that the [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives; and that development of the [project/ product name] SADS will comply with the requirements defined in the Software Development Policies.
Page 22 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
375
3.5 Detailed Design Phase Software Development Activities The goal of the Detailed Design Phase is to ensure that the detailed software design satisfies the requirements and constraints specified in the [project/product name] SRS and augments the design specified in the [project/product name] SADS. A [project/product name] Software Detailed Design Specification (SDDS) is generated during this phase of software development. The SDDS describes how the software system will be structured to satisfy the requirements identified in the SRS and supports the design specified in the SADS.The SDDS translates the software requirements into a description of the software structure, software components, interfaces, and data necessary for the implementation phases. The result is a solution specification that can be implemented in code with little additional refinement. The goal of the Detailed Design Phase V&V tasks is to ensure the internal consistency, completeness, correctness, and clarity of the [project/product name] SDDS and to verify that the implemented design will satisfy the requirements specified in the [project/product name] SRS.
3.5.1
Software Detailed Design Tasks
The detailed design tasks defined for the Software Architecture Phase software development include the following: •
Partitioning of the [project/product name] SADS tasks into appropriate subtasks
•
Final design of start-up and shutdown tasks
•
Final design of interrupt service routines
•
Partition of the data and control structures into global, local, and task elements
•
Final definition of mailboxes, messages, timers, and pointers
•
Final definition of task priorities that are required for intertask communications and synchronization
•
Development of structure charts
•
Development of final state transition, data flow, and control flow diagrams
The software safety features will be included in this design and will be in compliance with approved software safety design guidelines and safety considerations identified in the system Hazards Analysis, which are controlled and/or commanded by software. The results of the requirements will be documented in the SDDS.
Copyright © 2002 Interpharm Press
Page 23 of 44 SDP with SCMP and SQAP
376
Software Quality Assurance SOPs for Healthcare Manufacturers
Design walk-throughs are conducted by the software designer(s) during the detailed design phase to examine the characteristics of the detailed software design. V&V will participate in these walk-throughs and will provide results of design V&V to the software lead engineer and the [corporate title/position].
3.5.2
Software Development Test Information Sheets
A Software Development Test Information Sheet (DTIS) is prepared by the software developers for each software component test defined in the SDTP. The DTISs are provided to the V&V leader by the software lead engineer for review prior to the Software Detailed Design Review (SDDR). Verification of the adequacy of software component testing is supported by the review of the DTISs.
3.5.3
Inputs and Outputs
The inputs to the Detailed Design Phase include the [project/product name] SRS, Hazards Analysis, [project/product name] SADS, [project/product name] RTM, and [project/product name] periodic program status reports. The outputs of Detailed Design Phase V&V are the Detailed Design Phase V&V Task reports, Detailed Design Phase V&V Task Summary report, [project/product name] VTISs, [project/product name] SDDS, [project/product name] DTISs, [project/product name] RTM updates, and updates to this plan as required in order to accommodate changes in [project/product name] product or software requirements.
3.5.4
Software Detailed Design Review
The goal of the SDDR is to •
Review the progress of the [project/product name] project to date
•
Review the SDDS and determine the adequacy, correctness, and testability relative to the stated software and interface requirements
•
Evaluate the form, structure, and functional description of the design for correctness, consistency, completeness, and accuracy
•
Evaluate the software structure for robustness, testability, and compliance with established software development procedures and Software Development Policies
•
Analyze the data items defined at each interface for correctness, consistency, completeness, and accuracy
Page 24 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
•
377
Determine whether to proceed or not to proceed to the next development phase
All V&V outputs generated during this development phase will be provided by the V&V leader to the software lead engineer prior to the SDDR. An assessment will be made of how well the software structures defined in the [project/product SDDS satisfy the fundamentals of structured design. Structured design techniques that provide a foundation for “good” design methods include the following:
name]
•
Evaluating the preliminary software structure to reduce coupling and improve cohesion
•
Minimizing structures with high fan-out and striving for fan-in as depth increases
•
Keeping the scope of effect of a component within the scope of control of that component
•
Evaluating component interfaces to reduce complexity and redundancy and improve consistency
•
Defining components that have predictable function, but avoiding components that are overly restrictive
•
Striving for single-entry, single-exit components, and avoiding content coupling
•
Packaging software on the basis of design constraints and portability requirements
•
Selecting the size of each component so that independence is maintained
3.5.5
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on the assumptions: that the [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives; and that development of the [project/product name] SDDS and DTISs will comply with the requirements defined in the Software Development Policies.
3.6 Code and Test Phase Software Development Activities The goals of the Code and Test Phase are as follows:
Copyright © 2002 Interpharm Press
Page 25 of 44 SDP with SCMP and SQAP
378
Software Quality Assurance SOPs for Healthcare Manufacturers
•
To ensure that the design is correctly implemented in code, resulting in a program or system that is ready for integration and validation
•
To ensure the accurate translation of the detailed design
•
To detect undiscovered errors
Verification of the Code and Test Phase activities performed by software developers is accomplished by reviewing code and test results.
3.6.1
Code and Test Tasks
The [project/product name] software will be coded, using the text editor approved for the project, in conformance with the applicable programming guidelines. Code testing will be performed in accordance with the SDTP and DTIS requirements at the level of detail represented by the software at the time of the test. The source code submitted for integration will be free of unacceptable diagnostic utility analysis and compiler warnings. A list of acceptable diagnostic and compiler warnings and switch settings will be maintained by the software integrator. Software components will be analyzed by a complexity analysis tool. Code walk-throughs will be conducted by the code developer(s) during the implementation to examine both high-level and detailed properties of the source code. V&V will participate in these walk-throughs and provide results of source code V&V to the software lead engineer and the [corporate title/position]. Code reviews will also be performed by V&V. The V&V reviews of source code will •
Evaluate the structure of the source code for compliance with coding standards
•
Assess the communication value of the source code
•
Evaluate the source code for efficiency of algorithms, memory efficiency, execution efficiency, and input and output efficiency
•
Evaluate the source code for consistency, completeness, and traceability to software requirements and design
Discrepancies and deficiencies found during V&V of the source code are documented in Software Anomaly Reports. During the Code and Test Phase, software developers will use DTISs to conduct software component testing. At the successful completion of the testing described, the DTIS is signed
Page 26 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
379
and dated by the software lead engineer. DTISs and associated test data will be provided to the V&V leader by the software lead engineer as each test is completed. Completed DTISs will be analyzed by V&V to evaluate the following: •
Adequacy of test coverage
•
Adequacy of test data
•
Software behavior
•
Software reliability
Discrepancies and deficiencies found during software component testing are documented in Software Anomaly Reports.
3.6.2
Inputs and Outputs
Inputs to the Code and Test Phase include the [project/product name] SDDS, [project/product name] SRS, [project/product name] code, [project/product name] SDTP, [project/product name] DTISs, [project/product name] SVTP and [project/product name] VTISs. The outputs are V&V task reports, anomaly reports, a task summary report, [project/product name] SVTPR, updates to the [project/product name] RTM, and updates to this plan as required.
3.6.3
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on the assumptions: that the [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives; that development of the [project/product name] code and conduct of software component testing will comply with the requirements defined in the Software Development Policies; and that when component testing has been completed successfully, [project/product name] code will be delivered to the V&V organization for baselining in accordance with this plan and the [project/product name] SCMP.
3.7 Integrate and Test Phase Software Development Activities The goals of the Integrate and Test Phase are as follows:
Copyright © 2002 Interpharm Press
Page 27 of 44 SDP with SCMP and SQAP
380
Software Quality Assurance SOPs for Healthcare Manufacturers
•
To ensure that the design is correctly implemented in code, resulting in a program or system that is ready for validation
•
To ensure the accurate translation of the implemented code and detailed design
•
To detect undiscovered errors
Verification of the Integration and Test Phase activities performed by software developers is accomplished by reviewing code and test results.
3.7.1
Integrate and Test Tasks
Integration will be the responsibility of the software integrator, who will receive and audit the software developer’s deliverables. Code to be integrated will be verified against the current existing integrated baseline to ensure that it does not cause unexpected side effects. The code to be integrated will conform with the requirements specified in the Product Requirements Document (PRD) and IDS. If the audit and integration are acceptable, the code will be added to the integration baseline. Otherwise, the code will be returned to the developer for error resolution and correction. Code walk-throughs may be conducted by the code developer(s) during the integration phase at the request of the software lead engineer or software integrator to examine both high-level and detailed properties of the source code.V&V will participate in these walk-throughs and will provide results of source code V&V to the software lead engineer and the [corporate title/position]. Code reviews will also be performed by V&V. The V&V reviews of source code will •
Evaluate the structure of the integrated code for compliance with coding standards
•
Assess the communication value of the integrated code
•
Evaluate the integrated code for efficiency of algorithms, memory efficiency, execution efficiency, and input/output efficiency
•
Evaluate the integrated code for consistency, completeness, and traceability to software requirements and design
Discrepancies and deficiencies found during V&V of the source code are documented in Software Anomaly Reports. During the Integrate and Test Phase, software developers will use DTISs to conduct software component integration testing. At the successful completion of the testing described, the DTIS
Page 28 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
381
is signed and dated by the software lead engineer. DTISs and associated test data will be provided to the V&V leader by the software lead engineer as each test is completed. Completed DTISs will be analyzed by V&V to evaluate the following: •
Adequacy of test coverage
•
Adequacy of test data
•
Software behavior
•
Software reliability
Discrepancies and deficiencies found during software integration testing are documented in Software Anomaly Reports.
3.7.2
Inputs and Outputs
Inputs to the Integrate and Test Phase include the following: •
[project/product name]
SDDS
•
[project/product name]
SRS
•
[project/product name]
code
•
[project/product name]
SDTP
•
[project/product name]
DTISs
•
[project/product name]
SVTP
•
[project/product name]
VTISs
The outputs are V&V task reports, anomaly reports, a task summary report, updates to the [project/product name] SVTPR, updates to the [project/product name] RTM, and updates to this plan as required.
3.7.3
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on these assumptions:
Copyright © 2002 Interpharm Press
Page 29 of 44 SDP with SCMP and SQAP
382
Software Quality Assurance SOPs for Healthcare Manufacturers
•
The [corporate title/position] will provide the project organization with the resources required to fulfill the tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives.
•
Integration of the [project/product name] code and conduct of software component integration testing will comply with the requirements defined in the Software Development Policies.
•
When component testing has been completed successfully, the [project/product name] code will be delivered to the V&V organization for baselining in accordance with this plan and the [project/product name] SCMP.
3.8 Validation Phase Software V&V Activities The goal of the Software Validation Phase V&V is to verify that the [project/product name] software satisfies the requirements and design specified in the [project/product name] SRS and SDDS.
3.8.1
Prevalidation Software Configuration Control
At the completion of software component testing, the software is placed under configuration control for baseline processing. The baselined source code and associated files will be stored in the project software library in accordance with the [project/product name] SCMP. The library will provide internal source file control, problem identification, change traceability, and status determination of the software and associated documentation. By this means, software configuration is controlled prior to software validation.
3.8.2
Software Validation Testing
Software validation is performed using the current controlled version of the software. Software validation is conducted in accordance with the [project/product name] SVTP using the [project/product name] SVTPR. The results of software validation are documented on the VTISs. Validation test results are analyzed to determine if the software satisfies software requirements and objectives. Software Anomaly Reports are generated to document test failures and software faults. Control of the software configuration under test is maintained by implementing the procedures in the [project/product name] SCMP.The SCMP describes the required steps for processing, reporting, and recording approved software changes and dissemination of baselined descriptive documentation and software media.
Page 30 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
3.8.3
383
Software Configuration Audit
A Software Configuration Audit of the validated software is conducted by V&V at the conclusion of software validation.The baselined software documentation is audited to determine that all the software products to be delivered for certification are present. The version description of all items is verified to demonstrate that the delivered software end products correspond to the software subjected to software validation. Discrepancies and deficiencies found during the software configuration audit are documented in Software Anomaly Reports. All anomaly reports are provided to the software lead engineer and the [corporate title/position] for initiation of corrective action. Completion of the Software Configuration Audit is contingent upon closure of all outstanding software discrepancies and deficiencies. Upon successful completion of this audit, the certified software products are delivered by the V&V leader to the software lead engineer for final product certification. The Software Configuration Audit Report is generated by the V&V leader to document the final configuration of the software products delivered for product certification.
3.8.4
Software Verification and Validation Report
The [project/product name] Software Verification and Validation Report (SVVR) is generated by the V&V leader at the completion of all V&V tasks during the Software Validation Phase. The SVVR is a summary of all V&V activities and results, including status and disposition of anomalies. An assessment of the overall software quality and recommendations for software and/or development process improvements is documented in the report.
3.8.5
Inputs and Outputs
The inputs to the Software Validation Phase V&V are the following: •
[project/product name]
product requirements documentation
•
[project/product name]
SRS
•
[project/product name]
SDDS
•
[project/product name]
SVTP
•
[project/product name]
VTISs
•
[project/product name]
SVTPR
Copyright © 2002 Interpharm Press
Page 31 of 44 SDP with SCMP and SQAP
384
Software Quality Assurance SOPs for Healthcare Manufacturers
The outputs are the completed [project/product name] VTISs, Software Validation Phase V&V Task Summary report, Software Configuration Audit Report, [project/product name] SVVR, anomaly reports and updates to this plan as required in order to accommodate changes in the [project/product name] software validation program.
3.8.6
Risks and Assumptions
Accomplishment of the scheduled tasks for this phase of software development depends on these assumptions: •
The [corporate title/position] will provide the project organization with the resources required to fulfill the V&V tasks defined, will ensure that the necessary inputs are provided in a timely manner, and will review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives.
•
Changes to [project/product name] requirements that have not been approved by the [title/position] will not be implemented in the [project/product name] code without the approval of the [corporate title/position] or designate.
•
[project/product name] software to be tested will be obtained by the V&V engineer(s) from the configuration management library.
4.0 SOFTWARE DEVELOPMENT ADMINISTRATIVE PROCEDURES
4.1 Additional Software Development Procedures The [project/product name] does not require any additional software development procedures.
4.2 Commercially Available, Reusable Software The [project/product name] will use [insert language and version number here] as its source language and libraries. The product will also use [insert any purchased software products, libraries, or package names here].
Page 32 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
385
4.3 Software End-product Repository The released [project/product name] product software, data, documents, specifications, and libraries will be entered into the product quality assurance system for product life cycle configuration control in accordance with established software end-product SOPs. The non-released [project/product name] product software end products will be entered into the software quality assurance system for product life cycle configuration control in accordance with established software end-product SOPs. The non-released [project/product name] product hardware, firmware, and hardware-control-support software end products will be entered into the software quality assurance system for product life cycle configuration control in accordance with established software end-product SOPs.
4.4 Software Development Metrics The [project/product name] software team will collect the following software metrics: •
Defects, as a function of the Software Anomaly Reports and CRAs
•
Number of software components subjected to walk-throughs
•
Number of software components integrated
•
Code, analyzed by a software complexity analysis tool
Copyright © 2002 Interpharm Press
Page 33 of 44 SDP with SCMP and SQAP
Software Quality Assurance Plan (SQAP)
Page 34 of 44 SDP with SCMP and SQAP
Interface requirements and design analysis
Interface Design Specification (IDS)
Interface Design
Software Endproduct Acceptance Plan (SEAP)
Software Development Plan (SDP)
Software Configuration Management Plan (SCMP)
Software Requirements Review (SRR)
Software Architecture Design Review (SADR)
Software architecture design walkthrough
Software Development Test Plan (SDTP)
Software Architecture Design Specification (SADS)
Software requirements analysis Software Requirements Specification (SRS)
Software architecture design analysis
Architecture Design
System requirements document review
Requirements
Software Detailed Design Review (SDDR)
Software Development Test Information Sheets (DTISs)
Software detailed design walkthrough
Software Detailed Design Specification (SDDS)
Software detailed design analysis
Detailed Design
Code implementation
Integration implementation
DTIS execution
Code audits
Code audits DTIS execution
Code walkthroughs
Integrate and Test
Code walkthroughs
Code and Test
TASKS, ACTIVITIES, AND DOCUMENTS BY PHASE
Software Verification and Validation Report (SVVR)
Software configuration audit and report
Regression test conduct
Software validation test conduct
Software Validation
APPENDIX A
Estimations
Project Start-Up
APPENDIX A.1
386 Software Quality Assurance SOPs for Healthcare Manufacturers
SCHEDULE OF SOFTWARE DEVELOPMENT TASKS
Copyright © 2002 Interpharm Press
Software Development Plan
A.2
387
SOFTWARE IMPLEMENTATION PHASED DELIVERABLES
[Enter intended order of build and baseline deliverables]
Copyright © 2002 Interpharm Press
Page 35 of 44 SDP with SCMP and SQAP
Page 36 of 44 SDP with SCMP and SQAP
Hardware Support Software Engineers Senior Software Technician Software Technician
1
1
1
1
1
1
Project Interface Start-Up Design Requirements
1
1
1
1
1
1
1
1
Architecture Detailed Code Integrate Design Design and Test and Test
1
1
Software Validation
APPENDIX B
Software V&V Engineers Software V&V Lead Engineer Senior Software V&V Scientist Software V&V Scientist Senior Software V&V Analyst Software V&V Analyst Software V&V Engineer Associate Software V&V Engineer
Software Engineers Software Lead Engineer Software Scientist System Analyst Software Analyst Programmer Analyst Programmer Associate Programmer
Resource Requirement
388 Software Quality Assurance SOPs for Healthcare Manufacturers
SOFTWARE DEVELOPMENT PERSONNEL AND RESOURCE REQUIREMENTS
Copyright © 2002 Interpharm Press
Software Development Software Database software Text-processing software Graphical/drawing software Spreadsheet software Project management software Performance analyzer Logic analyzer Complexity analysis software Reverse engineering software Configuration control software Code debugger software Compiler software Assembler software Linker/loader software Application development software Software Development Hardware Emulator PC/workstation Lap top/notebook computers Controlled current power supply Test fixtures EPROM generator
Resource Requirement
Copyright © 2002 Interpharm Press X
X
X
X
X
X
X X X X X
X X X X
X
X
X
X X X X
X
X X X X X
X
X X
X X X X
X X X X X X
X X X X
X
X X X X X
X X X
X
X X
Software Validation
X X
X
X
X X
X X X X X
X
X X X
X X X X
X X X X X
X
X X X
X X X X
Integrate and Test
APPENDIX C
X
X X X
Project Interface Architecture Detailed Code Start-Up Design Requirements Design Design and Test
Software Development Plan 389
SOFTWARE DEVELOPMENT TOOLS, TECHNIQUES, AND METHODOLOGIES
Page 37 of 44 SDP with SCMP and SQAP
Page 38 of 44 SDP with SCMP and SQAP Interface Design Specification (IDS)
Software Quality Assurance Plan (SQAP)
Requirements Baseline
Configuration baselines
Software Endproduct Acceptance Plan (SEAP)
Software Development Plan (SDP)
Software Configuration Management Plan (SCMP)
Software Requirements Specification (SRS)
Requirements
Software Requirements Review (SRR)
Software Verification and Validation Plan (SVVP)
Interface Design
Project Start-Up
Architecture Design Baseline
CRAs
Software Anomaly Reports
Software Validation Test Procedures (SVTPR)
Software Development Test Information Sheets (DTISs)
Source code
Code and Test and Integrate and Test
Detailed Design Baseline
Implementation Baseline
Software configuration status
Code audits
Design walk- Code walkthroughs throughs
CRAs
Software Detailed Design Specification (SDDS)
Detailed Design
Software Software Architecture Detailed Design Review Design (SADR) Review (SDDR)
Design-walkthroughs
CRAs
Software Validation Test Plan (SVTP)
Software Development Test Plan (SDTP)
Software Architecture Design Specification (SADS)
Architecture Design
Validation Baseline
Software configuration status
Software configuration audit
Software Verification and Validation Report (SVVR)
CRAs
Software Anomaly Reports
Software Validation Test information Sheets (VTISs)
Software Validation
APPENDIX D
Reviews and audits
Configuration items
Configuration Management Description
390 Software Quality Assurance SOPs for Healthcare Manufacturers
SOFTWARE CONFIGURATION MILESTONES
Copyright © 2002 Interpharm Press
Software Development Plan
APPENDIX E
391
CHANGE REQUEST/APPROVAL (CRA) FORM CHANGE REQUEST/APPROVAL (CRA) FORM
1. System name: ______________________________________ 3. Application Level:
❑
SOFTWARE
4.a. Originating Organization
5. Configuration Baseline Affected (highest level)
OTHER
❑
7. Configuration Items Affected:
6. Change Classification:
d. Date
❑
DOCUMENT
b. Initiator
c. Telephone
2. CRA Number: __________
a.
______________________
b.
______________________
Class I
❑
c.
______________________
Class II
❑
d.
______________________
Class III
❑
e.
______________________
8. Narrative: (if additional space is needed, indicate here ___ Page ___ of ___.) a. Description of change:
b. Need for change:
c. Estimated effects on other systems, software, or equipment:
d. Alternatives:
e. Anomaly Number (if any) used to generate this CRA: 9. Disposition:
Additional Analysis
________________________________ Approved
Disapproved
DATE: Signature:
____________________________________________________________________
10. Change Verification Results: 11.
V&V Signature: ________________________________________________
13.
Date Closed:________________
Copyright © 2002 Interpharm Press
12. Date: ______
14: Signature: __________________________________
Page 39 of 44 SDP with SCMP and SQAP
392
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Algorithm: Finite set of well-defined rules for the solution of a problem in a finite number of steps. Algorithm analysis: Examination of an algorithm to determine its correctness with respect to its intended use, to determine its operational characteristics, or to understand it more fully in order to modify, simplify, or improve it. Anomaly: Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, which thereafter serves as the basis for further development and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Completeness: Those attributes of the software or documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Component testing: Testing conducted to verify the implementation of the design for one software component or collection of software components. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Configuration item: Aggregation of hardware, software, or any of its discrete parts, that satisfies an end use function.
Page 40 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
393
Configuration management (CM): Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Consistency: Those attributes of the software or documentation that provide uniformity in the specification, design, and implementation of the product. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations. Critical software: Software whose failure could have an impact on safety. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to Quality Assurance for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Deviation: Authorization for a future activity, event, or product that departs from standard procedures. Documentation: Manuals, written procedures or policies, records or reports that provide information concerning uses, maintenance or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, which may lead to a failure. Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness, or damage to or loss of equipment or property.
Copyright © 2002 Interpharm Press
Page 41 of 44 SDP with SCMP and SQAP
394
Software Quality Assurance SOPs for Healthcare Manufacturers
Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Integration: Process of combining software elements, hardware elements, or both into an overall system. Milestone: Scheduled and accountable event that is used to measure progress. Quality assurance (QA): Planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Architecture Design Specification (SADS): Project-specific document that constrains the design information needed to support the detailed definition of the individual software system components and, upon completion of the Architecture Design Review, becomes the design baseline for development of the SDDS used in support of software coding. Software Configuration Management Plan (SCMP): Project-specific plan that specifies the methods and planning employed to implement software configuration management activities. Software Detailed Design Review (SDDR): Review conducted for the purpose of: (1) reviewing the project’s detailed design, SDDS, associated plans, and critical issues; (2) resolving identified issues; (3) obtaining commitment to proceed into the code and test phase; and (4) obtaining commitment to a test program supporting product acceptance. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture
Page 42 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
Software Development Plan
395
Design Review, including a description of the overall program operation and control and the use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures employed to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software Development Test Plan (SDTP): Project-specific plan that defines the scope of software testing that must be completed successfully for each software component developed. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software quality: Totality of features and characteristics of a software product that bear on its ability to satisfy given needs. Software Quality Assurance Plan (SQAP): Project-specific plan that states the software quality objectives of the project as conditioned by the product requirements and the significance of the intended application. Software reliability: Probability that software will not cause the failure of a system for a specified time under specified conditions. Software Requirements Review (SRR): Review of the provisions of the Software Requirements Specification, which, once approved, will serve as the basis of software End Product acceptance. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products.
Copyright © 2002 Interpharm Press
Page 43 of 44 SDP with SCMP and SQAP
396
Software Quality Assurance SOPs for Healthcare Manufacturers
Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied. Software Validation Test Plan (SVTP): Project-specific plan that describes the software testing required to verify that the software product satisfies the specified requirements. Source code: Original software expressed in human-readable form (programming language), which must be translated into machine-readable form before it can be executed by the computer. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test. Testability: Extent to which software facilitates both the establishment of test criteria and the evaluation of the software with respect to those criteria, or the extent to which the definition of requirements facilitates analysis of the requirements to establish test criteria. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. Walk-through: Review in which the designer or programmer leads members of the review team through a segment of design or code, and the reviewers ask questions and submit comments about technique, style, possible errors, violation of development standards, and other problems.
Page 44 of 44 SDP with SCMP and SQAP
Copyright © 2002 Interpharm Press
[Project/Product Name] SDTP SOFTWARE DEVELOPMENT TEST PLAN
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SDTP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 25 SDTP
398
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
TESTING OVERVIEW
5
3.0
TESTING REQUIREMENTS
8
4.0
TEST REPORTING
13
APPENDIX A
Allocation of Test Specifications
15
APPENDIX B-1
Hardware Requirements for Component and Integration Testing Support
16
APPENDIX B-2: Software Requirements for Component and Integration Testing Support APPENDIX C-1
17
Hardware Configurations for Software Component and Integration Testing
18
Software Configurations for Software Component and Integration Testing
19
APPENDIX D-1
Software Component and Integration Testing Risks
20
APPENDIX D-2
Software Component and Integration Testing Contingencies
21
Software Development Test Information Sheet
22
APPENDIX C-2
APPENDIX E GLOSSARY
Page 2 of 25 SDTP
23
Copyright © 2002 Interpharm Press
Software Development Test Plan
399
1.0 INTRODUCTION
1.1 Purpose This plan identifies and describes the plan for the software testing to be conducted during the development of the software for the [project/product name] project.
1.2 Scope This plan describes the tests that are executed during software development for the [project/product name] project. These tests are to be implemented by the [project/product name] software development team under the direction of the software lead engineer.
1.3 Overview The qualification of the [project/product name] software for system Design Verification Testing is established by the successful completion of the testing described in the [project/product name] Software Validation Test Plan (SVTP) and this SDTP. Review and analysis of the recorded results of the testing described in this plan is performed by the V&V engineers and is a prerequisite to initiating the tests described in the SVTP. The software testing described in this plan defines how the [project/product name] software testing will be conducted during the Code and Test Phase and Integrate and Test Phase of the software development life cycle.
1.4 Referenced Document The following document of the exact issue shown forms a part of this specification to the extent specified herein. In the event of conflict between the document referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
Copyright © 2002 Interpharm Press
Page 3 of 25 SDTP
400
Software Quality Assurance SOPs for Healthcare Manufacturers
1.4.1
Project Specifications
•
Product Objectives Document, Document Number [aaa]-PODRevision [#.#], dated [date]
[project/product name] [#.#],
•
[project/product name]
•
[project/product name]
•
[project/product name]
•
[project/product name] Software End-product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
1.4.2
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
Software Configuration Management Plan, Document Number [aaa]-CMP-[#.#], Revision [#.#], dated [date] Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
Software Quality Assurance Plan, Document Number [aaa]-QAP[#.#], Revision [#.#], dated [date] Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
Page 4 of 25 SDTP
Copyright © 2002 Interpharm Press
Software Development Test Plan
401
2.0 TESTING OVERVIEW
This plan describes the software testing to be performed in order to verify the design and functionality of the [project/product name] software components. Testing will include both the individual component testing and the integration testing of multiple components. The [project/product name] software development team will implement testing methods consistent with the Software Development Policies in order to ensure the development of quality software.
2.1 Qualification Methodology Qualification of the [project/product name] software for delivery to the quality assurance group for system Design Validation Testing is established by the successful completion of the testing described in the [project/product name] SVTP and this SDTP. The SDTP shall describe the scope of the software testing that must be successfully completed for each software component and for component integration. The [project/product name] software test requirements are defined in terms of: •
Levels of testing
•
Test categories
•
Test verification methods
2.1.1
Levels of Testing
Qualification of the [project/product name] software component-level testing is established by the successful completion of two levels of software testing. The software component testing shall verify the correct operation of each software component described in the [project/product name] Software Detailed Design Specification (SDDS). The software integration testing shall verify the correct operation of all software tasks operating together as described in the SDDS.
2.1.2
Test Categories
The test categories for software component testing and software integration testing will include the following types of testing:
Copyright © 2002 Interpharm Press
Page 5 of 25 SDTP
402
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Functional Testing. Tests designed to verify that all the functional requirements have been satisfied. This category is termed success oriented, because the tests are expected to produce successful results.
•
Robustness Testing. Tests designed to evaluate software performance given unexpected inputs. This category is termed failure oriented, because the test inputs are designed to cause the product to fail, given foreseeable and reasonably unforeseeable misuse of the product.
•
Stress Testing. Tests designed to evaluate software in a stress condition in which the amount or rate of data exceeds the amount expected.
•
Safety Testing. Tests designed to verify that the software performs in a safe manner and that a complete assessment of the safety design is accomplished.
•
Growth Testing. Tests performed to verify that the margins of growth specified for any particular component are supported by the software.
•
Regression Testing. Tests performed whenever a software change occurs, to detect faults introduced during modification, to verify that modifications have not caused unintended adverse effects, and to verify that the software still meets its specified requirements.
2.1.3
Test Verification Methods
The methods of test verification for software component testing will include the following: •
Inspection. Visual examination of an item.
•
Analysis. Evaluation of theoretical or empirical data.
•
Demonstration. Operational movement or adjustment of an item.
•
Test. Operation of an item and recording and evaluation of quantitative data.
2.2 Testing Strategy The testing strategy for the [project/product name] software will include actions to detect deficiencies with the minimum amount of testing. The test requirements will depend upon previous actions for detecting and reducing deficiencies prior to actually applying test methods.The
Page 6 of 25 SDTP
Copyright © 2002 Interpharm Press
Software Development Test Plan
403
methods applied will be formulated to support the software development and V&V effort of the overall software development effort.
2.2.1
Software Component Breakdown
The definition of tasks, components, subroutines, functions, and data will be detailed in the [project/product name] SDDS.
2.2.2
Integration Strategy
The integration of the [project/product name] software components will be performed on the [project/ product name] hardware platform. The start-up code, basic input and output services, and executive, director, or operating system will be integrated first. The remaining software components will then be integrated in an order designed to minimize resource contentions and provide basic functionality before advanced functionality.
2.2.3
Test Methodology
The objective of the testing to be performed on the [project/product name] software will be to detect functional deficiencies as well as deficiencies due to unknown side effects. The testing will also be designed to uncover potential performance issues. The software testing will begin at the component level and will conclude with the completion of all integration testing tasks.
2.2.4
Test Specification
The objective of the test specification is to ensure that any given test verifies the performance of the software component. The allocation of test specifications to the various test schema is given in Appendix A.
2.2.5
Integration Prerequisites
Prior to software integration and test, the [project/product name] code will meet prerequisites. Satisfaction of the prerequisites is the responsibility of the individual code author.The software integrator will reject any software component that is submitted for integration if these prerequisites have not been satisfied. Each software subroutine, function, or component will meet the following prerequisites:
Copyright © 2002 Interpharm Press
Page 7 of 25 SDTP
404
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Accurately implements the header preamble
•
Has had satisfactory completion of a code walk-through if requested
•
Has had trace verification that the code satisfies the design requirements
•
Conforms to the software programming standards
•
Has had error-free preprocessing, compilation, and assembly
•
Has passed a language diagnostic facility or tool and is free of any warnings
•
Has passed individual functional, robustness, stress, safety, and growth testing
2.2.6
Accepted Software Components
When the code within a task or the integration of tasks has successfully passed testing, the software lead engineer will deliver that task to the software configuration manager for configuration control. The code will be placed into the [project/product name] software library and be available for V&V activities.
3.0 TEST REQUIREMENTS
3.1 Component Testing Component testing will be performed by the code author and will be defined at the [project/ product name] SDDS level.
3.1.1
Component Test Requirements
The component testing requirements include the following: •
Out-of-bounds testing
•
Extreme-value testing
•
Branching and logic control testing
Page 8 of 25 SDTP
Copyright © 2002 Interpharm Press
Software Development Test Plan
•
Interrupt and re-entry testing
•
Component-failure testing
405
Component testing will be performed using the following code execution: •
Via source code debugger
•
Via the target platform using an emulator
•
Analysis involving evaluation of data or control flow
3.1.2
Component Test Responsibilities
A Development Test Information Sheet (DTIS) will be generated for every component test. The DTIS will be written by the component author and reviewed by the software lead engineer.The actual testing will be performed by the DTIS author, and the results will be reviewed by the software lead engineer.
3.1.3
Component Test Categories
All components will receive functional and safety test category testing. Other test categories will be applied as necessary.
3.1.4
Component Test Schedule
Component testing will be performed after code development test completion and before other code segments are completed. Use of several test techniques will allow the testing to occur independently, before other components are complete. During code development, components will be tested as they become available.
3.2 Integration Testing Integration testing will encompass testing of the following: •
Complete code
•
Integration of tasks
•
Tasks themselves
Copyright © 2002 Interpharm Press
Page 9 of 25 SDTP
406
Software Quality Assurance SOPs for Healthcare Manufacturers
3.2.1
Integration Test Requirements
The integration testing requirements include the following: •
Out-of-bounds testing
•
Extreme-value testing
•
Branching and logic control testing
•
Interrupt and re-entry testing
•
Component-failure testing
•
Discrete and LSI physical component-failure testing
•
Software structure testing
Component testing will be performed using code execution via the target platform and by means of an emulator and analysis involving evaluation of data or control flow. Component regression testing will be performed to ensure that the component integration has not adversely affected previously successful component testing. After the components have been fully integrated into their respective tasks, detailed integration testing will begin. After the tasks have been integrated, detailed testing of software system integration will be performed. These tests will include all functionality of the [project/product name] software as specified in the Software Requirements Specification (SRS). Additionally, several random tests will be performed by seeding errors and recording the system response.
3.2.2
Integration Test Responsibilities
A DTIS will be generated for every integration test. The DTIS will be written by the software integration engineer and reviewed by the software lead engineer.The actual testing will be performed by the software integration engineer, and the results will be reviewed by the software lead engineer.
3.2.3
Integration Test Categories
The complete [project/product name] software will be tested using functional, robustness, stress, safety, and growth tests. The individually integrated code will be tested using functional,
Page 10 of 25 SDTP
Copyright © 2002 Interpharm Press
Software Development Test Plan
407
robustness, stress, and safety tests, and the integrated tasks will be tested using functional, robustness, stress, and safety tests.
3.2.4
Integration Test Schedule
Integration testing will be performed after the components have been completed and have successfully passed code development testing. Several test techniques will be used, in order to allow the testing to occur independently before other components are complete. During code development, integration testing will occur as tasks and components become available. The integration of individual tasks will be completed as other tasks are beginning. This testing will continue throughout the Integration and Test Phase of the software development life cycle.
3.3 Component Integration Order The anticipated order of the testing for components, tasks, and integrated tasks will follow the order of task integration as specified in the [project/product name] Software Development Plan (SDP).
3.4 Resource Requirements Software development testing of the [project/product name] software is the responsibility of the [project/product name] software development team. In order to support the component and integration testing, the following resources have been identified.
3.4.1
Facilities
The component and integration testing will be performed at the company facilities; equipment located here will be used.
3.4.2
Personnel
The component and integration testing will be performed by personnel of the [project/product name] software development team. Additional testing will be performed by the [project/product name] software V&V personnel in accordance with the SVTP.
Copyright © 2002 Interpharm Press
Page 11 of 25 SDTP
408
Software Quality Assurance SOPs for Healthcare Manufacturers
3.4.3
Hardware
The hardware required to support the component and integration testing of the [project/product software is listed in Appendix B-1.
name]
3.4.4
Software
The software required to support the component and integration testing of the [project/product software is listed in Appendix B-2.
name]
3.4.5
Resources Plan
The resources needed to complete [project/product name] component and integration testing will be composed of corporate personnel. If these resources are insufficient, then qualified outside consultants will be hired temporarily and assigned to the [project/product name] software development team.
3.5 Test Configurations 3.5.1
Hardware Platform Configuration
The [project/product name] test bed hardware is listed in Appendix C-1.
3.5.2
Software Platform Configuration
The [project/product name] test bed software is listed in Appendix C-2.
3.5.3
Software Library
The [project/product name] software project will implement a software library that will hold the baselined [project/product name] software. This library will be placed under [project/product name] software development configuration control.
Page 12 of 25 SDTP
Copyright © 2002 Interpharm Press
Software Development Test Plan
409
3.6 Risks and Contingencies 3.6.1
Assumptions
In order to successfully complete the component and integration testing of the [project/product name] software, the following assumptions are made: •
Training requirements and learning curves for the software development tools can be assimilated while actually using the tools.
•
No major requirements will be either changed or added after the Software Requirements Phase of the software development life cycle.
•
The forecast of memory and CPU bandwidth requirements are accurate.
3.6.2
Risks
The risks associated with the [project/product name] software component and integration testing are listed in Appendix D-1.
3.6.3
Contingencies
The contingencies associated with the [project/product name] software component and integration testing are listed in Appendix D-2.
4.0 TEST REPORTING
4.1 Development Test Information Sheet Each software development component and integration test will have a software DTIS generated for it. Each DTIS will be reviewed by the software lead engineer. Each DTIS will comply with the following:
Copyright © 2002 Interpharm Press
Page 13 of 25 SDTP
410
Software Quality Assurance SOPs for Healthcare Manufacturers
•
The DTIS will be written such that the test is repeatable.
•
The DTIS will present the test objective, test approach, and success criteria.
•
The DTIS will present the complete test setup.
•
The DTIS will be written such that the test can migrate from the component testing level to the integration testing level.
•
The DTIS will be written such that the test results can be recorded on the form.
•
The DTIS standard inputs are defined by the code requirements.
An example of the DTIS is shown in Appendix E.
4.2 Validation Test Reporting Validation testing of the [project/product name] software is specified in the SVTP.
Page 14 of 25 SDTP
Copyright © 2002 Interpharm Press
Copyright © 2002 Interpharm Press
Design structures
Code environment
Code execution
Data output
Data input
Computations All areas Sampling and sampling rates of data input options Output options and formats Error messages Informational messages All executable code Branching logic and control decisions Interrupt handling, interrupt lock out, data write lock out, and reentry Illegal conditions or range of conditions at singular condition or state Processing time Memory usage Task schedule latency
Transfer of control Parameters received and sent Data table interactions Adaptive parameters Hardware interactions Mailbox transfers Singular, extreme, out-of-range, and nominal values
Verification Includes
Functional Robustness Safety Stress Stress Robustness Growth Stress
Robustness Safety Functional Stress Functional
Functional Robustness Stress
Test Categories
APPENDIX A
Interfaces
Area Verified
Software Development Test Plan 411
ALLOCATION OF TEST SPECIFICATIONS
Page 15 of 25 SDTP
412
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX B-1
HARDWARE REQUIREMENTS FOR COMPONENT AND INTEGRATION TESTING SUPPORT
Quantity [#] [#]
1 1 [Enter quantity required]
Page 16 of 25 SDTP
Description Emulator Workstation and/or PC DATAIO UNISITE PROM Programmer EPROM Ultraviolet Eraser [Enter hardware description]
Copyright © 2002 Interpharm Press
Software Development Test Plan
APPENDIX B-2
413
SOFTWARE REQUIREMENTS FOR COMPONENT AND INTEGRATION TESTING SUPPORT
Item 1 1 1 1 1 1 1 1 1 1 [enter quantity required]
Copyright © 2002 Interpharm Press
Description Software CASE tool Complexity analysis tool Cross and native compiler Assembler Cross and native debugger Module librarian Source code control system Text editor or publishing system Graphical or drawing system Database application system [Enter software descriptions]
Page 17 of 25 SDTP
414
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX C-1
HARDWARE CONFIGURATIONS FOR SOFTWARE COMPONENT AND INTEGRATION TESTING
Configuration Number
Configuration Description
1 2 3 4
Workstations and/or PCs Workstations and/or PCs with emulators Workstations and/or PCs with emulators and target hardware Target hardware with EPROM
[enter sequential number]
Page 18 of 25 SDTP
[Enter hardware configuration description]
Copyright © 2002 Interpharm Press
Software Development Test Plan
APPENDIX C-2
415
SOFTWARE CONFIGURATIONS FOR SOFTWARE COMPONENT AND INTEGRATION TESTING
Configuration Number
Configuration Description
1
Workstations and/or PCs with target software executing within debugger environment Emulator with target software operating under real-time executive, operating system, or driver Emulator with target software operating under real-time executive, operating system, or driver on target hardware Target software embedded within EPROM operating under real-time executive, operating system, or driver on target hardware
2 3 4
[enter sequential number]
Copyright © 2002 Interpharm Press
[Enter hardware configuration description]
Page 19 of 25 SDTP
416
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX D-1 SOFTWARE COMPONENT AND INTEGRATION TESTING RISKS Item 1 [enter sequential number]
Page 20 of 25 SDTP
Description Memory and CPU bandwidth requirements not yet fully determined
[Enter task, activity, or technical description of risk]
Copyright © 2002 Interpharm Press
Software Development Test Plan
417
APPENDIX D-2 SOFTWARE COMPONENT AND INTEGRATION TESTING CONTINGENCIES Item 1 2 3 [enter sequential number]
Description Eliminate selected system requirements Add staff to support an increase in the estimated code size Extend the schedule to accommodate new software requirements
[Enter task, activity, or technical description of contingency]
Copyright © 2002 Interpharm Press
Page 21 of 25 SDTP
418
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX E
SOFTWARE DEVELOPMENT TEST INFORMATION SHEET
SOFTWARE DEVELOPMENT TEST INFORMATION SHEET
Test Category
Test Number
Component Name
Requirement Number Satisfied
1. Component objectives and component success criteria 2. Test objectives and success criteria 3. Test approach 4. Test instrumentation 5. Test duration 6. Input data and format 7. Output data and format 8. Data collection, reduction, and analysis requirements 9. Test script 10. Test drivers 11. Test stubs 12. Test data stream 13. Test control flow stream 14. Pretest comments 15. Post-test comments 16. Component build description (and subordinate DTISs) 17. Signatures:
Page 22 of 25 SDTP
Test Conductor
Date
Software Lead Engineer
Date
Copyright © 2002 Interpharm Press
Software Development Test Plan
419
GLOSSARY Baseline: Specification or product, formally reviewed and agreed upon, that thereafter serves as the basis for further development and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Component testing: Testing conducted to verify the implementation of the design for one software component or a collection of software components. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Design requirement: Any requirement that impacts or constrains the design of a software system or software system component. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect, which may lead to a failure, of a system or system component that is caused by a defective, missing or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system.
Copyright © 2002 Interpharm Press
Page 23 of 25 SDTP
420
Software Quality Assurance SOPs for Healthcare Manufacturers
Implementation phase: Period in the software development life cycle during which a software product is created from design documentation and debugged. Inspection: Formal evaluation technique in which software requirements, design, or code are examined in detail by a person or group other than the author in order to detect faults, violations of development standards, or other problems. Regression testing: Selective retesting to detect faults introduced during modification, to verify that modifications have not caused unintended adverse effects and that a modified system or system component still meets its specified requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules and associated documentation, and data pertaining to the operation of a computer system. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review and includes a description of the overall program operation and control and the use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is then evaluated to determine whether requirements have been satisfied.
Page 24 of 25 SDTP
Copyright © 2002 Interpharm Press
Software Development Test Plan
421
Software Validation Test Plan (SVTP): Project-specific plan that describes the software testing required to verify that the software product satisfies the specified requirements. Test bed: Hardware and software platform configuration to be used for testing purposes. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test. Validation: Process of evaluating software at the end of the software development process to ensure its compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development life cycle fulfill the requirements established during the previous phase.
Copyright © 2002 Interpharm Press
Page 25 of 25 SDTP
[Project/Product Name] SEAP SOFTWARE END-PRODUCT ACCEPTANCE PLAN
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SEAP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 12 SEAP
424
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
SOFTWARE END-PRODUCT ACCEPTANCE REQUIREMENTS
5
METHODS FOR ACCUMULATION OF ACCEPTANCE-RELATED DATA
6
3.0
APPENDIX A
Software End-product Acceptance Check List
APPENDIX B
Software End-product Acceptance Sign-Off List
GLOSSARY
Page 2 of 12 SEAP
7 11 12
Copyright © 2002 Interpharm Press
Software End-product Acceptance Plan
425
1.0 INTRODUCTION
1.1 Purpose This document describes the plans for acceptance of the software end products developed for the [project/product name] product.The term end products in this plan refers to all deliverables generated by the [project/product name] software development team. This plan complies with the Software Development Policies and follows its guidance and control.
1.2 Scope This plan describes the criteria that will be met in order to obtain written approval of the [project/ product name] software by the [project/product name] software development team and the [project/ product name] [title/position]. This plan is to be implemented by the [project/product name] software development team under the direction of the software lead engineer and is in effect for the full range of tasks that are described in the [project/product name] Software Development Plan (SDP).
1.3 Overview This plan describes the end products, services required, and their acceptance criteria for the [project/product name] software project. This plan lists the acceptance criteria, reviews, audits, and approval schedules by which the end products and services will be judged complete. This plan gives support to other activities such that proper software verification and validation (V&V) can be performed.
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
Copyright © 2002 Interpharm Press
Page 3 of 12 SEAP
426
Software Quality Assurance SOPs for Healthcare Manufacturers
1.4.1
Project Specification
•
Product Objectives Document, Document Number [aaa]-PODRevision [#.#], dated [date]
[project/product name] [#.#],
•
[project/product name]
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
•
[project/product name]
Software Configuration Management Plan, Document Number [aaa]-CMP-[#.#], Revision [#.#], dated [date] Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
[#.#],
Software Development Test Plan, Document Number [aaa]-DTPRevision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
1.4.2
Software Quality Assurance Plan, Document Number [aaa]-QAP[#.#], Revision [#.#], dated [date] Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
Page 4 of 12 SEAP
Copyright © 2002 Interpharm Press
Software End-product Acceptance Plan
•
427
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 SOFTWARE END-PRODUCT ACCEPTANCE REQUIREMENTS
Each end product produced by the software development team for the [project/product name] project requires a level of concurrence by the [project/product name] [title/position] that ranges over high, medium, and low. The high level indicates that the [title/position] must agree with the content of that particular end product. The medium level indicates that the [title/position] understands the content of the end product. A low level indicates that the [title/position] does not require a full understanding of the end-product content. Regardless of the concurrence level, the signature of the [title/position] indicates that the end product has been received and that the specified criteria were met. For each end product identified in this plan, the following information is provided in the format shown: Required format:
Delineation of content, media, or source governing the format
Number of deliverables:
Maximum number of distinct versions that could be produced, including the original
Producing schedule:
Date that the producing activity occurs
Delivering schedule:
Date that the end product is scheduled to be delivered to the [title/position]
Review and acceptance schedule:
Date that the end product is scheduled to be accepted by the [title/position]
Final criteria:
The judgment criteria used to accept the end product
Concurrence level:
The level to which the [title/position] must agree with the end-product content
Appendix A contains the checklist of the [project/product name] software end products that this plan applies to. Appendix B, the signature sign-off list that corresponds to the checklist of software end products shown in Appendix A, becomes the formal signature document that is signed by the [title/position].
Copyright © 2002 Interpharm Press
Page 5 of 12 SEAP
428
Software Quality Assurance SOPs for Healthcare Manufacturers
3.0 METHODS FOR ACCUMULATION OF ACCEPTANCE-RELATED DATA
The [project/product name] software development project will establish and maintain an electronic and hard-copy file system to accumulate end-product acceptance data for each software end product. These files will contain the following: •
Physical evidence of the product or sub-products having been produced or performed
•
Written approvals
•
Other supporting documentation
Each file system will be set up under the direction and control of the [project/product name] software lead engineer.
Page 6 of 12 SEAP
Copyright © 2002 Interpharm Press
Software End-product Acceptance Plan
APPENDIX A
429
SOFTWARE END-PRODUCT ACCEPTANCE CHECK LIST
Acceptance Criteria
Item
1 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 2 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 3 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 4 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 5 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 6 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level
Copyright © 2002 Interpharm Press
End-product Acceptance Criteria Description Software development estimates Per Software Development Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Medium Software Quality Assurance Plan (SQAP) Per Software Development Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Low Interface Design Specification (IDS) Per Software Development Policies [number required] [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Medium Software Configuration Management Plan (SCMP) Per Software Development Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Low Software Development Plan (SDP) Per Software Development Policies [number required] [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Medium Software End-product Acceptance Plan (SEAP) Per Software Development Policies 3 [scheduled date(s)] [scheduled date(s)) [scheduled date(s)] Product delivered in required format High
Page 7 of 12 SEAP
430
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX A
SOFTWARE END-PRODUCT ACCEPTANCE CHECK LIST (CONTINUED)
Acceptance Criteria
Item
7 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 8 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 9 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 10 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 11 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 12 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level
Page 8 of 12 SEAP
End-product Acceptance Criteria Description Software Requirements Specification (SRS) Per Software Development Policies 3 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Medium Software Requirements Review (SRR) Per Software Development Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Memo announcing review and follow-up memo documenting results Medium Software design walk-throughs Per Software Development Policies Indeterminate As scheduled As scheduled As scheduled Collection of memos announcing walk-through and follow-up memos documenting results Medium Software Architecture Design Specification (SADS) Per Software Development Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Medium Software Development Test Plan (SDTP) Per Software Development Policies 2 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Medium Software Architecture Design Review (SADR) Per Software Development Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Memo announcing review and follow-up memo stating results Medium
Copyright © 2002 Interpharm Press
Software End-product Acceptance Plan
APPENDIX A
431
SOFTWARE END-PRODUCT ACCEPTANCE CHECK LIST (CONTINUED)
Acceptance Criteria
Item
13 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 14 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 15 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 16 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 17 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level
Copyright © 2002 Interpharm Press
End-product Acceptance Criteria Description Software Detailed Design Specification (SDDS) Per Software Development Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Product delivered in required format Medium Software Detailed Design Review (SDDR) Per Software Development Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Memo announcing review and follow-up memo stating results Medium Software code walk-throughs Per Software Development Policies Indeterminate As scheduled As scheduled As scheduled Collection of memos announcing walk-throughs and follow-up memos stating results Medium Closure on all Test Information Sheets (DTISs and VTISs) Per Software Development SOPs and Software V&V SOPs 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Collection of all DTISs and VTISs completed, signed off, and closed Medium Software Anomaly Reports Closure on all Per Software V&V Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Collection of all Software Anomaly Reports completed, signed off, and closed Medium
Page 9 of 12 SEAP
432
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX A
SOFTWARE END PRODUCT ACCEPTANCE CHECK LIST (CONTINUED)
Acceptance Criteria
Item
18 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria
Concurrence level 19 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria
Concurrence level 20 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level 21 Required format Number of deliverables Producing schedule Delivering schedule Review and acceptance schedule Final criteria Concurrence level
Page 10 of 12 SEAP
End-product Acceptance Criteria Description Released software Per Software Engineering Policies 1 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Collection of source code, source code listings, compilation listings, link/load map listing, and EPROM Medium Hardware support software Product delivered in appropriate format 3 [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] Collection of source code, source code listings, compilation listings, link/load map listing, and EPROM if applicable Medium Audit on end-product acceptance Per Software Engineering Policies, SEAP Indeterminate As scheduled As scheduled As scheduled Collection of memos documenting audit and original sign-off sheet completed High [Enter item description] [Enter controlling format document] [Enter number required] [scheduled date(s)] [scheduled date(s)] [scheduled date(s)] [Enter product required format] [Enter level]
Copyright © 2002 Interpharm Press
Software End-product Acceptance Plan
APPENDIX B
433
SOFTWARE END-PRODUCT ACCEPTANCE SIGN-OFF LIST
Item Number
End-product Description
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
Software development estimates Software Quality Assurance Plan (SQAP) Interface Design Specification (IDS) Software Configuration Management Plan (SCMP) Software Development Plan (SDP) Software End-product Acceptance Plan (SEAP) Software Requirements Specification (SRS) Software Requirements Review (SRR) Software design walk-throughs Software Architecture Design Specification (SADS) Software Development Test Plan (SDTP) Software Architecture Design Review (SADR) Software Detailed Design Specification (SDDS) Software Detailed Design Review (SDDR) Software code walk-throughs Closure on all DTISs Closure on all VTISs Closure on all Software Anomaly Reports Released software Hardware support software Audit on end-product acceptance
[#]
[Enter item description]
Copyright © 2002 Interpharm Press
Initials
Date
Page 11 of 12 SEAP
434
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Acceptance criteria: Criteria that a software end product must meet in order to successfully complete a test phase or meet delivery requirements. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to Quality Assurance for certification. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures employed to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software project: Planned and authorized undertaking of specified scope and duration that results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase.
Page 12 of 12 SEAP
Copyright © 2002 Interpharm Press
[Project/Product Name] SQAP SOFTWARE QUALITY ASSURANCE PLAN
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SQAP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 16 SQAP
436
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
MANAGEMENT
5
3.0
SOFTWARE QUALITY ASSURANCE PROCEDURES
7
APPENDIX A
Software Quality Assurance Milestones
11
APPENDIX B
Summary of Software Quality Assurance Activities
12
GLOSSARY
Page 2 of 16 SQAP
13
Copyright © 2002 Interpharm Press
Software Quality Assurance Plan
437
1.0 INTRODUCTION
1.1 Purpose This plan defines the quality assurance actions that the [project/product name] software development team will undertake in order to assure that the software developed for the [project/product name] product complies with the Software Development Policies, relevant software development procedures, and corporate quality assurance policies and standards.
1.2 Scope This plan is applicable to the software end products, produced and used during [project/product name] software development, which are under the control of the software development team. This plan provides the basis for the software quality assurance (SQA) activities that are applied to the [project/product name] software. This plan also defines the software products and processes that are being acted upon and describes or references the procedures that implement the activities throughout the software development life cycle and process.
1.3 Overview The [project/product name] software development team recognizes that an effective SQA program is the key to meeting schedule objectives while meeting performance, safety, and reliability requirements. This plan has been designed to provide the methods necessary to do the following: •
Minimize the incorporation of deficiencies
•
Detect introduced deficiencies
•
Effect corrective action when deficiencies occur
Implementation of this plan is the responsibility of the [project/product name] software lead engineer. The quality assurance approach set forth in this plan defines the activities that software developers and software V&V personnel will perform during the development of the [project/product
Copyright © 2002 Interpharm Press
Page 3 of 16 SQAP
438
Software Quality Assurance SOPs for Healthcare Manufacturers
name] software in order to ensure the quality of the product software. The software lead engineer will provide the following:
•
Visibility to assure the accomplishment of a disciplined software development process
•
Assignment of group and V&V reviews to assure that the software product and related documentation comply with the Software Development Policies, relevant software development procedures, and corporate quality assurance policies and standards
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
1.4.1
Project Specifications
•
[project/product name]
•
[project/product name]
1.4.2
Product Objectives Document, Document Number [aaa]-POD[#.#], Revision [#.#], dated [date] Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
Page 4 of 16 SQAP
Copyright © 2002 Interpharm Press
Software Quality Assurance Plan
•
439
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 MANAGEMENT
2.1 Organization The [project/product name] has been organized under the direction of [title/position], who has assembled a program team that represents all concerned disciplines.This team coordinates the development activities and provides support for product development. Product development of [project/product name] involves [insert engineering disciplines here], and software engineering. The interface between these disciplines during product development is provided through the technical team, which meets to address and resolve [project/product name] product development issues. A lead software engineer heads the [project/product name] software development and provides technical direction in all aspects of software development. Other software engineers have been assigned to the team, and all participants in the development of the [project/product name] software are responsible for ensuring that their efforts are in compliance with the Software Development Policies. The functions and tasks of V&V for the [project/product name] software are organized under the direction of [corporate title/position]. The [project/product name] V&V organization is composed of software engineers who have not been directly involved in the development of the software being verified and have not established the criteria against which the software is validated. The development and administration of the V&V program is the responsibility of the V&V lead software engineer. This individual is responsible for planning, organizing, monitoring, and controlling the V&V tasks performed for the [project/product name] project. The project V&V organization will comply with the Software Verification and Validation Policies. All [project/product name] software development and V&V team members are responsible for ensuring that their efforts are in compliance with the SQA procedures described in this plan.
Copyright © 2002 Interpharm Press
Page 5 of 16 SQAP
440
Software Quality Assurance SOPs for Healthcare Manufacturers
2.2 Responsibilities The following SQA functions will be performed: •
Generate, maintain, and approve this plan
•
Assure the proper use of the tools, techniques, methodologies, and records used to aid in the production of quality software
•
Assure identification, preparation, coordination, and maintenance of and compliance with software development procedures for control of critical steps affecting product software quality
•
Assure compliance of the software project documentation and code with software programming standards and conventions
•
Use of software library procedures and controls to handle source code, object code, and related data
•
Use of peer reviews in accordance with Software Development Policies
•
Ensure the development and update of a software hazards analysis
•
Ensure the development and update of a requirements traceability matrix to track flow-down of specific requirements
•
Assure that software testing produces verifiable results that are traceable to requirements specifications
•
Assure the prompt detection, identification, correction, documentation, and reporting of software end-product anomalies and deficiencies
•
Coordinate with support groups on matters pertaining to SQA of the [project/product software
name]
2.3 Plan Implementation SQA of the [project/product name] software end product will begin with the development of the [project/product name] Software Requirements Specification (SRS) and end with the delivery of the software end product to the Quality Assurance (QA) group for challenge testing.The major
Page 6 of 16 SQAP
Copyright © 2002 Interpharm Press
Software Quality Assurance Plan
441
milestones associated with this development process have been defined in the [project/product name] Software Development Plan (SDP). Each milestone is associated with a set of development activities to be performed and documented and represents a point in time when specific SQA activities are applied. The major milestones or phases of software development for the [project/product name] project are as follows: 1. Requirements 2. Architecture design 3. Detailed design 4. Code and test 5. Integrate and test 6. Software validation The products of each major milestone are reviewed and/or tested prior to any advance to the next phase of software development. Appendix A identifies the specific SQA activities applied at each major milestone of the [project/product name] software project and their relationship to the development activities, documents, and review(s) performed. Appendix B lists the SQA activities that are applied to the [project/product name] software.
3.0 SOFTWARE QUALITY ASSURANCE PROCEDURES
3.1 Activities, Methods, and Tools The Software Development Policies contain the requirements for the development of quality software and the performance of SQA activities. The following procedures and methods governing SQA activities for the [project/product name] software are consistent and in compliance with those policies and will be used to ensure software quality: •
Analysis of specifications to verify that the software requirements are accurately and completely identified
•
Review of the hazards analysis to ensure that all hazards are identified
Copyright © 2002 Interpharm Press
Page 7 of 16 SQAP
442
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Design reviews to identify design requirements, detect design deficiencies, and adhere to the development policies
•
Design and code walk-throughs to verify that all requirements are addressed and that established policies, procedures, and guidelines are being followed
•
Review of development progress and compliance with software development policy requirements, software project plans, and software development estimates
•
Review of test plans and procedures to ensure that all specified requirements are adequately tested
•
Software tools used as necessary throughout the software development process for evaluating, analyzing, and documenting the software activities and products
•
Use of software CRA forms to ensure proper resolution and timely implementation, documentation, and close-out
3.2 Use of Development Plans, Procedures, and Tools All planning documents will be reviewed to assure that they are adequate to support the software project, comply with the Software Development Policies, and are consistent with associated plans. The execution of these plans will be monitored throughout the software project, and corrections to the procedures or their implementation will be made as required.
3.3 Use of Configuration Management Software configuration management practices will be used during all phases of software development. The [project/product name] Software Configuration Management Plan (SCMP) details the configuration management practices. Software Anomaly Reports and CRAs will be used to ensure that software changes are properly incorporated and completed.
3.4 Use of the Software Library Software designated for the [project/product name] software library will be entered and maintained in accordance with the procedures defined in the [project/product name] SCMP. As a minimum,
Page 8 of 16 SQAP
Copyright © 2002 Interpharm Press
Software Quality Assurance Plan
443
configuration management of the library will ensure that (1) the most recent authorized version of materials under each level of configuration control are clearly identified and are the ones routinely available from the library; and (2) previous versions of materials under configuration control are clearly identified and controlled to provide an audit trail that permits reconstruction of all changes made to the item.
3.5 Design Reviews and Code Walk-Throughs Prior to design review or code walk-through, the responsible software engineer will make a review package available to other [project/product name] software developers and V&V members. Code walk-throughs will be at the discretion of the [project/product name] software lead engineer. The [project/product name] SDP details the code walk-through practices, format, and content.
3.6 Corrective Action System Corrective action for discrepancies and deficiencies found during software development and test will be processed through the use of Software Anomaly Reports and CRAs. Processing of Software Anomaly Reports is described in the [project/product name] Software Verification and Validation Plan (SVVP). Processing of the CRAs is described in the [project/product name] SCMP. SQA will assure the following for the Software Anomaly Reports: •
Reports are reviewed and analyzed.
•
Appropriate corrective action is taken.
•
Trends are analyzed in performance of work to prevent the development of non-compliant products.
•
Corrective measures are reviewed to ensure that problems and discrepancies have been resolved and correctly reflected in the appropriate documents.
3.7 Documentation Review All software documentation prepared during each phase of the [project/product name] software development will be reviewed to assure compliance with the following standards and requirements:
Copyright © 2002 Interpharm Press
Page 9 of 16 SQAP
444
•
Software Quality Assurance SOPs for Healthcare Manufacturers
Adherence to required format and documentation standards defined in the software project plans and software development procedures
[project/product name]
•
Compliance with Software Development Policies
•
Internal consistency
•
Understandability
•
Traceability to the indicated document(s)
•
Consistency with the indicated document(s)
3.8 Testing Test plans, Test Information Sheets (TISs), and test procedures will be reviewed for compliance with the standards and requirements described in the [project/product name] SVVP, SDP, and Software Development Policies. Tests will be conducted in accordance with approved test plans, TISs, and test procedures. Test results will be specified in a test report.
Page 10 of 16 SQAP
Copyright © 2002 Interpharm Press
Copyright © 2002 Interpharm Press
1. 2.
D denotes deliverable for the phase. E denotes procedure is in effect for the entire phase.
D D E D D
E
D E
D E E D D D D
E D D D D
E
E
E E D D D D D D
D
E
D
E
D
E
E E E
D
E
E
E E D
E
D
E
D
E
E E D D
D D
E
D
E
Project Interface Architecture Detailed Code Integrate Software Start-Up Design Requirements Design Design and Test and Test Validation
APPENDIX A
Notes:
Software configuration identification Software Quality Assurance Plan (SQAP) Software Verification and Validation Plan (SVVP) Interface Design Phase V&V Software Configuration Status Report (SCSR) Interface Design Specification (IDS) Software configuration audits and reviews Requirements Phase V&V Software Configuration Management Plan (SCMP) Software Development Plan (SDP) Software End-product Acceptance Plan (SEAP) Software Requirements Specification (SRS) Requirements Traceability Matrix (RTM) Software requirements review and acceptance Software design walk-throughs Architecture Design Phase V&V Software Architecture Design Specification (SADS) Software Validation Test Plan (SVTP) Software Development Test Plan (SDTP) Software Architecture Design Review and acceptance Software Detailed Design Phase V&V Software Development Test Information Sheets (DTISs) Software Detailed Design Specification (SDDS) Software Validation Test Information Sheets (VTISs) Software Detailed Design Review and acceptance Software code walk-throughs Code development and testing Code and Test Phase V&V Code integration and testing Integrate and Test Phase V&V Software Validation Test Procedures (SVTPR) Software validation test conduct Software Validation Phase verification and validation Software Configuration Audit Report (SCAR) Software Verification and Validation Report (SVVR)
Development Activities and Documents
Software Quality Assurance Plan 445
SOFTWARE QUALITY ASSURANCE MILESTONES
Page 11 of 16 SQAP
446
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX B
SUMMARY OF SOFTWARE QUALITY ASSURANCE ACTIVITIES
Item Number
Activity Description
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Review software development SOPs for SQA requirements Review and update SQAP Use of SCRB for CRA processing and close-out Use of Software Anomaly Report for detection, correction, and reporting of anomalies against baseline software Review of SDP, SCMP, and SVVP Review of SRS and support of SRR Ensure development and update of RTM Review software design, SADS, SDDS, and support of SADR and SDDR Use of configuration management procedures Use of software development libraries Use of programming standards, conventions, and guidelines Use of code walk-throughs Review of test plans and procedures for adequacy of coverage Review of development progress Generation, tracking, updating, and maintaining of software metrics
[#}
[Enter other project-specific SQA activities here]
Page 12 of 16 SQAP
Copyright © 2002 Interpharm Press
Software Quality Assurance Plan
447
GLOSSARY Anomaly: Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Change Request/Approval (CRA): Form used to document changes to a baseline. Code: Loosely, one or more computer programs or part of a computer program. Code and Test: Phase of the software development life cycle during which a software end product is created from design documentation and tested. Completeness: Those attributes of the software or documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task, or a group of logically related code units that perform a specific task or set of tasks. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Configuration control: Process of evaluating, approving or disapproving, and coordinating changes to configuration items after formal establishment of their configuration identification. Configuration identification: Process of designating the configuration items in a system and recording their characteristics. Configuration item: Aggregation of hardware, software, or any of its discrete parts, that satisfies an end-use function.
Copyright © 2002 Interpharm Press
Page 13 of 16 SQAP
448
Software Quality Assurance SOPs for Healthcare Manufacturers
Configuration management (CM): Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Consistency: Those attributes of the software or documentation that provide uniformity in the specification, design, and implementation of the product. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to Quality Assurance for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Design requirement: Any requirement that impacts or constrains the design of a software system or software system component. Deviation: Authorization for a future activity, event, or product that departs from standard procedures. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, which may lead to a failure. Hazard: Dangerous state of a device or system, which may lead to death, injury, occupational illness, or damage to or loss of equipment or property. Hazard analysis: Listing of potential hazards associated with a device or system, along with an estimation of the severity of each hazard and its probability of occurrence.
Page 14 of 16 SQAP
Copyright © 2002 Interpharm Press
Software Quality Assurance Plan
449
Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Milestone: Scheduled and accountable event that is used to measure progress. Quality assurance: Planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements. Reliability: Ability of an item to perform a required function under stated conditions for a stated period of time. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software configuration management (SCM): Discipline of identifying the configuration of a software system at discrete points in time for the purpose of systematically controlling changes to this configuration and maintaining the integrity and traceability of this configuration throughout the development process. Software Configuration Management Plan (SCMP): Project-specific plan that specifies the methods and planning employed to implement software configuration management activities. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures employed to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software end products: Computer programs, software documentation, and databases produced by a software development project.
Copyright © 2002 Interpharm Press
Page 15 of 16 SQAP
450
Software Quality Assurance SOPs for Healthcare Manufacturers
Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software project: Planned and authorized undertaking, of specified scope and duration, that results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Software quality: Totality of features and characteristics of a software product that bear on its ability to satisfy given needs. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software tool: Computer program used to help develop, test, analyze, or maintain another computer program or its documentation. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied. Software Verification and Validation Plan (SVVP): Project-specific plan that describes the project’s unique verification and validation organization, activities, schedule, inputs and outputs, and any deviations from the software policies required for effective management of verification and validation tasks. Source code: Original software expressed in human-readable form (programming language), which must be translated into machine-readable form before it can be executed by the computer. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. Walk-through: Review in which the designer or programmer leads members of the review team through a segment of design or code, and the reviewers ask questions and submit comments about technique, style, possible errors, violation of development standards, and other problems.
Page 16 of 16 SQAP
Copyright © 2002 Interpharm Press
[Project/Product Name] SVTP SOFTWARE VALIDATION TEST PLAN
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SVTP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 32 SVTP
452
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
4
2.0
TEST OVERVIEW
6
3.0
TEST REQUIREMENTS
11
4.0
TEST REPORTING
13
5.0
TEST ADMINISTRATION PROCEDURES
15
APPENDIX A
Software Validation Criteria
16
APPENDIX B-1
Software Validation Functional Tests
17
APPENDIX B-2
Software Validation Robustness Tests
18
APPENDIX B-3
Software Validation Stress Tests
19
APPENDIX B-4
Software Validation Safety Tests
20
APPENDIX C-1
Requirements for Software Validation Testing Support Hardware
21
Requirements for Software Validation Testing Support Software
22
APPENDIX D-1
Software Validation Testing Risks
23
APPENDIX D-2
Software Validation Testing Contingencies
24
APPENDIX C-2
Page 2 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
453
APPENDIX E
Software Validation Test Information Sheet (VTIS)
25
APPENDIX F
Software Validation Test Log Sheet
26
APPENDIX G-1
Task Preparation List for Software Validation Testing
27
Software Validation Task Interdependencies
28
APPENDIX G-2 GLOSSARY
Copyright © 2002 Interpharm Press
29
Page 3 of 32 SVTP
454
Software Quality Assurance SOPs for Healthcare Manufacturers
1.0 INTRODUCTION
1.1 Purpose This plan identifies and describes the plan for the software validation testing of the software developed for the [project/product name] project.
1.2 Scope This plan describes the tests that are executed during software validation of the [project/product name] project. These tests verify that the [project/product name] software satisfies the software requirements of the [project/product name] Product Requirements Document (PRD). This plan also describes the test organization, schedule, reporting, and administration. The test program described in this plan will be performed by the [project/product name] V&V software engineers during the Software Validation Phase of the [project/product name] software development.
1.3 Overview The qualification of the [project/product name] software for system Design Verification Testing is established by the successful completion of the testing described in the [project/product name] Software Development Test Plan (SDTP) and this SVTP. Review and analysis of the recorded results of the testing described in the SDTP by the V&V engineers will be a prerequisite to initiating the tests described herein.The software validation testing described in this plan includes methods for the verification of the following:
•
Correct implementation of software requirements
•
Software system capabilities
•
Throughput and timing requirements
•
Safety design requirements
•
Correct interface to the system environment
Page 4 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
455
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
1.4.1
Project Specifications
•
[project/product name]
•
[project/product name]
•
[project/product name]
•
[project/product name]
Interface Design Specification, Document Number [aaa]-IDS-[#.#], Revision [#.#], dated [date]
Product Objectives Document, Document Number [aaa]-POD[#.#], Revision [#.#], dated [date] Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
[aaa]-CMP-[#.#],
Software Configuration Management Plan, Document Number Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
•
[project/product name]
•
[project/product name]
Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
Software Development Test Plan, Document Number [aaa]-DTP[#.#], Revision [#.#], dated [date] Software End Product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
[#.#],
Software Quality Assurance Plan, Document Number [aaa]-QAPRevision [#.#], dated [date]
•
Software Requirements Specification, Document Number [aaa]SRS-[#.#], Revision [#.#], dated [date]
•
[project/product name]
[project/product name]
Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date]
Copyright © 2002 Interpharm Press
Page 5 of 32 SVTP
456
Software Quality Assurance SOPs for Healthcare Manufacturers
1.4.2
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 TEST OVERVIEW
2.1 Organization The organizational structure of the [project/product name] project is described in the [project/product name] Software Development Plan (SDP). A software test program is organized and conducted independently by both the software developers and the V&V software engineers. Software component testing is conducted by the software developers in accordance with the [project/product name] SDTP to verify the processing and input and output requirements of each software component defined in the [project/product name] Software Detailed Design Specification (SDDS). Software validation testing is conducted by the V&V software engineers in accordance with this plan to verify that the fully integrated software end products satisfy the requirements in the [project/product name] Software Requirements Specification (SRS). This layered approach to testing provides assurance that the software errors introduced in code develop-
Page 6 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
457
ment are detected and that the software end products are verified by personnel independent of the software developers. The V&V engineers will make recommendations to the [project/product name] software lead engineer and the [corporate title/position] based on testing and analysis of the software. The V&V engineers will receive functional direction from and report to the [corporate title/position] in order to accomplish an independent assessment of software quality.
2.2 Master Schedule 2.2.1
Testing Schedule
The [project/product name] testing schedule is described in the [project/product name] SDP and [project Software Verification and Validation Plan (SVVP). The SDP and SVVP describe the software development phases of the [project/product name] project. Software validation testing will be performed during the Software Validation Phase.
/product name]
The development activities during the implementation phases and Software Validation Phase are not necessarily performed sequentially. During these phases of software development, software components are developed and tested independently. The individually tested software components are integrated incrementally and tested to ensure the integrity of the software interface. This process is repeated until multiple software components have been integrated into functional capabilities and/or tasks. Multiple components that have successfully completed the testing defined in the [project/product name] SDTP will be placed in the [project/product name] software library. Software validation testing will be exercised only against software components that have successfully completed testing as defined in the SDTP and have been baselined in accordance with the [project/product name] Software Configuration Management Plan (SCMP). Software validation testing will begin when baselined [project/product name] software is available from the software library and will continue until the end of the Software Validation Phase.
2.2.2
Test Documentation Schedule
The test documentation schedule is described in the [project/product name] SDP and SVVP, and this SVTP will be completed and approved prior to the [project/product name] Software Architecture Design Review (SADR). Validation Test Information Sheets (VTISs) will be generated for each test defined in this plan and will be prepared prior to the Software Detailed Design Review (SDDR). The [project/product name] Software Validation Test Procedures
Copyright © 2002 Interpharm Press
Page 7 of 32 SVTP
458
Software Quality Assurance SOPs for Healthcare Manufacturers
(SVTPR) contain the detailed procedures for the testing described in this plan and will be generated and approved by the [corporate title/position] prior to test execution. Software validation testing will be performed only with the approved SVTPR. At the conclusion of testing, a record of the software validation testing performed for the [project/product name] software will be provided in the [project/product name] Software Verification and Validation Report (SVVR).
2.3 Responsibilities 2.3.1
Software Development Engineers
The [project/product name] software development engineers will be responsible for the generation of the [project/product name] software and the following activities that are related to the software validation testing:
•
Generating the SDTP and associated DTISs
•
Software component testing in accordance with the SDTP
•
Complying with the procedures for software configuration control as specified in the SCMP
•
Implementing the corrective action required by an approved Software Anomaly Report
•
Implementing the corrective action required by an approved CRA form
•
Supporting the conduct of software validation testing as required
2.3.2
Software V&V Engineers
The functional responsibilities of the V&V software engineers during software validation testing are segregated into the following functional positions:
•
Test Administrator
•
Test Supervisor
•
Test Team
•
Data Analyst
Page 8 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
459
These positions do not necessarily refer to organizational positions, and the responsibilities of some positions may be performed by the same person. 2.3.2.1 Test Administrator. The development and administration of the software validation testing program described in this plan is the responsibility of the Test Administrator. The Test Administrator is responsible for planning, organizing, monitoring, and controlling the test activities for the [project/product name] project. This includes ensuring that only the latest controlled version of [project/product name] software from the software library is used in the validation test. 2.3.2.2 Test Supervisor. The conduct of testing performed in accordance with this plan and the detailed test procedures is the responsibility of the Test Supervisor. The Test Supervisor is responsible for the following: 1. Preparing the SVTPR 2. Preparing and maintaining the VTISs 3. Reviewing the results of the software component testing 4. Scheduling support personnel as required during testing 5. Monitoring the conduct of the software validation testing 6. Preparing and maintaining of the Software Validation Test Log 7. Testing report generation 2.3.2.3 Test Team. The execution of testing during the software validation testing is the responsibility of the Test Team. The Test Team is responsible for the following: 1. Executing approved test procedures 2. Preparing Software Anomaly Reports on problems observed during test conduct 3. Daily updating of the test log with accomplishments, recommendations, and/or results as appropriate 4. Operating equipment during testing 5. Advising the Test Supervisor of recommended changes in test conduct, data extraction, and emulation operation and/or debug activities 2.3.2.4 Data Analyst. Data collection and analysis performed in accordance with this plan and detailed test procedures is the responsibility of the Data Analyst. The Data Analyst is responsible for the following:
Copyright © 2002 Interpharm Press
Page 9 of 32 SVTP
460
Software Quality Assurance SOPs for Healthcare Manufacturers
1. Data requirements definition 2. Data reduction 3. Data analysis
2.4 Tools, Techniques, and Methodologies Testing performed in accordance with this plan will verify that the fully integrated software end products satisfy the requirements of the [project/product name] SRS. At the start of software validation testing, the latest controlled version of software will be checked out of the [project/product name] software library, and the executable version of the software will be built by V&V. Successful completion of this task is required before further testing of the software may begin.
2.4.1
Test Categories
The test categories for software validation testing will include the following types of testing.
•
Functional Testing. To verify that all the functional requirements have been satisfied. This test category is termed success oriented, because the tests are expected to produce successful results.
•
Robustness Testing. To evaluate software performance given unexpected inputs. This test category is termed failure oriented, because the test inputs are designed to cause the product to fail given foreseeable and reasonably unforeseeable misuse of the product.
•
Stress Testing. To evaluate software in a stress condition in which the amount or rate of data exceeds the amount expected.
•
Safety Testing. To verify that the software performs in a safe manner and that a complete assessment of the safety design is accomplished.
•
Regression Testing. Performed whenever a software change occurs, to detect faults introduced during modification, verify that modifications have not caused unintended adverse effects, and verify that the software still meets its specified requirements.
2.4.2
Test Verification Methods
The methods of test verification for software validation testing will include the following: •
Inspection. Visual examination of an item.
Page 10 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
•
Analysis. Evaluation of theoretical or empirical data.
•
Demonstration. Operational movement or adjustment of an item.
•
Test. Operation of an item and the recording and evaluation of quantitative data.
2.4.3
461
Validation Criteria
The detailed pass or fail criteria for the [project/product name] software are contained in the [project/ product name] SVTPR.The validation criteria for each software requirement of the SRS are defined by one or more test category and test verification method, as shown in Appendix A.
3.0 TEST REQUIREMENTS
3.1 Functional Testing The objectives of Functional Testing are as follows:
•
To define and detect specified classes of faults by executing each functional capability of the [project/product name] software against fault-revealing test data
•
To detect failures that correspond to wrong sequences of function invocation or to wrong transmission of data between functions
The [project/product name] software functional tests are shown in Appendix B-1.
3.2 Robustness Testing The objective of robustness testing is to determine the performance of the [project/product name] software given foreseeable and reasonably unforeseeable misuse of the [project/product name] product. This testing will measure the compliance of the software with the requirements of the SRS and the design of the SDDS given unexpected and/or invalid inputs. The [project/product name] software robustness tests are shown in Appendix B-2.
Copyright © 2002 Interpharm Press
Page 11 of 32 SVTP
462
Software Quality Assurance SOPs for Healthcare Manufacturers
3.3 Stress Testing The objective of stress testing is to verify the compliance of the software with the SRS under operating conditions in which the amount or rate of data exceeds the amount expected. The [project/product name] software stress tests are shown in Appendix B-3.
3.4 Safety Testing The objective of safety testing is to verify that the software performs in accordance with the safety requirements of the SRS and that a complete assessment of the safety design is accomplished. The [project/product name] software safety tests are shown in Appendix B-4.
3.5 Regression Testing The objectives of regression testing are as follows:
•
Detect faults introduced during modification
•
Verify that modifications have not caused unintended adverse effects
•
Verify that the software still meets its specified requirements
Tests previously performed during software validation testing will be repeated as required in order to ensure the integrity of the [project/product name] software revisions. The degree of retest that is required will be a function of the impact of software modification on the safety and functionality of the software. All regression testing performed will be documented prior to completion of software validation testing.
3.6 Resource Requirements 3.6.1
Personnel
The [project/product name] V&V lead software engineer will function as the [insert titles from Section 2.3.2 here]. Each of [insert number of V&V engineers here] V&V software engineers will function as the [insert titles from Section 2.3.2 here].
Page 12 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
3.6.2
463
Hardware
The hardware required to support the development of the [project/product name] SVTPR, actual conduct of testing, and data analysis are listed in Appendix C-1. Software validation testing will require the final hardware configuration of the [project/product name] product(s).
3.6.3
Support Software
The support software required for the development and maintenance of test documentation, test preparation, test conduct, and data analysis are listed in Appendix C-2.
3.7 Risks and Contingencies Risk management for software validation testing will be performed by the V&V group at each [project/product name] software development milestone in order to identify areas of uncertainty that are significant sources of risk and to formulate a cost-effective strategy for resolving the sources of risk. Updates to this plan and to the SVVP, if necessary, will be generated to document any modification required as a result of this risk analysis. The risks and contingencies associated with the [project/product name] software validation testing are listed in Appendices D-1 and D-2, respectively.
4.0 TEST REPORTING
4.1 Test Recording Test recording will be accomplished through the development and maintenance of the following:
•
SVTP VTISs
•
Software Validation Test Log
•
Software validation test report
Copyright © 2002 Interpharm Press
Page 13 of 32 SVTP
464
Software Quality Assurance SOPs for Healthcare Manufacturers
4.1.1
Validation Test Information Sheet
A VTIS will be generated for each test described in this plan; an example of the VTIS is shown in Appendix E. Each VTIS will describe the test purpose and success criteria, test approach, and test environment. This information is used in conjunction with this plan in order to develop the SVTPR. During testing, the VTIS for each test is updated to document the test conductor, test results, any associated comments, and test completion signatures.The VTISs will be retained by the [project/product name] V&V group in a central location, for review by the [project/product name] software developers, software lead engineer, and the [corporate title/position].
4.1.2
Test Log
The [project/product name] Software Validation Test Log will be used to record all the chronological events relevant to software validation testing. A sample [project/product name] Software Validation Test Log sheet is shown in Appendix F.
4.1.3
Test Report
The Software Validation Test Report will be generated at the conclusion of all testing and will be a record of the software validation testing performed for the [project/product name] software. The Software Validation Test Report may be in a format appropriate for technical disclosure and will consist of the following:
•
Summary of the software validation testing
•
Software Validation Test Log
•
Validation Test Information Sheets
•
Summary of the analysis and evaluation of test results
•
Recommendations
4.2 Anomaly Reporting All software deficiencies discovered during software validation testing will be reported on a Software Anomaly Report. A description of the information required on the anomaly report is provided in the [project/product name] SVVP, along with instructions for completing the anomaly
Page 14 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
465
report. A Software Anomaly Report generated during software validation testing will be distributed to the [project/product name] software lead engineer for review and initiation of appropriate corrective action.
5.0 TEST ADMINISTRATION PROCEDURES
5.1 Tasks The set of tasks necessary to prepare for and perform software validation testing is given in Appendix G-1.
5.2 Intertask Dependencies The V&V intertask dependencies are listed in Appendix G-2.
5.3 Special Skills Software validation testing will not require special skills.
5.4 Test Administration Software validation testing is an iterative process. Anomalies discovered during testing may require another phase of validation testing to ensure that the anomalies are fixed and that no adverse side effects have resulted from the fixes. The degree of retest that may be required will be a function of the impact of the software corrections on the safety and functionality of the software.
Copyright © 2002 Interpharm Press
Page 15 of 32 SVTP
466
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX A
SOFTWARE VALIDATION CRITERIA
SRS PARAGRAPH IDENTIFICATION Number Title
TEST IDENTIFICATION Category Methods
[Enter SRS paragraph number here]
[Enter SRS paragraph number title here]
[Enter Section 2.4.1 test category type here]
[Enter Section 2.4.2 test method type here]
[Enter SRS paragraph number here]
[Enter SRS paragraph number title here]
[Enter Section 2.4.1 test category type here]
[Enter Section 2.4.2 test method type here]
[Enter SRS paragraph number here]
[Enter SRS paragraph number title here]
[Enter Section 2.4.1 test category type here]
[Enter Section 2.4.2 test method type here]
[Enter SRS paragraph number here]
[Enter SRS paragraph number title here]
[Enter Section 2.4.1 test category type here]
[Enter Section 2.4.2 test method type here]
[Enter SRS paragraph number here]
[Enter SRS paragraph number title here]
[Enter Section 2.4.1 test category type here]
[Enter Section 2.4.2 test method type here]
Page 16 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
APPENDIX B-1
Number FU-1
467
SOFTWARE VALIDATION FUNCTIONAL TESTS FUNCTIONAL TEST REQUIREMENTS Title Description
[Enter title of test to be conducted]
[Enter a functional test description for each operational mode, state, or function. Include initializations, defaults, errors, and displays.]
FU-2
[Enter title of test to be conducted]
[Enter a functional test description for each operational mode, state, or function. Include initializations, defaults, errors, and displays.]
FU-3
[Enter title of test to be conducted] [Enter a functional test description for each operational mode, state, or function. Include initializations, defaults, errors, and displays.]
FU-4
[Enter title of test to be conducted]
[Enter a functional test description for each operational mode, state, or function. Include initializations, defaults, errors, and displays.]
FU-5
[Enter title of test to be conducted]
[Enter a functional test description for each operational mode, state, or function. Include initializations, defaults, errors, and displays.]
FU-[sequential
[Enter title of test to be conducted]
number]
[Enter a functional test description for each operational mode, state, or function. Include initializations, defaults, errors, and displays.]
Copyright © 2002 Interpharm Press
Page 17 of 32 SVTP
468
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX B-2
Number RO-1
SOFTWARE VALIDATION ROBUSTNESS TESTS ROBUSTNESS TEST REQUIREMENTS Title Description
[Enter title of test to be conducted]
[Enter a robustness test description for timing, interfaces, communications, user interface, and operational mode, state, or function.]
RO-2
[Enter title of test to be conducted]
[Enter a robustness test description for timing, interfaces, communications, user interface, and operational mode, state, or function.]
RO-3
[Enter title of test to be conducted] [Enter a robustness test description for timing, interfaces, communications, user interface, and operational mode, state, or function.]
RO-4
[Enter title of test to be conducted]
[Enter a robustness test description for timing, interfaces, communications, user interface, and operational mode, state, or function.]
RO-5
[Enter title of test to be conducted]
[Enter a robustness test description for timing, interfaces, communications, user interface, and operational mode, state, or function.]
RO-[sequential number]
[Enter title of test to be conducted]
[Enter a robustness test description for timing, interfaces, communications, user interface, and operational mode, state, or function.]
Page 18 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
APPENDIX B-3
Number ST-1
469
SOFTWARE VALIDATION STRESS TESTS ROBUSTNESS TEST REQUIREMENTS Title Description
[Enter title of test to be conducted]
[Enter a stress test description for timing, interfaces, communications, and extended operations of the user interface and operational mode, state, or function.]
ST-2
[Enter title of test to be conducted]
[Enter a stress test description for timing, interfaces, communications, and extended operations of the user interface and operational mode, state, or function.]
ST-3
[Enter title of test to be conducted] [Enter a stress test description for timing, interfaces, communications, and extended operations of the user interface and operational mode, state, or function.]
ST-4
[Enter title of test to be conducted]
[Enter a stress test description for timing, interfaces, communications, and extended operations of the user interface and operational mode, state, or function.]
ST-5
[Enter title of test to be conducted]
[Enter a stress test description for timing, interfaces, communications, and extended operations of the user interface and operational mode, state, or function.]
ST-[sequential
[Enter title of test to be conducted]
[Enter a stress test description for timing, interfaces, communications, and extended operations of the user interface and operational mode, state, or function.]
number]
Copyright © 2002 Interpharm Press
Page 19 of 32 SVTP
470
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX B-4
Number
SOFTWARE VALIDATION SAFETY TESTS
SAFETY TEST REQUIREMENTS Title Description
SA-1
[Enter title of test to be conducted]
[Enter a safety test description for each critical function, parameter, or fail-safe design function.]
SA-2
[Enter title of test to be conducted]
[Enter a safety test description for each critical function, parameter, or fail-safe design function.]
SA-3
[Enter title of test to be conducted]
[Enter a safety test description for each critical function, parameter, or fail-safe design function.]
SA-4
[Enter title of test to be conducted]
[Enter a safety test description for each critical function, parameter, or fail-safe design function.]
SA-5
[Enter title of test to be conducted]
[Enter a safety test description for each critical function, parameter, or fail-safe design function.]
SA-[sequential
[Enter title of test to be conducted]
[Enter a safety test description for each critical function, parameter, or fail-safe design function.]
number]
Page 20 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
APPENDIX C-1
471
REQUIREMENTS FOR SOFTWARE VALIDATION TESTING SUPPORT HARDWARE
VALIDATION TESTING HARDWARE REQUIREMENTS Quantity 2 2 1 [Enter quantity required]
Copyright © 2002 Interpharm Press
Description HP 64700 Emulator Sun Workstations DATAIO UNISITE PROM Programmer [Enter description of hardware required]
Page 21 of 32 SVTP
472
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX C-2
REQUIREMENTS FOR SOFTWARE VALIDATION TESTING SUPPORT SOFTWARE
VALIDATION TESTING SUPPORT SOFTWARE REQUIREMENTS Quantity 1 1 1 1 1 1 1 1 1 1 [Enter quantity required]
Page 22 of 32 SVTP
Description Software CASE tool Complexity analysis tool Cross and native compiler Assembler Cross and native debugger Module librarian Source code control system Text editor or publishing system Graphical or drawing system Database application system [Enter description of software required]
Copyright © 2002 Interpharm Press
Software Validation Test Plan
473
APPENDIX D-1 SOFTWARE VALIDATION TESTING RISKS
VALIDATION TESTING RISKS Item [Enter [Enter [Enter [Enter [Enter
sequential sequential sequential sequential sequential
Description number] number] number] number] number]
Copyright © 2002 Interpharm Press
[Enter [Enter [Enter [Enter [Enter
task, task, task, task, task,
activity, activity, activity, activity, activity,
or or or or or
technical technical technical technical technical
description description description description description
of of of of of
risk.] risk.] risk.] risk.] risk.]
Page 23 of 32 SVTP
474
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX D-2 SOFTWARE VALIDATION TESTING CONTINGENCIES
VALIDATION TESTING CONTINGENCIES Item [Enter [Enter [Enter [Enter [Enter
sequential sequential sequential sequential sequential
Page 24 of 32 SVTP
Description number] number] number] number] number]
[Enter [Enter [Enter [Enter [Enter
task, task, task, task, task,
activity, activity, activity, activity, activity,
or or or or or
technical technical technical technical technical
description description description description description
of of of of of
risk risk risk risk risk
contingency.] contingency.] contingency.] contingency.] contingency.]
Copyright © 2002 Interpharm Press
Software Validation Test Plan
APPENDIX E
475
SOFTWARE VALIDATION TEST INFORMATION SHEET (VTIS)
SOFTWARE VALIDATION TEST INFORMATION SHEET
Test Category
Test Number
Requirement
Requirement Number
1. Objectives and success criteria 2. Test approach 3. Test instrumentation 4. Test duration 5. Data collection, reductions, and analysis requirements 6. Comments 7. Results 8. Signatures:
Copyright © 2002 Interpharm Press
Test Conductor
Date
V&V Lead Engineer
Date
Page 25 of 32 SVTP
476
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX F
SOFTWARE VALIDATION TEST LOG SHEET SOFTWARE VALIDATION TEST LOG
Location:
Time
Software Project:
Test Number
Date:
Entry
References
Engineer
Page ____ of ____
Page 26 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
477
APPENDIX G-1 TASK PREPARATION LIST FOR SOFTWARE VALIDATION TESTING Item
VALIDATION TESTING PREREQUISITES Description
1 1.a 1.b 1.c 1.d 1.e 1.f 1.g 1.h 1.i 1.j 1.k 1.l 1.m 1.n [Enter [Enter [Enter [Enter
sequential sequential sequential sequential
Preparation of required documents: Product or system requirements document Interface Design Specification (IDS) Software Development Plan (SDP) Software Quality Assurance Plan (SQAP) Software Configuration Management Plan (SCMP) Software Verification and Validation Plan (SVVP) Software Requirements Specification (SRS) Software Architecture Design Specification (SADS) Software Detailed Design Specification (SDDS) Software Development Test Plan (SDTP) Software Validation Test Plan (SVTP) Development TISs Validation TISs Software Validation Test Procedures (SVTPR) number] number] number] number]
Copyright © 2002 Interpharm Press
[Enter [Enter [Enter [Enter
description description description description
of of of of
prerequisite prerequisite prerequisite prerequisite
for for for for
validation validation validation validation
testing] testing] testing] testing]
Page 27 of 32 SVTP
478
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX G-2 SOFTWARE VALIDATION TASK INTERDEPENDENCIES VALIDATION TESTING TASK INTERDEPENDENCIES Item Description 1 2 3 [Enter sequential number] [Enter sequential number]
Page 28 of 32 SVTP
SDDS will be approved prior to beginning development of SVTPR and VTISs SVTP will be approved prior to completion of Architecture Design Phase SVTPR and VTISs will be approved prior to start of validation testing [Enter description of intertask dependency for validation testing] [Enter description of intertask dependency for validation testing]
Copyright © 2002 Interpharm Press
Software Validation Test Plan
479
GLOSSARY Anomaly: Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. Baseline: Specification or product, formally reviewed and agreed upon; that thereafter serves as the basis for further development and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Completeness: Those attributes of the software or documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Component testing: Testing conducted to verify the implementation of the design for one software component or collection of software components. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Configuration item: Aggregation of hardware or software or any of its discrete parts that satisfies an end-use function. Configuration management (CM): Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Design requirement: Any requirement that impacts or constrains the design of a software system or software system component.
Copyright © 2002 Interpharm Press
Page 29 of 32 SVTP
480
Software Quality Assurance SOPs for Healthcare Manufacturers
Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect, which may lead to a failure, of a system or system component that is caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system. Implementation phase: Period in the software development life cycle during which a software product is created from design documentation and debugged. Inspection: Formal evaluation technique in which software requirements, design, or code are examined in detail by a person or group other than the author in order to detect faults, violations of development standards, or other problems. Milestone: Scheduled and accountable event that is used to measure progress. Regression testing: Selective retesting to detect faults introduced during modification, to verify that modifications have not caused unintended adverse effects, and to verify that a modified system or system component still meets its specified requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system.
Page 30 of 32 SVTP
Copyright © 2002 Interpharm Press
Software Validation Test Plan
481
Software Architecture Design Review (SADR): Software review conducted for the purpose of (1) reviewing the projects SADS, associated plans, and technical issues; (2) resolving identified issues; and (3) obtaining commitment to proceed into the detailed design phase. Software Configuration Management Plan (SCMP): Project-specific plan that specifies the methods and planning employed to implement software configuration management activities. Software Detailed Design Review (SDDR): Software review conducted for the purpose of: (1) reviewing the project’s SDDS, associated plans, and critical issues; (2) resolving identified issues; (3) obtaining commitment to proceed into the code and test phase; and (4) obtaining commitment to a test program supporting product acceptance. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review and includes a description of the overall program operation and control and the use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures employed to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software Development Test Plan (SDTP): Project-specific plan that defines the scope of software testing that must be successfully completed for each software component. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software quality: Totality of features and characteristics of a software product that bear on its ability to satisfy given needs. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products.
Copyright © 2002 Interpharm Press
Page 31 of 32 SVTP
482
Software Quality Assurance SOPs for Healthcare Manufacturers
Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is then evaluated to determine whether requirements have been satisfied. Software Validation Test Plan (SVTP): Project-specific plan that describes the software testing required to verify that the software product satisfies the specified requirements. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test. Validation: Process of evaluating software at the end of the software development process to ensure its compliance with software requirements. Verification: Process of determining whether or not the products of a given phase of the software development life cycle fulfill the requirements established during the previous phase.
Page 32 of 32 SVTP
Copyright © 2002 Interpharm Press
[Project/Product Name] SVVP SOFTWARE VERIFICATION AND VALIDATION PLAN
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SVVP-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 40 SVVP
484
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
V&V OVERVIEW
5
3.0
V&V REQUIREMENTS
8
4.0
V&V REPORTING
25
5.0
V&V ADMINISTRATION PROCEDURES
27
APPENDIX A
Schedule of V&V Tasks by Software Development Life Cycle Phase
30
V&V Resource Requirements by Software Development Life Cycle Phase
31
V&V Tasks, Roles, and Responsibilities by Software Development Life Cycle Phase
32
APPENDIX D-1
Software Anomaly Report
33
APPENDIX D-2
Instructions For Completing Software Anomaly Report
34
APPENDIX B
APPENDIX C
GLOSSARY
Page 2 of 40 SVVP
35
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
485
1.0 INTRODUCTION
1.1 Purpose This plan identifies and describes the plan for the verification and validation (V&V) of the software developed for the [project/product name]. The program of V&V defined in this document will be applied throughout all phases of the software development that is defined in the [project/product name] Software Development Plan (SDP).
1.2 Scope The scope of the [project/instrument name] V&V is defined to be those tasks and the information necessary to manage and perform those tasks that are required in order to ensure the development of quality software for the [project/product name]. The basis for the development of software for the [project/product name] will be the capabilities as defined in the system requirements documents.The V&V program described in this plan has been tailored to assure that an appropriate level of V&V is applied to all phases of the software development, while supporting the [project/instrument name] market strategy and product launch schedule. This document describes the V&V organization, activities, schedule, and inputs and outputs that are required for an effective [project/product name] V&V program. The scope of participation by associated organizations in the V&V of the [project/product name] software product is also identified. V&V will be defined for each phase of the [project/product name] software development relative to:
•
V&V tasks
•
Methods and evaluation criteria
•
Inputs and outputs
•
Schedule
•
Resources
•
Risks and assumptions
•
Roles and responsibilities
Copyright © 2002 Interpharm Press
Page 3 of 40 SVVP
486
Software Quality Assurance SOPs for Healthcare Manufacturers
1.3 Overview The verification and validation of software is defined to be the independent assessment and measurement of the correctness, accuracy, consistency, completeness, robustness, and testability of software requirements, design, and implementation. The goals of the [project/product name] V&V program are: 1. Verify that the products of each development phase comply with previous phase requirements and products; address all safety related requirements for critical components/functions; satisfy the standards, practices, and conventions of the phase; and establish the proper basis for initiating the next software development phase. 2. Validate that the completed software end product complies with established software and system requirements. 3. Document the results of the V&V tasks in support of software and management planning activities. 4. Facilitate the accomplishment of the product quality goals.
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
1.4.1
•
Project Specifications Product Objectives Document, Document Number [aaa]-PODRevision [#.#], dated [date]
[project/product name] [#.#],
•
[project/product name]
•
[project/product name]
•
[project/product name]
•
[project/product name]
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
Software Configuration Management Plan, Document Number [aaa]-CMP-[#.#], Revision [#.#], dated [date] Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
[#.#],
Software Development Test Plan, Document Number [aaa]-DTPRevision [#.#], dated [date]
Page 4 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
487
•
Software End-product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
1.4.2
[project/product name]
Software Quality Assurance Plan, Document Number [aaa]-QAP[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 V&V OVERVIEW
2.1 Organization The [project/product name] has been organized under the direction of [title/position], who has assembled a program team that represents all concerned disciplines. This team coordinates the development activities and provides support for product development. Product development of [project/product name] involves disposables, [insert engineering disciplines here], and software engineering. The interface between these disciplines during product development is provided through the technical team that meets to address and resolve [project/product name] product development issues.
Copyright © 2002 Interpharm Press
Page 5 of 40 SVVP
488
Software Quality Assurance SOPs for Healthcare Manufacturers
A lead software engineer heads the [project/product name] software development and provides technical direction in all aspects of software development. Other software engineers have been assigned to the team, and all participants in the development of the [project/product name] software are responsible for ensuring that their efforts are in compliance with the Software Development Policies. The functions and tasks of V&V for the [project/product name] software are organized under the direction of [corporate title/position]. The V&V tasks, policies, and procedures are administered and approved by this individual.The authority for resolving project-related issues raised by the V&V tasks and approval of the V&V products resides with this individual or designee. The [project/ product name] V&V organization is composed of software engineers who have not been directly involved in the development of the software being verified and have not established the criteria against which the software is validated. The functional responsibilities of the V&V organization are separated into two functions, the software V&V lead engineer and the V&V software engineer. These two functions do not necessarily refer to organizational positions, and the responsibilities may be performed by one person. The development and administration of the V&V program described by this plan is the responsibility of the V&V lead software engineer. This individual is responsible for planning, organizing, monitoring, and controlling the V&V tasks performed for the [project/product name] project. The application of V&V tasks to each phase of software development is the responsibility of the V&V software engineer. The project V&V organization will comply with all approved software configuration management requirements and procedures that are described in the [project/product name] Software Configuration Management Plan (SCMP) in the development and submittal of the V&V products defined in this plan.
2.2 Master Schedule V&V is an integral part of each phase of the software development life cycle. The V&V tasks are integrated into the project schedule in order to provide feedback to the development process and to support management functions. A schedule of the [project/product name] V&V tasks and the relationship of each task to the phases of software development are presented in Appendix A. The management of the V&V tasks and personnel that are required to support management and technical reviews requires the scheduling of those tasks to correspond to and meet project milestones. As the individual V&V tasks are completed, the results are documented in a V&V task report; the task reports are correlated at the completion of each software development phase into a Phase V&V Task Summary Report. The exchange of V&V data and results with the development effort is also provided by anomaly reports.
Page 6 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
489
The anomaly reports, task reports, and task summary reports provide feedback to the software development process regarding the technical quality of software products. Resolution of critical or high-severity anomalies is required before the V&V effort proceeds on to the next software development phase.
2.3 Resources The personnel and material resources required to perform V&V of the [project/product name] software are presented in Appendix B. The factors that were analyzed in determining these resource requirements are product features and performance requirements of the [project/product name] product as specified in the product requirements documentation and development of the [project/ product name] software project in compliance with the Software Development Policies.
2.4 Responsibilities The project V&V organization is responsible for performing the V&V tasks defined in this plan. The personnel selected to perform the V&V of [project/product name] software have the technical credibility to: •
Understand the source of software quality-related problems
•
Follow through with recommended corrective actions to the development process
•
Abort delivery of defective software end products
Members of the project V&V organization will not be assigned to the development of software to be produced on the project or to establishing the criteria against which the software is validated.The specific roles and responsibilities of the V&V organization during each phase of software development are presented in Appendix C.
2.5 Tools, Techniques, and Methodologies V&V of the [project/product name] software will be accomplished by reviewing, testing, checking or otherwise establishing, and documenting the conformance of the software to specified requirements. These techniques will be performed manually and by means of automated tools and techniques. Examples of the automated tools to be used include traceability analyzers,
Copyright © 2002 Interpharm Press
Page 7 of 40 SVVP
490
Software Quality Assurance SOPs for Healthcare Manufacturers
static analyzers, dynamic analyzers, comparators, test result analyzers, and change trackers. Manual tools to be utilized include walk-throughs, formal reviews, and algorithm analysis. Support tools and techniques for testing include the following: •
General system utilities and text-processing tools for test preparation, organization, and modification
•
Data reduction and report generation tools
•
Library support systems consisting of database management systems and configuration control systems
•
Test drivers and test languages
The selection of tools for the V&V tasks are based on the V&V objectives and goals for each phase of software development. The necessary tools for V&V are listed in Appendix B.
3.0 V&V REQUIREMENTS
3.1 Management The management of the V&V program described in this plan spans all phases of [project/product name] software development.The [corporate title/position] designates a V&V lead software engineer, who is responsible for performing the V&V management tasks for the V&V program. The V&V lead is responsible for making decisions regarding the performance of V&V, assigning priorities to V&V tasks, estimating the level of effort for a task, tracking the progress of work, determining the need for V&V task iteration or initiation of new V&V tasks, and assuring adherence to standards in all phases of the [project/product name] V&V program. The management tasks to be performed for the [project/product name] V&V program include, but are not limited to, the following: SVVP generation and maintenance
•
[project/product name]
•
Software baseline change assessment for effects on previously completed V&V tasks
Page 8 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
491
•
Periodic review of the V&V effort, technical accomplishments, resource utilization, future planning, and risk management
•
Daily management of V&V phase activities, including the technical quality of final and interim V&V reports
•
Review and evaluation of V&V results in order to determine when to proceed to the next software development life cycle phase and define changes to V&V tasks which will improve the V&V effort
•
Maintain good communication with all team members to ensure the accomplishment of project quality assurance goals and objectives
At each phase of software development, the V&V tasks and associated inputs and outputs, schedule, resource requirements, risks and assumptions, and the personnel responsible for performing the task are evaluated. This evaluation establishes the criteria for updating this plan. Maintenance is performed as necessary to ensure the completeness and consistency of this plan with the changes in software developed for the project. will support the management of V&V for the [project/product name] software through reviews of V&V activities. Periodic reviews of the V&V effort, technical accomplishments, resource utilization, future planning, and risk management will be conducted by [corporate title/position]. The technical quality and results of the outputs of each phase of V&V will be evaluated in order to provide management support for the V&V leader’s recommendation to proceed or not to proceed to the next development phase and to define changes to V&V tasks to improve the V&V effort. Updates to this plan during software development will be reviewed and approved by [corporate title/position] prior to implementation. [corporate title/position]
3.2 Software Requirements Phase V&V The goal of the Requirements Phase V&V is to ensure that both the problem and the constraints upon the solution are specified in a rigorous form. During this phase of software development the software requirements analysis is performed. As problem evaluation and solution synthesis are accomplished, the interface characteristics of the software are established and design constraints are uncovered. The Product Requirements Document specifies the product or system-level requirements of the [project/product name]. This document establishes the product requirements from which software requirements are allocated. The [project/product name] Software Requirements Specification (SRS) is the document that specifies the results of the software requirements analysis.The SRS defines the basic functions,
Copyright © 2002 Interpharm Press
Page 9 of 40 SVVP
492
Software Quality Assurance SOPs for Healthcare Manufacturers
performance, interfaces, flow and structure of information, and validation criteria of a successful software implementation.The emphasis of the Software Requirements Phase V&V tasks is the analysis and evaluation of the correctness, consistency, completeness, accuracy, and testability of the specified software requirements.
3.2.1
Verification and Validation Tasks
3.2.1.1 Review of Product Requirements Documentation. Review of the product requirements documentation is critical because it establishes the basis upon which all succeeding documents and products are developed. During this phase of development, the specifications of system performance, user interface, and critical components of [project/product name] are reviewed for use in V&V planning and in defining the level of effort required to successfully verify and validate the software. The requirements defined in these product requirements documents provide the basis for development of the [project/product name] Requirements Traceability Matrix (RTM). Product requirements documentation of the project is provided to the V&V leader by the software lead engineer for review prior to development of the SRS. Review of the Product Requirements Specification supports V&V of the software product and ensures that the development of a safe, reliable, user-friendly, cost-effective product is achievable. 3.2.1.2 Verification of the Software Requirements Specification. The [project/product name] SRS will be evaluated for correctness, consistency, completeness, accuracy, and testability. The SRS is provided to the V&V leader for review prior to Software Requirements Review (SRR). The review of the SRS by V&V will concentrate on the following areas of requirement definition: 1. Specification of the computing environment(s) in which the software must perform 2. Specification of the safety requirements, including a description of the unsafe operating conditions in terms of critical software functions and goals, the severity of hazards, and the set of associated critical parameters and critical indicators 3. Specification of the hardware interfaces through which the software must gather input and send output 4. Specification of the software interfaces, including the purpose of the interface, the type of data to be interchanged via the interface, and an estimate of data quantity and transfer rate requirements 5. Specification of the user interfaces, including the characteristics that the software must support for each human interface to the software product
Page 10 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
493
6. Specification of the interfaces to communications devices, including the name, version, interface type, and required usage 7. Specification of the required values of each output, expressed through functions and state tables 8. Specification of the timing, accuracy, and stability requirements for each such output value 9. Software design constraints specifying likely changes and desirable subsets that must be accounted for in the design An assessment will also be made of how well the [project/product name] SRS satisfies system objectives, including system safety considerations. 3.2.1.3 Generation of the RTM. The [project/product name] RTM traces the development of the software product from requirements through software validation. The RTM is developed by V&V for use in: •
Evaluating subsequent requirements and design documents
•
Developing test events and test data
•
Documenting the validation of the software
During review of the Product Requirements Specification, the software-related requirements are listed in the RTM with a reference to the document that specified them. These requirements will be refined in subsequent levels of engineering documentation and entered into a database with a reference to the higher level requirement document. Every requirement in the RTM will be traceable in both directions to the product requirements documentation and to the code and subsequent test(s). The RTM will be used to generate tests designed to validate a specific requirement or a group of related requirements. Inconsistencies in the refinement of requirements, incomplete definition of requirements in lower level specifications, and code and incomplete specification of testing for requirements are detected by the RTM. Concurrent with evaluation of the [project/product name] SRS, the [project/product name] RTM is updated to document the tracing of the specified software and interface requirements to requirements in the product requirements documentation.
3.2.2
Inputs and Outputs
The inputs to the Software Requirements Phase V&V include the product requirements documentation, [project/product name] SRS, and periodic program status reports. The outputs of the Software Requirements Phase V&V are the Requirements Phase V&V Task reports,
Copyright © 2002 Interpharm Press
Page 11 of 40 SVVP
494
Software Quality Assurance SOPs for Healthcare Manufacturers
Requirements Phase V&V Task Summary report, the [project/product name] RTM, and updates to this plan as required in order to accommodate changes in product requirements and/or program objectives. Outputs from this phase are inputs to subsequent V&V tasks. V&V task reports are developed for each V&V task that is conducted during the requirements phase and will be used to document discrepancies between requirements documentation and previously defined product requirements. The V&V task reports are distributed by the V&V leader to the software lead engineer and the [corporate title/position] for review and initiation of corrective action. The Software Requirements Phase V&V Task Summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The Software Requirements Phase V&V Task Summary report will be distributed to the software lead engineer and the [corporate title/position]. All V&V outputs generated during this development phase will be provided by the V&V leader to the software lead engineer prior to the SRR in order to support management in determining the adequacy, correctness, and testability of the stated software and interface requirements. A copy of all V&V outputs from this development phase will be delivered, at task completion, to the software configuration manager for archiving.
3.2.3
Risks and Assumptions
Accomplishment of the scheduled V&V for this phase of software development assumes that the [corporate title/position] will provide the V&V organization with the resources required to fulfill the V&V tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives. Development of the SRS will comply with the requirements defined in the Software Development Policies.
3.3 Software Architecture Phase V&V The goal of the Software Architecture Phase V&V is to ensure that the preliminary software design establishes the design baseline from which the detailed design will be developed. The [project/product name] Software Architecture Design Specification (SADS) is generated during this phase of software development. The SADS describes how the software system will be structured to satisfy the requirements identified in the [project/product name] SRS. The SADS translates the software requirements into a description of the software structure, software components, interfaces, and data necessary for the detailed design phase. The goal of the Software Architecture Phase V&V tasks is to ensure internal consistency, completeness, correctness, and clarity of the information needed to support the detailed definition
Page 12 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
495
of the individual software system components.V&V will generate a plan for software validation of the software end product during this software development phase.
3.3.1
Verification and Validation Tasks
3.3.1.1 Architecture Design Verification. The tasks defined for the Software Architecture Phase V&V concentrate on evaluating the preliminary software design. The relationships between the requirements of the [project/product name] SADS and the [project/product name] SRS are analyzed for correctness, consistency, and accuracy. The inclusion of safety features in the software design will be evaluated for compliance with approved software safety design guidelines and safety considerations identified in the system Hazards Analysis that are controlled and/or commanded by software. Design walk-throughs are conducted by the software designer(s) during this phase to examine the characteristics of the software architecture design. V&V will participate in these walkthroughs and will provide results of design V&V to the software lead engineer and [corporate title/position]. The SADS is provided to the V&V leader by the software lead engineer for review prior to the Software Architecture Design Review (SADR). Review of the [project/product name] SADS by V&V is accomplished by performing the following: •
Evaluating the form, structure, and functional description of the design for correctness, consistency, completeness, and accuracy
•
Evaluating the software structure for robustness, testability, and compliance with established software development procedures and Software Development Policies
•
Analysis of the data items defined at each interface for correctness, consistency, completeness, and accuracy
3.3.1.2 Generation of the Software Validation Test Plan. The generation of a plan for software validation testing is the responsibility of the V&V leader and is accomplished concurrently with design analysis. This document is the [project/product name] Software Validation Test Plan (SVTP) and defines the methods for verifying: •
Correct implementation of software requirements
•
Software system capabilities
•
Throughput and timing requirements
•
Safety design requirements
•
Correct interface to the system environment. The V&V leader is responsible for generating, maintaining, and obtaining approval of the SVTP
Copyright © 2002 Interpharm Press
Page 13 of 40 SVVP
496
Software Quality Assurance SOPs for Healthcare Manufacturers
Development of the [project/product name] SVTP is accomplished in parallel with the review and verification of the [project/product name] SADS. The SVTP describes the tests conducted and resource requirements for software validation of the software end products.The tests to be performed during software validation are carefully selected to verify correct system operation under the range of environments and input conditions defined in the [project/product name] SRS. During software validation the following are measured: •
Compliance of the complete software product with all functional requirements while operating in all system environment(s)
•
Software performance at hardware, software, and user interfaces
•
Performance at boundaries and under stress conditions
•
Compliance with safety design requirements
3.3.1.3 Review of the Software Development Test Plan. The scope of software testing that must be successfully completed for each software component by the software developers is defined in the [project/product name] Software Development Test Plan (SDTP). The software lead engineer provides the SDTP to the V&V leader for review prior to document approval. The completeness, correctness, and consistency of the software testing described in the SDTP required for each software component are evaluated. Compliance of the SDTP with the requirements specified in the Software Development Policies and software development procedures is also verified. 3.3.1.4 Updates to the RTM. The [project/product name] RTM is updated to document the tracing of the software structure specified in the [project/product name] SADS to requirements in the [project/ product name] SRS. The updated RTM is subsequently used in the generation of the [project/product name] SVTP to ensure completeness and consistency in test coverage. At the completion of this development phase, the [project/product name] RTM is updated to cross-reference each software requirement to the test(s) described in the SVTP and the [project/product name] SDTP. The V&V leader is responsible for ensuring the accuracy and completeness of the RTM updates.
3.3.2
Inputs and Outputs
The inputs to the Software Architecture Phase V&V include the [project/product name] SRS, Hazards Analysis, [project/product name] RTM, [project/product name] SADS, [project/product name] SDTP, and [project/product name] periodic program status reports. The outputs of the Software Architecture Phase V&V are the V&V Task reports, V&V Task Summary report, [project/product name] SVTP, [project/product name] RTM updates, and updates to this plan as required in order to accommodate changes in [project/product name] product/software requirements. Outputs from this phase are inputs to subsequent V&V tasks.
Page 14 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
497
V&V task reports are developed for each V&V task conducted during this phase and are used to document discrepancies between the preliminary design documentation and previously defined software requirements. V&V task reports are distributed by the V&V leader to the software lead engineer and the [corporate title/position] for review and initiation of corrective action. Prior to completion of the [project/product name] Software Architecture Design Review (SADR), the [project/product name] SVTP will be reviewed by the software lead engineer and approved by [corporate title/position]. The Software Architecture Phase V&V Task Summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations.The V&V task summary report will be distributed to the software lead engineer and [corporate title/position]. All V&V outputs generated during this development phase are provided by the V&V leader to the software lead engineer prior to the SADR in order to support management in determining the compatibility, reliability, and testability of the stated software and interface design. The [project/product name] SVTP is delivered, upon approval, to the software configuration manager for configuration management. Distribution of the SVTP will be made to the V&V leader, the software lead engineer, and the [corporate title/position] prior to the SADR. A copy of all other V&V outputs from this development phase is delivered, at task completion, to the software configuration manager for archiving.
3.3.3
Risks and Assumptions
Accomplishment of the scheduled V&V for this phase of software development assumes the following: •
The [corporate title/position] will provide the V&V organization with the resources required to fulfill the V&V tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives.
•
Development of the [project/product name] SADS and SDTP will comply with the requirements defined in the Software Development Policies.
3.4 Detailed Design Phase V&V The goal of the Detailed Design Phase V&V is to ensure that the detailed software design satisfies the requirements and constraints specified in the [project/product name] SRS and augments the design specified in the [project/product name] SADS. A [project/product name] Software Detailed Design Specification (SDDS) is generated during this phase of software development. The
Copyright © 2002 Interpharm Press
Page 15 of 40 SVVP
498
Software Quality Assurance SOPs for Healthcare Manufacturers
SDDS describes how the software system will be structured to satisfy the requirements identified in the SRS and supports the design specified in the SADS.The SDDS translates the software requirements into a description of the software structure, software components, interfaces, and data necessary for the implementation phase. The result is a solution specification that can be implemented in code with little additional refinement. The goal of the Detailed Design Phase V&V tasks is to ensure the internal consistency, completeness, correctness, and clarity of the [project/product name] SDDS and to verify that the implemented design will satisfy the requirements specified in the [project/product name] SRS. The [project/product name] SVTP will be updated as required in order to incorporate the additional design details of the SDDS. Software Validation Test Information Sheets (VTISs) are developed by V&V to define the objectives, approach, and requirements of each test defined in the SVTP.
3.4.1
Verification and Validation Tasks
3.4.1.1 Detailed Design Verification. The tasks defined for the Detailed Design Phase V&V concentrate on evaluating the software design for the [project/product name] project. The relationships between the requirements of the [project/product name] SDDS and SRS and the design of the [project/product name] SADS and SDDS are analyzed for correctness, consistency, completeness, and accuracy.The inclusion of safety features in the software design is evaluated for compliance with approved software safety design guidelines and safety considerations identified in the Hazards Analysis that are controlled and/or commanded by software. Design walk-throughs are conducted by the software designer(s) during the detailed design phase to examine the characteristics of the detailed software design. V&V will participate in these walk-throughs and will provide results of design V&V to the software lead engineer and the [corporate title/position]. The [project/product name] SDDS is provided to the V&V leader by the software lead engineer for review prior to the Software Detailed Design Review (SDDR). Review of the SDDS is accomplished by performing the following: •
Evaluating the form, structure, and functional description of the design for correctness, consistency, completeness, and accuracy
•
Evaluating the software structure for robustness, testability, and compliance with established software development procedures and Software Development Policies
•
Analysis of the data items defined in the SDDS at each hardware, software, and user interface for correctness, consistency, completeness, and accuracy
•
Assessing how well the software structures defined in the [project/product name] SDDS satisfy the fundamentals of structured design
Page 16 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
499
Structured design techniques that provide a foundation for “good” design methods include the following: •
Evaluating the preliminary software structure to reduce coupling and improve cohesion
•
Minimizing structures with high fan-out and striving for fan-in as depth increases
•
Keeping the scope of effect of a component within the scope of control of that component
•
Evaluating component interfaces to reduce complexity and redundancy and improve consistency
•
Defining components whose function is predictable, but avoiding components that are overly restrictive
•
Striving for single-entry, single-exit components, and avoiding content coupling
•
Packaging software on the basis of design constraints and portability requirements
•
Selecting the size of each component so that independence is maintained
3.4.1.2 Review of Software Development Test Information Sheets. A Software Development Test Information Sheet (DTIS) is prepared by the software developers for each software component test defined in the SDTP. The DTISs are provided to the V&V leader by the software lead engineer for review prior to the DDR. Verification of the adequacy of software component testing is supported by the review of the DTISs. The DTISs are analyzed by V&V to evaluate the following: •
Adequacy of the test methods and test limits defined
•
Adequacy of test coverage
•
Software behavior
•
Software reliability
3.4.1.3 Generation of the VTIS. The generation of VTISs is accomplished concurrently with the software design analysis. These test documents provide the following: •
An organized and accessible collection of all testing and test results
•
A means of tracking the progression/status of testing
Copyright © 2002 Interpharm Press
Page 17 of 40 SVVP
500
•
Software Quality Assurance SOPs for Healthcare Manufacturers
A means of test verification
For each test conducted during [project/product name] software validation, a VTIS is generated and maintained which describes the following: •
Objectives of the test and the success criteria
•
Item under test
•
Test approach
•
Required test instrumentation
•
Test phasing, scheduling, and duration
•
Data collection, reduction, and analysis requirements
The [project/product name]] VTISs are provided to the software lead engineer prior to the DDR, for use in assessing the adequacy of the test methods and limits defined for the software validation test program. Upon successful completion of the DDR, the VTISs will serve as the basis for development of the [project/product name] Software Validation Test Procedures (SVTPR). 3.4.1.4 Updates to the RTM. The [project/product name] RTM will be updated to document the tracing of the software structure specified in the [project/product name] SDDS to requirements in the [project/product name] SRS.
3.4.2
Inputs and Outputs
The inputs to Detailed Design Phase V&V include the [project/product name] SRS, Hazards Analysis, [project/product name] SADS, [project/product name] RTM, [project/product name] SDDS, [project/ product name] DTISs, and [project/product name] periodic program status reports. The outputs of Detailed Design Phase V&V are the Detailed Design Phase V&V Task reports, Detailed Design Phase V&V Task Summary report, [project/product name] VTISs, [project/product name] RTM updates, and updates to this plan as required in order to accommodate changes in [project/product name] product/software requirements. Outputs from this phase are inputs to subsequent V&V tasks. V&V task reports are developed for each V&V task conducted during this phase and are used to document discrepancies between the specification of software design and/or tests and previously defined software requirements. V&V task reports are distributed by the V&V leader to the software lead engineer and the [corporate title/position] for review and initiation of corrective action. The Detailed Design Phase V&V Task Summary report summarizes the results of the
Page 18 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
501
V&V performed and provides an assessment of the quality of progress and recommendations. The V&V task summary report will be distributed to the software lead engineer and the [corporate title/position]. All V&V outputs generated during this development phase are provided by the V&V leader to the software lead engineer prior to the SDDR in order to support management in determining the compatibility, reliability, and testability of the stated software and interface design. A copy of all V&V outputs from this development phase will be delivered, at task completion, to the software configuration manager for archiving.
3.4.3
Risks and Assumptions
Accomplishment of the scheduled V&V for this phase of software development depends on the following assumptions: •
The [corporate title/position] will provide the V&V organization with the resources required to fulfill the V&V tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives.
•
Development of the [project/product name] SDDS and DTISs will comply with the requirements defined in the Software Development Policies.
3.5 Implementation Phase V&V The goal of the Implementation Phase V&V is to ensure that the design is correctly implemented in code, resulting in a program or system ready for validation. The Implementation Phase of the [project/product name] software development effort encompasses the activities defined in the Software Development Policies for the Code and Test Phase and the Integrate and Test Phase. The goal of the Implementation Phase V&V tasks is to ensure the accurate translation of the detailed design and to detect undiscovered errors. Verification of the Implementation Phase activities performed by software developers is accomplished by reviewing code and software integration results.The instructions for validation test setup, operation, and evaluation are generated by V&V for approval prior to test execution.
Copyright © 2002 Interpharm Press
Page 19 of 40 SVVP
502
Software Quality Assurance SOPs for Healthcare Manufacturers
3.5.1
Verification and Validation Tasks
3.5.1.1 Source Code Verification. The V&V tasks performed during the Implementation Phase emphasize the analysis and evaluation of the source code to the [project/product name] SDDS. A traceability analysis is performed to identify the source code implementation of the design and assess the correctness, consistency, completeness, and accuracy of that implementation. The source code is also evaluated for robustness, testability, and compliance with established programming standards and conventions. Code walk-throughs are conducted by the code developer(s) during the Implementation Phase to examine both high-level and detailed properties of the source code.V&V personnel will participate in these walk-throughs and provide results of source code V&V to the software lead engineer and the [corporate title/position]. Code reviews will also be performed by V&V personnel. V&V reviews of source code will: •
Evaluate the structure for compliance with coding standards
•
Assess the communication value
•
Evaluate for efficiency of algorithms, memory efficiency, execution efficiency, and input and output efficiency
•
Evaluate for consistency, completeness, and traceability to software requirements and design
Discrepancies and deficiencies found during V&V of the source code are documented in Software Anomaly Reports. 3.5.1.2 Verification of Software Component Testing, During the implementation phase, software developers will use DTISs to conduct software component testing. At the successful completion of the testing described, the DTIS is signed and dated by the software lead engineer. DTISs and associated test data will be provided to the V&V leader by the software lead engineer in an incremental manner, as each test is completed. Completed DTISs will be analyzed by V&V to evaluate the following: •
Adequacy of test coverage
•
Adequacy of test data
•
Software behavior
•
Software reliability
Page 20 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
503
Discrepancies and deficiencies found during software component testing are documented in Software Anomaly Reports. 3.5.1.3 Generation of Software Validation Test Procedures. The generation of test procedures for Software Validation will be accomplished concurrently with code and integration analysis. The [project/product name] SVTPR will be developed using the test information defined in the VTISs as an outline and adding procedures for test setup, operation, and evaluation. Test setup requirements and computing environments defined in the [project/product name] Challenge Test Protocol will be included in the [project/product name] SVTPR to ensure that test setup and computer environment configurations are as accurate as possible prior to validation test execution. The [project/product name] SVTPR will specify the following: •
Steps for executing the set of tests defined in the [project/product name] SVTP
•
Requirements for logging test activities
•
Criteria for procedure stop and restart
•
Methods of collecting and analyzing test data
The test procedures will be reviewed by the software lead engineer and approved by the [corporate title/position] prior to test execution. 3.5.1.4 Updates to the RTM. The [project/product name] RTM is updated to indicate the successful completion of software component tests.
3.5.2
Inputs and Outputs
Inputs to Implementation Phase V&V include the [project/product name] SDDS, [project/product name] SRS, [project/product name] code, [project/product name] SVTP, and [project/product name] VTISs. The outputs are V&V task reports, anomaly reports, a task summary report, [project/product name] SVTPR, updates to the [project/product name] RTM, and updates to this plan as required.The outputs from this phase are inputs to subsequent V&V tasks. The Implementation Phase V&V Task reports and anomaly reports are distributed to the software lead engineer and the [corporate title/position] for review and initiation of corrective action. The Detailed Design Phase V&V Task Summary report summarizes the results of the V&V performed and provides an assessment of the quality of progress and recommendations. The task summary report is distributed to the software lead engineer and the [corporate title/position].
Copyright © 2002 Interpharm Press
Page 21 of 40 SVVP
504
Software Quality Assurance SOPs for Healthcare Manufacturers
The [project/product name] SVTPR document is delivered, upon approval, to the software configuration manager for configuration management. Distribution of the SVTPR will be made by the software configuration manager to the V&V leader, software lead engineer, and the [corporate title/position] prior to the start of the Software Validation Phase. All other V&V outputs from this development phase will be delivered, at task completion, to the software configuration manager for archiving.
3.5.3
Risks and Assumptions
Accomplishment of the scheduled V&V for this phase of software development assumes the following: •
The [project/product name] will provide the V&V organization with the resources required to fulfill the V&V tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives.
•
Development of the [project/product name] code and conduct of software component testing will comply with the requirements defined in the Software Development Policies.
•
At the successful completion of component testing, [project/product name] code will be delivered to the V&V organization for baselining in accordance with the [project/product name] SDP and SCMP.
3.6 Software Validation Phase V&V The goal of the Software Validation Phase V&V is to verify that the [project/product name] software satisfies the requirements and design specified in the [project/product name] SRS and SDDS.
3.6.1
Verification and Validation Tasks
3.6.1.1 Prevalidation Software Configuration Control. At the completion of software component testing, the software is placed under configuration control for baseline processing. The baselined source code and associated files will be stored in the project software library in accordance with the [project/product name] SCMP. The library will provide internal source file control, problem identification, change traceability, and status determination of the software
Page 22 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
505
and associated documentation. By this means, software configuration is controlled prior to Software Validation. 3.6.1.2 Software Validation Testing. Software Validation is performed using the current controlled version of the software. Software Validation is conducted in accordance with the [project/ product name] SVTP using the [project/product name] SVTPR. The results of Software Validation are documented on the VTISs. Validation test results are analyzed to determine if the software satisfies software requirements and objectives. Software Anomaly Reports are generated to document test failures and software faults. Control of the software configuration under test is maintained by implementing the procedures in the [project/product name] SCMP. The SCMP describes the required steps for processing, reporting and recording approved software changes and dissemination of baselined descriptive documentation and software media. Software Validation is conducted to ensure, as a minimum, the verification of the following software performance requirements: •
Satisfaction of applicable human interface requirements
•
Satisfaction of applicable system safety and data integrity requirements
•
Proper operation, including initiation, data entries via peripheral devices, and system operation monitoring and control
•
Proper interface of all hardware specified in the software requirements specification
3.6.1.3 Regression Testing. Regression testing will be conducted during Software Validation as necessary in order to confirm that the redesign of corrected software has been effective and has not introduced other errors.This retesting includes repeat testing of all test procedures that revealed problems in the previous testing and repeat testing of all test procedures that verify functions that are affected by the corrections. The SVTPR will be corrected to incorporate changes resulting from procedure validation through execution and procedure modification to accommodate approved software design changes. 3.6.1.4 Software Configuration Audit. A Software Configuration Audit of the validated software is conducted by V&V at the conclusion of Software Validation. The baselined software documentation is audited to determine that all the software products to be delivered for certification are present. The version description of all items is verified to demonstrate that the delivered software end products correspond to the software subjected to Software Validation. Discrepancies and deficiencies found during the software configuration audit are documented in Software Anomaly Reports. All anomaly reports are provided to the software lead engineer and the
Copyright © 2002 Interpharm Press
Page 23 of 40 SVVP
506
Software Quality Assurance SOPs for Healthcare Manufacturers
for initiation of corrective action. Completion of the Software Configuration Audit is contingent upon closure of all outstanding software discrepancies and deficiencies.
[corporate title/position]
Upon successful completion of this audit, the certified software products are delivered by the V&V leader to the software lead engineer for final product certification. The Software Configuration Audit Report is generated by the V&V leader to document the final configuration of the software products delivered for product certification. 3.6.1.5 Software Verification and Validation Report (SVVR). The [project/product name] Software Verification and Validation Report (SVVR) is generated by the V&V leader at the completion of all V&V tasks during the Software Validation Phase. The SVVR is a summary of all V&V activities and results, including status and disposition of anomalies. An assessment of the overall software quality and recommendations for software and/or development process improvements are documented in the report.
3.6.2
Inputs and Outputs
The inputs to the Software Validation Phase V&V are the [project/product name] product requirements documentation, [project/product name] SRS, [project/product name] SDDS, [project/product name] SVTP, [project/product name] VTISs, and [project/product name] SVTPR. The outputs are the completed [project/product name] VTISs, Software Validation Phase V&V Task Summary report, Software Configuration Audit Report, [project/product name] SVVR, anomaly reports, and updates to this plan as required in order to accommodate changes in the [project/product name] software validation program. The Software Validation Phase V&V Task Summary report is generated at the conclusion of all testing. This report includes a summary and detail of the test results, detailed test history, an evaluation of test results and recommendations, and a record of test procedure deviations. Anomaly reports generated during this V&V phase document discrepancies detected during testing and the software configuration audit. The Software Validation Phase V&V Task Summary report and anomaly reports are distributed to the software lead engineer and the [corporate title/position] for review and initiation of corrective action. The Software Configuration Audit Report and the SVVR are distributed to the software lead engineer and the [corporate title/position]. All V&V outputs from the Software Validation phase will be placed under configuration control at task completion.
3.6.3
Risks and Assumptions
Accomplishment of the scheduled V&V for this phase of software development depends on the following assumptions:
Page 24 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
507
•
The [corporate title/position] will provide the V&V organization with the resources required to fulfill the V&V tasks defined, ensure that the necessary inputs are provided in a timely manner, and review the V&V outputs and provide feedback on the completeness and adequacy of each in supporting goals and objectives.
•
Changes to [project/product name] requirements that have not been approved by the [title/position] will not be implemented in the [project/product name] code without the approval of the [corporate title/position] or designee.
•
[project/product name] software to be tested will be obtained by the V&V engineer(s) from the configuration management library.
4.0 V&V REPORTING
4.1 Task Reporting The results of individual V&V tasks are documented in a V&V task report. The task report identifies the V&V phase at which the task was conducted, the responsible V&V engineer(s), the responsible software development team member(s), interim results, and status and recommended corrective action if any. The V&V task report may be in a format appropriate for technical disclosure. V&V task reports are provided to the software lead engineer and the [corporate title/position] in a timely manner to aid in the detection and resolution of problems prior to the start of the next software development phase.V&V task reports will be placed in the [project/product name] archive files.
4.2 V&V Phase Summary Report At the conclusion of each V&V phase, the V&V Phase Summary Report summarizes the results of V&V performed during the applicable software development phase. This summary report contains, as a minimum, the following: •
Description of the V&V tasks performed
•
Summary of task results
Copyright © 2002 Interpharm Press
Page 25 of 40 SVVP
508
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Summary of anomalies and implemented resolutions
•
Assessment of software quality
•
Recommendations
The V&V Phase Summary Report may be in a format appropriate for technical disclosure and will be placed in the [project/product name] archive files.
4.3 Anomaly Report Problem reporting is initiated by the V&V software engineer(s) with a Software Anomaly Report that identifies problems detected during V&V activities. The specific information required on an anomaly report identifies how, when, and where the problem occurred and the impact of the problem on the system capability of the product and on the continued conduct of V&V phase activities. The form used to document anomalies detected by the V&V effort is shown in Appendix D-1, and the instructions for completing the anomaly report are shown in Appendix D-2. Each anomaly report contains the following: •
Description and location of the anomaly
•
Severity of the anomaly, if determinable
•
Cause and method of identifying the anomalous behavior
•
Recommended action and actions taken to correct the anomalous behavior
Anomaly reports are delivered to the [project/product name] software configuration manager for configuration identification, tracking, and status reporting.
4.4 Software Configuration Audit Report The Software Configuration Audit Report (SCAR) is a checklist that documents the results of the software configuration audit. The SCAR does the following: •
Identifies and describes each item of the software products to be certified
Page 26 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
509
•
Verifies that the software configurations are what they were intended and proclaimed to be
•
Verifies that the configuration of the software products to be certified is the same configuration validated at the completion of the Software Validation phase
The SCAR may be in a format appropriate for technical disclosure and will be placed in the [project/product name] archive files.
4.5 V&V Final Report The [project/product name] SVVR is issued upon completion of all V&V tasks, during the Software Validation phase of software development.The following are included, as a minimum, in the SVVR: •
Summary of all V&V tasks performed
•
Summary of task results
•
Summary of anomalies and resolutions
•
Assessment of overall software quality
•
Recommendations
The SVVR may be in a format appropriate for technical disclosure and will be placed in the archive files.
[project/product name]
5.0 V&V ADMINISTRATION PROCEDURES
5.1 Anomaly Reporting and Resolution The V&V leader is responsible for the proper documentation and reporting of Software Anomaly Reports. All anomalies are reported, regardless of the perceived impact on software development or the severity level with respect to the system operation. Unreported, unresolved problems can have a significant adverse impact in later stages of the software development life
Copyright © 2002 Interpharm Press
Page 27 of 40 SVVP
510
Software Quality Assurance SOPs for Healthcare Manufacturers
cycle, possibly when there is little time for resolution. All anomaly reports are placed under configuration management. The projected impact of an anomaly is determined by evaluating the severity of its effect on the operation of the system. The severity of an anomaly report is defined as one of the following: •
High. The change is required to correct a condition that prevents or seriously degrades a system objective and no alternative exists, or to correct a safety-related problem.
•
Medium. The change is required to correct a condition that degrades a system objective, a change is required to provide for performance improvement, or a change is required to confirm that the user and system requirements can be met.
•
Low. The change is desirable to maintain the system, correct operator inconvenience, or other.
Resolution of critical or “high” anomalies is required before the V&V effort proceeds to the next software development phase. Software Anomaly Reports are reviewed by the software lead engineer for anomaly validity, type, and severity. The software lead engineer can direct that additional investigation be performed if required in order to assess the validity of the anomaly or the proposed solution.When an anomaly solution is approved and the personnel responsible for performing the corrective action are indicated, the software lead engineer will authorize implementation of the corrective action. The V&V leader is responsible for anomaly report closure, which includes documenting the corrective action(s) taken and verifying the incorporation of authorized changes as described in the anomaly report. If the anomaly requires a change to a baselined configuration item, a CRA is prepared by a member of the software development team for the item(s) to be changed. A reference to applicable anomaly reports will be documented in the issued CRA. CRAs will be processed in accordance with the [project/product name] SCMP.
5.2 Task Iteration Policy A part of the software life cycle is the need for anomaly correction, performance enhancements, requirement changes and clarifications, and management of the effects of software changes on previously completed V&V tasks or future V&V tasks. The requirement for the reperformance of previous V&V tasks or initiation of new V&V tasks in order to address these
Page 28 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
511
software changes is established by the V&V leader and approved by the [corporate title/position]. Continuous review of V&V efforts, technical accomplishments, resource utilization, future planning, and risk assessment is required for effective V&V management. V&V tasks that uncover significant problems and/or tasks for which a significant part of the defined activity was not completed are candidates for V&V task iteration once corrections to outstanding problems have been implemented. Other opportunities for V&V task iteration are found when the inputs to the V&V task have undergone significant changes to the representation of the system or software requirements. Required iteration of V&V tasks is determined by the V&V leader through assessments of change, criticality, and quality effects.
5.3 Control Procedures All inputs and outputs of the V&V effort for each project are placed under configuration management for historical archiving. All software development material used in any V&V task is configured, protected, and assigned a status upon V&V task completion. Procedures for V&V document change control and configuration status accounting are implemented in accordance with the [project/product name] SCMP to ensure that the validity of V&V results is protected from accidental or unauthorized alteration. All V&V inputs and outputs are retained, in accordance with the corporate disaster contingency plan, in order to provide project history for use in future software development planning and analysis of development cycle requirements.
Copyright © 2002 Interpharm Press
Page 29 of 40 SVVP
Page 30 of 40 SVVP
Software Verification and Validation Plan (SVVP
Software Quality Assurance Plan (SQAP)
Estimations
Project Start-Up
V&V Task Summary Report
Interface Design Specification (IDS)
Interface Design
V&V Task Summary Report
SVVP update
Requirements Traceability Matrix (RTM)
Software Endproduct Acceptance Plan (SEAP) evaluation
Software Development Plan (SDP) evaluation
Software Configuration Management Plan (SCMP) evaluation
Software Requirements Review (SRR)
V&V Task Summary Report
SVVP update
RTM update
Software Validation Test Plan (SVTP)
Software Architecture Design Review (SADR)
V&V Task Summary Report
SVTP update
SVVP update
RTM update
Validation Test Information Sheet (VTIS)
Software Development Test Plan (SDTP) evaluation
Software Detailed Design Review (SDDR)
Software Development Test Information Sheet (DTIS) evaluation
Software Development Test Plan (SDTP) evaluation V&V Task Summary Report
SVVP update
VTIS update
RTM update
DTIS evaluation
V&V Task Summary Report
SVVP update
VTIS update
RTM update
Software Validation Test Procedure (SVTPR)
Anomaly generation
DTIS evaluation
Code audits
Code audits Anomaly generation
Code walkthroughs
Integrate and Test
Code walkthroughs
Code and Test
RTM update
Anomaly generation
Software Verification and Validation Report (SVVR)
Software Configuration Audit Report (SCAR)
Regression test conduct
Software validation test conduct
Software Validation
APPENDIX A
Software architecture design walk-through
Software detailed design walkthrough
Software Architecture Design Specification (SADS) evaluation
Software requirements analysis Software Requirements Specification (SRS) evaluation
Software Detailed Design Specification (SDDS) evaluation
Software architecture design analysis
Software detailed design analysis
Detailed Design
System requirements document review
Requirements
Architecture Design
512 Software Quality Assurance SOPs for Healthcare Manufacturers
SCHEDULE OF V&V TASKS BY SOFTWARE DEVELOPMENT LIFE CYCLE PHASE
Copyright © 2002 Interpharm Press
V&V functional requirements V&V lead engineer V&V engineer V&V hardware requirements Emulator PC or workstation Controlled current power supply Test fixtures EPROM generator V&V software requirements Database software Text-processing software Graphical or drawing software Spreadsheet software Project management software Performance analyzer Logic analyzer Complexity analysis software Reverse engineering software Configuration control software Code debugger software Compiler software Assembler software Linker/loader software
Resource Requirement
X
X
X
X
X
Interface Design
X
Project Start-Up
Copyright © 2002 Interpharm Press X
X X X X X
X
X X X X X
X
X X
Architecture Design
X
X
X X
X X
X
X X
Detailed Design
X X X X X
X
X X X X X X
X X X X
X X X X X X
X X
X X
X X
X X
X X
X X
Integrate and Test
X X
Code and Test
X X X X X
X
X X X X
X X
X X
X X X
X X
X X
Software Validation
APPENDIX B
X
X X
Requirements
Software Verification and Validation Plan 513
V&V RESOURCE REQUIREMENTS BY SOFTWARE DEVELOPMENT LIFE CYCLE PHASE
Page 31 of 40 SVVP
514
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX C Software Development Phase Project Start-up Interface Design Requirements
Architecture Design
Detailed Design
Code and Test
Integrate and Test
Software Validation
Page 32 of 40 SVVP
V&V TASKS, ROLES, AND RESPONSIBILITIES BY SOFTWARE DEVELOPMENT LIFE CYCLE PHASE V&V Tasks and Activities
V&V Lead Engineer
Estimations Software Quality Assurance Plan (SQAP) evaluation Software Verification and Validation Plan (SVVP) generation Interface Design Specification (IDS) evaluation V&V Task Summary Report Product or system requirements documentation review Software requirements traceability analysis Software Requirements Specification (SRS) evaluation Software Requirements Review (SRR) support Software Configuraton Management Plan (SCMP) evaluation Software End-product Acceptance Plan (SEAP) evaluation SVVP update V&V Task Summary Report Software architecture design analysis Software Architecture Design Specification (SADS) evaluation Software Development Test Plan (SDTP) evaluation Software architecture design walk-through Software Architecture Design Review (SADR) support Software Validation Test Plan (SVTP) generation RTM update SVVP update V&V Task Summary Report Software detailed design analysis Software Detailed Design Specification (SDDS) evaluation Software detailed design walk-through Software Development Test Information Sheet (DTIS) evaluation Software Detailed Design Review (SDDR) support Software Development Test Plan (SDTP) evaluation Validation Test Information Sheet (VTIS) generation RTM update SVVP update SVTP update V&V Task Summary Report Code walk-throughs Code audits Anomaly generation DTIS evaluation RTM update SVVP update VTIS update V&V Task Summary Report Code walk-throughs Code audits Anomaly generation DTIS evaluation Software Validation Test Procedure (SVTPR) generation RTM update SVVP update VTIS update V&V Task Summary Report Software validation test conduct Regression test conduct Software Configuration Audit Report (SCAR) generation Anomaly generation RTM update Software Verification and Validation Report (SVVR) generation
X X X X X X X X X X X X X X
V&V Engineer
X X X X X X X X X X
X X X X X X
X X X X
X X X X X X X X X X X X X X X X X
X X X
X X X X X X X X X X X X X X X
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
APPENDIX D-1
515
SOFTWARE ANOMALY REPORT SOFTWARE ANOMALY REPORT
1. Date:
2. Severity: HML
3. Anomaly Report
4. Title (briefly describe the problem):
5. System: 8. Originator:
6. Component: 9. Organization
12. Verification and Validation Task: 14.
System Configuration:
15.
Anomaly Description:
16.
Problem Duplication: During run Y N After restart Y N After reload Y N
10. Telephone
N/A N/A N/A
Investigation Time
19.
Proposed Solution:
20.
Corrective Action Taken: Date:
21.
Closure Sign-off:
11. Approval:
13. Reference Document(s):
17.
18.
Copyright © 2002 Interpharm Press
7. Version
❑ ❑ ❑ ❑ ❑
Source of Anomaly: PHASE Requirements Architecture Design Detailed Design Implementation Undetermined
❑ ❑ ❑ ❑ ❑ ❑
TYPE Documentation Software Process Methodology Other Undetermined
Software Lead Engineer
Date
V&V Lead Engineer
Date
Page 33 of 40 SVVP
516
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX A-2
INSTRUCTIONS FOR COMPLETING SOFTWARE ANOMALY REPORT
1. Date: Form preparation date. 2. Severity: Circle the appropriate code. High: The change is required to correct a condition that prevents or seriously degrades a system objective (where no alternative exists) or to correct a safety-related problem. Medium: The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user and system requirements can be met. Low: The change is required to maintain the system, correct operator inconvenience, or other. 3. Anomaly report number: Number assigned for control purposes. 4. Title: Brief phrase or sentence describing the problem. 5. System: Name of the system or product against which the anomaly report is written. 6. Component: Component or document name against which the anomaly report is written. 7. Version: Version of the document or code against which the anomaly report is written. 8. Originator: Printed name of individual originating the anomaly report. 9. Organization: Organization of originator of anomaly report. 10. Telephone: Office phone number of the individual originating the anomaly report. 11. Approval: Software management individual or designatee approval for anomaly report distribution. 12. V&V task name: Name of the V&V task being performed when the anomaly was detected. 13. Reference document: Designation of the documents that provide the basis for determining that an anomaly exists. 14. System configuration: Configuration loaded when anomaly occurred; not applicable for documentation or logic errors. 15. Anomaly description: Description defining the anomaly and a word picture of events leading up to and coincident with the problem. Cite equipment being used, unusual configurations, environment parameters, and so forth, that will enable the programmer to duplicate the situation. If continuation sheets are required, fill in Page _ of _ at the top of the form. 16. Problem duplication: Duplication attempts, successes or failures for software errors; not applicable for documentation or logic errors. 17. Source of anomaly: On investigation completion, source of the anomaly in terms of phase origination and type. 18. Investigation time: Time, to the nearest half hour, required to determine the cause of the anomaly but not the time to determine a potential solution or time to implement the corrective action. 19. Proposed solution: Description defining in detail a solution to the detected anomaly, including documents, components and code. 20. Corrective action taken: Disposition of the anomaly report, including a description of any changes initiated as a direct result of this report and the date incorporated. 21. Closure sign-off: Signature of the software lead engineer authorizing implementation of the corrective action. Signature of the V&V lead engineer verifying incorporation of the authorized changes as described in this report. Only signature of software lead engineer is required when no corrective action is approved.
Page 34 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
517
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Algorithm: Finite set of well-defined rules for the solution of a problem in a finite number of steps. Algorithm analysis: Examination of an algorithm to determine its correctness with respect to its intended use, to determine its operational characteristics, or to understand it more fully in order to modify, simplify, or improve it. Anomaly: Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved, or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Code audit: Independent review of source code by a person, team, or tool to verify compliance with software design documentation and programming standards. Correctness and efficiency may also be evaluated. Completeness: Those attributes of the software and documentation that provide full implementation of the functions required. Component: Unit of code that performs a specific task or a group of logically related code units that perform a specific task or set of tasks. Component testing: Testing conducted to verify the implementation of the design for one software component or collection of software components. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it.
Copyright © 2002 Interpharm Press
Page 35 of 40 SVVP
518
Software Quality Assurance SOPs for Healthcare Manufacturers
Configuration identification: Process of designating the configuration items in a system and recording their characteristics. Configuration item: Aggregation of hardware, software, or any of its discrete parts, that satisfies an end-use function. Configuration management: Process of identifying and defining the configuration items in a system, controlling the release and change of these items throughout the product life cycle, recording and reporting the status of configuration items and change requests, and verifying the completeness and correctness of configuration items. Configuration status accounting: Recording and reporting of the information needed to manage a configuration effectively, including a listing of approved configuration identification, status of proposed changes to the configuration, and implementation status of approved changes. Consistency: Those attributes of the software and documentation that provide uniformity in the specification, design, and implementation of the product. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations. Critical software: Software whose failure could have an impact on safety. Criticality: Classification of a software error or fault based upon evaluation of the degree of impact of that error or fault on development or operation of a system. Delivery: Transfer of responsibility for an item from one activity to another, as in delivery of the validated software product to Quality Assurance for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Design requirement: Any requirement that impacts or constrains the design of a software system or software system component. Deviation: Authorization for a future activity, event, or product to depart from standard procedures. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software.
Page 36 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
519
Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, that may lead to a failure. Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness, or damage to or loss of equipment or property. Hazard analysis: Listing of potential hazards associated with a device or system, along with an estimation of the severity of each hazard and its probability of occurrence. Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Integration: Process of combining software elements, hardware elements, or both into an overall system. Iteration: Process of repeatedly executing a given sequence of steps until a given condition is met or while a given condition is true. Milestone: Scheduled and accountable event that is used to measure progress. Quality assurance (QA): Planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements. Regression testing: Selective retesting to detect faults introduced during modification, to verify that modifications have not caused unintended adverse effects and that a modified system or system component still meets its specified requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented.
Copyright © 2002 Interpharm Press
Page 37 of 40 SVVP
520
Software Quality Assurance SOPs for Healthcare Manufacturers
Robustness: Extent to which software can continue to operate correctly despite introduction of invalid inputs. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Architecture Design Specification (SADS): Project-specific document that constrains the design information needed to support the detailed definition of the individual software system components and, upon completion of the Architecture Design Review, becomes the design baseline for development of the SDDS used in support of software coding. Software Configuration Management Plan (SCMP): Project-specific plan that specifies the methods and planning employed to implement software configuration management activities. Software Detailed Design Review (SDDR): Software review conducted for the purpose of: (1) reviewing the project’s SDDS, associated plans, and critical issues; (2) resolving identified issues; (3) obtaining commitment to proceed into the code and test phase; and (4) obtaining commitment to a test program supporting product acceptance. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review, including a description of the overall program operation and control and the use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software Development Plan (SDP): Project-specific plan that identifies and describes the procedures employed to implement the management activities that coordinate schedules, control resources, initiate actions, and monitor progress of the software development effort. Software Development Test Plan (SDTP): Project-specific plan that defines the scope of software testing that must be successfully completed for each software component developed.
Page 38 of 40 SVVP
Copyright © 2002 Interpharm Press
Software Verification and Validation Plan
521
Software end products: Computer programs, software documentation, and databases produced by a software development project. Software library: Controlled collection of software and related documentation designed to aid in software development, use, or maintenance. Software quality: Totality of features and characteristics of a software product that bear on its ability to satisfy given needs. Software Quality Assurance Plan (SQAP): Project-specific plan that states the software quality objectives of the project as conditioned by the product requirements and the significance of the intended application. Software reliability: Probability that software will not cause the failure of a system for a specified time under specified conditions. Software Requirements Review (SRR): Review of the provisions of the Software Requirements Specification that, once approved, will serve as the basis of software end-product acceptance. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied. Software Validation Test Plan (SVTP): Project-specific plan that describes the software testing required to verify that the software product satisfies the specified requirements. Source code: Original software expressed in human-readable form (programming language) that must be translated into machine-readable form before it can be executed by the computer. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test. Testability: Extent to which software facilitates both the establishment of test criteria and the evaluation of the software with respect to those criteria, or the extent to which the definition of requirements facilitates analysis of the requirements to establish test criteria.
Copyright © 2002 Interpharm Press
Page 39 of 40 SVVP
522
Software Quality Assurance SOPs for Healthcare Manufacturers
Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. Walk-through: Review in which the designer or programmer leads members of the review team through a segment of design or code, and the reviewers ask questions and submit comments about technique, style, possible errors, violation of development standards, and other problems.
Page 40 of 40 SVVP
Copyright © 2002 Interpharm Press
[Project/Product Name] IDS INTERFACE DESIGN SPECIFICATION
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Revision
Page
[#.#]
1 of [#]
Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Document Number [aaa]-IDS-[#.#]
REVISION HISTORY
Copyright © 2002 Interpharm Press
Page 1 of 13 IDS
524
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
HARDWARE-TO-SOFTWARE INTERFACES
5
3.0
SOFTWARE-TO-SOFTWARE INTERFACES
11
GLOSSARY
Page 2 of 13 IDS
13
Copyright © 2002 Interpharm Press
Interface Design Specification
525
1.0 INTRODUCTION
1.1 Purpose This document specifies the interfaces between the hardware and software for the [project/product name] project and the hardware platform on which the software will execute.
1.2 Scope This document applies to all of the hardware and software interfaces that are resident in the [project/product name] product.
1.3 Overview This specification delineates the hardware-to-software and software-to-software interfaces within the executable code of the [project/product name] product. Section 2 specifies the hardwareto-software interfaces and details the kernel functions, such as processor, memory, interrupt system, timers, DMA system, wait state generator, hardware CRC generator, and serial communications controller; power manager functions, such as power management, watchdog, and real-time clock; product hardware functions, such as audio, keyboard, displays, motors, analog inputs, and sensors; communications functions; self-test descriptions such as logic board, analog board, display module, switch panel power supply, and battery; and port, register, and data definitions. Section 3 specifies the software-to-software interfaces and details the mailbox design, such as structure definitions, message definitions, and message values and types; global declarations; global data variables, such as RAM data and data constants; interrupt service parameters; driver and controller data; and macro definitions.
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the exctent specified herein. In the event of conflict between the documents referenced herein and
Copyright © 2002 Interpharm Press
Page 3 of 13 IDS
526
Software Quality Assurance SOPs for Healthcare Manufacturers
the content of this specification, the content of this specification shall be considered a superseding requirement.
1.4.1
Project Specifications
•
[project/product name]
Product Objectives Document, Document Number [aaa]-POD[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
[aaa]-CMP-[#.#],
Software Configuration Management Plan, Document Number Revision [#.#], dated [date]
•
[project/product name]
Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name] Software End-product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
Software Development Test Plan, Document Number [aaa]-DTP[#.#], Revision [#.#], dated [date]
[#.#],
•
1.4.2
Software Quality Assurance Plan, Document Number [aaa]-QAPRevision [#.#], dated [date]
Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date] [project/product name]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision, [#.#], dated [date]
Page 4 of 13 IDS
Copyright © 2002 Interpharm Press
Interface Design Specification
527
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
1.4.3
Other
[Insert hardware reference manuals, component handbook or data sheet call outs here.]
2.0 HARDWARE-TO-SOFTWARE INTERFACES
2.1 Kernel Functions 2.1.1
Processor
[Insert the microprocessor specifications, attributes, architecture, or characteristics here.]
2.1.2
Memory
2.1.2.1 ROM (Read Only Memory). [Insert ROM characteristics, specifications, attributes, usable address range(s), and interrupt vector information here.]
2.1.2.2 RAM (Random Access Memory). [Insert RAM characteristics, specifications, attributes, and usable address range(s) here.]
Copyright © 2002 Interpharm Press
Page 5 of 13 IDS
528
Software Quality Assurance SOPs for Healthcare Manufacturers
2.1.2.3 EEPROM (Electronically Erasable and Programmable Read Only Memory). [Insert ROM characteristics, specifications, attributes, usable address range(s), access method, interrupt vector, and operations and register values here.]
2.1.3
Interrupt System
[Insert interrupt characteristics, specification and attributes by name, function, and value here.]
2.1.4
Input and Output
[Insert input and output port configuration and memory mapped input and output here.]
2.1.5
Timers
[Insert timer characteristics, specifications, attributes, names, and functions here.]
2.1.6
DMA (Direct Memory Access) System
[Insert DMA channel characteristics, specifications, attributes, and functions here.]
2.1.7
Wait State Generator
[Insert wait state generator characteristics, specifications, attributes, and functions here.]
2.1.8
Hardware CRC (Cyclical Redundancy Check) Generator
[Insert hardware CRC generator characteristics, specifications, attributes, name(s), connections, channel(s), and functions here.]
2.1.9
Serial Communications Controller
[Insert serial communications controller characteristics, specifications, attributes, functions, device(s) controlled, register usage and values, device select priority, and communications type here.]
Page 6 of 13 IDS
Copyright © 2002 Interpharm Press
Interface Design Specification
529
2.2 Power Manager Functions 2.2.1
Power Manager Functions
[Insert power manager characteristics, specifications, attributes, functions, connections, warnings, alarms, and calibration here.]
2.2.2
Watchdog
[Insert watchdog characteristics, specifications, attributes, functions, interval, alarms, testing interval, channel or port connections, and name(s) and values here.]
2.2.3
Real-time Clock
[Insert real-time clock characteristics, specifications, attributes, functions, channel, or port connections, and name(s) and values here.]
2.2.4
Always “On” Displays
[Insert displays that are always “on,” including their characteristics, specifications, attributes, functions, channel or port connections, and name(s) and values here.]
2.3 Product Hardware Functions 2.3.1
Audio
[Insert audio characteristics, specifications, attributes, functions, frequencies, amplitudes, durations, register controls, and name(s) and values here.]
2.3.2
Keyboard
[Insert keyboard characteristics, specifications, attributes, functions, channel or port connections, access method, commands, register usage, and name(s) and values here.]
Copyright © 2002 Interpharm Press
Page 7 of 13 IDS
530
Software Quality Assurance SOPs for Healthcare Manufacturers
2.3.3
Displays
[Insert the graphics, LCD, and LED display characteristics, specifications, attributes, functions, channel or port connections, controller, intensity and contrast, registers, bit assignments, and names(s) and values here.]
2.3.4
Motor(s)
[Insert the motor(s) characteristics, specifications, attributes, functions, channel or port connections, register usage, timer definitions, and names(s) and values here.]
2.3.5
Analog Inputs
[Insert the analog-to-digital and digital-to-analog converter characteristics, specifications, attributes, functions, channels or port connections, register usage, timer definitions, and name(s) and values here.]
2.3.6
Sensors
[Insert the sensors characteristics, specifications, attributes, functions, channel or port connections, register usage, timer definitions, and names(s) and values here.]
2.3.7
Switches
[Insert the switches characteristics, specifications, attributes, functions, channel or port connections, access method, commands, register usage, and names(s) and values here.]
2.4 Communications Functions 2.4.1
Serial Communications
[Insert the serial communications characteristics, specifications, attributes, functions, channel or port connections, register usage, timer definitions, and names(s) and values here.]
Page 8 of 13 IDS
Copyright © 2002 Interpharm Press
Interface Design Specification
2.4.2
531
Insert Other Communications
[Insert any other communications characteristics, specifications, attributes, functions, channel or port connections, register usage, timer definitions, and name(s) and values here.]
2.5 Self-Test Descriptions 2.5.1
Logic Board
[Insert self-test definitions of the logic board, delineating test designator, test functions, performance level, scope of failure, and name(s) and values here.]
2.5.2
Analog Board
[Insert self-test definitions of the analog board, delineating test designator, test functions, performance level, scope of failure, and name(s) and values here.]
2.5.3
Displays
[Insert self-test definitions of the graphics, LCD and LED displays, delineating test designator, test functions, performance level, scope of failure, and name(s) and values here.]
2.5.4
Switches
[Insert self-test definitions of the the switches, delineating test designator, test functions, performance level, scope of failure, and name(s) and values here.]
2.5.5
Sensors
[Insert self-test definitions of the sensors, delineating test designator, test functions, performance level, scope of failure, and name(s) and values here.]
Copyright © 2002 Interpharm Press
Page 9 of 13 IDS
532
Software Quality Assurance SOPs for Healthcare Manufacturers
2.5.6
Motor(s)
[Insert self-test definitions of the motor(s), delineating test designator, test functions, performance level, scope of failure, and name(s) and values here.]
2.6 Port, Register, and Data Definitions 2.6.1
CPU Internal Registers
[Insert internal peripheral control block ports, delineating register label, I/O address, reset state, initial value, and name(s) here.]
2.6.2
Ports’ and Registers’ Bit Assignments
[Insert bit assignments for ports, channels, system status, and registers, delineating field name; bit position numbers; functions; in, out, or read back designation; and active state here.]
2.6.3
Serial Communications Controller Data Register Definitions
[Insert serial communications controller data, transmit, receive, and data registers definitions, delineating command, response to command, operation, register name, and values here.]
2.6.4
Serial Communications Controller Data Register Definitions for Peripherals, Displays, LCDs, and LEDs
[Insert serial communications controller data for peripherals, displays, LCD and LED transmit, receive, and data registers definitions, delineating command, response to command, operation, register name, and values here.]
2.6.5
Serial Communications Controller Data Register Definitions for Keyboard Operations
[Insert serial communications controller data for keyboard operations, delineating command, operation code, number of data bytes, command description, and values here.]
Page 10 of 13 IDS
Copyright © 2002 Interpharm Press
Interface Design Specification
2.6.6
533
Serial Communications Controller Data Register Definitions for EEPROM Operations
[Insert serial communications controller data for EEPROM operations delineating operation, register, and values here.]
3.0 SOFTWARE-TO-SOFTWARE INTERFACES
3.1 Mailbox Design [Insert mailbox structure definitions, message definitions, and message values and types here.]
3.2 Global Declarations [Insert global definitions, structures, and values here.]
3.3 Global Data Variables [Insert names, descriptions, and values of global data, structures, parameters, and constants here.]
3.4 Task Data Variables [Insert names, descriptions, and values of task local data, structures, parameters, and constants here.]
3.5 Interrupt Service Parameters [Insert names, descriptions, and values of interrupt service data, structures, parameters, and constants here.]
Copyright © 2002 Interpharm Press
Page 11 of 13 IDS
534
Software Quality Assurance SOPs for Healthcare Manufacturers
3.6 Driver and Controller Data [Insert names, descriptions, and values of driver and controller data, structures, parameters, and constant here.]
3.7 Macro Definitions [Insert macro definition for inputs, processing, outputs, and related data here.]
3.8 Critical Data and Parameters [Insert names, descriptions, locations, functions, and values of critical data, parameters, constants, errors, warnings, and structures here.]
Page 12 of 13 IDS
Copyright © 2002 Interpharm Press
Interface Design Specification
535
GLOSSARY Critical software: Software whose failure could have an impact on safety. Criticality: Classification of a software error or fault based upon an evaluation of the degree of impact of that error or fault on the development or operation of a system. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness or damage to, or loss of equipment or property. Safety critical indicators: Visual or audible indicators the condition of the system. Safety critical parameters: System parameters within which the system must operate in order to be safe.
Copyright © 2002 Interpharm Press
Page 13 of 13 IDS
[Project/Product Name] RTM REQUIREMENTS TRACEABILITY MATRIX
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-RTM-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 2 RTM
Requirement Number
Higher Level Requirement Description
Page 2 of 2 RTM P P P P P P P P P P P
/ / / / / / / / / / /
F F F F F F F F F F F
SRS DDS Test Requirement Paragraph Paragraph Software Test Verification Results Number Number Number Component(s) Number(s) Method(s) Pass/Fail
538 Software Quality Assurance SOPs for Healthcare Manufacturers
REQUIREMENTS TRACEABILITY MATRIX
Copyright © 2002 Interpharm Press
[Project/Product Name] SADS SOFTWARE ARCHITECTURE DESIGN SPECIFICATION
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SADS-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 11 SADS
540
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
ARCHITECTURE AND TOP-LEVEL DESIGN
5
3.0
NOTES
8
APPENDIX A GLOSSARY
Page 2 of 11 SADS
Allocation of Requirements Map
9 10
Copyright © 2002 Interpharm Press
Software Architecture Design Specification
541
1.0 INTRODUCTION
1.1 Purpose This document specifies the top-level architectural software design for the [project/product name] project.
1.2 Scope This document applies to all of the software that is to be developed for and reside in the [project/product name] product.
1.3 Overview The [project/product name] product is a [insert high-level description of product here]. This version of the product implements the following features: [insert product feature descriptions here]. The software is responsible for the instrument’s functionality, user interface, safety checks, and performance accuracy. This document specifies the decomposition of the [project/product name] product software into physically implementable tasks that perform the various functions of the product. The allocation of the software requirements that are described in the [project/product name] Software Requirements Specification (SRS) is presented in the Allocation of Requirements Map in Appendix A.
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
Copyright © 2002 Interpharm Press
Page 3 of 11 SADS
542
Software Quality Assurance SOPs for Healthcare Manufacturers
1.4.1
Project Specification
•
[project/product name] Interface Design Specification, Document Number [aaa]-IDS-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
•
[project/product name]
Product Objectives Document, Document Number [aaa]-POD[#.#], Revision [#.#], dated [date] [project/product name]
[aaa]-CMP-[#.#],
Software Configuration Management Plan, Document Number Revision [#.#], dated [date]
•
[project/product name]
Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
Software End-product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
Software Development Test Plan, Document Number [aaa]-DTP[#.#], Revision [#.#], dated [date] [project/product name]
[#.#],
Software Quality Assurance Plan, Document Number [aaa]-QAPRevision [#.#], dated [date]
•
[project/product name] Software Requirements Specification, Document Number [aaa]SRS-[#.#], Revision [#.#], dated [date]
•
[project/product name]
1.4.2
Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
Page 4 of 11 SADS
Copyright © 2002 Interpharm Press
Software Architecture Design Specification
543
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 ARCHITECTURE AND TOP-LEVEL DESIGN
2.1 Architecture The processor diagram for the [project/product name] software, shown in Figure 1, comprises [numfunctional tasks, as shown in the data flow diagram of Figure 2. Detailed descriptions of these data flows are contained in the [project/product name] Interface Design Specification (IDS). Each functional task contains individual physical tasks that accomplish the various functions of the [project/product name] product. These tasks are executed under the control of the [insert operating system, executive or driver system software here]. ber of tasks]
Figure 1
[project/product name] Processor or Context Diagram
[Insert processor or context diagram here]
Figure 2
[project/product name] Data Flow Diagram of Functional Mode of Operation
[Insert data flow diagram here]
Copyright © 2002 Interpharm Press
Page 5 of 11 SADS
544
Software Quality Assurance SOPs for Healthcare Manufacturers
2.2 Functional Allocation The allocation of the [project/product name] SRS software requirements to the various functional and physical tasks is presented in Appendix A.
2.3 Memory and Processing Time Allocation The software architecture described in this document supports the following memory and processing time: a. The software fits within [#] bytes of ROM, which begins at address [#]. b. Data storage consists of [#] bytes of RAM, which begins at address [#]. c. Nonvolatile data storage is provided by [#] bit words of [#] word EEPROM, which is addressed [access] via I/O port [port name]. d. The complete cycle of active failure checks is completed in less than the system critical time. e. The complete cycle of passive failure checks is completed at the beginning of each operational window, which is not less frequent than the system power cycle. f. Indicator functions are completed within a full duty cycle time of [#] msec. [g.
Insert any additional memory and processing time characteristics or specifications here.]
2.4 Functional Control and Data Flow Functional control flows and data flows between the [project/product name] functional tasks are shown in Figure 2, and a detailed composition of the control and data flows is given in the [project/product name] IDS.
2.5 Global Data Global data are shown as “stores” in Figure 2, and detailed descriptions of the global data are given in the [project/product name] IDS.
Page 6 of 11 SADS
Copyright © 2002 Interpharm Press
Software Architecture Design Specification
545
2.6 Top-Level Design This section describes the functional organization and decomposition of the [project/product name] software and the functional modes of operation shown in Figure 2. [For each functional task, include a completed section 2.6.n per the outline below.]
2.6.n
[functional task name] Functional Task
The [functional task name] functional task is responsible for [insert the functional task physical process high-level description here]. The processes composing the [functional task name] functional task are shown in the data flow diagram, Figure 3, and its operation is shown in the state transition diagram, Figure 4.
Figure 3
[functional task name] Functional Task Data Flow Diagram
[Insert task data flow diagram here]
Figure 4
[functional task name] Functional Task State Transition Diagram
[Insert task state transition diagram here]
2.6.n.1 Inputs. The inputs to the [functional task name] functional task are shown by the data flows in Figure 3, and detailed descriptions of this data are given in the [project/product name] IDS. 2.6.n.2 Global, Critical, and Task Data. Global data, critical data, and task data shared within and specific to the [functional task name] functional task are detailed in the [project/product name] IDS. 2.6.n.3 Interrupts. [Insert the names and processing associated with each interrupt handled by the functional task or indicate that it does not process any hardware-generated interrupts.]
Copyright © 2002 Interpharm Press
Page 7 of 11 SADS
546
Software Quality Assurance SOPs for Healthcare Manufacturers
2.6.n.4 Timing, Sequencing, and Activations. [Insert timing, task sequence and activation, and tasks that are activated and deactivated by the functional task, here.]
2.6.n.5 Processing. [For each physical task within the functional task, include the following completed paragraph.] The [insert physical task name] task is responsible for [insert a paragraph description for each physical task implemented within the functional task and include the following: •
Operating or processing logic
•
Testing relative to safety, faults, failures, hazards, states, or system status
•
Enable and disable decisions
•
Critical software
•
Error, warning, or alert message processing
•
Display, sensor, and indicator processing]
The operation of the [physical task name] task is shown in the state transition diagram of Figure 5.
Figure 5
[physical task name] State Transition Diagram
[Insert physical task state transition diagram here]
2.6.n.6 Outputs. The outputs from the [functional task name] functional task are shown by the data flows in Figure 3, and detailed descriptions of these data are given in the [project/product name] IDS.
3.0 NOTES
[Insert here any notes, comments, observations, processing, timelines, algorithms, or special processing conditions, states, or status that are needed to augment, explain, or amplify the design specified above.]
Page 8 of 11 SADS
Copyright © 2002 Interpharm Press
Software Architecture Design Specification
APPENDIX A
Number
[#.#.#.#.#] [#.#.#.#.#] [#.#.#.#.#] [#.#.#.#.#]
ALLOCATION OF REQUIREMENTS MAP
SRS Paragraph Title or Description
[Title [Title [Title [Title
547
or or or or
description] description] description] description]
Copyright © 2002 Interpharm Press
SADS Document Paragraph Process Number or Number Description N/A [#.#.#] [#.#.#] [#.#.#]
[#.#.#.#.#] [Description]
Page 9 of 11 SADS
548
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Acceptance criteria: Criteria that a software end product must meet to successfully complete a test phase or meet delivery requirements. Accuracy: Quantitative assessment of freedom from error. Critical software: Software whose failure could have an impact on safety. Criticality: Classification of a software error or fault based upon an evaluation of the degree of impact of that error or fault on the development or operation of a system. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, that may lead to a failure. Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness or damage to or loss of equipment or property. Interface Design Specification (IDS): Project-specific document that completely specifies the interfaces among the subsystems. Reliability: Ability of an item to perform a required function under stated conditions for a stated period of time. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Safety critical indicators: Visual or audible indicators of the condition of the system. Safety critical parameters: System parameters within which the system must operate in order to be safe.
Page 10 of 11 SADS
Copyright © 2002 Interpharm Press
Software Architecture Design Specification
549
Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software reliability: Probability that software will not cause the failure of a system for a specified time under specified conditions.
Copyright © 2002 Interpharm Press
Page 11 of 11 SADS
[Project/Product Name] SDDS SOFTWARE DETAILED DESIGN SPECIFICATION
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SDDS-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 19 SDDS
552
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
ARCHITECTURE AND TOP-LEVEL DESIGN
5
3.0
DETAILED DESIGN
9
4.0
NOTES
13
APPENDIX A
Allocation of Requirements Map
14
APPENDIX B
Task Data Flow and Control Flow Diagrams
15
APPENDIX C
Task State Transition Matrices
16
APPENDIX D
Task Structure Charts
17
APPENDIX E
Task Data Structure Charts
18
GLOSSARY
Page 2 of 19 SDDS
19
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
553
1.0 INTRODUCTION
1.1 Purpose This document specifies the low-level software design for the [project/product name] project.
1.2 Scope This document applies to all of the software that is to be developed for and reside in the [project/ product name] product.
1.3 Overview The [project/product name] product is a [insert high-level description of product here]. This version of the product implements the following features: [insert product feature descriptions here]. The software is responsible for the instrument’s functionality, user interface, safety checks, and performance accuracy. This document specifies the decomposition of the [project/product name] product software into physically implementable tasks that perform the various functions of the product. The allocation of the software requirements that are described in the [project/product name] Software Requirements Specification (SRS) is presented in the Allocation of Requirements Map in Appendix A.
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
Copyright © 2002 Interpharm Press
Page 3 of 19 SDDS
554
Software Quality Assurance SOPs for Healthcare Manufacturers
1.4.1
Project Specifications
•
[project/product name] Interface Design Specification, Document Number [aaa]-IDS-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
•
[project/product name]
•
[project/product name]
Software Architecture Design Specification, Document Number [aaa]-ADS-[#.#], Revision [#.#], dated [date] Software Configuration Management Plan, Document Number [aaa]-CMP-[#.#], Revision [#.#], dated [date] Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
[#.#],
Software Development Test Plan, Document Number [aaa]-DTPRevision [#.#], dated [date]
•
Software End-product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
•
[project/product name]
[project/product name]
Software Quality Assurance Plan, Document Number [aaa]-QAP[#.#], Revision [#.#], dated [date] Software Requirements Specification, Document Number [aaa]SRS-[#.#], Revision [#.#], dated [date]
[aaa]-VVP-[#.#],
1.4.2
Software Verification and Validation Plan, Document Number Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
Page 4 of 19 SDDS
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
555
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 ARCHITECTURE AND TOP-LEVEL DESIGN
2.1 Architecture The processor diagram for the [project/product name] software, shown in Figure 1, comprises [number of tasks] functional modes of operation as shown in the data flow diagram of Figure 2. Detailed descriptions of these data flows are contained in the [project/product name] Interface Design Specification (IDS). Each functional mode of operation contains individual physical tasks that accomplish the various functions of the [project/product name] product. These tasks are executed under the control of the [insert operating system, executive, or driver system software here].
Figure 1
[project/product name] Processor or Context Diagram
[Insert processor or context diagram here]
Figure 2
[project/product name] Data Flow Diagram of Functional Mode of Operation
[Insert data flow diagram here]
2.2 Functional Allocation The allocation of the [project/product name] SRS software requirements to the various functional and physical tasks is presented in Appendix A.
Copyright © 2002 Interpharm Press
Page 5 of 19 SDDS
556
Software Quality Assurance SOPs for Healthcare Manufacturers
2.3 Memory and Processing Time Allocation This section describes the timing and sizing of the [project/product name] software.
2.3.1
Memory Allocation
The software design described in this document supports the following memory allocation: a. The software fits within [#] bytes of ROM, which begins at address [#]. b. Data storage consists of [#] bytes of RAM, which begins at address [#]. c. Nonvolatile data storage is provided by [#] bit words of [#] word EEPROM, which is addressed [access] via I/O port [port name]. [d.
2.3.2
Insert any additional memory characteristics or specifications here.]
Timing Allocation
The software design described in this document supports the following processing time: a. The complete cycle of active failure checks is completed in less than the system critical time. b. The complete cycle of passive failure checks is completed at the beginning of each operational window, which is not less frequent than the system power cycle. c. Indicator functions are completed within a full duty cycle time of [#] msec. [d.
2.3.3
Insert any additional processing time characteristics or specifications here.]
Processing Timeline
The [project/product name] processing timeline and task priorities are shown in Figure 3. This diagram reflects a simplified sequence of task execution, which begins with [insert initiating action for task execution sequence]. The task list on the left of the diagram is in priority order from highest to lowest. All tasks are scheduled by [insert operating system, executive, or driver name here], which runs at the highest priority. For the following discussion, the [project/product name] is in the [insert functional mode of operation name here]. [Insert description of processing timeline from initial task through final task. Include task names, postings, interrupts, timers, timing windows, messages, and hardware interfaces.]
Page 6 of 19 SDDS
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
557
The following tasks and functions are not shown in the diagram and are performed after the above processing has concluded. [Insert any additional processing tasks and functions not covered above, such as watchdog processing, safety processing, and any tasks or functions that must be performed before the next processing cycle.]
Figure 3
[project/product name] Processing Timeline and Task Priority
[Insert processing time line and task priority diagram here]
2.4 Functional Control and Data Flow Functional control flows and data flows between the [project/product name] functional modes of operation are shown in Figure 2, and a detailed composition of the control and data flows is given in the [project/product name] IDS.
2.5 Global Data Global data are shown as “stores” in Figure 2, and detailed descriptions of the global data are given in the [project/product name] IDS.
2.6 Top-Level Design This section describes the functional organization and decomposition of the [project/product name] software and the functional modes of operation shown in Figure 2. [For each functional mode of operation include a completed section 2.6.n per the outline below.]
Copyright © 2002 Interpharm Press
Page 7 of 19 SDDS
558
Software Quality Assurance SOPs for Healthcare Manufacturers
2.6.n
[functional mode of operation name] Functional Mode of Operation
The [functional mode of operation name] functional mode of operation is responsible for [insert the processes composing the [functional mode of operation name] functional mode of operation are shown in the data flow diagram of Figure 4, and its operation is shown in the state transition diagram Figure 5.
functional task physical process high-level description here]. The
Figure 4
[functional task name] Functional Task Data Flow Diagram
[Insert functional task data flow diagram here]
Figure 5
[functional task name] Functional Task State Transition Diagram
[Insert functional task state transition diagram here]
2.6.n.1 Inputs. The inputs to the [functional task name] functional task are shown by the data flows in Figure 4, and detailed descriptions of these data are given in the [project/product name] IDS. 2.6.n.2 Global, Critical, and Task Data. Global data, critical data, and task data shared within and specific to the [functional task name] functional task are detailed in the [project/product name] IDS. 2.6.n.3 Interrupts. [Insert the names and processing associated with each interrupt handled by the functional task, or indicate that it does not process any hardware-generated interrupts.]
2.6.n.4 Timing, Sequencing, and Activations. [Insert any timing, task sequencing, task activation, tasks that are activated by the functional task, and task deactivation here.]
2.6.n.5 Processing. [For each physical task within the functional task, include the following completed paragraph.] The [insert physical task name] task is responsible for [insert a paragraph description for each physical task implemented within the functional task and include operating or processing logic; testing relative to safety, faults, failures, hazards, states, or system status; enable and disable decisions; critical software; error, warning, or alert message processing; and display, sensor, and indicator processing.] The
operation of the [physical task name] task
is shown in the state transition diagram of Figure 6.
Page 8 of 19 SDDS
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
Figure 6
559
[physical task name] State Transition Diagram
[Insert physical task state transition diagram here]
2.6.n.6 Outputs. The outputs from the [functional task name] functional task are shown by the data flows in Figure 4, and detailed descriptions of these data are given in the [project/product name] IDS. 2.6.[n+1] Shared Functions Task. The Shared Functions Task is the collection of [project/ functions that are used by two or more of the physically executable tasks described above.
product name] software
3.0 DETAILED DESIGN
3.1 Design Conventions and Directives 3.1.1
Naming Conventions
The [project/product name] software will adhere to the naming conventions documented in the SE Programming Guidelines. [Insert any additions, deviations, or different implementations here.]
3.1.2
Programming Conventions
The [project/product name] software uses the [insert operating system, executive, or driver name here], the [insert programming language to be used here], and will adhere to the programming conventions and practices documented in the SE Programming Guidelines. [Insert any additions, deviations, or different conventions, practices, or implementations here.]
Copyright © 2002 Interpharm Press
Page 9 of 19 SDDS
560
Software Quality Assurance SOPs for Healthcare Manufacturers
3.1.3
Compiler Directives
The [project/product name] software will adhere to the compiler conventions documented in the SE Programming Guidelines. [Insert any additions, deviations, or different conventions here.]
3.1.4
Assembler Directives
The [project/product name] software will adhere to the assembler conventions documented in the SE Programming Guidelines. [Insert any additions, deviations, or different conventions here.]
3.1.5
Linker, Loader, and Locator Directives
The [project/product name] software will adhere to the linker, loader, and locator conventions documented in the SE Programming Guidelines. [Insert any additions, deviations, or different conventions here.]
3.1.6
MAKE File Directives
The [project/product name] software will adhere to the MAKE file conventions documented in the SE Programming Guidelines. [Insert any additions, deviations, or different conventions here.]
3.1.7
Configuration Management Directives
The [project/product name] software will adhere to the configuration management conventions documented in the SE Programming Guidelines. [Insert any additions, deviations, or different implementations here.]
3.1.8
Macro File Directives
The [project/product name] software will adhere to the macro file conventions documented in the SE Programming Guidelines. [Insert any additions, deviations, or different implementations here.]
3.1.9
Utility Tool Directives
[Insert any directives here.]
Page 10 of 19 SDDS
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
561
3.2 Data Design 3.2.1
Mailbox Data Design
The mailboxes contain the data that are passed across the mailbox interface and are used as initiation or activation mechanisms. Mailboxes consist of a structure and a valid message. The mailbox structure definitions are contained in [insert generic files names and directory locations]. The mailbox messages are categorized as status messages or service request messages. Status messages are defined as messages broadcast by a task to indicate the status or state of the task. Service request messages are defined as messages posted from one task to another task that request a service to be performed.Valid status message and service request message definitions are contained in [insert generic file names and directory locations]. The definition of all valid messages that can be received by a task are contained in [insert generic file names and directory locations].
3.2.2
Global Data Stores
Global data stores are data stores common to more than one of the [project/product name] tasks. Each global data store is uniquely identified by [insert a discussion of how the global data are identified by name]. Global data require a [insert lock and unlock or semaphore mechanism here] to access the data values. Data structures are uniquely identified by [insert a discussion of how the data structures are identified by name].
3.2.3
Global Declarations
[Insert a discussion of global declarations, how they are used, what they are used for, file names, and directory locations.]
3.2.4
Global Data Variables
[Insert a discussion of global data variables, how they are used, what they are used for, file names, and directory locations.]
3.2.5
Global Data Constants
[Insert a discussion of global data constants, how they are used, what they are used for, file names, and directory locations.]
Copyright © 2002 Interpharm Press
Page 11 of 19 SDDS
562
Software Quality Assurance SOPs for Healthcare Manufacturers
3.3 Detailed Design This section presents the detailed design of the [project/product name] software tasks.The tasks are presented in alphabetical order rather than in the order of their functional mode of operation. [For each functional task indicated in Section 2.6.n, include a completed section 3.3.n per the outline below.]
3.3.n
[functional task name] Task
The [functional task name] software files are contained in the directory [insert directory path].The [functional task name] data flow and control flow diagrams are shown in Appendix B; state transition matrices are shown in Appendix C; structure chart(s) are shown in Appendix D; and data structure charts are shown in Appendix E. 3.3.n.1 [functional task software_name] Task Description. [Insert a discussion of the task, how it is implemented, when and how it is activated, and interfaces to other tasks.]
3.3.n.2 [functional task software_name] Functional Design. [Insert a discussion of the task logic and any special processing conditions or states; functions called; errors, warnings, and alarms processing; hardware interfaces; safety processing; critical parameters used; data accessed and interfaces to other tasks.]
3.3.[n+1] Shared Functions Task. The Shared Functions Task software files are contained in the directory [insert directory path].The Shared Functions Task consists of those functions that are shared, executed, or used by two or more of the physical tasks discussed above. [For each shared functions task, include a completed section 3.3.n+1 per the outline below.]
3.3.[n+1].1 [functional task name] Description. [Insert a discussion of the function’s logic and any special processing conditions or states; functions called; errors, warnings, and alarms processing; hardware interfaces; safety processing; critical parameters used; data accessed, and interfaces to other tasks.]
3.4 Libraries Used This section contains a description of each library used in the [project/product name] software. [For each library, include a completed section 3.4.n per the outline below.]
Page 12 of 19 SDDS
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
3.4.n
563
[library name] Library Description
[Insert a discussion of the library; functions used if not all are used; errors, warnings, and alarms issued; how the software interfaces with the library; and directory paths of all required files.]
3.5 Macros Used This section contains a description of each macro used in the [project/product name] software. [For each macro, include a completed section 3.5.n per the outline below.]
3.5.n
[macro name] Macro Description
[Insert a discussion of the macro’s logic and any special processing conditions or states; functions called; errors, warnings, and alarms processing; hardware interfaces; safety processing; critical parameters used; data accessed, and interfaces to other tasks.]
4.0 NOTES
[Insert here any notes, comments, observations, timelines, algorithms or special processing conditions, states, or status that are needed to augment, explain, or amplify the design specified above.]
Copyright © 2002 Interpharm Press
Page 13 of 19 SDDS
564
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX A
Number
[#.#.#.#.#] [#.#.#.#.#] [#.#.#.#.#] [#.#.#.#.#]
Page 14 of 19 SDDS
ALLOCATION OF REQUIREMENTS MAP
SRS Paragraph Title or Description
[Title [Title [Title [Title
or or or or
description] description] description] description]
SDDS Document Paragraph Process Number or Number Description N/A [#.#.#] [#.#.#] [#.#.#]
[#.#.#.#.#] [Description]
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
APPENDIX B
565
TASK DATA FLOW AND CONTROL FLOW DIAGRAMS
[Insert all data flow and control flow diagrams here.]
Copyright © 2002 Interpharm Press
Page 15 of 19 SDDS
566
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX C
TASK STATE TRANSITION MATRICES
[Insert all state transition matrices here.]
Page 16 of 19 SDDS
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
APPENDIX D
567
TASK STRUCTURE CHARTS
[Insert all task structure charts here.]
Copyright © 2002 Interpharm Press
Page 17 of 19 SDDS
568
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX E
TASK DATA STRUCTURE CHARTS
[Insert all task data structure charts here.]
Page 18 of 19 SDDS
Copyright © 2002 Interpharm Press
Software Detailed Design Specification
569
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Critical software: Software whose failure could have an impact on safety. Criticality: Classification of a software error or fault based upon evaluation of the degree of impact of that error, fault on the development, or operation of a system. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, that may lead to a failure. Hazard: Dangerous state of a device or system that may lead to death, injury, occupational illness, or damage to or loss of equipment or property. Interface Design Specification (IDS): Project-specific document that completely specifies the interfaces among the subsystems. Reliability: Ability of an item to perform a required function under stated conditions for a stated period of time. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Safety critical indicators: Visual or audible indicators of the condition of the system. Safety critical parameters: System parameters within which the system must operate in order to be safe. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software reliability: Probability that software will not cause the failure of a system for a specified time under specified conditions. Copyright © 2002 Interpharm Press
Page 19 of 19 SDDS
without RTM
[Project/Product Name] SRS SOFTWARE REQUIREMENTS SPECIFICATION
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SRS-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 22 SRS without RTM
572
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
REQUIREMENTS
5
3.0
QUALIFICATION REQUIREMENTS
12
4.0
PREPARATION FOR DELIVERY
15
APPENDIX A
Allocation of Requirements Map
16
APPENDIX B
Qualification Criteria
17
GLOSSARY
Page 2 of 22 SRS without RTM
18
Copyright © 2002 Interpharm Press
Software Requirements Specification
573
1.0 INTRODUCTION
1.1 Purpose This document establishes the software requirements for the [project/product name] project.
1.2 Scope This document applies to all of the software that is to be developed for and reside in the [project/ product name] product.
1.3 Overview The [project/product name] product is a [insert high-level description of product here]. This version of the product implements the following features: [insert product feature descriptions here]. The software is responsible for the instruments functionality, user interface, safety checks, and performance accuracy.This document establishes and specifies the design, interface, functional performance, and qualification requirements of the [project/product name] product software.The allocation of the software requirements that are described in the [project/product name] Product Requirements Document (PRD) is presented in the Allocation of Requirements Map in Appendix A.
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
1.4.1 •
Project Specification [project/product name] Interface Design Specification, Document Number [aaa]-IDS-[#.#], Revision [#.#], dated [date]
Copyright © 2002 Interpharm Press
Page 3 of 22 SRS without RTM
574
•
Software Quality Assurance SOPs for Healthcare Manufacturers
Product Objectives Document, Document Number [aaa]-PODRevision [#.#], dated [date]
[project/product name] [#.#],
•
[project/product name]
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
Software Configuration Management Plan, Document Number [aaa]-CMP-[#.#], Revision [#.#], dated [date] [project/product name]
[#.#],
Software Development Test Plan, Document Number [aaa]-DTPRevision [#.#], dated [date]
•
[project/product name] Software End-product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
1.4.2
Software Quality Assurance Plan, Document Number [aaa]-QAP[#.#], Revision [#.#], dated [date] Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
Page 4 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
575
2.0 REQUIREMENTS
2.1 Design Requirements 2.1.1
Memory Requirements
The [project/product name] software shall meet the following memory requirements: a. The software shall fit within [#] bytes of ROM, which begins at address [#]. b. Data storage shall consist of [#] bytes of RAM, which begins at address [#]. c. Nonvolatile data storage shall be provided by [#] bit words of [#] word EEPROM, which is addressed [access] via I/O port [port name]. [d.
2.1.2
Insert any additional memory characteristics or specifications here.]
Timing Requirements
The [project/product name] software shall meet the following processing timeline requirements: a. The complete cycle of active failure checks shall be completed in less than the system critical time. b. The complete cycle of passive failure checks shall be completed at the beginning of each operational window, which is not less frequent than the system power cycle. c. Indicator functions shall be completed within a full duty cycle time of [#] msec. [d.
2.1.3
Insert any additional processing time characteristics or specifications here.]
Design Standards
The [project/product name] software shall meet the software design standards and procedures of the Software Engineering Software Development Policies and Software Engineering Software Development Guidelines, respectively.
Copyright © 2002 Interpharm Press
Page 5 of 22 SRS without RTM
576
Software Quality Assurance SOPs for Healthcare Manufacturers
2.1.4
Audio
All audio tones shall have a loudness of at least [insert level] dB. All frequencies shall be within [insert percent range]% of the specified value. All durations and intervals shall be within [insert microsecond range] msec.
2.1.5
Safety
Critical information shall be stored in a modified form and carry the same information. [Insert any additional safety requirements here.]
2.1.6
Display
[Insert display requirements here.]
2.1.7
Remote Communications
The instrument shall support [insert communications requirements specification here] communications up to [insert baud rate here] baud.
[2.1.n Requirement Area Insert any additional design requirements here.]
2.2 Interface Requirements 2.2.1
Interface Relationships
The [project/product name] software shall support the interfaces shown in the context diagram of Figure 1.
Figure 1
[project/product name] Implementation Context Diagram
[Insert context diagram here]
Page 6 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
2.2.2
577
Interface Identifications and Documentation
The [project/product name] software shall support the interfaces documented in the [project/product Interface Design Specification (IDS).
name]
2.2.3
RAM Test
The RAM test interface shall provide a means to test RAM-test hardware where a RAM error sets off the watchdog alarm and stops the instrument.
2.2.4
EEPROM
The electronically erasable and programmable ROM (EEPROM) shall be used to store calibration and configuration data.
2.2.5
Interrupt System
The interrupt system provides event information from the timers, serial communications ADC, and DMA. The interrupts used in the [project/product name] software are [insert list of interrupts].
2.2.6
Timers
There are [number] counter/timers internal to the processor. The timers used in the [project/product name] software are [insert list of timers and functions used for].
2.2.7
Direct Memory Access
[Insert description of DMA and how it will be used.]
2.2.8
Wait State Generator
[Insert description of wait state generator and how it will be used.]
Copyright © 2002 Interpharm Press
Page 7 of 22 SRS without RTM
578
Software Quality Assurance SOPs for Healthcare Manufacturers
2.2.9
Watchdog
The watchdog timer shall be used to verify that the software is executing its functions in a time that is less than the system critical time. The watchdog interface provides the means to specify the watchdog interval, to periodically reset the watchdog, and to reset an alarm condition.
2.2.10 Real-time Clock The hardware real-time clock shall be used to determine running time.
2.2.11 CRC Generator The hardware cyclical redundancy check (CRC) generator is used to verify the contents of ROM and EEPROM. It may also be used to generate and verify the CRCs in remote messages. The CRC generator is accessed via the DMA.
2.2.12 Audio The audio generator is used to provide audio prompts, warnings, and alarm messages.
2.2.13 Power Control The instrument cannot be turned off by hardware action alone, except in the case of errors.
2.2.14 Keyboard The keyboard interface provides user input to the software. [Insert any additional keyboard requirements here.]
2.2.15 LED Displays The LED indicator interface provides basic status and alarm indications to the user. [Insert any additional LED requirements here.]
Page 8 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
579
2.2.16 LCD Displays The LCD display provides the main status and parameter display to the user. [Insert any additional LCD display requirements here.]
2.2.17 Motor The drive motor interface provides the means to control the motor speed. [Insert any additional motor requirements here.]
2.2.18 ADC The analog-to-digital (ADC) converter interface provides the means for [insert uses for ADC here]. [Insert any additional ADC requirements here.]
2.2.19 DAC The digital-to-analog (DAC) converter interface provides the means for [insert uses for DAC here]. [Insert any additional DAC requirements here.]
2.2.20 Communications The communications interface provides the means for [insert uses for communications here]. [Insert any additional ADC requirements here.]
[2.2.n Requirement Area Insert any additional interface requirements here.]
2.3 Functional and Performance Requirements 2.3.1
Operational Modes
The instrument shall have [number of modes] functional modes of operation.Transitions between the modes shall be initiated by [list of activations] and by the detection of error and alarm conditions.The
Copyright © 2002 Interpharm Press
Page 9 of 22 SRS without RTM
580
Software Quality Assurance SOPs for Healthcare Manufacturers
transition rules are shown in Figure 2. [Insert operational modes input, processing, and output requirements here.]
Figure 2
Functional Modes of Operation Transition Rules
[Insert operation transition rules here]
2.3.2
User Interface Function
The user interface function provides the user with the means to control the instrument and for the instrument to inform the user of its current status. [Insert operational modes input, processing, and output requirements here.]
2.3.3
Data Entry Function
The data entry function is to display menus and prompts and accept user key presses. [Insert data entry function input, processing data entry items, rules, and audio and output requirements here.]
2.3.4
Panel Switch Function
The panel switch function provides the user with the means to control the instrument and to support data entry functions. [Insert panel switch function input, processing, and output requirements here.]
2.3.5
Display Function
The display function displays alphanumeric information on the LCD. [Insert display function input, processing, and output requirements here.]
2.3.6
Indicator Function
The indicator function multiplexes the panel indicator LEDs and tests the LEDs and their drivers. [Insert indicator function input, processing, and output requirements here.]
Page 10 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
2.3.7
581
Audio Function
The audio function provides audible feedback to the user for data entry functions and for error and alarm indications. [Insert audio function input, processing, and output requirements here.]
2.3.8
Motor Control Function
The motor control function generates commands to the motor to cause it to operate at the proper rate. [Insert motor function input, processing, and output requirements here.]
2.3.9
Remote Communications Function
The remote communications function provides remote monitoring and control of the instrument. [Insert remote communications function input, processing, and output requirements here.]
2.3.10 Power-Up Test Function The power-up test function executes initial testing of the hardware and software. [Insert powerup test function input, processing, and output requirements here.]
2.3.11 Hardware Monitoring Function The hardware monitoring function continuously monitors the status of the instrument and its internal components. [Insert hardware monitoring function input, processing, and output requirements here.]
2.3.n Requirement Area [Insert function input, processing, and output requirements here.]
2.4 Adaptation Requirements 2.4.1
System Environment
[Insert any system environment requirements that are known to be modifiable.]
Copyright © 2002 Interpharm Press
Page 11 of 22 SRS without RTM
582
Software Quality Assurance SOPs for Healthcare Manufacturers
2.4.2
System Parameters
[Insert any system parameters requirements that are known to be modifiable.]
2.5 Traceability The mapping of the requirements in the [project/product name] Product Objectives Document, [project/product name] Product Requirements Document, and [project/product name] Interface Design Specification are given in Appendix A.
3.0 QUALIFICATION REQUIREMENTS
3.1 General Qualification Requirements The software development team shall implement a software development program that utilizes reviews, walk-throughs, audits, static and dynamic analysis, testing, and other methods consistent with SE Software Development Policies in order to ensure the development of quality software for the [project/product name] product.
3.2 Qualification Methodology 3.2.1
Test Plans
Qualification of the [project/product name] software for delivery to the System Design Validation Testing is established by the successful completion of the testing described in the [project/product name] Software Development Test Plan (SDTP) and [project/product name] Software Validation Test Plan (SVTP). The SDTP shall describe the scope of software testing that must be successfully completed for each software component of the [project/product name] product. The SVTP shall describe the software testing required to verify that the fully integrated software product satisfies the requirements specified in this document.
Page 12 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
3.2.2
583
Test Information Sheets
The criteria for the success of each test conducted on the [project/product name] software shall be defined in a test information sheet (TIS). A TIS shall be generated for each test described in the SDTP and SVTP. The DTISs shall be used as a guide for test setup and conduct. The VTISs shall be used as a baseline for the development of the [project/product name] Software Validation Test Procedures (SVTPR). The SVTPR shall be used to conduct the testing described in the SVTP.
3.3 Test Requirements The [project/product name] software test requirements are defined in terms of levels of testing, test categories, and test verification methods.
3.3.1
Levels of Testing
Qualification of the [project/product name] software is established by the successful completion of two levels of software testing. The software component testing shall verify the correct operation of each software component that is described in the [project/product name] Software Detailed Design Specification (SDDS). The software validation testing shall verify that the fully integrated software satisfies the requirements specified in this document.
3.3.2
Test Categories
The test categories for software component testing and software validation testing shall include the following types of testing. •
Functional Testing. Tests designed to verify that all the functional requirements have been satisfied. This category is termed success oriented, because the tests are expected to produce successful results.
•
Robustness Testing. Tests designed to evaluate software performance given unexpected inputs. This category is termed failure oriented, because the test inputs are designed to cause the product to fail given foreseeable and reasonably unforeseeable misuse of the product.
•
Stress Testing. Tests designed to evaluate software in a stress condition in which the amount or rate of data exceeds the amount expected.
Copyright © 2002 Interpharm Press
Page 13 of 22 SRS without RTM
584
Software Quality Assurance SOPs for Healthcare Manufacturers
•
Safety Testing. Tests designed to verify that the software performs in a safe manner and that a complete assessment of the safety design is accomplished.
•
Growth Testing. Tests performed to verify that the margins of growth specified for any particular component are supported by the software.
•
Regression Testing. Tests performed whenever a software change occurs to detect faults introduced during modification, verify that modifications have not caused unintended adverse effects, and verify that the software still meets its specified requirements.
3.3.3
Test Verification Methods
The methods of test verification for software component testing and software validation testing shall include the following. •
Inspection. Visual examination of an item.
•
Analysis. Evaluation of theoretical or empirical data.
•
Demonstration. Operational movement or adjustment of an item.
•
Test. Operation of an item and recording and evaluation of quantitative data.
3.3.4
Qualification Criteria
The qualification criteria of each software requirement specified in this document are defined by one or more levels of testing, test category, and test method. A matrix identifying the qualification criteria for the requirements specified in this document is provided in Appendix B.
3.4 Special Qualification Requirements [Insert any special qualification requirements, such as special tools, techniques, facilities, and acceptance limits, or indicate that none of these is required.]
Page 14 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
585
4.0 PREPARATION FOR DELIVERY
All deliverables shall conform to the standards set forth in the SE Software Development Policies and the [project/product name] Software End-products Acceptance Plan (SEAP).
4.1 Instrument EPROM The copyright notice, instrument model number, and software revision number shall be embedded in each electronically programmable ROM (EPROM).
4.2 Archive Documents The copyright notice, instrument model number, and software revision number shall be included in each document and each file.
Copyright © 2002 Interpharm Press
Page 15 of 22 SRS without RTM
586
Software Quality Assurance SOPs for Healthcare Manufacturers
APPENDIX A SRS Paragraph Number
ALLOCATION OF REQUIREMENTS MAP SRS Paragraph Title or Description
System Document System Document Identification Paragraph Number
[##.##.##.##] [##.##.##.##] [##.##.##.##]
[Title or description} [Title or description} [Title or description}
N/A
[##.##.##.##]
[Title or description}
[aaa]
Page 16 of 22 SRS without RTM
[aaa] [aaa]
[#.#.#] [#.#.#] [#.#.#] [#.#.#] [#.#.#]
Copyright © 2002 Interpharm Press
Software Requirements Specification
APPENDIX B SRS Paragraph Number
587
QUALIFICATION CRITERIA SRS Test Test Test Paragraph Qualification Qualification Qualification Title or Description Level(s) Categories Methods
[#.#.#.#] [#.#.#.#]
[Title or description] [Title or description]
[#.#.#.#]
[Title or description]
[#.#.#.#]
[Title or description]
Copyright © 2002 Interpharm Press
N/A
N/A
N/A
[Select from Section 3.3.1] [Select from Section 3.3.1] [Select from Section 3.3.1]
[Select from Section 3.3.2] [Select from Section 3.3.2] [Select from Section 3.3.2]
[Select from Section 3.3.3] [Select from Section 3.3.3] [Select from Section 3.3.3]
Page 17 of 22 SRS without RTM
588
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Archive: Provisions made for storing and retrieving records over a long period of time. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Code and Test: Phase of the software development life cycle during which a software end product is created from design documentation and tested. Code audit: Independent review of source code by a person, team, or tool to verify compliance with software design documentation and programming standards. Correctness and efficiency may also be evaluated. Component: Unit of code that performs a specific task, or group of logically related code units that perform a specific task or set of tasks. Component testing: Testing conducted to verify the implementation of the design for one software component or collection of software components. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution, as well as to execute it. Configuration audit: Process of verifying that all required configuration items have been produced, that the current version agrees with specified requirements, that the technical documentation completely and accurately describes the configuration items, and that all change requests have been resolved. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations.
Page 18 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
589
Criticality: Classification of a software error or fault based upon an evaluation of the degree of impact of that error or fault on the development or operation of a system. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to Quality Assurance for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Design requirement: Any requirement that impacts or constrains the design of a software system or software system component. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, that may lead to a failure. Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Inspection: Formal evaluation technique in which software requirements, design, or code are examined in detail by a person or group other than the author to detect faults, violations of development standards, or other problems. Interface Design Specification (IDS): Project-specific document that completely specifies the interfaces among the subsystems. Product Objectives Document (POD): Project-specific document that specifies the objective of the product in terms of marketing and function.
Copyright © 2002 Interpharm Press
Page 19 of 22 SRS without RTM
590
Software Quality Assurance SOPs for Healthcare Manufacturers
Product Requirements Document (PRD): Project-specific document that specifies the requirements of the product in terms of internal and external functions and performance. Programming standard: Definite rule or procedure established and imposed by authority. Regression testing: Selective retesting to detect faults introduced during modification to verify that modifications have not caused unintended adverse effects and that a modified system or system component still meets its specified requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Architecture Design Review (SADR): Software review conducted for the purpose of (1) reviewing the projects SADS, associated plans, and technical issues; (2) resolving identified issues; and (3) obtaining commitment to proceed into the detailed design phase. Software code walk-throughs: Reviews conducted at the component source code level as the implementation of each component is completed, to detect implementation, documentation, and programming standards problems. Correctness and efficiency may also be evaluated. Software design walk-throughs: Reviews conducted at the component level as the design of each component is completed, to detect design problems. Software Detailed Design Review (SDDR): Software review conducted for the purpose of: (1) reviewing the project’s SDDS, associated plans, and critical issues; (2) resolving identified issues; (3) obtaining commitment to proceed into the code and test phase; and (4) obtaining commitment to a test program supporting product acceptance. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review, including a description of the overall program operation and control and the
Page 20 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
591
use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software documentation: Documents developed for software projects that are necessary design and planning tools for disciplined and successful software development. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software End-product Acceptance Plan (SEAP): Project-specific plan designed to serve as a descriptive checklist of the end products and services required for approval. Software Requirements Review (SRR): Review of the provisions of the Software Requirements Specification, which, once approved, will serve as the basis of software endproduct acceptance. Software Test Plan (STP): Project-specific plan that defines the scope of software testing that must be successfully completed for each software component. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied. Software Validation Test Plan (SVTP): Project-specific plan that describes the software testing required to verify that the software product satisfies the specified requirements. Source code: Original software expressed in human-readable form (programming language), which must be translated into machine-readable form before it can be executed by the computer. Technical reviews: Meeting at which the software end products of a phase of software development are presented for end-product review, issue resolution, and obtaining commitment to proceed into the next software development phase. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test.
Copyright © 2002 Interpharm Press
Page 21 of 22 SRS without RTM
592
Software Quality Assurance SOPs for Healthcare Manufacturers
Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. Walk-through: Review in which the designer or programmer leads members of the review team through a segment of design or code, and the reviewers ask questions and submit comments about technique, style, possible errors, violation of development standards, and other problems.
Page 22 of 22 SRS without RTM
Copyright © 2002 Interpharm Press
with RTM
[Project/Product Name] SRS SOFTWARE REQUIREMENTS SPECIFICATION
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SRS-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 22 SRS with RTM
594
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
REQUIREMENTS
5
3.0
QUALIFICATION REQUIREMENTS
12
4.0
PREPARATION FOR DELIVERY
15
APPENDIX A
Requirements Traceability Matrix
16
APPENDIX B
Qualification Criteria
17
GLOSSARY
Page 2 of 22 SRS with RTM
18
Copyright © 2002 Interpharm Press
Software Requirements Specification
595
1.0 INTRODUCTION
1.1 Purpose This document establishes the software requirements for the [project/product name] project.
1.2 Scope This document applies to all of the software that is to be developed for and reside in the [project/ product name] product.
1.3 Overview The [project/product name] product is a [insert high level description of product here]. This version of the product implements the following features: [insert product feature descriptions here]. The software is responsible for the instruments functionality, user interface, safety checks, and performance accuracy.This document establishes and specifies the design, interface, functional performance, and qualification requirements of the [project/product name] product software.The allocation of the software requirements that are described in the [project/product name] Product Requirements Document (PRD) is presented in the requirements allocation map in Appendix A.
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and the content of this specification, the content of this specification shall be considered a superseding requirement.
1.4.1 •
Project Specification [project/product name] Interface
Design Specification, Document Number [aaa]-IDS-[#.#],
Revision [#.#], dated [date]
Copyright © 2002 Interpharm Press
Page 3 of 22 SRS with RTM
596
•
Software Quality Assurance SOPs for Healthcare Manufacturers
Product Objectives Document, Document Number [aaa]-PODRevision [#.#], dated [date]
[project/product name] [#.#],
•
[project/product name]
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
•
[project/product name]
Software Configuration Management Plan, Document Number [aaa]-CMP-[#.#], Revision [#.#], dated [date] Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
[#.#],
Software Development Test Plan, Document Number [aaa]-DTPRevision [#.#], dated [date]
•
[project/product name] Software End-product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
1.4.2
Software Quality Assurance Plan, Document Number [aaa]-QAP[#.#], Revision [#.#], dated [date] Software Verification and Validation Plan, Document Number [aaa]-VVP-[#.#], Revision [#.#], dated [date]
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
Page 4 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
•
597
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 REQUIREMENTS
2.1 Design Requirements 2.1.1
Memory Requirements
The [project/product name] software shall meet the following memory requirements: a. The software shall fit within [#] bytes of ROM, which begins at address [#]. b. Data storage shall consist of [#] bytes of RAM, which begins at address [#]. c. Nonvolatile data storage shall be provided by [#] bit words of [#] word EEPROM, which is addressed [access] via I/O port [port name]. [d.
2.1.2
Insert any additional memory characteristics or specifications here.]
Timing Requirements
The [project/product name] software shall meet the following processing timeline requirements: a. The complete cycle of active failure checks shall be completed in less than the system critical time. b. The complete cycle of passive failure checks shall be completed at the beginning of each operational window, which is not less frequent than the system power cycle. c. Indicator functions shall be completed within a full duty cycle time of [#] msec. [d.
Insert any additional processing time characteristics or specifications here.]
Copyright © 2002 Interpharm Press
Page 5 of 22 SRS with RTM
598
Software Quality Assurance SOPs for Healthcare Manufacturers
2.1.3
Design Standards
The [project/product name] software shall meet the software design standards and procedures of the Software Engineering Software Development Policies and Software Engineering Software Development Guidelines, respectively.
2.1.4
Audio
All audio tones shall have a loudness of at least [insert level] dB. All frequencies shall be within [insert percent range]% of the specified value. All durations and intervals shall be within [insert microsecond range] msec.
2.1.5
Safety
Critical information shall be stored in a modified form and carry the same information. [Insert any additional safety requirements here.]
2.1.6
Display
[Insert display requirements here.]
2.1.7
Remote Communications
The instrument shall support [insert communications requirements specification here] communications up to [insert baud rate here] baud.
[2.1.n Requirement Area Insert any additional design requirements here.]
2.2 Interface Requirements 2.2.1
Interface Relationships
The [project/product name] software shall support the interfaces shown in the context diagram of Figure 1.
Page 6 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
Figure 1
599
[project/product name] Implementation Context Diagram
[Insert context diagram here]
2.2.2
Interface Identifications and Documentation
The [project/product name] software shall support the interfaces documented in the [project/product name] Interface Design Specification (IDS).
2.2.3
RAM Test
The RAM-test interface shall provide a means to test RAM-test hardware where a RAM error sets off the watchdog alarm and stops the instrument.
2.2.4
EEPROM
The electronically erasable and programmable ROM (EEPROM) shall be used to store calibration and configuration data.
2.2.5
Interrupt System
The interrupt system provides event information from the timers, serial communications ADC, and DMA. The interrupts used in the [project/product name] software are [insert list of interrupts].
2.2.6
Timers
There are [number] counter/timers internal to the processor. The timers used in the [project/product name] software are [insert list of timers and functions used for].
2.2.7
Direct Memory Access
[Insert description of DMA and how it will be used.]
Copyright © 2002 Interpharm Press
Page 7 of 22 SRS with RTM
600
Software Quality Assurance SOPs for Healthcare Manufacturers
2.2.8
Wait State Generator
[Insert description of wait state generator and how it will be used.]
2.2.9
Watchdog
The watchdog timer shall be used to verify that the software is executing its functions in a time that is less than the system critical time. The watchdog interface provides the means to specify the watchdog interval, to periodically reset the watchdog, and to reset an alarm condition.
2.2.10 Real-time Clock The hardware real-time clock shall be used to determine running time.
2.2.11 CRC Generator The hardware cyclical redundancy check (CRC) generator is used to verify the contents of ROM and EEPROM. It may also be used to generate and verify the CRCs in remote messages. The CRC generator is accessed via the DMA.
2.2.12 Audio The audio generator is used to provide audio prompts, warnings, and alarm messages.
2.2.13 Power Control The instrument cannot be turned off by hardware action alone, except in the case of errors.
2.2.14 Keyboard The keyboard interface provides user input to the software. [Insert any additional keyboard requirements here.]
Page 8 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
601
2.2.15 LED Displays The LED indicator interface provides basic status and alarm indications to the user. [Insert any additional LED requirements here.]
2.2.16 LCD Displays The LCD display provides the main status and parameter display to the user. [Insert any additional LCD display requirements here.]
2.2.17 Motor The drive motor interface provides the means to control the motor speed. [Insert any additional motor requirements here.]
2.2.18 ADC The analog-to-digital (ADC) converter interface provides the means for [insert uses for ADC here]. [Insert any additional ADC requirements here.]
2.2.19 DAC The digital-to-analog (DAC) converter interface provides the means for [insert uses for DAC here]. [Insert any additional DAC requirements here.]
2.2.20 Communications The communications interface provides the means for [insert uses for communications here]. [Insert any additional ADC requirements here.]
[2.2.n
Requirement Area
Insert any additional interface requirements here.]
Copyright © 2002 Interpharm Press
Page 9 of 22 SRS with RTM
602
Software Quality Assurance SOPs for Healthcare Manufacturers
2.3 Functional and Performance Requirements 2.3.1
Operational Modes
The instrument shall have [number of modes] functional modes of operation, and transitions between the modes shall be initiated by [list of activations] and by the detection of error and alarm conditions. The transition rules are shown in Figure 2. [Insert operational modes input, processing, and output requirements here.]
Figure 2
Functional Modes of Operation Transition Rules
[Insert operation transition rules here]
2.3.2
User Interface Function
The user interface function provides the user with the means to control the instrument and for the instrument to inform the user of its current status. [Insert operational modes input, processing, and output requirements here.]
2.3.3
Data Entry Function
The data entry function is to display menus and prompts and accept user key presses. [Insert data entry function input, processing data entry items, rules, and audio and output requirements here.]
2.3.4
Panel Switch Function
The panel switch function provides the user with the means to control the instrument and to support data entry functions. [Insert panel switch function input, processing, and output requirements here.]
2.3.5
Display Function
The display function displays alphanumeric information on the LCD display. [Insert display function input, processing, and output requirements here.]
Page 10 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
2.3.6
603
Indicator Function
The indicator function multiplexes the panel indicator LEDs and tests the LEDs and their drivers. [Insert indicator function input, processing, and output requirements here.]
2.3.7
Audio Function
The audio function provides audible feedback to the user for data entry functions and for error and alarm indications. [Insert audio function input, processing, and output requirements here.]
2.3.8
Motor Control Function
The motor control function generates commands to the motor to cause it to operate at the proper rate. [Insert motor function input, processing, and output requirements here.]
2.3.9
Remote Communications Function
The remote communications function provides remote monitoring and control of the instrument. [Insert remote communications function input, processing, and output requirements here.]
2.3.10 Power-Up Test Function The power-up test function executes initial testing of the hardware and software. [Insert powerup test function input, processing, and output requirements here.]
2.3.11 Hardware Monitoring Function The hardware monitoring function continuously monitors the status of the instrument and its internal components. [Insert hardware monitoring function input, processing, and output requirements here.]
2.3.n
Requirement Area
[Insert function input, processing, and output requirements here.]
Copyright © 2002 Interpharm Press
Page 11 of 22 SRS with RTM
604
Software Quality Assurance SOPs for Healthcare Manufacturers
2.4 Adaptation Requirements 2.4.1
System Environment
[Insert any system environment requirements that are known to be modifiable.]
2.4.2
System Parameters
[Insert any system parameters requirements that are known to be modifiable.]
2.5 Traceability The mapping of the requirements in the [project/product name] Product Objectives Document, [project/product name] Product Requirements Document, and [project/product name] Interface Design Specification are given in Appendix A.
3.0 QUALIFICATION REQUIREMENTS
3.1 General Qualification Requirements The software development team shall implement a software development program that utilizes reviews, walk-throughs, audits, static and dynamic analysis, testing, and other methods consistent with SE Software Development Policies in order to ensure the development of quality software for the [project/product name] product.
Page 12 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
605
3.2 Qualification Methodology 3.2.1
Test Plans
Qualification of the [project/product name] software for delivery to the System Design Validation Testing is established by the successful completion of the testing described in the [project/product name] Software Development Test Plan (SDTP) and [project/product name] Software Validation Test Plan (SVTP). The SDTP shall describe the scope of software testing that must be successfully completed for each software component of the [project/product name] product. The SVTP shall describe the software testing required to verify that the fully integrated software product satisfies the requirements specified in this document.
3.2.2
Test Information Sheets
The criteria for the success of each test conducted on the [project/product name] software shall be defined in a test information sheet (TIS). A TIS shall be generated for each test described in the SDTP and SVTP. The DTISs shall be used as a guide for test setup and conduct. The VTISs shall be used as a baseline for the development of the [project/product name] Software Validation Test Procedures (SVTPR). The SVTPR shall be used to conduct the testing described in the SVTP.
3.3 Test Requirements The [project/product name] software test requirements are defined in terms of levels of testing, test categories, and test verification methods.
3.3.1
Levels of Testing
Qualification of the [project/product name] software is established by the successful completion of two levels of software testing. The software component testing shall verify the correct operation of each software component that is described in the [project/product name] Software Detailed Design Specification (SDDS). The software validation testing shall verify that the fully integrated software satisfies the requirements specified in this document.
Copyright © 2002 Interpharm Press
Page 13 of 22 SRS with RTM
606
Software Quality Assurance SOPs for Healthcare Manufacturers
3.3.2
Test Categories
The test categories for software component testing and software validation testing shall include the following types of testing. •
Functional Testing. Tests designed to verify that all the functional requirements have been satisfied. This category is termed success oriented, because the tests are expected to produce successful results.
•
Robustness Testing. Tests designed to evaluate software performance given unexpected inputs. This category is termed failure oriented, because the test inputs are designed to cause the product to fail given foreseeable and reasonably unforeseeable misuse of the product.
•
Stress Testing. Tests designed to evaluate software in a stress condition in which the amount or rate of data exceeds the amount expected.
•
Safety Testing. Tests designed to verify that the software performs in a safe manner and that a complete assessment of the safety design is accomplished.
•
Growth Testing. Tests performed to verify that the margins of growth specified for any particular component are supported by the software.
•
Regression Testing. Tests performed whenever a software change occurs to detect faults introduced during modification, verify that modifications have not caused unintended adverse effects, and verify that the software still meets its specified requirements.
3.3.3
Test Verification Methods
The methods of test verification for software component testing and software validation testing shall include the following. •
Inspection. Visual examination of an item.
•
Analysis. Evaluation of theoretical or empirical data.
•
Demonstration. Operational movement or adjustment of an item.
•
Test. Operation of an item and the recording and evaluation of quantitative data.
Page 14 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
3.3.4
607
Qualification Criteria
The qualification criteria of each software requirement specified in this document are defined by one or more levels of testing, test category, and test method. A matrix identifying the qualification criteria for the requirements specified in this document is provided in Appendix B.
3.4 Special Qualification Requirements [Insert any special qualification requirements, such as special tools, techniques, facilities, and acceptance limits, or indicate that none of these is required.]
4.0 PREPARATION FOR DELIVERY
All deliverables shall conform to the standards set forth in the SE Software Development Policies and the [project/product name] Software End-products Acceptance Plan (SEAP).
4.1 Instrument EPROM The copyright notice, instrument model number, and software revision number shall be embedded in each electronically programmable ROM (EPROM).
4.2 Archive Documents The copyright notice, instrument model number, and software revision number shall be included in each document and each file.
Copyright © 2002 Interpharm Press
Page 15 of 22 SRS with RTM
Requirement Number
Requirement Description
Higher Level Requirement Number
SRS DDS Paragraph Paragraph Software Test Verification Number Number Component(s) Number(s) Method(s)
Page 16 of 22 SRS with RTM / / / / / / / / / / /
F F F F F F F F F F F
APPENDIX A
P P P P P P P P P P P
Test Results Pass/Fail
608 Software Quality Assurance SOPs for Healthcare Manufacturers
REQUIREMENTS TRACEABILITY MATRIX
Copyright © 2002 Interpharm Press
Software Requirements Specification
APPENDIX B SRS Paragraph Number
609
QUALIFICATION CRITERIA SRS Test Test Test Paragraph Qualification Qualification Qualification Title or Description Level(s) Categories Methods
[#.#.#.#] [#.#.#.#]
[Title or description] [Title or description]
[#.#.#.#]
[Title or description]
[#.#.#.#]
[Title or description]
Copyright © 2002 Interpharm Press
N/A
N/A
N/A
[Select from Section 3.3.1] [Select from Section 3.3.1] [Select from Section 3.3.1]
[Select from Section 3.3.2] [Select from Section 3.3.2] [Select from Section 3.3.2]
[Select from Section 3.3.3] [Select from Section 3.3.3] [Select from Section 3.3.3]
Page 17 of 22 SRS with RTM
610
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Accuracy: Quantitative assessment of freedom from error. Archive: Provisions made for storing and retrieving records over a long period of time. Audit: Independent review for the purpose of assessing compliance with software requirements, specifications, baselines, standards, procedures, instructions, and coding requirements. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Code and Test: Phase of the software development life cycle during which a software end product is created from design documentation and tested. Code audit: Independent review of source code by a person, team, or tool to verify compliance with software design documentation and programming standards. Correctness and efficiency may also be evaluated. Component: Unit of code that performs a specific task, or group of logically related code units that perform a specific task or set of tasks. Component testing: Testing conducted to verify the implementation of the design for one software component or collection of software components. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution, as well as to execute it. Configuration audit: Process of verifying that all required configuration items have been produced, that the current version agrees with specified requirements, that the technical documentation completely and accurately describes the configuration items, and that all change requests have been resolved. Correctness: Extent to which software is free of design defects, coding defects, and faults; meets its specified requirements; and meets user expectations.
Page 18 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
611
Criticality: Classification of a software error or fault based upon an evaluation of the degree of impact of that error or fault on the development or operation of a system. Delivery: Transfer of responsibility for an item from one activity to another, as in the delivery of the validated software product to Quality Assurance for certification. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Design requirement: Any requirement that impacts or constrains the design of a software system or software system component. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Error: Discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. Evaluation: Process of determining whether an item or activity meets specified criteria. Failure: Inability of a system or system component to perform its required function (see fault). Fault: Defect of a system or system component, caused by a defective, missing, or extraneous instruction or set of related instructions in the definition, specification, design, or implementation of a system, that may lead to a failure. Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Inspection: Formal evaluation technique in which software requirements, design, or code are examined in detail by a person or group other than the author to detect faults, violations of development standards, or other problems. Interface Design Specification (IDS): Project-specific document that completely specifies the interfaces among the subsystems. Product Objectives Document (POD): Project-specific document that specifies the objective of the product in terms of marketing and function.
Copyright © 2002 Interpharm Press
Page 19 of 22 SRS with RTM
612
Software Quality Assurance SOPs for Healthcare Manufacturers
Product Requirements Document (PRD): Project-specific document that specifies the requirements of the product in terms of internal and external functions and performance. Programming standard: Definite rule or procedure established and imposed by authority. Regression testing: Selective retesting to detect faults introduced during modification to verify that modifications have not caused unintended adverse effects and that a modified system or system component still meets its specified requirements. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs. Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Architecture Design Review (SADR): Review for (1) reviewing the project’s architecture design, SADS, associated plans, and critical issues; (2) resolving identified issues; and (3) obtaining commitment to proceed into the Detailed Design Phase. Software code walk-throughs: Reviews conducted at the component source code level as the implementation of each component is completed, to detect implementation, documentation, and programming standards problems. Correctness and efficiency may also be evaluated. Software design walk-throughs: Reviews conducted at the component level as the design of each component is completed, to detect design problems. Software Detailed Design Review (SDDR): Review for (1) reviewing the project’s detailed design, SDDS, associated plans, and critical issues; (2) resolving identified issues; (3) obtaining commitment to proceed into the code and test phase; and (4) obtaining commitment to a test program supporting product acceptance. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review, including a description of the overall program operation and control and the
Page 20 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
Software Requirements Specification
613
use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software documentation: Documents developed for software projects that are necessary design and planning tools for disciplined and successful software development. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software End-product Acceptance Plan (SEAP): Project-specific plan designed to serve as a descriptive checklist of the end products and services required for approval. Software Requirements Review (SRR): Software review of the provisions of the Software Requirements Specification, which, once approved, will serve as the basis of software endproduct acceptance. Software Test Plan (STP): Project-specific plan that defines the scope of software testing that must be successfully completed for each software component. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied. Software Validation Test Plan (SVTP): Project-specific plan that describes the software testing required to verify that the software product satisfies the specified requirements. Source code: Original software expressed in human-readable form (programming language), which must be translated into machine-readable form before it can be executed by the computer. Technical reviews: Meeting at which the software end products of a phase of software development are presented for end-product review, issue resolution, and obtaining commitment to proceed into the next software development phase. Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test.
Copyright © 2002 Interpharm Press
Page 21 of 22 SRS with RTM
614
Software Quality Assurance SOPs for Healthcare Manufacturers
Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase. Walk-through: Review in which the designer or programmer leads members of the review team through a segment of design or code, and the reviewers ask questions and submit comments about technique, style, possible errors, violation of development standards, and other problems.
Page 22 of 22 SRS with RTM
Copyright © 2002 Interpharm Press
[Project/Product Name] SFMEA SOFTWARE FAILURE MODES AND EFFECTS ANALYSIS
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SFMEA-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 2 SFMEA
Software Quality Assurance SOPs for Healthcare Manufacturers
Software Function
Failure Mode
Cause
Effect on System
Possible Hazard(s)
Risk Index
Applicable Control
616
Page 2 of 2 SFMEA
Copyright © 2002 Interpharm Press
[Project/Product Name] SFMECA SOFTWARE FAILURE MODES EFFECTS CRITICALITY ANALYSIS
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SFMECA-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 8 SFMECA
618
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
SOFTWARE FAILURE MODES EFFECTS TABLES
3
Page 2 of 8 SFMECA
Copyright © 2002 Interpharm Press
Software Failure Modes Effects Criticality Analysis
619
1.0 INTRODUCTION
1.1 Purpose This document lists the effects of failures in the [project/product name] software and the countermeasures used to prevent them from becoming hazards.
1.2 Scope This document lists the effects of failures in the main processor only.
2.0 TABLES LISTING SOFTWARE FAILURE MODES EFFECTS
This section details the contents of the tables listing Software Failure Modes Effects.
2.1 Table for Detection of Hardware Faults by Main Processor Software The “Effect if not detected” column of the table states the effect of the error if it were not detected. The “Module detected by and/or variable name(s)” column of the table states the name of the module, function, or routine that detects the error. For assembler code, this states the name of the closest labels above and below the code that detects the error. If the error is assigned, allocated, or associated with a variable name, then the variable name also appears in this column. The “System effect/error message/error number” column of the table states either the effect, error message, or error number that is associated with the error.
Copyright © 2002 Interpharm Press
Page 3 of 8 SFMECA
620
Software Quality Assurance SOPs for Healthcare Manufacturers
2.2 Table for Detection of Main Processor Software Faults The “Effect if not detected” column of the table states the effect of the error if it were not detected. The “Module detected by and/or variable name(s)” column of the table states the name of the module, function, or routine that detects the error. For assembler code, this states the name of the closest labels above and below the code that detects the error. If the error is assigned, allocated, or associated with a variable name, then the variable name also appears in this column. The “System effect/error message/error number” column of the table states either the effect, error message, or error number that is associated with the error.
2.3 Table Results Attached are the Software Failure Modes Effects Tables.
Page 4 of 8 SFMECA
Copyright © 2002 Interpharm Press
Software Failure Modes Effects Criticality Analysis
621
TABLE OF SOFTWARE FAILURE MODES EFFECTS DETECTION OF HARDWARE FAULTS BY MAIN PROCESSOR SOFTWARE PROJECT:
__________________________________________________
Page ____of ____
Initiated by:
__________________________________________________
Date: ________
Signature
Title/Position
Reviewed by: __________________________________________________ Signature
Title/Position
Approved by: __________________________________________________ Signature
Item number
Component or description
Failure mode
Copyright © 2002 Interpharm Press
Date: ________
Date: ________
Title/Position
Effect if not detected
Module detected by and/or variable name(s)
System effect/ error message/ error number
Page 5 of 8 SFMECA
622
Software Quality Assurance SOPs for Healthcare Manufacturers
TABLE OF SOFTWARE FAILURE MODES EFFECTS DETECTION OF HARDWARE FAULTS BY MAIN PROCESSOR SOFTWARE (CONTINUED) PROJECT:
Item number
Page 6 of 8 SFMECA
__________________________________________________ Component or description
Failure mode
Effect if not detected
Module detected by and/or variable name(s)
Page ____of ____ System effect/ error message/ error number
Copyright © 2002 Interpharm Press
Software Failure Modes Effects Criticality Analysis
623
TABLE OF SOFTWARE FAILURE MODES EFFECTS DETECTION OF MAIN PROCESSOR SOFTWARE FAULTS PROJECT:
__________________________________________________
Page ____of ____
Initiated by:
__________________________________________________
Date: ________
Signature
Title/Position
Reviewed by: __________________________________________________ Signature
Title/Position
Approved by: __________________________________________________ Signature
Item number
Component or description
Failure mode
Copyright © 2002 Interpharm Press
Date: ________
Date: ________
Title/Position
Effect if not detected
Module detected by and/or variable name(s)
System effect/ error message/ error number
Page 7 of 8 SFMECA
624
Software Quality Assurance SOPs for Healthcare Manufacturers
TABLE OF SOFTWARE FAILURE MODES EFFECTS DETECTION OF MAIN PROCESSOR SOFTWARE FAULTS (CONTINUED) PROJECT:
Item number
Page 8 of 8 SFMECA
__________________________________________________
Component or description
Failure mode
Effect if not detected
Module detected by and/or variable name(s)
Page ____of ____
System effect/ error message/ error number
Copyright © 2002 Interpharm Press
SOFTWARE TESTING AND VERIFICATION AND VALIDATION DOCUMENTS
THIS SECTION CONTAINS THE FOLLOWING DOCUMENTS: SOFTWARE ANOMALY REPORT INSTRUCTIONS FOR COMPLETING SOFTWARE ANOMALY REPORT DEVELOPMENT TEST INFORMATION SHEET (DTIS) VALIDATION TEST INFORMATION SHEET (VTIS) SOFTWARE VALIDATION TEST LOG
Copyright © 2002 Interpharm Press
Page 1 of 1 Software Testing and Verification and Validation Documents
626
Software Quality Assurance SOPs for Healthcare Manufacturers
SOFTWARE ANOMALY REPORT 1. Date:
2. Severity: HML
3. Anomaly Report
4. Title (briefly describe the problem):
5. System: 8. Originator:
6. Component: 9. Organization
12. Verification and Validation Task: 14.
System Configuration:
15.
Anomaly Description:
16.
Problem Duplication: During run Y N After restart Y N After reload Y N
7. Version 10. Telephone
13. Reference Document(s):
17. N/A N/A N/A
18.
Investigation Time
19.
Proposed Solution:
20.
Corrective Action Taken: Date:
21.
Closure Sign-off:
11. Approval:
❑ ❑ ❑ ❑ ❑
Source of Anomaly: PHASE Requirements Architecture Design Detailed Design Implementation Undetermined
❑ ❑ ❑ ❑ ❑ ❑
TYPE Documentation Software Process Methodology Other Undetermined
Software Lead Engineer
Date
V&V Lead Engineer
Date
Page 1 of 3 Copyright © 2002 Interpharm Press Software Testing and Verification and Validation Documents
Software Testing and Verification and Validation Documents
627
INSTRUCTIONS FOR COMPLETING SOFTWARE ANOMALY REPORT The initiator completes Items 1 through 16 and submits the form to the software V&V lead engineer for review. After review, the software V&V lead engineer submits the form to the software lead engineer for investigation, implementation, completion of Items 17 through 20, and appropriate disposition Items 21.When completed, the form is returned to the software V&V lead engineer for final review, resolution, and appropriate disposition Items 21. 1. Date: Form preparation date. 2. Severity: Circle the appropriate code. High: The change is required to correct a condition that prevents or seriously degrades a system objective (where no alternative exists) or to correct a safety-related problem. Medium: The change is required to correct a condition that degrades a system objective, to provide for performance improvement, or to confirm that the user and system requirements can be met. Low: The change is required to maintain the system, correct operator inconvenience, or other. 3. Anomaly report number: Number assigned for control purposes. 4. Title: Brief phrase or sentence describing the problem. 5. System: Name of the system or product against which the anomaly report is written. 6. Component: Component or document name against which the anomaly report is written. 7. Version: Version of the document or code against which the anomaly report is written. 8. Originator: Printed name of individual originating the anomaly report. 9. Organization: Organization of originator of anomaly report. 10. Telephone: Office phone number of the individual originating the anomaly report. 11. Approval: Software management individual or designatee approval for anomaly report distribution. 12. V&V task name: Name of the V&V task being performed when the anomaly was detected. 13. Reference document: Designation of the documents that provide the basis for determining that an anomaly exists. 14. System configuration: Configuration loaded when anomaly occurred; not applicable for documentation or logic errors. 15. Anomaly description: Description defining the anomaly and a word picture of events leading up to and coincident with the problem. Cite equipment being used, unusual configurations, environment parameters, and so forth, that will enable the programmer to duplicate the situation. If continuation sheets are required, fill in Page _ of _ at the top of the form. 16. Problem duplication: Duplication attempts, successes or failures for software errors; not applicable for documentation or logic errors. 17. Source of anomaly: On investigation completion, source of the anomaly in terms of phase origination and type. 18. Investigation time: Time, to the nearest half hour, required to determine the cause of the anomaly but not the time to determine a potential solution or time to implement the corrective action. 19. Proposed solution: Description defining in detail a solution to the detected anomaly, including documents, components and code.
Copyright © 2002 Interpharm Press
Page 2 of 3 Software Testing and Verification and Validation Documents
628
Software Quality Assurance SOPs for Healthcare Manufacturers
INSTRUCTIONS FOR COMPLETING SOFTWARE ANOMALY REPORT (CONTINUED) 20. Corrective action taken: Disposition of the anomaly report, including a description of any changes initiated as a direct result of this report and the date incorporated. 21. Closure sign-off: Signature of the software lead engineer authorizing implementation of the corrective action. Signature of the V&V lead engineer verifying incorporation of the authorized changes as described in this report. Only signature of software lead engineer is required when no corrective action is approved.
Page 3 of 3 Copyright © 2002 Interpharm Press Software Testing and Verification and Validation Documents
Software Testing and Verification and Validation Documents
629
DEVELOPMENT TEST INFORMATION SHEET (DTIS)
page [#] of [#] Test Category
Test Number
Requirement
Requirement Number(s)
1. Component objectives and component success criteria: 2. Test objectives and success criteria: 3. Test approach: 4. Test instrumentation: 5. Test duration: 6. Input data and format: 7. Output data and format: 8. Data collection, reduction, and analysis requirements: 9. Test script(s): 10. Test driver(s): 11. Test stub(s): 12. Test data stream: 13. Test control flow stream: 14. Component build description and subordinate DTISs: 15. Pretest comments: 16. Results: 17. Post-test comments: 18. Signatures:
Copyright © 2002 Interpharm Press
Test Conductor
Date
Software Lead Engineer
Date
Page 1 of 1 Software Testing and Verification and Validation Documents
630
Software Quality Assurance SOPs for Healthcare Manufacturers
VALIDATION TEST INFORMATION SHEET (VTIS)
page [#] of [#] Test Category
Test Number
Requirement
Requirement Number
1. Test objectives and success criteria: 2. Test approach: 3. Test instrumentation: 4. Test duration: 5. Data collection, reduction, and analysis requirements: 6. Comments: 7. Post-test comments: 8. Signatures: Test Conductor
Date
Project V&V Lead Engineer
Date
Page 1 of 1 Copyright © 2002 Interpharm Press Software Testing and Verification and Validation Documents
Software Testing and Verification and Validation Documents
631
SOFTWARE VALIDATION TEST LOG Location:
Time
Software Project:
Test Number
Date:
Entry
References
Engineer
Page ____ of ____
Copyright © 2002 Interpharm Press
Page 1 of 1 Software Testing and Verification and Validation Documents
[Project/Product Name] SVTPR SOFTWARE VALIDATION TEST PROCEDURES
Written by:
[Name/Title/Position]
Date
Reviewed by:
[Name/Title/Position]
Date
Approved by:
[Name/Title/Position]
Date
Document Number [aaa]-SVTPR-[#.#]
Revision
Page
[#.#]
1 of [#]
REVISION HISTORY Revision
Description
Date
[##.##]
[Revision description]
[mm/dd/yy]
Copyright © 2002 Interpharm Press
Page 1 of 14 SVTPR
634
Software Quality Assurance SOPs for Healthcare Manufacturers
CONTENTS
1.0
INTRODUCTION
3
2.0
GENERIC TEST PROCEDURES
5
3.0
DETAILED VALIDATION TEST PROCEDURES
6
4.0
TEST REPORTING
9
APPENDIX A GLOSSARY
Page 2 of 14 SVTPR
Validation Test Information Sheets
11 12
Copyright © 2002 Interpharm Press
Software Validation Test Procedures
635
1.0 INTRODUCTION
1.1 Purpose This document describes the detailed procedures for performing the software validation testing to be conducted on the [project/product name] project as described in the Software Validation Test Plan (SVTP).
1.2 Scope This document is limited to the description of test procedures necessary to conduct software validation testing on the [project/product name] project. These tests are to be implemented by the [project/product name] software verification and validation (V&V) team under the direction of the software V&V lead engineer.
1.3 Overview software validation testing will be conducted by the software V&V engineers during the Software Validation Phase of the [project/product name] software development life cycle. Software validation testing is conducted in order to verify that the [project/product name] software satisfies the requirements and design that are specified in the [project/product name] Software Requirements Specification (SRS) and Software Detailed Design Specification (SDDS). The mode of testing will be with a buttoned-up instrument or with the use of [insert hardware emulator or other configurations here]. The generic steps necessary to start and stop this hardware are described in Section 2. The individual test procedures are described in Section 3. The Validation Test Information Sheets (VTISs) are contained in Appendix A. The results of the validation testing will be recorded on the VTISs, and data collected during the test will be recorded on the Software Validation Test Log Sheets. [project/product name]
1.4 Referenced Documents The following documents of the exact issue shown form a part of this specification to the extent specified herein. In the event of conflict between the documents referenced herein and
Copyright © 2002 Interpharm Press
Page 3 of 14 SVTPR
636
Software Quality Assurance SOPs for Healthcare Manufacturers
the content of this specification, the content of this specification shall be considered a superseding requirement.
1.4.1
Project Specifications
•
Interface Design Specification, Document Number [aaa]-IDS-[#.#], Revision [#.#], dated [date]
•
[project/product name]
[project/product name]
[#.#],
Product Objectives Document, Document Number [aaa]-PODRevision [#.#], dated [date]
•
[project/product name]
Product Requirements Document, Document Number [aaa]PRD-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
•
[project/product name] Software Detailed Design Specification, Document Number [aaa]DDS-[#.#], Revision [#.#], dated [date]
•
[project/product name]
•
[project/product name]
•
Software End-product Acceptance Plan, Document Number [aaa]EAP-[#.#], Revision [#.#], dated [date]
•
[project/product name]
Software Architecture Design Specification, Document Number [aaa]-ADS-[#.#], Revision [#.#], dated [date] Software Configuration Management Plan, Document Number [aaa]-CMP-[#.#], Revision [#.#], dated [date]
Software Development Plan, Document Number [aaa]-SDP-[#.#], Revision [#.#], dated [date]
Software Development Test Plan, Document Number [aaa]-DTP[#.#], Revision [#.#], dated [date] [project/product name]
[#.#],
Software Quality Assurance Plan, Document Number [aaa]-QAPRevision [#.#], dated [date]
•
[project/product name] Software Requirements Specification, Document Number [aaa]SRS-[#.#], Revision [#.#], dated [date]
•
[project/product name]
Software Validation Test Plan, Document Number [aaa]-VTP-[#.#], Revision [#.#], dated [date]
Page 4 of 14 SVTPR
Copyright © 2002 Interpharm Press
Software Validation Test Procedures
•
Software Verification and Validation Plan, Document Number Revision [#.#], dated [date]
[project/product name] [aaa]-VVP-[#.#],
1.4.2
637
Procedures and Guidelines
•
Product Development Safety Design Guidelines, Revision [#.#], dated [date]
•
Product Development User Interface Design Guidelines, Revision [#.#], dated [date]
•
Software Engineering Configuration Management Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Development Guidelines, Revision [#.#], dated [date]
•
Software Engineering Software Configuration Management Policies, Revision [#.#], dated [date]
•
Software Engineering Software Development Policies, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Guidelines, Revision [#.#], dated [date]
•
Software Engineering Verification and Validation Policies, Revision [#.#], dated [date]
2.0 GENERIC TEST PROCEDURES
This section details the test procedures that are repetitive and common to many of the individual test procedures. They are provided here instead of being duplicated in each and every individual procedure that uses them. The individual procedures will reference these by section number and name.
2.1 Start-Up Procedure [Insert the correct start-up procedure steps, expected screen outputs, displays, or responses, and any special notes here.]
Copyright © 2002 Interpharm Press
Page 5 of 14 SVTPR
638
Software Quality Assurance SOPs for Healthcare Manufacturers
2.2 Restart Procedure [Insert the correct restart procedure steps, expected screen outputs, displays, or responses, and any special notes here.]
2.3 Suspend Procedure [Insert the correct suspend procedure steps, expected screen outputs, displays, or responses, and any special notes here.]
2.4 Shut-Down Procedure [Insert the correct terminate or shut-down procedure steps, expected screen outputs, displays, or responses, and any special notes here.]
2.5 Test Measurement Procedures [Insert the correct test measurement procedure steps, expected screen outputs, displays, or responses, and any special notes here.]
[2.n [Generic Procedure Name] [Insert additional test procedure steps, expected screen outputs, displays, or responses, and any special notes here.]]
3.0 DETAILED VALIDATION TEST PROCEDURES
Each validation test procedure is assigned a unique identifier of the form TTNN-nn.The “TT” field designates the test type; FU denotes a functional test, RB a robustness test, ST a stress test, and SA a safety test. Each test type is composed of several test categories, represented by the
Page 6 of 14 SVTPR
Copyright © 2002 Interpharm Press
Software Validation Test Procedures
639
“NN” field in the identifier.Within each test category is a series of tests, and each test is assigned a sequential number. This number is represented by the “nn” field in the identifier. Each VTIS is assigned a unique identifier, which appears as the VTIS Test Number. The identifier is of the form TTNN-nn, and the fields are defined to be the same as the validation test procedure identifier.
3.1 Functional Testing 3.1.1
FU01 Power-Up Function Tests
[3.1.1.n FU01-nn [test title]. For each power-up functional test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
3.1.2
FU02 Power-Down Function Tests
[3.1.2.n FU02-nn [test title]. For each power-down functional test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
3.1.3
FU03 Data Entry Function Tests
[3.1.3.n FU03-nn [test title]. For each data entry functional test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
[3.1.N FUNN [test title] For each functional test category, insert the correct functional test series in individual sections.]
3.2 Robustness Testing 3.2.1
RB01 Timing Robustness Test
[3.2.1.n RB01-nn [test title]. For each timing robustness test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
Copyright © 2002 Interpharm Press
Page 7 of 14 SVTPR
640
Software Quality Assurance SOPs for Healthcare Manufacturers
3.2.2
RB02 Software Interface Robustness Test
[3.2.2.n RB02-nn [test title]. For each software interface robustness test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
[3.2.N RBNN [test title] For each robustness test category, insert the correct robustness test series in individual sections.]
3.3 Stress Testing 3.3.1
ST01 Extended Operation Stress Test
[3.3.1.n ST01-nn [test title]. For each extended operation stress test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
3.3.2
ST02 User Interface Stress Test
[3.3.2.n ST02-nn [test title]. For each user interface stress test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
[3.3.N STNN [test title] For each stress test category, insert the correct stress test series in individual sections.]
3.4 Safety Testing 3.4.1
SA01 Critical Parameters Safety Tests
[3.4.1.n SA01-nn [test title]. For each critical parameters safety test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
Page 8 of 14 SVTPR
Copyright © 2002 Interpharm Press
Software Validation Test Procedures
3.4.2
641
SA02 Fail-Safe Design Safety Tests
[3.4.2.n SA02-nn [test title]. For each fail-safe design parameters safety test series, insert the correct generic and specific procedure steps, expected outputs, displays, or responses, and any special notes in individual sections.]
[3.4.N FUNN [test title] For each functional test category, insert the correct functional test series in individual sections.]
4.0 TEST REPORTING
4.1 Validation Test Information Sheet The VTISs for the validation testing of the [project/product name] software are provided in Appendix A. Completed VTISs will be provided to the [project/product name] V&V software lead engineer for review and approval. All VTISs will be kept in a central location and made available for review by the [project/product name] software lead engineer and the [corporate title/position].
4.2 Software Test Log The Software Test Log will be used to document all significant V&V activities during the [project/ product name] software validation testing. During and upon completion of testing, the Software Test Log will be used to evaluate test results and support regression testing.
4.3 Software Test Report The Software Validation Test Report will be generated at the conclusion of all testing and will be a record of the software validation testing performed on the [project/product name] software.
Copyright © 2002 Interpharm Press
Page 9 of 14 SVTPR
642
Software Quality Assurance SOPs for Healthcare Manufacturers
4.4 Software Anomaly Report software problem reporting will be initiated by the V&V lead engineer through the Software Anomaly Report. The specific information required in the report identifies how, when, and where the problem occurred and the impact of the problem on the system capability of the product and on the continued conduct of testing. The Software Anomaly Report form is provided in the [project/product name] Software Verification and Validation Plan (SVVP). Each anomaly report contains the following: [project/product name]
•
Description and location of the anomaly
•
Severity of the anomaly if determinable
•
Cause and method of identifying the anomalous behavior
•
Recommended action and actions taken to correct the anomalous behavior
Page 10 of 14 SVTPR
Copyright © 2002 Interpharm Press
Software Validation Test Procedures
APPENDIX A
643
VALIDATION TEST INFORMATION SHEETS
[Insert all VTISs here.]
Copyright © 2002 Interpharm Press
Page 11 of 14 SVTPR
644
Software Quality Assurance SOPs for Healthcare Manufacturers
GLOSSARY Anomaly: Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. Baseline: Specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. Change control: Process by which a change is proposed, evaluated, approved or rejected, scheduled, and tracked. Code: Loosely, one or more computer programs or part of a computer program. Component: Unit of code that performs a specific task, or a group of logically related code units that perform a specific task or set of tasks. Computer program: Sequence of instructions suitable for processing by a computer. Processing may include the use of an assembler, a compiler, an interpreter, or a translator to prepare the program for execution as well as to execute it. Design phase: Period in the software development cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. Deviation: Authorization for a future activity, event, or product that departs from standard procedures. Documentation: Manuals, written procedures or policies, records, or reports that provide information concerning uses, maintenance, or validation of software. Implementation phase: Period in the software development cycle during which a software product is created from design documentation and debugged. Requirements phase: Period in the software development cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented. Robustness: Extent to which software can continue to operate correctly despite the introduction of invalid inputs.
Page 12 of 14 SVTPR
Copyright © 2002 Interpharm Press
Software Validation Test Procedures
645
Safety: Provision of a very high degree of freedom, within the constraints of system effectiveness and cost, from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property. Software: Computer programs, procedures, rules, and associated documentation and data pertaining to the operation of a computer system. Software Detailed Design Specification (SDDS): Project-specific document that constitutes an update to and an expansion of the design baseline established at the Architecture Design Review, including a description of the overall program operation and control, and the use of common data. The detailed design is described through the lowest component level of software organization and the lowest logical level of database organization. Software development life cycle: Period that starts with the development of a software product and ends when the product is validated and delivered for QA certification. This life cycle includes a requirements phase, design phase, implementation phase, and software validation phase. Software documentation: Documents that are necessary design and planning tools for disciplined and successful software development. Software end products: Computer programs, software documentation, and databases produced by a software development project. Software project: Planned and authorized undertaking of specified scope and duration that results in the expenditure of resources toward the development of a product that is primarily one or more computer programs. Software Requirements Specification (SRS): Project-specific document that provides a controlled statement of the functional, performance, and external interface requirements for the software end products. Software Validation Phase: Period in the software development life cycle in which the components of a software product are evaluated and integrated and the entire software product is evaluated to determine whether requirements have been satisfied. Software Verification and Validation Plan (SVVP): Project-specific plan that describes the project’s unique verification and validation organization, activities, schedule, inputs/outputs, and any deviations from the software policies required for effective management of verification and validation tasks.
Copyright © 2002 Interpharm Press
Page 13 of 14 SVTPR
646
Software Quality Assurance SOPs for Healthcare Manufacturers
Test Information Sheet (TIS): Document that defines the objectives, approach, and requirements for a specific test. Validation: Process of evaluating software at the end of the software development process to ensure compliance with software requirements. Verification: Process of determining whether the products of a given phase of the software development cycle fulfill the requirements established during the previous phase.
Page 14 of 14 SVTPR
Copyright © 2002 Interpharm Press
[Project/Product Name] SVVR SOFTWARE VERIFICATION AND VALIDATION REPORT
WRITTEN BY:
DATE: [mm/dd/yy]
[signature] [V&V lead engineer name], V&V
REVIEWED BY:
DATE: [mm/dd/yy]
[signature] [software lead engineer name],
APPROVED BY:
Lead Engineer
Software Lead Engineer
[signature]
DATE: [mm/dd/yy]
[name of approver], [title/position]
SUBJECT:
Software Revision [#.##] Document Number [######] Revision [L] Software CRC [######], Checksum [######] [project/product name]
Copyright © 2002 Interpharm Press
Page 1 of 3 SVVR
648
Software Quality Assurance SOPs for Healthcare Manufacturers
V&V Testing of the [project/product name] Revision [#.##] software has been completed with no outstanding safety anomalies.The quality of the software is deemed acceptable for release.The following [project/product name] software specifications and reports have been created and reviewed for this version of software and are located [enter controlled storage location name]: •
Interface Design Specification (IDS) Revision [L]
•
Software Requirements Specification (SRS) Revision [L]
•
Software Detailed Design Specification (SDDS)
•
Requirements Traceability Matrix (RTM)
•
Software Anomaly Reports
Revision [L]
Dated [date]
Dated [date]
Phase Summary Reports were generated for each phase of the project. [Enter summary of V&V tasks performed].
The following tests and reviews were performed: 1. CRA Review Summary CRA [Mnnn-aaa-nnn]: Reviewed source code changes from Revision [#.##]. Noted that this Revision [special notes]. The major changes included [summary of changes by file]. 2. CRA Test Summary CRA [Mnnn-aaa-nnn]: [Summarize those functions that were tested and describe how they were tested and the relevant results]. [3.
CRA Code Fix Test Summary CRA [Mnnn-aaa-nnn]: Inspected code change to fix [errant behavior] for Revision(s) [#.##].]
3. Chronological Test Record • • • • •
[Enter ending date] Requirements Phase Architecture Design Phase [Enter ending date] Detailed Design Phase [Enter ending date] Code and Test and Integrate and Test Phase [Enter ending date] Software Validation Phase [Enter ending date]
[OPTIONAL [5.
Test Input Sources Other Than the SRS and SDS
Page 2 of 3 SVVR
Copyright © 2002 Interpharm Press
Software Verification and Validation Report
A. [6.
649
Enter source]
Process Recommendations The following software process recommendations are provided as feedback for future software development efforts. A.
Enter recommendation]]
Copyright © 2002 Interpharm Press
Page 3 of 3 SVVR
Software Configuration Management Document
651
CHANGE REQUEST/APPROVAL (CRA) FORM 1. System name: ______________________________________ 3. Application Level:
❑
SOFTWARE
4.a. Originating Organization
5. Configuration Baseline Affected (highest level)
OTHER
❑
7. Configuration Items Affected:
6. Change Classification:
d. Date
❑
DOCUMENT
b. Initiator
c. Telephone
2. CRA Number: __________
a.
______________________
b.
______________________
Class I
❑
c.
______________________
Class II
❑
d.
______________________
Class III
❑
e.
______________________
8. Narrative: (if additional space is needed, indicate here ___ Page ___ of ___.) a. Description of change:
b. Need for change:
c. Estimated effects on other systems, software, or equipment:
d. Alternatives:
e. Anomaly Number (if any) used to generate this CRA: 9. Disposition:
Additional Analysis
________________________________ Approved
Disapproved
DATE: Signature:
____________________________________________________________________
10. Change Verification Results: 11.
V&V Signature: ________________________________________________
13.
Date Closed:________________
Copyright © 2002 Interpharm Press
12. Date: ______
14: Signature: __________________________________
Page 1 of 1 Software Configuration Management Document
[Project/Product Name] SCAR SOFTWARE CONFIGURATION AUDIT REPORT
Configuration Number [aaa-nnn-v.vv]
Item Name or Description
Software Version
[title, name, or description]
[number, if appropriate]
Document Number
Document Revision
[number, if appropriate]
[baseline, build, or version number]
GENERATED BY: [signature]
DATE: [mm/dd/yy] [configuration manager name], Configuration Management Engineer
Copyright © 2002 Interpharm Press
Page 1 of 1 SCAR
[Project/Product Name] SCSR SOFTWARE CONFIGURATION STATUS REPORT
[date]
TO:
[title/position] [name of software lead engineer] [name of V&V lead engineer] [project/product name] Product History File
FROM:
[name of project configuration management engineer]
SUBJECT: [project/product name] Software Configuration Status Report (SCSR) This SCSR documents the results of configuration status accounting for the [project/product name] baseline of [date]. This SCSR was generated in part with the configuration management script [name of automated tool]. The format of the reporting for the software build files under source code control is file name, type of change since last SCSR, control identification number, date and time of creation, user name of creator, serial number of change, predecessor serial number, and line-by-line change statistics indicating insertion, change, or deletion. I.
Baseline Identification: [aaa]-[LLL]-[##]
II.A
List of all Software Anomaly Reports
Anomaly Number and Date
Date of Fix
Associated CRAs
Disposition
Date of Closure
[Enter anomaly report information here.]
II.B
List of all CRAs
CRA Number Disposition Date of Closure [Enter CRA information here.]
Copyright © 2002 Interpharm Press
Page 1 of 2 SCSR
656
Software Quality Assurance SOPs for Healthcare Manufacturers
III.A
List of all baselined material under [configuration control tool name] control
File Name
Type of Change
Control ID Date and Time
Username
Serial Number
Predecessor Serial Change Number Statistics
[Enter build files information here.]
III.B
Build contents not under [configuration control tool name] control
File Name
Type of Change
Control ID Date and Time
Username
Serial Number
Predecessor Serial Change Number Statistics
[Enter build files information here.]
Page 2 of 2 SCSR
Copyright © 2002 Interpharm Press
[Project/Product Name] SOFTWARE TEAM REPORT CARD Please grade how well the software team performed the following software aspects of the project from your perspective as either a software developer or as a software V&V individual. Grades are from highest (1) to lowest (5). Subject Analysis Requirements Design Code Test Integration Verification and validation Configuration management Hardware interaction (*)
Developer 1 1 1 1 1 1 1 1 1
2 2 2 2 2 2 2 2 2
3 3 3 3 3 3 3 3 3
4 4 4 4 4 4 4 4 4
5 5 5 5 5 5 5 5 5
V&V 1 1 1 1 1 1 1 1 1
2 2 2 2 2 2 2 2 2
3 3 3 3 3 3 3 3 3
5 5 5 5 5 5 5 5 5
4 4 4 4 4 4 4 4 4
* How well, overall, did the software process and team interact with the hardware teams?
Copyright © 2002 Interpharm Press
Page 1 of 1 STRC
Interpharm/CRC Interpharm/CRC publishes a full library of pharmaceutical science and regulation resources. You can phone, fax, or mail in your order for the following titles or save 10% by purchasing them online at www.crcpress.com Order Form TITLE
Author/Editor
Pub Date
Cat. no.
ISBN #
QTY PRICE
Hewitt, William
22-Aug-03
PH1824
0849318246
$229.95 / £154.00
BIOMEDICAL SCIENCE MICROBIOLOGY Microbiological Assay: A Rational Approach
PHARMACEUTICAL SCIENCE & REGULATION BIOTECHNOLOGY/BIOPHARMACEUTIC International Biotechnology, Bulk Chemical, and Pharmaceutical GMPs, Fifth Edition
Anisfeld, Michael H.
31-May-99
PH1838
0849318386
$329.95 / £220.00
Pharmaceutical Biotechnology, Second Edition
Groves, Michael J.
18-Oct-03
PH1873
0849318734
$139.95 / £93.00
31-May-95
PH4775
0935184775
$179.00 / £120.00
Clean Room Design: Minimizing Contamination Through Proper Design Ljungqvist, Bengt
30-Nov-96
PH0329
1574910329
$89.95 / £59.99
Cleaning and Cleaning Validation: A Biotechnology Perspective
Voss, Jon
30-Jun-96
PH9507
0939459507
$249.95 / £167.00
Cleaning Validation: A Practical Approach
Bismuth, Gil
31-Jan-00
PH1082
1574911082
$199.95 / £133.00
Cleanroom Microbiology for the Non-Microbiologist
Carlberg, David M.
30-Apr-95
PH4732
0935184732
$119.00 / £79.99
Fluid Sterilization by Filtration: The Filter Integrity Test and Other Filtration Topics, Second Edition
Johnston, Peter R.
31-Jul-97
PH0396
1574910396
$149.95 / £99.00
Isolation Technology: A Practical Guide
Coles, Tim
31-Dec-97
PH0590
1574910590
$199.95 / £133.00
Sterile Facility Product Design and Project Management, Second Edition
Odum, Jeffrey N.
25-Nov-03
PH1874
0849318742
$189.95 / £127.00
Sterilization of Drugs and Devices: Technologies for the 21st Century Nordhauser, Fred M.
30-Apr-98
PH0604
1574910604
$239.95 / £160.00
Sterilization of Medical Devices
30-Nov-98
PH0876
1574910876
$179.95 / £120.00
28-Feb-00
PH1663
1574911163
$199.95 / £133.00
Clinical Development: Strategic, Pre-Clinical, and Regulatory Issues Steiner, Janice
30-Nov-96
PH0280
1574910280
$149.95 / £99.00
Clinical Research Coordinator Handbook: GCP Tools and Techniques, Second Edition Dresser, Michelle
01-Oct-01
PH1236
1574911236
$189.95 / £127.00
Clinical Research Monitor Handbook: GCP Tools and Techniques, Second Edition
30-Jun-98
PH1252
1574911252
$189.95 / £127.00
PHARMACEUTICAL SCIENCE & REGULATION CLEANING & STERILIZATION Aseptic Pharmaceutical Manufacturing II: Applications for the 1990s
Groves, Michael J.
Booth, Anne F.
Validated Cleaning Technologies for Pharmaceutical Manufacturing LeBlanc, Destin A. PHARMACEUTICAL SCIENCE & REGULATION CLINICAL TRIALS
Rosenbaum, Deborah
Drug Regimen Compliance
Metry, Jean-Michel
30-Jan-98
PH1227
0471971227
$169.95 / £113.00
GCP Harmonization Handbook, The
Maynard, Donald E.
31-May-96
PH0132
1574910132
$199.95 / £133.00
Handbook of SOPs for Good Clinical Practice, A
Maynard, Donald E.
28-Feb-96
PH0094
1574910094
$229.95 / £153.00
International Clinical Trials: A Guidebook and Compendium of National Drug Laws, Two-Volume Set
Brunier, Dominique
30-Jun-99
PH0949
1574910949
$379.95 / £253.00
International Medical Device Clinical Investigations: A Practical Approach
Pieterse, Herman
30-Dec-98
PH054X
157491054X
$0.00 / £0.00
International Medical Device Clinical Investigations: A Practical Approach, Second Edition
Pieterse, Herman
30-Apr-99
PH085X
157491085X
$239.95 / £160.00
Outsourcing in Clinical Drug Development
Drucker, Roy
15-Aug-02
PH1120
1574911120
$249.95 / £167.00
Physician Investigator Handbook: GCP Tools and Techniques, Second Edition
Rosenbaum, Deborah
01-Jan-02
PH1244
1574911244
$189.95 / £127.00
Practical Clinical Trials Resource Guide
Rosenbaum, Deborah
25-Jul-03
PH1870
084931870X
$69.95 / £46.99
TOTAL
TITLE
Author/Editor
Pub Date
Cat. no.
ISBN #
QTY PRICE
Practical Guide to Clinical Data Management
Prokscha, Susanne
31-Jan-99
PH0434
1574910434
$209.95 / £140.00
Veterinary Clinical Trials From Concept to Completion
Dent, Nigel
31-Dec-01
PH121X
157491121X
$249.95 / £167.00
Computer Validation Compliance: A Quality Assurance Perspective Double, Mary Ellen
31-Jan-94
PH4481
0935184481
$199.95 / £133.00
Electronic Communication Technologies: A Practical Guide for Healthcare Manufacturers
30-May-98
PH0698
1574910698
$239.95 / £160.00
Good Computer Validation Practices: Common Sense Implementation Stokes, Teri
31-May-94
PH4554
0935184554
$249.95 / £167.00
Practical Computer Validation Handbook
McDowall, R.D.
25-Nov-03
PH1880
0849318807
$269.95 / £180.00
Software Development and Quality Assurance for the Healthcare Manufacturing Industries, Third edition
Mallory, Steven R.
31-Jul-02
PH1368
1574911368
$239.95 / £160.00
Software Quality Assurance SOPs for Healthcare Manufacturers, Second Edition
Mallory, Steven R.
30-Jun-02
PH135X
157491135X
$219.95 / £147.00
Software Quality Assurance: A Guide for Developers and Auditors
Smith, Howard T. Gar
30-Jun-97
PH0493
1574910493
$229.95 / £153.00
Survive and Thrive Guide to Computer Validation, The
Stokes, Teri
31-May-98
PH0671
1574910671
$239.95 / £160.00
Validating Corporate Computer Systems: Good IT Practice for Pharmaceutical Manufacturers
Wingate, Guy
31-May-00
PH1171
1574911171
$239.95 / £160.00
Validation of Computerized Analytical and Networked Systems
Huber, Ludwig
01-Oct-01
PH1333
1574911333
$269.95 / £180.00
Validation of Computerized Analytical Systems
Huber, Ludwig
31-May-95
PH4759
0935184759
$199.95 / £133.00
PHARMACEUTICAL SCIENCE & REGULATION COMPUTER SOFTWARE
Mitchard, Mervyn
PHARMACEUTICAL SCIENCE & REGULATION DRUG DEVELOPMENT Advances in Drug Discovery Techniques
Harvey, Alan L.
15-Aug-98
PH5095
0471975095
$169.95 / £113.00
Drug Development Programme Management
Lead, Barbara Ann
31-Oct-00
PH1112
1574911112
$209.95 / £140.00
Good Pharmaceutical Freeze-Drying Practice
Cameron, Peter
30-Jun-97
PH0310
1574910310
$199.95 / £133.00
Guide to Pharmaceutical Particulate Science, A
Hickey, Anthony
13-Mar-03
PH1844
1574911422
$239.95 / £160.00
Injectable Drug Development: Techniques to Reduce Pain and Irritation
Gupta, Pramod K.
31-May-99
PH0957
1574910957
$249.95 / £167.00
Lyophilization: Introduction and Basic Principles
Jennings, Thomas A.
31-Aug-99
PH0817
1574910817
$219.95 / £147.00
Pharmaceutical Preformulation and Formulation: A Practical Guide from Candidate Drug Selection to Commercial Dosage Form Gibson, Mark
01-Aug-01
PH1201
1574911201
$269.95 / £180.00
Separations Technology: Pharmaceutical and Biotechnology Applications
Olson, Wayne P.
01-Jun-95
PH4724
0935184724
$179.95 / £120.00
Sustained-Release Injectable Products
Senior, Judy
31-Mar-00
PH1015
1574911015
$259.95 / £173.00
Transdermal and Topical Drug Delivery Systems
Ghosh, Tapash K.
30-Jun-97
PH0418
1574910418
$239.95 / £160.00
Twenty-First Century Pharmaceutical Development
Blaisdell, Peter
31-Oct-00
PH1023
1574911023
$239.95 / £160.00
Water-Insoluble Drug Formulation
Liu, Rong
30-Sep-00
PH1058
1574911058
$269.95 / £180.00
PHARMACEUTICAL SCIENCE & REGULATION LABORATORY Automated Microbial Identification and Quantitation: Technologies for the 2000s
Olson, Wayne P.
31-Jan-96
PH4821
0935184821
$197.00 / £133.00
Calibration in the Pharmaceutical Laboratory
Kowalski, Tony
31-Dec-01
PH0922
1574910922
$229.95 / £153.00
Data Acquisition and Measurement Techniques
Munoz-Ruiz, Angel
30-Jun-98
PH068X
157491068X
$229.95 / £153.00
GLP Essentials: A Concise Guide to Good Laboratory Practice, Second Edition (5-pack)
Anderson, Milton A.
30-Jun-02
PH1384
1574911384
$129.95 / £87.00
GLP Quality Audit Manual, Third edition
Anderson, Milton A.
22-Jun-00
PH1066
1574911066
$189.95 / £127.00
International Stability Testing
Mazzo, David J.
31-Aug-98
PH0787
1574910787
$219.95 / £147.00
Managing the Analytical Laboratory: Plain and Simple
Nilsen, Clifford
31-May-96
PH0159
1574910159
$179.95 / £120.00
Microbial Limit and Bioburden Tests: Validation Approaches and Global Requirements
Clontz, Lucia
31-Oct-97
PH0620
1574910620
$189.95 / £127.00
Rapid Microbiological Methods in the Pharmaceutical Industry
Easter, Martin
13-Mar-03
PH1414
1574911414
$239.95 / £160.00
The QC Laboratory Chemist: Plain and Simple
Nilsen, Clifford
30-Apr-97
PH0531
1574910531
$159.95 / £107.00
Validation and Qualification in Analytical Laboratories
Huber, Ludwig
31-Oct-98
PH0809
1574910809
$219.95 / £147.00
PHARMACEUTICAL SCIENCE & REGULATION MANAGEMENT Continuous Improvement in the Healthcare Manufacturing Industry: A Practical Guide Bland, Valerie
30-Sep-99
PH099X
157491099X
$169.95 / £113.00
Good Technical Management Practices: A Complete Menu
31-Aug-98
PH0868
1574910868
$149.95 / £99.00
Tingstad, James Edwa
TOTAL
TITLE
Author/Editor
Pub Date
Cat. no.
ISBN #
QTY PRICE
Pharmaceutical Marketing: A Practical Guide
Dogramatzis, Dimitri
01-Oct-01
PH118X
157491118X
$249.95 / £167.00
Total R & D Management: Strategies and Tactics for 21st Century Healthcare Manufacturers
Dabbah, Roger
15-Jul-98
PH071X
157491071X
$169.95 / £113.00
TOTAL
PHARMACEUTICAL SCIENCE & REGULATION MANUFACTURING & ENGINEERING Biotechnology and Biopharmaceutical Manufacturing, Processing, and Preservation
Avis, Kenneth E.
31-Mar-96
PH0167
1574910167
$179.95 / £120.00
Control of Particulate Matter Contamination in Healthcare Manufacturing
Barber, Thomas A.
31-Oct-99
PH0728
1574910728
$219.95 / £147.00
Cryopreservation: Applications in Pharmaceuticals and Biotechnology
Avis, Kenneth E. 30-Sep-99
PH0906
1574910906 $199.95 / £133.00
How to Develop and Manage Qualification Protocols for FDA Compliance
Cloud, Phil
31-Aug-99
PH0981
1574910981
$249.95 / £167.00
Pharmaceutical Manufacturing Change Control
Turner, Simon G.
31-Jan-99
PH0965
1574910965
$199.95 / £133.00
Pharmaceutical Unit Operations: Coating
Avis, Kenneth E.
31-Aug-98
PH0825
1574910825
$189.95 / £127.00
Pharmaceutical Water: System Design, Operation, and Validation
Collentro, William V
30-Sep-98
PH0272
1574910272
$229.95 / £153.00
Sterile Pharmaceutical Products: Process Engineering Applications Avis, Kenneth E.
31-Oct-95
PH4813
0935184813
$189.00 / £127.00
Sterile Product Facility Design and Project Management
01-Nov-96
PH0205
1574910205
$159.95 / £107.00
Odum, Jeffrey N.
PHARMACEUTICAL SCIENCE & REGULATION MEDICAL DEVICES Electromagnetic Compatibility in Medical Equipment: A Guide for Designers and Installers
Kimmel, William D.
01-Oct-95
PH4805
0935184805
$169.95 / £113.00
Medical Device and Equipment Design: Usability Engineering and Ergonomics
Wiklund, Michael E.
15-Feb-95
PH4694
0935184694
$139.95 / £93.00
Metered Dose Inhaler Technology
Purewal, Tol S.
31-Dec-97
PH0655
1574910655
$199.95 / £133.00
Pharmacetical Applications in the European Union: A Guide Through the Registration Maze
Lowe, Cheng Yee
28-Feb-98
PH0647
1574910647
$179.95 / £120.00
Practical Design Control Implementation for Medical Devices
Justiniano, Jose
13-Mar-03
PH1279
1574911279
$249.95 / £167.00
Validation for Medical Device and Diagnostic Manufacturers, Second Edition
Desain, Carol V.
30-Sep-97
PH0639
1574910639
$199.95 / £133.00
26-Sep-03
PH1872
0849318726
$119.95 / £79.99
PHARMACEUTICAL SCIENCE & REGULATION PHARMACEUTICAL SCIENCE Pharmaceutical Microbiology
Cundell, Anthony
PHARMACEUTICAL SCIENCE & REGULATION QUALITY ASSURANCE Audit by Mail: Time and Cost Effective GMP Audit Tool
Lyall, John
31-Oct-95
PH466X
093518466X
$249.95 / £167.00
Biotechnology: Quality Assurance and Validation
Avis, Kenneth E.
31-Oct-98
PH0892
1574910892
$179.95 / £120.00
Compliance Auditing for Pharmaceutical Manufacturers: A Practical Guide to In-Depth Systems Auditing
Ginsbury, Karen
01-Aug-94
PH4600
0935184600
$249.95 / £167.00
GMP Compliance, Productivity, and Quality: Achieving Synergy in Healthcare Manufacturing
Vinay, Bhatt
30-Jun-98
PH0779
1574910779
$229.95 / £153.00
GMP/ISO Quality Audit Manual for Healthcare Manufacturers and their Suppliers, Sixth Edition (Volume 3 - Software Package)
Steinborn, Leonard
11-Jul-03
PH1848
0849318483
$199.95 / £133.00
GMP/ISO Quality Audit Manual for Healthcare Manufacturers and their Suppliers, Sixth Edition, (Volume 1 - Checklists)
Steinborn, Leonard
20-Jun-03
PH1846
0849318467
$279.95 / £187.00
GMP/ISO Quality Audit Manual for Healthcare Manufacturers and their Suppliers, Sixth Edition, (Volume 2 - Regulations)
Steinborn, Leonard
11-Jul-03
PH1847
0849318475
$299.95 / £200.00
Pharmaceutical Quality Systems
Schmidt, Oliver
30-Apr-00
PH1090
1574911090
$189.95 / £127.00
Pre-Production Quality Assurance for Healthcare Manufacturers
Hough, G. William
30-Jun-97
PH0450
1574910450
$169.95 / £113.00
Quality and GMP Auditing: Clear and Simple
Vesper, James L.
31-Jul-97
PH0558
1574910558
$119.95 / £79.99
Quality Assurance Compliance: Procedures for Pharmaceutical and Biotechnology Manufacturers Peine, Ira C.
01-Feb-94
PH4511
0935184511
$199.95 / £133.00
Quality Systems and GMP Regulations for Device Manufacturers: A Practical Guide to US, European, and ISO Requirements
31-Mar-98
PH426X
087389426X
$139.95 / £93.00
Kuwahara, Steven
PHARMACEUTICAL SCIENCE & REGULATION REGULATIONS & STANDARDS Compact Regs CFR 21: Part 820 April 2002 Revision, Quality System Regulation (10-pack)
Drug Administration,
30-Sep-02
PH1834
0849318343
$79.95 / £52.99
Compact Regs CFR 21: Parts 807, 812, & 814, Medical Device Approval (10-pack)
Drug Administration,
15-Aug-02
PH1837
0849318378
$99.95 / £66.99
Compact Regs CFR 21: Part 11, Electronic Records; Electronic Signatures (10-pack)
Drug Administration,
30-Sep-02
PH1826
0849318262
$79.95 / £52.99
TITLE
Author/Editor
Pub Date
Cat. no.
ISBN #
QTY PRICE
Compact Regs CFR 21: Part 26, Mutual Recognition: US and the European Community (10-pack)
Drug Administration,
31-Dec-01
PH1827
0849318270
$79.95 / £52.99
Compact Regs CFR 21: Part 58, Good Laboratory Practice for Nonclinical Laboratory Studies (10-pack)
Drug Administration,
30-Sep-02
PH1828
0849318289
$79.95 / £52.99
Compact Regs CFR 21: Part 606, Current Good Manufacturing Practice for Blood and Blood Components (10-pack)
Drug Administration,
30-Sep-02
PH1832
0849318327
$99.95 / £66.99
Compact Regs CFR 21: Part 820, Quality System Regulation (10-pack)
Drug Administration,
31-Dec-01
PH1833
0849318335
$79.95 / £52.99
Compact Regs CFR 21: Parts 110 and 111, cGMP in Manufacturing, Packaging, or Holding Human Food, cGMP for Dietary Supple Drug Administration,
31-Dec-01
PH1829
0849318297
$79.95 / £52.99
Compact Regs CFR 21: Parts 201, 202, and 203, Prescription Drug Labeling, Advertising and Marketing (10-pack)
Drug Administration,
31-Dec-01
PH1830
0849318300
$99.95 / £66.99
Compact Regs CFR 21: Parts 210 and 211, Pharmaceutical and Bulk Chemical GMPs (10-pack)
Drug Administration,
30-Sep-02
PH1831
0849318319
$99.95 / £66.99
Compact Regs CFR 21: Parts 50, 56, and 312, Good Clinical Practices (GCP) (10-pack)
Drug Administration,
30-Sep-02
PH1836
084931836X
$129.95 / £87.00
FDA-Speak: A Glossary and Agency Guide
Snyder, Dean E.
01-Oct-01
PH1295
1574911295
$199.95 / £133.00
Good Drug Regulatory Practices: A Regulatory Affairs Quality Manual Dumitriu, Helene I.
30-Sep-97
PH0515
1574910515
$169.95 / £113.00
International Labeling Requirements
Sidebottom, Charles
26-Jun-03
PH1850
0849318505
$269.95 / £180.00
International Pharmaceutical Registration
Chalmers, Alan A.
01-Jun-00
PH1031
1574911031
$289.95 / £193.00
Interpharm Master Keyword Guide to 21 CFR Regulations of the U.S. Food and Drug Administration: 2001-2002 Edition
Drug Administration,
01-Mar-01
PH1406
1574911406
$119.95 / £79.99
15-May-03
PH1851
0849318513
$199.95 / £133.00
Interpharm Master Keyword Guide: 21 CFR Regulations of the Food and Drug Administration, 2002-2003 Edition Understanding Biopharmaceuticals: Manufacturing and Regulatory Issues
Grindley, June
31-Dec-99
PH0833
1574910833
$229.95 / £153.00
Write It Down: Guidance for Preparing Documentation that Meets Regulatory Requirements
Gough, Janet
01-Oct-99
PH0884
1574910884
$199.95 / £133.00
PHARMACEUTICAL SCIENCE & REGULATION TRAINING Documentation Systems: Clear and Simple
Vesper, James L.
30-Sep-97
PH0507
1574910507
$119.95 / £79.99
Quality Rules in Active Pharmaceutical Ingredients Manufacture: American Edition (5-pack)
Sharp, John
30-Jun-02
PH1392
1574911392
$99.95 / £66.99
Quality Rules in Medical Device Manufacture: Revised American Edition (5-pack)
Sharp, John
30-Jun-02
PH1376
1574911376
$99.95 / £66.99
Quality Rules in Packaging: Revised American Edition, 5-pack
Sharp, John
30-Jun-02
PH1325
1574911325
$99.95 / £66.99
Quality Rules in Sterile Products: Revised American Edition (5-pack) Sharp, John
30-Jun-02
PH1341
1574911341
$99.95 / £66.99
Quality Rules: A Short Guide to Drug Products GMP, Revised American Edition (5-pack)
Sharp, John
30-Jun-01
PH1317
1574911317
$99.95 / £66.99
Training for the Healthcare Manufacturing Industries: Tools and Techniques to Improve Performance
Vesper, James L.
30-Aug-93
PH4430
0935184430
$129.95 / £87.00
PHARMACEUTICAL SCIENCE & REGULATION VALIDATION Computer Systems Validation: Concepts and Case Studies
Wingate, Guy
12-Sep-03
PH1871
0849318718
$269.95 / £180.00
How to Sell Validatable Equipment to Pharmaceutical Manufacturers
Kopp, Erik
30-Sep-99
PH0973
1574910973
$239.95 / £160.00
Pharmaceutical Equipment Validation: The Ultimate Qualification Guidebook
Cloud, Phil
31-Aug-98
PH0795
1574910795
$229.95 / £153.00
Validating Automated Manufacturing and Laboratory Applications: Putting Principles into Practice Wingate, Guy
30-Jun-97
PH037X
157491037X
$249.95 / £167.00
Validation Fundamentals: How To, What To, When To Validate
Gibson, William
30-Apr-98
PH0701
1574910701
$139.95 / £93.00
Validation of Active Pharmaceutical Ingredients, Second edition
Berry, Ira R.
31-Dec-01
PH1198
1574911198
$249.95 / £167.00
TOTAL
10% OFF
When you order online! Visit our website, browse through our online catalog, and take advantage of our online ordering with our secure online order form to SAVE 10% with your paid order. Be sure to type in the promotion code 341CCXXXX in the Special Offer field to receive your discount on any of our products. This offer is not available with any other Special Promotions or discounts.
BILL TO
SHIP TO (Fill in only if different than billing information)
Name__________________________________________________________________________
Name__________________________________________________________________________
Title___________________________________________________________________________
Title___________________________________________________________________________
Organization____________________________________________________________________
Organization____________________________________________________________________
Address________________________________________________________________________
Address________________________________________________________________________
City________________________________________State/Province________________________
City________________________________________State/Province________________________
Country_________________________________ Zip/Postal Code_________________________
Country_________________________________ Zip/Postal Code_________________________
Daytime Phone (
)__________________________ Fax (
)_______________________ Daytime Phone (
)__________________________ Fax (
)_______________________
(Required to Process Order)
e-Mail Address____________________________________________________________________ e-Mail Address____________________________________________________________________
Please send me the following: CAT. NO.
QUANTITY
TITLE
PRICE EACH
TOTAL PRICE
Subtotal Tax Shipping Charges Order Total
❏ I’ve enclosed check #_____________________ payable to CRC Press LLC ❏ Bill My Company (Purchase Order Attached) ❏ Please Charge to: ❏ MasterCard
❏ VISA
❏ American Express
CARD NO.
SHIPPING AND HANDLING
Exp. Date MO.
YR.
Signature ________________________________________________________________________ Phone Number ____________________________________________________________________ (Signature and Phone Number Required to Process Order)
O R D E R I N G L O CAT I O N S In North & South America, In Europe, Middle East, Asia, and Australasia: and Africa: CRC PRESS CRC PRESS / ITPS 2000 N.W. Corporate Blvd. Boca Raton, FL 33431-9868, USA Tel: 1-800-272-7737 • Fax: 1-800-374-3401 From Outside the Continental U.S. Tel: 1-561-994-0555 • Fax: 1-561-361-6018 e-mail: [email protected]
Cheriton House, North Way Andover, Hants, SP10 5BE, UK Tel: 44 (0) 1264 342932 Fax: 44 (0) 1264 342788 e-mail: [email protected]
Region Delivery Time USA/Canada 3-5 Days America/Asia/Australia 7-14 Days Europe 3-5 Days Middle East/Africa 7-21 Days
First Title $5.99 $9.99 £2.99 £4.99
Additional Title $1.99 $3.99 £0.99 £2.99
For priority mail services, please contact your nearest CRC PRESS office.
Corporate Offices CRC PRESS CRC PRESS UK 2000 N.W. Corporate Blvd. Boca Raton, FL 33431-9868, USA Tel: 1-800-272-7737 • Fax: 1-800-374-3401 From Outside the Continental U.S. Tel: 1-561-994-0555 • Fax: 1-561-361-6018 e-mail: [email protected]
23-25 Blades Court, Deodar Road London SW15 2NU, UK Tel: 44 (0) 20 8875 4370 Fax: 44 (0) 20 8871 3443 e-mail: [email protected]
www.crcpress.com
3.21.03rd