Computer-Assisted Detection Devices in Radiology: Performance Testing, Documentation and Labeling Requirements
This guidance applies to computer-assisted detection (CADe) devices applied to radiology images and radiology device data, classified under 21 CFR 892.2050. These devices are computerized systems intended to identify, mark, highlight, or direct attention to portions of radiology images or data that may reveal abnormalities during interpretation by clinicians. The guidance covers CADe devices marketed as complete packages with review workstations or as add-on software for imaging equipment or platforms.
What You Need to Know? π
What documentation level is required for CADe devices under the new FDA software guidance?
Computer-assisted detection (CADe) devices applied to radiology images should generally address recommendations for a Basic Documentation Level, replacing the previous Moderate Level of Concern classification under the superseded 2005 guidance.
Which product codes are covered under this CADe guidance?
The guidance applies to CADe devices classified under 21 CFR 892.2050 with product codes NWE (colon CT CADe), OEB (lung CT CADe), and OMJ (chest x-ray CADe). New product codes will be created for additional CADe types.
When is a clinical performance assessment required for CADe devices?
A clinical performance assessment may not be necessary if you can directly compare standalone performance to the predicate device and demonstrate improved sensitivity with stable false positive rates, or vice versa, with consistent mark characteristics.
What are the key requirements for standalone performance testing of CADe devices?
Standalone testing must use independent test data, include detection and localization accuracy, provide FROC curves with confidence intervals, and demonstrate performance across relevant subgroups like lesion size and imaging protocols.
How should test data reuse be handled in CADe device evaluation?
Test data reuse should be minimized. If necessary, implement strict controls including firewalls, audit trails, limited reuse frequency, and ensure algorithm developers cannot access individual case data or performance results.
What training requirements exist for CADe device users?
Training must help clinicians use the device appropriately, covering expected advantages, known limitations, typical true/false positives and negatives, optimal device settings, and suitable reading scenarios using broad patient datasets including normal cases.
What You Need to Do π
Recommended Actions
- Determine if clinical performance assessment is needed based on ability to directly compare standalone performance with predicate
- Design comprehensive standalone performance testing protocol with independent test dataset
- Establish reference standard definition and scoring process before testing
- Develop detailed algorithm documentation including design, features, and processing steps
- Conduct generalizability testing across different imaging technologies
- Create user training program covering device capabilities and limitations
- Prepare comprehensive labeling including performance data and warnings
- Document all testing methodologies and maintain data integrity controls
- Submit electronic data for statistical analyses when possible
- Establish audit trail if reusing any test data
Key Considerations
Clinical testing
- Clinical performance assessment may be required when standalone performance cannot be directly compared to predicate device
- Clinical study should compare device performance to control modality (typically unaided reading)
- Study should demonstrate statistical significance in performance improvement
- Study population should be representative of intended use population
Non-clinical testing
- Standalone performance testing required for all submissions
- Testing database must be independent from training data
- Performance metrics should include sensitivity and false positive rates with confidence intervals
- Stratified analysis by relevant confounders required
- Generalizability testing across different acquisition technologies needed
Human Factors
- User training procedures must be provided
- Training should cover device advantages and limitations
- Training should help users identify appropriate device settings and reading scenarios
Software
- Documentation level is generally βBasicβ for CADe devices
- Software documentation should follow FDA software guidance
- Algorithm design and function must be described in detail
- Processing steps, features, models and classifiers must be documented
Labeling
- Must include indications for use, directions, warnings and precautions
- Should describe device limitations and potential adverse events
- Must include summary of clinical and standalone performance
- Should specify compatible devices and acquisition techniques
Safety
- Warnings about not relying solely on CADe output
- Discussion of potential adverse events from false positives and missed abnormalities required
Other considerations
- Reference standard definition and scoring process must be established before testing
- Electronic submission of study data recommended
- Audit trail required when reusing test data
Relevant Guidances π
- Content of Premarket Submissions for Device Software Functions
- Technical Performance Assessment and Premarket Requirements for Digital Diagnostic Radiology Display Devices
- Applying Human Factors Engineering and Usability Engineering to Medical Devices
- Content and Decision-Making Process for 510k Submissions: Determining Substantial Equivalence
Related references and norms π
- DICOM Std: Digital Imaging and Communications in Medicine Standard
- JPEG Std: Joint Photographic Experts Group Standard
- SMPTE Test Pattern: Society of Motion Picture and Television Engineers Test Pattern
Original guidance
- Computer-Assisted Detection Devices in Radiology: Performance Testing, Documentation and Labeling Requirements
- HTML / PDF
- Issue date: 2022-09-28
- Last changed date: 2022-09-27
- Status: FINAL
- Official FDA topics: Medical Devices, Radiation-Emitting Products, Radiology, Premarket
- ReguVirta ID: 5b6f4cc7c6f81c81ef864a4c9e419bcf