Benchmark testing the Digital Imaging Network-Picture Archiving and Communications System proposal of the Department of Defense

1999 
The Department of Defense issued a Request for Proposal (RFP) for its next generation Picture Archiving and Communications System in January of 1997. The RFP was titled Digital Imaging Network—Picture Archiving and Communications System (DIN-PACS). Benchmark testing of the proposed vendors' systems occurred during the summer of 1997. This article highlights the methods for test material and test system organization, the major areas tested, and conduct of actual testing. Department of Defense and contract personnel wrote test procedures for benchmark testing based on the important features of the DIN-PACS Request for Proposal. Identical testing was performed with each vendor's system. The Digital Imaging and Communications in Medicine (DICOM) standard images used for the Benchmark Testing included all modalities. The images were verified as being DICOM standard compliant by the Mallinckrodt Institute of Radiology, Electronic Radiology Laboratory. The Johns Hopkins University Applied Physics Laboratory prepared the Unix-based server for the DICOM images and operated it during testing. The server was loaded with the images and shipped to each vendor's facility for on-site testing. The Defense Supply Center, Philadelphia (DSCP), the Department of Defense agency managing the DIN-PACS contract, provided representatives at each vendor site to ensure all tests were performed equitably and without bias. Each vendor's system was evaluated in the following nine major areas: DICOM Compliance; System Storage and Archive of Images; Network Performance; Workstation Performance; Radiology Information System Performance; Composite Health Care System/ Health Level 7 communications standard Interface Performance; Teleradiology Performance; Quality Control; and Failover Functionality. These major sections were subdivided into workable test procedures and were then scored. A combined score for each section was compiled from this data. The names of the involved vendors and the scoring for each is contract sensitive and therefore can not be discussed. All of the vendors that underwent the benchmark testing did well. There was no one vendor that was markedly superior or inferior. There was a typical bell shaped curve of abilities. Each vendor had their own strong points and weaknesses. A standardized benchmark protocol and testing system for PACS architectures would be of great value to all agencies planning to purchase a PACS. This added information would assure the purchased system meets the needed functional requirements as outlined by the purchasers PACS Request for Proposal.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    2
    References
    5
    Citations
    NaN
    KQI
    []