User interface and method for automated positioning of an examination table relative to a medical imaging system

2014 
The user interface and the method allow a user to freeze with a first user interaction, the camera image. Subsequently, the user may set a reference information R in the frozen camera image 15 with a second user interaction. The reference location information R is for example a start line for a scanning area of ​​a patient P, or the area to be scanned itself. By the user interface and the method eliminates the need for the user, the reference position information R as a dot on a sheet with realizing low specific surface features on a patient P and to follow visually, while the examination table, and thus the reference position information R moves in a gantry, where the user, if necessary, observe and its wiring during the table movement and the patient P itself, and must control. In contrast, the determination of the reference location information R in the frozen camera image 15 much more practical, mentally less strenuous and less prone to error. In a development of the microprocessor detects a pointing gesture in the frozen camera image 15, and applies a mark location information M in dependence on the detected pointing gesture determined. Pointing gestures are particularly well suited for the production of mark location information M, which has sensed the user, and based on which the reference location information R can be set.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []