Museum Guide 2.0 – An Eye-Tracking based Personal Assistant for Museums and Exhibits
2011
This paper describes a new prototypical application that is based on a head mounted mobile eye tracker in combination with content based image retrieval technology. The application, named “Museum Guide 2.0”, acts like an unintrusive personal guide of a visitor in a museum. When it detects that the user is watching a specific art object, it will provide audio information on that specific object via earphones. The mobile eye tracker thereby observes the visitors eye movements and synchronizes the images of the scene camera with the detected eye fixations. The built in image retrieval subsystem recognizes which of the art objects in the exhibition is currently fixated by the users eyes (if any). Challenges that had to be faced during our research are the modifications of the retrieval process utilizing a given fixation for better accuracy, the detection of consciousness when looking at one specific object as trigger event for information delivery and to distinguish from noise (unconscious fixations). This paper focuses on the application aspect of Museum Guide 2.0. It describes how a database of given art objects is created from scratch and how the runtime application is to be used. We end with a user study that has been conducted to evaluate the acceptance of the system, specifically in contrast to conventional audioplayer based approaches.
Keywords:
- Correction
- Cite
- Save
- Machine Reading By IdeaReader
3
References
10
Citations
NaN
KQI