Date Added: Sep 2010
The authors have developed a prototype platform for contextual information access in mobile settings. Objects, people, and the environment are considered as contextual channels or cues to more information. The system infers, based on gaze, speech and other implicit feedback signals, which of the contextual cues are relevant, retrieves more information relevant to the cues, and presents the information with Augmented Reality (AR) techniques on a handheld or head-mounted display. The augmented information becomes potential contextual cues as well, and its relevance is assessed to provide more information. In essence, the platform turns the real world into an information browser which focuses proactively on the information inferred to be the most relevant for the user.