Translate

Tuesday 2 February 2016

Eye Based Human Computer Interaction

                                                 
                                               An advanced approach to man-machine interaction is proposed, in which computer vision techniques are used for interpreting user actions. The key idea of the approach is the combined use of head motions for visual navigation and eye pupil positions for context switching within the graphical human-computer interface. This allows a partial decoupling of the visual models used for tracing eye features, with beneficial effects on both computational speed and adaptation to user characteristics. The applications range from navigation and selection in virtual reality and multimedia systems, to aids for the disabled and the monitoring of typical user actions in front of advanced terminals.                                                                                                                                                                    The feasibility of the approach is tested and discussed in the case of a virtual reality application, the virtual museum.

No comments:

Post a Comment