Real time Virtual View Generation For Augmented Virtuality System
This paper describes a novel method of real-time virtual view generation for augmented virtuality system. The aim is to rectify the displacement between virtual eye position and video cameras which are attached on the head mounted display. It consists of three steps: Firstly, stereo calibration was used to generate cameras' internal and external parameters, and then image planes were rectified to canonical configuration state that made them alignment or corresponding points on the same scan lines. At last the virtual view images between two cameras are generated using view morphing which is a shape-preserving transition method from a source to a destination view.