Prince, S and Cheok, A and Farbiz, F and Williamson, T and Johnson, N and Billinghurst, M and Kato, H (2002) 3D Live: Real Time Captured Content for Mixed Reality. In: Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR’02). (pp. 7 - 13). IEEE Computer Society: Los Alamitos, US.
Full text not available from this repository.
We present a complete system for live capture of 3D content and simultaneous presentation in augmented reality. The user sees the real world from his viewpoint, but modified so that the image of a remote collaborator is rendered into the scene. Fifteen cameras surround the collaborator, and the resulting video streams are used to construct a three-dimensional model of the subject using a shape-from-silhouette algorithm. Users view a two-dimensional fiducial marker using a video-see-through augmented reality interface. The geometric relationship between the marker and head-mounted camera is calculated, and the equivalent view of the subject is computed and drawn into the scene. Our system can generate 384 × 288 pixel images of the models at 25 fps, with a latency of < 100 ms. The result gives the strong impression that the subject is a real part of the 3D scene. We demonstrate applications of this system in 3D videoconferencing and entertainment.
|Title:||3D Live: Real Time Captured Content for Mixed Reality|
|UCL classification:||UCL > School of BEAMS > Faculty of Engineering Science > Computer Science|
Archive Staff Only: edit this record