Invited review
Virtual reality: a tutorial

https://doi.org/10.1016/S0924-980X(97)00086-6Get rights and content

Abstract

Virtual reality (VR) technology is complex and relies on multidisciplinary knowledge. VR applications are attracting an increasing interest among neuroscientists, in particular in the study of the human brain. Here we present a brief tutorial in which we address aspects of VR methodology that are most relevant to neurophysiology applications. After a brief survey of possible applications to neurophysiology, we discuss the following issues in VR: display technology, visual stimulus presentation techniques, visual spatial resolution and accuracy, devices for real-time interaction with the virtual environment and force-feedback.

Section snippets

What is virtual reality?

The idea of creating synthetic interactive environments has been around for years. The first head-mounted stereo display was demonstrated in the 1960s Sutherland, 1965; Sutherland, 1968) and, in the mid-1970s, Myron Krueger performed the first experiments on what he defined `artificial reality'. The term virtual reality (VR) is credited to Jaron Lanier, who coined it in the early 1990s (see Machover and Tice, 1994). Computer scientists create a virtual environment using an interactive visual

What virtual reality offers to neurophysiology

Generally speaking, VR offers the opportunity to create synthetic environments where a large number of physical variables that influence our behavior can be controlled precisely and simultaneously. Furthermore, in a virtual environment we can record our motor responses and use them to interact with and manipulate the same environment.

VR allows the creation of dynamic three-dimensional (3D) views providing the user with visual information that can be quite similar to that achievable when viewing

Display

One of the most striking features of VR is the possibility of displaying stereoscopic views of a desired scene. It is well known that depth perception can be obtained by showing an appropriate bi-dimensional view (right-eye and left-eye half-images) to each eye separately. These views are generated by projecting the scene on a plane (or on two different planes, see below) using two different viewpoints. The main difference between the two half-images is a shift in the direction given by the

Interaction with the virtual environment

Virtual environments offer the opportunity to design user-friendly interfaces tending to take advantage of multiple affordances of the human operator. Several new interactive devices represent a significant advance, compared with traditional input devices (keyboard, mouse). The devices used typically in VR allow a high transfer rate of information from the user to the computer. If we consider the amount of information that an average human operator can deliver per unit time to the host by using

Conclusions

The application to neurophysiology of technologies that are generally subsumed under the general heading of VR is rapidly expanding. VR methodology and techniques allow the creation of new interfaces that can provide subjects with a considerable amount of information, based on multiple sensory modalities. Moreover, users can provide the controlling system with their feedback by means of new interface devices that do not require specific training. This is due to the fact that the interaction

Acknowledgements

Work from the author's laboratory was partially supported by the Italian Health Ministry, the Italian Space Agency and the Human Frontiers Science Program.

References (52)

  • G. Burdea et al.

    Dextrous telerobotics with force-feedback - an overview, Part 1: human factors

    Robotica

    (1991)
  • G. Burdea et al.

    Dexterous telerobotics with force-feedback - An overview, Part 2: control and implementation

    Robotica

    (1991)
  • M. Carrozzo et al.

    A hybrid frame of reference for visuo-manual coordination

    NeuroReport

    (1994)
  • Collewijn, H. and Erkelens, C.J. Binocular eye movements and the perception of depth. In: E. Kowler (Ed.), Eye...
  • J. Decety et al.

    Mapping motor representations with positron emission tomography

    Nature

    (1994)
  • M. Deering

    High resolution virtual reality

    Comput. Graph.

    (1992)
  • Deyo, R. and Ingebreston, D. Notes on real-time vehicle simulation. In: Implementing and Interacting with Real-time...
  • R.E. Ellis

    What are virtual environments?

    IEEE Comput. Graph. Appl.

    (1994)
  • J.D. Foley

    Interfaces for advanced computing

    Sci. Am.

    (1987)
  • Foley, J.D., van Dam, A., Feiner, S.K. and Hughes, J.F. Computer Graphics: Principles and Practice, 2nd edn.,...
  • J.M. Foley

    Binocular distance perception, Psychol

    Rev.,

    (1980)
  • Hirota, K. and Hirose, M. Development of surface display. Proc. Virtual Reality Ann. Int. Symp., IEEE Press,...
  • K. Hirota et al.

    Providing force feedback in virtual environments

    IEEE Comput. Graph. Appl.

    (1995)
  • L.F. Hodges et al.

    Rotation algorithm artifacts in stereoscopic images

    Optic. Eng.

    (1990)
  • L.F. Hodges et al.

    Geometric considerations for stereoscopic virtual environments

    Presence

    (1993)
  • L.F. Hodges

    Time multiplexed stereoscopic computer graphics

    IEEE Comput. Graph. Appl.

    (1992)
  • Cited by (17)

    View all citing articles on Scopus
    View full text