Site icon Vision Science Academy

Visual Pathways and Cortical Processing in Virtual Environments

Aishi Bhowmik, B. Optom

Consultant Optometrist, LV Prasad Eye Institute, Bhubaneswar, India

 

Virtual Reality (VR) games immerse users by activating the visual cortex and extraocular muscles. The technology creates stereoscopic images that simulate real-world vision. When a headset is worn, visual stimuli are processed by the retina and transmitted through the optic nerve to the primary visual cortex (V1). (1)

 

Retinal Processing and Subcortical Targets

Not all retinal ganglion cell axons project to the Lateral Geniculate Nucleus (LGN), the principal relay to the visual cortex. (1) Some fibres diverge to other subcortical targets. Retinal input to the suprachiasmatic nucleus of the hypothalamus regulates circadian rhythms, synchronising sleep wake cycles with ambient light. (2) Other axons project to the pretectal nuclei, mediating pupillary reflexes and eye movements. (3) Approximately 10% of retinal ganglion cells project to the superior colliculus, a midbrain structure essential for orienting eye and head movements toward peripheral visual stimuli. (4) A focused light stimulus can activate multiple superior colliculus neurons, triggering reflexive eye movements that align images onto the fovea, thereby facilitating attentional shifts. (5)

 

Superior Colliculus and Thalamic Integration

Like the LGN, the superior colliculus receives feedback from the primary visual cortex and sends projections to subcortical structures, including the reticular formation and spinal cord, while also influencing the pulvinar nucleus of the thalamus, which contributes to higher-order visual processing. (3)

 

Extra Striate Visual Areas and Motion Perception

The primary visual cortex processes fundamental visual features such as edges, orientation, and spatial location through specialised neurons organised in a retinotopic map. From V1, visual information is relayed to V2, where binocular disparity, depth perception, and figure-ground segregation are refined – processes critical for three-dimensional perception in virtual environments. (2) V2 also initiates the division of visual processing into the dorsal (“where”) and ventral (“what”) streams. (6)

Area V3 contributes to the perception of dynamic form and moving shapes, enabling object recognition during motion. Area V4 specialises in colour perception, curvature analysis, and object identification, supporting detailed scene interpretation. (6) Area V5 (middle temporal area) plays a central role in motion perception, spatial awareness, and the analysis of direction and velocity. It receives strong magnocellular input from V1, V2, and V3 and contains neuron columns selectively tuned to motion direction. Some V5 neurons respond to perceived rather than actual motion, illustrating how contextual cues can alter visual perception, an effect relevant to immersive virtual environments. (1)

Figure 1: This image shows a medical professional using VR glasses to analyse a brain scan

Image Courtesy: https://www.freepik.com/free-photo/professional-researcher-wearing-virtual-reality-glasses-using-medical-inovation-lab-analysing-brain-scan-patient-team-neurological-doctors-working-with-equipment-high-tech-simulator-device_17787403.htm#fromView=search&page=1&position=10&uuid=903e59b0-4c98-4fcb-bc1e-ef42b4885c8e&query=Visual+Pathways+and+Cortical+Processing+in+Virtual+Environments

 

Latency in VR games

Latency beyond tolerable limits is often associated with low-end hardware or poor system optimisation, reducing the effectiveness of virtual reality experiences. (8,9)

 

Latency Range

Latency Range User Experience Physiological Impact
< 20 ms Ideal, smooth experience Minimal discomfort, high immersion
20–50 ms Noticeable delay Mild eye strain or discomfort
60–70 ms Maximum tolerable Motion sickness, disorientation
> 70 ms Poor performance Severe discomfort, loss of immersion

Table 1: This table shows user experience in a range of latencies

 

Conclusion

Despite its ability to closely simulate natural vision, virtual reality remains limited by technical constraints such as latency, reduced resolution, and the vergence-accommodation conflict, all of which can contribute to visual fatigue and motion sickness. (7) Improvements in hardware and software continue to enhance cortical stimulation and the sense of presence in virtual environments. (9)

References

  1. Gupta, M., Ireland, A. C., & Bordoni, B. (2025). Neuroanatomy, visual pathway. StatPearls Publishing.
  2. Celesia, G. G., & DeMarco, P. J. (1994). Anatomy and physiology of the visual system. Journal of Clinical Neurophysiology, 11(5), 482–492.
  3. De Moraes, C. G. (2013). Anatomy of the visual pathways. Journal of Glaucoma, 22(Suppl 5), S2–S7.
  4. Kelts, E. A. (2010). The basic anatomy of the optic nerve and visual system. NeuroRehabilitation, 27(3), 217–222.
  5. Reese, B. E. (2011). Development of the retina and optic pathway. Vision Research, 51(7), 613–632.
  6. Zyda, M. (2005). From visual simulation to virtual reality to games. Computer, 38(9), 25–32.
  7. Pur, D. R., Lee-Wing, N., & Bona, M. D. (2023). The use of augmented and virtual reality in low vision rehabilitation: A systematic review. Graefe’s Archive for Clinical and Experimental Ophthalmology, 261, 1743–1755.
  8. Neugebauer, A., Castner, N., Severitt, B., et al. (2024). Simulating vision impairment in virtual reality. Virtual Reality, 28, 97.
  9. Coelho, H., Monteiro, P., Gonçalves, G., Melo, M., & Bessa, M. (2024). Evaluation of task presentation methodologies in immersive virtual training environments. IEEE Access.

About the Author

Aishi Bhowmik

Consultant Optometrist
LV Prasad Eye Institute, Bhubaneswar, India
Exit mobile version