Objects project different images when viewed from varying locations, but the visual system can correct perspective distortions and identify objects across viewpoints. This study investigated the conditions under which the visual system allocates computational resources to construct view-invariant, extraretinal representations, focusing on planar symmetry. When a symmetrical pattern lies on a plane, its symmetry in the retinal image is degraded by perspective. Visual symmetry activates the extrastriate visual cortex and generates an Event Related Potential (ERP) called Sustained Posterior Negativity (SPN). Previous research has shown that the SPN is reduced for perspective symmetry during secondary tasks. We hypothesized that perspective cost would decrease when visual cues support extraretinal representation. To test this, 120 participants viewed symmetrical and asymmetrical stimuli presented in a frontoparallel or perspective view. The task did not explicitly involve symmetry; participants discriminated the luminance of the patterns. Participants completed four experimental blocks: (1) Baseline block: no depth cues; (2) Monocular viewing block: stimuli viewed with one eye; (3) Static frame block: pictorial depth cues from elements within a flat surface with edges; (4) Moving frame block: motion parallax enhanced 3D interpretation before stimulus onset. Perspective cost was calculated as the difference between SPN responses to frontoparallel and perspective views. Contrary to our pre-registered hypotheses, the perspective cost was consistent across all four blocks. We conclude that the tested visual cues do not substantially reduce the computational cost of processing perspective symmetry. (c) 2025 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

Putting things into perspective: Which visual cues facilitate automatic extraretinal symmetry representation?

Bertamini M.;
2025

Abstract

Objects project different images when viewed from varying locations, but the visual system can correct perspective distortions and identify objects across viewpoints. This study investigated the conditions under which the visual system allocates computational resources to construct view-invariant, extraretinal representations, focusing on planar symmetry. When a symmetrical pattern lies on a plane, its symmetry in the retinal image is degraded by perspective. Visual symmetry activates the extrastriate visual cortex and generates an Event Related Potential (ERP) called Sustained Posterior Negativity (SPN). Previous research has shown that the SPN is reduced for perspective symmetry during secondary tasks. We hypothesized that perspective cost would decrease when visual cues support extraretinal representation. To test this, 120 participants viewed symmetrical and asymmetrical stimuli presented in a frontoparallel or perspective view. The task did not explicitly involve symmetry; participants discriminated the luminance of the patterns. Participants completed four experimental blocks: (1) Baseline block: no depth cues; (2) Monocular viewing block: stimuli viewed with one eye; (3) Static frame block: pictorial depth cues from elements within a flat surface with edges; (4) Moving frame block: motion parallax enhanced 3D interpretation before stimulus onset. Perspective cost was calculated as the difference between SPN responses to frontoparallel and perspective views. Contrary to our pre-registered hypotheses, the perspective cost was consistent across all four blocks. We conclude that the tested visual cues do not substantially reduce the computational cost of processing perspective symmetry. (c) 2025 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
2025
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3549978
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 0
  • OpenAlex ND
social impact