Eye Guidance in Natural Scenes

Eye Guidance in Natural Scenes
(A special issue of Visual Cognition)
Edited by Benjamin W. Tatler
Publication date: August 2009

Successfully completing many forms of behaviour requires that humans look in the right place at the right time: This has generated a large volume of research aimed at understanding how the eyes are guided. Prominent in research over a number of years now has been the notion that visual salience (low-level conspicuity) is a crucial causal factor in determining where people look. However, there is now a critical mass of evidence, drawn from a range of experimental settings, that visual conspicuity is unlikely to play a prominent role in determining where we look under most viewing conditions.

This places us at a very interesting transition period in our understanding of eye guidance. The papers included in this special issue not only underline the limitations of the original visual salience approach, but also present a range of alternative levels of selection that have a role in guiding where people fixate in scenes. Emerging alternative models of eye guidance are presented, and these appear to offer complementary accounts and theoretical positions. A number of remaining challenges have yet to be addressed even in the latest models of eye guidance, and this special issue considers these challenges, which will feature prominently in future research. Importantly, this special issue brings together research from a variety of experimental settings, from static scene viewing in laboratories, to watching movies, to virtual reality, to free behaviour in the real world, in order to fully assess our current understanding of eye guidance in natural scenes. It is clear that a key challenge in the future of this area will be to understand eye guidance in the context of natural behaviour.

Contents

Top-down Control of Eye Movements: Yarbus Revisited
Marianne DeAngelus & Jeff B. Pelz*

Saliency and scan patterns in the inspection of real-world scenes: Eye movements during encoding and recognition
Geoffrey Underwood*, Tom Foulsham and Katherine Humphrey

Overt attentional prioritization of new objects and feature changes during real-world scene viewing
Michi Matsukara, James R. Brockmole* and John M. Henderson

Do we look at lights? Using mixture modelling to distinguish between low- and high-level factors in natural image viewing
Benjamin T. Vincent*, Roland Baddeley, Alessia Correani, Tom Troscianko, Ute Leonards

The nature of the visual representations involved in eye movements when walking down the street
Filipe Cristino & Roland Baddeley*

You look where I look! Effect of gaze cues on overt and covert attention in misdirection
Gustav Kuhn* Benjamin W. Tatler, and Geoff Cole

Get real! Resolving the debate about equivalent social stimuli.
Elina Birmingham*, Walter F. Bischof and Alan Kingstone

Modeing Search for People in 900 Scenes: A combined source model of eye guidance
Krista Ehinger, Barbara Hidalgo-Sotelo, Antonio Torralba, and Aude Oliva

SUN: Top-down saliency using natural statistics
Christopher Kanan*, Mathew H. Tong, Lingyun Zhang, and Garrison W. Cottrell

An Effect of Referential Scene Constraint on Search Implies Scene Segmentation
Gregory J. Zelinsky* and Joseph Schmidt

The prominence of behavioural biases in eye guidance
Benjamin W. Tatler* & Benjamin T. Vincent

How Are Eye Fixation Durations Controlled during Scene Viewing? Further Evidence from a Scene Onset Delay Paradigm
John M. Henderson* & Tim J. Smith

Facilitation of Return during Scene Viewing
Tim J. Smith* & John M. Henderson

Distractor Effect and Saccade Amplitudes: Further Evidence on Different Modes of Processing in Free Exploration of Visual Images
Sebastian Pannasch* & Boris M. Velichkovsky

Gaze allocation in natural stimuli: comparing free exploration to head-fixed viewing conditions.
Bernard Marius ‘t Hart, Johannes Vockeroth, Frank Schumann, Klaus Bartl, Erich Schneider, Peter König, and Wolfgang Einhäuser*

Gaze control, change detection and the selective storage of object information while walking in a real world environment
Jason A. Droll* & Miguel P. Eckstein

Modeling the role of task in the control of gaze
Dana H. Ballard* and Mary M. Hayhoe

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s