Set size manipulations reveal the boundary conditions of perceptual ensemble learning

Andrey Chetverikov, Gianluca Campana, Árni Kristjánsson

Research output: Contribution to journalArticlepeer-review

Abstract

Recent evidence suggests that observers can grasp patterns of feature variations in the environment with surprising efficiency. During visual search tasks where all distractors are randomly drawn from a certain distribution rather than all being homogeneous, observers are capable of learning highly complex statistical properties of distractor sets. After only a few trials (learning phase), the statistical properties of distributions - mean, variance and crucially, shape - can be learned, and these representations affect search during a subsequent test phase (Chetverikov, Campana, & Kristjánsson, 2016). To assess the limits of such distribution learning, we varied the information available to observers about the underlying distractor distributions by manipulating set size during the learning phase in two experiments. We found that robust distribution learning only occurred for large set sizes. We also used set size to assess whether the learning of distribution properties makes search more efficient. The results reveal how a certain minimum of information is required for learning to occur, thereby delineating the boundary conditions of learning of statistical variation in the environment. However, the benefits of distribution learning for search efficiency remain unclear.

Original languageEnglish
Pages (from-to)144-156
Number of pages13
JournalVision Research
Volume140
DOIs
Publication statusPublished - Nov 2017

Bibliographical note

Publisher Copyright: © 2017 Elsevier Ltd

Other keywords

  • Ensemble perception
  • Feature distributions
  • Priming of pop-out
  • Summary statistics
  • Visual search

Fingerprint

Dive into the research topics of 'Set size manipulations reveal the boundary conditions of perceptual ensemble learning'. Together they form a unique fingerprint.

Cite this