Event Abstract

VISUAL SEARCH AND ATTENTION IN BUMBLEBEES

  • 1 Queen Mary University of London, United Kingdom

Imagine searching for your bicycle in a university parking space. Despite a multitude of visual stimuli and distractors (e.g. other bicycles), we manage to locate our target objects with relative ease. Visual search is a constant part of our daily lives, whether we are looking for misplaced files at work or the TV remote control at home. Accordingly, work on this subject has been a prominent theme in human psychology and neuroscience. The efficiency of visual search is of fundamental relevance to animals’ survival and fitness, including their ability to avoid predators or to find food and mates. Yet we know comparatively little about similar search mechanisms in non-primates or about the attentional constraints that might restrict target location for them. Bumblebees are excellent examples of animals for which visual search and attention are vitally important. They have to use sensory cues to locate flowers with high rewards of nectar and pollen while ignoring distracting low reward flowers. Simultaneously, other factors might make demands on their attention, such as the presence of predatory spiders. Several studies over the last few years have shed new light on how bees might solve this complex problem as well as the role that attention and memory might play during visual search.

These studies demonstrate that bees are capable of making both flexible and rapid decisions during visual search. Bumblebees can, for example, adjust the speed with which they solve difficult colour discrimination tasks when the cost of making errors increases, demonstrating a speed-accuracy trade-off. There is also a speed-accuracy trade-off between individuals, so that some bees will consistently adopt a fast-and-sloppy strategy, whereas others choose more slowly but more accurately. In a classical visual search task, where bees must pick a target amidst an array of distractors, it appears that search is strictly serial, since the time taken to find the target depends on the number of distractors. This raises the question of whether bees can at all see ‘at a glance’ (i.e. take in whole scenes simultaneously) or whether they must always scan the entire scene sequentially. It turns out, however, that bees can accurately choose patterns that are flashed only for brief intervals (<50 ms). When an additional task of avoiding predators (robotic crab spiders) is incorporated, bees are capable of learning visual cues associated with cryptic predators and avoiding them. They can do so even while discriminating between high rewarding targets and less rewarding or punishing (with quinine) distractors, thus showing a capacity for divided attention. This level of cognitive sophistication is also reflected in the ability of bees to retrieve previously learnt visual target types flexibly from their reference memory when presented a task with multiple visual target types. Taken together, these results paint an emerging picture of bees as efficient foragers with a remarkable capability for complex visual search strategies.

Acknowledgements

This work was supported by a Human Frontiers in Science Program Long Term Fellowship, a Marie Curie Incoming International Fellowship and a grant from the National Environment Research Council, UK.

Keywords: Attention, Bumblebee, inattentional blindness, Rapid perception, Speed-accuracy trade-off, visual search

Conference: Tenth International Congress of Neuroethology, College Park. Maryland USA, United States, 5 Aug - 10 Aug, 2012.

Presentation Type: Invited Symposium (only for people who have been invited to a particular symposium)

Topic: Sensory: Vision

Citation: Nityananda V, Wang M, Ings T, Proulx M, Skorupski P, Pattrick J and Chittka L (2012). VISUAL SEARCH AND ATTENTION IN BUMBLEBEES. Conference Abstract: Tenth International Congress of Neuroethology. doi: 10.3389/conf.fnbeh.2012.27.00020

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 17 Apr 2012; Published Online: 07 Jul 2012.

* Correspondence: Dr. Vivek Nityananda, Queen Mary University of London, London, United Kingdom, v.nityananda@qmul.ac.uk