Auditory object-based attention uncovers parallels with visual attention

Mentor 1

Adam Greenberg

Location

Union Wisconsin Room

Start Date

5-4-2019 1:30 PM

End Date

5-4-2019 3:30 PM

Description

Attention gates sensory input by selecting and enhancing the behaviorally relevant subset of incoming data. However, it is unclear whether attention is domain-general or domain-specific with regard to different sensory modalities. Research on attention (conducted most frequently in the visual domain), has demonstrated that attentional selection can act upon features, spatial locations, or objects. Evidence for the latter case (known as Object-Based Attention; OBA) has shown that attended objects garner enhanced processing compared to unattended objects, even when spatial locations are overlapping or equidistant. Our goal is to investigate whether OBA in the auditory domain operates analogously to the visual domain. To induce the percept of separate but simultaneous auditory objects, participants will hear two simultaneous streams of tones. Each stream will be composed of quartets (groups of four tones) clustered around one of five distinct frequencies (300 Hz 566Hz 1068 Hz 2016Hz 3805 Hz). At the start of each trial, attention will be cued to one of the two objects via the presentation of a single tone at the central frequency of that object. Subjects are instructed to detect the target (a quartet of four identical, repeated tones) and response times are recorded. Critically, the target can appear within the cued object (valid condition) or the non-cued object (invalid condition). If OBA functions analogously in vison and audition, we should observe significantly better performance on valid cues versus invalid cues, indicating an advantage for attended objects. Conversely, equivalent performance in these conditions would suggest important disanalogies between attentional mechanisms within the visual and auditory domains. If our hypothesis is supported, we will conduct subsequent studies to more fully understand auditory OBA.

This document is currently not available here.

Share

COinS
 
Apr 5th, 1:30 PM Apr 5th, 3:30 PM

Auditory object-based attention uncovers parallels with visual attention

Union Wisconsin Room

Attention gates sensory input by selecting and enhancing the behaviorally relevant subset of incoming data. However, it is unclear whether attention is domain-general or domain-specific with regard to different sensory modalities. Research on attention (conducted most frequently in the visual domain), has demonstrated that attentional selection can act upon features, spatial locations, or objects. Evidence for the latter case (known as Object-Based Attention; OBA) has shown that attended objects garner enhanced processing compared to unattended objects, even when spatial locations are overlapping or equidistant. Our goal is to investigate whether OBA in the auditory domain operates analogously to the visual domain. To induce the percept of separate but simultaneous auditory objects, participants will hear two simultaneous streams of tones. Each stream will be composed of quartets (groups of four tones) clustered around one of five distinct frequencies (300 Hz 566Hz 1068 Hz 2016Hz 3805 Hz). At the start of each trial, attention will be cued to one of the two objects via the presentation of a single tone at the central frequency of that object. Subjects are instructed to detect the target (a quartet of four identical, repeated tones) and response times are recorded. Critically, the target can appear within the cued object (valid condition) or the non-cued object (invalid condition). If OBA functions analogously in vison and audition, we should observe significantly better performance on valid cues versus invalid cues, indicating an advantage for attended objects. Conversely, equivalent performance in these conditions would suggest important disanalogies between attentional mechanisms within the visual and auditory domains. If our hypothesis is supported, we will conduct subsequent studies to more fully understand auditory OBA.