The detectability of a 10-ms tone masked by a 400-ms wideband noise was measured as a function of the delay in the onset of the tone compared to the onset of the noise burst. Unlike most studies like this on auditory overshoot, special attention was given to signal delays between 0 and 45 ms. Nine well-practiced subjects were tested using an adaptive psychophysical procedure in which the level of the masking noise was adjusted to estimate 79% correct detections. Tones of both 3.0 and 4.0 kHz, at different levels, were used as signals. For the subjects showing overshoot, detectability remained approximately constant for at least 20-30 ms of signal delay, and then detectability began to improve gradually toward its maximum at about 150-200 ms. That is, there was a "hesitation" prior to detectability beginning to improve, and the duration of this hesitation was similar to that seen in physiological measurements of the medial olivocochlear (MOC) system. This result provides further support for the hypothesis that the MOC efferent system makes a major contribution to overshoot in simultaneous masking.
Bibliographical noteFunding Information:
This work was supported by a research grant awarded to D.M. by the National Institute on Deafness and other Communication Disorders (NIDCD; RO1 DC000153). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIDCD or the National Institutes of Health. Two anonymous reviewers and the associate editor provided helpful comments on an earlier version of this report.