Abstract
Researchers use protocols to screen for suspicious survey submissions in online studies. We evaluated how well a de-duplication and cross-validation process detected invalid entries. Data were from the Sexually Explicit Media Study, an Internet-based HIV prevention survey of men who have sex with men. Using our protocol, 146 (11.6 %) of 1254 entries were identified as invalid. Most indicated changes to the screening questionnaire to gain entry (n = 109, 74.7 %), matched other submissions’ payment profiles (n = 56, 41.8 %), or featured an IP address that was recorded previously (n = 43, 29.5 %). We found few demographic or behavioral differences between valid and invalid samples, however. Invalid submissions had lower odds of reporting HIV testing in the past year (OR 0.63), and higher odds of requesting no payment compared to check payments (OR 2.75). Thus, rates of HIV testing would have been underestimated if invalid submissions had not been removed, and payment may not be the only incentive for invalid participation.
Original language | English (US) |
---|---|
Pages (from-to) | 1928-1937 |
Number of pages | 10 |
Journal | AIDS and Behavior |
Volume | 19 |
Issue number | 10 |
DOIs | |
State | Published - Oct 14 2015 |
Bibliographical note
Funding Information:The Sexually Explicit Media (SEM) Study was funded by the National Institute of Mental Health (NIMH), Grant #5R01MH087231-02. The authors are also grateful for the support of Barbara Lea in preparing this manuscript.
Publisher Copyright:
© 2015, Springer Science+Business Media New York.
Keywords
- Bias
- HIV
- Questionnaires
- Survey methods
- Validity