Assessing Quality of User-Submitted Need Statements from Large-Scale Needfinding: Effects of Expertise and Group Size

Cory R. Schaffhausen, Timothy M. Kowalewski

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Collecting data on user needs often results in a surfeit of candidate need statements. Additional analysis is necessary to prioritize a small subset for further consideration. Previous analytic methods have been used for small quantities (often fewer than 75 statements). This study presents a simplified quality metric and online interface appropriate to initially screen and prioritize lists exceeding 500 statements for a single topic or product area. Over 20,000 ratings for 1697 need statements across three common product areas were collected in 6 days. A series of hypotheses were tested: (1) Increasing the quantity of participants submitting needs increases the number of high-quality needs as judged by users; (2) increasing the quantity of needs contributed per person increases the number of high-quality needs as judged by users; and (3) increasing levels of self-rated user expertise will not significantly increase the number of high-quality needs per person. The results provided important quantitative evidence of fundamental relationships between the quantity and quality of need statements. Higher quantities of total needs submitted correlated to higher quantities of high-quality need statements both due to increasing group size and due to increasing counts per person using novel content-rich methods to help users articulate needs. Based on a multivariate analysis, a user's topic-specific expertise (self-rated) and experience level (self-rated hours per week) were not significantly associated with increasing quantities of high-quality needs.

Original languageEnglish (US)
Article number121102
JournalJournal of Mechanical Design
Volume137
Issue number12
DOIs
StatePublished - Dec 1 2015

Bibliographical note

Publisher Copyright:
© 2015 by ASME.

Keywords

  • assessment
  • crowd
  • expertise
  • needfinding
  • needs
  • preferences
  • problems
  • quality
  • rating
  • user

Fingerprint

Dive into the research topics of 'Assessing Quality of User-Submitted Need Statements from Large-Scale Needfinding: Effects of Expertise and Group Size'. Together they form a unique fingerprint.

Cite this