Over forty to fifty years, there are large shifts in the survey methods employed by federal surveys like the Current Population Survey (CPS) and the National Health Interview Survey (NHIS) that make analytical harmonization difficult (but not impossible). Changes present a special challenge to researchers interested in constructing an analytically consistent analysis of change over time in a phenomenon of interest. Sampling changes, instrumentation changes, and data processing changes are the three most common challenges causing potential gaps in the analysis of trends over time. Changes in sampling of people within households and households within Primary Sampling Units can cause problems for variance estimation. Instrumentation changes can cause estimates to vary for no other reason. These include question wording, question universe definitions, and computer-assisted instruments versus paper and pencil. Data processing changes can also alter estimates over time. These include how missing data is handled, how out of range values are edited, and how universe definitions are enforced. As part of the integrated public use microdata projects at the Minnesota Population Center, we have developed a variety of methods for bridging the gaps in surveys. First and foremost, we tightly integrate information about these major changes into our documentation, so data users know about the problems when they download the data from our system. Knowledge of these changes is important for performing appropriate analyses. Second, we are developing various statistical approaches to bridging the gap and testing their performance for analyzing trends. We will highlight examples from our NHIS and CPS data projects to extract basic rules for dealing with these gaps. We will also point out instances in which the gaps are just too big to bridge for most analytical purposes.
|Original language||English (US)|
|State||Published - 2004|