Research highlights of the Consumer Expenditure Survey redesign
Reduce the interview reference period. The CE program has researched the impact of having respondents report their expenditures monthly as opposed to quarterly. A shorter reference period is valued by survey methodologists: memory studies consistently demonstrate that recent events are recalled more accurately than events occurring further in the past and that memory decay increases with longer recall periods.12 Further, shortening the reference period may reduce the cognitive burden of recalling expenditures over the reference period; a shorter reference period reduces the overall respondent burden associated with completing the interview. Past research suggests that reducing the CEQ to a monthly instead of a quarterly interval may improve reporting; however, this change may actually result in greater respondent burden because the interviews would take place more frequently to yield 12 months of data from each household.13
Reduce the interview length. Research has been undertaken to identify the role of interview length in measurement error and respondent burden. The CEQ currently averages approximately an hour to complete, which has led to concerns about the impact of survey length on data quality. An experiment with changing the order in which sections were administered within the first interview did not conclusively prove that reports early in the interview provide higher quality data than do later reports.14
Studies on the use of a split-questionnaire design to reduce the length of the interview have shown promise. Split-questionnaire designs involve splitting the full interview into subsections and administering only select sections to each respondent. It was shown that when responses from the first wave of interviews were used to predict whether a respondent made purchases in certain expenditure categories in wave two, fewer questions would need to be administered in the second wave. This technique would reduce the total interview time for expenditure sections by 69 percent, with minimal impact on the precision of the estimates for many expenditure categories.15 Using split-questionnaire designs that are responsive to respondent information collected earlier may also improve the data quality of expenditure reports.
Because interview costs are dependent on the amount of time interviewers spend obtaining a completed survey, research was undertaken to identify how reducing the interview sections would cut back on the time needed to conduct an interview, thereby lowering overall wave-one CEQ costs. The research determined that the reduction in time depended heavily on the section from which questions are removed.16 A large portion of the costs associated with wave-one interviews apparently are incurred in the process of contacting respondents to participate. Therefore, reducing the length of the wave-one interview by piecemeal reductions in content would result in only minimal cost savings.
In addition, studies have investigated administering “global questions,” which are questions asked at a more aggregated level, as this could reduce the burden imposed upon respondents. One study examined whether data quality was affected by asking global questions versus asking a series of questions at a more detailed level. It was assumed that higher expenditure amounts were a measure of better data quality in this comparison of question types.17 Across the 10 expenditure sections tested, global questions resulted in higher expenditure amounts than detailed questions, findings consistent with comparisons of CED detailed and CEQ global food-at-home expenditures.18 Qualitative research has shown that global questions resulted in some higher expenditure amounts, but researchers also raised questions about the accuracy of the responses for these types of questions.19 These findings suggest that caution is called for when using global questions in place of more detailed questions or when attempting to attribute better data quality to the administration of global questions.
12 Robert Groves, Survey errors and survey costs (New York: John Wiley and Sons, 1989).
13 Brett Creech, Jeanette Davis, Scott Fricker, Jeffrey Gonzalez, Meaghan Smith, Lucilla Tan, and Nhien To, “Measurement Issues Study final report,” Statistical and Survey Methods Research Papers (U.S. Bureau of Labor Statistics, 2011), http://www.bls.gov/cex/cesrvmeth_davis.pdf.
14 Janel Brattland, Jennifer Edgar, Shannon Maloney, Peggy Murphy, Barry Steinberg, and Neil Tseng, “Order effects team test report—evaluating order effects changes,” unpublished paper, Order Effects project (U.S. Bureau of Labor Statistics, 2011).
15 Jeffrey Gonzalez, “The use of split questionnaires in a panel survey with responsive design,” paper presented at the U.S. Bureau of Labor Statistics, January 23, 2012.
16 Ian Elkin, “Cost savings from shortening interview 1 in the Consumer Expenditure interview survey,” Cost savings from shortening interview 1 project unpublished paper (U.S. Bureau of Labor Statistics, 2011).
17 Creech et al., “Measurement Issues Study final report.”
18 Steve Henderson, “Comparing global questions and answers to results from detailed specific questions: data on food expenditures from the Consumer Expenditure Survey,” CE data comparison articles and presentations (U.S. Bureau of Labor Statistics, 2012), http://www.bls.gov/cex/ce_dryinv_199811.pdf.
19 Jennifer Edgar, “What does ‘usual’ usually mean?” paper presented at the American Association for Public Opinion Research annual conference, May 14–17, 2009 and “Global clothing questions cognitive testing results,” Statistical and Survey Methods research papers (U.S. Bureau of Labor Statistics, 2011), http://www.bls.gov/cex/cesrvmeth_cloth.pdf.