Consumer Expenditure Survey Microdata Users’ Workshop and Survey Methods Symposium, 2013
The 2013 Consumer Expenditure Survey Microdata Users’ Workshop and Survey Methods Symposium included presentations from BLS and non-BLS economists and researchers.
The Consumer Expenditure Survey (CE) is the most detailed source of expenditures, demographics, and income collected by the federal government. Every year, the Bureau of Labor Statistics (BLS) CE program releases microdata on the CE website from its two component surveys (the Quarterly Interview Survey and the Diary Survey), which are used by researchers in a variety of fields, including academia, government, market research, and other private industry areas.1
In July 2006, the CE program office conducted the first in a series of annual workshops to help users better understand the structure of the CE microdata; provide training in the uses of the surveys; and, through presentations by current users and interactive forums, promote awareness of the different ways the data are used and explore possibilities for collaboration.
In 2012 and 2013, an additional day was added to the event to explore topics in survey methods research to support the major project to redesign the CE survey, called Gemini (more information on this project can be found here: https://www.bls.gov/cex/geminiproject.htm). In addition to the CE program staff, workshop speakers have included economists from BLS regional offices and researchers not affiliated with BLS; similarly, symposium speakers have included CE program staff, other BLS National Office staff, and speakers from outside BLS. In this report, Ian Elkin describes the survey methods symposium, which took place on July 16, 2013, and Geoffrey Paulin describes the workshop that immediately followed it, on July 17–19, 2013.
Survey methods symposium
The goals of the 2013 CE Survey Methods Symposium were 1) to provide an overview and preliminary results from the Web Diary Feasibility test, the status of the Interview-to-Interview Imputation Methods project, and recommendations/overview regarding the use of financial records in the CE Survey; and 2) to feature presentations regarding the Gemini Survey Redesign project and a comparison of other consumer expenditure surveys. There were two sessions, one on each topic.
Web Diary Test: Preliminary Findings. Ian Elkin (CE) provided an overview of the Web Diary Feasibility Test, specifically on preliminary findings and how those findings are being used to improve the Individual Diaries Feasibility Test. Web Diary inputs leading into an improved Individual Diaries Feasibility Test include 1) targeted sampling, 2) paradata monitoring, 3) streamlined and efficient design, 4) straightforward respondent materials, 5) comprehensive field representative training, 6) additional follow-up procedures, and 7) modified receipt/recall procedures. Preliminary results from the Web Diary Feasibility Test analysis illustrate the potential benefits of a multi-modal approach; however, more testing is necessary to fully realize this potential.
Imputing Across Interviews: Balancing Time Savings and Data Quality. The second presentation started with a stated goal of reducing respondent burden balanced with maintaining a high level of quality of data. Geoffrey Paulin (CE) provided insight into the impact of removing the bounding interview from the CEQ Interview survey and the work being done to mitigate the increase in interview time to the current second wave.
Initially, Paulin focused on the consequences of discontinuing the bounding interview. These consequences include the need to add information collected only in the bounding interview to the second interview, as well as the additional interview time needed during the second interview. The question was then posed whether or not expenditures currently collected in the second interview can be successfully imputed from third, fourth, and fifth interviews to minimize respondent burden. The presentation suggested the modification of the Utilities and Apparel sections, as well as the categories for which respondents are asked about “usual” weekly/monthly expenditures. However, general and technical questions must be answered before any determination can be made.
Use of Financial Records in the CE Survey. Brandon Kopp (OSMR) reported that the CE Records Study focused on determining what records are available and from whom, while establishing how respondents’ self-reports compare with those records. Records were provided for 36 percent of the expenditures reported in the first visit, with participant characteristics such as non-Hispanic Whites, women, DC residents, and homeowners as well as expenditure characteristics such as recent purchases and more expensive purchases being positively associated with supplying records. While over- and under-reporting were not common, when over- and under-reporting did occur, respondents’ accuracy in reporting expenditure amounts was low, off by an average of 36 percent. However, the CE Records Study found records to be more accurate than self-reports.
The Records Information and Feasibility Study considered the information necessary for completing the CE Surveys that is available on financial records. Analysis shows that, while transaction data, an item description, and outlet information appear on most records, not all of the information CE needs is easily obtained on financial records and, consequently, respondent interaction is still necessary. The two records studies showed great promise, but future challenges remain, including 1) collecting a comprehensive set of records, 2) capturing transactions that do not yield records, and 3) easily and accurately converting records into tabular data.
Gemini Survey Redesign
A Comparison of Consumer Expenditure Surveys. As input into the CE Gemini Survey Redesign, Brett McBride (CE) presented on the Comparison of Consumer Expenditure Surveys, the design of which featured 35 countries’ household expenditure surveys that were selected based on the diversity of their characteristics and the extent of the information provided. Survey information was collected from program websites, methodology reports, and e-mail correspondence with survey representatives. The Comparison of Consumer Expenditure Surveys reports that common themes and innovations in other countries’ design characteristics can inform the CE Survey program’s redesign efforts. In addition, while the CE Survey shares similar data collection methods, it uniquely uses two independent samples and differs in some notable design features including 1) incorporation of new (online) technologies, 2) data collection at the individual level, and 3) the motivation of respondents through incentives.
Ongoing research has led to suggestions, many of which are being recommended as part of the Gemini Redesign Proposal. These include 1) exploring the use of a single sample, 2) identifying various ways to encourage individual reporting, 3) encouraging the use of records and receipts, 4) boosting response rate through incentives, 5) learning how online components can effectively be implemented, 6) learning from redesign efforts that have reduced respondent burden, and 7) considering the use of administrative records.
McBride concluded that survey programs can benefit from communicating best practices for effectively collecting high quality-data and sharing lessons learned from testing new features and implementing new survey designs.
The Gemini Project to Redesign the CE Surveys. Laura Erhard (née Paszkiewicz) (CE) provided an overview of the Gemini Project, including motivations that provide the foundation for the redesign process. Redesign motivations leading into the current research agenda include 1) evidence of underreporting/reduce measurement error, 2) evidence that the CE is burdensome/reduce burden, 3) evidence that the CE is expensive, and 4) the trend of declining response rates.
In order to satisfy these objectives, certain aspects of the redesign process are restricted. First, the Consumer Price Index requirements, must be satisfied. In addition, although not required, other user needs will be addressed if possible. Moreover, survey data collection and processing costs must remain at or below the current levels, and response rates must be maintained or improved.
The Gemini Redesign process relied on inputs from expert panels, external discussion events, ongoing research, the National Academies’ Committee on National Statistics (CNSTAT), the Westat independent proposal, and Census staff. It proposes a single integrated sample, a two-interview format based on recall and records, the inclusion of an electronic 1-week diary with paper backup, individual diaries for all household members age 15 and over, performance-based incentives, highly aggregated expenditure categories, and two waves set 1 year apart.
The symposium concluded with a discussion session regarding how the implementation of the Gemini Redesign Project would affect the research of CE data end-users and recommendations on how to mitigate any negative impact thereof.
Concerns ranged from what assistance CE would provide researchers looking at expenditures with longer transaction times to concerns regarding the distribution of spending, specifically healthcare expenditures to the distribution of observations. In addition, concern was voiced over how perceived respondent burden would impact the response rate. Some of the response was positive, including how the team looked critically at detail, proposed simplifying the survey, and how respondent burden would be reduced through the logical grouping of questions and incorporation of records.
Assessing Measurement Error in CE. The final symposium presentation, by Roger Tourangeau (Westat), focused on developing a specific measure that can be used on an ongoing basis to track measurement error in the CE Survey over time, but recommended a multi-method-indicators approach that consists of three main categories: 1) internal indicators, 2) external indicators, and 3) record-check studies.
Internal indicators are based solely on CE data or information about the data collection process, such as a comparison of the Interview Survey and the Diary Survey. An external indicator approach is based on the comparison of external data sources, such as comparing CE estimates with other surveys (e.g., the Medical Expenditure Panel Survey, Panel Survey of Income Dynamics, and the Residential Energy Consumption Survey). The final approach is to compare CE reports to actual bills or other records. Tourangeau recommended that CE build on past efforts and develop a time series with multiple indicators, keeping in mind that no approach is perfect.
Conclusions. CE’s survey methods symposium was a successful event. The symposium illustrated the tangible steps that CE has taken to redesign its surveys and the well-represented approaches it is considering for future implementation, all while reaffirming its commitment to providing a high-quality product for its end-users.
The main conclusion that can be drawn from the presentations and discussions from the Survey Methods Symposium is that, as with any redesign process, there is a measure of uncertainty regarding how end-users will be affected and, consequently, the CE program should tailor the process to take into account all aspects by which end-users may be impacted.
Microdata users’ workshop
Day one. The first day of the 2013 workshop opened with presenters from the CE program. Ryan Pfirrmann-Powell provided an overview of the CE, featuring topics such as how the data are collected and published. Scott Curtin then presented an introduction to the microdata, including an explanation of its features, such as data file structure and variable naming conventions.
The rest of the day featured several presentations by researchers not affiliated with the CE program who have used the microdata for a variety of purposes (Mehmet Zahid Samancioglu, Stuart Craig, Atara Stephanie Oliver, and David Molina), followed by a continuation of practical training. The day concluded with an information sharing group session among workshop participants and CE program staff. This was an open forum in which attendees met informally to discuss their research and offered suggestions for improving the microdata. Because the practical training is progressive, until 2011 this session was held on the second day to maximize overlap in attendance between newer and more experienced users. However, in response to comments from attendees at prior workshops, in 2012 the session was scheduled for the first day of the workshop, and the successful configuration was repeated in 2013.
Day two. The second day opened with more advanced topics, with Barry Steinberg of the BLS Division of Price Statistical Methods presenting technical details about sampling methods and construction of sample weights, and Evan Hubener (CE program) speaking on imputation and the allocation of expenditure data in the CE.
Following a research presentation on how expenditures are affected by pay frequency (Yiwei Zhang), the technical instruction resumed with a topic of perennial interest to CE microdata users: how to apply longitudinal weights to the interview data. As noted in the introduction by Bill Passero (CE program), the Interview Survey collects data from respondents for four consecutive calendar quarters. During each interview, the respondent is asked to provide information on expenditures for various items during the past 3 months. However, not all participants remain in the sample for all four of these interviews. Those who do remain have different characteristics (e.g., higher rates of homeownership and average age) than those who do not. Therefore, attempting to analyze average annual expenditures by only examining respondents who participate for all four interviews yields biased results. Following the Passero presentation, Mikolaj Slomka of the BLS Division of Consumer Expenditure Information Systems described measures to protect data confidentiality in the CE microdata, such as topcoding.2
The next session, titled “Researcher Access to Confidential Files,” was unique to this workshop, and of interest to advanced researchers who require access to data that have not yet been topcoded or otherwise treated. As background, the BLS sponsors a program to allow such researchers access to confidential data once they have received appropriate clearances through an application process. In addition, researchers may only have access to the data on-site.3 In some past workshops, such a researcher using CE data was on-site during the workshop and delivered a presentation on how this process works. This year, there were two such researchers available (Christine Dobridge and Aditya Aladangady), and they agreed to present a brief panel session (with Pfirrmann-Powell as moderator) describing their experiences with this process.
After a break for lunch, practical training resumed, followed by a presentation on expenditures of older consumers (Patrick Purcell and Kimberly Burham). The day concluded with a special presentation by Terry Schau, managing editor of the Monthly Labor Review, describing the publication process, from submission to printing, for authors interested in having their works appear in that journal.
Day three. On the final day, CE staff featured advanced topics, starting with Barbara Johnson-Cox explaining how sales taxes are applied to expenditure reports during the data production process. Next, Geoffrey Paulin described the correct use of imputed income data and sample weights in computing population estimates. The latter session noted that the proper use of weights requires a special technique to account for sample design effects that, if not employed, result in estimates of variances and regression parameters that are incorrect.4 The session concluded with a “sneak peek” of developments for CE microdata by Steve Henderson. Of particular note was the announcement that the CE was now publishing tables containing 12 months of data on a rolling basis, with July 2011 through June 2012 data tables published in March 2013, and the 2012 annual data that were published on September 10, 2013, the earliest time of year ever. Also noted were the forthcoming 2012 table showing expenditures, income, and demographics for consumer units characterized by the level of education attained by the most highly educated member of the consumer unit (rather than by the level of education attained by the reference person, as they were in tables last published for the 2011 results), and the coming 2013 improvement to income taxes. Beginning with the 2013 data, the CE will be using the National Bureau of Economic Research TAXSIM program to estimate and publish federal and state income taxes.
After a break, presentations by other researchers not affiliated with the CE program (Nirupama Kulkarni, Rejeana Gvillo, and Jiyoon Kim) concluded the morning. In the afternoon, practical training included a presentation of a computer program available with the microdata for use in computing 1) correct standard errors for means and regression results when using unweighted non-imputed data, 2) population-weighted non-imputed data, and 3) multiply imputed income data, both unweighted and population weighted (Paulin).
2014 Symposium and workshop
The next Survey Methods Symposium will be held July 15, 2014, once again concomitant with the next microdata users’ workshop (July 16–18). While the symposium and workshop will remain free of charge to all participants, advance registration is required. For more information about these and previous events, visit the CE website (www.bls.gov/cex) and look for “Annual Workshop” under the left navigation bar, titled “CE MICRODATA. For direct access to this information, the link is www.bls.gov/cex/csxannualworkshop.htm. Additional details about the 2013 symposium are available at https://www.bls.gov/cex/geminimaterials.htm.
Highlights of workshop presentations
Following are highlights of the papers presented during the workshop, listed in the order of presentation. They are based on summaries written by the respective authors.
Mehmet Zahid Samancioglu, Ph.D. candidate, University of Michigan, “Consumption in the Panel Survey of Income Dynamics (PSID) in Comparison to the CE” (Interview Survey), day one.
I established a match between CE and PSID and assessed how well PSID can match CE with the recent addition of several new questions about consumption. While PSID asks about major categories of consumption, CE collects data on consumption on a much more detailed, itemized level.
Stuart Craig, Research Associate, Institute for Social and Policy Studies, Yale University, “Measuring Medical Out of Pocket Expenditures in the CE: A Cross-Survey Comparison” (Interview Survey), day one.
I work on a team that attempts to create a comprehensive measure of household resource instability (http://economicsecurityindex.org). This is done by measuring the proportion of individuals who experience a 25 percent or greater loss of household resources—which is defined as income minus debt burden and out-of-pocket medical expenses—from one year to the next, and who lack sufficient financial wealth to buffer a large loss. Of particular interest is the secular trend in insecurity. However, because data on medical spending and household income are not consistently available in any nationally representative panel, medical spending data from the CE are used to supplement panel income data from the SIPP and CPS.
The measure runs from 1984–present, using yearly observations (only keeping individuals who are available for all waves), and uses the interview (MTBI and FMLI) files for the medical spending imputation.
The model of the joint distribution of changes to medical spending and changes to income from year to year comes from the SIPP, but is only available for select years. Therefore, the CE is used to generalize the model of ranked medical spending in the SIPP to allow the yearly distribution of medical spending to change as it does in the CE.
The presentation included a discussion of our goals and how we arrived at our methods. We also compared the distribution of medical spending to newly available SIPP and CPS data.
Atara Stephanie Oliver, Ph.D. Candidate, University of Maryland-Baltimore Campus, “Information Technology and Transportation: Substitutes or Complements?” (Interview Survey), day one.
The increased availability and prevalence of Information and Communications Technology (ICT) provides opportunities to use such products as substitutes for transportation. Common examples of this substitution are telecommuting, video conferences, and online classes. However, despite the intuitive appeal of the substitution relationship between ICT and transportation, prior research has indicated that the relationship between ICT and transportation is quite complex; at times, ICT substitutes for travel, and at other times, ICT and travel complement each other. Therefore, using a Quadratic Almost Ideal Demand System (QUAIDS) model and data from the U.S. Consumer Expenditure Survey and the Consumer Price Index, I analyzed the effect of ICT expenditures on transportation demand. The analysis indicates that ICT may serve as a substitute for air travel, but primarily serves as a complement for private transportation. Overall, the data support a complementary relationship between ICT and transportation, which indicates that an increase in technology may increase rather than decrease the negative externalities associated with transportation.
David Molina, Associate Professor, University of North Texas, “Hispanic Consumers Through a Sociological Lens” (Interview Survey), day one.
Recent work by Charles et al. (2009) and others have looked at whether African-American and Hispanic consumers are more likely to have higher expenditure on “show-off” items than are White non-Hispanic consumers, the so-called “Veblen Effect.” These studies have used BLS categories such as electronics and automobile expenditures. The problem is that automobile expenditures can include repair and maintenance that may not qualify as “show-off” items. We have attempted to construct our own categories for automobile expenditures, which for example would include rims, stereos, etc., and for clothing, not including repair and laundry expenditures. We use two basic models. One where we restrict the Veblen variable to be the same for each race/ethnic group is:
Ln(expenditure component) = a0 + a1*ln(Total Expenditure) + a2*Black + a3*Hispanic + a4*X + a5*(regional controls) + a6*V + a7*VI + e
where V is the emulation variable and VI is the interaction of V with income. The other model, a less restrictive model where we allow the Veblen variable to be different for the different race/ethnic group, is:
Ln(expenditure component)I = a0 + a1*ln(Total Expenditure) + a2*Black + a3*Hispanic + a4*X + a5*(regional controls) + a6*VH + a7*VHI + a8*VB + a9*VBI + a10*VW + a11*VWI + e
where VH is the emulation variable for the Hispanic groups, VHI is the interaction of V with income for the Hispanic group, VB and VBI are the same variables but for the Black group, and VW and VWI are the same variables but for the White group.
Yiwei Zhang, Ph.D. candidate, University of Pennsylvania, “Consumption Responses to Pay Frequency—Evidence from ‘Extra’ Paychecks” (Interview Survey), day two.
I am interested in understanding how households adjust the timing of their consumption to that of their income, especially in the presence of liquidity constraints or recurrent expenditures whose timing is difficult to adjust. To shed light on these questions, I leverage a unique feature of bi-weekly pay schedules that results in predictable monthly variation in income. Specifically, bi-weekly workers receive two paychecks a month with the exception of two months out of the year when they receive three. Importantly, this variation is solely an artifact of being paid bi-weekly (versus a different pay frequency) and should be fully anticipated. I show that there are large responses in durable spending following months with three paychecks and that these responses only exist among bi-weekly workers. On average, durable spending is 23 percent higher in months following those with three paychecks relative to those that do not.
Christine Dobridge, Ph.D. candidate, University of Pennsylvania, “Payday Lending” (Interview Survey), day two.
The research goal is to determine how access to payday lending affects household spending on luxury versus necessity goods, particularly following an income shock (e.g., extreme weather events). I use regression analysis to show that following shocks, loan access mitigates spending declines on necessities such as groceries and rent.
Aditya Aladangady, Ph.D. candidate, University of Michigan, “Household Balance Sheets and Monetary Policy Transmission” (Interview Survey), day two.
This project seeks to understand the impact of house price fluctuations on the spending and borrowing behavior of homeowners. To do this, I exploit regional differences in land availability and zoning laws in different cities. Following a national-level housing demand shock, regions with geographic or regulatory constraints have large house price increases, compared with cities with more available land and looser zoning laws where increased construction keeps prices stable. This heterogeneity provides a means to “difference” across regions to decompose consumption responses into the direct effect from the shock and the amplification arising through fluctuations in housing wealth. Using CE expenditures data linked to geographic variables, I compare the expenditure growth of households living in both constrained and unconstrained regions. Additional tests attempt to further separate wealth and collateral effects to provide a better picture of the welfare implications of house price fluctuations.
Patrick Purcell, Social Security Administration and Kimberly Burham, Investment Company Institute, “Expenditures of the Aged Chartbook, 2010” (Interview and Diary Surveys), day two.
Published by the Social Security Administration, the chartbook provides tables and charts showing expenditure patterns for consumers age 55 and older. This presentation described findings published in the chartbook.
Nirumpama Kulkarni, Ph.D. candidate, University of California-Berkeley, “Recourse and Mortgage Debt Overhang” (Interview Survey), day three.
What is the impact of stronger creditor rights on credit market outcomes? This work addresses the question empirically by focusing on recourse laws that provide creditors stronger rights to recoup their losses in residential mortgages in the United States. Using a rich, household-level dataset on expenditures, the work analyzes consumption of borrowers with negative equity in recourse states and compares it with consumption of borrowers in nonrecourse states. While negative equity borrowers reduce principal payments in both recourse and nonrecourse states, borrowers in nonrecourse states reduce principal payments to a greater extent. Negative equity borrowers in nonrecourse states, however, continue to spend on vehicles and home-related durables (furnishings and appliances) that the lender does not have access to in the case of default. In contrast, negative equity borrowers in recourse states reduce all spending.
Methods used: Diff-in-diff, DDD.
Rejeana Gvillo, Ph.D. candidate, Texas A&M University, “Food from Vending Machines: Information from the CE Diary Files” (Diary Survey), day three.
The rising intake of calories for U.S. consumers has become national news. One source of easy access to high-calorie items is the vending machine. While some vending machines (though very few) offer more healthful food choices, little to no research has been conducted on how consumers would react if most vending machines offered healthful options.
The goal of the work is to compare persons with positive vending expenditures to persons with positive expenditures in categories for particular fruits and vegetables. Also, limited dependent variable models are used to estimate the probabilities of having positive vending expenditures.
Preliminary results show that there are statistically significant differences in spending patterns for those who do and do not report purchases from vending machines. Among them, those who purchase from vending machines are more likely than those who do not to report expenditures for such items as cola and potato chips, even when factors such as income and family size are taken into account.
Jiyoon Kim, Ph.D. candidate, University of Michigan, “Stimulus Payments and Expenditures” (Interview Survey), day three.
My analysis samples are consumer unit-quarter observations (i.e., each consumer unit contributes four times in the data). Robust standard errors are clustered at consumer unit level to correct for within-consumer-unit dependence. I was wondering if it would be more correct to use one expenditure data per one consumer unit, rather than to use all reported expenditures of consumer units and cluster standard errors by consumer unit.
Staff of the CE Program
Crain, Vera. Economist, Branch of Information and Analysis (BIA); days one, two and three. Workshop leader and session coordinator.
Curtin, Scott. Supervisory Economist, Chief, Microdata Section, BIA; day one
Henderson, Steve. Supervisory Economist, Chief, BIA; day three
Hubener, Evan. Economist, Branch of Production and Control (P&C); day two.
Johnson-Cox, Barbara. Economist, P&C; day three
Passero, Bill. Senior Economist, BIA; day two
Paulin, Geoffrey. Senior Economist, BIA; day three
Pfirrmann-Powell, Ryan. Economist, BIA; days one and two
Other BLS speakers
Schau, Terry. Managing Editor, Monthly Labor Review Branch; day two
Steinberg, Barry. Mathematical Statistician, Division of Price Statistical Methods; day two
Speakers from outside BLS
Aladangady, Aditya, “Household Balance Sheets and Monetary Policy Transmission” (Interview Survey), panelist, day two.
Craig, Stuart, “Measuring Medical Out of Pocket Expenditures in the CE: A Cross-Survey Comparison” (Interview Survey), day one.
Dobridge, Christine, “Payday Lending” (Interview Survey), panelist, day two.
Gvillo, Rejeana, “Food from Vending Machines: Information from the CE Diary Files” (Diary Survey), day three.
Kim, Jiyoon, “Stimulus Payments and Expenditures” (Interview Survey), day three.
Kulkarni, Nirupama, “Recourse and Mortgage Debt Overhang” (Interview Survey), day three.
Molina, David, “Hispanic Consumers through a Sociological Lens” (Interview Survey), day one.
Oliver, Atara Stephanie, “Technology and Transportation: Substitutes or Complements?” (Interview Survey), day one.
Purcell, Patrick and Kimberly Burham, “Expenditures of the Aged Chartbook” (Interview and Diary Surveys), day two.
Samancioglu, Mehmet Zahid, “Consumption in the Panel Study of Income Dynamics in Comparison to CE” (Interview Survey), day one.
Zhang, Yiwei, “Consumption Responses to Pay Frequency—Evidence from ‘Extra’ Paychecks” (Interview Survey), day two.
Ian Elkin and Geoffrey Paulin, "Consumer Expenditure Survey Microdata Users’ Workshop and Survey Methods Symposium, 2013," Monthly Labor Review, U.S. Bureau of Labor Statistics, April 2014, https://doi.org/10.21916/mlr.2014.16.
1 The Quarterly Interview Survey is designed to collect data on expenditures for big-ticket items (e.g., major appliances, cars and trucks) and recurring items (e.g., payments for rent, mortgage, or insurance). In the Interview Survey, participants are visited once every 3 months for five consecutive quarters. Data from the first interview are collected only for bounding purposes and are not published.
In the Diary Survey, participants record expenditures daily for 2 consecutive weeks. The survey is designed to collect expenditures for small-ticket and frequently purchased items, such as detailed types of food (e.g., white bread, ground beef, butter, lettuce).
The CE microdata may be downloaded on the CE website (https://www.bls.gov/cex/pumd.htm).
2 To preserve the confidentiality of the data, values for some variables, such as income sources and certain expenditures (e.g., rent, among others) are topcoded. In this process, values that exceed a predetermined critical value are replaced with a new value. In each case, changed values are flagged for user identification. Details about topcoding are provided in the public-use microdata documentation for the year of interest. (See, for example, 2011 Consumer Expenditure Interview Survey, Public Use Microdata, User’s Documentation, September 25, 2012, https://www.bls.gov/cex/2011/csxintvw.pdf.)
4 The CE sample design is pseudorandom. The proper use of weights requires the use of the method of balanced repeated replication.