An official website of the United States government
The Contingent Worker Supplement (CWS) is a set of questions that has periodically been appended to the nation’s monthly labor force survey, the Current Population Survey (CPS).1 The CWS, first fielded in 1995, is designed to measure the number and characteristics of contingent workers and workers in four alternative employment arrangements—independent contractors, on-call workers, temporary help agency workers, and workers provided by contract firms. The survey was fielded four more times—in 1997, 1999, 2001, and 2005—with a largely unchanged questionnaire.
In 2016, the U.S. Bureau of Labor Statistics (BLS) obtained funding to field the CWS in May 2017. One major goal of the 2017 CWS was to see how the number of contingent workers and workers in alternative employment arrangements had changed since 2005. Therefore, in order to maintain data comparability over time, the 2017 questionnaire was largely the same as that used when the data were last collected.
Many stakeholders were interested in adding questions to the CWS to collect information about a variety of other topics. However, the development of new questions can be a lengthy process, and BLS had limited time to make changes if the survey was to be fielded in May 2017. First, to comply with Office of Management and Budget (OMB) guidance, all substantive changes to federal survey questionnaires are evaluated, and proposed changes are announced and provided to the public for comment, which can take considerable time. Additionally, the U.S. Census Bureau—which conducts the survey for BLS—had adopted new software for its data collection instrument since the 2005 survey, so the CWS needed to be completely reprogrammed in the new software and tested thoroughly. (The data collection instrument is the custom-designed software used by Census Bureau interviewers to conduct the survey and collect responses.) Because the existing survey had an extremely complicated questionnaire, many rounds of systematic testing would be necessary to ensure that the survey instrument was programmed correctly. Developing and adding new questions and ensuring that these questions were programmed correctly would strain an already ambitious schedule.
After consulting with the Census Bureau, BLS determined that, given the time constraints and the need to minimize respondent burden, it was not possible to add more than four straightforward questions—that is, four questions with limited skip-and-fill patterns and with limited response options.2 Also, the four questions would need to be added to the end of the CWS questionnaire so that there was no impact on responses to earlier questions. In addition, placing questions at the end of the questionnaire would simplify the programming of the data collection instrument.
In early 2016, BLS formed a team of economists and survey methodologists to investigate the possibility of adding four questions to the CWS. While the group considered several topics—such as second jobs, flexibility of work, advance notification of work schedule, and contingent work and alternative employment arrangements over a longer time span than the previous week—consensus coalesced around obtaining data about work arrangements that have emerged since 2005.
New terms are being used in relation to this emerging type of work, such as “gig workers” and “gig economy.” BLS does not have a definition for these terms, and there is no generally accepted definition among researchers. Many definitions of gig workers include people in temporary jobs, independent contractors, on-call workers, and day laborers—all of which can be estimated with CWS data. However, many definitions also include people in types of work arrangements that did not exist when the survey was last fielded. Many researchers and policymakers have expressed a need for additional data on emerging work arrangements to paint a more complete picture of gig workers, especially since anecdotal evidence suggests a sharp rise in the number of these workers in recent years.
The CWS seemed an appropriate survey for collecting these new data. BLS decided to focus on one emerging type of work that most researchers consider to be a type of gig work—one that is sometimes referred to as “electronically mediated work” or “online platform work.”3 In this type of employment arrangement, workers
· use a company’s website or mobile app to connect to clients or customers and obtain short jobs, projects, or tasks;
· are paid by or through the company that owns the website or mobile app;
· choose when and whether to work; and
· may do these short jobs, projects, or tasks in person or online.
There are many examples of this type of work. For instance, some people use their own cars to transport people from place to place, having obtained customers through a mobile app that facilitates payment of the ride. (Companies that currently enable this kind of work include Uber and Lyft.) Others do household chores or yardwork after finding clients through a mobile app or website that later arranges for work payment. (Examples of companies that focus on these types of short-term jobs include TaskRabbit and Handy.) Additionally, some workers do work entirely online, such as taking surveys, adding descriptive keywords to photos or documents, or designing webpages for businesses. (Companies such as Amazon Mechanical Turk and Clickworker enable this type of work.)
Note that workers are not considered electronically mediated workers simply because they use a website or mobile app to do their work. The website or app must be used to connect them directly to customers or short-term jobs or tasks, and workers must also be paid by or through the company that owns the website or app. People who find customers or jobs through online ads but are not paid by the company that owns the website where they posted the ad are not considered electronically mediated workers. For example, work found through a Craigslist.com ad is not considered electronically mediated work.
Moreover, many businesses have websites or mobile apps that their employees may use to carry out their work. For example, a company’s driver may use a mobile app to map a route when making deliveries; however, this alone does not constitute electronically mediated work. Similarly, some businesses—such as coffee shops or fast-food restaurants—allow customers to order through a website or app. These businesses typically have a dedicated staff to complete orders. Thus, a barista at a coffee shop is not considered an electronically mediated worker just because a customer may order a beverage through a mobile app.
Electronically mediated workers often have the ability to choose when and how much they work. Some people do electronically mediated work as their only source of income, while others do this type of work as a second job or “on the side.”
Though the measurement of electronically mediated work, like that of other emerging types of work, had been little researched, the BLS team began by reviewing the existing literature.4 The concepts used to define electronically mediated work are quite complicated, and many decisions had to be made in developing questions to identify this type of work.
In both the CPS and CWS, one person answers the survey questions about everyone living in the household. Thus, respondents provide data about themselves (self-reports) and others living with them (proxy reports). People who report about other household members may not be able to answer all questions about others’ employment arrangements. While it would be possible to have all household members report only about themselves, this would be quite expensive because interviewers would have to contact some households multiple times in order to obtain responses from all household members. Therefore, respondents need to be able to answer all questions both about themselves and others.
The CPS and the CWS use a “last week” reference period—that is, the week including the 12th of the month. Because the four new questions would follow the CPS and the CWS, it would make sense that they too focus on “last week.” However, while some people may do electronically mediated work as a full-time job, there is evidence that many do this type of work sporadically.5 Thus, focusing on “last week” might understate the number of people engaging in such work, as it would fail to capture those who regularly perform electronically mediated work but did not do so during the past week.
Therefore, BLS team members considered using a longer reference period, such as the past month or past year, but were concerned that this would confuse respondents because so many previous questions in both the CPS and CWS focus on “last week.” Further, a long reference period can sometimes be difficult for respondents because they may not remember when certain activities occurred. For example, if respondents are asked how many times they did a particular activity in the last year, they may include activities from 2 or 3 years ago. In addition, if a longer reference period were used for the new questions, the number of workers doing electronically mediated work would not be comparable with other CWS estimates. With the same reference week, the interaction of electronically mediated work with contingent work and alternative employment arrangements could be explored. For example, BLS could estimate the number of electronically mediated workers who were also contingent workers or independent contractors. To both avoid respondent confusion and keep measures on a comparable basis, BLS decided to use the reference period of “last week” for these new questions.
The CWS questions are asked of employed people.6 However, some researchers have suggested that people do not consider electronically mediated work to be a job.7 If true, people who only did electronically mediated work might be undercounted if the questions were limited to those classified as employed through answers to CPS questions.
Although keeping the same universe for the new questions would be practical, the BLS team evaluated whether the question universe should be expanded to include those who were not employed—that is, either unemployed or not in the labor force. However, the most basic of the labor force questions in the CPS asks “LAST WEEK, did you do ANY work for pay?”8 People with responses of “no” to this question (and who were not temporarily absent from a job) would be classified as unemployed or not in the labor force. In order to expand the question universe to those who were not employed, the CWS would essentially have to repeat this question or a variant of it to people who had already answered “no.” This could frustrate respondents who felt that they had already answered the question. In addition, BLS research suggests that the effect of missed informal work on total employment estimates is likely to be small.9 Given these concerns, BLS decided to restrict the universe of the new questions to the employed.
The existing questions in the CWS apply only to a person’s main job. For the relatively few people with more than one job (about 1 in 20 workers in 2017), this is the job in which they usually work the most hours. Because anecdotal evidence suggests that many people do electronically mediated work in addition to a regular job, the team investigated expanding the scope of the questions to include all reported jobs. Also, as mentioned above, some researchers have suggested that some people may not view electronically mediated work done on the side as a job, which could cause electronically mediated work to be underrepresented in measures of second jobs. One way to expand the scope of the questions would be to ask about any work done in the reference week, not just work for the main job.
BLS team members feared that respondents would be confused by a sudden shift to questions asking about any work after having answered so many questions about their main job. In addition, asking about any work, rather than the main job, would mean that data on electronically mediated work would not be on a comparable basis with the other data collected in the CWS.
Despite concerns about respondent confusion, BLS thought that it was important to expand the scope to provide more information about the relatively little-studied topic of electronically mediated work. To address concerns about comparability, information could be collected about whether electronically mediated work had been done for the main job, a second job, or additional work for pay. The additional work for pay category would hopefully capture electronically mediated work done by people who do not consider such work to be part of a job.
Some researchers were interested in distinguishing between electronically mediated work done entirely online and that done in person, speculating that their effects on the labor market might be different.10 In particular, in-person electronically mediated work would more likely affect local labor markets, while electronically mediated work done entirely online would more likely impact the global labor market and be influenced by international regulations and trends. In addition, some researchers suggested that in-person electronically mediated work was more likely to be a sole source of income because it may require a greater time commitment. By contrast, people might be more likely to do online electronically mediated work on an intermittent basis to supplement their incomes.11 For this reason, the BLS team decided to distinguish between these two types of work.
There were considerable challenges to designing a set of only four questions that would be clearly understood by respondents. One of the easiest ways to ask questions about electronically mediated work would be to ask whether respondents (or members of their household) had done work through specific companies, such as Uber, Lyft, or TaskRabbit. However, BLS survey questions, by longstanding tradition, do not use specific company names because companies can change, especially in emerging industries or fields. Companies popular at the time of initial survey development may no longer exist when the survey is fielded. BLS attempts to minimize changes to questionnaires because even small changes to question wording can affect responses and, thus, data comparability over time. In addition, respondents may focus only on the company named and omit similar companies. For example, respondents may fail to respond about ride-share companies other than Uber and Lyft if only those companies’ names were included in a question.12 Given these concerns, BLS decided the questions should describe the characteristics of the work itself but not use company names.
BLS knew it would be difficult to design four questions about electronically mediated work that would be clear to respondents without using company names. Respondents might interpret questions about finding jobs through websites or mobile apps as questions about online job search. In addition, use of websites and mobile apps is widespread, and they are used for many different reasons. Writing questions so that respondents could clearly identify when they had used websites or apps only to facilitate electronically mediated work would be a challenge.
Difficult concepts can often be clarified by including examples in questions. The risk of using examples is that respondents may focus on the example rather than on the actual question, which may lead to incorrect answers if a respondent’s experience does not align with the example chosen. For instance, a respondent who did electronically mediated chores might answer “no” to a question that included an example about electronically mediated ride sharing. Because of this danger, BLS is cautious about including examples in questions. However, BLS believed that the advantages of using examples would outweigh the disadvantages as long as the examples were chosen carefully.
After much discussion of the previously mentioned topics, the team proposed the wording of the four questions. One question asked about in-person electronically mediated work. Another asked about online electronically mediated work. Both used examples to clarify the concepts. The in-person and online questions were each followed by a question about which job this work was done for—that is, whether the work was for their main job, a second job, or additional work for pay.
Throughout the question development process, BLS actively sought feedback about the proposed new questions. BLS staff gave many presentations and briefings about the CWS to outside groups, including congressional staff, industry groups, academics, nonprofit organizations, and other government agencies. While these presentations tended to focus on the CWS as a whole, BLS efforts to add new questions to collect more information were also described.
In addition, an early draft of the new questions was circulated to many academics, industry experts, special interest groups, and other data users. The draft questions were also discussed with the Department of Labor’s Structure of Work Policy Working Group, which had emphasized the need for up-to-date data that could be used to study how Americans’ work arrangements have changed over time. Furthermore, the new questions were reviewed and cleared by OMB. The clearance process included two periods of public comment, during which BLS received suggestions from the public.
Through these outreach efforts, BLS received considerable feedback, all of which was evaluated. BLS staff made several wording changes to the questions based on specific suggestions received. Some suggestions were not feasible given the tight timeline, such as overhauling the CWS questionnaire or developing an alternative set of four questions on a different topic. Likewise, expanding the scope of the questions to cover a longer timeframe was not deemed practical.
In accordance with OMB guidelines for statistical surveys, BLS typically cognitively tests proposed new questions before they are added to surveys.13 Cognitive testing involves administering a sample questionnaire to recruited participants and then asking a series of debriefing questions.14 These debriefing questions collect information about the response process, providing insight into whether participants understand the questions as intended, have difficulty formulating their answers, and respond “correctly” given the measurement objectives. This type of testing can be valuable in ensuring that questions measure the intended concepts.
Two cognitive testing methods were used to evaluate the electronically mediated work questions—laboratory testing and online testing. For both the laboratory and online modes, the goals of the cognitive testing were as follows:
· To ensure that the proposed questions worked as intended—that is, that they maximized the number of true positives and minimized the number of false positives
· To test the wording of the draft questions
· To determine if introductory or transition language was necessary between the existing CWS questions and the electronically mediated work questions
· To determine whether interviewer instructions or help screens were necessary to explain the key concepts
BLS staff conducted 24 interviews in their Washington, DC, cognitive testing laboratory. Participants were recruited through advertisements on Craigslist.com and through flyers handed out at a DC taxi stand and a pizza restaurant. The ads targeted workers who were employed by specific companies, such as Uber, Lyft, TaskRabbit, or GrubHub, or in specific professions. The professions selected include a relatively large number of both electronically mediated workers and traditional workers in the same occupation (for example, Uber drivers and taxi drivers). People who responded to the advertisements were asked several screening questions to ensure they had relevant experience before being invited to participate in the cognitive testing.
A trained cognitive interviewer administered an abbreviated version of the CPS and the CWS, along with the four new questions. The interviewer then debriefed participants to gain insight into their response process in order to uncover any sources of error in what was reported and ways to improve the questions.
BLS also conducted 138 online interviews through the Amazon Mechanical Turk (mTurk) platform. While online interviews differ from how the CPS is conducted—that is, in person or by telephone—they allowed BLS to recruit participants in a broad variety of professions and outside the DC area. Also, since mTurk is itself an example of a platform that facilitates online electronically mediated work, online interviews allowed BLS to recruit a large number of individuals for whom the online question would be relevant.
The in-person question asked about short, in-person jobs or tasks that people find through companies that connect them with customers through a website or mobile app and also coordinate payment for the service. The in-person question performed differently in the two cognitive testing modes. The cognitive interviews conducted in the laboratory contained some false positive responses. Through the debriefing questions, BLS survey methodologists determined that 4 (out of 14) participants who said they had done in-person electronically mediated work had not actually done so. Instead, they had obtained clients through websites (such as Craigslist.com) but were not paid through those websites. Additionally, two of the three proxy responses of “yes” to the in-person question were found to have similar errors. However, most responses to the in-person question were correct, and participants seemed to understand the question as intended.
The in-person question performed better in the laboratory testing than it did in the online mTurk testing. During the mTurk testing, there were 18 (out of 57) false positive “yes” responses and 13 (out of 81) false negative “no” responses. The false positives were determined by evaluating open-ended text descriptions of jobs. Seven false positives were due to participants identifying mTurk tasks performed in the previous week—which were done entirely online—as in-person electronically mediated work. Other false positives were made by respondents who obtained clients through a website but were not paid through that site. The false negative determinations were made by having participants select from a list of electronically mediated work platforms through which they had worked during the previous week.
The online question asked about short, paid tasks done entirely online that people find through companies that maintain online lists of tasks. Very few people responded “yes” to this question in the cognitive testing interviews conducted in the laboratory. (Because the mTurk testing was planned, and mTurk is a platform through which people do electronically mediated work entirely online, BLS focused its efforts on recruiting cognitive test participants for the laboratory who were likely to have done in-person electronically mediated work.) All three “yes” responses to this question collected in the laboratory were determined through the debriefing to be false positives. These participants said “yes” either because they (or their household members) found clients online or because they did some of their work online.
The mTurk testing yielded mixed results for the online question. There were very few false positives but many false negatives (31 out of 42 “no” responses). Many participants who answered “no” did not include mTurk tasks they had done in the previous week. Most of these participants did not think the online question was intended to include mTurk tasks. This could be a result of administering the testing via mTurk; participants may have excluded their mTurk work because BLS knew they were on mTurk.
Both the in-person and online questions were followed by a “which job” question; if respondents said “yes” to either the in-person or online question, they were asked if that work had been done for the main job, a second job, or additional work for pay. In the laboratory, some participants found it difficult to distinguish between a second job and additional work for pay, but the cognitive testing did not probe specifically about participants’ answers to these two questions. Interviewers did probe when participants displayed obvious difficulty with either of the questions. Some participants found it difficult to distinguish between a second job and additional work for pay because they did not think of electronically mediated work as a job. In the mTurk testing, most participants said they did electronically mediated work—particularly online work—as additional work for pay.
In the CPS, main job and second job concepts are communicated through the survey questions. However, the truncated version of the CPS interview given during the cognitive testing asked only about the main job and did not include any questions about the second job. Therefore, BLS believed that some of the confusion that occurred during testing would not occur in an actual field interview. Similarly, BLS thought that CWS respondents would understand the difference between main and second jobs if they had been administered the full CPS interview.
Although there were some participants who provided incorrect responses, both types of testing indicated that the four questions generally measured what they were intended to measure. BLS survey methodologists analyzed all participant interviews to determine why incorrect answers had occurred. They identified several issues:
In-person and online questions
· Some participants thought websites that advertised goods and services but did not facilitate payment, such as Craigslist.com, were applicable to both the in-person and online questions.
In-person question
· In both testing modes, participants who relied on the internet or mobile apps for their work thought the in-person question applied to them. Specifically, participants who found clients through social media and participants who worked for businesses that allow customers to place their orders through mobile apps or websites thought the in-person question applied to their situation. They appeared to miss the reference to “in person.”
· Many participants in the mTurk testing reported online electronically mediated work (in particular, tasks done through mTurk) as part of their answer to the in-person question.
Online question
· Several participants with data entry jobs at traditional companies believed that the online question applied to them.
· Many mTurk participants did not include their experience with mTurk as part of the online question. This may be because they were tested using mTurk and assumed that mTurk tasks should be excluded.
“Which job” questions
· Participants had some difficulty differentiating between second job and additional work for pay.
The final report on the cognitive testing made several recommendations designed to improve the questions.15 To stress the difference between in-person and online work, the report made two suggestions: (1) to add introductory, clarifying language and (2) to emphasize the words “in person” and “online” in the questions. The report also suggested revising the examples to better represent the type of work being asked about. In addition, the report suggested highlighting that BLS was interested in learning about all work, not just the main job.
The question wording was finalized based on these recommendations. However, because of time and funding constraints, BLS adopted the revised questions without additional cognitive testing.
After making changes based on the cognitive testing results and stakeholder comments, BLS finalized the question wording in July 2016. Before being asked the questions about electronically mediated work, respondents were given a short introduction:
I now have a few questions related to how the internet and mobile apps have led to new types of work arrangements. I will ask first about tasks that are done in person and then about tasks that are done entirely online.
This introduction was intended to alert respondents to the fact that the following questions would touch on the internet and mobile apps. It also aimed to signal respondents to distinguish between in-person work and work done entirely online. The hope was that this introduction would clarify what might otherwise appear to be repetitive language.
Final wording of the in-person question and follow-up “which job” question was as follows:
Q1 Some people find short, IN-PERSON tasks or jobs through companies that connect them directly with customers using a website or mobile app. These companies also coordinate payment for the service through the app or website.
For example, using your own car to drive people from one place to another, delivering something, or doing someone’s household tasks or errands.
Does this describe ANY work (you/NAME) did LAST WEEK?
Q1a Was that for (your/NAME’s) (job/(main job, (your/NAME’s) second job)) or (other) additional work for pay?
Note that names are used if the question is asked about others in the household. If respondents answer “yes” to the in-person question (Q1), they are asked the follow-up “which job” question (Q1a). People with only one job are asked whether the in-person electronically mediated work was for their job or additional work for pay. Multiple jobholders are asked whether this work was for their main job, a second job, or other additional work for pay.
As recommended in the cognitive testing report, the words “in person” were capitalized in the question. Interviewers are instructed that capitalized words are important in questions and must be emphasized when conducting interviews. To reduce the underreporting of paid activities that participants think of as “not a job,” respondents were asked to describe any work they did during the reference period. It should be noted that the basic CPS questions inquiring about work already include the emphasis on any work, so respondents would have heard this emphasis in prior questions. The words “last week” are also emphasized, which is done in other questions throughout the CWS and the CPS that refer to the reference week.
The questions about online electronically mediated work and about which job were very similar to the questions for in-person work, though with emphasis on the word “online” and with different examples:
Q2 Some people select short, ONLINE tasks or projects through companies that maintain lists that are accessed through an app or a website. These tasks are done entirely online, and the companies coordinate payment for the work.
For example, data entry, translating text, web or software development, or graphic design.
Does this describe ANY work (you/NAME) did LAST WEEK?
Q2a Was that for (your/NAME’s) (job/(main job, (your/NAME’s) second job)) or (other) additional work for pay?
As mentioned earlier, the software used to program the data collection instrument—that is, the custom-designed software used by Census Bureau interviewers to collect the data—had changed since the CWS was last collected. Because of the change, Census Bureau staff reprogrammed the instrument for the 2017 CWS, adding the four new questions to the end. Staff at both the Census Bureau and BLS performed many rounds of extensive instrument testing to ensure that CWS questions appeared on the screen as expected and that all skip-and-fill patterns were correct. In addition, the Census Bureau tested the processing system before fielding.
It is cost prohibitive to do in-person training for CPS supplements like the CWS because interviewers are based all over the country. Instead, interviewers are typically trained about supplements through 1-hour self-study materials. BLS updated and augmented the 2005 training materials to include information about the new questions on electronically mediated work. The final May 2017 self-study materials covered not only the new questions but all questions on the CWS, which collects data about a number of different topics. Reflecting the order of the questions in the survey, the information about the new questions appeared at the very end of the self-study. The training materials were provided before the fielding of the CWS, and interviewers were instructed to complete the self-study materials as part of their preparations for the month.
The CWS was fielded in May 2017. No major problems with either the existing questions or the new questions were reported by interviewers during the data collection period. Considerable time was needed to process the data. Just as the data collection instrument had to be reprogrammed, all edits had to be completely reprogrammed. In addition, supplement weights needed to be developed.
Interviews conducted by telephone from one of the Census Bureau’s three data collection centers are taped for quality assurance purposes and are retained for a short period. It is standard practice for BLS staff to monitor a handful of interviews after new CPS questions are fielded. Monitoring allows staff to hear the entire interview, including apparent respondent confusion, requests for clarification, and verbatim responses to the questions. From listening to interviews, it is often possible to determine whether the questions were easily understood by respondents, whether answers were correct, and whether breakdowns in communication occurred.
While the Census Bureau was processing the data, the BLS team monitored many interviews to assess the data quality of the new questions. To enable a qualitative analysis of how the questions worked, the team used the unprocessed data to select cases with a variety of characteristics—such as occupation, self-response versus proxy response, and multiple-jobholding status. The selected cases included both those in which respondents said “yes” to at least one of the new CWS questions and those in which respondents answered “no” to both questions. Three or four team members attended each of several monitoring sessions and recorded their observations. Team members independently noted interactions between respondents and interviewers based on predetermined guidelines and assessed the correctness of answers to the new questions. The group discussed each interview immediately after listening to it, and team members were almost always in complete agreement about their assessments of cases.
In all, the CWS team monitored about 100 interviews. It was clear that there were many false positives to both the in-person and online electronically mediated questions. Respondents had described the main job earlier in the interview, and they often mentioned additional details about the work when answering the in-person and online questions. For most “yes” responses, it was obvious that the reported work could not have been obtained through a website or app that also coordinated payment of the work. Staff monitoring interviews observed some common patterns.
Many respondents focused on the examples rather than on the definitions of electronically mediated work given in the questions. Additionally, if respondents hesitated, interviewers sometimes repeated only the examples. Consequently, many said “yes” to the question if any of their job duties resembled any of the examples included in the questions. For example, monitors heard the following responses:
· “Yes, I drive my car to work.”
· “Yes, I sometimes use a computer at work.”
· “Yes, that describes part of what I do at work.”
· “Yes, I’m a graphic designer.”
Also, many respondents who said they did in-person electronically mediated work for their main job also said they did online electronically mediated work for that same job. It is highly unlikely that people did both electronically mediated work in person and entirely online for the same job.16
Some respondents with traditional jobs used websites or mobile apps in their work. Some of these websites and apps did not facilitate electronically mediated work, but respondents gave affirmative answers to the questions anyway. Many answered “yes” if they obtained clients or jobs using a website or mobile app even if they were not paid through that website or app. Examples of respondents in this type of situation include the following:
· A real estate agent who obtained customers through the web
· A gravel delivery person who used an app to obtain route directions
· A fast-food worker who prepared orders that customers placed through an app
Some respondents appeared to think the questions were asking about whether they used a computer in their work. A number of respondents said “yes” to the questions and listed as examples work that was clearly not electronically mediated. Examples of respondents in this type of situation include the following:
· A university lecturer who did all work online (lectures, student interactions, etc.)
· A technical support person who was connected to people to help through the internet
· A receptionist in a doctor’s office who scheduled appointments using a computer
By asking unscripted probes, the interviewer can help respondents determine the response option that best fits. However, BLS staff rarely observed interviewers probing when necessary for correct answers or providing explanations to confused respondents. Instead, many interviewers simply repeated the examples in the questions. In addition, interviewers sometimes could not interpret the respondents’ answers. In a few cases, interviewers intervened to change previously correct answers, saying things such as “but you do use a computer, don’t you?” for the in-person question. In response to these types of inquiries from interviewers, respondents’ correct “no” answers were occasionally converted to incorrect “yes” answers.
The team observed many false “yes” answers to both the in-person and online questions. In general, both questions appear to have been too complicated. In order for a “yes” answer to be accurate, several conditions needed to be true. Many respondents did not seem to consider all of the necessary conditions and instead responded “yes” when only one of the conditions was true. While the team concluded that both questions had a high number of false positive responses, they observed no false negatives.
The CWS team then turned to examining records on the confidential microdata file. While this file does not contain as much information about each case as a taped interview, it does include answers for other questions in the CPS and the CWS, including respondents’ verbatim descriptions of job duties, employer name, industry, and occupation. In addition, the file contains information about usual work hours; whether the person worked for the government, a for-profit firm, or a nonprofit firm; and self-employment status. The file also contains CWS information about whether people were independent contractors or in other alternative employment arrangements on their main job.
The electronically mediated work questions were asked about more than 46,000 people, and there were relatively few “yes” responses—about 1,600 for the in-person question, the online question, or both. Most of these answers indicated that the work was done for a person’s main job, and BLS could obtain information about those jobs using the confidential microdata file. A quick review reinforced what had been observed in the monitoring—that many of these “yes” answers were clearly false positives. For example, the file showed that “yes” answers for the in-person question had been recorded for the following main jobs:
· Vice president of a major bank
· Manager of a fast-food restaurant
· Local police officer
· Surgeon at a large hospital
For the online question, “yes” answers were often given for people who used computers or mobile apps in their work, even though not all of them had done electronically mediated work. Many people with “yes” answers, though not all, clearly could not have done all of their work entirely online. Examples of cases with likely false positives for the online question include the following (again, these are people who said they did this work for their main job):
· Medical assistant administering medication to patients
· Hair stylist
· Railroad engineer
· Front desk clerk at a motel
BLS also examined records with “no” responses for the electronically mediated questions. A quick review reinforced the conclusions from the monitoring—that is, the vast majority of negative answers for both the in-person and online questions appeared to be correct.
Given that both the monitoring and evaluation revealed that the questions had not worked as intended, BLS considered whether the data were too flawed to release. Although there were many false positives, false negatives did not seem to be a problem. Therefore, the team decided to use information on the confidential microdata file to see whether incorrectly coded cases could be identified using the verbatim information on the confidential microdata file.
The team devised a test to determine whether false positives could be identified, first creating guidelines to help identify whether electronically mediated work had been done. For example, respondents who worked for the federal, state, or local government were unlikely to be electronically mediated workers. The team agreed that unclear cases should be assumed to be correct. (See appendix A for a complete list of the guidelines.)
Using the guidelines the team had developed, a group of 5 staff members evaluated 100 records with “yes” answers to the in-person question. Key information about each case was read aloud, and each of the five team members independently evaluated whether the respondents’ answers were compatible with electronically mediated work, assigning answers of “yes,” “no,” or “maybe” for each case. Team members’ determinations were not discussed during the evaluation session. After all 100 cases had been evaluated, team members’ responses were compared. For a substantial number of records, the team members had unanimously agreed that the “yes” answer was incorrect. They also agreed unanimously that a few cases definitely had correct answers. The test confirmed to the team that many false positives could be identified through a recoding process.
Based on the results of the recoding test, the team decided to evaluate all records with affirmative answers to the in-person and online questions and recode erroneous answers when possible. Information is collected for both main and second jobs in the CPS, so any evaluation of answers needed to consider the job for which the electronically mediated work was done. The team devised three approaches for reviewing data that depended on respondents’ answers to the “which job” questions—that is, work done for the main job, a second job, or additional work for pay. The team also reviewed a sample of “no” answers to check for false negatives.
The vast majority of respondents who said “yes” to either the in-person or online questions reported that the electronically mediated work had been done for their main job (or their household members’ main job). Of the 912 “yes” answers for the in-person question, 826 (91 percent) were for the main job. Of the 963 “yes” answers for the online question, 917 (95 percent) were for the main job.
The CWS team reviewed each record with a “yes” answer to the in-person or online question and an answer of “main job” to the corresponding “which job” question. The review was done in a systematic fashion by groups of five team members, and the in-person and online questions were evaluated separately. As with the recoding test, key information about each case was read aloud, and each of the five team members independently evaluated whether they thought the respondent had done electronically mediated work, assigning answers of “yes,” “no,” or “maybe” for each case. Team members’ determinations were not discussed during the evaluation sessions.
Cases with four “no” answers and one “maybe” answer were assumed to be false positives, as were cases that had unanimously been assigned “no” by all five team members. Once this review was completed, the number of records with “yes” answers for the in-person question had been reduced from 826 on the main job to 184, and the number of “yes” answers to the online question had been reduced from 917 on the main job to 167. Team members believed that, while they had identified many false positives, there were likely additional false positives that could not be identified given the available data.
Almost all respondents who answered “yes” to either the in-person or online questions said that the work was done for the main job. However, a small number of people said this work was done for the second job. Five percent of “yes” responses for the in-person question and 2 percent for the online question were for a second job.
Because of the survey design, the CPS has less information on second jobs than it does for main jobs. Each month, information about job duties, employer name, occupation, and industry for the second job is collected of only about one-fourth of multiple jobholders.17 For the records with detailed information about second jobs, the team did an evaluation similar to that done for the main job. The team evaluated respondents’ verbatim descriptions and other information to determine whether a “yes” answer for the electronically mediated work questions should have been coded as a “no.” For the three-fourths of records for which no additional information was available, the response provided was accepted without recoding.
As with the exercise done for main job, the team evaluated answers for the in-person and online questions independently and identified a small number of false positives. Because so few records were evaluated, the team was not able to conclude whether “yes” answers were more likely to be correct for the second job than for the main job. After recoding the records for which information was available, the affirmative answers for the in-person question decreased from 50 to 48 and the affirmative answers for the online question decreased from 24 to 19. It is likely that, had information about second jobs been available for the other three-fourths of multiple jobholders, more “yes” answers would have been recoded to “no.”
A small number of respondents reported that they or their household members did electronically mediated work for “additional work for pay”—4 percent of the “yes” responses for in-person work and 2 percent for online work. The confidential microdata file does not contain any information about what respondents did for additional work for pay. Because the CWS team had no additional information about these respondents, their answers were accepted and were not reviewed. In addition, answers were assumed to be correct for the very small number of respondents who said that they or their household members did electronically mediated work but did not answer the “which job” questions.
Although the monitoring suggested there was not a problem with incorrect “no” answers, BLS used the microdata to look for false negatives in two ways. First, staff looked at cases with “no” answers in occupations in which anecdotal evidence suggests there may be high numbers of electronically mediated workers. Staff members saw no evidence of a substantial problem with false negatives in these occupations.
Second, BLS identified records containing selected keywords. Keywords included businesses that commonly facilitate electronically mediated work, such as Uber, Lyft, TaskRabbit, Handy, Amazon mTurk, and Crowdflower. Words associated with electronically mediated work, such as taxi, freelance, and ride share, were also included.18 Using the verbatim descriptions of job duties, employer name, occupation, and industry, a team of five staff members evaluated each case to determine whether an incorrect answer of “no” had been recorded for the in-person and online questions. Team members identified a handful of incorrect “no” answers—9 records out of about 175 cases with the selected keywords. (Many of the correct “no” answers were taxi drivers and handymen who clearly had not done electronically mediated work.) Note that this was not a random sample. Rather, these records were chosen as being the most likely to have false negatives among all those with “no” answers. Because the number of false negatives identified was so small, the team concluded that false negatives were of little concern overall.
Final results of the recoding, taking into account both the false positives and the handful of false negatives found by the team, served to lower the number of observations with “yes” answers for the in-person electronically mediated work question from 912 to 277. Recoding lowered the number of “yes” answers to the online question from 963 to 208. For both the in-person and online questions, most of the answers that were changed were for workers who had done electronically mediated work for their main job. This was partly because most cases were for the main job, and partly because BLS had information about virtually all main jobs. False positives for cases in which the electronically mediated work was done for the second job were less common, but the BLS team could only evaluate about one-fourth of those cases because of the lack of information about second jobs on the microdata file. Lacking any information about what was done as additional work for pay, the BLS team could not recode any additional-work-for-pay cases.
The recoding of in-person and online electronically mediated work was done independently. There was no attempt to ensure that “yes” answers did not occur for both questions, even though the BLS team agreed that someone was highly unlikely to do electronically mediated work both in-person and entirely online for the same job. Despite this fact, the number of cases with “yes” answers for both in-person and online electronically mediated work was reduced sharply—from 293 in the collected data to 23 in the recoded data.
Weighted estimates showed that the broad demographic characteristics of electronically mediated workers were similar for both the collected and recoded data. However, there were a number of differences by industry. (See table 1.)
Characteristic | Recoded | Collected | ||||
---|---|---|---|---|---|---|
Total | In person | Online | Total | In person | Online | |
Number of workers (in thousands) | 1,609 | 990 | 701 | 5,057 | 3,021 | 2,969 |
Percent of total employed | 1.0 | 0.6 | 0.5 | 3.3 | 2.0 | 1.9 |
Class of worker(1) | ||||||
Total | 100.0 | 100.0 | 100.0 | 100.0 | 100.0 | 100.0 |
Wage and salary workers | 62.8 | 62.4 | 62.6 | 78.1 | 72.7 | 81.9 |
Private industries | 59.1 | 58.3 | 58.9 | 66.7 | 63.6 | 69.4 |
Government | 3.8 | 4.2 | 3.8 | 11.4 | 9.2 | 12.5 |
Self-employed workers | 37.2 | 37.6 | 37.4 | 21.9 | 27.3 | 18.1 |
Self-employed workers, incorporated | 7.3 | 7.1 | 8.8 | 7.3 | 8.8 | 6.6 |
Self-employed workers, unincorporated | 29.8 | 30.5 | 28.6 | 14.6 | 18.5 | 11.5 |
Industry(1) | ||||||
Total | 100.0 | 100.0 | 100.0 | 100.0 | 100.0 | 100.0 |
Agriculture and related industries | 0.0 | 0.1 | 0.0 | 0.2 | 0.2 | 0.3 |
Mining, quarrying, and oil and gas extraction | 0.0 | 0.0 | 0.0 | 0.2 | 0.0 | 0.3 |
Construction | 1.2 | 1.3 | 0.9 | 4.4 | 5.2 | 3.9 |
Manufacturing | 1.1 | 0.5 | 1.9 | 6.0 | 4.6 | 6.5 |
Wholesale trade | 0.9 | 0.9 | 0.8 | 2.0 | 2.2 | 2.1 |
Retail trade | 5.9 | 5.6 | 7.1 | 9.9 | 10.9 | 8.8 |
Transportation and utilities | 21.8 | 35.0 | 1.9 | 9.8 | 13.9 | 5.1 |
Information | 4.1 | 1.4 | 7.5 | 3.0 | 1.7 | 4.2 |
Financial activities | 3.3 | 2.5 | 4.0 | 9.5 | 9.6 | 9.6 |
Professional and business services | 31.0 | 16.4 | 51.2 | 19.9 | 16.4 | 24.5 |
Education and health services | 16.3 | 19.2 | 12.9 | 18.4 | 16.9 | 19.8 |
Leisure and hospitality | 6.4 | 6.5 | 7.1 | 6.3 | 7.1 | 5.0 |
Other services | 7.2 | 10.0 | 4.2 | 5.9 | 7.5 | 4.8 |
Public administration | 0.6 | 0.6 | 0.6 | 4.4 | 3.7 | 5.0 |
Notes: (1)Refers to the sole or main job; electronically mediated work may be done for the main job, a second job, or additional work for pay. Notes: Some people did electronically mediated work both in person and online. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm. Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics. |
Most notably, 22 percent of electronically mediated workers in the recoded data were in the transportation and utilities industry on their main job, over twice the share of transportation and utilities workers found in the collected data (10 percent). Reflecting the relatively large share of electronically mediated workers who were ride-share drivers, the difference was particularly great for electronically mediated workers who did their jobs in person—35 percent as recoded and 14 percent as collected. In addition, the share of electronically mediated workers in professional and business services was higher for the recoded data (31 percent) than for the collected data (20 percent). Many technical jobs in the industry are sometimes electronically mediated, such as graphic design, copy editing, and computer programming. Among electronically mediated workers who did their work entirely online, the recoded data share was about double that of the collected data—51 percent versus 25 percent.
By class of worker—that is, whether people were wage and salary workers or self-employed—the characteristics are somewhat different for the recoded and collected data. In the recoded data, 4 percent of electronically mediated workers were employed in government on their main job, compared with 11 percent in the collected data. (Although people are unlikely to do electronically mediated work for a government job, some workers employed by the government on their main job did electronically mediated work for a second job or for additional work for pay. In addition, data were not recoded for a small number of cases because there was insufficient verbatim information on the confidential microdata file.) Moreover, the share of electronically mediated workers who were self-employed workers with unincorporated businesses was 30 percent in the recoded data, twice the share as in the collected data (15 percent). (Detailed estimates showing the impact of recoding on in-person and online electronically mediated work are available in appendix B.)
For both the collected and recoded data, table 2 shows the numbers and percentages of in-person and online electronically mediated workers who did this work for their main job, a second job, or additional work for pay. The share who did in-person electronically mediated work for their main job was 91 percent in the collected data, higher than the 72 percent found in the recoded data. The difference was similar for online workers—94 percent in the collected data and 78 percent in the recoded data. However, this difference reflects the fact that BLS had more information about main jobs than about other jobs or additional work for pay, and BLS could recode many cases in which work was done for the main job. As mentioned earlier, the confidential file contains information for second jobs for only about one-fourth of multiple jobholders, and contains no information about the work done for additional work for pay. Consequently, BLS could recode very few cases in which work was done for the second job. The data reflect this, showing little difference in the number of people who did electronically mediated work for their second job in the collected and recoded data. None of the cases reporting additional work for pay were recoded. Thus, there are likely to be more false positives in the recoded data among those who did electronically mediated work for a second job or as additional work for pay. While BLS is confident in estimates of the number of people who did electronically mediated work for their main job, the number of people who did this work for a second job or for additional work for pay may be overstated. Because the team could not recode as many second-job cases or any additional-work-for-pay cases, percent distributions from the “which job” questions should be viewed with caution.
Which job | Recoded | Collected | ||
---|---|---|---|---|
In person | Online | In person | Online | |
Total | 990 | 701 | 3,021 | 2,969 |
Main job | 717 | 544 | 2,746 | 2,799 |
Second job | 142 | 67 | 143 | 80 |
Additional work for pay | 120 | 85 | 120 | 85 |
Percent distribution | ||||
Total | 100.0 | 100.0 | 100.0 | 100.0 |
Main job | 72.4 | 77.6 | 90.9 | 94.3 |
Second job | 14.3 | 9.5 | 4.7 | 2.7 |
Additional work for pay | 12.1 | 12.1 | 4.0 | 2.9 |
Notes: BLS does not recommend using data from the “which job” questions as there was little or no information to recode people who did electronically mediated work for a second job or for additional work for pay. In particular, percent distributions of in-person and online electronically mediated work done for the main job, second job, or additional work for pay are likely to be misleading. Totals include a small number who did not answer the “which job” questions. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm. Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics. |
BLS is confident that the recoded data provides a better picture of the number and characteristics of in-person and online electronically mediated workers than does the collected data. While the confidential file provided additional detail about jobs and was used to identify clear false positives, team members did not recode ambiguous cases. In addition, there was little or no information to recode people who did electronically mediated work for a second job or for additional work for pay, but relatively few responses were in these categories. Thus, while there are likely still some false positives in the recoded data, BLS believes that measures using recoded data more accurately represent the number and characteristics of electronically mediated workers than do measures using the collected data.
However, BLS does not recommend using data from the “which job” questions. In particular, percent distributions of in-person and online electronically mediated work done for the main job, second job, or additional work for pay are likely to be misleading.
All estimates in this section are based on recoded data. BLS believes these data to be superior because they exclude the obvious false positives in the collected data. However, because the questions did not work as intended and there was not enough information to recode all cases, the recoded data may still have limitations. In addition, some of these estimates are based on relatively few observations, so variances may be large.
In May 2017, there were 1.6 million electronically mediated workers, accounting for 1.0 percent of total employment. (See table 3.) These workers obtained short jobs or tasks through websites or mobile apps that both connected them with customers and facilitated payment for the tasks. The estimates include all people who did electronically mediated work, whether for their main job, a second job, or additional work for pay. Of all workers, 0.6 percent did electronically mediated work in person and 0.5 percent did electronically mediated work entirely online. Note that some people did electronically mediated work both in person and entirely online. This can occur when people do electronically mediated work for two different jobs.
Characteristic | Total employed | Electronically mediated workers | |||||
---|---|---|---|---|---|---|---|
Total | In person | Online | Percent of total employed | ||||
Total | In person | Online | |||||
Total, 16 years and over | 153,331 | 1,609 | 990 | 701 | 1.0 | 0.6 | 0.5 |
Men | 81,545 | 870 | 534 | 370 | 1.1 | 0.7 | 0.5 |
Women | 71,785 | 739 | 456 | 331 | 1.0 | 0.6 | 0.5 |
Age | |||||||
16 to 24 | 19,054 | 166 | 73 | 110 | 0.9 | 0.4 | 0.6 |
25 to 54 | 98,801 | 1,146 | 718 | 488 | 1.2 | 0.7 | 0.5 |
25 to 34 | 33,991 | 401 | 239 | 184 | 1.2 | 0.7 | 0.5 |
35 to 44 | 32,065 | 355 | 223 | 146 | 1.1 | 0.7 | 0.5 |
45 to 54 | 32,745 | 390 | 257 | 157 | 1.2 | 0.8 | 0.5 |
55 and over | 35,476 | 297 | 199 | 104 | 0.8 | 0.6 | 0.3 |
55 to 64 | 26,236 | 219 | 150 | 75 | 0.8 | 0.6 | 0.3 |
65 and over | 9,240 | 77 | 49 | 29 | 0.8 | 0.5 | 0.3 |
Race and Hispanic or Latino ethnicity | |||||||
White | 120,638 | 1,200 | 692 | 589 | 1.0 | 0.6 | 0.5 |
Black or African American | 18,588 | 276 | 228 | 48 | 1.5 | 1.2 | 0.3 |
Asian | 9,110 | 93 | 45 | 49 | 1.0 | 0.5 | 0.5 |
Hispanic or Latino ethnicity | 25,525 | 265 | 183 | 94 | 1.0 | 0.7 | 0.4 |
Usual full- and part-time status(1) | |||||||
Full-time workers | 125,240 | 1,165 | 687 | 548 | 0.9 | 0.5 | 0.4 |
Part-time workers | 28,091 | 444 | 303 | 154 | 1.6 | 1.1 | 0.5 |
Class of worker(2) | |||||||
Wage and salary workers | 138,183 | 1,011 | 618 | 439 | 0.7 | 0.4 | 0.3 |
Private industries | 116,300 | 950 | 577 | 413 | 0.8 | 0.5 | 0.4 |
Government | 21,884 | 61 | 41 | 27 | 0.3 | 0.2 | 0.1 |
Self-employed workers | 15,147 | 598 | 372 | 262 | 3.9 | 2.5 | 1.7 |
Self-employed workers, incorporated | 5,575 | 118 | 70 | 61 | 2.1 | 1.3 | 1.1 |
Self-employed workers, unincorporated | 9,572 | 480 | 302 | 201 | 5.0 | 3.2 | 2.1 |
Industry(2) | |||||||
Agriculture and related industries | 2,498 | 1 | 1 | 0 | 0.0 | 0.0 | 0.0 |
Mining, quarrying, and oil and gas extraction | 775 | 0 | 0 | 0 | 0.0 | 0.0 | 0.0 |
Construction | 10,484 | 20 | 13 | 6 | 0.2 | 0.1 | 0.1 |
Manufacturing | 15,984 | 18 | 5 | 13 | 0.1 | 0.0 | 0.1 |
Wholesale trade | 3,383 | 15 | 9 | 6 | 0.4 | 0.3 | 0.2 |
Retail trade | 16,131 | 96 | 56 | 50 | 0.6 | 0.3 | 0.3 |
Transportation and utilities | 7,773 | 351 | 346 | 13 | 4.5 | 4.5 | 0.2 |
Information | 2,894 | 66 | 14 | 53 | 2.3 | 0.5 | 1.8 |
Financial activities | 10,640 | 52 | 25 | 28 | 0.5 | 0.2 | 0.3 |
Professional and business services | 18,528 | 499 | 162 | 359 | 2.7 | 0.9 | 1.9 |
Education and health services | 35,384 | 262 | 190 | 90 | 0.7 | 0.5 | 0.3 |
Leisure and hospitality | 14,244 | 104 | 64 | 50 | 0.7 | 0.5 | 0.3 |
Other services | 7,517 | 115 | 99 | 30 | 1.5 | 1.3 | 0.4 |
Public administration | 7,095 | 10 | 6 | 4 | 0.1 | 0.1 | 0.1 |
Occupation(2) | |||||||
Management, professional, and related occupations | 62,378 | 720 | 261 | 505 | 1.2 | 0.4 | 0.8 |
Management, business, and financial operations occupations | 25,866 | 234 | 117 | 130 | 0.9 | 0.5 | 0.5 |
Professional and related occupations | 36,513 | 486 | 144 | 375 | 1.3 | 0.4 | 1.0 |
Service occupations | 26,405 | 264 | 245 | 35 | 1.0 | 0.9 | 0.1 |
Sales and office occupations | 32,584 | 235 | 121 | 128 | 0.7 | 0.4 | 0.4 |
Sales and related occupations | 15,134 | 109 | 57 | 62 | 0.7 | 0.4 | 0.4 |
Office and administrative support occupations | 17,450 | 125 | 64 | 67 | 0.7 | 0.4 | 0.4 |
Natural resources, construction, and maintenance occupations | 14,104 | 39 | 20 | 19 | 0.3 | 0.1 | 0.1 |
Farming, fishing, and forestry occupations | 1,222 | 4 | 0 | 4 | 0.4 | 0.0 | 0.4 |
Construction and extraction occupations | 7,985 | 21 | 7 | 14 | 0.3 | 0.1 | 0.2 |
Installation, maintenance, and repair occupations | 4,896 | 14 | 14 | 0 | 0.3 | 0.3 | 0.0 |
Production, transportation, and material moving occupations | 17,860 | 352 | 343 | 15 | 2.0 | 1.9 | 0.1 |
Production occupations | 8,785 | 7 | 4 | 3 | 0.1 | 0.0 | 0.0 |
Transportation and material moving occupations | 9,075 | 345 | 339 | 12 | 3.8 | 3.7 | 0.1 |
Contingent worker status(2) | |||||||
Contingent workers, estimate 1 | 1,958 | 21 | 13 | 8 | 1.1 | 0.7 | 0.4 |
Contingent workers, estimate 2 | 2,511 | 79 | 62 | 22 | 3.1 | 2.5 | 0.9 |
Contingent workers, estimate 3 | 5,858 | 126 | 86 | 45 | 2.2 | 1.5 | 0.8 |
Noncontingent workers | 147,473 | 1,483 | 904 | 656 | 1.0 | 0.6 | 0.4 |
Alternative employment arrangement(2) | |||||||
Independent contractors | 10,614 | 597 | 375 | 264 | 5.6 | 3.5 | 2.5 |
On-call workers | 2,579 | 68 | 47 | 21 | 2.6 | 1.8 | 0.8 |
Temporary help agency workers | 1,356 | 46 | 35 | 11 | 3.4 | 2.6 | 0.8 |
Workers provided by contract firms | 933 | 23 | 18 | 5 | 2.4 | 1.9 | 0.5 |
Workers with traditional arrangements | 137,853 | 882 | 521 | 401 | 0.6 | 0.4 | 0.3 |
Educational attainment | |||||||
Total, 25 years and over | 134,277 | 1,443 | 917 | 592 | 1.1 | 0.7 | 0.4 |
Less than a high school diploma | 9,578 | 64 | 53 | 12 | 0.7 | 0.6 | 0.1 |
High school graduates, no college | 33,616 | 284 | 230 | 58 | 0.8 | 0.7 | 0.2 |
Some college or associate degree | 36,088 | 375 | 265 | 126 | 1.0 | 0.7 | 0.3 |
Bachelor’s degree and higher | 54,994 | 720 | 370 | 396 | 1.3 | 0.7 | 0.7 |
Bachelor’s degree only | 33,749 | 403 | 196 | 229 | 1.2 | 0.6 | 0.7 |
Advanced degree | 21,246 | 317 | 173 | 167 | 1.5 | 0.8 | 0.8 |
Notes: (1)Based on usual hours at all jobs combined. Full time is 35 hours or more per week; part time is less than 35 hours. (2)This refers to the sole or main job; electronically mediated work may be done for the main job, a second job, or other additional work for pay. Notes: These are estimates of electronically mediated workers as recoded by BLS. Some people did electronically mediated work both in person and online. Estimates for the race groups (White, Black or African American, and Asian) do not sum to totals because data are not presented for all races. People whose ethnicity is identified as Hispanic or Latino may be of any race. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm. Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics. |
Electronically mediated workers were slightly more likely to be men than women, reflecting the fact that, overall, a higher percentage of the employed were men. (See table 4.) Compared with workers overall, electronically mediated workers were more likely to be in the prime-working-age category (25 to 54) and less likely to be in the oldest age category (55 and over). They were also more likely than workers overall to work part time.19
Characteristic | Total employed | Electronically mediated workers | ||
---|---|---|---|---|
Total | In person | Online | ||
Total, 16 years and over (in thousands) | 153,331 | 1,609 | 990 | 701 |
Percent of total | 100.0 | 100.0 | 100.0 | 100.0 |
Men | 53.2 | 54.1 | 53.9 | 52.7 |
Women | 46.8 | 45.9 | 46.1 | 47.3 |
Age | ||||
16 to 24 | 12.4 | 10.3 | 7.4 | 15.6 |
25 to 54 | 64.4 | 71.2 | 72.6 | 69.5 |
25 to 34 | 22.2 | 24.9 | 24.1 | 26.3 |
35 to 44 | 20.9 | 22.1 | 22.5 | 20.9 |
45 to 54 | 21.4 | 24.3 | 25.9 | 22.4 |
55 and over | 23.1 | 18.4 | 20.1 | 14.9 |
55 to 64 | 17.1 | 13.6 | 15.1 | 10.8 |
65 and over | 6.0 | 4.8 | 4.9 | 4.1 |
Race and Hispanic or Latino ethnicity | ||||
White | 78.7 | 74.6 | 69.9 | 84.0 |
Black or African American | 12.1 | 17.1 | 23.0 | 6.9 |
Asian | 5.9 | 5.8 | 4.6 | 7.0 |
Hispanic or Latino ethnicity | 16.6 | 16.4 | 18.5 | 13.4 |
Usual full- and part-time status(1) | ||||
Full-time workers | 81.7 | 72.4 | 69.4 | 78.1 |
Part-time workers | 18.3 | 27.6 | 30.6 | 21.9 |
Class of worker(2) | ||||
Wage and salary workers | 90.1 | 62.8 | 62.4 | 62.6 |
Private industries | 75.8 | 59.1 | 58.3 | 58.9 |
Government | 14.3 | 3.8 | 4.2 | 3.8 |
Self-employed workers | 9.9 | 37.2 | 37.6 | 37.4 |
Self-employed workers, incorporated | 3.6 | 7.3 | 7.1 | 8.8 |
Self-employed workers, unincorporated | 6.2 | 29.8 | 30.5 | 28.6 |
Industry(2) | ||||
Agriculture and related industries | 1.6 | 0.0 | 0.1 | 0.0 |
Mining, quarrying, and oil and gas extraction | 0.5 | 0.0 | 0.0 | 0.0 |
Construction | 6.8 | 1.2 | 1.3 | 0.9 |
Manufacturing | 10.4 | 1.1 | 0.5 | 1.9 |
Wholesale trade | 2.2 | 0.9 | 0.9 | 0.8 |
Retail trade | 10.5 | 5.9 | 5.6 | 7.1 |
Transportation and utilities | 5.1 | 21.8 | 35.0 | 1.9 |
Information | 1.9 | 4.1 | 1.4 | 7.5 |
Financial activities | 6.9 | 3.3 | 2.5 | 4.0 |
Professional and business services | 12.1 | 31.0 | 16.4 | 51.2 |
Education and health services | 23.1 | 16.3 | 19.2 | 12.9 |
Leisure and hospitality | 9.3 | 6.4 | 6.5 | 7.1 |
Other services | 4.9 | 7.2 | 10.0 | 4.2 |
Public administration | 4.6 | 0.6 | 0.6 | 0.6 |
Occupation(2) | ||||
Management, professional, and related occupations | 40.7 | 44.7 | 26.4 | 71.9 |
Management, business, and financial operations occupations | 16.9 | 14.5 | 11.8 | 18.5 |
Professional and related occupations | 23.8 | 30.2 | 14.5 | 53.5 |
Service occupations | 17.2 | 16.4 | 24.8 | 5.0 |
Sales and office occupations | 21.3 | 14.6 | 12.2 | 18.3 |
Sales and related occupations | 9.9 | 6.8 | 5.8 | 8.8 |
Office and administrative support occupations | 11.4 | 7.8 | 6.4 | 9.5 |
Natural resources, construction, and maintenance occupations | 9.2 | 2.4 | 2.1 | 2.7 |
Farming, fishing, and forestry occupations | 0.8 | 0.3 | 0.0 | 0.6 |
Construction and extraction occupations | 5.2 | 1.3 | 0.7 | 2.1 |
Installation, maintenance, and repair occupations | 3.2 | 0.9 | 1.4 | 0.0 |
Production, transportation, and material moving occupations | 11.6 | 21.9 | 34.6 | 2.1 |
Production occupations | 5.7 | 0.4 | 0.4 | 0.4 |
Transportation and material moving occupations | 5.9 | 21.5 | 34.2 | 1.7 |
Contingent worker status(2) | ||||
Contingent workers, estimate 1 | 1.3 | 1.3 | 1.3 | 1.2 |
Contingent workers, estimate 2 | 1.6 | 4.9 | 6.2 | 3.1 |
Contingent workers, estimate 3 | 3.8 | 7.8 | 8.7 | 6.5 |
Noncontingent workers | 96.2 | 92.2 | 91.3 | 93.5 |
Alternative employment arrangement(2) | ||||
Independent contractors | 6.9 | 37.1 | 37.8 | 37.7 |
On-call workers | 1.7 | 4.2 | 4.8 | 3.0 |
Temporary help agency workers | 0.9 | 2.8 | 3.5 | 1.5 |
Workers provided by contract firms | 0.6 | 1.4 | 1.8 | 0.6 |
Workers with traditional arrangements | 89.9 | 54.8 | 52.6 | 57.2 |
Educational attainment | ||||
Total, 25 years and over | 100.0 | 100.0 | 100.0 | 100.0 |
Less than a high school diploma | 7.1 | 4.5 | 5.7 | 2.0 |
High school graduates, no college | 25.0 | 19.7 | 25.1 | 9.8 |
Some college or associate degree | 26.9 | 26.0 | 28.9 | 21.3 |
Bachelor’s degree and higher | 41.0 | 49.9 | 40.3 | 67.0 |
Bachelor’s degree only | 25.1 | 27.9 | 21.4 | 38.7 |
Advanced degree | 15.8 | 22.0 | 18.9 | 28.3 |
Notes: (1)Based on usual hours at all jobs combined. Full time is 35 hours or more per week; part time is less than 35 hours. (2)This refers to the sole or main job; electronically mediated work may be done for the main job, a second job, or other additional work for pay. Notes: These are estimates of electronically mediated workers as recoded by BLS. Some people did electronically mediated work both in person and online. Estimates for the race groups (White, Black or African American, and Asian) do not sum to totals because data are not presented for all races. People whose ethnicity is identified as Hispanic or Latino may be of any race. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm. Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics. |
Blacks or African Americans accounted for 17 percent of electronically mediated workers, higher than their share of overall employment (12 percent). By contrast, Whites made up 75 percent of electronically mediated workers, slightly lower than their share of workers overall (79 percent). Hispanics or Latinos made up 16 percent of electronically mediated workers, and Asians accounted for 6 percent. Blacks were overrepresented among in-person electronically mediated workers (23 percent), while Whites were overrepresented among online workers (84 percent).
Educational attainment data are restricted to those age 25 and over because most people have completed their education by that age. Compared with workers overall, people who did electronically mediated work were more likely to have a bachelor’s degree or higher. This was driven by people who did their tasks entirely online; 67 percent of online electronically mediated workers had a bachelor’s degree or higher.
Self-employed workers were more likely than wage and salary workers to do electronically mediated work (4 percent versus 1 percent). (See table 3.) Five percent of self-employed workers whose businesses were unincorporated did such work, as did 2 percent of the self-employed with incorporated businesses.
By industry, workers in transportation and utilities (main job) were the most likely to have done electronically mediated work, with 5 percent of workers in this industry having done such work. Those employed in professional and business services, information, and other services were also more likely to do electronically mediated work, at 3 percent, 2 percent, and 2 percent, respectively. Workers in transportation and utilities and in other services were more likely to do in-person work, and those in professional and business services and in information were more likely to do online work.
Workers in the four alternative employment arrangements measured in the CWS—independent contractors, temporary help agency workers, on-call workers, and workers provided by contract firms—were more likely than workers in traditional arrangements to have done electronically mediated work. Independent contractors were the most likely to do electronically mediated work—6 percent did so in May 2017, compared with 3 percent of temporary help agency workers, 3 percent of on-call workers, and 2 percent of workers provided by contract firms. By contrast, less than 1 percent of workers in traditional arrangements were electronically mediated workers.
BLS should not again attempt to collect data about electronically mediated work using the four new questions fielded in the May 2017 CWS. If BLS were to collect data about electronically mediated work in the future, the questions would need to be substantially revised. It may simply be that the concepts are too complicated for four questions to properly identify all the information BLS was attempting to measure.
BLS recognizes that a number of steps could be taken to improve future questions on all subjects, not simply those that concern electronically mediated work. Such strategies address ways to improve question development, cognitive testing, and interviewer training.
For complicated questions that are intended to measure a small portion of the population, it is important to test participants who do not fit the characteristic of interest as well as those who do. For the laboratory testing, BLS made the strategic decision to recruit people who had done in-person electronically mediated work or were in occupations with a large number of electronically mediated workers. These laboratory participants may have been more familiar with the concepts than people in occupations with little or no electronically mediated work. For example, even if they do no electronically mediated work, many taxi drivers may be familiar with such work because they know about Uber and Lyft drivers. They may be able to answer the questions more easily than people in occupations that have few electronically mediated workers, such as schoolteachers or firefighters. For the mTurk testing, all participants were at least somewhat familiar with electronically mediated work since they were using the mTurk platform. Thus, mTurk testing participants were doubtless more likely to be familiar with electronically mediated work than some others would have been. An increased sample size with a wider variation of occupations and work arrangements may have provided increased insight about the potential for false positives.
Conducting multiple rounds of cognitive testing is a recommended best practice, particularly when questions involve complicated concepts or revisions are made during testing. Because of time and funding constraints, BLS committed to a tight timeframe for cognitive testing and conducted only one round of testing. Improvements may have been realized if—after analyzing the cognitive interview results and revising the questions—additional interviews had been conducted to evaluate the revisions.
Interviewers clearly could have benefited from additional training. Because CPS interviewers are located all over the country, in-person classroom training would be prohibitively expensive. Less expensive ways to improve training include the following:
· Additional time for the self-study material
· Computer-based training modules with graded quizzes
· Online training sessions
· Web-based training or teleconferences
· Practice interviews
CPS interviewers have a keen sense about what might be confusing to other interviewers and respondents. Meeting with interviewers prior to finalizing the questionnaire to identify possible problems with questions could improve both the questionnaire and training material. It could have an added benefit of increasing interviewer engagement.
Electronically mediated work is an area that is being increasingly studied. Since the 2017 CWS was fielded, new findings have emerged that could have been helpful in the question design process. For example, Statistics Finland, when attempting to measure “platform jobs,” found that they needed to include company names in the question in order to get accurate responses.20 BLS should keep abreast of new research, continue to work with outside experts, and leverage the efforts of others when designing questions.
Each record was reviewed by multiple staff members. In order to determine whether responses were correct, they looked at the following pieces of information:
· Verbatim descriptions of industry, occupation, and duties on the job
· Company name
· Class of worker (whether wage and salary worker or self-employed)
· Whether the person is an independent contractor, an on-call worker, a temporary help agency worker, or a worker provided by a contract firm on the main job
· The number of hours a person usually works
Answers were not recoded if there was not enough information to determine, with a high probability, that a person was not doing electronically mediated work.
Type of work | Less likely to have done electronically mediated work | More likely to have done electronically mediated work |
---|---|---|
In-person electronically mediated work | If the person works: · In a management occupation · As a real estate agent · In sales · In manufacturing or mining · In an occupation that requires extensive infrastructure to provide the service · For the federal, state, or local government | If the person: · Is a driver or delivery person · Works in home healthcare · Does chores or other short-term work · Works in an occupation where customers typically only need a worker for a short or fixed period · Usually works few hours per week |
If the person: · Does NOT work for a business that connects people with clients through a website or mobile app OR · Is NOT paid by or through a businesses website or mobile app that connects people with clients or custormers OR · Is NOT doing in-person work | If the person: · Works for a business that connects people with clients or customers through a website or mobile app AND · Is paid by or through the business that owns the website or mobile app AND · Is doing in-person work | |
Electronically mediated work that is done entirely online | If the person works: · In a management occupation · As a real estate agent · In sales · In manufacturing or mining · As a driver or delivery person · For the federal, state, or local government | If the person: · Does data entry, answers surveys, or assesses internet sites · Does copyediting, translating, or graphic design · Does data analysis or programming · Does digital marketing or social media analysis · Does online tutoring or course development · Works in an occupation that requires no face-to-face interaction · Usually works few hours per week |
If the person: · Does NOT work for a business that connects people with clients through a website or mobile app OR · Is NOT paid by or through a business website or mobile app that connects people with clients or customers OR · Is not doing work entirely online | If the person: · Does work for a business that connects people with clients or customers through a website or mobile app AND · Is paid by or through the business that owns the website or mobile app AND · Is doing work entirely online | |
Source: U.S. Bureau of Labor Statistics. |
Characteristic | Total employed | Electronically mediated workers | |||||
---|---|---|---|---|---|---|---|
Recoded | Collected | ||||||
Total | In person | Online | Total | In person | Online | ||
Total, 16 years and over | 153,331 | 1,609 | 990 | 701 | 5,057 | 3,021 | 2,969 |
Men | 81,545 | 870 | 534 | 370 | 2,650 | 1,647 | 1,500 |
Women | 71,785 | 739 | 456 | 331 | 2,407 | 1,374 | 1,469 |
Age | |||||||
16 to 24 | 19,054 | 166 | 73 | 110 | 471 | 259 | 306 |
25 to 54 | 98,801 | 1,146 | 718 | 488 | 3,489 | 2,078 | 2,042 |
25 to 34 | 33,991 | 401 | 239 | 184 | 1,170 | 700 | 738 |
35 to 44 | 32,065 | 355 | 223 | 146 | 1,207 | 696 | 687 |
45 to 54 | 32,745 | 390 | 257 | 157 | 1,112 | 682 | 617 |
55 and over | 35,476 | 297 | 199 | 104 | 1,098 | 683 | 622 |
55 to 64 | 26,236 | 219 | 150 | 75 | 808 | 481 | 459 |
65 and over | 9,240 | 77 | 49 | 29 | 290 | 202 | 163 |
Race and Hispanic or Latino ethnicity | |||||||
White | 120,638 | 1,200 | 692 | 589 | 3,983 | 2,354 | 2,398 |
Black or African American | 18,588 | 276 | 228 | 48 | 664 | 406 | 365 |
Asian | 9,110 | 93 | 45 | 49 | 279 | 180 | 135 |
Hispanic or Latino ethnicity | 25,525 | 265 | 183 | 94 | 821 | 520 | 457 |
Usual full- and part-time status(1) | |||||||
Full-time workers | 125,240 | 1,165 | 687 | 548 | 4,139 | 2,351 | 2,559 |
Part-time workers | 28,091 | 444 | 303 | 154 | 918 | 670 | 410 |
Class of worker(2) | |||||||
Wage and salary workers | 138,183 | 1,011 | 618 | 439 | 3,948 | 2,196 | 2,432 |
Private industries | 116,300 | 950 | 577 | 413 | 3,374 | 1,920 | 2,061 |
Government | 21,884 | 61 | 41 | 27 | 574 | 277 | 371 |
Self-employed workers | 15,147 | 598 | 372 | 262 | 1,109 | 824 | 538 |
Self-employed workers, incorporated | 5,575 | 118 | 70 | 61 | 370 | 265 | 196 |
Self-employed workers, unincorporated | 9,572 | 480 | 302 | 201 | 739 | 560 | 341 |
Industry(2) | |||||||
Agriculture and related industries | 2,498 | 1 | 1 | 0 | 13 | 5 | 10 |
Mining, quarrying, and oil and gas extraction | 775 | 0 | 0 | 0 | 10 | 1 | 10 |
Construction | 10,484 | 20 | 13 | 6 | 223 | 156 | 116 |
Manufacturing | 15,984 | 18 | 5 | 13 | 302 | 140 | 193 |
Wholesale trade | 3,383 | 15 | 9 | 6 | 102 | 66 | 62 |
Retail trade | 16,131 | 96 | 56 | 50 | 501 | 330 | 260 |
Transportation and utilities | 7,773 | 351 | 346 | 13 | 495 | 420 | 152 |
Information | 2,894 | 66 | 14 | 53 | 150 | 52 | 126 |
Financial activities | 10,640 | 52 | 25 | 28 | 483 | 291 | 285 |
Professional and business services | 18,528 | 499 | 162 | 359 | 1,006 | 495 | 727 |
Education and health services | 35,384 | 262 | 190 | 90 | 932 | 510 | 589 |
Leisure and hospitality | 14,244 | 104 | 64 | 50 | 317 | 215 | 150 |
Other services | 7,517 | 115 | 99 | 30 | 299 | 227 | 142 |
Public administration | 7,095 | 10 | 6 | 4 | 224 | 112 | 149 |
Occupation(2) | |||||||
Management, professional, and related occupations | 62,378 | 720 | 261 | 505 | 2,499 | 1,238 | 1,711 |
Management, business, and financial operations occupations | 25,866 | 234 | 117 | 130 | 1,207 | 667 | 761 |
Professional and related occupations | 36,513 | 486 | 144 | 375 | 1,292 | 572 | 950 |
Service occupations | 26,405 | 264 | 245 | 35 | 569 | 474 | 221 |
Sales and office occupations | 32,584 | 235 | 121 | 128 | 1,220 | 672 | 776 |
Sales and related occupations | 15,134 | 109 | 57 | 62 | 589 | 387 | 333 |
Office and administrative support occupations | 17,450 | 125 | 64 | 67 | 631 | 285 | 443 |
Natural resources, construction, and maintenance occupations | 14,104 | 39 | 20 | 19 | 225 | 154 | 122 |
Farming, fishing, and forestry occupations | 1,222 | 4 | 0 | 4 | 8 | 1 | 8 |
Construction and extraction occupations | 7,985 | 21 | 7 | 14 | 118 | 77 | 75 |
Installation, maintenance, and repair occupations | 4,896 | 14 | 14 | 0 | 99 | 76 | 39 |
Production, transportation, and material moving occupations | 17,860 | 352 | 343 | 15 | 543 | 483 | 139 |
Production occupations | 8,785 | 7 | 4 | 3 | 77 | 58 | 32 |
Transportation and material moving occupations | 9,075 | 345 | 339 | 12 | 466 | 425 | 107 |
Contingent worker status(2) | |||||||
Contingent workers, estimate 1 | 1,958 | 21 | 13 | 8 | 85 | 47 | 59 |
Contingent workers, estimate 2 | 2,511 | 79 | 62 | 22 | 158 | 100 | 95 |
Contingent workers, estimate 3 | 5,858 | 126 | 86 | 45 | 285 | 167 | 180 |
Noncontingent workers | 147,473 | 1,483 | 904 | 656 | 4,772 | 2,854 | 2,790 |
Alternative employment arrangement(2) | |||||||
Independent contractors | 10,614 | 597 | 375 | 264 | 980 | 750 | 489 |
On-call workers | 2,579 | 68 | 47 | 21 | 175 | 113 | 88 |
Temporary help agency workers | 1,356 | 46 | 35 | 11 | 63 | 40 | 43 |
Workers provided by contract firms | 933 | 23 | 18 | 5 | 55 | 38 | 20 |
Workers with traditional arrangements | 137,853 | 882 | 521 | 401 | 3,798 | 2,093 | 2,329 |
Educational attainment | |||||||
Total, 25 years and over | 134,277 | 1,443 | 917 | 592 | 4,586 | 2,761 | 2,663 |
Less than a high school diploma | 9,578 | 64 | 53 | 12 | 213 | 172 | 78 |
High school graduates, no college | 33,616 | 284 | 230 | 58 | 837 | 575 | 407 |
Some college or associate degree | 36,088 | 375 | 265 | 126 | 1,282 | 741 | 753 |
Bachelor’s degree and higher | 54,994 | 720 | 370 | 396 | 2,255 | 1,274 | 1,424 |
Bachelor’s degree only | 33,749 | 403 | 196 | 229 | 1,387 | 782 | 908 |
Advanced degree | 21,246 | 317 | 173 | 167 | 868 | 492 | 516 |
Notes: (1)Based on usual hours at all jobs combined. Full time is 35 hours or more per week; part time is less than 35 hours. (2)This refers to the sole or main job; electronically mediated work may be done for the main job, a second job, or other additional work for pay. Notes: Some people did electronically mediated work both in person and online. Estimates for the race groups (White, Black or African American, and Asian) do not sum to totals because data are not presented for all races. People whose ethnicity is identified as Hispanic or Latino may be of any race. An Excel version of this table is available at https://www.bls.gov/cps/electronically-mediated-employment.htm. Source: Contingent Worker Supplement to the Current Population Survey, U.S. Bureau of Labor Statistics. |
Current Population Survey staff, "Electronically mediated work: new questions in the Contingent Worker Supplement," Monthly Labor Review, U.S. Bureau of Labor Statistics, September 2018, https://doi.org/10.21916/mlr.2018.24
1 The Current Population Survey (CPS) is jointly sponsored by U.S. Bureau of Labor Statistics (BLS) and the U.S. Census Bureau and is best known for the national unemployment rate. Statistics from the CPS, widely used by policymakers and researchers, are among the country’s most timely economic indicators. The CPS provides extensive information about the employment and unemployment status of the population, and the survey data can be broken out by a variety of demographic characteristics, such as race, ethnicity, educational attainment, disability status, age, and gender. In addition, the CPS is a primary source of socioeconomic data about the labor force, including industry, occupation, hours of work, and earnings. In most months of the year, the monthly CPS questions are followed by supplementary questions about a particular topic. The Contingent Worker Supplement (CWS) is one such supplement. The Department of Labor’s Chief Evaluation Office sponsored the May 2017 CWS.
2 In order to limit respondent burden, surveys are often designed so that respondents are asked questions based on how they answered earlier questions. In the CPS, for example, people who said they have a job are asked a series of questions about their employment, questions that are not asked of people without jobs. The ways respondents are routed through the survey questions are referred to as “skip patterns.” Also, the wording of particular questions is often conditional on information obtained earlier in the survey. For instance, a question about a particular household member may include a “fill” of the household member’s name.
3 In addition to electronically mediated workers and platform workers, other terms used to describe people who do this type of work include the following: online gig workers, workers providing services through digital matching firms, e-lancers, sharing economy workers, on-demand economy workers, digitally matched workers, peer-to-peer economy workers, collaborative economy workers, and electronically intermediated workers.
4 For example, see Devin Fidler, “Work, interrupted: the new labor economics of platforms,” Institute for the Future, 2016, http://www.iftf.org/fileadmin/user_upload/downloads/wfi/IFTF_Work-Interrupted_FullReport.pdf; Jonathan V. Hall and Alan B. Krueger, “An analysis of the labor market for Uber’s driver-partners in the United States,” Princeton University, Industrial Relations Section, Working Paper 587, January 2015, https://dataspace.princeton.edu/jspui/handle/88435/dsp010z708z67d; Jane Dokko, Megan Mumford, and Diane Whitmore Schanzenbach, “Workers and the online gig economy,” The Hamilton Project, December 2015, http://www.hamiltonproject.org/papers/workers_and_the_online_gig_economy; Sarah A. Donovan, David H. Bradley, and Jon O. Shimabukuro, “What does the gig economy mean for workers?” Congressional Research Service, R44365, February 5, 2016, https://fas.org/sgp/crs/misc/R44365.pdf; Seth D. Harris and Alan B. Krueger, “A proposal for modernizing labor laws for twenty-first-century work: the ‘independent worker,’” Discussion Paper 2015–10, The Hamilton Project, December 2015, http://www.hamiltonproject.org/assets/files/modernizing_labor_laws_for_twenty_first_century_work_krueger_harris.pdf; Sara Horowitz and Fabio Rosati, “53 million Americans are freelancing, new survey finds,” Freelancers Union, September 4, 2014, https://blog.freelancersunion.org/2014/09/04/53million/; Rudy Telles, Jr., “Digital matching firms: a new definition in the ‘sharing economy’ space,” ESA Issue Brief no. 01-16 (Economics and Statistics Administration, June 3, 2016); and Diana Farrell and Fiona Greig, “Paychecks, paydays, and the online platform economy: big data on income volatility,” J.P. Morgan Chase Institute, February 2016, https://institute.jpmorganchase.com/content/dam/jpmc/jpmorgan-chase-and-co/institute/pdf/jpmc-institute-volatility-2-report.pdf.
5 See Hall and Krueger, “An analysis of the labor market for Uber’s driver-partners in the United States”; and Lawrence Mishel, “Uber and the labor market: Uber drivers’ compensation, wages, and the scale of Uber and the gig economy,” Economic Policy Institute, May 15, 2018, https://www.epi.org/publication/uber-and-the-labor-market-uber-drivers-compensation-wages-and-the-scale-of-uber-and-the-gig-economy/.
6 The CWS is asked of employed people who are not unpaid family workers on their main job. The survey also includes a few respondents who were not employed. These respondents are asked about their last job.
7 For example, see Katharine G. Abraham and Ashley Amaya, “Probing for informal work activity,” National Bureau of Economic Research, Working Paper 24880, August 2018, http://www.nber.org/papers/w24880; and Anat Bracha and Mary A. Burke, “Who counts as employed? Informal work, employment status, and labor market slack,” Federal Reserve Bank of Boston, Working Paper 16-29, December 2016, https://www.bostonfed.org/publications/research-department-working-paper/2016/who-counts-as-employed-informal-work-employment-status-and-labor-market-slack.aspx.
8 Respondents who have indicated that someone in the household has a farm or business are asked “LAST WEEK, did you do ANY work for either pay or profit?”
9 In order to investigate possible measurement error in the classification of labor force status in the CPS and other surveys using labor force questions similar to those in the CPS, BLS researchers investigated data on income-generating activities from the American Time Use Survey (ATUS), data that are not available in the CPS. The evaluation of the ATUS data indicated that, while there may be some misclassification of workers in surveys that use questions similar to the CPS labor force questions, the effect on the total employment estimate is likely small. See Mary Dorinda Allard and Anne E. Polivka, "Measuring labor market activity today: are the words work and job too limiting for surveys?," Monthly Labor Review, U.S. Bureau of Labor Statistics, November 2018, https://doi.org/10.21916/mlr.2018.26.
10 For example, see Ajay Agrawal, John Horton, Nicola Lacetera, and Elizabeth Lyons, “Digitization and the contract labor market: a research agenda,” in Avi Goldfarb, Shane M. Greenstein, and Catherine E. Tucker, eds., Economic Analysis of the Digital Economy (Chicago, IL: University of Chicago Press, 2015), pp. 219–50; Valerio De Stefano, “The rise of the ‘just-in-time workforce’: on-demand work, crowdwork and labor protection in the ‘gig economy,’” Comparative Labor Law and Policy Journal, vol. 37, no. 3, June 2016, pp. 471–504; Alek Felstiner, “Working the crowd: employment and labor law in the crowdsourcing industry,” Berkeley Journal of Employment and Labor Law, vol. 32, no. 1, 2011, pp. 143–203; and Gordon Burtch, Seth Carnahan, and Brad N. Greenwood, “Can you gig it? An empirical examination of the gig-economy and entrepreneurial activity,” Ross School of Business Paper no. 1308, March 2016, University of Michigan, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2744352.
11 For example, see Miriam A. Cherry, “A taxonomy of virtual work,” Georgia Law Review, vol. 45, no. 4, summer 2011, p. 968, https://heinonline.org/HOL/Page?collection=journals&handle=hein.journals/geolr45&id=1022; Karën Fort, Gilles Adda, and K. Bretonnel Cohen, “Amazon Mechanical Turk: gold mine or coal mine?” Computational Linguistics, vol. 37, no. 2, June 2011, pp. 413–20; David Martin, Benjamin V. Hanrahan, Jacki O’Neill, and Neha Gupta, “Being a turker,” in Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work and Social Computing, pp. 224–35, 2014; and Kotaro Hara, Abigail Adams, Kristy Milland, Saiph Savage, Chris Callison-Burch, and Jeffrey P. Bigham, “A data-driven analysis of workers’ earnings on Amazon Mechanical Turk,” in Proceedings of the 2018 CHI Conference on Human Factors in Computer Systems, 2018.
12 An extensive list of companies involved in electronically mediated work that existed when the questions were being developed can be found in Rudy Telles, Jr., “Digital matching firms.”
13 For Office of Management and Budget standards and guidelines for statistical surveys, see https://obamawhitehouse.archives.gov/sites/default/files/omb/inforeg/statpolicy/standards_stat_surveys.pdf.
14 For Office of Management and Budget standards and guidelines for cognitive interviews, see https://obamawhitehouse.archives.gov/sites/default/files/omb/inforeg/directive2/final_addendum_to_stat_policy_dir_2.pdf.
15 For the final cognitive testing report, see appendix H of the Office of Management and Budget clearance for the 2017 CWS.
16 The data collection instrument allowed “yes” answers for both questions.
17 Households selected for the CPS are in the sample for 8 months total over a 16-month period. Detailed information about multiple jobs is collected only in the 4th and 8th months in which households are in the sample.
18 The full list of keywords follows: Clickworker, Click Worker, Crowdflower, Crowd Flower, Crowdsource, Crowd Source, Favor, Fiverr, freelance, Grubhub, Grub Hub, Handy, Instacart, Insta Cart, Lyft, Mechanical Turk, Microworkers, Micro Workers, Minijob, Mini Job, Onespace, One Space, Postmates, Post Mates, Rapidworker, rideshare, ride share, ridesharing, ride sharing, Shorttask, Short Task, Taskrabbit, Task Rabbit, taxi, Turk, Uber, Upwork, Washeo, Washio, and .com. The search was not case sensitive.
19 Part-time workers are defined as those who usually work less than 35 hours per week at all jobs combined.
20 Hanna Sutela, “Platform jobs are here to stay—how to measure them?” Statistics Finland, April 17, 2018, http://www.stat.fi/tietotrendit/blogit/2018/platform-jobs-are-here-to-stay-how-to-measure-them/.