Faculty research on AI and invisible labor honored for critical real-world impact
Fri 11.12.21 / Madelaine Millar
Faculty research on AI and invisible labor honored for critical real-world impact
Fri 11.12.21 / Madelaine Millar
Fri 11.12.21 / Madelaine Millar
Fri 11.12.21 / Madelaine Millar
Faculty research on AI and invisible labor honored for critical real-world impact
Fri 11.12.21 / Madelaine Millar
Faculty research on AI and invisible labor honored for critical real-world impact
Fri 11.12.21 / Madelaine Millar
Fri 11.12.21 / Madelaine Millar
Fri 11.12.21 / Madelaine Millar
If you work for a salary or an hourly wage, you get paid for all the tasks you do during work hours—answering emails, making sure your paycheck is deposited correctly, figuring out what task you’ll complete next. You may even be getting paid to read this article right now as you work. But if you’re a crowdsourced worker on a platform like Mechanical Turk cleaning and labeling AI training data, the critical tasks that keep your work, well, working—but that aren’t the actual job of cleaning and labeling data—are unpaid. For a long time, they weren’t even measured.
This invisible labor by crowdsourced AI workers is the crux of new research by Khoury College of Computer Sciences assistant professor Saiph Savage, her doctoral student researcher (and first author) Carlos Toxtli, and Microsoft researcher Siddharth Suri. Their work was recently honored with the Impact Award at the 24th ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW), one of the top conferences in the field of human-computer interaction. Their paper, “Quantifying the Invisible Labor in Crowd Work”, examines how much of their time crowdsource workers are spending on tasks that are critical to their job, but unpaid.
To measure invisible labor, the team built a browser extension that labeled and measured the different tasks being performed on crowdsource platforms. They had 100 workers who varied across axes like skill level, country, language, disability, and working style install the browser, and then the research team collected and analyzed the data.
“We found that it’s a big part of the [total] labor, invisible labor. In fact, it is one-third of all the labor that they do on the platform,” said Toxtli. “In other types of work…you have a fixed amount per hour, independent of how productive you are. For [crowdsource workers], they have to be productive all the time.”
The most common types of invisible labor were related to payments—checking earnings, requesting their pay for hours worked, etc.—and hypervigilance. Because the supply of crowdsource workers often exceeds demand, many of the tasks on offer pay very little, and snapping up well-paid tasks requires constantly watching the platform. Companies aren’t paying workers to perform this hypervigilance, but without it, workers won’t get paid at all.
All this invisible labor has a serious impact on their wages. When accounting for the time put into invisible labor, the researchers found that a number of crowdsource workers are making below the federal minimum wage of $7.25 an hour.
The Impact Award is granted to papers that demonstrate critical real-world impact, which Savage said has been the goal of this project all along.
“Our goal is that, with this type of tool, we can empower workers to go to policymakers and say ‘hey, a large number of us are impacted by invisible labor. It’s consuming 33 percent of our time, and it’s costing us this amount of money. We need to do something, change needs to happen here,” explained Savage. “If we truly want to create change for these platforms, positive change, we need to understand what is happening inside.”
In addition to giving crowdsource workers the data they need to push for policy change, the team hopes their research can bring visibility to a part of AI that’s often overlooked, despite being critical to the industry’s success.
“They are invisible workers, because most people don’t know the AI [we] use is powered by them,” said Toxtli. Training a single algorithm takes thousands of pieces of data, each cleaned and labeled by a human being; at least for now, machine learning can’t learn without being powered by people. “They really want to be recognized.”
If you work for a salary or an hourly wage, you get paid for all the tasks you do during work hours—answering emails, making sure your paycheck is deposited correctly, figuring out what task you’ll complete next. You may even be getting paid to read this article right now as you work. But if you’re a crowdsourced worker on a platform like Mechanical Turk cleaning and labeling AI training data, the critical tasks that keep your work, well, working—but that aren’t the actual job of cleaning and labeling data—are unpaid. For a long time, they weren’t even measured.
This invisible labor by crowdsourced AI workers is the crux of new research by Khoury College of Computer Sciences assistant professor Saiph Savage, her doctoral student researcher (and first author) Carlos Toxtli, and Microsoft researcher Siddharth Suri. Their work was recently honored with the Impact Award at the 24th ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW), one of the top conferences in the field of human-computer interaction. Their paper, “Quantifying the Invisible Labor in Crowd Work”, examines how much of their time crowdsource workers are spending on tasks that are critical to their job, but unpaid.
To measure invisible labor, the team built a browser extension that labeled and measured the different tasks being performed on crowdsource platforms. They had 100 workers who varied across axes like skill level, country, language, disability, and working style install the browser, and then the research team collected and analyzed the data.
“We found that it’s a big part of the [total] labor, invisible labor. In fact, it is one-third of all the labor that they do on the platform,” said Toxtli. “In other types of work…you have a fixed amount per hour, independent of how productive you are. For [crowdsource workers], they have to be productive all the time.”
The most common types of invisible labor were related to payments—checking earnings, requesting their pay for hours worked, etc.—and hypervigilance. Because the supply of crowdsource workers often exceeds demand, many of the tasks on offer pay very little, and snapping up well-paid tasks requires constantly watching the platform. Companies aren’t paying workers to perform this hypervigilance, but without it, workers won’t get paid at all.
All this invisible labor has a serious impact on their wages. When accounting for the time put into invisible labor, the researchers found that a number of crowdsource workers are making below the federal minimum wage of $7.25 an hour.
The Impact Award is granted to papers that demonstrate critical real-world impact, which Savage said has been the goal of this project all along.
“Our goal is that, with this type of tool, we can empower workers to go to policymakers and say ‘hey, a large number of us are impacted by invisible labor. It’s consuming 33 percent of our time, and it’s costing us this amount of money. We need to do something, change needs to happen here,” explained Savage. “If we truly want to create change for these platforms, positive change, we need to understand what is happening inside.”
In addition to giving crowdsource workers the data they need to push for policy change, the team hopes their research can bring visibility to a part of AI that’s often overlooked, despite being critical to the industry’s success.
“They are invisible workers, because most people don’t know the AI [we] use is powered by them,” said Toxtli. Training a single algorithm takes thousands of pieces of data, each cleaned and labeled by a human being; at least for now, machine learning can’t learn without being powered by people. “They really want to be recognized.”
If you work for a salary or an hourly wage, you get paid for all the tasks you do during work hours—answering emails, making sure your paycheck is deposited correctly, figuring out what task you’ll complete next. You may even be getting paid to read this article right now as you work. But if you’re a crowdsourced worker on a platform like Mechanical Turk cleaning and labeling AI training data, the critical tasks that keep your work, well, working—but that aren’t the actual job of cleaning and labeling data—are unpaid. For a long time, they weren’t even measured.
This invisible labor by crowdsourced AI workers is the crux of new research by Khoury College of Computer Sciences assistant professor Saiph Savage, her doctoral student researcher (and first author) Carlos Toxtli, and Microsoft researcher Siddharth Suri. Their work was recently honored with the Impact Award at the 24th ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW), one of the top conferences in the field of human-computer interaction. Their paper, “Quantifying the Invisible Labor in Crowd Work”, examines how much of their time crowdsource workers are spending on tasks that are critical to their job, but unpaid.
To measure invisible labor, the team built a browser extension that labeled and measured the different tasks being performed on crowdsource platforms. They had 100 workers who varied across axes like skill level, country, language, disability, and working style install the browser, and then the research team collected and analyzed the data.
“We found that it’s a big part of the [total] labor, invisible labor. In fact, it is one-third of all the labor that they do on the platform,” said Toxtli. “In other types of work…you have a fixed amount per hour, independent of how productive you are. For [crowdsource workers], they have to be productive all the time.”
The most common types of invisible labor were related to payments—checking earnings, requesting their pay for hours worked, etc.—and hypervigilance. Because the supply of crowdsource workers often exceeds demand, many of the tasks on offer pay very little, and snapping up well-paid tasks requires constantly watching the platform. Companies aren’t paying workers to perform this hypervigilance, but without it, workers won’t get paid at all.
All this invisible labor has a serious impact on their wages. When accounting for the time put into invisible labor, the researchers found that a number of crowdsource workers are making below the federal minimum wage of $7.25 an hour.
The Impact Award is granted to papers that demonstrate critical real-world impact, which Savage said has been the goal of this project all along.
“Our goal is that, with this type of tool, we can empower workers to go to policymakers and say ‘hey, a large number of us are impacted by invisible labor. It’s consuming 33 percent of our time, and it’s costing us this amount of money. We need to do something, change needs to happen here,” explained Savage. “If we truly want to create change for these platforms, positive change, we need to understand what is happening inside.”
In addition to giving crowdsource workers the data they need to push for policy change, the team hopes their research can bring visibility to a part of AI that’s often overlooked, despite being critical to the industry’s success.
“They are invisible workers, because most people don’t know the AI [we] use is powered by them,” said Toxtli. Training a single algorithm takes thousands of pieces of data, each cleaned and labeled by a human being; at least for now, machine learning can’t learn without being powered by people. “They really want to be recognized.”
If you work for a salary or an hourly wage, you get paid for all the tasks you do during work hours—answering emails, making sure your paycheck is deposited correctly, figuring out what task you’ll complete next. You may even be getting paid to read this article right now as you work. But if you’re a crowdsourced worker on a platform like Mechanical Turk cleaning and labeling AI training data, the critical tasks that keep your work, well, working—but that aren’t the actual job of cleaning and labeling data—are unpaid. For a long time, they weren’t even measured.
This invisible labor by crowdsourced AI workers is the crux of new research by Khoury College of Computer Sciences assistant professor Saiph Savage, her doctoral student researcher (and first author) Carlos Toxtli, and Microsoft researcher Siddharth Suri. Their work was recently honored with the Impact Award at the 24th ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW), one of the top conferences in the field of human-computer interaction. Their paper, “Quantifying the Invisible Labor in Crowd Work”, examines how much of their time crowdsource workers are spending on tasks that are critical to their job, but unpaid.
To measure invisible labor, the team built a browser extension that labeled and measured the different tasks being performed on crowdsource platforms. They had 100 workers who varied across axes like skill level, country, language, disability, and working style install the browser, and then the research team collected and analyzed the data.
“We found that it’s a big part of the [total] labor, invisible labor. In fact, it is one-third of all the labor that they do on the platform,” said Toxtli. “In other types of work…you have a fixed amount per hour, independent of how productive you are. For [crowdsource workers], they have to be productive all the time.”
The most common types of invisible labor were related to payments—checking earnings, requesting their pay for hours worked, etc.—and hypervigilance. Because the supply of crowdsource workers often exceeds demand, many of the tasks on offer pay very little, and snapping up well-paid tasks requires constantly watching the platform. Companies aren’t paying workers to perform this hypervigilance, but without it, workers won’t get paid at all.
All this invisible labor has a serious impact on their wages. When accounting for the time put into invisible labor, the researchers found that a number of crowdsource workers are making below the federal minimum wage of $7.25 an hour.
The Impact Award is granted to papers that demonstrate critical real-world impact, which Savage said has been the goal of this project all along.
“Our goal is that, with this type of tool, we can empower workers to go to policymakers and say ‘hey, a large number of us are impacted by invisible labor. It’s consuming 33 percent of our time, and it’s costing us this amount of money. We need to do something, change needs to happen here,” explained Savage. “If we truly want to create change for these platforms, positive change, we need to understand what is happening inside.”
In addition to giving crowdsource workers the data they need to push for policy change, the team hopes their research can bring visibility to a part of AI that’s often overlooked, despite being critical to the industry’s success.
“They are invisible workers, because most people don’t know the AI [we] use is powered by them,” said Toxtli. Training a single algorithm takes thousands of pieces of data, each cleaned and labeled by a human being; at least for now, machine learning can’t learn without being powered by people. “They really want to be recognized.”