If you believe moderation is a high-skilled, high-stakes job that presents unique psychological risks to your workforce, you might hire all of those workers as full-time employees. But if you believe that it is a low-skill job that will someday be done primarily by algorithms, you probably would not.

Instead, you would do what Facebook, Google, YouTube, and Twitter have done, and hire companies like Accenture, Genpact, and Cognizant to do the work for you. Leave to them the messy work of finding and training human beings, and of laying them all off when the contract ends. Ask the vendors to hit some just-out-of-reach metric, and let them figure out how to get there.

At Google, contractors like these already represent a majority of its workforce. The system allows tech giants to save billions of dollars a year, while reporting record profits each quarter. Some vendors may turn out to mistreat their workers, threatening the reputation of the tech giant that hired them. But countless more stories will remain hidden behind nondisclosure agreements.

In the meantime, tens of thousands of people around the world go to work each day at an office where taking care of the individual person is always someone else’s job. Where at the highest levels, human content moderators are viewed as a speed bump on the way to an AI-powered future.

Casey Newton on moderation.

Hey y’all remember that article from a while back about how fucking awful it is to be a Facebook moderator?

Guess what? There’s a Part II. Content warning for the full story, which deals with death, workplace harassment and negligence, PTSD, and animal and child cruelty. Also: delete your fucking Facebook.

Also big shout-out to the graphic designers at The Verge, for the subtly filthy page margins, which… eurgh.