Don’t get me wrong. I love ActivityPub and decentralized social media as much as the next disaffected Web 2.0 neckbeard. But, also… yes, this.
… Lest we forget.
This whole experience1 was wild, and someone way more articulate than me I’m sure could make hay writing some kind of retrospective about:
- clashing online community mores
- emergent user behaviors engendered by commercial social media’s growth-at-all-costs mindset
- the “too-cool-for-you-plebes-but-oh-gods-please-acknowledge-my-greatness” dichotomy in “influencer”2 culture
… and probably many, many more!
This post has been doing the rounds today, about what it’s like to be a Facebook content moderator, and it’s absolutely damning.
It’s not necessarily even the stuff about the sorts of content moderators are faced with, or the ridiculously inconsistent rules for dealing with it.1 It’s the little indignities of the actual outsourcer, like the underpayment and the ruthless micromanagement of workers. It’s well-known that a lack of autonomy in a workplace causes the kind of stress that’s psychologically damaging, and that that kind of stress is significantly more common in low-wage jobs.2 Compounded with a workplace where people are constantly confronted by traumatic material, and…
- Though they are bad… and also the product of Facebook’s “all things to all people” monopolistic approach to social media, which is a different-but-related issue. [↩]
- High-wage jobs tend to come with a different kind of stress, i.e. one associated with having to deal with complex problems and difficult decisions. This type of stress, while still “stressful” is nonetheless not associated with long-term psychological damage in the way of stress originating from a lack of autonomy. [↩]
I wrote the law that allows sites to be unfettered free speech marketplaces. I wrote that same law, Section 230 of the Communications Decency Act, to provide vital protections to sites that didn’t want to host the most unsavory forms of expression. The goal was to protect the unique ability of the internet to be the proverbial marketplace of ideas while ensuring that mainstream sites could reflect the ethics of society as a whole.
In general, this has been a success — with one glaring exception. I never expected that internet CEOs would fail to understand one simple principle: that an individual endorsing (or denying) the extermination of millions of people, or attacking the victims of horrific crimes or the parents of murdered children, is far more indecent than an individual posting pornography.
Ron Wyden (D-OR) on the horror he hath wrought.
More specifically, Section 230 is the portion of the Act that explicitly states social media platforms (among others) are not publishers and therefore not liable for content posted by users to their sites.
Wyden’s point is that the whole “lulz freeze peach!” crowd is fundamentally wrong; freedom of expression on social media platforms is protected not by the constitution (i.e. the First Amendment), but by legislature. Which, of course, can be changed, limited, or revoked at any time. Without Section 230, everyone who’s ever been the target of, say, a Twitter hate mob could sue Twitter for facilitating it and that, fundamentally, the aggregate of all these lawsuits would put the company out of business.
It’s worth pointing out that Wyden thinks this would be a bad thing. YMMV.
Really interesting look at errors of scale in social media design. Specifically, how things like threat models and attack patterns change as social networks get larger, and how building for “one end” of the scale can seriously fuck you up if your site happens to sit in the other.
Definitely one to sit in my “useful thoughts for an Mastodon admin” pile…
The government-thinking has a secondary appeal to executive teams [of social media sites]. If their site is a country, that makes them the ruling class. It makes the CEO the president (or dictator). And again, squinting, it can kind of feel that way. Running a company, like managing a community, is literally a power trip. You can do things your members can’t, including punishing those members. Power, even tiny power, can be addictive.
But it’s not true. None of it. Your product is not a country. You are not a government. Your CEO is not a president. And, worse, thinking that way is damaging to the community, disastrous for the company, and may just be ruining the world.
Derek Powazek on false equivalences.
I’ve said it before and I’ll say it again: The widespread conflation of private platforms and businesses with public (i.e. government) services and infrastructure is like the Original Sin of late-stage capitalism. This is what causes people to cling desperately like Twitter and Facebook, under the assumption that angrily @ing Jack Dorsey is somehow equivalent to making phonecalls to political representatives. This is what causes people to say things like they “believe in” Facebook and “won’t give up on it”, won’t try out new or equivalent services, because they feel some kind of strange, pseudo-patriotism towards the platform. And this is what causes those people to think attitudes like that are somehow valorous.
Spoiler alert: a company is not a government, nor a country, nor a polity. The fact that you think it is is a lie capitalism has taught you, because the reality is the sorts of actions that work on governments (e.g. democracy, accountability) don’t work against corporations—who are accountable to their shareholders/board, not their consumers/product—and yet the foundational conceit of the nation-state (specifically, patriotism) is immensely profitable in the sense that it keeps consumers locked into a particularly brand…
But the imperative to “connect people” lacks the one ingredient essential for being a good citizen: Treating individual human beings as sacrosanct. To Facebook, the world is not made up of individuals, but of connections between them. The billions of Facebook accounts belong not to “people” but to “users,” collections of data points connected to other collections of data points on a vast Social Network, to be targeted and monetized by computer programs.
There are certain things you do not in good conscience do to humans. To data, you can do whatever you like.
Nikhil Sonnad on social media immorality.
There’s a lot of discussion about how we need to reach out and talk to people who disagree with us – how we need to extend an olive branch and find common ground – and that’s a lovely sentiment, but in order for that to work, the other party needs to be … well, not a raging asshole. Insisting that people continue to reach out to their abusers in hopes that they will change suggests that the abuse is somehow in the victim’s hands to control.
Geraldine DeRuiter tried feeding trolls.
Obviously content warning at the article, which has screenshots of the abusive messages on Twitter.