On the. Tl;dr, copyright laws in general are demonstrably good for the production of creative works (the story, recounted in the article, about how this was studied is pretty interesting) but they’re also a kind of monopoly, and the current system is hot nonsense cooked up to make (very specifically) Disney billions.
For most social media users, posting and engaging with content is optional and separate to work commitments. For journalists, however, there is a growing expectation that maintaining a strong social media presence is both part of the job and a prerequisite for landing a job.
This means that for more than a decade, on top of their day jobs, journalists have effectively performed unpaid labor for social media platforms such as Facebook, where they promote their stories, themselves, and the organizations that employ them. As one study put it, it’s a case of “tweet or be sacked.”
I’ve had this conversation with people, specifically in the context of a dude I know talking shit about the mother of a kid at daycare. She’s a relatively prominent member of the press gallery here, and Dude was describing her as “she says she’s always too busy to do [random childcare thing], but apparently she’s not too busy to be on Twitter all day.”
And I’m like . . . dude. That’s her job. That is, literally, her job. She is doing her job. That’s why she’s busy. Because if she doesn’t she’s not going to have one and childcare sure as shit ain’t free.
The thought had never occurred to Dude before. And that’s not even getting into any of the boatload of gendered issues going on here.
So yeah. Dat modern journalistic career, huh?
The Facebook showdown with Murdoch’s corporate wing, a.k.a the Australian Government, is definitely a Team No-one situation. That being said, I did like 1 if nothing else . . ., from someone supportive of it. It’s a lesson in bias checking
- Do I really hate this law because it’s the government writing cheques to Murdoch or is it because the tech industry has conditioned me to see any regulation as inherently bad, even if only subconsciously? Or both? [↩]
Beginning in the 1980s, the World Bank and IMF (which are controlled primarily by the US and G7), imposed structural adjustment programmes across the global South, which significantly depressed wages and commodity prices (cutting them in half) and reorganized Southern economies around exports to the North. The goal was to restore Northern access to the cheap labour and resources they had enjoyed during the colonial era. It worked: during the 1980s the quantity of commodities that the South exported to the North increased, and yet their total revenues on this trade (i.e., the GDP they received for it) declined. In other words, by depressing the costs of Southern labour and commodities, the North is able to appropriate a significant quantity effectively for free.
Jason Hickel on.
One of the (many) things I do that makes me insufferable at parties, is whenever anyone goes on a pearl-clutching tear about how they “can’t believe” some economic injustice or other, I point out that that is the system working as intended.
Incredulity is an abrogation of responsibility. Don’t let people get away with it.
Short version, “green growth” is. Either the planet goes, or capitalism does.
[Amazon founder Jeff Bezos] recognized that a customer coming to Amazon and looking for a book would not know that not finding it in a search meant nobody would have it. In the retail world, that signal usually meant “go shop for it in another store.” Bezos saw the benefit of having his searchers find the book that was no longer available and being provided the information that it was “out of print”. That would encourage the customer to find a replacement at Amazon rather than search other retailers for the unavailable book.
The article also points out that Amazon has an advantage over brick-and-mortar booksellers because Amazon (at least at founding) did not keep an inventory of books on-hand; instead, when customer orders came it, it passed them through to one of the two big book wholesalers (Bowker and Baker & Taylor), who did the actual fulfilment, warehousing, et al.
The whole publishing industry is kind of end-to-end messed up, and extremely conservative in the “we do it this way because we’ve always done it this way” (it’s a, roughly, five-hundred-year-old industry). Most times when people talk about “disrupting” industries it’s code for “loss-leading with VC money in a way we’re pretending isn’t totally illegal” (ref. Uber et al.), but with Amazon, it really was legit the real deal. Of course, the longer-term effects have been utterly disastrous to, conservatively, all global retail everywhere. But . . . y’know.
Apparently the largest single category on GoFundMe, accounting for 13% of all fundraisers, is for monthly bills. That’s . . . a problem, according to the , amongst others.
The Modernists believed that form should follow function. Though their buildings were as ugly as minimalism is boring, the Modernists saw beauty in them because they were cheap, and therefore egalitarian. Like today’s minimalists, they criticized ornamentation and focused only on the essentials. But like minimalism, modernism was cold and uninviting. In the end, it repelled people.
The gold standard of modernism, at least in theory, was the Pruitt-Igoe in St. Louis, conceived as an “oasis in the desert.” Architectural Digest called the proposal “the best high apartment of the year.” To manufacture equality among residents, the elevators only stopped at the 1st, 4th, 7th, and 11th floors—that would reduce congestion by forcing residents to use the stairs. To create camaraderie, those same floors had communal laundry rooms, garbage chutes, and public gathering spaces. But that paradise was a pipe dream. The world turned against the concrete towers that the Modernists had once pushed for, and local residents kept away from it due to high crime rates. So many people moved out that almost half of the buildings were boarded up by 1971, and five years later, all 33 buildings were demolished. In retrospect, the Pruitt-Igoe mirrors the rise and fall of modernism.
Modernism’s collapse is a reminder that total efficiency is for robots. It was inspired by a noble and egalitarian vision of the future whose reality was as hostile as its vision was inspiring. From it, we learn that humans want to live in a world decorated by color and pattern. A world without ornamentation is as bland as soup without spice—and humans want spice.
David Perell on.
I’m always kind of fascinated by articles like this because, confession, I love minimalism. Broad expanses of well-lit, neutral tones (“adult beige” as I once described it to my husband) with only the smallest hint of, preferably natural, ornamentation? Love it.
On the other hand, I do also loathe a lot of Modernist (and, worse, Brutalist) architecture, mostly because it always looks, uh. Cheap. Even when it’s not; most of our major national public buildings here, for example, are very brutalist, and they are definitely not cutting costs on construction. And I just don’t buy the “cheapness is egalitarian!” argument, mostly because of the “the poor don’t deserve nice things, they should be grateful to have anything at all!” undertones. So . . . yeah. There’s that.
Some additional thoughts:
- Modern “Apple minimalism” comes from the fact that Steve Jobs was very, very heavily influenced by Japanese aesthetic minimalism and, in particular, Zen art. So the Western backlash against minimalism always feels like it’s a teensy tiny bit touched by racism (see also: the Western reaction to Marie Kondo).
- The Pruitt-Igoe was actually designed by Minoru Yamasaki who, in case the name didn’t give it away, was a Japanese American architect.1
- It was technically based on (or at least cribbed heavily off the ideas of) the Plan Voisin, which was basically the same thing but in Paris. Parisians loathed the idea and it was never actually built. If you’ve ever seen High-Rise (you know, that arty Tom Hiddleston film) and/or read the book it was based on, the Plan Voisin and its descendants were its Really Real World inspiration.
- In comparison to “Apple minimalism,” its maximalist backlash is pretty much always just upcycled 19th century Orientalism. My parents live in this sort of house—all Persian rugs and antique Chinese cabinets of dubiously legal export origin—and while I don’t hate it, the aesthetic never feels entirely “comfortable” to me because of its history, in a way “Apple minimalism”—which takes style notes but doesn’t, like, wholesale steal artefacts—does not.
- Relatedly, “backlash maximalism” also reads as extremely “university-educated-upper-middle-class white ex-hippie Boomer” to me, because it’s a really common aesthetic here for people of my parents’ age and demographic. So it has unfortunate notes of inter-generational/class conflict, as well as colonialism, in a way I think maybe doesn’t register to Americans as strongly. (See also: the stereotypical Aspirational White Australian Bunnings Buddha Statue.)
- For what it’s worth, I also have traditionally hated Art Deco (mostly because it’s not Art Nouveau, which is the Obviously Far Superior “Art Something”-named aesthetic), though it’s growing on me as I get older.
- Fun fact: His most famous building? The World Trade Centre. [↩]
finger, the original .
Here’s how they actually work now: senators’ offices get “hotline” emails from leaders asking if anyone wants to filibuster a given nominee or bill. While US senators (and state-level equivalents like Wendy Davis) who engaged in “talking filibusters” sometimes had to go as far as equipping themselves with urinary catheters before all those hours of talking, today a junior staffer can simply call their party’s “cloak room” to let them know that the senator they work for intends to “place a hold” on whatever the “hotline” was about. That’s it. That’s the filibuster.
The filibuster is such a strange institution in general (with a ridiculously racist history, to boot), but the fact that it exists in its current form at all is, uh. Probably not that great, hey.