(Also is that . . . the video for “Bad Apple“? Lol. Weebs.)
Everyone go home. Machine learning is over now that humanity has discovered thefor it.
[Mild content warning that this is, indeed, an app that draws cartoon scribbles of penises which, while hilarious, may not be entirely safe for work.]
Modern pop-ups are.
Most of this issue, incidentally, is caused by websites desperately trying to find loopholes in privacy regulations like GDPR; basically trying to make interfaces as annoying and confusing as (legally) possibly to trick users into opting into surveillance and tracking.
Also: Newsletter sign-ups, which are sort of the same thing (basically: a way of harvesting personal information, i.e. your email address, and noting that email clients tend to have less privacy-protecting adblockers in them than web browsres do).
Why do suspension bridges have stranded cables not solid rods? The major reason is that solid rods would fail suddenly and catastrophically, whereas stranded cables fail slowly and make alarming noises while they do. We build software systems out of solid rods; they fail abruptly and completely. Most are designed to perform their tasks as fast as possible, so that when they are compromised, they perform the attacker’s tasks as fast as possible.
David Rosenthal on.
This is actually from a talk about the externalities of cryptocurrencies, which is worth watching and/or reading in full.
Sometimes I go back and read my blog posts from 2014 and marvel at how freewheeling, irreverent, and downright joyful my writing was. I don’t really write like that anymore, because social media (and the internet in general) have conditioned me to constantly fret over negative attention. So I act as my own PR firm, carefully focus-testing and bowdlerizing my prose until it’s as dry as a slice of burnt toast. Sometimes I can escape from this trap a little bit (like I’m trying to do right now), but overall I worry that my writing has gotten worse, not better, over time.
Nolan Lawson on the.
My current blog only goes back to 2014 or so, but I’ve been blogging fairly consistently since 19991 and still have an archive of all the posts . . . somewhere. And while it’s definitely Old Woman Yells At Cloud2 territory, you can absolutely trace trends in how my writing has changed over the years as various forms of online social interaction — webrings and cliques, LiveJournal, “review” sites, social media — have waxed and waned. Admittedly some of this is age-related — in 1999 I was fifteen and writing about getting to stay home from the school swimming carnival in exchange for promising Mum I’d vacuum the house3 — but . . . probably not as much as I’d like, in retrospect.
So this look at the . . . interesting.saga, from a former iOS dev is
But the main take-home here is in the line,
I think Apple’s censorship policies are wrong and they have no grounds to be policing adult content within apps on the app store. It’s not uncommon to hear variants of this (almost always from Americans) and it’s just . . . lol wut?
Apple’s policies may indeed be nonsense but the idea that a private company doesn’t have “grounds” to police the content on the platform it maintains is, like, Peak Silicon Valley Brain. Of course it has grounds! Hell, it doesn’t just have “grounds” it has a legal obligation, under the law, to do so! And even if it didn’t, again, it’s Apple’s own platform and in as much as there is an issue here, it’s not “can Apple police content on its own platform?”1 so much as it’s “should Apple have the ability to hold a monopoly marketplace over what can be installed on a iPhone to start with?”2
This is what I mean about “Silicon Valley Brain”, which is the terminal condition of having taken too many STEM courses and not enough of everything else to realise this isn’t about “censorship.” It’s about antitrust.
The future from . . .
[T]he reason to be skeptical about [the “metaverse”] is not that people don’t want to do some or all of the stuff that Zuckerberg talks about in strained tones of wonder and whimsy. Whether in terms of speculating on cryptocurrencies or gaming with a V.R. headset or just clocking into a virtual workplace, people are absolutely already doing all those things, albeit sometimes more happily than others. It is not even the question of why anyone would entrust the design and implementation of the future to Facebook, which has made the world infinitely dumber, uglier, and worse in a number of obvious and inescapable ways, and is a miserable website to use to boot. That is a really good question, though, if only because it is just extremely difficult to imagine someone choosing to work and live inside the website that convinced their grandparents that the germ theory of disease was a hoax. But what really rankles is that it doesn’t matter. It doesn’t matter that this vision of the future is extractive, joyless, and dull; that the smug cretins who got rich off the platforms that this new decentralized movement is supposedly leaving behind are also leading the supposed successor movement doesn’t matter either. This push to transcend the world that Facebook has befouled and create a new, virtual one can be read in a sense as a response to the 10,000-page Facebook Papers leak, which demonstrated both the extent to which Zuckerberg’s own criminal indifference and Facebook’s inability to police its own sprawl have made the site a malignant metastatic force in countries around the world. Again, it doesn’t matter.
This is the taunt implicit in everything Zuckerberg does at this point in his reign. Here is a man who got unconscionably rich off the worst website that has ever existed, a website that has broken brains on a scale previously unimaginable in human history, and here is his stupendously wack vision for the future—and everyone is just going to have to deal with it. There are many things to abhor about Mark Zuckerberg and his works, but the fundamental mediocrity of it all—the lack of vision, the absence of any moral sense or shame, the inability and unwillingness not just to fix but even reckon with the dangerous and ungovernable thing he’s made—is what feels both most egregious and most of this moment. It is embarrassing and not a little enraging to realize that you are subject to the whims of an amoral and incurious capitalist posing as a visionary optimist. It is especially humiliating when the all-bestriding and inevitable figure in question is such a dim, dull nullity.
The future is.
No wonder all the Kids These Days aesthetics are about, like, escaping to the woods to be a bog witch or some imagined fantasy version of 1800s British academia or whatever . . .
With the have-nots spending more and more of their time experiencing a simulation of glorious substance through their VR headsets, the haves would have the actual glorious substance all the more to themselves. The beaches would be emptier, the streets cleaner. Best of all, the haves would be able to shed all responsibility, and guilt, for the problems of the real world. When [millionaire tech mogul Marc] Andreessen argues that we should no longer bother to “prioritize improvements in reality,” he’s letting himself off the hook. Let them eat virtual cake.
Nicholas Carr on.
Maybe an escape into virtual reality actually isn’t all that great after all? Like I’m pretty sure they’ve made some films about this . . .
The truth is that there is no such thing as the “digital world.” It is not a realm that exists apart from the so-called real world. Everything that is digital — information, exchanges, and experiences — is also physical. And yet, as infinite as we imagine the digital world to be, there is only so much physical stuff to go around. We are facing a very real future of scarcity, not abundance, and it will only be more severe if we continue to deny the true, hard costs of digital culture. We need to begin to express it in harmony with the physical world.
Christopher Butler on the.
Cyberspace is not infinite. Servers are expensive; in electricity, in land usage, in carbon footprint, in the resources it takes to build them in the first place. And the cost-benefit ratio of “cyber per chip” is getting lower, not higher . . .