

New version of Descartes: imagine that an LLM no less hallucination-prone than unaligned, is feeding itās output directly into your perceptionsā¦
Non cogitat, ergo non est
New version of Descartes: imagine that an LLM no less hallucination-prone than unaligned, is feeding itās output directly into your perceptionsā¦
Non cogitat, ergo non est
How had I missed that the guy who was āintriguedā by crypto in 202X was also the guy who told the chatbot to make him leave his wife? I really need to pay more attention to the byline.
Iām pretty sure I have no more than three followers, actually.
Longer read than I had realized but worth every word. Very well done.
In other words, we may eventually reach a sort of wealth singularity, a point when the wealth of a few grows so exponentially that it basically reaches the point of infinity.
I actually question whether or not this has already happened. The wealthy already have access to enough money that they donāt actually need to sell assets - to give anything up - in order to get credit. Just taking away Elonās money doesnāt make him stop being Elon. It doesnāt take away his connections, his charisma, his loyal follower base, etc. Even if he did get taken down in court any financial consequence wouldnāt actually hurt his power base nearly as much as the reputational shift (see also Orange Man). Their net worth may not be literally infinite, but I canāt think of any additional power or prestige they could command if it was.
Nope. That would be more immediately concerning but less dumb than the reality.
Alright OpenAI, listen up. Iāve got a whole 250GB hard drive from 2007 full of the Star Wars/Transformers crossover stories I wrote at the time. I promise you itās AI-free and wonāt be available to train competing models. Bidding starts at seven billion dollars. Iāll wait while you call the VCs.
I donāt know, I think by their stated goals they did alright. They took investor money, yes, but they used it to move very quickly and break a lot of things. Now, we should probably have seen ahead of time that this was actually a bad thing and that breaking things is a bad goal, but it was the 2000s and we all thought touchscreen digital watches were pretty neat.
Easy Money Author (and former TV Star) Ben Mckenzieās new cryptoskeptic documentary is struggling to find a distributor. Admittedly, the linked article is more a review of the film than a look at the distributor angle. Still, it looks like itās telling the true story in a way that will hopefully connect with people, and it would be a real shame if it didnāt find an audience.
Given the relative caliber of those two I think this may be considered an attempted inducement to suicide by better writer. Not that Iām complaining, mind you.
I do think Ed is overly critical of the impact that AI hype has had on the job market, not because the tools are actually good enough to replace people but because the business idiots who impact hiring believe they are. I think Brian Merchant had a piece not long ago talking about how mass layoffs may not be happening but thereās a definite slowdown in hiring, particularly for the kind of junior roles that we would expect to see impacted. I think this actually strengthens his overall argument, though, because the business idiots making those decisions are responding to the thoughtless coverage that so many journalists have given to the hype cycle just as so many of the people who lost it all on FTX believed their credulous coverage of crypto. If weāre going to have a dedicated professional/managerial class separate from the people who actually do things then the work of journalists like this becomes one of their only connectors to the real world just as its the only connection that people with real jobs have to the arcane details of finance or the deep magic that makes the tech we all rely on function. By abdicating their responsibility to actually inform people in favor of uncritically repeating the claims of people trying to sell them something theyāre actively contributing to all of it and the harms are even farther-reaching than Ed writes here.
Right? I guess maybe the incel-adjacent want to go back to the standards of medieval kings needing to have the whole court in their bedchambers on the wedding night just to make absolutely certain that the royals fucked at least once.
Itās also kind of weird to see Atlas Shrugged on the list. Not because itās not dystopian because the only thing itās missing from its libertarian hellscape is realistic consequences in the form of bear attacks. But unlike the others the society isnāt expressly said to be awful by the narrative. Or, for Scholtzenizen, by reality.
Your bonus points link is even dumber than youāre suggesting. The first half of the tweet:
I donāt want to live in the world of āCamp Of The Saintsā.
I donāt want to live in the world of āAtlas Shruggedā.
I donāt want to live in the world of āThe GULag Archipelagoā.
I donāt want to live in the world of āNineteen Eighty-Fourā.
I donāt want to live in the āBrave New Worldā.
I want to live in the world of Hyperion, Ringworld, Foundation, and Dune
I donāt want bad things! I want good-ish things!
Also Iāve never read Ringworld or Hyperion but the other two stories span literal millennia and show wildly different societies over that period. Hell, showcasing that development is the entire first set of Foundation stories. Just⦠You can absolutely tell this sonofabitch doesnāt actually read.
I mean you could make an actual evo psych argument about the importance of being able to model the behavior of other people in order to function in a social world. But I think part of the problem is also in the language at this point. Like, anthropomorphizing computers has always been part of how we interact with them. Churning through an algorithm means itās āthinkingā, an unexpected shutdown means it ādiedā, when it sends signals through a network interface itās ātalkingā and so on. But these GenAI chatbots (chatbots in general, really, but itās gotten worse as their ability to imitate conversation has improved) are too easy to assign actual agency and personhood to, and it would be really useful to have a similarly convenient way of talking about what they do and how they do it without that baggage.
Iām pretty sure there are some other factors heās gonna need to sort out before having kids is even an actual question. For example, finding a woman who wants to have his kids and let him fuck with their infant brains.
Also given how we see the brain develop in cases of traumatic injury I would expect to see that neuroplasticity route around any kind of implant under most circumstances. Nerves arenāt wires and you canāt just plug 'em in and wait for a software patch.
I assume that the corpos are taking my data, but I also trust that their primary interest is commercial. Or rather, commercials. I.e. selling shit or otherwise profiting. But Iāve seen the kind of asshole who ends up in the general public and some of those fuckers are actively evil. I donāt expect Meta to target my family for our politics or go after anyoneās gender identity unless some outside circumstance makes it profitable for them.
You know, I really shouldnāt be surprised that the prominent photo of a sexy woman is the post that brings out the creeps and gets more ācomment removed by modsā than any other thread Iāve seen. But somehow I was.
Iāve been trying for far longer than reasonable to come up with a parody of this kind of toxic misogyny and technofetishism in the form of a ābut ChatGPT is better than girlfriendā joke and I canāt do it. The combination of pettiness and vitriol is beyond my ability to exaggerate.
Seeing shit like this alongside the discussions of the use of image recognition and automatic targeting in the recent Ukrainian drone attacks on Russian bombers is not great.
Also something something sanitized violence something something. These people love to fantasize about the thrill of defending themselves and their ideology with physical force but even in their propaganda are (rightly) disgusted and terrified by the consequences that such violence has on actual people.9
Well found.
Also I love that the conversation almost certainly started with a comment about how everyone assumes theyād be the in the kingās court the cast majority of people would have been some variant of peasant farmer for the vast majority of history. But somehow he still would have totally been the Chief Rabbi, given the most beautiful woman, and generally be a king. I wasnāt there obviously but either he missed the point or they all missed the point. Even when talking specifically about how you canāt choose the circumstances of your birth or their consequences he still canāt imagine himself not being the king.