

Quick TL;DR of my Discord Age Verification Experience™:
Using my face multiple times didn’t work due to the AV shitting itself inside out, but setting my DOB via Family Center somehow did it
Absolute fucking clown fiesta, Jesus Christ
he/they


Quick TL;DR of my Discord Age Verification Experience™:
Using my face multiple times didn’t work due to the AV shitting itself inside out, but setting my DOB via Family Center somehow did it
Absolute fucking clown fiesta, Jesus Christ


WD and Seagate confirm: Hard drives for 2026 sold out (because the AI datacentres have stolen them all)
idk if the bubble will pop or slowly deflate, but im certain that in 10 years we’ll look back at 2020s as the decade where tech stopped progressing in the way we know it - since we’re diverting all our resources to ai, there’s no longer any room left for anything else to grow
the 2010s crypto gpu shortage was the warning siren for this. it really hampered the growth of gpus because they permanently became so much more expensive - now the same is happening to memory, storage, and…well, gpus again! we’ve reached the point of reverse progress


Chatbots are a cognitive hazard, part infinity: AI Delusions Are Leading to Domestic Abuse, Harassment, and Stalking


Baldur Bjarnason gives his thoughts on the software job market, predicting a collapse regardless of how AI shakes out:
If you model the impact of working LLM coding tools (big increase in productivity, little downside) where the bottlenecks are largely outside of coding, increases in coding automation mostly just reduce the need for labour. I.e. 10x increase means you need 10x fewer coders, collapsing the job market
If you model the impact of working LLM coding tools with no bottlenecks, then the increase in productivity massively increases the supply of undifferentiated software and the prices you can charge for any software drops through the floor, collapsing the job market
If the models increase output but are flawed, as in they produce too many defects or have major quality issues, Akerlof’s market for lemons kicks in, bad products drive out good, value of software in the market heads south, collapsing the job market
If the model impact is largely fictitious, meaning this is all a scam and the perceived benefit is just a clusterfuck of cognitive hazards, then the financial bubble pop will be devastating, tech as an industry will largely be destroyed, and trust in software will be zero, collapsing the job market
I can only think of a few major offsetting forces:
- If the EU invests in replacing US software, bolstering the EU job market.
- China might have substantial unfulfilled domestic demand for software, propping up their job market
- Companies might find that declining software quality harms their bottom-line, leading to a Y2K-style investment in fixing their software stacks
But those don’t seem likely to do more than partially offset the decline. Kind of hoping I’m missing something


The phrase “ambient AI listening in our hospital” makes me hear the “Dies Irae” in my head.
I’m personally hearing “Morceaux” myself.


That slopped-out “diagram” plagiarised Vincent Driessen’s “A successful Git branching model”, BTW.


Claudio Nastruzzi of The Reg chimes in on the inherent shittiness of AI writing, coining the term “semantic ablation” to describe its capacity to destroy whatever unique voice a text has.


Look at the fresh garbage LinkedIn served to me today: https://www.linkedin.com/news/story/managing-ai-is-the-next-gen-skill-7655601/
(Why am I still on that godforsaken hellsite)


I wish we could finally agree that tech bros (and MBAs!) are greedy, full of shit and ruining the planet. And then remove both groups from any place of influence.
Prohibiting the teaching of MBAs and/or massively funding the humanities would be a good start. Hell, you could fund the humanities with the cash that currently goes toward MBAs and kill two birds with one stone.


New post from Iris Meredith (titled “Carbon Dysphoria”), comparing the large-scale dysfunction of the tech industry to gender dysphoria - “definitely one of my weirder ones”, by her own admission


Former Reddit CEO
wants humanity to “perish with dignity”
The fuck does a former Reddit CEO know about dignity


Stumbled across a stray blogpost that piqued my interest: A programmer’s loss of identity


Rat-adjacent coder Scott Shambaugh has continued blogging on the PR disaster turned AI-generated pissy blog post.
TL;DR: Ars Technica AI-generated an article with fabricated quotes (which got taken down after backlash), and Scott has reported a quarter of the comments he read taking the clanker’s side in the entire debacle.
Personally, I’m willing to take Scott at his word on that last part - between being a programmer and being a rat/rat-adjacent, chances are his circles are (were?) highly vulnerable to being hit by the LLM rot.


So AI is a parasite that takes from Wikipedia, contributes nothing in return, and in fact actively chokes it out? And you think the solution is for Wikipedia to just surrender and implement AI features?
Given how thoroughly tech bought into the AI hype, that is probably the exact “solution” he’s thinking of.
(Exactly why tech fell for the slop machines so hard, I’ll probably never know.)


some nimrod suggested skilled machinists be outfitted with pressure sensing gloves and cameras and patiently explain eahc machining step so the LLMs could take their jobs
I expected a willingness from HN users to backstab the working class, but I didn’t expect something this blatantly half-baked.
10x developers, 0.1x proletariat.


“As AI enters the operating room, reports arise of botched surgeries and misidentified body parts”
Medical malpractice as a service, coming to a GP near you


The whole thing’s worth reading, but this snippet in particular deserves attention:
Tech companies have done everything they can to maximise the potential harms of generative models because in doing so they think they’re maximising their own personal benefit.


Starting this Stubsack off by linking to Pavel Samsonov’s “You can’t “AI-proof your career” with a project mindset”, a follow-on to Iris Meredith’s “Becoming an AI-proof software engineer” which goes further into how best to safeguard one’s software career from the slop-bots.


The first issue filed is called “Hello world does not compile” so you can tell it’s off to a good start.
The comments are a hoot, at least.
The kind of person who cannot tell the difference between blindly guessing words and conscious thought.