

Stuff like this is why I say a lot of USians are politically illiterate and don’t mean it as an exaggeration. Though actually, it might be an understatement because you have stuff like this, where it’s not just that they don’t understand, it’s that they’re confidently wrong. Propaganda has flipped the script on them, but underneath, their material conditions often still reflect someone who would benefit from a socialist state. So then you have this wildness, where he can present something that would benefit them and they can even agree with it, as long as he says the alternative is “communism” - the monster under the bed.
It just goes to show how obviously sensible the concepts of communism are, when it’s not being blockaded by a knee-jerk hate response. Hell, using myself as an example, I might have mentioned before how there was a time prior to my reading any theory where I sort of accidentally found my way to something like communism as a thing to believe in; like just on the sensibility of it, not from others telling me. Granted, I did not meander into something as complex as dialectical materialism (I still struggle sometimes to fully wrap my head around that). But it’s like, it makes sense that people would lean in that direction, whether they have the same word for it or not, when the oppressive conditions out of which it developed as theory and practice are still very much a thing. It shows the scientific roots of it; in science, you can have two people independently observe and record more or less the same thing because the thing itself is not different based on who is looking at it. How it is interpreted can be different, but science strives to separate out interpretation from raw empirical traits.
I find the tone kind of slapdash. Feel like the author could have condensed it to a small post about using AI agents in certain contexts, as that seems to be the crux of their argument for usefulness in programming.
I do think they have a valid point about some in tech acting squeamish about automation when their whole thing has been automation from day one. Though I also think the idea of AI doing “junior developer” level of work is going to backfire massively on the industry. Seniors start out as juniors and AI is not going to progress fast enough to replace seniors probably within decades (I could see it replacing some seniors, but not on the level of trust and competency that would allow it to replace all of them). But AI could replace a lot of juniors and effectively lock the field into a trajectory of aging itself out of existence, due to it being too hard for enough humans to get the needed experience to take over the senior roles.
Edit: I mean, it’s already the case that dated systems sometimes use languages nobody is learning anymore. That kind of thing could get much worse.