I thought of this recently (anti llm content within)
The reason a lot of companies/people are obsessed with llms and the like, is that it can solve some of their problems (so they think). The thing I noticed, is a LOT of the things they try to force the LLM to fix, could be solved with relatively simple programming.
Things like better searches (seo destroyed this by design, and kagi is about the only usable search engine with easy access), organization (use a database), document management, etc.
People dont fully understand how it all works, so they try to shoehorn the llm to do the work for them (poorly), while learning nothing of value.
He said he clicked the source it quoted.
Maybe if Google hasn’t been enshittifying search for 10 years, AI search wouldn’t be useful. But I’ve seen the same thing. The forced Gemini summary at the top of Google often has source links that aren’t anywhere on the first page of Google itself.
And how do you know the source is accurate? Having a source doesn’t automatically make it accurate. Bullshit can also have sources.
The premise of the op is that classic programming makes AI unnecessary. Having a bad source from classic Google search index isn’t a problem with AI.