When talking to other people and bringing up sources, it’s common for them to say “I don’t like that website” or “it’s not trustworthy”. On Lemmy, this is most commonly said about Reddit, where you will be questioned if you use it as a source of knowledge or show off something you did there. Wikipedia is another one.

However, the other day, me and a friend noticed something. The most discredited websites all correspond to the most neutral websites. Minus its overt traditionalism, Reddit is pretty neutral and doesn’t promote a specific leaning. Wikipedia is another one, as the whole point of Wikipedia was that it could be a source of knowledge made by the people and for the people. Recently ChatGPT became something a lot of people consult, and nowadays you get a lot of ridicule for mentioning things like asking it for advice or going to it to check on something. Quora is a fourth example, in fact it currently has a “spammy” reputation that I don’t see the inspiration for. I don’t know, this all seems too big a coincidence in our world.

Do these websites (and other ones) really inspire being looked down upon as much as the people around you claim, and which ones do you have the most and least amount of issue with? And why?

  • In general there is no “neutral” source of information. At all. Yes, including Wikipedia with its “NPOV” policy. (It even says that there’s no such thing in its own policies, so I’m not exactly saying anything new here.) Most of the sources you cite as “neutral” will actually be sources that agree, broadly, with your own cultural assumptions that you are likely not even aware of, not to mention actively questioning.

    That being said, since there is no such thing as a neutral source of information, you can still have good sources of information. Wikipedia is one such. Is it perfect? No. Because nothing is. But it is good enough for most general knowledge. It gets a bit dicey as a source when you leave the realm of western assumptions, or if you enter into the realm of contentious politics. But for most things it’s just fine as a quick resource to get information from. It’s a decent encyclopedia whose ease of access isn’t matched by anybody else.

    Reddit is not, however. Because reddit has no disciplined approach to information-gathering and -sharing. Wikipedia is an encyclopedia (with all the strengths and flaws that form takes on). Reddit is a lot of people talking loudly in a gigantic garden party from Hell. Over by the roses you have a bunch of people loudly expounding on the virtues of the Nazi party. Over by the fountain you’ve got another group loudly expounding on how vile and gross the Nazis were casting glares in the direction of the roses. In the maze park you’ve got a bunch of people meandering around and laughing while they babble inanities. Out in the driveway you’ve got a bunch of Morris dancers practising their craft. It may be fun if you like that kind of thing, but it is absolutely not a source of reliable information unless you do so much fact checking that you might as well skip the reddit step and go straight to getting the facts from the places you’re using to check.

    ChatGPT, to continue using strained analogies, is that weird uncle in your family. He’s personable, bright, cheerful, and seems to know a lot of stuff. But he’s a bit off and off-putting somehow, and that’s because behind the scenes, when nobody’s looking, he’s taking a lot of hallucinogens. He does know a lot. A whole lot. But he also makes shit up from the weird distortions the drugs in his system impose on his perceptions. As a result you never know when he’s telling the truth or when he’s made a whole fantasy world to answer your question.

    My personal experience with ChatGPT came from asking it about a singer I admire. She’s not a really big name and not a lot of people write about her. I wanted to find more of her work and thought ChatGPT could at least give me a list of albums featuring her. And it did! It gave me a dozen albums to look for. Only … none of them existed. Not a single one. ChatGPT made up a whole discography for this singer instead of saying “sorry, I don’t know”. And when I went looking for them and found they didn’t exist, I told it this and it did its “sorry, I made a mistake, here’s the right list” thing … and that list contained half of the old list that I’d already pointed out didn’t exist and half new entries that, you guessed it!, also didn’t exist.

    And the problem is that ChatGPT is just as certain when hallucinating as it is when telling things that are true. It is PARTICULARLY unsuited to be a source of information.