• MudMan@fedia.io
    link
    fedilink
    arrow-up
    10
    arrow-down
    4
    ·
    6 days ago

    Hah. Yeah, I’ll do that as soon as you invent a way to freeze time.

    For what it’s worth, I’m pretty sure it’s less energy efficient to run a local open source LLM than to offload the task to a data center, but the flexibility and privacy are too big of a deal to ignore.

    In any case chatbots suck at finding accurate information reliably, but they are actually pretty good at reaching things you already know or can verify at a glance with suprisingly little information. The fact that a piece of tech is being misused often doesn’t mean it’s useless. This simplistic black-and-white stuff is so dumb and social media is so full of it. Speaking of often misused technology, I suppose.