Image from a based Chinese artist on Twitter @Amogha_Pasa

  • ☆ Yσɠƚԋσʂ ☆@lemmygrad.ml
    link
    fedilink
    arrow-up
    0
    ·
    17 days ago

    A large context is still a problem right now. For example, there’s sort of a fundamental tension because once your network gets big enough, then it takes a long time for data to propagate through it relative to the rate of computation. So, you start running into hard limits on how much of its knowledge system can access effectively at any one time. I’m sure there’s still plenty of room to grow, I’m just saying it’s not a given that AI systems could just keep self improving indefinitely.

    And you’re completely right that China having to do more with less actually drives innovation. I also suspect that the whole data centre build out the US is rushing into is premature. This tech is still actively evolving, and we might see completely new algorithms or hardware approaches that make current gen obsolete. For example, there’s already stuff like this on the hardware side, there are also analog chips being developed which would be a much better fit. And once that tech solidifies it could blow right past current types of chips. And new software algorithms like SpkingBrain could be served better by different hardware architecture.

    So the US is making a huge gamble on the massive infrastructure investment that’s unlikely to pay off.