• TheControlled@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    Isn’t there some computer science hypothesis (or whatever) about how the more complex computers get the more inefficient they must get as well?

    • SlopppyEngineer@discuss.tchncs.de
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Computers haven’t become less efficient. They can still crunch numbers like crazy.

      It’s the software. Why spend a month making something when you can just download some framework that does what you want in one hour. Sure, it used 10 times as much memory and CPU, but that’s still only a 1 second delay with a modern computer and the deadline for release is approaching fast.

      Repeat that process often enough and you have a ridiculously bloated mess of layers upon layers of software. Just for fun you can start up some old software and play around with it in an emulator to be baffled how quick it all works on a modern system.