• MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    12 hours ago

    Download of 6GB is wild, is that re-downloading the entire package for each one that needs an update? Shouldn’t it be more efficient to download only the changes and patch the existing files?

    At this point it seems like my desktop Linux install needs as much space and bandwidth than windows does.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      37 minutes ago

      This doesn’t work too well for rolling releases, because users will quickly get several version jumps behind.

      For example, let’s say libbanana is currently at version 1.2.1, but then releases 1.2.2, which you ship as a distro right away, but then a few days later, they’ve already released 1.2.3, which you ship, too.
      Now Agnes comes home at the weekend and runs package updates on her system, which is still on libbanana v1.2.1. At that point, she would need the diffs 1.2.1→1.2.2 and then 1.2.2→1.2.3 separately, which may have overlaps in which files changed.

      In principle, you could additionally provide the diff 1.2.1→1.2.3, but if Greg updates only every other weekend, and libbanana celebrates the 1.3.0 release by then, then you will also need the diffs 1.2.1→1.3.0, 1.2.2→1.3.0 and 1.2.3→1.3.0. So, this strategy quickly explodes with the number of different diffs you might need.

      At that point, just not bothering with diffs and making users always download the new package version in full is generally preferred.

    • Olap@lemmy.world
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      11 hours ago

      Patching means rebuilding. And packagers don’t really publish diffs. So it’s use all your bandwidth instead!

      • definitemaybe@lemmy.ca
        link
        fedilink
        arrow-up
        20
        ·
        11 hours ago

        Which is WAY more economical.

        Rebuilding packages takes a lot of compute. Downloading mostly requires just flashing some very small lights very quickly.

        • cmnybo@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          6
          ·
          9 hours ago

          If you have multiple computers, you can always set up a caching proxy so you only have to download the packages once.

          • SmoochyPit@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            5 hours ago

            That reminds me of Chaotic AUR, though it’s an online public repo. It automatically builds popular AUR packages and lets you download the binaries.

            It sometimes builds against outdated libraries/dependencies though, so for pre-release software I’ve sometimes had to download and compile it locally still. Also you can’t make any patches or move to an old commit, like you can with normal AUR packages.

            I’ve found it’s better to use Arch Linux’s official packages when I can, though, since they always publish binaries built with the same latest-release dependencies. I haven’t had dependency version issues with that, as long as I’ve avoided partial upgrades.

          • Ephera@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            openSUSE Leap does have differential package updates. Pretty sure, I once saw it on one of the Red-Hat-likes, too.

            But yeah, it makes most sense on slow-moving, versioned releases with corporate backing.