Most of the problems in the current internet landscape is caused by the cost of centralized servers. What problems are stopping us from running the fediverse on a peer to peer torrent based network? I would assume latency, but couldn’t that be solved by larger pre caching in clients? Of course interaction and authentication should be handled centrally, but media sharing which is the largest strain on servers could be eased by clients sending media between each other. What am I missing? Torrenting seems to be such an elegant solution.

  • taladar@sh.itjust.works
    link
    fedilink
    arrow-up
    44
    arrow-down
    2
    ·
    1 year ago

    Authentication and authorization can not be handled centrally if the host performing the actual action you want to apply those to can not be trusted.

    Media sharing is mainly a legal problem. With decentralized solutions you couldn’t easily delete illegal content and anyone hosting it would potentially be legally liable.

    • uis@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      and anyone hosting it would potentially be legally liable.

      Bullshit in many countries. Well, I know two of them: USA and Russia. First requires intention, second has explicitly excludes P2P from liability.

  • Endorkend@kbin.social
    link
    fedilink
    arrow-up
    45
    arrow-down
    4
    ·
    1 year ago

    Quite a few systems use torrent style distribution.

    Heck, even Windows uses a distributed bandwidth system where you can set it to download chunks of updates from local networked systems.

    All technologies, like bittorrent, nonSQL databases, blockchain, AI and the like become used as an invisible part of systems once the idiotic hype about the technology wanes.

  • Bobby Turkalino@lemmy.yachts
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    1
    ·
    1 year ago

    Torrenting requires way more resources than people realize. It’s easy to look at your torrents’ download speeds and think “oh, that’s less than a normal download, like from Steam, so it must not take nearly as many resources” – it’s not all about bandwidth. The amount of encryption and hashing involved in torrenting is fairly CPU heavy (every ~4 MB piece has to be hashed and verified), especially if your CPU doesn’t have onboard encryption hardware (think mobile devices). The sheer number of connections involved even in just one torrent can also bog down a network like you wouldn’t believe – anyone who runs a home seedbox can attest.

    • uis@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      “oh, that’s less than a normal download, like from Steam, so it must not take nearly as many resources”

      For me it’s always more.

      The amount of encryption and hashing involved in torrenting is fairly CPU heavy

      Same amount of encryption https requires. Hashing is completely optional and is not required for operation, encription is optional too, but other peers may require it.

      every ~4 MB piece has to be hashed and verified

      Which is same. I’m not sure, but steam probably verifies file at some stage too.

      The sheer number of connections involved even in just one torrent can also bog down a network like you wouldn’t believe – anyone who runs a home seedbox can attest.

      There is not difference between 4kpkts for 500 connections vs 4kpkts for 1 connection for network itself. IP and UDP don’t even have such concept. Network is stateless. But shitty routers with small conntrack table on the other hand…

  • makeasnek@lemmy.ml
    link
    fedilink
    arrow-up
    28
    ·
    edit-2
    1 year ago

    The short answer is that while torrents show great possibility for content distribution (as an alternative to CDNs for example), they inherently rely on some centralized resources and don’t make sense for a lot of use cases. Most websites are a bunch of small files, torrenting is really much more useful for offloading large bandwidth loads. On small files, the overhead for torrents is a waste. That’s why your favorite linux ISO has a torrent but your favourite website doesn’t.

    One major issue is difficulty in accurately tracking the contribution of each member of the swarm. I download a file and I seed it to the next person, sounds great right? But what if the next person doesn’t come along for a long time? Do I keep that slot open for them just in case? How long? How I prove I actually “paid my dues” whether that was waiting for peers or actually delivering to them? How do we track users across different swarms? Do we want a single user ID to be tracked across all content they’ve ever downloaded? When you get into the weeds with these kinds of questions you can see how quickly torrenting is not a great technology for a number of use cases.

    Being somewhat centralized, by the way, is how BitTorrent solved the spam issue which plagued P2P networks prior to it. Instead of searching the entire network and everything it contains (and everything every spammer added it to it), you instead rely on a trusted messenger like a torrent index to find your content. The torrent file or magnet link points to a link in a DHT and there you go, no need to worry about trusting peers since you are downloading a file by hash not by name. And you know the hash is right because some trusted messenger gave it to you. Without some form of centralization (as in previous P2P networks), your view of the network was whatever your closest peers wanted it to be, which you essentially got assigned at random and had no reason to trust or not trust. You couldn’t verify they were accurately letting you participate in the wider network. Even a 100% trustworthy peer was only as good as the other peers they were connected to. For every one peer passing you bad data, you needed at least two peers to prove them wrong.

    Blockchain gets us close to solving some of these problems as we now have technology for establishing distributed ledgers which could track things like network behavior over time, upload/download ratio, etc. This solves the “who do I trust to say this other peer is a good one?” problem: you trust the ledger. But an underlying problem to applying Blockchain to solve this problem is that ultimately people are just going to be self-reporting their bandwidth. Get another peer to validate it, you say? Of course! But how do we know that peer is not the same person (how do we avoid sybil attacks)? Until we have a solid way to do “proof of bandwidth” or “proof of network availability”, that problem will remain. There are many people working on this problem (they’ve already solved proof of storage so perhaps this could be solved in a similar way) but as of right now I know of no good working implementation that protects against sybil attacks. Then again, if you can use blockchain or some other technology to establish some kind of decentralized datastore for humanity, you don’t need torrents at all as you would instead be using that other base layer protocol for storage and retrieval.

    IPFS was intended as a decentralized replacement for much of the way the the current internet works. It was supposed to be this “other protocol”, but the system is byzantinely complex and seems to have suffered from a lack of usability, good leadership, and promotion. When you have an awesome technology and nobody uses it, there are always good reasons for lack of adoption. I don’t know enough about those reasons to really cover them here, but suffice to say they actually do exist. Then again, IPFS has been around for a while now (15 years?) and people use it for stuff so clearly it has some utility.

    That said, if you want to code on this problem and contribute to helping solve data storage/transmission problems, there are certainly many OSS projects which could use your help.

  • onlinepersona@programming.dev
    link
    fedilink
    English
    arrow-up
    26
    ·
    1 year ago

    You’re basically talking about IPFS. I think their problem is that they gave it a local HTTP interface and the documentation is… in need of improvements.

    • Acters@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      I always shy away from newer tech because of lackluster documentation and poor leadership. The latter is rare enough. Without proper documentation, I feel like I have to read the code and make my own notes to put into their documentation platform. Which is not what I want to do when I use it. Contributing is nice, but when doing something a core member would do without credit, it will dissuade me from participating.

      • onlinepersona@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I know that feeling. Curiosity often gets the better of me though: I’m a nix / NixOS user - amongst the worst documented projects I’ve come across.

  • andruid@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    For me I’ve had issues with getting organzational support for use anything close to p2p, with things like “keep that bot net off my system” being said. On personal side I had issues with ISPs assuming traffic was illegal in nature and sending me bogus cease and desist notices.

    Agreed though. At least webrtc has a strong market. IPFS and other web3 things also have tried to find footholds in common use, so the fight isn’t over for sure!

    • uis@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      On personal side I had issues with ISPs assuming traffic was illegal in nature and sending me bogus cease and desist notices.

      On the other hand check if you can sue them for bogus cease and desists. Of you can, do it after changing ISP.

    • xoggy@programming.dev
      link
      fedilink
      arrow-up
      18
      ·
      1 year ago

      Gotta love the heavy use of buzzword technologies and no actual information on what is actual is. Then you click the “How does it work?” button and it takes you to a Google powerpoint… so much for the sleek website design.

    • blackfire@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      This is the only system i am aware of using torrent based content sharing. Its not a great system though as you are essentially downloading a whole archive everytime you connect so it just grows and grows unless you set some retention

  • Eggymatrix@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I would not want to neither deal with security issues nor pay the data costs associated with some an app being able to connecting to my phone to download media

        • henrikx@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          7
          ·
          1 year ago

          Don’t see how this is much different to today’s way of doing things where pretty much everyone is a freeloader to the centralized server. The major benefit is that it doesn’t have to be just one server anymore.

          • folkrav@lemmy.ca
            link
            fedilink
            arrow-up
            5
            ·
            1 year ago

            The trackers themselves are centralized. The .torrent file you download from a private tracker has a unique private ID tied to your account, which the torrent client advertises to the tracker when it phones home to the announce URL, alongside your leech/seed metadata.

  • Dunstabzugshaubitze@feddit.de
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Most clients are web browsers and support for torrents in http is the same as for every other file.

    So that would only give us a use for torrents as a form of content distribution plattform to get the actual files closer to the client.

    In cases where we have actual non browser clients: i like to curate what i am distributing and don’t want to distribute anything i happen stumble upon or would you be willing to store and more importantly share everything you find on 4chan or that might show up in your mastodon feed?

  • deur@feddit.nl
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I was thinking about the possibility torrent based public repo git clones but that isn’t going to pan out.