Just had this idea pop up in my mind. Instead of relying on volunteers mirroring package repositories all around the world, why not utilise BitTorrent protocol to move at the very least some some load unto the users and thus increase download speeds as well as decrease latency?

  • gnuhaut@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    5 months ago

    Because HTTP is simpler, faster, easier, more reliable.

    The motivation for a a lot of p2p is to make it harder to shut down, but there is no danger of that for Linux distros. The other would be to save money, but Debian/Arch/etc. get more than enough bandwidth/server donations, so they’re not paying for that anyway.

  • atzanteol@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    5 months ago

    BitTorrent would likely increase latency, not lower it. The bit torrent protocol is very inefficient for small files and large numbers of files (https://wiki.debian.org/DebTorrent - see “Problems”).

    But I think your question is more “why not use p2p to download files” for which I think the answer is likely “because they don’t need to.” It would add complication and overhead to maintain. An FTP/HTTP server is pretty simple to setup / maintain and the tools already exist to maintain them. You can use round-robin DNS to gain some redundancy and a bit of load spread without much effort either.

    • Omega_Jimes@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      5 months ago

      Bittorrent is nice for getting isos, but it would pul my hair out if I tried to download patches with it.

  • arxdat@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    Metallica ruined it. They made it seem as though torrenting was evil because their content was being downloaded. Poor babies.

    • ElderWendigo@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      5 months ago

      Lars ruined Napster. BitTorrent came around some time later after Limewire, Soulseek, and DirectConnect. Lars might have had something to say about Bit Torrent, but by that point no one was listening.

      Besides, back then, we really were using BitTorrent mostly for Linux ISOs. At the time it was more reliable than http. It really sucked having to download an entire ISO again because it failed the checksum. BitTorrent alleviated that.

  • recarsion@discuss.tchncs.de
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    To add to everything else mentioned, many places (schools, workplaces) don’t allow any usage of BitTorrent, even legal. A guy at my uni got yelled at for torrenting a Linux iso. Not to mention depending on where you live your ISP might be interested in that activity unless you’re using a vpn.

  • makeasnek@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    5 months ago

    There is an apt variant that can do this, but nobody uses it. BitTorrent isn’t great for lots of small files overhead wise.

    IPFS is better for this than torrents. The question is always “how much should the client seed before they stop seeding and how longs should they attempt to seed before they give up”. I agree something like this should exist, I have no problem quickly re-donating any bandwidth I use.

  • Rogue@feddit.uk
    link
    fedilink
    arrow-up
    2
    ·
    5 months ago

    I suspect if this was enabled by default there would be uproar from people annoyed the distro was stealing their bandwidth, and if it were opt-in then very few people would do it.

    Windows Update uses peer to peer to distribute updates. It’s one of the first things I always disabled.

  • Sims@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    5 months ago

    Over time I’ve seen several groups tinker with p2p protocols for packages. Latest using gnunet/ipfs for Guix packages. But I’ve never seen a working/integrated system. Weird…

  • sorrybookbroke@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    That’s actually a really interesting idea. Windows even does something, or at a point did something, similar with system updates.

    Peer to peer packages would have some privacy, and potential security issues of course but I like the thought

  • Possibly linux@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    That what Debian does with ISOs. However, no one uses it.

    If anything IPFS might be good for packages as the IPFS program could he embedded into the package manager.

  • ಠ_ಠ@infosec.pub
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    5 months ago

    Some distros do this already.

    Alternative downloads

    There are several other ways to get Ubuntu including torrents, which can potentially mean a quicker download, our network installer for older systems and special configurations and links to our regional mirrors for our older (and newer) releases.

    BitTorrent is a peer-to-peer download network that sometimes enables higher download speeds and more reliable downloads of large files. You need a BitTorrent client on your computer to enable this download method.

    https://ubuntu.com/download/alternative-downloads