Basically, I’ve found a ~100gb website that I need to have on hand in order to use it’s data for a research project. However, the website is really old (started in the 90s and still looks like it), and I need to download it so I don’t get fucked if it goes down at some point. Atm I’ve just had HTTrack running at it’s 25kb/s default limit 24/7 but if I don’t increase the bandwidth it would take like 2 months before it’s all downloaded, which doesn’t work for my timeframe. So, what’s a reasonable bandwidth to put it at so I don’t bother the admins or DOS the website but also don’t have to wait absolute ages? Oh and I need all the images, videos, etc, so I can’t rely on archive.org.

  • bobbarker4444@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Have you considered contacting the admins and seeing if they’d be able and willing to provide a dump for you?