Jump to content

Help with large downloads


megaburn

Recommended Posts

I'm on very slow 26.4kbps (~3kB/s) dialup, to download large files I use a Linode VPS as an intermediary then I can either download directly with greater compression and reliable file resume, or go to a local public library (300+MB) or PC repair shop (1+GB). Problem is I cannot get command line browsers like Lynx to download from The Nexus (i.e. cannot use VPS as intermediary) and none of the download managers seem to work with file resume so anything over ~20 hours (200+MB) is kind of hopeless (tried most of the ones available for Firefox, and a few standalone apps). Is there any way around this?

 

Upgrading to a premium account is only an option if I can download with "wget -c <url>", I don't care about speed, just accessibility with command line tools.

Link to comment
Share on other sites

Is there any way around this?
multi-threading and resume download will certainly NOT work if you do not have a premium account. As for if that particular tool will work, I don't know. Ask premium users to give it a try to subscribe yourself with the minimum amount to check it out.

 

LHammonds

Link to comment
Share on other sites

Any plans for adding resume support? Not having resume is a big waste of server resources, I know I've wasted nearly a hundred hours on failed downloads and there are plenty of similar comments on here. Remove the cookie based security scheme and limit it to just 1 file transfer per IP address.

 

I'm still searching for a download manager that will work, for some reason everything which integrates into Firefox does not support file resume on files with the same name. Thats why something as simple as "wget -c" would be ideal, it just appends new data onto the existing file.

 

Any premium users care to comment on support for wget or similar bare bones file transfer clients?

Link to comment
Share on other sites

If it would be limited to on filetransfer at a time this site would be down in a month....Most people come and download a bunch of mods at the same time. This is the advantage compared other sites that don't allow this and partly explains its success.

And again, only Premium members have the privileges of pause-resume:

 

*Uncapped file downloads. Premium members can download from the file servers at the maximum speed possible while normal members are capped to downloading at 150kb/sec to reduce congestion.

 

*Premium only file server. Premium members can download from a file server available just for them that no normal members can download from. No connection saturation.

 

*Download manager/accelerator support. Premium members can download from the premium member file server using download managers and accelerators with full download pausing/resuming and multi-threading/tasking capabilities for top speeds.

 

*No advertising. Premium members will never see any advertisements on this site.

 

*Result preferences. Premium members can choose to increase the number of files or comments shown on individual pages.

 

*Full download history. Access to your full download history on the site.

Link to comment
Share on other sites

I installed the Windows version of wget 1.11.4 and tried it on the following mod:

 

Armamentarium - Armor

 

I copied the file mirror URL and pasted it onto the command line but it failed said 'file' is not recognized as an internal or external command, etc.

 

wget -c http://www.tesnexus.com/downloads/dl.php?fs=files&file=ArmamentariumArmor-16975.7z

 

I'm certain it has to do with login authentication and not being attached to the browser.

 

How this utility is supposed to "login" to TESNexus and maintain an authenticated session is beyond me.

 

I even tried the "unsafe method" of an embedded ID/Password in the URL.

 

wget -c http://LHammonds:[email protected]/downloads/dl.php?fs=files&file=ArmamentariumArmor-16975.7z

 

LHammonds

Link to comment
Share on other sites

Pronam, I mean on a per file basis. Right now its 1 transfer per cookie token and that cookie has to be checked by a script. Changing that to a server side IP based access control allow/deny scheme would make the entire process transparent to any client making a request for any single file from any single IP - no need for client side cookie support or a server side script to validate a cookie. I think resume support is being broken by that cookie validation script and wrapping the transfer in a PHP session, confusing download managers into thinking its a different file.

 

 

LHammonds, wget would have to get the session data from something else. I did some testing with the FlashGot extension for Firefox but the download manager 'definition' which builds the command to call wget doesn't include cookie or HTTP session data (it does for the cURL lib though). Problem there is FlashGot caches it to a temp folder with a random file name, so even if it could start the download it would not be able to restart a failed download in a new session.

 

I'll do some more testing with custom FlashGot download manager definitions, maybe I can get it to bypass the caching and write to the file directly at its save-to location. Another idea is attempting to wrap wget in SSH instead of the console, so I could start the download locally and get it to automatically save to the VPS (requires that the HTTP session and cookie can 'survive' being moved to another PC with a different IP).

Link to comment
Share on other sites

The server side script is there to make sure people are logged in before downloading as well as insuring people are sticking to their 150kb/sec download speed.

 

The reason why file resuming doesn't work isn't the cookie system or the PHP system, it's the fact the files are being retrieved from a non-public directory and fed to the user through PHP. The same system applies to the premium members, only their files are arranged slightly differently in order to allow download manager support.

Link to comment
Share on other sites

As I said, there are other means to that end. You can both ensure people logged in to start a download and limit bandwidth via Apache ACL and httpd.conf (or htaccess) directives. For what I was suggesting before, have the download script only act as a 'gate keeper' by issuing download permissions via adding an user IP, file name, and time stamp entry to an allow list, deny all other requests to the file folders by default, then use a cron job bash script to cull the old allow entries every few hours or so. This should keep it from confusing clients and might even improve performance by taking PHP out of handling transfers. With that setup the files would have to be publicly accessible to requests but without an ACL allow entry any unauthorized downloaders would just get 403 forbidden.

 

Anyway, the current system seems to work well enough, it should be possible to fix resume support. I still think its the download script, assuming the Apache config allows for file resume, its probably that the script is missing support for http header range values. There are some examples in the PHP fread manual comments. If thats not it then give me some more details and I'll research it some more.

 

To be clear I'm just trying to help here, not complain. Put it this way, would you rather have several hours of my time helping to resolve the problems with file resume -or- for me to just shutup and move on? At the moment I have far more free time than money so I'm still hesitant to get a premium account but I would like to contribute something back the Nexus for the past several years.

Link to comment
Share on other sites

Well I "fixed" file resume my way by copying the cookie and header data for a download started locally to my VPS, it downloaded successfully without having to login on Nexus from the VPS, then I resumed the failed local download using the VPS as a mirror. In about 5 hours I'll know if the file is valid. The test file is Marts_Mutant_Mod_1-RC41-3211.7z, it was at 57MB locally when resumed, total size is 113MB or about a 11 hour download at ~3KB/s.

 

EDIT: Works, download complete and tested. If anyone else wants to try this method of setting up personal mirror sites just let me know, I'll post some step by step instructions.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...