Jump to content

megaburn

Members
  • Posts

    6
  • Joined

  • Last visited

Everything posted by megaburn

  1. Well I "fixed" file resume my way by copying the cookie and header data for a download started locally to my VPS, it downloaded successfully without having to login on Nexus from the VPS, then I resumed the failed local download using the VPS as a mirror. In about 5 hours I'll know if the file is valid. The test file is Marts_Mutant_Mod_1-RC41-3211.7z, it was at 57MB locally when resumed, total size is 113MB or about a 11 hour download at ~3KB/s. EDIT: Works, download complete and tested. If anyone else wants to try this method of setting up personal mirror sites just let me know, I'll post some step by step instructions.
  2. As I said, there are other means to that end. You can both ensure people logged in to start a download and limit bandwidth via Apache ACL and httpd.conf (or htaccess) directives. For what I was suggesting before, have the download script only act as a 'gate keeper' by issuing download permissions via adding an user IP, file name, and time stamp entry to an allow list, deny all other requests to the file folders by default, then use a cron job bash script to cull the old allow entries every few hours or so. This should keep it from confusing clients and might even improve performance by taking PHP out of handling transfers. With that setup the files would have to be publicly accessible to requests but without an ACL allow entry any unauthorized downloaders would just get 403 forbidden. Anyway, the current system seems to work well enough, it should be possible to fix resume support. I still think its the download script, assuming the Apache config allows for file resume, its probably that the script is missing support for http header range values. There are some examples in the PHP fread manual comments. If thats not it then give me some more details and I'll research it some more. To be clear I'm just trying to help here, not complain. Put it this way, would you rather have several hours of my time helping to resolve the problems with file resume -or- for me to just shutup and move on? At the moment I have far more free time than money so I'm still hesitant to get a premium account but I would like to contribute something back the Nexus for the past several years.
  3. Pronam, I mean on a per file basis. Right now its 1 transfer per cookie token and that cookie has to be checked by a script. Changing that to a server side IP based access control allow/deny scheme would make the entire process transparent to any client making a request for any single file from any single IP - no need for client side cookie support or a server side script to validate a cookie. I think resume support is being broken by that cookie validation script and wrapping the transfer in a PHP session, confusing download managers into thinking its a different file. LHammonds, wget would have to get the session data from something else. I did some testing with the FlashGot extension for Firefox but the download manager 'definition' which builds the command to call wget doesn't include cookie or HTTP session data (it does for the cURL lib though). Problem there is FlashGot caches it to a temp folder with a random file name, so even if it could start the download it would not be able to restart a failed download in a new session. I'll do some more testing with custom FlashGot download manager definitions, maybe I can get it to bypass the caching and write to the file directly at its save-to location. Another idea is attempting to wrap wget in SSH instead of the console, so I could start the download locally and get it to automatically save to the VPS (requires that the HTTP session and cookie can 'survive' being moved to another PC with a different IP).
  4. Any plans for adding resume support? Not having resume is a big waste of server resources, I know I've wasted nearly a hundred hours on failed downloads and there are plenty of similar comments on here. Remove the cookie based security scheme and limit it to just 1 file transfer per IP address. I'm still searching for a download manager that will work, for some reason everything which integrates into Firefox does not support file resume on files with the same name. Thats why something as simple as "wget -c" would be ideal, it just appends new data onto the existing file. Any premium users care to comment on support for wget or similar bare bones file transfer clients?
  5. I'm on very slow 26.4kbps (~3kB/s) dialup, to download large files I use a Linode VPS as an intermediary then I can either download directly with greater compression and reliable file resume, or go to a local public library (300+MB) or PC repair shop (1+GB). Problem is I cannot get command line browsers like Lynx to download from The Nexus (i.e. cannot use VPS as intermediary) and none of the download managers seem to work with file resume so anything over ~20 hours (200+MB) is kind of hopeless (tried most of the ones available for Firefox, and a few standalone apps). Is there any way around this? Upgrading to a premium account is only an option if I can download with "wget -c <url>", I don't care about speed, just accessibility with command line tools.
  6. Hi, I got an email reminding me that this site exists and a general news update. I’ve retired from Morrowind but I’m keeping a close eye on Oblivion’s development so I’ll start to pay a little more attention on here. Anyway, without reading the other responses to this week’s question (I’m kind of short on time at the moment): Multiplayer: Yes; MMORPG: No. Multiplayer offers certain social experiences that you cannot find else where. Whether it’s covering your buddies in a heavy fire fight while playing a FPS or cutting a swath through hordes of demons in a hack n slash click fest. It’s just fun. The Elder Scrolls, being as unique as it is, offers a rather interesting potential for a deeper game play experience over single player alone or even other multiplayer games. I think the major objection to multiplayer comes from people who do not want to see the storyline cheapened with increased combat to appease the FPS multiplayer crowd. While this argument is, traditionally, accurate I think Bethsoft could manage a strong balance. Looking at Morrowind or even what is known about Oblivion now, it’s clear that if a multiplayer element were added to either game - as is - the impact on the story would be negligible. The game play experience, on the other hand, should change quite a bit. Whether or not this change is an improvement depends more on the players than on the game itself. A couple trigger happy players jumping from Half Life 2 over to a multiplayer Oblivion would probably turn the game play into hack n slash blood bath. A couple people jumping over from playing D&D over IRC or a MUD would probably seek to enhance the story quite a bit or even set rules against cheapening game play with excessive combat or looting. In short, expanding and balancing game play would be a challenge but in the end it would be an improvement. Or put it this way: Why are there so many, popular, Morrowind mods that add NPC allies, friends, pets, or even “significant others”? Is it just a few geeks seeking to create a virtual social life because their real social life is lacking? I would like to think that it is because people love The Elder Scrolls and want to share their game play experience with their real life friends but settle for NPC’s because the game is single player. Then again maybe it’s a trend of anti-social personality disorders. Multiplayer would also open up the option to run AI bots for added competition – or companionship. As for MMORPG’s: No. It will only lead to a break down in the quality of the game play experience, cheapen the story, kill modding, and add yet another game to an already saturated market place. I think an included option for a player run dedicated server with the option for a persistent environment is a good idea but for regular multiplayer only. This would allow a small group of friends to experience the story, reset the game world for another run through the story, and then install a few mods to change the game play for yet another run through a somewhat different story. Another way to look at it is core game play dynamics: TES single player is generally about the game play experience, multiplayer is generally about “winning” regardless of the game or genre, and MMORPG’s tend to be about living an alternate life of sorts (hence the “Ever Crack” effect). TES has always been about the story and game play experience. Balancing the game play so players cannot “win” should make multiplayer possible and without significantly changing the experience. An MMORPG could only cheapen the experience – and (again) remove the modding aspect from the “total player freedom” core game design concept. Finally there are the implications of adding multiplayer. An absolute requirement is there can be no effect on the single player game play – except perhaps the option to turn on included AI bots for built “NPC friends”. Modding would probably explode with hack n slash fans increasing combat and RPG fans focusing on new lands, “tweaking the experience”, or creating new adventures. Since everyone can run their own game (unlike an MMORPG) player groups would probably develop radically different sets of mods. For any Fallout fans on here: this argument also applies to Fallout. Multiplayer Fallout *could* be a major improvement but I agree with most of the hardcore Fallout following that Bethsoft has yet to prove their ability to design a multiplayer RPG. This is an old but still interesting debate. One which I’ll have to continue later… Sorry for the long post, I’ve run through this argument a dozen plus times and this is the sum total of my point. I’ll probably post another rant in a couple weeks. -Chris
×
×
  • Create New...