Jump to content

Downloading and file serving changes


Dark0ne

Recommended Posts

  • Replies 190
  • Created
  • Last Reply

Top Posters In This Topic

In response to post #11086397. #11139420 is also a reply to the same post.

You can use custom categories in NMM, that's what I do dince the whoile category-system was added.
I think knowing how to install mods manually is something every modder should do, even (or maybe especially) when using NMM. There's so many users having problems they could have either avoided or at least foreseen if they had known how multiple mods interact with each other.
I still prefer using NMM, but that's something everyone modding their games has to decide on his/her own.
Link to comment
Share on other sites

Any ideas where the new servers will be? I hope to get one in the central US (preferably Chicago, as that's where i'm closest to). I think we did have a Chicago premium server a while back, but I didn't have premium then. I do now though! Anyway, thanks so much for all the hard work you've put into the site Dark0ne & Co. I (and many others) on here really appreciate it!

Link to comment
Share on other sites

Not having a limit on the number of files downloaded at a time is bad; as a programmer, I'd point and laugh, except that I've made worse mistakes in my career (no, I won't say what they were... worse, remember?).

 

I will say that 5/10 files is perfectly reasonable for larger files, but for small files (where more time is spent setting up the download than actually downloading the file) might be a bit low. It might make more sense to have a limit for small (<200K? 300K? 1M?) files that's higher (20/40, perhaps?) and keep the 5/10 for files above that limit.

 

(Or, if you want to get really fancy, have files under 1M count less -- e.g., the 'cost' of a file is size/1M, capped at 1, so 100K files only count as a tenth of a file. Heck, for files above, say, 50M, you could have them count as more than one file -- e.g., cost = max(min(size/1M,1),size/50M) (I think that does what I described... I need more coffee if I want to produce valid code).

 

Just a suggestion....

Link to comment
Share on other sites

In response to post #11159677.

I don't think is quite as straight forward as that. A server garbing one big file is actually less strain on the network than a bunch of small files. A 10 100kB files vs 1MB file your not going to see a significant performance difference, but scaled up to the level Dark0ne is talking about. Having 200 files being downloaded at a time per user causes a lot of database lookups to occur, resulting in a huge amount of random reads on the disk (random reads and random writes to disk are generally the most time consuming I/O operations).

Basically its not so much a problem of data throughput to user(though that does play a factor) but amount of time it takes the web servers to lookup each file being downloaded by users. The more files (not necessarily more GBs) causes the database servers and file servers to work harder.

I'm in I.T. at a research group and a little bit of a storage nerd so this stuff is very interesting to me.
Link to comment
Share on other sites

I really respect you for making these blog posts. It separates you from the standard "entity" that we get used to in times like this, where they lack any sort of transparency and divulge pretty much nothing that goes on behind the scenes.

 

The queue sounds really nice, too. I honestly thought there was always one there, but mostly because I never looked (Then again the max I downloaded was like 6 - 10 concurrently - I can't even imagine tracking down 200 files to download them at once...)

 

I also respect the fact that you guys take responsibility and admit screwups or oversights. I hope you manage to get the servers sorted out!

 

Start charging them 100 dollars a mega-download. That will cure them of that real quick. And if that doesn't work, send them the Red Square White Exclamation Point note. You know, the one that reads "Really, are you that selfish?" Besides that, I am one of the people that read your notes. You are kind enough to tell us what is going on, and how things are working out. Not many people or places care to do that anymore. So thank you. If I had this many problems with my computer, I would have already mailed it to a fish. (For all you youngsters out there that means throwing it in a lake). I hope you get it worked out the way you want. Keep up the great work and again, a big thank you.

 

 

How ridiculous. While the most I think I've concurrently downloaded was ~6 - 10 files, I honestly thought there was a queue in place (I just never really watched it all that closely to notice there wasn't, it's a set it and go kind of a thing), it's ridiculous of you to place this level of blame on the users of the NMM when the NMM is flawed (and knowingly so) in a way that allows them to do this. I do web development, and I know one of the first mindsets I take when I'm developing the user end of an application is "How might the user abuse this, intentionally or unintentionally". I think this should be the mindset of any application that interacts with a server and is operated by an end-user.

 

If they don't want people to download 200 concurrent files and essentially hammer the server, it's their responsibility to make it not allow that (which is what they basically said). Expecting people to be decent is as equally stupid as downloading 200 concurrent files or not putting in a queue to begin with.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...