Jump to content

Downloading a full site for offline possible?


Vagrant0

Recommended Posts

Since the Oblivion wiki has been unreliable lately, was wondering if anyone knew of a way to save dozens of linked pages for offfline use that wouldn't get me banned from the site. Natrually I'd have to do it slow enough not to interfere with traffic, but it seems like I have no alternative from downloading. So I was wondering if anyone knew of a good way of doing this, or if it's even possible without downloading each page individually and manually piecing them together.
Link to comment
Share on other sites

Link to comment
Share on other sites

Well, no, since those either require buying them, or that you can only view the pages within the application. I would prefer to just have them backed up on a CD or something, which wouldn't require any additional applications, and could be distributed to others that need it when the wiki goes down, and stays down. Also, since many of them lack decent controls for when they do download, I would probably get banned from the wiki for excessive traffic.

Link to comment
Share on other sites

Then give a chance at WinHTTrack.

I've used it a long time ago to backup a site for a friend of mine who hadn't Internet connection. I remember that I've copied the whole folder structure on a CD (or DVD maybe) and somehow it ran like a website. Not sure if I used another tool to make able it to run like a website from a CD though. Anyway I've downloaded the new version at the link that I've provided above and it has many configurable options like limits for max transfer rate, for max connections per second, for flow control etc. Finally, it's free so..

 

EDIT #1: It can continue an interrupted download and get separated files, and it has a transfer scheduler too.

 

EDIT #2: On that site there is also a forum if you need more support.

Link to comment
Share on other sites

Then give a chance at WinHTTrack.

I've used it a long time ago to backup a site for a friend of mine who hadn't Internet connection. I remember that I've copied the whole folder structure on a CD (or DVD maybe) and somehow it ran like a website. Not sure if I used another tool to make able it to run like a website from a CD though. Anyway I've downloaded the new version at the link that I've provided above and it has many configurable options like limits for max transfer rate, for max connections per second, for flow control etc. Finally, it's free so..

 

EDIT #1: It can continue an interrupted download and get separated files, and it has a transfer scheduler too.

 

EDIT #2: On that site there is also a forum if you need more support.

it doesn't work... Atleast not with the wiki. Can someone provide something useful and absolutely working here so I don't end up getting myself banned from the wiki trying to make something like this work?

Link to comment
Share on other sites

Have you asked if it's OK? If not, that's worth a shot.

 

Would it be possible to get part of it from the Google cache?

Wouldn't know who to ask, or where. Don't know how to access google cache, or download it for my own uses.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...