« May 2004 »
S M T W T F S
1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31

You are not logged in. Log in
Open Community
Post to this Blog

Helpful Links
Angelfire Home
Register Your Domain
Angelfire's Twitter
Angelfire's Facebook

Angelfire Club Blog
Need assistance and ideas from fellow Angelfire members to help build and manage your website? You've come to the right place!
To join this Community Blog, you must be an Angelfire member. Just click the "Join this Community" link, and start posting immediately.

Hint: When posting, select a topic that most relates to your question. (News, FrontPage, HTML Questions, etc...) This will help to keep the blog organized for everyone.

View Latest Entries

Thursday, 13 May 2004
Backing Up Sites
Mood:  quizzical
Topic: HTML Questions
Does anyone know how I can go about backing up my website (ie. the contents of the web shell- that is; the .html files, the .css files, etc.)?

I have an idea, but it wont work for any blog or .css files...


Posted by Freemason at 2:18 AM EDT | Post Comment | View Comments (2) | Permalink | Share This Post

Thursday, 13 May 2004 - 6:26 AM EDT

Name: cw

Use FTP. You'll need some software like CuteFTP.

Thursday, 13 May 2004 - 9:17 AM EDT

Name: jrp34

If you are *really careful* and configure it right you can use 'wget'. 'wget' is a utility that reads web pages and downloads them, sort of like a mirroring utility. It saves all the files (incl. images) in the same directory structure as they exist on the server, so you don't have to reconstruct much. The downside is that it can easily download more than it can chew, so you have to be careful to limit it to your site only and use the right options.

The advantage to wget is that it wll save blogs, etc. that are rendered on the fly. The disadvantage is that you have to remove all the ad code and whatnot yourself after its done.

Oh, and there is a version for Windows, but I've never used it. Linux and MaxOS X all the way. :)

JP

View Latest Entries