Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Periodic site backups

Options
  • 16-03-2005 12:57pm
    #1
    Closed Accounts Posts: 230 ✭✭


    How do you handle site backups?

    I'm looking for
    1. An ftp client that will do an incremental backup from webhost to local PC, copying only new or updated files
    2. A utility that will update a mySQL dump on my local PC, from the host, handling only new or updated records
    Preferably freeware. Anybody?


Comments

  • Registered Users Posts: 7,739 ✭✭✭mneylon


    Are you hosting on windows or linux?

    If you are on linux you could *probably* setup a cronjob to scp the files to you once a day ...


  • Closed Accounts Posts: 230 ✭✭musician.ie


    I'm on Windows unfortunately. What do the experts do?

    And don't say the experts use Linux :D


  • Registered Users Posts: 1,268 ✭✭✭hostyle


    What do the experts do?

    Write their own.


  • Registered Users Posts: 7,739 ✭✭✭mneylon


    I'm on Windows unfortunately. What do the experts do?

    Get a dedicated server (or use linux)


  • Registered Users Posts: 6,511 ✭✭✭daymobrew


    A few years back I wrote a perl script to recurisively download files from a ftp site and commit them to ClearCase source code repository and then email people with the results. Ran as a cron job.

    The basic algorithm might be useful to you - you'd have to determine which files were new (could be the messy part) and download them.

    I can't help you with the db part.


  • Advertisement
  • Registered Users Posts: 7,739 ✭✭✭mneylon


    daymobrew wrote:
    The basic algorithm might be useful to you - you'd have to determine which files were new (could be the messy part) and download them.
    Rsync can do that.. AFAIK there is a windows port


  • Registered Users Posts: 6,511 ✭✭✭daymobrew


    Rsync can do that
    Will that need to have the rsync daemon running on the remote machine?


  • Registered Users Posts: 7,739 ✭✭✭mneylon


    I've never tried using it on windows, but on *nix both machines need to have rsync installed.


  • Closed Accounts Posts: 230 ✭✭musician.ie


    Sigh - I do miss my linux hosting and telnet access, but I thought asp was the way to go. I would assume my hosting company only run whatever daemons/processes are absolutely necessary.

    To be honest, it's the mysql databases that are the big issue. The files are scripts and html pages (which seldom change) and images (which don't change but new ones arrive - so my ftp client can actually handle that. Telling it not to replace existing files means it only transfers new ones. )

    But the mysql databases? I have 2 sites with article databases containing all the content for each site. One already has (and one is about to have) a phpBB forum. This is getting big on the existing one and will be huge on the new one. Using the backup to dump the entire DB will devour my bandwidth on the hosting plan, and on my broadband account, and will take ages. But I have to have a backup - my company backs up but won't guarantee data on my hosting plan. Hence my interest in backing up incrementally. It occurs to me I could install mysql on my local pc, but after that it gets a bit fuzzy.....

    If I configured my PC to be the server, I know I'd be hacked within the hour. I haven't the knowledge, time or inclination to go that route.

    Thanks for the ideas so far by the way.


  • Registered Users Posts: 4,003 ✭✭✭rsynnott


    For the MySQL DB, if your host allows access to the DB from outside, just write a batch file to do "mysqldump -H (host) -U (user) -p(password) (dbname) > db.sql" and put it in scheduled tasks. (Is the right arrow the Windows pipe operator? Am somewhat rusty on Windows)


  • Advertisement
  • Registered Users Posts: 4,003 ✭✭✭rsynnott


    blacknight wrote:
    I've never tried using it on windows, but on *nix both machines need to have rsync installed.

    As long as the host has an ssh daemon, it should be possible to rsync, I think.


  • Closed Accounts Posts: 230 ✭✭musician.ie


    rsynnott - I am currently doing what you suggest, except I use an app (DBTools manager) to do this for me. The problem is that by dumping the entire database every time, I'm using a phenomonal ammount of bandwidth. What I'm searching for is a way of just backing up what's changed and merging it somehow with the previous backup.

    I could probably implement it by comparing the remote and local databases, table by table with sql statements, but that's a big job. I was hoping for something at a higher level.


  • Registered Users Posts: 6,511 ✭✭✭daymobrew


    Any useful info at the MySQL site?
    http://dev.mysql.com/tech-resources/articles/index.html

    I searched for 'backup' at:
    http://dev.mysql.com/doc/mysql/en/index.html

    but I don't know enough to know if the results are useful.

    Is there a way to log changes so you know what changed since the last backup?
    [Maybe add code to your web site apps to write db change commands to a file that you can download and apply to your local copy]


  • Closed Accounts Posts: 2,161 ✭✭✭steve-hosting36


    Hi Musician,

    Looks like your hosted with us, your control panel includes a comprehensive account back-up tool, that will backup all content, settings and databases. Check it out or drop a mail to support if you need any help using it.


  • Closed Accounts Posts: 230 ✭✭musician.ie


    daymobrew - thanks. I had been looking there, and there were some promising leads there. It looks like Steve has the easy answer. I didn't realise that their export tool was so powerful :o

    Thanks all.


  • Registered Users Posts: 4,003 ✭✭✭rsynnott


    rsynnott - I am currently doing what you suggest, except I use an app (DBTools manager) to do this for me. The problem is that by dumping the entire database every time, I'm using a phenomonal ammount of bandwidth. What I'm searching for is a way of just backing up what's changed and merging it somehow with the previous backup.

    I could probably implement it by comparing the remote and local databases, table by table with sql statements, but that's a big job. I was hoping for something at a higher level.

    Ah, right, in that case, you could just add a timestamp to each table (preferably at the end so that it won't upset code which relies on columns being in a particular order) and then write a script to do something along the lines of "SELECT FROM <WHATEVER> WHERE created > <last-downloaded>"


Advertisement