Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Auto copy SQL2K DB from in-house server to ISP server daily

Options
  • 03-11-2010 7:43pm
    #1
    Closed Accounts Posts: 333 ✭✭


    Anyone know or suggest how to automatically export a SQL Server 2K database from an in-house server to an ISP hosting package once a day. We have developed a web based tool that queries the database when it is online, but the online tool will never be able to access the in-house live database that is updated daily by in-house users. The problem is automating the export of the database (or subset of data) from SQL Server 2K each day to update or overwrite various tables within the online version of the database. It could in fact just overwrite each time as a last resort because the online version will not be adding/updating any records.

    The ISP package in question is just a standard Blacknight Minimus hosting package. Would be willing to convert DB to any other DB type or structure file format during the export if that helped proceedings.

    Any ideas?

    McGintyMcGoo


Comments

  • Registered Users Posts: 23,212 ✭✭✭✭Tom Dunne


    Could you do some sort of export (as in, a simple SELECT out to a file, or an actual data dump), FTP the resultant file(s) over, then do the opposite on the other side?

    I could tell you how to do it in Oracle :), not at all familiar with SQL Server, but the concepts should be similar enough.


  • Registered Users Posts: 194 ✭✭dumb_parade


    If you can get a DB connection string to the ISP DB. Then you can put together a data migration app to move updated data to the ISP DB whenever you need to.


  • Registered Users Posts: 2,781 ✭✭✭amen


    loads of ways to do it but in no particular order these would the easiest
    1: Database replication
    2: Database Mirroring
    3: Log shipping
    4: Backup and restore

    You could also use triggers to monitor the table updates.

    A silly question though why not just have your internal people/application
    talk to the database on the internet and use that as the one datasource?


  • Registered Users Posts: 68,317 ✭✭✭✭seamus


    amen wrote: »
    A silly question though why not just have your internal people/application talk to the database on the internet and use that as the one datasource?
    I'd be inclined to say that their current system isn't all that bad. Performing updates on a local live server and shipping them to to query-only database for the website gives you more control in the event that something goes crap.

    In the event that the web database goes tits up, you have instant access to the local backups instead of through whatever interface blacknight have provided for it.

    Although I doubt it's that critical, such a setup also simplifies the process of maintaining a backup webspace & database on another hosting company.

    They're using a small package, so it's either a small company or small site, but it never hurts to be prepared.


  • Closed Accounts Posts: 333 ✭✭McGintyMcGoo


    Thanks for the comments so far. Access from the web to the live server is a no no. It is a very small reporting only website. To keep costs down, the small hosting package is being used instead of adding a new public facing web server into our DMZ.

    I have created some report specific views, that I plan to export from the system daily. I assume that "data transformation services" or the "bcp" command will do this easily enough. That data could be exported to a local folder and then a batch run to ftp the dat file onto a server.

    That's as far as I have gotten. Not sure where the file will be stored on the hosting package, or how to automate the importing of that data into the online database. I have manually imported data into an online MYSQL in the past, but I'm wondering how to automate this. I should point out that I am happy to have the online database in any format as it will only consist of (most likely) 4 simple tables. Has anyone automated the import of data from a file into a live online database but using such a small web hosting package. I'm sure I could work miracles with a dedicated online server OR a server in our DMZ but cost is prohibiting both.

    McGintyMcGoo


  • Advertisement
  • Registered Users Posts: 2,781 ✭✭✭amen


    As I said above one of the options 1-4 is the best way to do this.
    MS SQL will look after everything for you and tell you if it fails.

    Much better then messing with bcp, file copies and scheduled jobs.

    e


  • Registered Users Posts: 23,212 ✭✭✭✭Tom Dunne


    amen wrote: »
    As I said above one of the options 1-4 is the best way to do this.
    MS SQL will look after everything for you and tell you if it fails.

    But will they work between SQL Server and MySQL?

    And what kind of secure connection would you need to set up?


  • Registered Users Posts: 2,781 ✭✭✭amen


    I am a bit confused. From reading the OP it seems he was talking about MS SQL only in which case this will work.

    But then later he mentions
    I have manually imported data into an online MYSQL in the past
    . So if (s)he is trying to import data automatically from MS SQL to MySQL then no my suggestions will not work.

    But if the data really is moving from MS SQL to MySql then bcp most likely won't work either (might if you export in native format) unless you have the same data types, definitions etc

    I would prob write a little c#(or language of choice) program to extract the data from MS SQL, convert and then import into MySQL.

    You could even save the data to disk so you know what was sent each day.

    Ooh it appears that using the MS SQL Linked DB(very handy feature) you can link any ODBC data source including a MySQL database to MS SQL. This is the best way http://www.packtpub.com/article/mysql-linked-server-on-sql-server-2008


  • Closed Accounts Posts: 910 ✭✭✭Jagera


    Do you have access to SQL DTS? I am not 100% clear on the architecture, but if you can get connectivity to both databases, then you can use that to send the data over.

    How much data are we talking here?


Advertisement