• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

Transferring Data between servers

New Here ,
Sep 17, 2009 Sep 17, 2009

Copy link to clipboard

Copied

We have an application that is on our companies internal network, and the same application running on an external network for our client. 


We want to try to sync the databases when transactions are made.


So If client updates the app we want to update our internal database,  and if internal transactions were made in our application it would update the clients db.

Unfortunealty we are unable to pass data into our internal network from the clients as all ports are closed.


We are running SQL server for both. We are trying to make this dyanimic and not a manual process.


We can connect to our clients machine via ftp from our internal server..


We have thought of having the clients transactions write to a file, and then our internal server with a scheduled task would retrieve the file and update our internal database. 


Then we would push the changes back to the external server and have a script write to the database. 


Not sure this is the best method.  If anyone has any ideas on how we should do this please let me know.


Thanks,

Brian


TOPICS
Advanced techniques

Views

493

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Sep 18, 2009 Sep 18, 2009

Copy link to clipboard

Copied

LATEST

Hey Brian,

We do something similar but it may not be the best way to do it,  We put triggers on the tables that we want to synchronize that timestamp each row when it is created or updated.  We also have a trigger that keeps a table of deleted rows.  Our database is Oracle so the rest of this is Oracle specific.  I have an Oracle job (PL/SQL) that kicks off automatically every so often that uses Oracle datapump to create dump files of the records.  When completed we use an another external job (chron on unix and scheduled task on windows)  to move the dump files.  On the other servers we have oracle jobs that periodically look for new dump files in a given directory.  When it finds them it loads them into a temporary schema and then merges the data into the real schema.   We also have a set of pages where the admin can control which tables get exported, filters that can be used to limit which rows get exported, and job scheduling and results.

kind of lame but we do this because we synchronize multiple servers some of which have no connectivity at all and the files have to be written to CD and sent to someone to do the updates

-kevin

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Resources
Documentation