This content has been marked as final. Show 2 replies
I'm not sure I can help you because we don't have a "real" backup server but I can describe what we do and perhaps some of it can be used in your situation.
We use a total of 12 servers around the state (one at each location plus one for development). They are all configured the same; the only differences are some of the forms & files on the development one (naturally) as well as the HQ one does a few more tasks than the remote office ones. Of the 11 production servers, everything is the same except for values within files that indicate the location name, IP# and such.
Since they are duplicates any one of them can be used as a backup to any other one, including one of the remote ones being used as backup for the unique tasks done on the HQ server.
We use ".INI" files that are unique for each location to specify which server is to be used for what purpose. Though there is a unique file for each location, every location's file is on every server (providing backup). All we need to do to substitute one server for another is to update these .INI files and distribute them. Since our servers are duplicates, if we had a backup server like you have we could get the necessary files from any of them to update the backup before putting it on line (though a method for identifying only the latest changes that needed to go onto the backup server would have to be developed).
Our user application sees that a file it uses has been updated on its server and copies the file to the user's PC. The custom agents running on the servers always load the .INI & other control files so they always get the latest. Our user application is writen such that if it can't get to its primary server it will automatically switch to a designated backup server. The custom agents on the servers recognize they can't reach the HQ server and a queue & retry process is initiated.
Our distribution process is currently just a simple batch file that copies files from the source location (usually a series of "distribution" folders on the development server) to all of the production servers. We are developing a VB application for this distribution so there will be more intelligence and logging. Getting the files to be distributed in the next release to these distribution folders is a manual process done by the developers.
What we have set up is a unique distribution folder for each unique location on the servers. For example, there is a folder for the MDFs, a folder for where executables are kept, a folder for the SERVER folder, etc. The batch file (and future program) takes the contents of a specific distribution folder and copies it to the associated folder on the servers. The distribution process also copies command files to the server's CONTROL folder to cause it to pause processing, reload the Job Management Database and to restart processing. (We haven't figured out how to completely stop and reload the process for when the main service .INI file gets updated.)
Our distribution process keeps a "backup" copy of everything that gets distributed. This results in a copy of everything that has ever been put on the production servers so we can re-update a server if it needs it. Actually, it is a backup of the latest distribution and a copy of everything ever distributed. This allows for fallback if needed. Unfortuantely, it is
so just updating a backup server like you have could be time consuming unless it was kept up-to-date and no more than one "release" behind. If it wasn't kept up-to-date the alternative would be to do a date sort and select files for updating that are of a certain age. In your case, a VB application that used a date range as input could easily do the selection and copies.
We will be installing updated servers shortly. Part of this update will include a server "instance" on the development server that will be a duplicate of what is on the production servers. This will not only allow us to pre-test how an update will operate before it is actually distributed but will provide an actual backup of what is on the production servers. Since our servers are duplicates of each other we don't need this capability for backup but it will be present.
I don't know how someone sets things up with a true backup server. For example, I don't know if it has the same IP or not. I know that as long as only one is attached to the network that would work and make it very simple to activate the backup. It does make it impossible to load the same updates to both servers at the same time like we do. If they have different IP #s, then something like what we do would work as long as you were copying the files from a "development" server and not updating the production server directly.
If you are updating the prodution server directly, then perhaps you could set up a process that runs periodically that copies the associated files off of the server. MDF files are probably created on a deleloper's PC and then copied to the server. This copy process could be a canned batch routine (or a program) that also copies the files to another server for backup and updating the backup server when necessary.
The files that are changed when you modify the Job Management Database or the configuration are fairly easy to identify. The "Server" folder contains several "JFxxxxx.ini" files that contain the configuration stuff. That folder also contains "jfserver.jmd" which is the complete Job Management Database.
Thanks for the information, Tom. While our objective is a bit smaller in scope (we're only concerned with copying the JetForms application files) you have some good ideas for us to consider. Your last comment is especially helpful regarding the Job Management database and other .ini files that we definitely would want to "mirror" to our backup system.
Thanks again for the help.