This content has been marked as final. Show 4 replies
It really depends on what "large" is and what type of OS you are running on. If this is a Windows solution I would probably say just slap a tape drive on the server and run Backup Exec and that will backup in the background pretty good while the server is running.
On the Linux side I'm not sure of too many software packages that will do backups. Backup Exec has a Linux Agent, but it cannot backup while information is being accessed on Linux.
Personally I would always say do full backups in any case though. Nothing worse than doing 10 backups and losing 1. This way it's less to keep track of.
Let me get this straight. You have a database that changes (meaning you have customer that actually cause the database to change,) every hundredth of a second.
I have heard some Tall Tails on Adobe's foums, but... Please answer... If that were true... Don't you have people you hire to deal with this sort of problem?
I think you overstated your problem. Please rewrite your question.
On Sun, 3 Aug 2008 02:42:39 +0000 (UTC), "Squonk64"
>Let me get this straight. You have a database that changes (meaning you have
>customer that actually cause the database to change,) every hundredth of a
> I have heard some Tall Tails on Adobe's foums, but... Please answer... If
>that were true... Don't you have people you hire to deal with this sort of
> I think you overstated your problem. Please rewrite your question.
The only database applications I know of that get that many entries
are part of the software that runs operating plants like refineries.
I used to write the documentation for refinery control software that
took in 30,000 entries every two minutes, because it polled every
controller, every sensor, in the plant.
Which meant the control software had to solve 30,000 simultaneous
equations every two minutes...
Fun stuff. Huge application written in Lahey Fortran.
Wild Rose Websites www.wildrosewebsites.com
The database WILL be changing hundreds of times per minute. It does not yet; I am still developing this site. Every time a user views a submission, a record is added. This allows me to sort submissions by popularity (most unique views in last 7 days). That table alone could potentially have hundreds of records inserted every minute.
It has been suggested to me that I should look into 'replication'. So that is what I am doing now.