4 Replies Latest reply on Aug 3, 2008 1:30 PM by AngryCloud

    Database backups

    AngryCloud Level 1
      For a very large database that is being changed at least hundreds of times per minute, which is the best method to use for daily backup:

      incremental or full?

      With incremental, if one backup goes wrong, then all the proceeding backups will be ruined. On the other hand, making a full backup of an extremely large database every day would cost me a lot of disk space, and would take a lot of time to process on the server.

      Losing a full day's worth of changes to the database would be horrible enough in itself. So risking losing any more than that is just not acceptable.
        • 1. Re: Database backups
          Ben M Adobe Community Professional
          It really depends on what "large" is and what type of OS you are running on. If this is a Windows solution I would probably say just slap a tape drive on the server and run Backup Exec and that will backup in the background pretty good while the server is running.

          On the Linux side I'm not sure of too many software packages that will do backups. Backup Exec has a Linux Agent, but it cannot backup while information is being accessed on Linux.

          Personally I would always say do full backups in any case though. Nothing worse than doing 10 backups and losing 1. This way it's less to keep track of.
          • 2. Re: Database backups
            Squonk64 Level 1
            Let me get this straight. You have a database that changes (meaning you have customer that actually cause the database to change,) every hundredth of a second.

            I have heard some Tall Tails on Adobe's foums, but... Please answer... If that were true... Don't you have people you hire to deal with this sort of problem?

            I think you overstated your problem. Please rewrite your question.
            • 3. Re: Database backups
              Level 7
              On Sun, 3 Aug 2008 02:42:39 +0000 (UTC), "Squonk64"
              <webforumsuser@macromedia.com> wrote:

              >Let me get this straight. You have a database that changes (meaning you have
              >customer that actually cause the database to change,) every hundredth of a
              >second.
              >
              > I have heard some Tall Tails on Adobe's foums, but... Please answer... If
              >that were true... Don't you have people you hire to deal with this sort of
              >problem?
              >
              > I think you overstated your problem. Please rewrite your question.

              The only database applications I know of that get that many entries
              are part of the software that runs operating plants like refineries.

              I used to write the documentation for refinery control software that
              took in 30,000 entries every two minutes, because it polled every
              controller, every sensor, in the plant.

              Which meant the control software had to solve 30,000 simultaneous
              equations every two minutes...

              Fun stuff. Huge application written in Lahey Fortran.

              Win
              --
              Win Day
              Wild Rose Websites www.wildrosewebsites.com
              windayNOSPAM@wildrosewebsites.com
              • 4. Re: Database backups
                AngryCloud Level 1
                The database WILL be changing hundreds of times per minute. It does not yet; I am still developing this site. Every time a user views a submission, a record is added. This allows me to sort submissions by popularity (most unique views in last 7 days). That table alone could potentially have hundreds of records inserted every minute.

                It has been suggested to me that I should look into 'replication'. So that is what I am doing now.