4 Replies Latest reply on Jul 10, 2013 12:50 PM by Pickory

    Phenomenal optimization technique!

    TᴀW Adobe Community Professional & MVP

      I just discovered an amazing way of optimizing a script, which I thought

      I'd share.


      I have a script that adds line numbers to an InDesign document

      (www.freelancebookdesign.com under the scripting tab).


      It works by adding a text frame alongside each InDesign frame and adds

      numbers in that frame.


      I've done quite a lot of optimization on it already, and the script

      starts at a very nice pace, but it soon slows down.


      So on a book with 100 pages, it's pretty quick. But adding line numbers

      to a 500-page book becomes very slow, because by the last 1/3 or so of

      pages the script has slowed to a crawl.


      Now, many of you will probably recognize the symptoms: with each page, a

      new text frame + contents has been created, so after 200 or so

      operations, the undo queue has become very long.


      The question then becomes: how to flush the undo queue.


      Now, I remember reading once a suggestion to do a "save as". Thing is, I

      don't want to "save as" the user's document -- they won't thank me if

      they need to undo a few steps before they ran the script!


      Of course, the script already uses a doScript call with

      UndoModes.ENTIRE_SCRIPT so it's all a single step. And we know that

      FAST_ENTIRE_SCRIPT isn't safe to use -- it's quite buggy.


      What I figured out, and am quite proud of , is to break up the loop

      that goes through those 500 pages into 10 loops of around 50 pages each

      -- and run each loop with a separate doScript (ENTIRE_SCRIPT) call. So

      we have a nested doScript.


      The thing about UndoModes.ENTIRE_SCRIPT seems to be that the undo queue

      is still written to, and when the doScript call ends, they are all

      deleted and turned into one step. So each time a doScript call finishes,

      even if your call involved a thousand steps, they will all be reduced to

      a single undo step when it finishes -- and this is the equivalent of a

      "save as".


      And since it seems to take exponentially longer to execute a command the

      longer the undo queue is, by dividing the queue into 10 chunks of 50

      (instead of a single chunk of 500), a huge amount of time is saved.

      Every 50 iterations, the undo queue is flushed, and the script therefore

      continues at the same pace as when it was first run. (Obviously, if

      there are thousands of iterations, it is probably a good idea to add

      another nested doScript call).


      So, case in point: Experiments with a 500-page book have show a 360%

      increase in efficiency! What used to take 288 seconds now takes 80 seconds!


      I'm pretty impressed!


      Do you have a better way of dealing with undo slowness?




        • 1. Re: Phenomenal optimization technique!
          Peter Kahrel Adobe Community Professional & MVP

          Nice one, Ariel. Thanks for sharing.



          • 2. Re: Phenomenal optimization technique!
            Pickory Level 3



            That is very clever.


            Presumably it would be doscript within doscript to achieve the single undo.



            • 3. Re: Phenomenal optimization technique!
              TᴀW Adobe Community Professional & MVP

              Thanks. @Pickory: Yes, a nested doScript.


              Here's a test script. The script creates a new document and adds 1000

              pages, each with a text frame on it. It does this twice: First time,

              with a single doScript call, second time with a nested doScript (ie 10 x

              100 pages).


              The results I get are 48 seconds for the first run, 31 seconds for

              the second -- only 2/3 of the time it takes the first loop!


              And this is for a relatively simple operation: The more the script does,

              the more the advantage is noticeable (as I mentioned, in my Line Number

              script, it took 1/4 of the time for a long document!).




              // TEST 1: Single doScript to create 1000 pages with text frame


              var myDoc = app.documents.add();

              alert("Starting Test 1");


              app.doScript (main, undefined, undefined, UndoModes.ENTIRE_SCRIPT, "test");

              alert("Single doScript call took "$.hiresTimer/1000000" seconds");


              function main(){

                   for (var i = 0; i<1000; i++){

                       myPage = myDoc.pages.add();







              // TEST 2: Nested doScript to create 1000 pages with text frame


              myDoc = app.documents.add();

              alert("Starting Test 2");


              app.doScript(main2, undefined, undefined, UndoModes.ENTIRE_SCRIPT, "test


              alert("Nested doScript version took "$.hiresTimer/1000000" seconds.");


              function main2(){

                   for (i = 0; i<10; i++){

                       app.doScript(nestedDoScript, undefined, undefined,

              UndoModes.ENTIRE_SCRIPT, "The user will never see this");




              function nestedDoScript(){

                   for (var j=0; j<100; j++){

                       myPage = myDoc.pages.add();




              • 4. Re: Phenomenal optimization technique!
                Pickory Level 3



                That is quite an interesting find.


                Thinking out loud here.


                I wonder, if the script called a plugin to setup the start undo sequence, the script itself told ID not to record any undo stuff, and then told the plugin to close the undo sequence when the script had done it stuff. The plugin would make the start and end snap shot. Does it have to be a plugin, could it be another script?