0 Replies Latest reply on Apr 9, 2012 10:38 AM by jaloren28

    Are there best practices for optimizing ExtendScripts?

    jaloren28 Level 2

      I am relatively new to scripting so I am wondering if there are some tips or tricks for avoiding "obvious" pitfalls that cause slow downs in the script and/or if there are ways to use profile options to figure this out. I have a script that works it just goes extremely slow -- it takes about 2 days to parse a couple hundred files.

       

      Here's my scenario:

       

      Customer has a large documentation set. They've decided they want to convert about 100 terms into variables. They want to automate the process of finding these terms and replacing the terms with these variables.

       

      To address this scenario, my script does the following:

      1. User creates a two column table in a framemaker document.
      2. The first column has the term they want to find.
      3. The second column has the name of the variable they want to replace the term with.
      4. In the first column, each term has a semicolon at the end.
      5. The script opens the framemaker table and does the following:
        1. on each row, it reads the text from each cell and then concatenates the text from each cell into a single string. For example, if the first column had the string foobar; and the second cell had the string Variable1, then the concatenated string is foobar;Variable1.
        2. Pushes the concatenated string onto an array. From here on out, i'll call this array the VariableListArray.
      6. User provides a root directory that the script will search down from.
      7. Script recursively goes through each file and folder looking for framemaker files. Any framemaker file it finds, it then opens. Once opened, the script does the following:
        1. Uses a for loop to iterate through the VariableListArray.
        2. For each item in the array, call the split method where the the semicolon is the delimeter. This means a new array is returned that contains two elements: the term and the variable name. I'll call this new array the StringArray. For example, if the string is foobar;Variable1, then the new array has two elements: foobar and Variable1.
        3. Call a function that uses the find method.This function requires three parameters: the document being searched, the term being searched for, and the name of the variable that the term will be replaced with.
        4. To initialize this find function, I pass the first element of the StringArray as the term  to be replaced, I pass the second element of the StringArray as the name of the variable that the term will be replaced with.
        5. Find function searches through the document looking for the term and if it finds it then replaces it with the variable whose name i passed into the function.
        6. Repeat this find and replace function for each item in the VariableListArray.

       

      I have found that this on average will take half an hour or longer to parse a document that's a 100 pages or more. If I am parsing several hundred framemaker files (and I am), then this entire process can take several days. Note that my testing so far has all been on local files. I am not doing this testing on a NFS.

       

      So based on the logic that I have described, does anyone have recommendations about how  I could optimize my script to speed things up?