1 person found this helpful
Acrobat does a very bad job in handling the CPU and memory of the computer
when running complex scripts. It happened to me many times that it got
stuck when processing large files or large number of files.
My only advice is to try to optimize the code as much as possible, run it
in batches (if possible), and try to avoid a lot of nested loops that do
For example, if you want to add an annotation to a piece of text, use one
loop to look for the text you want to add the annotation to, save that
information in a variable, and then use a separate loop to actually add the
annotations themselves. That proves less likely to get stuck than doing
everything in a single loop.
Not a solution or even a suggestion... just observations. Where we can influence performance (and it isn't often), there is a way of looking at performance tuning. Essentially, a program will go as fast as it can until it reaches one of its limits. A limit might be
* User. Waiting for you. Probably not the case here.
* CPU. Once it's using all of your CPU (or all of one CPU) that's it.How to test? Performance monitors.
* Disk. If a program needs to read or write disk, then it will usually stop what it is doing until that is finished. CPU drops. It isn't as simple as it seems, since disk cacheing means many things can be found back in memory if read or written recently. How to test? Performance monitors. A program which seems not to need disk can become mysteriously disk bound if, for example, it is writing a lot of logs.
* Memory. If there isn't enough free RAM there will be paging. That is, disk reads/writes. Now you become disk bound. How to test? Performance monitors; close down other programs especially all browsers.
* Other external resource (e.g. network).
* Inter process or inter thread communication. Where a program wastes most of its time talking to another process or another part of itself. How to test: virtually impossible to monitor, but high SYSTEM CPU is a clue. Virtually impossible to improve.
* Waiting for the system to do something. In this case the system is going to hit one of the things above.
So gains in processing times are achieved by determining the best place to place code like using the on blur action for processing a field process that only has to occur when a field value changes like testing a value and issuing a waring versus running a full calculation for that same action.
When any field used within a calculation is modified, all calculations are reprocessed when ever one of those fields is changed. Another way to improve performance is to create a function or a predefined process that may have some input values, performs a specific task. and may return a result. So if I have a calculation like computing elapsed time between 2 time values on a time sheet rather than coding the conversion of the time strings to a number, performing calculation and formatting the result for each day, I could define a function that uses the 2 time values, computes the difference and returns the formatted result. Since the function has been syntax checked and tokenized it will run faster on each call for each day.
It is also possible to combine the calculation for many fields into one larger calculation in one form field and it is even possible to call a user defined function within another user defined function so one eliminates the repeated initialization for the code for each field.
If you are having to search through the PDF for each word, then there is a lot of overhead. I have noticed with such scripts that the first use takes significantly longer than the next call as long as Acrobat is not closed. This maybe due to the use of virtual memory by the computer and the code I use is in the RAM from the first call and not the virtual swap file for the second call of the process.
You might also get faster processing by using the Action/Batch processing of Acrobat Professional.