This content has been marked as final. Show 3 replies
First make sure that you set RequestTimeout in the url parameter to something very high.
Sounds like your browser may also be timing out which would most likely be the case even if you set RequestTimeout to something huge. I would put <cfflush> in every loop iteration and send progress update to the browser, perhaps it will signal to the browser that data is still coming in and it just might prevent it from timing out.
Have you tried method="trace" in your http calls? If you aren't interested in the body, don't bother doing a GET.
You could probably consolidate all of those queries down to a single query too, then loop using cfoutput, grouping on "Comp_key". Not even sure "allVersions" is even necessary.
If that still doesn't do it, try using maxrows and startrow in your cfoutput. Maybe add a cflocation to the end with a URL param to pass startrow+maxrows to the next instance. Then just sit back and let it run, you can watch the url update in the browser as the records are processed.
Thanks - that's really helpful information.
What's too high for the timeout setting for the page? I think the entire task would take 30 minutes to run. Is it taxing to a CF server to have a page running for that long?