>>My question is, How can I determine what took the other 39,875ms?
The dump. cfdump is slow, especially when output complex types (structs, xml, queries). Try running the page without the dump. You'll see it loads quickly. Don't try to benchmark your page with a dump in it.
I agree with Jason on this. But, just out of curiosity, how many columns are being returned in your query? Forty seconds is not out of the question, especially if there are many columns.
The rest of the time was spent by the browser rendering the html sent to it. If you had metainfo="yes", not only does the browser have to render the query results, it also has to render the sql and all the parameters. If you had debugging turned on, that's more information to be rendered.
Good catches guys.
I removed the dump and had it just output the recordcount, and the page loaded very quickly.
I then trimmed up the SQL (They were using a SELECT *, so I chose just the fields needed). Originally there were about 40 fields and 164 rows. Now there are 8 fields and 164 records.
I took a look at the code CF dumped and it's rather convoluted for such a simple table-based output. Ugh.
We're getting into the Error-Handling, Logging, Auditing and Debugging part of our framework, and I'm very keen on running things as optimal as possible.
As far as CF-based metrics, does only Enterprise/Developer offer such functionality or is there a built-in service that I can tie into that will let me monitor performance based values for CF?