This content has been marked as final. Show 11 replies
Rather than worry about releasing all of that memory, I think you should instead worry about what your application is doing to require such a massive memory footprint.
No <cfquery> tag should result in 1GB of memory utilization.
Is there anything else strange that your application is doing? Lots and lots of string concatenation?
You are correct in your thinking! Unfortunately I have a client who needs data from a query put into an excel spreadsheet, and yes, we're talking 50,000 to 75,000 rows of data per spreadsheet. It's what the client wants, and they're not much interested in anything else at the moment. They currently have some shell scripts running against a DB2 database process that is building spreadsheets from the query; when I tried to run the same query in ColdFusion it came back with 100,000 records and that is when I experienced the JRUN memory issue.
So the question came up as a result of my test, but it still remains, what do I need to do to get JRUN to release the memory?
Thanks for your help,
Turn off Maintain Connections (Maintain connections across client requests)under the Advanced Settings of the datasource and see if that helps.
Actually I tried that already, even restarted the server after I applied the setting for the particular datasource, the JRUN server still will not release all of that memory. Do you think I need to apply another patch from Sun for the JVM? I've Googled the daylights out of this and can't find an answer....
Thanks for the suggestion,
I'm probably not the best person to answer this, but I think that the jrun service starts small, then takes up memory as needed. However, it doesn't really release all that memory when a script completes, as you might expect.
There is a garbage collection process that runs every now and then, that will release some of the memory, but I'm not sure you will get it all back.
I was playing with uploading some huge files on a small server. It sucked up about 90% of the memory and never released it. However, CF continued to run fine.
So how are you using this query? Are you using the query results to build a CSV/excel export file? If so, it sounds like you're buffering the entire export file contents into memory before writing it out to disk. This is generally a bad idea, especially for large files such as yours.
Are my assumptions correct here? If not, can you perhaps post some of the code that is behind this process to clarify?
... actually, I've not even gotten that far. I simply ran the query via cfquery, (select top 100000 * from dummydata) and printed a simple "complete" at the end of the query. I didn't even try to output a cfdump, and I even turned off debugging.... I also tried "select * from dummydata" since there are exactly 100000 rows of data in the table. Now granted, there are 155 columns in the table, but it's a SQL Server 2000 DB and the longest piece of data is 20 chars in length. The query finishes OK, but like I said, JRUN shoots up in RAM usage and won't give it back.... any ideas? I haven't even gotten to using the data from the query yet!!!
How about paging the query using limit/offset to break the dataset up in smaller chunks. This may take the pressure off the jvm and garbage collection. Also, make sure query caching is off
In article <email@example.com> "rich.leach"
> came back fine. However, I tracked the memory usage on my server and
> noticed that jrun.exe climbed all the way up to close to 1 gb (I
> adjusted the min/max heap settings in the CF Admin to allow that) but
> JRUN never let go of all that memory, it's continuing to ping at
> close to 1 gb.
That's standard behavior for a Java application in general. Java
performs its own memory management and if you tell it there's 1Gb heap
it will allocate that memory from the operating system if needed but
then it will manage its own needs within that 1Gb of allocated space.
If you look at the JVM memory usage directly, you'll see it isn't
using much memory after your page has run (because garbage collection
returns it to the heap) but the heap memory all belongs to Java so the
operating system shows it as still allocated.
Hope that explains it?
Sean A Corfield
An Architect's View -- http://corfield.org/
I'm using an evaluation license of nemo since 61 days.
You should really try it!
Thanks Sean, I appreciate the insight!
Rich, I've run into the same conundrum before, but Sean's point is the key--the memory is being "held" by the JVM but most of it is available after the query runs. It's just more efficient for the JVM to hang on to all that memory it claimed in case you run another big query. After I installed SeeFusion to gather stats, I was able to make a nice little CF graph page that shows me available vs. free vs. used memory, and now I see that anytime I run a big-pig query like that, the JVM gobbles up a ton of memory, uses it, and then frees it but keeps it claimed.
The other trick I've read is that after you're done processing your query and built your output, do another query of the same name that produces just one row. It's supposed to make sure that the original query object is destroyed from memory. Not sure if it really does that, but it helps me sleep more soundly.