• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
0

JRUN not releasing memory

Contributor ,
Aug 03, 2007 Aug 03, 2007

Copy link to clipboard

Copied

Hi-

I installed CF8 on Windows using the standard configuration (JRUN, IIS, SQLServer) and needed to run some threshold tests against some sample data, nothing out of the ordinary - other than the size of the result set - I was requesting 100,000 records from one table, which came back fine. However, I tracked the memory usage on my server and noticed that jrun.exe climbed all the way up to close to 1 gb (I adjusted the min/max heap settings in the CF Admin to allow that) but JRUN never let go of all that memory, it's continuing to ping at close to 1 gb. Does anyone know how I go about getting JRUN/CF to automatically release the memory (besides restarting CF)?

Thanks,

Rich
TOPICS
Advanced techniques

Views

1.9K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines

correct answers 1 Correct answer

Contributor , Aug 06, 2007 Aug 06, 2007
Thanks Sean, I appreciate the insight!

Rich

Votes

Translate

Translate
Engaged ,
Aug 03, 2007 Aug 03, 2007

Copy link to clipboard

Copied

Rather than worry about releasing all of that memory, I think you should instead worry about what your application is doing to require such a massive memory footprint.

No <cfquery> tag should result in 1GB of memory utilization.

Is there anything else strange that your application is doing? Lots and lots of string concatenation?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Aug 03, 2007 Aug 03, 2007

Copy link to clipboard

Copied

You are correct in your thinking! Unfortunately I have a client who needs data from a query put into an excel spreadsheet, and yes, we're talking 50,000 to 75,000 rows of data per spreadsheet. It's what the client wants, and they're not much interested in anything else at the moment. They currently have some shell scripts running against a DB2 database process that is building spreadsheets from the query; when I tried to run the same query in ColdFusion it came back with 100,000 records and that is when I experienced the JRUN memory issue.

So the question came up as a result of my test, but it still remains, what do I need to do to get JRUN to release the memory?

Thanks for your help,

Rich


Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Aug 03, 2007 Aug 03, 2007

Copy link to clipboard

Copied

Turn off Maintain Connections (Maintain connections across client requests)under the Advanced Settings of the datasource and see if that helps.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Aug 03, 2007 Aug 03, 2007

Copy link to clipboard

Copied

Actually I tried that already, even restarted the server after I applied the setting for the particular datasource, the JRUN server still will not release all of that memory. Do you think I need to apply another patch from Sun for the JVM? I've Googled the daylights out of this and can't find an answer....

Thanks for the suggestion,

Rich

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Aug 04, 2007 Aug 04, 2007

Copy link to clipboard

Copied

I'm probably not the best person to answer this, but I think that the jrun service starts small, then takes up memory as needed. However, it doesn't really release all that memory when a script completes, as you might expect.

There is a garbage collection process that runs every now and then, that will release some of the memory, but I'm not sure you will get it all back.

I was playing with uploading some huge files on a small server. It sucked up about 90% of the memory and never released it. However, CF continued to run fine.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Engaged ,
Aug 05, 2007 Aug 05, 2007

Copy link to clipboard

Copied

So how are you using this query? Are you using the query results to build a CSV/excel export file? If so, it sounds like you're buffering the entire export file contents into memory before writing it out to disk. This is generally a bad idea, especially for large files such as yours.

Are my assumptions correct here? If not, can you perhaps post some of the code that is behind this process to clarify?

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Aug 05, 2007 Aug 05, 2007

Copy link to clipboard

Copied

... actually, I've not even gotten that far. I simply ran the query via cfquery, (select top 100000 * from dummydata) and printed a simple "complete" at the end of the query. I didn't even try to output a cfdump, and I even turned off debugging.... I also tried "select * from dummydata" since there are exactly 100000 rows of data in the table. Now granted, there are 155 columns in the table, but it's a SQL Server 2000 DB and the longest piece of data is 20 chars in length. The query finishes OK, but like I said, JRUN shoots up in RAM usage and won't give it back.... any ideas? I haven't even gotten to using the data from the query yet!!!

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
New Here ,
Aug 05, 2007 Aug 05, 2007

Copy link to clipboard

Copied

How about paging the query using limit/offset to break the dataset up in smaller chunks. This may take the pressure off the jvm and garbage collection. Also, make sure query caching is off

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
LEGEND ,
Aug 05, 2007 Aug 05, 2007

Copy link to clipboard

Copied

In article <f8vhbe$94q$1@forums.macromedia.com> "rich.leach"
<webforumsuser@macromedia.com> wrote:
> came back fine. However, I tracked the memory usage on my server and
> noticed that jrun.exe climbed all the way up to close to 1 gb (I
> adjusted the min/max heap settings in the CF Admin to allow that) but
> JRUN never let go of all that memory, it's continuing to ping at
> close to 1 gb.

That's standard behavior for a Java application in general. Java
performs its own memory management and if you tell it there's 1Gb heap
it will allocate that memory from the operating system if needed but
then it will manage its own needs within that 1Gb of allocated space.

If you look at the JVM memory usage directly, you'll see it isn't
using much memory after your page has run (because garbage collection
returns it to the heap) but the heap memory all belongs to Java so the
operating system shows it as still allocated.

Hope that explains it?

Sean A Corfield
An Architect's View -- http://corfield.org/


--
I'm using an evaluation license of nemo since 61 days.
You should really try it!
http://www.malcom-mac.com/nemo

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Contributor ,
Aug 06, 2007 Aug 06, 2007

Copy link to clipboard

Copied

Thanks Sean, I appreciate the insight!

Rich

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Explorer ,
Oct 17, 2007 Oct 17, 2007

Copy link to clipboard

Copied

LATEST
Rich, I've run into the same conundrum before, but Sean's point is the key--the memory is being "held" by the JVM but most of it is available after the query runs. It's just more efficient for the JVM to hang on to all that memory it claimed in case you run another big query. After I installed SeeFusion to gather stats, I was able to make a nice little CF graph page that shows me available vs. free vs. used memory, and now I see that anytime I run a big-pig query like that, the JVM gobbles up a ton of memory, uses it, and then frees it but keeps it claimed.

The other trick I've read is that after you're done processing your query and built your output, do another query of the same name that produces just one row. It's supposed to make sure that the original query object is destroyed from memory. Not sure if it really does that, but it helps me sleep more soundly.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Resources
Documentation