> Adam Cameron wrote:
>
The maximum stable heap size I've managed to get is around
> 1.0-1.2GB, on a win32 system. On Solaris (running a
32-bit JVM),
> about 1.4GB. It *seems* like GC doesn't actually clear
out RAM
> properly if more than that much RAM is being
addressed.
>
> Yes, there is a well-known
>
http://kb.adobe.com/selfservice/viewContent.do?externalId=tn_19359&sliceId=1
Not really what I was talking about. One might be able to get
the CF
instance to *start* with 1.8GB allocated to the heap, but it
won't actually
work. I've managed to get a server to idle for a reasonable
length of time
on 1.5GB, but as soon as the thing started to ramp up, it
face-planted,
once it started actually trying to *use* the higher end of
the RAM
allocated to it. At 1.2GB, it'll seem to run OK for a
reasonable length of
time, but eventually it starts leaking memory; at around 1GB,
it was pretty
stable.
Hence my comment about it being *stable* at that allocation.
Not that "it
simply won't start if more than 1.8GB is allocated to it".
My point was that your rule of thumb:
maximum heap size(Xmx) = RAM(in MB) / (2 * number of servers
using the
JVM)
Is not a very good one. Plug 4GB RAM (so a small server) and
one CF
instance into that equation. Your rule suggests I should be
allocating 2GB
to the heap. Which - as you yourself pointed out - won't
work.
--
Adam