BlazeDS and ColdFusion - jRun memory usage increase resulting in java.lang.OutOfMemoryError: Java heap space

I have BlazeDS running in an instance of Coldfusion 8. A flex chat app that acts as both a producer and consumer with 70 long polling requests allowed, moving to client polling with a polling interval of 2 seconds. jRun metrics logging is enabled. The server is set up with a max heap size of 1200mb, 210 max jrun threads and 150 max simultaneous flash remoting requests.

All can appear to be cruising for hours with not much deviation from following JRun stats.

Running Threads,Available Threads,Jrun Sessions,Total Used Mem MB,Avail Mem MB
71,101, 148, 332.864, 34.866

Then all of a sudden, over a period of maybe a couple of hours the memory slowly increases and I sometimes end up with a java.lang.OutOfMemoryError: Java heap space exception. The memory is being used but not released and I end up with roughly the attached metric data.

Running Threads,Available Threads,Jrun Sessions,Total Used Mem MB,Avail Mem MB
71, 130, 195, 1015.424, 249.94

The Coldfusion monitor doesn't show any unusual memory usage and no application exceptions are being thrown.

I can see a number of the following errors in the system.out log, but not sure if they are related. I can't find any information regarding them in relation to BlazeDS.

java.lang.IllegalStateException: Session is invalid at jrun.servlet.session.JRunSession.checkSessionValidity(JRunSession.java:394)

At times memory run very high but no out of mem error is thrown, then, as activity quietens down memory is released, but available memory can go from say 480mb to 800mb in a 20 sec period. There's no gradual release in memory.

Has anybody come across anything like this before?

My services-config.xml

<properties> <polling-enabled>true</polling-enabled> <polling-interval-millis>2000</polling-interval-millis> <wait-interval-millis>60000</wait-interval-millis> <client-wait-interval-millis>1</client-wait-interval-millis> <max-waiting-poll-requests>70</max-waiting-poll-requests> </properties>

-------------Problems Reply------------

Your best approach is going to be standard Java OutOfMemoryError analysis:

  • add -XX:+HeapDumpOnOutOfMemoryError to your JVM start-up parameters
  • watch for an error condition, it will create a binary head dump file in TOMCAT_HOME
  • analyze the heap dump file with a profiler, like the Eclipse Project's Memory Analyzer (MAT)

This is going to tell you what objects are sucking up all your memory. I think it is too early to focus on the java.lang.IllegalStateException. When the JVM is out of memory then all bets are off--it is possible to see all sorts of error messages.

BTW: What is your session time out policy?

Category:java Views:3 Time:2009-10-21

Related post

Copyright (C) dskims.com, All Rights Reserved.

processed in 0.134 (s). 11 q(s)