Skip to main content
December 31, 2009
Question

More jrun using up memory problems

  • December 31, 2009
  • 1 reply
  • 1538 views

There seem to be a lot of problems like this but everyone's situation seems to be a little different.  It makes it hard to find solutions.  Here is what we are dealing with.

We are taking an XML file containing product data and importing it into a SQL 2000 database.  The xml file is about 6mb and it has product data for about 4700 products.  When we wrote the import we were running coldfusion 7 standard updated to 7.02.  We run the import every night and it seemed to run fine.  It took about 45 minutes to run but the server didn't crash.  We just recently upgraded to cf 9 standard.  After the upgrade when the import ran we were seeing lots of out of memory errors in the log.  The second night it ran coldfusion had to be restarted.

We setup a testing server and ran the import there and watched the memory usage of jrun.  As the import ran memory usage kept climbing until it reached the max heap size.  After this the import got very slow and eventually coldfusion became completely unresponsive.

We changed the import so that it would do the products in smaller batches as separate requests.  This seemed to be the answer.  Between the requests the memory usage would drop down some and the whole thing was able to run without using up all the available memory.  The import also takes a lot less time now that the memory usage isn't so high.  We thought we had it fixed so we started running it on the production server.

After a couple days we noticed that our memory usage was very high on the production server.  We went back to the testing server and ran the import a few times.  After this we could see that each time it was run the memory usage would increase.  After about 4 times the memory reached the max heap size and coldfusion became unresponsive.  The memory usage never goes back down.

    This topic has been closed for replies.

    1 reply

    Inspiring
    December 31, 2009

    Some things to check / consider.

    1. What are your JVM settings?  What is your Maximum JVM Heap Size?  If you are using the default value, 512 MB, consider a larger value.  On a 32 bit server the upper limit is 1.8 GB.  A 64 bit server can allocate more memory to the JVM, but I haven't seen any documentation that specifies the upper limits on 64 bit.  (If anyone can point me to some info on this I'd be grateful.)

    2. Consider moving the XML to SQL import process off of ColdFusion to another tool.

    Some options:
    a. A SQL Server DTS package

    b. Use the bulk insert feature of SQL Server

    c. Use a third party tool such as Altova Mapforce

    Java and JVM page in CF administration
    http://help.adobe.com/en_US/ColdFusion/9.0/Admin/WSc3ff6d0ea77859461172e0811cbf3638e6-7ffc.html#WSc3ff6d0ea77859461172e0811cbf3638e6-7feb

    "Maximum JVM heap size greater than 1.8GB will prevent ColdFusion MX from starting"
    http://go.adobe.com/kb/ts_tn_19359_en-us

    "Examples of Bulk Importing and Exporting XML Documents" (SQL Server 2000)
    http://msdn.microsoft.com/en-us/library/ms191184.aspx

    Altova Mapforce
    http://www.altova.com/mapforce.html

    "Troubleshooting a Leaky Heap in Your JVM" (blog entry)
    http://www.coldfusionmuse.com/index.cfm/2008/2/12/leaky.heap.jvm

    December 31, 2009

    The JVM settings we have are just what was there by default.

    -server -Dsun.io.useCanonCaches=false -XX:MaxPermSize=192m -XX:+UseParallelGC -Dcoldfusion.rootDir={application.home}/../ -Dcoldfusion.libPath={application.home}/../lib

    We have the Max jvm heap size set to 1024mb.  Our server is 32 bit.  Turning this up doesn't really solve the problem though.  It just means we can run for another day or two before all the memory is used up and it crashes.

    I'd like to continue running this in coldfusion because there are a lot of things that need to happen for each product, like creating categories and other dependent items.  I'm really not sure how to go about creating a sql script to import all this data.  Maybe I'll look into that some other time but I just don't think I have time to learn how to do that now.

    Today when I was doing some more testing it seems like I get the same problem just reading in the xml file and parsing it.  I use XMLParse and XMLSearch to get an array of the products to loop over.  Just doing this seems to use up memory that is never given back.

    Inspiring
    December 31, 2009

    1. You might try invoking Java's garbage collection after running your import code. This might help you get back your memory. See the blog entry I posted previously.

    2. Can you post your import code? Someone might be able to spot problem or advise you on performance improvements.

    3. Check your code for objects that persist for a long time such as in the session or application scopes. This might account for high the memory useage.

    4. You might try tweeking the "entityExpansionLimit" JVM argument.

    See: http://go.adobe.com/kb/ts_tn_19125_en-us

    5. Use an alternate method to parse your XML such as using a Java library. It is possible that the built-in CF9 XML features may have a memory leak. Using an alternate tool could be a workaround.

    http://betterxml.riaforge.org/

    http://www.bennadel.com/index.cfm?dax=blog:1345.view (blog entry)

    Note that I have not used the Java or XML components I've linked to.