I am doing some evaluation work and have been developing a test app
using the java sdk. Environment is the following:
-OS is Ubuntu 10.0.4
-Machine has 4GB memory
-Java is Sun 184.108.40.206
-Application is PHP talking via phpjavabridge to a servlet I wrote
that wraps the java sdk
In this first test I am processing documents that range from
30MB-155MB in size. I am opening a doc, then going through each page
and generating a full size JPEG image, based on the size returned from
I am running into three issues currently with this setup. One is that
there is a significant memory leak, the second is severe CPU issues
with it spiking to 100% during the duration fo the processing, the
third is I am receiving random OutOfMemory errors when processing
certain pages (which I assume are either too large in content?, its
not a size issue as it will process the first x pages and always die
on page Y)
I did initial development on my local windows machine, appeared to
solved the memory leak issue with pointers from this post:
Main pts being to call pdfdraw.destroy() with repeated page drawing
and making sure to give a .close call to the doc.
However when I run the same code on the server the leak persists.
Another interesting difference is that on my local machine when the
leak was present all memory would be freed up when I killed tomcat.
On the server the memory persists even after killing tomcat.
Any pointers on debugging what could be going on?
Second issue is the CPU spiking to 100% during the whole document
processing. I am not a java guy, in doing a bunch of reading to
solving an OutOfMemory issue which randomly pops up and kills the
server/process, I came across a lot of content about JVM tuning for
the HeapSize as well as params around garbage collection. Since the
post included above mentions the need to call .destroy() b/c of GC
issues, I figured maybe there is some tuning that can be done to
alleviate all these leak/performance issues I have been dealing with.
I tried the following for JAVA_OPTS:
JAVA_OPTS="-Xms1G -Xmx2G -XX:PermSize=128m -XX:MaxPermSize=128m -
Xshare:off -XX:NewSize=512m -Xss1024K"
With giving the JVM a minimum of 1G, Max of 2G memory, and the NewSize
value was touted to solve CPU spiking in java programs where GC is
happening too often and takes more CPU than the application processing
itself. The other params I dont fully understand but were included
with the other setting suggestions in multiple locations.
Any recomdations for JVM settings related to using the java sdk
The third OutOfMemory process killing issue I assume is b/c the
content for particular pages has too much going on or something and is
sapping the memory, maybe the tuning will help solve this issue as
well. I know the above post mentioned sometimes tiling might need to
be done for certain page/image processing. If that is the case how
can I programmatically determine that w/o relying on my process dying
and manually flagging the doc as I am doing now
I am going to embark on a next round of testing this weekend to see if
I could get to the bottom of things, but after several weeks of
dealing with these issues I figured I would try here to see if anyone
had any suggestions that could help reduce these performance pains.
Thanks in advance