How do I keep memory under control during PDF to image conversion in JAVA?

Q: I´m working with the pdftron for windows library JAVAversion, and
when i process many pdf files (as much as 100 files) I´ve got the
following error:

bad allocation

The code of the JAVA routine used to convert PDF to image (where the
error raise) is as follows:

    private void decodeFile(InformacionPdf informacionPdf) throws
CLException {
        PDFDraw draw = null;
        PDFDoc doc = null;

            throw new CLException(BossError.ERR_CONEXION,"No se pudo
setear el path del recurso. NO se encontro el recurso en (" +
valorProperty +")");

        try {
            doc = new PDFDoc(informacionPdf.getNombreArchivo());

            draw = new PDFDraw();
            if(informacionPdf.getDpi() > 0)
            pdftron.PDF.Page page = doc.getPage(1);

            // draw.export(page, jpgFileName);
            // Convert the firts page in a stream JPG.
        catch (Exception ex){
            throw new CLException(BossError.ERR_INDETERMINADO,"Error
generando imagen -->> " + ex.getMessage(),ex); <--- line
        finally {
            if (doc != null)
                try {
                } catch (PDFNetException e) {
                    throw new CLException

Could you help me?
A: The most likely problem is that Java garbage collector is not
collecting object in time which could lead to resource exhaustion.
You can explicitly release any memory associated with a large object
(such as PDFDraw, ElementBuilder, ElementWriter, TextExtractor, etc)
by calling destroy() when the object is no longer in use. For

draw = new PDFDraw();

draw.export(page, jpgFileName);