How do I render a memory intensive (large) PDF file?

Q: How do I render a memory intensive (large) PDF file?

I use the following apprach (previously suggested in this forum) to
rasterize a PDF at any resolution. Essentially, I render PDF pages in
tiles or stripes:

... in C# ....

draw.SetDPI(72);

int factor = 20;
double y2Factor = usingRect.y2 / factor;
for (int i = 1; i <= factor; i++)
{
    Rect cropPart = new Rect();

    cropPart.x1 = 0;
    cropPart.y1 = y2Factor * (i-1);
    cropPart.x2 = usingRect.x2;
    cropPart.y2 = y2Factor *i;

    pg.SetCropBox(cropPart);
    Bitmap bmp = draw.GetBitmap(pg);
    bmp.Save("D:\\part" + i.ToString() + ".jpg");
    bmp.Dispose();

    System.Web.HttpContext.Current.Response.Write("cropPart " +
i.ToString() + " x1: " + cropPart.x1 + ", x2: " + cropPart.x2 + ", y1:
" + cropPart.y1 + ", y2: " + cropPart.y2 + "<br />");
}

Now the problem is that I run into some PDF with very large embedded
images where I still run out of memory. How do I deal with this
situation?
-------
A: We looked into your test file and the problem is that PDF file
contains a few huge images (e.g. some of which are 700MB decompressed
or more). To lower memory requirements you can turn off image caching
as follows:

pdfdraw.SetCaching(false);

Disabling caching will allow you to render the file on 32-bit systems.
If you are dealing with many large files or files containing huge
images, you may also want to consider using 64-bit version of PDFNet
(http://www.pdftron.com/downloads.html).