Optimize PDF with repeated images

Q: With the way that we use PDFNet, we spend our time going through
the file, marking up specific pages and resaving the file to be used
for a different purpose. The problem we are having is that when we are
done we turned a 60meg PDF into a 1.65Gig set of 13 PDFs. (The
splitting is our choice)

As it stands right now, my assumption is that because the major logo
image is repeated upwards of 500 times per file the files are ending
up all very big. I have done some research into the optimization
module but cannot find it on the site and may help me with my problems.

Splitting a PDF may result in a larger total file size because fonts
and other resources may be replicated in each file. Still this should
not result in a big increase in file size, so the most likely problem
is in your PDF processing code. How are you 'marking up specific pages
and resaving the file'? Do you save the resulting file with
SDFDoc.SaveOptions.e_remove_unused or e_linearized flag?

'pdftron.PDF.Optimizer' can be of use before the split operation
because it could eliminate duplicate images (e.g. the logo), fonts,
and other resources. You could also use the optimizer to decrease the
image resolution (down-sample/resample) to change image compression
etc. For a sample code showing how to use the PDF optimizer please see
http://www.pdftron.com/pdfnet/samplecode.html#Optimizer.