How do I efficiently merge a huge PDF files with thousands of pages?

Q: I would like to merge huge PDF files with thousands of pages using
PDFNet.

I want to do a performance test by making a 1 million page PDF. My
intent was to copy the 6.4K pages from the original file 180 times.
This was fairly slow using the following code. What would be the best
way to do this?

My current code is as follows:

PDFNet.Initialize();
using (PDFDoc doc = new PDFDoc(fileName)) {
  doc.InitSecurityHandler();
  using (PDFDoc new_doc = new PDFDoc()) {
    int from = 1, to = doc.GetPageCount();
    new_doc.InsertPages(0, doc, from, to, PDFDoc.InsertFlag.e_none);
    // pdftron.PDF.Optimizer.Optimize(new_doc);
    new_doc.Save(output_path, SDFDoc.SaveOptions.e_linearized);
  }
}

In the knowledge base is suggest when working with large file you
should use temporary files. How would one do this?
----------------------
A: Because PDF linearization is a slow process a big performance
boost would to use SDFDoc.SaveOptions.e_remove_unused as a second
parameter in Save() method. Ideally you would also use the latest 64-
bit version (v.5.5 - http://www.pdftron.com/pdfnet/downloads.html)
which supports unlimited file size.

Since you are copying the same page set 180 times temporary files
would not be of much help. You will get much better performance is you
import the initial page set only once then copy the pages within the
same document. Actually you can avoid this step altogether and simply
replicate the original page set in the original document 180 times;
then save the document e_incremental. This would be very fast:

using (PDFDoc doc = new PDFDoc("my.pdf")) {
  doc.InitSecurityHandler();
  int sz = doc.GetPageCount();
  for (int i=0; i<180; ++i)
  {
    PageIterator itr = doc.GetPageIterator();
    for (int j=0; j<sz && itr.HasNext(); itr.Next(), ++j) {
       doc.PagePushBack(itr.Current());
     }
  }
  doc.Save("my.pdf",SDFDoc.SaveOptions.e_incremental);
}