When i try to convert pdf to image then for some pdfs i get a "out of memory" error. So i increased heap size and then i again got the error for some different pdf file. for the time being assume I have no memory leak from other objects. So what would be the reason for this memory out of error? Would it be just that the image is so large(which is not the case i think) that it consumes heap, or maybe pdfbox stores buffered image of each pages in its memory and this contributes to the growing heap size? Any insight would be wonderful.
Here's the link to the pdf I am trying to render. https://drive.google.com/file/d/0B_Ke2amBgdpeNFFDem5KVVVzanc/view?usp=sharing Here's the code segment.
PDFRenderer pdfRenderer = new PDFRenderer(pdDoc);
BufferedImage image = pdfRenderer.renderImageWithDPI(page-1, 300,ImageType.GRAY);
//image=ImageHelper.convertImageToGrayscale(image);
ImageIOUtil.writeImage(image,"G:/Trial/tempImg.png", 300);
Please note that for this particular pdf problem was solved by increasing the heap size but what I want to know is that does pdfbox stores buffered images in its memory and contributes to heap size.
Here's another pdf which faced the same issue even after increasing heap size . https://drive.google.com/file/d/0B_Ke2amBgdpedDBtaG1QcW1oYlU/view?usp=sharing In this pdf my code takes forever while rendering page 44. I don't know why this is happening.