-3

I have "folder" filled with documents, that i need to upload, i already used apache fileUpload but this "upload the entire folder to the jvm heap memory" so if the user have a folder with 100mb of docs, the application will launch java heap size error.

So in java there is a way to read the "folder" and upload files one by one instead of charging 100mb of documents at once directly to the jvm heap memory?

-No, theres no possible way to add more mb on jvm heap size.

the java code

        DiskFileUpload upload = new DiskFileUpload();

        upload.setSizeThreshold(1024 * 1024 * 1);
        upload.setSizeMax(-1);

        List<FileItem> items = null;
        try {
            items = parseRequest(req, upload);
        } catch (FileUploadException e) {
            e.printStackTrace();
            msg = new String[1];
            msg[0] = e.getMessage();
            session = req.getSession(true);
            session.setAttribute("msg", msg);
            resp.sendRedirect("front/Alerts.jsp?ok=false");
            return;
        }

protected List<FileItem> parseRequest(HttpServletRequest req, DiskFileUpload upload)
        throws IOException, FileUploadException {
    return upload.parseRequest(req);
}
E_net4
  • 27,810
  • 13
  • 101
  • 139
  • Can you show the code of `parseRequest`? – dan1st Aug 19 '22 at 22:00
  • Are you sure this is where the problem is? You've set your DiskFileUpload instance to use the disk for storage. Do you later call FileItem.get()? – tgdavies Aug 19 '22 at 22:35
  • Yeah, actually here is when the code crash, let me explain the program runs very well, but when the application manage to upload more than 10mb (of total size it's more than 10k docs 100kb each), it crashes, with java heap size error. FileItem.get() not a single time called. I used this apache classes: https://commons.apache.org/proper/commons-fileupload/apidocs/org/apache/commons/fileupload/DiskFileUpload.html and of course increasing the jvm heap size memory it's not even close an option :( – Luna Project Aug 19 '22 at 22:39
  • What do you see when you analyse the heap dump? – tgdavies Aug 20 '22 at 00:49

1 Answers1

0

You are using deprecated the class DiskFileUpload so it may be worth updating your application to use ServletFileUpload together with DiskFileItemFactory instead.

The setSizeThreshold value determines the size threshold at which memory or file saves are used by each item in the group of uploads, so it is likely your problem is due to the combination of your large 1MB setSizeThreshold() call with big number of files and so might be fixed by reducing the size to a low value.

DuncG
  • 12,137
  • 2
  • 21
  • 33