SoFunction
Updated on 2025-03-05

Java file download ZIP error report: Out of Memory problem troubleshooting

Say less nonsense and get some real information! !

It is normal to download small files or ZIP files in the project, but if you download ZIP files that exceed 2G, the error is reported: Out of Memory: Java Heap Space.

Memory overflow, but it's a big problem. Check the code:

 private void setByteArrayOutputStream(String fileName, InputStream inputStream, ZipArchiveOutputStream zous) {
       
      ByteArrayOutputStream baos = new ByteArrayOutputStream();
        byte[] buffer = new byte[1024];
        int len;
        while ((len = (buffer)) != -1) {
            (buffer, 0, len);
        }
        ();
        byte[] bytes = ();
        //Set the file name        ArchiveEntry entry = new ZipArchiveEntry(fileName);
        (entry);
        (bytes);
        ();
        ();
}

Careful children's shoes should have found a mistake!

1. Cause analysis

The main reasons analysis is ByteArrayOutputStream The use is related to. ByteArrayOutputStream dynamically expands its buffer in memory to hold written data. When writing large amounts of data, especially when processing large files, it can cause frequent allocation and copying of memory, thus consuming large amounts of memory. Large file processing: If the input stream (InputStream) reads a large amount of data (such as more than a few hundred MB or GB), ByteArrayOutputStream may consume more resources than available memory, resulting in an OutOfMemoryError.

In the above code, the logic is as follows:

(1) Read data: Continuously read data into the buffer buffer through (buffer).

(2) Write to ByteArrayOutputStream: Each time the data read is written to ByteArrayOutputStream. If the file is very large, ByteArrayOutputStream will continuously expand its internal array.

(3) Convert to byte array: When calling (), a new byte array will be created and all data will be copied into this new array, which in turn requires additional memory.

2. Solution

Adopt streaming: Directly fromInputStreamRead and write toZipArchiveOutputStream, not to useByteArrayOutputStream. This avoids loading the entire file into memory.

The code is as follows:

​private void setByteArrayOutputStream(String fileName, InputStream inputStream, ZipArchiveOutputStream zous) {
        //Create Zip portal        try {
            ZipArchiveEntry entry = new ZipArchiveEntry(fileName);
            (entry);

            //Use stream buffer, 1024 bytes at a time, read and write the ZIP output stream in chunks            byte[] buffer = new byte[1024];
            int len;
            while ((len = (buffer)) != -1) {
                (buffer, 0, len);
            }
            //Complete the current ZIP portal            ();
        } catch (Exception ex) {
            ();
        } finally {
            //Make sure the input stream is closed            try {
                ();
            } catch (Exception e) {
                ();
            }
        }
    }

After running, we found that no matter how large the file is downloaded, no errors will be reported. However, for files that exceed 4G, other errors occurred when forming ZIP packets, as follows:

.Zip64RequiredException: XXXXXXXXXXXXXXXXXXXXXXXXXXXX's size exceeds the limit of 4GByte.
	at .checkIfNeedsZip64(:651)
	at (:638)
	at (:513)

Cause analysis:

This exception is usually encountered when trying to create a ZIP file, especially when the file size exceeds 4GB. The standard for the ZIP format limits the size of a single file to 4GB, so the ZIP64 format is required when handling large files. We add the following in the code:

zous.setUseZip64(ZipArchiveOutputStream.); // Enable ZIP64 supportor(Different versions,Different functions)
zous.setUseZip64();

After following the above modification, we found that there was no memory overflow error when downloading the file ZIP.

This is the article about the problem troubleshooting of ZIP errors in Java file download: Out of Memory. For more related Java file download ZIP errors, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!