Hello James,
well, that is what I would expect BaseX to do. If you put the file in a map it needs to be in memory. For a large file your memory might run out. With your version 2) I assume you use the streaming capabilities of file:write-binary (see http://docs.basex.org/wiki/Streaming_Module for more information).
So to me it seems you should either increase the memory you give your JVM or you use streaming binary. However, what is the actual reason to put the binary into a map? So normally (but it depends on your use case) I would try to use the streaming capabilities and to store the binary as-is.
Cheers Dirk
Senacor Technologies Aktiengesellschaft - Sitz: Eschborn - Amtsgericht Frankfurt am Main - Reg.-Nr.: HRB 110482 Vorstand: Matthias Tomann (Vorsitzender), Marcus Purzer - Aufsichtsratsvorsitzender: Daniel Grözinger
On 20. Mar 2019, at 13:34, James Ball <basex-talk@jamesball.co.ukmailto:basex-talk@jamesball.co.uk> wrote:
Hello everyone,
I’m trying to use BaseX RESTXQ to upload some large (around 300MB) Zip files using HTML forms (multipart/form-data), save them to disk and then process them.
But I am getting server errors with large files (small files work perfectly).
HTTP ERROR 500
Problem accessing /test.htm. Reason:
Server Error
Caused by:
java.lang.OutOfMemoryError: Java heap space
________________________________ Powered by Jetty:// 9.4.9.v20180320http://eclipse.org/jetty
It seems to be when reading the file from the map of files.
1) Upload large file but do nothing with it - WORKS
2) Upload large file and just write the whole POST data to file - WORKS
3) Upload large file and write file from map - ERROR
This is the file writing function I’m using:
file:write-binary( "/Users/me/Desktop/delete2.zip”, $files(map:keys($files)[1]) )
Running BaseX 9.1
Is there something really obvious that I’m doing wrong? (There usually is :) )
Many thanks, James