Hi Christopher, hi Ben,
yes, this sounds like unwanted behavior, and I believe it should be fixable as the commands scripts I’ve been working with didn’t cause memory leaks. I’ll be glad to track down the possible issues. Could (one/both) of you pass me on a script that causes the problems?
Christian
PS: I would be grateful if you could additionally check if the problem persists in the latest stable snapshot. ___________________________
On Mon, May 20, 2013 at 10:33 AM, Ben Companjen bencompanjen@gmail.com wrote:
I recognise your problem, and reported it, but never got back to it with more details. I used BaseX client/server 7.5 beta. My first database contained 2.7 million documents, but I created a new one from an exported subset of 700k documents. That helped lower the memory use directly after loading the DB.
Any chance you use the SQL module in your processing?
My guess was that it had been a design choice to keep previously opened documents from a database in use in memory. But running out of memory probably wasn't ;)
Ben
On 20 May 2013 04:32, Christopher.R.Ball christopher.r.ball@gmail.com wrote:
I have a BaseX script (.bxs) I am running that does queries in batches (sets of 5k documents), but as it progresses it bogs down in speed, does not release memory between sets even if I force it to close and reopen the db between queries, and eventually runs out of memory.
But, if I break the same BaseX script into separate files still doing the same exact batches it is extremely fast and memory efficient.
Very suggestive of a memory leak . . .
I am running on BaseX 7.6.1 Beta.
Any thoughts?
Is there a way to force the script to do garbage collection?
BaseX-Talk mailing list BaseX-Talk@mailman.uni-konstanz.de https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk
BaseX-Talk mailing list BaseX-Talk@mailman.uni-konstanz.de https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk