Thanks Christian,
just one more question about your statement:
If the OPTIMIZE command by the same server instance, it's absolutely safe to use it. It will delay incoming requests, but none of them will be lost (see [1] for more details).
This means that I have to use "basexclient" (client-server) script rather than "basex" (standalone) when going to run the optimize commands through a cron job right? M.
On 07/25/2014 09:52 AM, Christian GrĂ¼n wrote:
Hi Marco,
My first question is whether running optimize command regularly (maybe every hour or so) without stopping the flow of incoming documents is a safe way to go or whether it will impede somehow the workload or introduce any risks on data integrity.
If the OPTIMIZE command by the same server instance, it's absolutely safe to use it. It will delay incoming requests, but none of them will be lost (see [1] for more details).
Secondly, to improve the workload on a single database (thus reducing the frequency of optimize calls), I was planning to split my data moving less important data to a different database (but same server). Will this be an effective solution in this scenario?
Yes, that's a reasonable approach when working with large data. If you have data that's static (i.e., not changed anymore), you can store it in a database that's perfectly optimized and indexed, and you can organize daily updates in a second, smaller database.
Hope this helps, Christian