I have about 9G xml file and like to process with basex.
But I am getting out of memory erroe (with setting -Xmx7G). Yesterday, I was able to process like this where $rawProductsFile is the name of big xml file. XML file consists of "Products" elements and process them one by one and write result into a file.
 But today, it is out of memory exception. 
 let $res :=
    for $doc in fn:doc($rawProductsFile)//Products
      let $list:=
        for $tname in $tables
            let $rows := ($doc[fn:name() eq $tname]|$doc//table[@name eq $tname])
            let $list2 := local:processOneProd ($tname,$rows,$tableMap)
            let $fw := file:append(fn:concat($tablesFolder,$tname,".xml"),$list2)
            return $fw
      return $list
 Which way is good to process large xml file like this, process each element one be one?
I appreciate your help
Erol Akarsu