I have about 9G xml file and like to process with basex. But I am getting out of memory erroe (with setting -Xmx7G). Yesterday, I was able to process like this where $rawProductsFile is the name of big xml file. XML file consists of "Products" elements and process them one by one and write result into a file.
But today, it is out of memory exception.
let $res :=
for $doc in fn:doc($rawProductsFile)//Products
let $list:=
for $tname in $tables
let $rows := ($doc[fn:name() eq $tname]|$doc//table[@name eq $tname])
let $list2 := local:processOneProd ($tname,$rows,$tableMap)
let $fw := file:append(fn:concat($tablesFolder,$tname,".xml"),$list2)
return $fw return $list
Which way is good to process large xml file like this, process each element one be one?
I appreciate your help
Erol Akarsu
How does the stack trace look like? Am 18.06.2014 16:09 schrieb "Erol Akarsu" eakarsu@gmail.com:
I have about 9G xml file and like to process with basex. But I am getting out of memory erroe (with setting -Xmx7G). Yesterday, I was able to process like this where $rawProductsFile is the name of big xml file. XML file consists of "Products" elements and process them one by one and write result into a file.
But today, it is out of memory exception.
let $res :=
for $doc in fn:doc($rawProductsFile)//Products let $list:= for $tname in $tables let $rows := ($doc[fn:name() eq $tname]|$doc//table[@name eq
$tname])
let $list2 := local:processOneProd ($tname,$rows,$tableMap) let $fw :=
file:append(fn:concat($tablesFolder,$tname,".xml"),$list2)
return $fw return $list
Which way is good to process large xml file like this, process each element one be one?
I appreciate your help
Erol Akarsu
Chrsitian,
It does not show any stack trace. But it shows only "Out Of Main Memor" exception in "Query Info" page I think it is impossible to process so large XML file 9GM like this.
I had to index xml file first and processed it fine.
Erol Akarsu
On Wed, Jun 18, 2014 at 8:02 PM, Christian Grün christian.gruen@gmail.com wrote:
How does the stack trace look like? Am 18.06.2014 16:09 schrieb "Erol Akarsu" eakarsu@gmail.com:
I have about 9G xml file and like to process with basex.
But I am getting out of memory erroe (with setting -Xmx7G). Yesterday, I was able to process like this where $rawProductsFile is the name of big xml file. XML file consists of "Products" elements and process them one by one and write result into a file.
But today, it is out of memory exception.
let $res :=
for $doc in fn:doc($rawProductsFile)//Products let $list:= for $tname in $tables let $rows := ($doc[fn:name() eq $tname]|$doc//table[@name eq
$tname])
let $list2 := local:processOneProd ($tname,$rows,$tableMap) let $fw :=
file:append(fn:concat($tablesFolder,$tname,".xml"),$list2)
return $fw return $list
Which way is good to process large xml file like this, process each element one be one?
I appreciate your help
Erol Akarsu
Did you use debugging (-d)? Am 19.06.2014 22:01 schrieb "Erol Akarsu" eakarsu@gmail.com:
Chrsitian,
It does not show any stack trace. But it shows only "Out Of Main Memor" exception in "Query Info" page I think it is impossible to process so large XML file 9GM like this.
I had to index xml file first and processed it fine.
Erol Akarsu
On Wed, Jun 18, 2014 at 8:02 PM, Christian Grün <christian.gruen@gmail.com
wrote:
How does the stack trace look like? Am 18.06.2014 16:09 schrieb "Erol Akarsu" eakarsu@gmail.com:
I have about 9G xml file and like to process with basex.
But I am getting out of memory erroe (with setting -Xmx7G). Yesterday, I was able to process like this where $rawProductsFile is the name of big xml file. XML file consists of "Products" elements and process them one by one and write result into a file.
But today, it is out of memory exception.
let $res :=
for $doc in fn:doc($rawProductsFile)//Products let $list:= for $tname in $tables let $rows := ($doc[fn:name() eq $tname]|$doc//table[@name eq
$tname])
let $list2 := local:processOneProd ($tname,$rows,$tableMap) let $fw :=
file:append(fn:concat($tablesFolder,$tname,".xml"),$list2)
return $fw return $list
Which way is good to process large xml file like this, process each element one be one?
I appreciate your help
Erol Akarsu
Christian
No I can run with debug and let you know result
Sent from my iPhone
On Jun 19, 2014, at 6:23 PM, Christian Grün christian.gruen@gmail.com wrote:
Did you use debugging (-d)?
Am 19.06.2014 22:01 schrieb "Erol Akarsu" eakarsu@gmail.com:
Chrsitian,
It does not show any stack trace. But it shows only "Out Of Main Memor" exception in "Query Info" page I think it is impossible to process so large XML file 9GM like this.
I had to index xml file first and processed it fine.
Erol Akarsu
On Wed, Jun 18, 2014 at 8:02 PM, Christian Grün christian.gruen@gmail.com wrote: How does the stack trace look like?
Am 18.06.2014 16:09 schrieb "Erol Akarsu" eakarsu@gmail.com:
I have about 9G xml file and like to process with basex. But I am getting out of memory erroe (with setting -Xmx7G). Yesterday, I was able to process like this where $rawProductsFile is the name of big xml file. XML file consists of "Products" elements and process them one by one and write result into a file.
But today, it is out of memory exception.
let $res :=
for $doc in fn:doc($rawProductsFile)//Products let $list:= for $tname in $tables let $rows := ($doc[fn:name() eq $tname]|$doc//table[@name eq $tname]) let $list2 := local:processOneProd ($tname,$rows,$tableMap) let $fw := file:append(fn:concat($tablesFolder,$tname,".xml"),$list2) return $fw return $list
Which way is good to process large xml file like this, process each element one be one?
I appreciate your help
Erol Akarsu
basex-talk@mailman.uni-konstanz.de