
Hi all, I'm confused as I expected a MonetDB XQuery users mailing list instead of a developer list. However, this list was linked in the MonetDB/XQuery FAQ section so I thought it would be worth trying... Here is my problem: I'm just running some XQuery benchmarks using MonetDB together with the XQuery module. Installation was ok and I have chosen a set of queries to be evaluated against data generated with the xmlgen tool. Query processing works fine for 10MB and 50MB documents, but when processing large documents (100MB and 200MB) the client reports an error message. I even tried to figure out the source of the problem using trace option -t but do not understand the listed error messages. Here is the final error message reported by the client: ERROR: [rename]: 29 times inserted nil due to errors at tuples "pre_size", "pre_level", "pre_prop", "pre_kind", "qn_uri", "qn_prefix". ERROR: [rename]: first error was: ERROR: rename(<tmp_351>,pre_size2): operation failed ERROR: interpret_unpin: [rename] bat=171,stamp=-1285 OVERWRITTEN ERROR: BBPdecref: tmp_253 does not have pointer fixes. ERROR: interpret_params: +(param 2): evaluation error. Once again: evaluation of the same queries works fine on small streams but fails on large streams. By the way, I've installed the latest offcial release of both the server and the XQuery module. Is this a bug or are there any default memory limits causing the crash? I tried to figure out what the --set option is good for but was not able to detect any documentation on supported options. Kind regards Michael _____________________________________________________________________ Der WEB.DE SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! http://smartsurfer.web.de/?mc=100071&distributionid=000000000071