
uriel katz wrote:
i have a application where i have constant bulk inserts(about 50000 rows of 1Kbyte each) every 1 minute,i am wondering which is the fastest and most efficient way to insert data. COPY into is the fastest i am using now the COPY command but it is kind of awkward since i need to dump my data as csv and then execute COPY to insert it,is there any streaming method or a special api for bulk inserts?
also when i issue a copy command it uses a little bit of cpu aroud 2%(it is a quad core setup with windows so i guess this means 8% of one cpu) and keep loading and releasing memory(i have 8GB of ram) even that it doesn`t get near 2GB. is this ok,what it is actually doing? We have recently upgraded our bulk loader to utilize as many cores as
I think it is possible to inline the tuples into the sql stream as well. The SQL developer will answer this one. possible This is not yet in the release. The preview of effects you can see in http://monetdb.cwi.nl/projects/monetdb//SQL/Benchmark/TPCH/
P.S.: are insert/selects multi threaded?
thanks for this awsome peace of software!
-Uriel ------------------------------------------------------------------------
------------------------------------------------------------------------- This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK & win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100&url=/ ------------------------------------------------------------------------
_______________________________________________ MonetDB-users mailing list MonetDB-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/monetdb-users