[MonetDB-users] Bulk load

Hello, I am trying to pipe output from my application using bulk load. If application produced output like this: CREATE TABLE test (col1 int); COPY 3 RECORDS INTO "test" FROM stdin USING DELIMITERS '\t'; 1 2 NULL COMMIT; I could pipe it to MapiClient -lsql -umonetdb -Pmonetdb, it worked OK. But application does not know how many rows it will produce, until dump out all the records. When I tried to skip record count I got failed assertion error (a bug ?). Anyway, it would fail when encountered COMMIT; at the end. At the same time COPY without record count can load data from a file, but I do not want to put data into temp table and then do load because i want them run in parallel. Could you suggest a way to do the loading ? -- Best regards, Andrei Martsinchyk mailto:andrei.martsinchyk@gmail.com

Responding to myself, I found out that I could do what I needed using named pipes. Below there is an example (shell script), perhaps it would be useful for someone: #!/bin/sh # A command that produces data lines COMMAND=cat load_test.dat # A name for pipe FIFO_NAME=/tmp/Monet.fifo # # Create named pipe mkfifo $FIFO_NAME # Write data to pipe on background $COMMAND > $FIFO_NAME & # Load data from pipe MapiClient -lsql -umonetdb -Pmonetdb -s "COPY INTO loadtest FROM '$FIFO_NAME' USING DELIMITERS '|'; COMMIT;" # Cleanup rm $FIFO_NAME 2005/9/27, Andrei Martsinchyk <andrei.martsinchyk@gmail.com>:
Hello,
I am trying to pipe output from my application using bulk load. If application produced output like this:
CREATE TABLE test (col1 int); COPY 3 RECORDS INTO "test" FROM stdin USING DELIMITERS '\t'; 1 2 NULL COMMIT;
I could pipe it to MapiClient -lsql -umonetdb -Pmonetdb, it worked OK. But application does not know how many rows it will produce, until dump out all the records. When I tried to skip record count I got failed assertion error (a bug ?). Anyway, it would fail when encountered COMMIT; at the end. At the same time COPY without record count can load data from a file, but I do not want to put data into temp table and then do load because i want them run in parallel. Could you suggest a way to do the loading ?
-- Best regards, Andrei Martsinchyk mailto:andrei.martsinchyk@gmail.com
-- Best regards, Andrei Martsinchyk mailto:andrei.martsinchyk@gmail.com
participants (1)
-
Andrei Martsinchyk