Questions tagged [bulk-load]

To bulk load means to import a data file in a user-specified format into a database table or view.

351 questions
5
votes
0 answers

Firebase Firestore emulator slow for bulk import

The local Firestore emulator for firebase seems fairly slow when dealing with bulk data. I'm trying to simulate a production batch import of ~5,000,000 documents, but after a few thousand records the server is so overwhelmed it slows to a rate of…
5
votes
2 answers

How to write using BCP to a remote SQL Server?

I have a remote SQL Server with a hostname I am using to connect to. BCP suggests to use bcp DBName.dbo.tablename in C:\test\yourfile.txt -c -T -t However when I try this it does not connect to DBName as that is not a valid alias. I get native…
Dave Fish
  • 93
  • 1
  • 1
  • 4
5
votes
1 answer

App Engine Bulk Loader Performance

I am using the App Engine Bulk loader (Python Runtime) to bulk upload entities to the data store. The data that i am uploading is stored in a proprietary format, so i have implemented by own connector (registerd it in bulkload_config.py) to convert…
Rahul
  • 19,744
  • 1
  • 25
  • 29
5
votes
1 answer

Execute COPY command in remote database using local file

I'm trying to execute the Postgres COPY command. COPY warehouse_terminal.test_table FROM 'C:\file.csv' DELIMITERS E'\t' CSV HEADER QUOTE E'\"' ESCAPE E'\\' NULL AS ''; The problem is, the db where I'm going to use this is remote, and the file…
muffin
  • 2,034
  • 10
  • 43
  • 79
5
votes
1 answer

SStablewriter for python cassandra

Is there a python variant of the SStable writers for cassandra? I found many java examples, http://amilaparanawithana.blogspot.com/2012/06/bulk-loading-external-data-to-cassandra.html Is this still something which is under consideration?
5
votes
3 answers

R bulk upload data to MYSQL database

there is the package: RMySQL How can I bulk upload lots of data to mysql from R? I have a csv with around 1 million lines and 80 columns. Would something like this work? dbWriteTable(con, "test2", "~/data/test2.csv") ## table from a file I fear…
user670186
  • 2,588
  • 6
  • 37
  • 55
4
votes
2 answers

'Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes" error when indexing a list of dictionaries

This question is related to this other one: How can I read data from a list and index specific values into Elasticsearch, using python? I have written a script to read a list ("dummy") and index it into Elasticsearch. I converted the list into a…
Aizzaac
  • 3,146
  • 8
  • 29
  • 61
4
votes
2 answers

Use MySQL's LOAD DATA INFILE with node.js?

Are there any node.js MySQL drivers that support LOAD DATA INFILE (http://dev.mysql.com/doc/refman/5.0/en/load-data.html)? I have some utilities that rely heavily on bulk insertion, and can't find any node.js drivers that explicitly mention…
jhurliman
  • 1,790
  • 1
  • 18
  • 20
4
votes
0 answers

Cassandra bulk loader to load from compressed CSV files

Is there a way to use Cassandra bulk-loading solutions (e.g. COPY) to load from zipped CSV files without having to unzip them? I have 48GB compressed data in Amazon S3 that I need to load to Cassandra. The resources/disk storage aren't such that I…
Jagrati Gogia
  • 221
  • 1
  • 3
  • 12
4
votes
2 answers

Solutions to put different values for a row-key but the same timestamps in hbase?

I'm new at Hbase. I'm facing a problem when bulk loading data from a text file into Hbase. Assuming I have a following table: Key_id | f1:c1 | f2:c2 row1 'a' 'b' row1 'x' 'y' When I parse 2 records and put it into Hbase at the same…
Bing Farm
  • 75
  • 1
  • 8
4
votes
2 answers

What is the fastest way to load data into Cassandra column-family

I created a Cassandra column-family and I need to load data from a CSV file for this column family. The csv file has a 15 Gb volume. I am using the CQL 'COPY FROM' command but this takes a long time to make loading the data. What is the…
Pedro Cunha
  • 401
  • 1
  • 6
  • 16
4
votes
2 answers

XML Schema Error: Required white space was missing

I have been searching on this for hours and can not figure out the issue. Could someone please help me with this? I am getting the above error when Executing a SQLXMLBULKLOAD in VB.NET 2010. I have attempted changing my xml declaration, my schema…
Josh McKearin
  • 742
  • 4
  • 19
  • 42
3
votes
1 answer

SQL Server 2008 BULK INSERT causes more reads than writes. Why?

I've huge a table (a few billion rows) with a clustered index and two non-clustered indices. A BULK INSERT operation produces 112000 reads and only 383 writes (duration 19948ms). It's very confusing to me. Why do reads exceed writes? How can I…
sh1ng
  • 2,808
  • 4
  • 24
  • 38
3
votes
1 answer

Bulk API error while indexing data into elasticsearch

I want to import some data into elasticsearch using bulk API. this is the mapping I have created using Kibana dev tools: PUT /main-news-test-data { "mappings": { "properties": { "content": { "type": "text" }, "title":…
lydal
  • 802
  • 3
  • 15
  • 33
3
votes
4 answers

AUTOINCREMENT primary key for snowflake bulk loading

I would like to upload data into snowflake table. The snowflake table has a primary key field with AUTOINCREMENT. When I tried to upload data into snowflake without a primary key field, I've received following error message: The COPY failed with…
1
2
3
23 24