Questions tagged [bulk-load]

To bulk load means to import a data file in a user-specified format into a database table or view.

351 questions
2
votes
1 answer

Can reads occur whilst executing a bulk \copy batch of inserts

I plan to be batch inserting a large volume of rows into a Postgres table using the \copy command once per minute. My benchmarks show I should be able to insert about 40k rows per second, and I plan to do this for 3 or 4 seconds each minute. Are…
Fachtna Simi
  • 437
  • 2
  • 13
2
votes
1 answer

How to avoid PHP/WordPress memory fatal error for bulk processing of data in large file uploads?

I have a large CSV file which I am uploading to the WordPress dashboard for importing taxonomy terms. I wrote a small plugin which uses wp_insert_term() function to insert each term, however, the function caches a lot of its data in order to check…
Aurovrata
  • 2,000
  • 27
  • 45
2
votes
1 answer

Gremlin bulk load csv date formate

I am trying to upload data in to the AWS neptune, but getting error because of date format sample format of csv: ~id, ~from, ~to, ~label, date:Date e1, v1, v2, created, 2019-11-04 can some one help me on this?
Ragul M B
  • 204
  • 3
  • 9
2
votes
1 answer

Bulk-load of data containing PostGis fields into PostgreSQL through Binary Copy

Summary I have an application with a PostgreSQL + PostGis database setup, and I am trying to load a significant amount of rows into one of its tables. After some research, binary copy seems to be the best approach, but after numerous attempts and…
Santiago
  • 21
  • 3
2
votes
1 answer

How to fix bulk loading with MySQL python connector

I am trying to bulk load multiple csv files into a MySQL database that has already been set up. My main issue is with the actual execution of the loading because it looks like everything else is set up properly I have tried looking around on stack…
Beagle
  • 21
  • 1
2
votes
0 answers

How to identify the line in my csv file causing my bulk load map reduce job to fail in apache phoenix

I'm trying to load a csv file stored on hdfs, with about 140 billions lines, with apache phoenix bulk load tool. export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/etc/hbase/conf:/etc/hadoop/conf export…
cmatic
  • 51
  • 6
2
votes
1 answer

SQL SERVER bulk load

I am working on the SQL Server 2017. I need to import 20 text file into one table. Every text file has the same data type and column name. I have checked the data and they have in the same order also. I need to import in SQL Table and create a new…
Avinash Kumar
  • 41
  • 1
  • 11
2
votes
2 answers

php mysqli/multi_query cannot be proceeded more then once

I've got an array of strings which are sql "input into ..." queries. Im looping through this array combining each 50 of them into a new string and then sending to mysql database. After that I reset my temporary string and prepare it for new…
ampher911
  • 81
  • 1
  • 8
2
votes
2 answers

Spark issues in creating hfiles- Added a key not lexically larger than previous cell

I am trying to create hfiles to do bulk load into Hbase and it keeps throwing the error with the row key even though everything looks fine. I am using the following code: val df = sqlContext.read.format("com.databricks.spark.csv") …
showstealer
  • 23
  • 1
  • 3
2
votes
1 answer

Redis mass insertion: protocol vs inline commands

For my task I need to load a bulk of data into Redis as soon as possible. It looks like this article is right about my case: https://redis.io/topics/mass-insert The article starts from giving an example of using multiple inline SET commands with…
greatvovan
  • 2,439
  • 23
  • 43
2
votes
1 answer

How to enter null for missed columns in csv?

I try to perform bulk insert from csv file. MY csv file having 7 columns but table contains 8 columns. i can able to perform bulk insert with below query if my table having 8 columns only. BULK INSERT Table_Name FROM 'E:\file\input.csv' WITH…
Mister X
  • 3,406
  • 3
  • 31
  • 72
2
votes
1 answer

Unable to import data from Hdfs to Hbase using importtsv

I moved tab delimited file into hdfs now was trying to move it to hbase. Below is my importtsv command bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv…
Elijah
  • 306
  • 1
  • 12
2
votes
2 answers

Get time to populate a POSTGRES table from csv file

Is there any way to determine the time which the Postgres takes to populate a table from a csv file. I tried using 'time' command but it did not work. time copy TABLE_NAME from /home/ankit/Documents/file.csv delimiter ',' csv header ; The error…
Ankit Shubham
  • 2,989
  • 2
  • 36
  • 61
2
votes
0 answers

How to generate SSTable files with Hadoop map-reduce in Cassandra?

I have 500GB data on HDFS to transfer to Cassandra cluster. I think the fastest way is to use Cassandra sstableloader to bulk load sstable files into Cassandra. Cassandra 3.x has provided Client API CQLSSTableWriter to generate sstable files,…
Yan Bo
  • 21
  • 4
2
votes
2 answers

Postgres Bulk load using control card

We have a case, where we are loading data from flat file to postgres table. It is a delimited file. File Content: A|B|C 1.1|2016-12-20|3 I want to load only COLUMN A, B & C, but the table has 10 columns. In Oracle, using SQL loader, where we can…
Raja
  • 507
  • 1
  • 6
  • 24