Questions tagged [bulkinsert]

Act of inserting multiple rows into a database simultaneously.

Act of inserting multiple rows into a database simultaneously.

The purpose is to speed up loading of large amounts of data into a database. Depending on the database or database drivers used, the data is generally transferred and committed to the database in large groups of rows instead of one row at a time.

Also include appropriate database tags such as or , and access methods such as , , or .

2493 questions
0
votes
1 answer

Bulk delete in oracle database with toad

I have a table with over 24 million log records , now we are trying to reduce that. Due to company policy we aren't allowed to do a truncate , move or anyhting of that sort. The records have to be deleted from that table in one flued go, aprox 23…
0
votes
1 answer

amazon redshift jdbc driver CopyManager.copyIn COPY command syntax

I am looking for an example of what command to pass to the amazon redshift jdbc driver CopyManager.copyIn method. The code comments specifically state that it is intended to accept STDIN for the inbound data that will be copied to the specified…
Joel
  • 124
  • 1
  • 8
0
votes
2 answers

MySQL Query, bulk insertion

I have a bulk data for insertion in MYSQL Tables, let use suppose, 10k in one time, What I am doing is store the data in an XML file and then go for insertion (data is around 50K rows), It will take a lot of time, Is there any option for bulk…
Ashok
0
votes
2 answers

Rust: How do we run a DML on MySQL with lots of parameters?

I want to insert lots of records in batches. Some of the fields are varchars, so I want to pass that data as dynamic parameters. I am aware I could use exec_batch(), but that wouldn't be a good solution performance-wise, as it seems to send each row…
at54321
  • 8,726
  • 26
  • 46
0
votes
1 answer

How to import a CSV with a TIME column on SSMS (SQL)

I have a CSV with the following format COL1 | COL2 20211002 | 163136 (YYYYMMDD) | (HHMMSS) On SQL I am creating a table with the following format CREATE TABLE dbo.[table] ( [COL1] DATE, [COL2] TIME, ); BULK LOAD: BULK INSERT…
0
votes
0 answers

DATA_SOURCE USING BULK INSERT in SQL Server does not compile

I'm trying to load data from a sample file, but I'm using couple of parameters in the WITH clause and hence it does not compile. I'm using SSMS18, Not sure if that is an issue? Not Working: BULK INSERT #data FROM 'sample.csv' WITH ( …
newbie
  • 75
  • 1
  • 8
0
votes
1 answer

Bulk inserting to MYSQL through NodeJS With Multiple Placeholders and SELECT

I'm having trouble getting the following to work. Any help would be massively appreciated var values = [ ['Product1', 1, 1], ['Product2', 1, 1], ]; dbConn.query( 'INSERT INTO tblQuoteItems…
Tom.Hendry
  • 31
  • 7
0
votes
2 answers

How to read csv and write it in a table using a query?

i want to read a csv file and then, write it on a table. The solutions that I've found online haven't helped me and i don't know how to do it. My csv file has 2 columns like that: ID NAME 1 Sylvia 2 John ... And i want to read it and import it to…
Ulyses Evans
  • 63
  • 1
  • 8
0
votes
1 answer

How can I add additional columns during BULK INSERT without modifying source file

I have a BULK INSERT into a table but I need to add data that is not in my csv for the last column in each row of the table. For example, if my csv file contains 10 columns but the database table contains 11 columns, how can I add data for the last…
Dar W
  • 1
0
votes
1 answer

Most efficient method to import bulk JSON data from different sources in postgresql?

I need to import data from thousands of URLs, here is an example of the…
0
votes
1 answer

EF Core increasing memory usage in insert loop

I have a long running application that downloads some data and inserts them in bulks in a sql server database. I have disabled tracking and set AutoDetectChangesEnabled = false _context.ChangeTracker.QueryTrackingBehavior =…
albert
  • 1,493
  • 1
  • 15
  • 33
0
votes
2 answers

Mysql losing 11 records on insert

I download an XML file containing 1048 records, and then I successfully create a table($today) in my DB, and load the XML data into the MySQL table. I then run a second script which contains this query: INSERT INTO t1 ( modelNumber, …
Ryan
  • 14,392
  • 8
  • 62
  • 102
0
votes
1 answer

Insert 1 million rows into MySQL Server quickly

I am trying to insert 2 Milion Rows to MySql database in a single commit. But I could not find better solutions for this operation. For the MsSql I found the following solution. SqlBulkCopy objbulk = new SqlBulkCopy(conn1); //assign…
0
votes
1 answer

How to index a 1 billion row CSV file with elastic search?

Imagine you had a large CSV file - let's say 1 billion rows. You want each row in the file to become a document in elastic search. You can't load the file into memory - it's too large so has to be streamed or chunked. The time taken is not a…
TomDane
  • 1,010
  • 10
  • 25
0
votes
1 answer

Doctrine Many-To-Many bulk insert

I need to do a bulk insert of thousands of records (5k up to 20k). The scenario is User<->n:m<->Group. The list of users is obtained by a complex query with many joins. I have access to the QueryBuilder that generates the list. The simpliest…
Jack Skeletron
  • 1,351
  • 13
  • 37
1 2 3
99
100