0

I am looking for a ruby-specific solution to insert multiple rows using the INSERT..VALUES syntax

Currently I am generating individual insert statements, but this is very inefficient from a database perspective. I found some related questions, but not exactly what I'm looking for.

Example INSERT Statements (currently):

INSERT INTO qux SELECT 1,3,'foo';
INSERT INTO qux SELECT 4,11,'bar';
INSERT INTO qux SELECT 12,19,'baz';

INSERT multiple records using ruby on rails active record

The above question uses ActiveRecord to achieve what I am looking to do, but I am looking for a pure ruby approach.

how do I perform transactions with ruby mysql2

The above question uses transactions to achieve a result similar to my needs. However this is still not as efficient as using one INSERT VALUES statement.

Example data:

start,end,name
1,3,foo
4,11,bar
12,19,baz

INSERT VALUES Statement (desired):

INSERT INTO qux VALUES (1,3,'foo'),(4,11,'bar'),(12,19,'baz');

Any gem/codebase that already exists with this functionality?

UPDATE: I do not use and do not plan to use ActiveRecord in this particular project. I interact directly with the MySQL database and would love to know of any solution to my problem. One option may be building the functionality myself; however I don't want to re-invent the wheel here.

Community
  • 1
  • 1
Garren S
  • 5,552
  • 3
  • 30
  • 45

2 Answers2

0
INSERT INTO SomeTable ( Col1, Col2, Col3 )
SELECT Val1, Val2, Val3 FROM SomeOtherTable
UNION
SELECT 'MyProvidedVal1', 'MyProvidedVal2', 'MyProvidedVal3'
UNION
SELECT 'MyProvidedVal4', 'MyProvidedVal5', 'MyProvidedVal6'

Source: Insert multiple rows using select

Community
  • 1
  • 1
Dennis
  • 3,962
  • 7
  • 26
  • 44
0

Okay, I came up with a solution that is a bit clumsy, but it works!

In MySQL, make a simple test table:

CREATE TABLE `aa_test` (
  `a` int(11) unsigned NOT NULL AUTO_INCREMENT,
  `b` varchar(256) DEFAULT NULL,
  `c` varchar(256) DEFAULT NULL,
) ENGINE=InnoDB AUTO_INCREMENT=48 DEFAULT CHARSET=utf8;

In your ruby code, assemble your rows into one big honkin' array:

values = [1, 'hi', 'there', 2, 'ho', 'there', 47, 'yo', 'buddy']

Build your query dynamically, depending on how many rows you have:

query = "INSERT INTO aa_test (`a`, `b`, `c`) VALUES "
(values.size / 3).times {|ignore|
    query << '(?,?,?),'
}
query.chop! # remove final comma
client.prepare(query).execute(*values)

It works! At least, in irb. :-)

# mysql -h*** -u*** -p*** YourDatabaseHere -e'SELECT * FROM aa_test'
+----+------+-------+
| a  | b    | c     |
+----+------+-------+
|  1 | hi   | there |
|  2 | ho   | there |
| 47 | yo   | buddy |
+----+------+-------+

Of course, you have to keep track of how many fields and rows you have in values[]. You can hold your column names in an Array and both insert them into the query and come up with the divisor for the loop.

Here's an example of dynamic number of rows from my application. You just need an array of column names:

col_names = ['Date', 'Receipt_type', 'Type', 'Name', 'Item', 'Num', 'Amount', 'Source', 'Destination', 'Statement_s']
query = 'INSERT INTO sa_general_journal ('
col_names.each do |col_name|
    query << '`' << col_name << '`,'
end
query.chop!
query << ') VALUES '
(values.size / col_names.size).times {|ignore|
    query << '('
    (col_names.size).times {|ignore|
        query << '?,'
    }
    query.chop!
    query << '),'
}
query.chop!
client.prepare(query).execute(*values)

Keep in mind that there are server-side limits to prepared statements. max_prepared_stmt_count caps the number of prepared statements that can be queued, to reduce denial of service attacks. If set to zero, you can't do any prepared statements. Also, if you are loading BLOBS, keep in mind that prepared statement data must fit within max_allowed_packet (16MB on my server). I was loading PDF files into BLOBs using prepared statements, and this bit me a couple times.

I came here looking for an answer, but had to come up with one by my self. I KNOW this is an OLD question. (People have jumped on me for answering old questions!) But I hope this is useful to someone who gets here by searching, as I did!

Jan Steinman
  • 325
  • 3
  • 11