0

In cqlsh I'm trying to insert a record with:

INSERT INTO data 
(order_id,order_ts,transaction_id,transaction_discount,transaction_qty,
transaction_total,
product_category,product_profit,product_upc,product_name,product_price,
product_distributor,
store_id,store_name,store_state,store_region,id) 

values ('YBC8RWE18',1368438171000,'LQKLVVI4E', 0, 1, 140.72, 
'Office Supplies', 12.42, 
'YT8899H3357', 'Casio USB Model FX-9860GII', 140.72, 'QR0', '2YOYWMR28Q', 
'BigLots', 'AZ', 
'Southwest', 2259a88e-b62d-4625-a86e-b86d77418a34 );

Looks fine, but I'm getting a number exception:

Caused by: java.lang.NumberFormatException: Zero length BigInteger
    at java.math.BigInteger.<init>(BigInteger.java:190)
    at org.apache.cassandra.serializers.DecimalSerializer.deserialize(DecimalSerializer.java:41)
    at org.apache.cassandra.serializers.DecimalSerializer.deserialize(DecimalSerializer.java:26)
    at org.apache.cassandra.db.marshal.AbstractType.compose(AbstractType.java:142)
    at org.apache.cassandra.db.marshal.DecimalType.compare(DecimalType.java:46)

Looks like DecimalSerializer.deserialize is the real issue here. If I try to surround the decimals in quotes (worth a shot, I thought) I get:

Bad Request: Invalid STRING constant (140.72) for product_price of type decimal

So that didn't help. What do I need to do to insert a decimal? Should I post up the COLUMNFAMILY def?

Here's the DESCRIBE TABLE:

CREATE TABLE data (
  id uuid,
  order_id text,
  order_ts timestamp,
  product_category text,
  product_distributor text,
  product_name text,
  product_price decimal,
  product_profit decimal,
  product_upc text,
  store_id text,
  store_name text,
  store_region text,
  store_state text,
  transaction_discount decimal, 
  transaction_id text,
  transaction_qty int,
  transaction_total decimal,
  PRIMARY KEY (id)
)

If I take off the quotes around 140.72 I get: Request did not complete within rpc_timeout. and the logs show the deserialize error. If I try to just insert a few columns it's fine -- until I try to insert the product_price field.

Cœur
  • 37,241
  • 25
  • 195
  • 267
jcollum
  • 43,623
  • 55
  • 191
  • 321
  • What is the **DESCRIBE TABLE data** output in cqlsh? – Lyuben Todorov Jun 05 '14 at 01:23
  • What version of cassandra are you using? I just tried it against 2.0.8 and it works fine. – mikea Jun 05 '14 at 15:17
  • @mikea 2.0.8, fresh install yesterday. `[cqlsh 4.1.1 | Cassandra 2.0.8 | CQL spec 3.1.1 | Thrift protocol 19.39.0]` – jcollum Jun 05 '14 at 15:26
  • @jcollum that's very strange because I just created your table and ran your insert without any error. Can you try inserting just that column. – mikea Jun 05 '14 at 15:28
  • I have a bunch of indexes and I'm getting `Request did not complete within rpc_timeout.` when I do what I expect to be a valid insert, is it possible that the rpc_timeout is too low out of the box? – jcollum Jun 05 '14 at 15:41
  • (tried looking for the rpc timeout in the yaml config, didn't see it) – jcollum Jun 05 '14 at 17:42

2 Answers2

0

Haven't got any answers, but did get some help.

The answer to this one isn't very satisfying. None of the errors really clued me in to what was going on, but I thought: well, there's really only a few moving parts here: the table def and the indexes. So I dropped the table and recreated it without indexes. Fixed the issue. Recreated the indexes and the issue was still fixed. So perhaps the indexes were messed up?

jcollum
  • 43,623
  • 55
  • 191
  • 321
  • @jcolumm what indexes did you have on the table? – mikea Jun 06 '14 at 10:31
  • I have indexes on all the fields. I'm evaluating Cassandra for arbitrary aggregations, e.g. "what's the total sales for Pokemon cards in Arizona on Tuesday afternoon of last week?" – jcollum Jun 06 '14 at 17:47
0

Got the same exception even without index on the table. If you look into the stack trace in the constructor of BigInteger public BigInteger(byte[] val), the exception are thrown when the passed in byte array is empty. Maybe there are bugs in Cassandra driver to deserialize decimal? However, there are following comments are interesting:

Translates a byte array containing the two's-complement binary representation of a BigInteger into a BigInteger. The input array is assumed to be in big-endian byte-order: the most significant byte is in the zeroth element.

Will the Cassandra driver guarantee the passed byte array is big-endian? The solution could be use other types like double instead.

Qinjin
  • 401
  • 4
  • 12