0

Recently I install CDH hadoop (version 5.14.2) but the HIVE Metastore Server couldn't be started

it has one HiveServer2 and one HIVE Metastore Server

when i started it I have set datanucleus.autoCreateSchema true to let hive create its table by itself ,

then the HiveServer2 started well but HIVE Metastore Server failed here is the logs

    ERROR   Datastore   
[main]: Error thrown executing CREATE TABLE `SERDE_PARAMS`
(
    `SERDE_ID` BIGINT NOT NULL,
    `PARAM_KEY` VARCHAR(256) BINARY NOT NULL,
    `PARAM_VALUE` VARCHAR(32672) BINARY NULL,
    CONSTRAINT `SERDE_PARAMS_PK` PRIMARY KEY (`SERDE_ID`,`PARAM_KEY`)
) ENGINE=INNODB : Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
    at com.mysql.jdbc.Util.getInstance(Util.java:408)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:944)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3976)
    at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3912)
    at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2530)
    at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2683)
    ......

    ERROR   Schema  
[main]: An exception was thrown while adding/validating class(es) : Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at com.mysql.jdbc.Util.handleNewInstance(Util.java:425)
    ......
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

    WARN    Query   
[main]: Query for candidates of org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics and subclasses resulted in no possible candidates
An exception was thrown while adding/validating class(es) : Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    ......

org.datanucleus.exceptions.NucleusDataStoreException: An exception was thrown while adding/validating class(es) : Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    ......
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    ......
    ... 36 more
Nested Throwables StackTrace:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Column length too big for column 'PARAM_VALUE' (max = 21845); use BLOB or TEXT instead
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    ......

I dont know how to soloved Column length too big for column 'PARAM_VALUE' should I add some setting in mysql (I use yum to install version 5.6.40)

thanks in advance .

HbnKing
  • 1,762
  • 1
  • 11
  • 25

1 Answers1

3

MySQL has a max row size limit of 65535 (minus storage overhead) Bytes. If in a charset(for eg: UTF-8) a character takes maximum 3 bytes, the maximum size of a varchar column can be 65535/3 = ~21845 assuming its the only column in the table. This is explained at https://dev.mysql.com/doc/refman/5.6/en/column-count-limit.html

Changing the Charset to latin1 which has the maximum length of one, resolves this problem because in latin1 the column size can be maximum of 65535 approximately.

Charset MySQL

Connect to Hive Metastore DB (MYSQL) and run ALTER DATABASE CHARACTER SET latin1

Change Charset for HiveMetastore DB

Abhinav Agarwal
  • 224
  • 1
  • 5
  • thanks for your answer , you have got the core message . I will check your answer when I meet this problem again , – HbnKing Oct 11 '18 at 04:52