20

I have successfully established the JDBC connection and can successfully execute statements like "use warehouse ...". When I try to run any SELECT statement I get the following error:

net.snowflake.client.jdbc.SnowflakeSQLLoggedException: JDBC driver internal error: Fail to retrieve row count for first arrow chunk: null.

I am able to see that my request was successful, and returned the expected data in the snowflake UI.

The error occurs on this line: rs = statement.executeQuery("select TOP 1 EVENT_ID from snowflake.account_usage.login_history");

The statement was able to execute queries prior to this line and the result set was as expected. Any insight would be appreciated!

Sara Feraca
  • 201
  • 1
  • 2
  • 3

7 Answers7

21

This could happen due to several reasons:

  1. What JDK version are you using? JDK16 has introduced strong encapsulation of JDK internals (see JEP 396) If you're using JDK16 try setting at JVM level on startup:
-Djdk.module.illegalAccess=permit 

This is a workaround until we get a fix for the following Apache Arrow issue ARROW-12747

  1. If you use an application that uses JDBC to connect to Snowflake, then the application might not interpret correctly the results. Try switching back to JSON rather than ARROW format and see if that fixes it. This can be done at session level by running:
ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON'
RoyalTS
  • 9,545
  • 12
  • 60
  • 101
Sergiu
  • 4,039
  • 1
  • 13
  • 21
  • Hi, can you explain what `-Djdk.module.illegalAccess=permit` does? I had this same problem in dbeaver in arch linux with java 16 openjdk connecting to snowflake. If you can explain more about option 1, and also what you mean by `JDKs above 11 might cause unexpected behavior` it would make this answer more complete and helpful. – scientific_explorer Jun 21 '21 at 10:22
  • @sergiu thank you for this answer. Its a bit problematic for those of us using the snowflake drivers for R, however, because we can't specify jvm level or alter session on startup, which means we have to downgrade our jvm version system-wide to get a connection. Can we hope that there will be progress soon on the issue of arrow compatibility? – Bob Jul 02 '21 at 17:22
  • @Bob The arrow issue has been opened directly towards Apache, so we're waiting for feedback from them. – Sergiu Jul 05 '21 at 08:01
  • Worked like a charm export JAVA_OPTS="-Djdk.module.illegalAccess=permit" – Shaounak Nasikkar Jan 26 '22 at 00:50
12

Using DBeaver to connect snowflake and had the same issue. It is resolved by setting the session parameter in each editor window as following: ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON';

This solution can be automated by configuring boot-strap queries in connection settings->Initialization. With every new-editor window this session parameter will preset during initialization.

Pulkit
  • 121
  • 1
  • 3
  • Thank you so much! This solves the problem. I had to use the intel version of Dbeaver for almost 3 months and the query fetch was painfully slow. The silicon binary with this fix is much faster now – Siddarth Ranganathan Apr 22 '22 at 05:05
  • Has anyone solved this issue with Databricks and Dbeaver? I have tried this solution but 'ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON';' does not work with Databricks – bda Jun 01 '22 at 19:35
  • This also helps if using a recent IntelliJ IDEA / DataGrip IDE to connect. When you configure the data source, changing the JVM properties for whatever reason does not work. Instead, in the "Options" tab for the Snowflake datasource, add the following string: `ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON';`. – Pavel Jun 23 '22 at 07:25
4

You can add the following 2 settings in the following file (macOS) /Applications/DBeaver.app/Contents/Eclipse/dbeaver.ini

-Djdk.module.illegalAccess=permit --add-opens=java.base/java.nio=ALL-UNNAMED

Information from: https://support.dbvis.com/support/solutions/articles/1000309803-snowflake-fail-to-retrieve-row-count-for-first-arrow-chunk-


Another alternative that worked for me on my MAC M1, is to use JDK11

brew install openjdk@11

Edit: /Applications/DBeaver.app/Contents/Eclipse/dbeaver.ini this line: ../Eclipse/jre/Contents/Home/bin/java change to /opt/homebrew/opt/openjdk@11/bin/java

Restart dbeaver

Nicolas
  • 79
  • 7
2

I hit the same problem, and was able to get it working by downgrading to Java 11 for version

[net.snowflake/snowflake-jdbc "3.13.8"]
Alan Thompson
  • 29,276
  • 6
  • 41
  • 48
  • I too encountered the same problem OP did. Only I am connecting R to Snowflake. I have tried your solution (downgrading Java to `11.0.13` and Snowflake `.jar` to `3.13.8`. I am able to run `dbListTables(con) ` and am presented with a list of tables but when running a query e.g. `dbGetQuery(con, "select * from table limit 10")` I get the error below – Sweepy Dodo Nov 17 '21 at 14:59
  • `Note: method with signature ‘DBIConnection#character’ chosen for function ‘dbListFields’, target signature ‘JDBCConnection#character’. "JDBCConnection#ANY" would also be valid Error in .verify.JDBC.result(r, "Unable to retrieve JDBC result set", : Unable to retrieve JDBC result set JDBC ERROR: SQL compilation error: Object '"dwh_db.visitor.table"' does not exist or not authorized. Statement: SELECT * FROM "dwh_db.visitor.table" LIMIT 0` – Sweepy Dodo Nov 17 '21 at 15:00
  • 1
    Solved using a particular combination of version of `Java` and `Snowflake`'s .jar file [link](https://stackoverflow.com/questions/70007531/snowflake-through-r-rjdbc-unable-to-retrieve-jdbc-result-set) – Sweepy Dodo Nov 18 '21 at 15:55
2

Before executing actual query you need to set this:

statement.executeQuery("ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON'");
Aqib Javed
  • 935
  • 1
  • 5
  • 15
Vignesh
  • 21
  • 1
1

The official solution from snowflake is to configure an extra property in your datasource configurations: https://community.snowflake.com/s/article/SAP-BW-Java-lang-NoClassDefFoundError-for-Apache-arrow

Customer can use this property (jdbc_query_result_format=json) in datasouce property of Application server or session property in application like

Statement = connection.createStatement();
Statement.executeQuery("ALTER SESSION SET JDBC_QUERY_RESULT_FORMAT='JSON'");

which will use result format as JSON instead of Arrow and which will avoid the above error.
saran3h
  • 12,353
  • 4
  • 42
  • 54
1

This happens on modern JVMs due to Arrow wanting to do low level memory allocations. It logs to stdout that it wants the JVM to be started with --add-opens=java.base/java.nio=ALL-UNNAMED.

The fixes that are good for JDK17 are described on the Arrow Docs as:

# Directly on the command line
java --add-opens=java.base/java.nio=ALL-UNNAMED -jar ...

# Indirectly via environment variables
env _JAVA_OPTIONS="--add-opens=java.base/java.nio=ALL-UNNAMED" java -jar ...

So if you are using intellij/Docker setting the env var of _JAVA_OPTIONS to be --add-opens=java.base/java.nio=ALL-UNNAMED is probably the best solution.

simbo1905
  • 6,321
  • 5
  • 58
  • 86