0

While i'm reading table from redis getting this below error.

Below code normally working well.

val readDF= spark.sparkContext.fromRedisKeyPattern(tableName,5).getHash().toDS()

Normally it's working for less than 2 million rows. But if i'm reading big table getting this error.

18/10/11 17:08:25 ERROR Executor: Exception in task 37.0 in stage 3.0 (TID 338) redis.clients.jedis.exceptions.JedisConnectionException: java.net.SocketTimeoutException: Read timed out at redis.clients.util.RedisInputStream.ensureFill(RedisInputStream.java:202) at redis.clients.util.RedisInputStream.readByte(RedisInputStream.java:40)

val redis = spark.sparkContext.fromRedisKeyPattern(tableName,100).getHash().toDS()

I also changed some settings on redis but i think it's not about that. Do you know how can i solve this problem ?

Guy Korland
  • 9,139
  • 14
  • 59
  • 106
Beyhan Gul
  • 1,191
  • 1
  • 15
  • 25
  • xref: https://github.com/RedisLabs/spark-redis/issues/98 – Itamar Haber Oct 14 '18 at 18:18
  • This could be (but may not) due to the client's output buffer exceeding the limits - can you check the Redis server's log for a warning in the form of: "Client %s scheduled to be closed ASAP for overcoming of output buffer limits."? – Itamar Haber Oct 14 '18 at 18:19

0 Answers0