1

In sake of Spark low latency jobs, Spark Job Server provides a Persistent Context option. But I'm not sure, does persistent context contains metadata, block locations & any other information required for query planning?. By default Spark should read this information from Hive Metastore (disk IO/network).

Does Spark has any option for keeping in-memory all information necessary for query planning?

VB_
  • 45,112
  • 42
  • 145
  • 293

0 Answers0