0

I cached a table in Databricks (SQL Notebook) using

CACHE TABLE work_details AS SELECT (....)

The problem is that I can only access the cached table if I am in the same notebook. I want to use the table in a different notebook (same cluster) but it throws the error table or view not found

Is there any workaround for this?

EDIT:

Note, I cannot use views here because the cached table has a lot of rows and is further used to join different tables to create the final required table. If I use VIEWS instead of the cached table, the time taken to create the final table is increased which I do not want.

Ques: Why can't I cache the table again in the new notebook?
Ans: This is the solution I am using right now, but I need a workaround where I can use this table across multiple notebooks without having to cache it again and again and still have the same performance.

Martin
  • 55
  • 6

1 Answers1

1

You typically create Temp view when creating cached table

• A Temp View is available across the context of a Notebook and is a common way of sharing data

• A Global Temp View is available to all Notebooks running on that Databricks Cluster

Workaround:

Create Global Temp View which will be accessible on all Notebooks running on that Cluster.

%sql
CREATE GLOBAL TEMP VIEW <global-view-name>

To access Global Temp View use below query

%sql
select * from global_temp.<global-view-name>;
Abhishek K
  • 3,047
  • 1
  • 6
  • 19
  • The cached table itself has 4 joins with different tables and the amount of rows is huge (80M+). The cached table then is used in other tables as a JOIN table to create the final table that I require. I cannot use a view instead of this cached table because it increases the time taken by a lot. (I need the final asset to finish all the joins under 1 minute and it works fine with the cached table but if I use VIEWS here, the time taken exceeds 3 minutes+.) – Martin Aug 12 '22 at 11:38