0

It seems that when I am connecting to Databricks Warehouse, it is using the default catalog which is hive_metastore. Is there a way to define unity catalog to be the default?

I know I can run the query

USE CATALOG MAIN

And then the current session will use unity-catalog. But I am looking for a way that it will always be the default. So, if the first query after login we will be

CREATE SCHEMA IF NOT EXISTS MY_SCHEMA

The schema will be created inside the main catalog.

Gilo
  • 640
  • 3
  • 23

1 Answers1

0

You can set: spark.databricks.sql.initial.catalog.name

But Databricks recommends keeping the default catalog as hive_metastore, because changing the default catalog can break existing data operations that depend on it.

https://docs.databricks.com/data-governance/unity-catalog/hive-metastore.html

Rock Pereira
  • 471
  • 1
  • 4
  • 12