0

I followed the instruction Use delta tables in Apache Spark

but when I try to save the tables into lakehouse, I got below message. I got the similar error message when following "Lakehouse tutorial introduction" when trying to read fact_sale table. Did I miss some permission settings?

Create database for fabric_lakehouse is not permitted using Apache Spark in Microsoft Fabric.

I checked all the settings but can't find any in Fabric workspace.

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
henjiFire
  • 58
  • 7
  • I'm having the same issue. Full message: Error Details org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Create database for *** is not permitted using Apache Spark in Microsoft Fabric. ) Error Code InvalidTable – danjp May 31 '23 at 00:11
  • yep..I put the same question to the learning doc github but seems like they can't answer that. just said it seems like a tenant level permission issue. no idea what to check next :( – henjiFire Jun 01 '23 at 09:20
  • I tried with a new free Power BI free account and don't have the issue. It must be a tenant permission setting but just don't know where/what to check/adjust. – henjiFire Jun 01 '23 at 09:54

3 Answers3

0

Try adding .coalesce(1). Without this, the system tries to create multiple tables and fails.

df.coalesce(1).write.mode("overwrite").format("delta").saveAsTable("mytable")
user16217248
  • 3,119
  • 19
  • 19
  • 37
0

Running into the same issue, strangely only when I have not saved the notebook with a custom name though.

I tried the coalesce (1) workaround, but ran into a myriad of new issues.

Accordingly, my rather unfortunate recommendation is to create a new notebook, do not rename it, and proceed from there.

  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jun 06 '23 at 08:53
  • No luck with that. I got the error message even if just using "Load to table" option from the file. maybe it's firewall related setting but no idea where to check – henjiFire Jun 07 '23 at 06:32
0

I had a same issue. Resolved by re-attaching a lakehouse , so you might have a multiple lakehouse within your workspace and notebook can be run on any of them.

click on the name of lakehouse and remove all lakehouses. Once removed click on add lakehouse and add the lakehouse again. this is should resolve the issue.

enter image description here

Chhaya Vishwakarma
  • 1,407
  • 9
  • 44
  • 72
  • I tried that but there is no luck. I also tried to create a new clean workspace and it runs into the same issue. Feels like a permission thing as I didn't get the error when I sign up a new account. However, obvisouly can't do that with work account. :( – henjiFire Jun 07 '23 at 06:26