1

I'm trying to write Tableau's .hyper file to a directory in Databricks.

However it yields

The database "hyper.file:/dbfs/my_hyper.hyper" could not be created: I/O error while accessing file:/dbfs/my_hyper.hyper: SIGBUS

Why is this happening? I face no issues at all when writing other file types but this issue persists with .hyper files.

Is this a permissions issue or a bug?

Please Advise

I'd be happy to provide additional info

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
The Singularity
  • 2,428
  • 3
  • 19
  • 48

1 Answers1

1

Most probably this happens because DBFS doesn't support random writes (see docs for the list of limitations). The workaround would be to write to the local disk, like, /tmp/, and then copy/move files using dbutils.fs.cp or dbutils.fs.mv commands (see docs)

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
  • Thanks! I already figured that the write to `/tmp/` and other similar locations is supported. The copy and move commands you specified look useful! – The Singularity Oct 12 '21 at 08:17