Questions tagged [azure-notebooks]
78 questions
0
votes
1 answer
Connecting databricks notebook to Appinsights in python code
Our databricks notebook is trigerred via an adf pipeline. I would like to add logging in my python notebook and would like to connect that logging information to be viewed in appinsights. How do I achieve that in python? Any pointers code examples…

Codecrasher99
- 351
- 3
- 17
0
votes
2 answers
One spark session for all notebook in Synapse
I don't find any solution fot start one apache spark session for all notebooks in one pipeline, any ideas ?

Dev
- 71
- 6
0
votes
0 answers
Is there a way to add custom metadata properties to Azure Databricks notebooks?
Basically, it's all in the question. In our workspaces, we have various notebook files and would like to be able to add custom metadata to enable efficient querying of them. For example, adding a property to a notebook to say "category" or "type",…

Arian Kulp
- 831
- 8
- 31
0
votes
1 answer
Can i run databricks notebook cell on if condition... If true run all cell if false run only bottom 5 cells
I want combine my 2 different notebooks and add one condition on received parameter.
If parameter is True then run all cell of that notebook and if parameter is false only run added code from another notebooks that will be few bottom…
0
votes
1 answer
mssparkutils.notebook.exit in Try block
mssparkutils.notebook.exit isn't exiting properly when used in try block, and it is rising exception. Can someone help me understand, why it isn't working inside try block? and how to make it work?
it worked, if not using try block.

subro
- 1,167
- 4
- 20
- 32
0
votes
1 answer
Save Spark dataframe to a dynamic Path in ADLS using Synapse Notebook
I am trying to use a Synapse Notebook using Pyspark to read a bunch of parquet files and reprocess them into different folder structure "YYYY/MM/YYYY-MM-DD.parquet based on the created_date of the row.
When I just use a literal path, everything…

Oblivi0n
- 3
- 2
0
votes
1 answer
Combine multiple notebooks to run single pipeline
I have 8 seperate notebooks in databricks, so I am currently running 8 different pipelines on ADF, each pipeline contains each notebook and so on. Is there a way to run a single pipeline which runs all notebooks, or is there a way to combine all…
0
votes
1 answer
What is the usage of createGlobalTempView or createOrReplaceGlobalTempView in Synapse notebook?
We know the Spark pool in Synapse will not work like databricks cluster model. We make use of GlobalTempViews in Databricks where they can be attached to cluster and other notebook can access the GlobalTempViews that are defined . As they are active…

shanmukh SS
- 13
- 4
0
votes
0 answers
AMQP Connector Socket Connect Error: TimeoutError in Azure Synapse Notebook
I am trying to access RabbitMQ (hosted in AWS) from Azure Synapse via Python code in Notebooks. Both servers (Azure and AWS) have been configured to access each other like the firewall setup is already completed and I am able to run the same code…

TheNerd
- 1
0
votes
1 answer
How to rollback uncommitted changes to a Synapse Notebook?
I have made changes directly to the backing github repo json notebook structure that mirror what has been done in the online notebook. To verify the repo changes I would like to revert the online changes that have not yet been committed:
How can…

WestCoastProjects
- 58,982
- 91
- 316
- 560
0
votes
0 answers
Convert azure synapse notebooks [in json] to python or jupyter
Local / IDE based development allows use of the most powerful IDE's [pycharm in particuilar]. How can the notebooks presently in json format in our git repo be converted to jupyter/ipynb and/or python on the local environment. We have many…

WestCoastProjects
- 58,982
- 91
- 316
- 560
0
votes
2 answers
PySpark(Azure Notebooks) - (Over)writing a csv file in Gen2 Storage container when the dataframe has as a source a csv file
I have an issue with loading a CSV to a dataframe add some rows and overwriting the csv from the source
To make it simple I tried to load the CSV and then overwrite it, this will also not work, and after that, I thought to make it more simple load…

chr1s84
- 75
- 1
- 9
0
votes
1 answer
Increase number of rows for Synapse Analytics Notebooks error output
I'm trying to connect to a database from a pyspark Synapse analytics notebook using a jdbc driver.
I'm having a Py4JJavaError when running my code but I can't see the full output of the error.
The remaining rows of the error are hidden behind "...…

datatalian
- 83
- 1
- 5
0
votes
1 answer
Databricks incorrect data while writing in Delta location
I am facing below issue while writing the data in Delta location. I am getting incorrect data. I am using Python Notebook in Azure Databricks.
Dataset Used : /databricks-datasets/flights/
Below are the steps I performed.
Mount to blob…

Baxy
- 139
- 1
- 13
0
votes
0 answers
How to distribute the contents of a spark dataframe over different azure storage containers
I want to build a multi-tenant deltalake based on a database which contains all data of all tenants. I am using Azure Synapse Pipelines and Spark Notebooks.
In the database there is one table which contains all tenants. Besides that we have several…