Questions tagged [azure-synapse]

Questions about Azure Synapse Analytics, bringing together enterprise data warehousing and big data analytics.

Azure Synapse is a limitless analytics service that brings together enterprise data warehousing and Big Data analytics. It gives you the freedom to query data on your terms, using either serverless or provisioned resources—at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate BI and machine learning needs.

2525 questions
0
votes
1 answer

Synapse Spark Pool Error HTTP Request failed to Authenticate

It was running fine and suddenly getting below error Error code 1 LIVY_JOB_STATE_DEAD Message [plugins.synapse-ent-tekura.sparkpoolsmall.180 WorkspaceType: CCID:] [Monitoring] Livy…
0
votes
0 answers

How to optimally use my Spark resources in Azure Synapse when calling a notebook reference in a loop?

I'm running a data transformation in Synapse and would like to speed it up. My pool is configured as "4 vCores, 28GB of memory with dynamic executors from 1..7". My data in ADL Gen2 consists of roughly 300 directories. Every directory holds between…
Krumelur
  • 32,180
  • 27
  • 124
  • 263
0
votes
1 answer

Synapse Spark exception handling - Can't write to log file

I have written PySpark code to hit a REST API and extract the contents in an XML format and later wrote to Parquet in a data lake container. I am trying to add logging functionality where I not only write out errors but updates of actions/process we…
Morpheus273
  • 37
  • 1
  • 6
0
votes
1 answer

How to retrieve snapshot data using SuccessFactors Compound Employee API?

I have to retrieve employee's historical data from SuccessFactors, and I've tried to do it using SFSF Compound Employee API but it didn't work well. I don't have experience of API call, so I didn't understand what the SFSF documentation…
0
votes
2 answers

How to export Adf pipelines,dataflows,datasets into Azure Synapse

I have 100 dataflows ,50 pipelines and their related datasets variables etc. Now i want to use Synapse service and want my all pipelines amd stuff of ADF into Synapse. My Adf is git configured Can we export them in one go??
AzSurya Teja
  • 121
  • 5
0
votes
1 answer

Azure Synapse Notebook Read Variable

I have a vary simple/toy pipeline where I have a pyspark notebook that has an exit value, a set variable activity set to the exit value and a second notebook that is parameterized with the variable. It looks like the below. I successfully set the…
Jeff Tilton
  • 1,256
  • 1
  • 14
  • 28
0
votes
1 answer

How to include a new incremented column using DENSE_RANK() in Synapse

In my synapse, I have a schedule table that stores information about all programs for a given day. It would be helpful if I could include rank/dense_rank in the output from this table based on the title and ordered by the event number and program…
Vivek KB
  • 49
  • 6
0
votes
1 answer

Synapse Notebook throws timeout error while connecting to AWS RDS SQL Server

I am working in the Synapse Workspace and trying to connect to AWS RDS from the Synapse Notebook. Whenever I try to connect, it throws the below timeout error - The TCP/IP connection to the host my-host, port 1433 has failed. Error: "connect timed…
0
votes
1 answer

why devops service prinicple does not have synapse administrator role when created synapse workspace using bicep code

I have created Azure synapse workspace using bicep code: resource synapse 'Microsoft.Synapse/workspaces@2021-06-01' = { name: synapseName location: location tags: tags identity: { type: 'SystemAssigned' } properties: { …
Vivek Jain
  • 71
  • 1
  • 11
0
votes
1 answer

Azure Synapse Copy Data from BigQuery, Source ERROR [HY000] [Microsoft][BigQuery] (131) Unable to authenticate with Google BigQuery Storage API

I am getting this error at the Source tab at the Use query (Table, Query) Query, when doing a copy data activity at the Azure Synapse pipeline. Unable to authenticate with Google BigQuery Storage API: . The strange thing is I can preview data at…
0
votes
1 answer

Getting an error while copying data from one folder to another in Azure Data Factory

This query used to work in Azure Data Factory pipeline but stopped working few days ago. Nothing changed in case of file names/ formats etc in Azure Blob storage. Getting error in this line: SELECT * FROM OPENROWSET ( BULK…
Tahmeed
  • 48
  • 1
  • 9
0
votes
1 answer

Trouble with GET request to Copy Data from REST API to Data Lake

I will provide some context: my pipeline makes a GET Request to a REST API (Auth type: OAuth2 Client Credential) in order to import data to the Data Lake (ADLSGen2) in parquet file format. Later, a Stored Procedure creates a View which includes…
0
votes
1 answer

Azure Synapse Notebook Folders Structure not saved in GitHub

My Synapse Workspace is configured with GitHub. The code is organized in folders under "NoteBook". Example: Under Notebook, Dev1 folder contains notebook1 and notebook2. Dev2 folder contains notebook3 and notebook4 When Synapse Publishes, the…
G. Young
  • 41
  • 1
  • 7
0
votes
1 answer

How to select characters between wildcards in Azure Data Factory expression for Switch Block

I have 2 pipelines that are currently selected based on an IF condition, which works well. I now need to add a third pipeline , so using a Switch block instead. And instead of tablename as the deciding factor, ideally I would use logic something…
0
votes
1 answer

Synapse serverless pool to query delta table previous versions

Can we use Synapse serverless pool (Built-in) to query a delta file's previous version? I am keen to a SQL statement similar to what we do in Databricks: select * from delta.`/my_dir` version as of 2 Does the OPENROWSET support support a "version…
QPeiran
  • 1,108
  • 1
  • 8
  • 18