Questions tagged [azure-synapse]

Questions about Azure Synapse Analytics, bringing together enterprise data warehousing and big data analytics.

Azure Synapse is a limitless analytics service that brings together enterprise data warehousing and Big Data analytics. It gives you the freedom to query data on your terms, using either serverless or provisioned resources—at scale. Azure Synapse brings these two worlds together with a unified experience to ingest, prepare, manage, and serve data for immediate BI and machine learning needs.

2525 questions
2
votes
0 answers

How to trigger a PowerBI Datamart refresh from Azure Synapse Pipelines?

Currently we are able to trigger an automated refresh of PowerBI datasets with an activity in Synapse Pipelines leveraging PowerBI API. Is it possible by any chance to do the same with a PowerBI Datamart? Couldn't find any reference for this…
datatalian
  • 83
  • 1
  • 5
2
votes
0 answers

Is Azure data factory and Azure Synapse billing for pipelines are same?

If we want to calculate the Estimated cost for Azure synapse, Is it similar to how the calculation happens on ADF? (I understand the prices may differ) Scenario that a Pipeline runs for 10 mins ever hour (8 hours) Pipeline has a dataflow activity…
SOUser
  • 31
  • 3
2
votes
1 answer

Save json to ADLS Gen2 using the Azure Synapse Copy Activity with content type application json

I'm using Azure Synapse REST api source in copy activity and trying to save api response as json with content-type as application/json in a azure data lake gen2 storage container. After saving the json documents I see the content-type set to…
pbj
  • 663
  • 1
  • 8
  • 19
2
votes
0 answers

Deployment changing the integration runtime of synapse and data factory artifacts to AutoResolveIntegrationRuntime

I'm running into a scenario when I run my deployment to different environments even though I have SelfHostedIntegrationRuntime specified on the linked services it auto resolves to AutoResolveIntegrationRuntime. How would I make it fixed to stick to…
2
votes
0 answers

Azure Synapse: Connect to Apache spark pool using a local Python script

I am working with Azure Synapse. Here, I can create a notebook, select one of the Apache Spark Pools and execute the following code: %%pyspark df = spark.sql("SELECT * FROM DataBaseName") df.show(10) I have the use-case where I need to be able…
The Dude
  • 3,795
  • 5
  • 29
  • 47
2
votes
1 answer

Continuous integration and delivery for an Azure Synapse Analytics workspace using DevOps

I am using Azure DevOps for continuous integration and delivery for an Azure Synapse Analytics workspace where I have deploy the workspace to Development, Test, PreProduction and Production environments. The pipelines use parametrized…
2
votes
3 answers

Data masking in Synapse serverless SQL pool

How can I implement data masking in Synapse serverless SQL pool, as currently, it is only implemented in a Synapse dedicated SQL pool? I am expecting to achieve masking in a serverless SQL pool.
2
votes
1 answer

Synapse Lake database view not available in SQL Pool?

Currently exploring using Spark notebooks in Synapse for data transformation instead of data flows but the lake db capabilities are a little confusing. I created a lake db, an external table (catalog?) and a view using a notebook in Synapse…
Fiffe
  • 209
  • 1
  • 2
  • 13
2
votes
4 answers

Synapse Analytics RenameDataFactoryResourceError error when trying to publish a rename pipeline

I'm trying to publish a reanaming pipeline and I'm getting this error and don't know how to deal with Error code: OK Inner error code: RenameDataFactoryResourceError Message: {"code":"InternalError","message":"Internal error has…
2
votes
1 answer

Unsupported encoding: DELTA_BYTE_ARRAY when reading from Kusto using Kusto Spark connector or using Kusto export with Spark version < 3.3.0

Since last week we started getting java.lang.UnsupportedOperationException: Unsupported encoding: DELTA_BYTE_ARRAY while reading from Kusto using the Kusto Spark connector 'Distributed' mode (same thing happens when trying to use the export command…
2
votes
1 answer

Azure datalake storage account

When try to create new data lake gen2 account, I am getting an error "There was an error trying to validate storage account name. Please try again" I tried with mulitple names but didn't work
Jo5689
  • 27
  • 4
2
votes
1 answer

How to convert Delta file format to Parquet File only

Delta Lake is the default storage format.I understand how to convert a parquet to Delta. My question is is there any way to revert it back to parquet.Any options ? What I need is I want single parquet file while writing .Do not need the extra log…
Nat
  • 47
  • 1
  • 8
2
votes
2 answers

What is the equivalent of the SQL function REGEXP_EXTRACT in Azure Synapse?

I want to convert my code that I was running in Netezza (SQL) to Azure Synapse (T-SQL). I was using the built-in Netezza SQL function REGEXP_EXTRACT but this function is not built-in Azure Synapse. Here's the code I'm trying to convert -- Assume…
John E.
  • 137
  • 2
  • 10
2
votes
1 answer

AzureSynapse - Replace apostrophe in raw data

I have some raw JSON data which I need to insert into my SQL database. I am able to read the data using a Lookup activity but am facing difficulties with the SQL insert as some of the fields in the JSON contain apostrophes ('). In python, I am able…
Waleed Alfaris
  • 136
  • 1
  • 9
2
votes
1 answer

Drop table in lake database

As print in screen below, why i can not drop this table in my lake databse ? i have create this table directly from my sink in my dataflow, thanks
Dev
  • 71
  • 6