Questions tagged [sql-data-warehouse]

108 questions
2
votes
1 answer

Panic during sql data warehouse bulkcopy

I'm writing the data to Azure SQL Datawarehouse using the go-mssql driver. I'm getting a panic thrown at random (at least I haven't been able to reliably replicate this issue) when using the bulkcopy functionality to write some data. The error…
Thihara
  • 7,031
  • 2
  • 29
  • 56
2
votes
2 answers

Aggregate strings in Azure SQL Data Warehouse

Is there a way to aggregate strings in Azure SQL Data Warehouse similar to the string_agg function in SQL Server? I have records with strings that I want to concatenate into a single string. SELECT string_agg(string_col, ',') FROM…
Erik Shilts
  • 4,389
  • 2
  • 26
  • 51
2
votes
2 answers

Real time Streaming data into Azure Datawarehouse from sql server

I'm trying to build a real-time reporting service on top of Microsoft Azure Data Warehouse. Currently I have a SQL server with about 5 TB of data. I want to stream the data to the data warehouse and use Azure DW's computation power to generate…
2
votes
1 answer

Configure Azure App Service without public URL

I am trying to deploy an Azure App Service from Visual Studio 15.2. Specifically I am trying to deploy this following service: https://github.com/Microsoft/Azure-SQL-DB-auditing-OMS-integration to ingest audit logs from SQL Data Warehouse to OMS.…
2
votes
1 answer

Why is Polybase slow for large compressed files that span 1 billion records?

What would cause Polybase performance to degrade when querying larger datasets in order to insert records into Azure Data Warehouse from Blob storage? For example, a few thousand compressed (.gz) CSV files with headers partitioned by a few hours per…
1
vote
1 answer

Updating of Fact tables

I have a flatfile resources that were extracted into facts and dimensions. Some dimensions also comes from db resources. The transformation process is set on as needed basis (if there are new/updated from flatfiles). The problem is this, some data…
1
vote
1 answer

How slow is Slowly Changing Dimensions?

How often dimensions change in a slowly changing scenario? I'm looking at SQL Server Temporal tables for Slowly Changing Dimensions (Type 2). Some of the dimension tables update quite frequently (daily! In that sense, they are not truly 'slowly…
RaviLobo
  • 447
  • 2
  • 10
  • 29
1
vote
2 answers

Azure SQL Data Warehouse: max size of varchar type

I am new to Azure and I have this field in my table in Azure SQL Data Warehouse: [AnnotationText] varchar(MAX) NULL, Based on what I read from…
kee
  • 10,969
  • 24
  • 107
  • 168
1
vote
1 answer

Populating fact table with different sequence time

I am using the following query to populate my fact table: Select sh.isbn_l,sh.id_c,sh.id_s, sh.data,sh.quantity, b.price from Book as b inner join Sales as sh on l.isbn=sh.isbn_l The main thing is that I want to load the table from a specific time…
1
vote
2 answers

Flexible running window to count entrys within timerange

I have some devices in the field, sending data by GSM by losing connection from time to time. As I have limited disk space, I tend to loose some data in the periods without connection so I like to evaluate the amount of pending data to get some…
Wien.MC
  • 55
  • 1
  • 9
1
vote
1 answer

How to resolve Rows were rejected while reading from external source(s) in sqlserver Data warehouse

I have to load the data from datalake to sql server data warehouse using the polybase tables.I have created the set up for the creation of external tables.i have created the external table and trying to do select * from ext_t1 table but i'm getting…
pythonUser
  • 183
  • 2
  • 7
  • 20
1
vote
2 answers

Is there any alternative of CREATE TYPE in SQL as CREATE TYPE is Not supported in Azure SQL data warehouse

I am trying to execute this query but as userdefined(Create type) types are not supportable in azure data warehouse. and i want to use it in stored procedure. CREATE TYPE DataTypeforCustomerTable AS TABLE( PersonID int, Name varchar(255), …
1
vote
0 answers

Dynamically trimming data to 4000 characters while copying to ADW using Azure Data Factory

I am getting the following error while copying data from blob to Azure Data Warehouse: "errorCode": "2200", "message": "ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened…
Megha Agarwal
  • 11
  • 1
  • 3
1
vote
1 answer

Strategies to prevent duplicate data in Azure SQL Data Warehouse

At the moment I am setting up an Azure SQL Data Warehouse. I am using Databricks for the ETL process with JSON-files from Azure Blob Storage. What is the best practice to make sure to not import duplicate dimensions or facts into the Azure SQL Data…
1
vote
2 answers

How to add a partition boundary only when not exists in SQL Data Warehouse?

I am using Azure SQL Data Warehouse Gen 1, and I create a partition table like this CREATE TABLE [dbo].[StatsPerBin1]( [Bin1] [varchar](100) NOT NULL, [TimeWindow] [datetime] NOT NULL, [Count] [int] NOT NULL, [Timestamp] [datetime] NOT NULL) WITH ( …
Lucas Yang
  • 63
  • 8