I have a B-level SQL Azure DB, querying mostly using STIntersect based on minimal polygons (5 points), and had horrible performance issues. Checking for intersection on a 50K rows table took nearly a minute (spatial indices being in place and hinted in the query).
Eventually, I set up a local test DB, ran the same query and had the result in less than a second.
I was using the lowest SQL Azure tier, so I upgraded to S1, and then to S2, with remarkably better results in S2, although it still takes over 3 secs). The problem is: When I run a single query, this already peaks my S2 instance to 60% DTU utilization - which makes me guess that this isn't fit for production at all - and that's a lot of money for very little performance.
https://stackoverflow.com/a/27974549 hinted a year ago that Azure SQL wasn't fit for spatial stuff because Geography as a CLR type kills the CPU. What is your experience / recommendation with the current V12 DBs? Do I really have to fall back to (worse performing) manual lat/long calculations when deploying to Azure?
Thanks, Philipp
--
Query:
DECLARE @g geography;
DECLARE @bfr geography;
set @g = geography::STPointFromText('POINT(8.7125158 47.5041406)', 4326)
set @bfr = @g.STBuffer(575000).Reduce(57500)
SELECT count(*)
FROM Locations st with(index(SPATIAL_LocationArea))
WHERE @bfr.STIntersects(st.LocationArea) = 1
Table:
CREATE TABLE [dbo].[Locations] (
[Id] INT IDENTITY (1, 1) NOT NULL,
[Location] [sys].[geography] NOT NULL,
[LocationArea] [sys].[geography] NOT NULL,
PRIMARY KEY CLUSTERED ([Id] ASC)
);
GO
CREATE SPATIAL INDEX [SPATIAL_Location]
ON [dbo].[Locations] ([Location]);
GO
CREATE SPATIAL INDEX [SPATIAL_LocationArea]
ON [dbo].[Locations] ([LocationArea]);