0

We have a requirement to provide customer access to a staging database so that they can extract their data into their own servers, but every table contains all customers data. All of the tables have a 'CustomerID' column. Customers should only see rows where the customerID is the same as theirs.

I am not looking for suggestions to create separate databases or views for each customer as both suggestions are high maintenance and low efficiency.

My solution has to work with:

  • 100GB database
  • 400 Tables
  • Updates every 30 minutes from the core transaction database
  • Quarterly schema changes (Application is in continuous Development).

Can anyone give me a definitive answer as to why the following method is not secure or will not work?:

I've set up a database user for each customer, with their customerID as an extended property.

I've created a view of every table that dynamically selects * from the table where the customerID column is the same as the extended property CustomerID of the logged in user. The code looks like this and appears to work well:

CREATE VIEW [CustomerAccessDatabase].[vw_Sales] 
AS SELECT * FROM [CustomerAccessDatabase].[Sales] 
WHERE [Sales].[CustomerID]= 
         (SELECT CONVERT(INT,p.value) AS [Value] 
         FROM sys.extended_properties
         JOIN sys.sysusers ON extended_properties.major_id=sysusers.[uid] 
         AND extended_properties.name = 'CustomerID' 
         AND sysusers.[SID]=(SELECT suser_sid())
         );
GO

To provide access to the views I've created a generic database role 'Customer_Access_Role'. This role has access granted to all of the table views, but access to the database tables themselves is denied.

To prevent users from changing their own customerID I've denied access to the extended properties like so:

USE [master];
GO
DENY EXEC ON sys.sp_addextendedproperty to [public];
GO
DENY EXEC ON sys.sp_dropextendedproperty to [public];
GO
DENY EXEC ON sys.sp_updateextendedproperty to [public];
GO

The end result is that I only need one database, and one set of permissions.

To add a new customer all I would need to do is create a new user with their customerID as an extended attribute and add them to the Customer_Access_Role. Thats it!

Max xaM
  • 348
  • 2
  • 12
  • Why would you go to so much trouble? Why not just take a copy of the database and then delete all of the data that is not related to that customer. This seems like a very long winded and risky approach to hide data you don't want them to see. – Tanner Mar 15 '16 at 10:04
  • Hi Tanner, thank you for your comment. This is a staging database that is updated every 30 minutes with potentially thousands of transactions. A small number of our larger customers have their own business intelligence teams who want to run customised intra-day reports from their own servers. They would keep a duplicate database and update it from the shared staging database. – Max xaM Mar 15 '16 at 11:11
  • I would create a separate database for these larger customers, it sounds too risky to expect that these changes would secure your database sufficiently, especially when you move forwards with changes and have to remeber to do this – Tanner Mar 15 '16 at 11:58
  • Copied comment from GoldBishop's answer: I appreciate that my solution may be unusual, but it has a big scalability advantage. There are 400 tables in this database. If I have just 10 customers who want this access, that's 4000 views, each with separate users and separate permissions. If I give each customer their own database I need to create a new customised database and ETL package every time a new customer comes along, and maintain every database whenever there is a schema change. – Max xaM Mar 15 '16 at 19:17
  • 1
    I realize you are looking for a solution that works with SQL Server 2012, but Microsoft is working on a Row Level Security feature that will be in SQL Server 2016 and in SQL Azure: https://msdn.microsoft.com/library/dn765131.aspx It might be worth looking at their solution in lieu of rolling your own. – Tim Lentine Mar 23 '16 at 02:57
  • Thanks @Tim Lentine, I wasn't aware of this and it is an ideal solution. Unfortunately I don't see our development team changing our platform any time soon, certainly not for a lowly analyst like me! – Max xaM Mar 23 '16 at 03:13

1 Answers1

0

I am going to reiterate what everyone is stating already and sum it up.

  1. You are making your job harder than it has to be.
  2. Create a View, that is just their data and then give them Security access to that View.
  3. Alternatively, extract all their data out of the "Core" database and into their own and give them the necessary access to that data.
GoldBishop
  • 2,820
  • 4
  • 47
  • 82
  • Thanks GoldBishop. I appreciate that my solution may be unusual, but it has a big scalability advantage. There are 400 tables in this database. If I have just 10 customers who want this access, that's 4000 views, each with separate users and separate permissions. If I give each customer their own database I need to create a new customised database and ETL package every time a new customer comes along, and maintain every database whenever there is a schema change. – Max xaM Mar 15 '16 at 19:14
  • Sounds like you have yourself a serious pickle. Maybe time to perform a refactoring normalization of your data. My company supports well over 50 customers across 3 instances. In each instance, we have one database for each customer. All of them are standard design, any schema change goes through a rigorous testing model before deploying to all 50 databases. In all cases, the core logic is generic and not client specific and is table driven. – GoldBishop Mar 19 '16 at 19:59
  • Respectfully, why would my solution not work for your business case? – Max xaM Mar 23 '16 at 03:17
  • This does not provide an answer to the question. To critique or request clarification from an author, leave a comment below their post. - [From Review](/review/low-quality-posts/11739789) – PVitt Mar 23 '16 at 13:15
  • @PVitt, if you read the Answer it is an answer to his question, review #2 & #3 of my bulleted point. It is not a critique but more of a better design implementation – GoldBishop Mar 23 '16 at 17:32
  • @MaxxaM Long-term, you have all your eggs in one basket. If something happens to that one basket (database), your whole system is dead. I understand the design concept and it works on low volume systems. One of our other systems, is very similar to your design and with just under 200 Clients on it, it is buckling badly and requires alot of maintenance/support. The other system is segmented based on concern and has its own problems (to be fair) but if one database goes down it is only that one client, not everyone of em. – GoldBishop Mar 23 '16 at 17:39