There have been a number of new security enhancements recently to control and audit access to Azure SQL databases. These include Auditing, Dynamic Data Masking, Row-level Security, and Transparent Data Encryption. In this first part I give my assessment of auditing and dynamic data design and how likely I am to be using them.
Azure SQL Auditing has been generally available since November, but the other security features I will be reviewing are still in preview and only available with Azure SQL Database V12 versions.
My conclusion: Auditing provides a cost-effective, low impact method for auditing database activity and I would recommend enabling it for any database where security or compliance is a major consideration.
Auditing for Azure SQL Databases recently became generally available and provides the simplest method for tracking activity on your database. The only additional cost is the Azure table storage to log the tracked events.
Enabling and configuring Auditing from the Azure management portal is straightforward. The only real decision is which types of events to audit, from the following list: Data Access, Schema Changes, Data Changes, Security Failures, or Grant/Revoke Permissions. As you can see, these options are very coarse-grained, and there is no opportunity to apply filters before they are logged, so I would think twice before selecting ‘Data Access’, in particular. It is useful to know that these options can be reconfigured at any time and will take immediate effect for any new connections.
For storing the audit logs, you choose an Azure storage account and it creates a single table called ‘SQLDBAuditLogs20140928’. I was initially confused by the date extension of the table, but the intention is that Auditing will periodically create new tables with your current date. The fact that the events are just being recorded in an Azure table, reinforces Microsoft’s positioning of Auditing as a tool for reviewing historical activity rather than monitoring current activity or alerting you to potential issues.
The Azure portal provides a link to download an Excel workbook which can be used to query and analyse the data in the audit logs. This comes with a predesigned set of Pivot charts and tables, based on a Power Pivot data model and a Power Query query, with drill downs to the raw data. I found that this was useful for testing my Auditing set-up, and I could imagine customising it further and deploying it to a Power BI site, but I still have questions about the best way to consume and visualise the audit data. The format and structure of the audit logs are described here.
A final point to mention that it can only log events for database connections that are using security enabled connection strings. By default, security enabled access is optional, which is useful for testing purposes but would probably defeat the purpose of enabling auditing unless it was required for a production database. You can always recognise a security enabled connection because the Azure SQL server name will have the suffix "database.secure.windows.net”.
My conclusion: Dynamic data masking could be used to provide a simple method to enforce data access control to individual columns that contained highly sensitive information, but it would be a very blunt instrument. The effect would be that either a login has unrestricted access to all data or they have restricted access to all masked data. I cannot personally envisage including this in an application database design.
As with auditing, dynamic data masking depends upon a security enable connection, so you should enforce this as a prerequisite.
Once dynamic data masking has been enabled in the Azure Management Portal, then the following configuration options are exposed:
Even with data masking enabled, access to data is unrestricted for all users until one or more data masks are defined. A data mask can be applied to a column in a table or a column alias in a query. Using an alias as a masked item obviously cannot impose a blanket restriction on the underlying item of data, but this option may be favoured when the source of the underlying data item cannot be tied to an individual table column.
There are a predefined set of functions for masking data, which include partial masks for credit card, social security and email values, random number for numeric values, partial masking for string values, or replacing values with data-type specific defaults. I would have preferred to see a more flexible approach to defining custom data mask patterns.
Another concern I have is the configuration of privileged logins and data masks as static lists in Azure management portal. These are both potentially dynamic items of application metadata, with logins having to take into account application connections as well as individual users, and data masks having to take into account changing database structures. I don’t know where else this information could reside but it feels like the wrong place.
Finally, there was no detailed description of what the ‘Use Extended Restrictions’ option does, but after experimenting it seems to prevent any database connection, for non-privileged users, that uses SQL Management Objects (SMO) or earlier equivalents. This explains why Microsoft qualify this option with a warning that it can affect application functionality.