If your SQL Server database has nullable columns which contain mostly null values consider setting these columns to sparse. Sparse columns are ordinary columns that have been optimized for storing null values. Not all columns can be set to sparse and a number of restrictions apply.
Columns set to sparse require more storage space for nonnull values (4 additional bytes) than the space required for identical data in a column that is not marked sparse. This means it only makes sense to use sparse when the percentage of nulls is above a certain threshold. The threshold is different for each data type.
An except from Microsofts Estimated Space Savings by Data Type table is below. The red highlight shows how many bytes are needed for normal non sparse columns, this is regardless if they are null or not.
The blue highlight shows bytes required for non null values in sparse columns. Note null values take up 0 space in sparse columns.
The final column shows what percentage of nulls is needed for space saving of 40%. Microsoft recommends using sparse columns when savings are at least 20 – 40%. We can see in yellow that to save 40% space for int columns we need 64% of values to be null, while the corresponding percentage for datetime columns is 52%.
Determining percentage of null values in a column manually
SELECT100.0 * SUM(CASEWHENAddress4ISNULLTHEN1ELSE0END) / COUNT(*) from Address AS Address4NullPercent
where Address4 is your column and Address is your table name.
Determining percentage of null values in all columns in all tables dynamically
Note the script referenced above might take a while to run so to prevent performance issues for your existing prod processes its best to run it either after hours or on a production clone. It will produce a table similar to below.
You’ll get the biggest space savings from using sparse columns with fixed-length columns having a high null percentage in tables with large amounts of records.
The availability of the API opens up a lot of possibilities for pricing analysis and comparison tools. As we know pricing for Azure services can vary from region to region, so apps which support comparing prices across regions provide at least one use case for the new API.
Below we see from the pricing calculator that a standard Linux F4 VM based in North Europe costs $0.226 per hour.
and the output from the Azure pricing API for the same VM is below.
We can see the prices match, both being $0.226 per hour (see line 2, 4 and 18 above). Note the Azure Pricing API currently only supports USD.
The type highlighted on line 19, is Consumptionwhich is the equivalent of pay as you go option in the price calculator. Other possible types are Reservationand DevTestConsumption.
The JSON for the same VM on Windows is very similar but we can see that ‘Windows’ is appended into the productName field (line 13).
Lets take a look a what reservation prices look like. Reservation pricing if you don’t know allows you to get a discount on your VM compute costs by buying compute time in advance in 1 or 3 year blocks. Reservation pricing can be a great way to reduce your VM cost footprint if you have steady and predictable workloads.
Here’s the Linux VM from above but with 1 year reserved pricing.
We can see a price of $0.1542 per hour and here’s the equivalent entry in the Azure price API.
Note the reservation term is 1 year as shown on line 4. The price on line 5 in this case is $1351, this is the per year cost. The calculation is 8760 (hours per year) * $0.1542 = 1350.792 and then rounded up. The unitOfMeasure on line 19 says 1 hour but the price given is per year so watch out for this.
Cosmos DB pricing can be confusing at the best of time and I did find a number of entries in the price API that I couldn’t link back to the pricing calculator and vice versa. This could be a consequence of the API being in preview or more likely I just missed something on my side.
Below shows the pricing calculator for a standard provisioned Cosmos DB in North Europe with the multiple region write option set.
And the equivalent in the pricing API is below. Note the meterName and skuName fields which are the main differentiators to single write region provisioned Cosmos databases.
As a final example, shown below is the pricing calculator for cool blob storage with geo-zone-redundant storage (GZRS) redundancy based in the North Europe region. We can see the estimated price is $0.2 per 10,000 write operations.
And here’s the corresponding entry in the Azure price API… Notice meterName explicitly calling out that this entry is for cool gzrs write operations. Also note the unitOfMeasure field (line 18) which shows 10k.
Price details for operations such as read and list and create container are contained in their own line item so to get an all in view of blob storage prices you’ll need to examine multiple entries.
Let’s now look at how we can pull out only specific prices we care about.
Filtering the Azure Price API
Unfiltered the base API URL just returns all prices for all services across all regions in batches of 100. Thankfully a number of filters can be applied using OData filter expressions. According to Microsoft…
This is most of the fields returned in the response and gives us a good bit of query power. Microsoft doesn’t give examples of using these filters but if you’ve used OData you might be able to figure things out.
Not all OData expressions will be supported on the pricing API but some examples which do work are below. Note, while the filter names are case insensitive the values to search for are case sensitive.
Equals filter (single)
The basic syntax to match on a single filter is $filter=FILTER eq ‘FILTERVALUE’. For example to only return services for the northeurope region we can query the following URL:
If we want to search on multiple filters we just use the and/or operators. Eg. $filter=FILTER1 eq ‘FILTER1VALUE’ and FILTER2 eq ‘FILTER2VALUE’. For example to only return virtual machine pricing in northeurope we can hit the following URL:
.NET 5 Release Candidate 1 (of 2) has just dropped. Prior to its full release and availability as a built in runtime stack in Azure App Service if your looking to try it out you can of course deploy your .NET 5.0 apps as self-contained. This means you don’t need the framework installed, this is the beauty of .NET Core after all…
If you prefer framework-dependent deployments, perhaps due to their much smaller size and quicker deploy time you can install an extension into your App Service to enable the 5.0 RC1 runtime.
The extension option is available if you navigate to your app service and click Development Tools -> Extensions. Restart your app afterwards.
Azure Data Studio is Microsofts cross-platform (it works on MacOS and Linux unlike SSMS) database IDE. Despite it’s name you can connect to on-premise DBs too just like SSMS.
Microsofts What is Azure Data Studio page has a good overview of the different functionality available in ADS and also outlines why you might choose ADS over SSMS but here I’m just going to show some quick examples of the charting capabilities available. These give you a nice simple way to visualize data direct in the IDE.
Clicking the chart icon on the RHS gives us a number of options for charting the results as seen below. The specific options available depend on what type of chart you select.
There are ten chart types supported, but not all will be relevant to all result sets. You get more power in excel or dedicated charting solutions but if your query is simple the charting in ADS can be very helpful. I find myself using it a lot in combination with group by statements.
A few examples of charts based on the above sales location data are below. In order, the charts are Bar, Doughnut, Line and Horizontal Bar.
Despite its lack of visual feedback you can upload files directly into your Azure App Service via the Azure portal. This can be done via the Kudo Console or in the new App Service Editor (preview).
To open the Kudo Console navigate to YOUR APP SERVICE -> Development Tools -> Advanced Tools or bookmark https://yourappservice.scm.azurewebsites.net/DebugConsole. And then just drag and drop the files you want uploaded onto the relevant folder.
To open App Service Editor navigate to YOUR APP SERVICE -> Development Tools -> App Service Editor (Preview) or bookmark https://yourappservice.scm.azurewebsites.net/dev. You can upload files by dragging them onto the Explore pane on the LHS.