Performing commands on all tables in SQL Server database using the sp_MSforeachtable store procedure

From time to time I have a need to do an operation such as dropping all records or disabling all triggers for all tables in an database. Scripts using cursors can be used here, but an undocumented stored procedure sp_MSforeachtable residing in the SQL Server master DB can also be used.

Below are a couple of different uses cases for this sproc from disabling all triggers, updating statistics, checking row counts etc. Note, this stored procedure is undocumented so could be dropped or modified anytime so best not to rely on it in any of your production scripts but I’ve found it very handy in adhoc situations like prepping a DB for UAT. If you’ve got other examples of use, please send them on.

Disable all triggers
exec sp_MSforeachtable ‘ALTER TABLE ? DISABLE TRIGGER ALL’

Enable all triggers
exec sp_MSforeachtable ‘ALTER TABLE ? DISABLE TRIGGER ALL’

Disable all constraints
exec sp_msforeachtable ‘ALTER TABLE ? NOCHECK CONSTRAINT all’

Enable all constraints
exec sp_msforeachtable ‘ALTER TABLE ? WITH CHECK CHECK CONSTRAINT all’

Delete records in all tables
exec sp_MSForEachTable ‘DELETE FROM ?’

Truncate all tables (won’t work if any FKs)
exec sp_MSForEachTable ‘TRUNCATE TABLE ?’

Rows, size, space info for all tables
exec sp_MSforeachtable ‘exec sp_spaceused [?]’

Print all table names
exec sp_MSforeachtable ‘print “?”‘

Update statistics
exec sp_msforeachtable ‘UPDATE STATISTICS ?’

Defragments all indexes
exec sp_msforeachtable ‘dbcc indexdefrag(0, “?”)’

Reseed all tables
exec sp_msforeachtable ‘DBCC CHECKIDENT (“?”, RESEED, 0)’

Reindex all tables
exec sp_msforeachtable ‘dbcc dbreindex(“?”)’

Update a particular column. Column MUST EXIST ON ALL TABLES, otherwise execution will stop upon coming to a table without the column without a rollback.
exec sp_MSforeachtable ‘UPDATE ? SET Updated_by = ”DAVID”’

Add a column to all tables. If column already exists execution will continue.
exec sp_msforeachtable ‘alter table ? add flag bit not null default 0’;

Remove column for all tables. If column doesn’t exist execution will continue. Delete dependencies first.
exec sp_msforeachtable ‘alter table ? drop column [flag]’

Row count for all tables
create table #rowcount (tbl_name varchar(100), row_count int)
exec sp_MSforeachtable ‘insert #rowcount select ”?”,count(*) row_count from ?’
select * from #rowcount order by tbl_name
drop table #rowcount

Number of rows updated since a certain date. Assumes all tables have an auditing date column on it. Tables which don’t have the column will not cause execution to stop.

create table #updateddaterows (tbl_name varchar(100), row_count int)
exec sp_MSforeachtable ‘insert #updateddaterows select ”?”,count(*) from ? where Updated_date > ”20160101” having count(*) > 0’
select * from #updateddaterows order by tbl_name
drop table #updateddaterows

 

Number of rows updated by a user since a certain date.
create table #updatedbyrows (tbl_name varchar(100), row_count int)
exec sp_MSforeachtable ‘insert #updatedbyrows select ”?”,count(*) from ? where Updated_by = ”12345” and Updated_date > ”20160101” having count(*) > 0’
select * from #updatedbyrows order by tbl_name
drop table #updatedbyrows

Related Links

Article looking in detail at the parameters sp_msforeachtable takes

Viewing SQL generated by Entity Framework when querying or calling SaveChanges()

When using Entity Framework, in an effort to track down bugs or performance problems, it’s very handy to see what SQL Entity Framework is generating and sending to the DB when you run a lambda/linq query or indeed make a call to SaveChanges().

This is easy once you know how. Simple add this.Database.Log = x => System.Diagnostics.Debug.WriteLine(x); to the constructor of your database context class (the one that derives from DbContext):

dbcontext

Afterwards you should see all the queries Entity Framework is sending to the DB in your output window.

Passing userId to SQL Server delete trigger from entity framework

If you’re in an enterprise environment some form of auditing is likely to be required so you can track who inserted, updated or deleted what in your database. Using triggers to insert into audit tables is usually a good fit for this, particularly if your database can be updated from multiple sources.

Within both insert and update triggers you have access to the original record inserted or updated from your web app so you can access the updated_by column from the original record and pop this into your audit_* tables. In the case of deletes however, there never is any user information passed to the delete statement so delete triggers don’t know who specifically (the DB connection itself is usually shared) issued the delete. Using a session like feature in SQL Server 2000+ called CONTEXT_INFO however we can ‘pass’ the specific id of the user issuing the delete from our web app to the trigger.

Passing user info to delete trigger using CONTEXT_INFO

CONTEXT_INFO allows us to associate a limited (very limited but enough space for user info) amount of information with the current SQL Server connection. This info can then be read back at any stage from within the same connection. Using this to pass user info to a trigger is quite easy then:

  1. Set CONTEXT_INFO to userID logged into your web app
  2. Issue delete statement from your web app
  3. Read CONTEXT_INFO back in the delete trigger

As long as all three above happen in the same connection this should work fine. Setting the context is easy, but is best done from within a store procedure to encapsulate the code to deal with CONTEXT_INFO (cumbersome binary). SetUserContext below takes the user id as a param and stores it into CONTEXT_INFO.

CREATE PROCEDURE [dbo].[SetUserContext]
 @userId NVARCHAR (30)
AS
BEGIN
 SET NOCOUNT ON;

 DECLARE @context VARBINARY(128)
 SET @context = CONVERT(VARBINARY(128), @userId)

 SET CONTEXT_INFO @context
END
GO

Reading CONTEXT_INFO is best done through a function so you can just directly call it from your delete trigger statement that inserts into your audit_* tables. The GetUserContext function below returns CONTEXT_INFO if it is set or the standard suser_name() if it is not set.

Checking for null and falling back on the connected DB user is important if a DB can be updated directly by an admin or support person outside the context of an app which would set CONTEXT_INFO.

CREATE FUNCTION [dbo].[GetUserContext] ()
RETURNS NVARCHAR (30)
AS
BEGIN
 
 DECLARE @idToReturn NVARCHAR(30)

 IF CONTEXT_INFO() IS NOT NULL
  SELECT @IdToReturn = CONVERT(NVARCHAR (30), CONTEXT_INFO())
 ELSE
  SELECT @IdToReturn = suser_name()
 
 RETURN @IdToReturn
 
END
GO

And finally a sample delete trigger which calls the function is below:

CREATE TRIGGER manager_d
    ON DBO.MANAGER
    FOR DELETE
AS
     INSERT dbo.AUDIT_MANAGER
     (
        manager_id,
        updated_by,
        updated_date,
        mode
     )
     SELECT 
     d.manager_id, 
     dbo.GetUserContext(),
     getdate(),
     'Delete'
     FROM deleted AS d
GO

After you have created the sproc for setting the context, the function for reading the context and have a couple of delete audit triggers created that’s things from the database point of view set up.

Setting CONTEXT_INFO using entity framework

In your web app you only care about setting the context. This can be done by simply calling the procedure. In entity framework this is easy:

public virtual void SetUserContext(string userId)
{
 var idParam = new SqlParameter("@userId", userId);
 this.Database.ExecuteSqlCommand("SetUserContext @userId", idParam);
}

Then in an overload of the SaveChanges method you simple call the SetUserContext sproc before you call the base SaveChanges method:

public void SaveChanges(string userId)
{
   using (var scope = Database.BeginTransaction())
   {
     SetUserContext(userId);
     base.SaveChanges();
     scope.Commit();
   }
}

Both the above functions would go in your database context class (the one which derives from DbContext).

Due to the fact that when executing stored procedures entity framework opens a connection, executes the proc and then immediately closes the connection, the context_info will be lost by the time we call SaveChanges which is when the delete trigger will run. To ensure both the setting and the reading of the context takes places in the same connection, we need to wrap them both in the same transaction as can be seen above.

What about connection pooling?

When a connection is being reused .Net SQL Server client executes the sp_reset_connection stored procedure which resets most connection specific state such as temp tables, cursors, transactions and crucially from the point of view of this article, CONTEXT_INFO. This is the case since SQL Server 2005 and above.

This means if you’re using connection pooling, which you really should be, you will not run into state sharing issues and thus connection pooling does not preclude you from using CONTEXT_INFO as part of your auditing solution. SQL Server 2000 users… get with the times or just reset CONTEXT_INFO after you issue your delete.

I hope you find the above useful. Please let me know your thoughts or questions below.

Related Links

List of what gets reset when a connection is reused from the connection pool

Stack Overflow question from a user who is concerned about creating race conditions if using CONTEXT_INFO

Another article about using CONTEXT_INFO from within Entity framework

Article about SESSION_CONTEXT which is a new and improved session mechanism in SQL Server 2016 which overcomes a lot of the limitations associated with CONTEXT_INFO

Config files specified in SQL Agent being overridden by design time package configuration

In work, our DEV and our UAT DB environments reside on the same box, but with their own SQL instances. We prefer to store SSIS packages on disk as they are a little (actually a lot) easier to manage that way. Since our DBs are on the same box we could, when configuring SSIS packages just copy the packages to two separate file locations for DEV and UAT and alter the design time package configuration to point to DEV.dtsconfig for one and UAT.dtsconfig for the other.

Rather than this however we attempted to just have the SSIS package in one location and override the default .dtsconfig file which was defined at design time by passing in the the environment specific .dtsconfig as part of the two (DEV & UAT) SQL agent jobs set-up to execute the package on schedule.  This can be done from the configurations tab in the job properties.

overiding-design-time-config

 

We expected the package to take its settings from UAT.dtsConfig as that was what was defined in the job but as we found this was not the case. This is because as of SQL 2008 SSIS packages load configurations in the following order:

  1. The utility first applies the design-time configurations.
  2. The utility then applies the run-time options that you specified on the command line when you started the utility.
  3. Finally, the utility reloads and reapplies the design-time configurations.

which meant the .dtsConfig specified in design time configuration was used. According to Behaviour Changes to Integration Services Features in SQL Server 2008 R2 on the MSDN site one can use /set to change design time settings but not the location of settings.

To get the .dtsConfig specified in the job to be the effective one, we needed to disable package configurations in design time. Not delete the existing one, but just disable it. After that the config file specified in the SQL Agent job was used.

Get all foreign keys and their cascade actions in SQL Server

Cascading deletes and updates are not for everyone and often they can cause problems or are better alternatives to using them. If you do use them and want to confirm what the update and delete actions are, the following query will do the trick to find all foreign keys in SQL Server along with their cascade actions: 

SELECT name, update_referential_action_desc, delete_referential_action_desc
FROM sys.foreign_keys