How to run Entity Framework Core DB scaffolding from a .bat file in Visual Studio

It’s very handy to wrap the Entity Framework Scaffolding command in a bat file and run it on demand from within Visual Studio. Thankfully this is easy to do…

Installing required dependencies

Open up a command prompt window, navigate to your EF project folder and run the below commands to make sure you have the required dependencies…

dotnet tool install – global dotnet-ef
dotnet add package Microsoft.EntityFrameworkCore.Tools

There’s a good chance you’ll already have these installed.

Create a .bat file which calls the ef dbcontext scaffold command

Put a command similar to below in a .bat file and save it in whatever project your EF DbContext and Models are.

dotnet ef dbcontext scaffold name="AdventureWorks" Microsoft.EntityFrameworkCore.SqlServer – force – output-dir Models – context-dir Context – context EFContext – use-database-names

In the above example we have a named connection which references a connection in the appsettings.json file.

To understand what all the various parameters and flags mean visit the EF Core tools reference (.NET CLI) – EF Core page.

If you’re copying and pasting from above, note WordPress has messed up the dashes. Each flag should have two dashes in front of it.

Executing the .bat file in Visual Studio

At this point you can just double click the file in the project folder and it will scaffold.

To run it from within Visual Studio we can install the Open Command Line extension. After installing the extension we can right click the .bat file and select ‘Execute File’…

Execute File

and we’ll see the CMD window popup…

CMD popup

‘∩╗┐dotnet’ is not recognized as an internal or external command

If you get this strange looking error when trying to run the .bat file it is probably saved in UTF-8 encoding format which cmd.exe doesn’t support. To resolve open the bat file in Notepad and save it with ANSI encoding

Save as ANSI encoding

Example of an Entity Framework Core Database First development workflow in a team environment

If you’re using Entity Framework Core with the Database First approach you may be looking for a suggested development workflow in terms of how changes to the DB can be managed in a team environment.

In this post I’ll outline the major elements of the workflow I tend to use on most EF projects. First, here’s two fundamental ground-rules to using this workflow…

DB is KING… C# models and DbContext should never be customised DIRECTLY

In a true Database first approach (as opposed to a one time scaffold from an existing DB) the DB is the owner or start point for the C# models which the Scaffold-DbContext command will create. This means your C# models should be completely rebuildable after any DB changes so don’t customise the scaffolded models. The same applies for your DbContext class, don’t customise this directly.

To add custom code to your models use partial classes

Use partial classes to implement an interface

To add custom code to your DbContext, create a new one and inherit from the generated one

Override DbContext

Database objects are stored in a SQL Server DB Project and must be source controlled

Aside from never customising generated code a second prerequisite or ground rule is that the database must be source controlled. We must store the definition of the DB in source control so other developers can pull it down and synch to their local DBs.

Assuming we are using SQL Server we can create a representation of our DB in a SQL Server Database Project in Visual Studio and add it to our solution….

SQL Server Database Project

Suggested development workflow when DB schema changes are needed

OK assuming we know not to ever change auto-generated EF Core code and that our database must be source controlled lets look at the day to day flow…

– 1 –
Developer gets latest and uses the DB project to push any changes into their local DB

Before the developer is about to start on some new work, they need to make sure they’ve got the latest code and database project downloaded from source control.

Using the Visual Studio schema comparison tool they then compare the DB project to their local DB to see if any other developers made changes. If they have some differences will be listed in the comparison results and the developer can then use the tool to push the updates (no scripts needed) to their local DB.

I prefer to, but you don’t have to check the schema comparison files in but it’s useful to have each developer at least have them saved locally so they can just double click them to run comparisons.

Two comparison files will allow a developer to synch from their DB to the project and vice versa….

Using DB comparison tool in Visual Studio

– 2 –
Developer adds a new column to a table in their local DB

Let’s say we want to add a new column for website address to the customer table so we run the below…

ALTER TABLE [SalesLT].[Customer] ADD WebsiteAddress nvarchar(500) NULL;

– 3 –
Re-scaffold from the developers local DB and include the -force parameter

The DB has been updated so we now need to update the C# models to match so we run something similar to below in the Package Manager Console…

Scaffold-DbContext name="AdventureWorks" Microsoft.EntityFrameworkCore.SqlServer -force -outputDir Models -contextdir Context -Context EFContext -UseDatabaseName

or we can use the .NET CLI and wrap the scaffold command in a bat file.

The connection string is referenced from a named connection in appsettings.json and will be pointing to the developers local DB. At this point we can’t scaffold directly from a DB project but this functionality is supported in EF Core Power Tools if you’d like to check it out.

Note the inclusion of the -force flag which tells Entity Framework Core to overwrite existing files. In this case the previously generated Customer.cs file will be re-created and will now include a new string property called WebsiteAddress. The context file will also be re-created and will now include a mapping corresponding to WebsiteAddress.

Our pending items will look similar to…

Pending Changes after scaffolding

– 4 –
Developer codes as normal and repeats steps 2-4 until ready to check in.

At this point the DB and the C# models are in synch so the developer can just continue to develop as normal. When they are ready to check in they move to step 5…

– 5 –
Developer runs comparison tool in Visual Studio to compare their local DB to the DB project

The developer is finished the task but if they check in only the C# code the next developer to do a ‘Get Latest Version’ command in Visual Studio will have problems. We need to check in the C# model changes and corresponding DB changes together…

To update the DB project we can again use the comparison tool in Visual Studio. This time the source will be the developers local DB and the target will be the database project on the developers local machine.

Note the comparison tool uses the locally downloaded version of the project. It is not possible to compare directly against the latest version that’s in source control. This means it is very important to have the latest project downloaded locally.

After running the comparison tool the output will look a bit look the below image. Click the image for a larger view in a new window.

Comparison Tool Results

We can see the tool has found one change corresponding to the WebsiteAddress column added earlier. To push this change into the project we need to select the checkbox in the middle and click Update.

After the update completes the database project will have been updated and the pending changes window will now look similar to…

Pending Changes after updating db project

Note… if we added a new object as opposed to editing an existing one the DB Project file itself would also show in the pending items.

You don’t have to do this but before checking in I like to…

1) re-run the comparison to make sure that after updating the DB project there are no differences left (in this case) or only differences whose corresponding C# model code will not be checked in either.


2) Compare the local version of changed objects to the latest in source control just to make sure I can reconcile every change…

– 6 –
Check in the database project changes and model/DbContext changes TOGETHER

All that’s left is to check-in….

Final checkin

In our real apps we would of course have a lot of other file changes to check in as well such as those made to other C# files, .CSHTML files, JavaScript files etc.

Deploying changes to promoted environments such as DEV, UAT or PROD

How you deploy to forward environments will vary depending on your processes but for the DB part it will likely involve using the comparison tool to compare the DB project against the target DB and then building up deployment scripts.

I usually have comparison files similar to below saved and checked in to the DB project…

Because the target is a database (and not a DB project) the generate scripts button will be enabled…

Generate Scripts option

The script created should be used only as an aid and needs to be manually reviewed before execution.

Note…. I recommend always using the DB project and not an actual DB as your single source of truth when building up rollout scripts as this forces developers to check-in their DB changes otherwise their functionality won’t work on DEV, UAT and PROD.

Checking in DB changes is very important as DBs are rebuilt frequently, developers do stupid things to them, old redundant objects are often not dropped etc. Having your objects checked in means you can always rollback to a previous version and understand why or who made a particular change to an object.


I’ve used the above database first workflow in teams of 5-20 developers regularly over the last few years without major issues. The issues that do arise are usually due to developers not following the basics such as getting the latest code before starting on new tasks and checking in generated EF code and corresponding DB object changes together.

Let me know what you think…

How to add a column to all tables which don’t have it already in SQL Server

In SQL Server if you need to add a column to all tables which don’t have it already I found the below script on Stack Exchange the other day which helps create the ALTER statements needed…

FROM sys.tables st
INNER JOIN sys.schemas ss on st.[schema_id] = ss.[schema_id]
WHERE st.is_ms_shipped = 0
 FROM sys.columns sc
 WHERE sc.[object_id] = st.[object_id]
 AND = 'created_date'

The example below shows that the created_date column is not in three tables. The generated ALTER statements can be easily reviewed before executing them.

Add a new columns to all tables that don't have it currentlyWe could also easily modify lines 4 and 8 to create drop statements for a column for all tables in which it exists.

Setting audit columns in Entity Framework Core when using the database first approach

It’s a very common use case for applications to have to track when records are created or modified and which user was responsible.

The steps below show how we can easily add createdBy, createdDate, changedBy and changedDate columns to our DB and populate them using Entity Framework Core.

Add audit columns to your tables

Adding the relevant columns to your tables is easy BUT if you have a lot tables it’s a bit of a pain.

I found the below script on Stack Exchange the other day which helps create the ALTER statements to add a column to any table which does not already have it.

In the below example the output shows that the created_date column is missing from three tables…

Add a new columns to all tables that don't have it currently

Create an interface with C# properties to match DB columns

Interface with audit related properties

We’ll use this interface to mark entities as having these audit related properties so we can have Entity Framework Core set them later on…

Re-Scaffold and have all your classes implement the new interface

We need to re-scaffold to bring our new DB columns into our models. After this we could manually add : IHasAuditProperties in each EF generated model class BUT this means as soon as we scaffold again we will lose these custom changes.

A better option is to use partial classes. Using the below as an example C# will combine the manually created Address partial class in the ExtendedModels folder with the partial class Entity Framework Core has generated in the Models folder. When we need to re-scaffold, our partial classes will remain so we won’t lose our changes.

Use partial classes to implement an interface

The partial class approach above is nice, but if you have a lot of models it can be time consuming. Another approach is to change the format of the generated models which are output during scaffolding. We can do this easily using the excellent EF Core Power Tools extension.

When using this tool, we will scaffold using a graphical rather than a command line interface. One of the options in the tool is to customise the generated code using Handlebar templates. This gives us the ability to have all our entities implement an interface or inherit from a base class…

Inherit from a base class

For full steps involved see my previous article…

Make entities inherit from a base class in Entity Framework Core DB First – Dave Callan

Override SaveChanges() to set the values for audit related properties.

Finally we can set the audit properties by Overriding SaveChanges/SaveChangesAsych in our static DbContext class…

Override SaveChanges

If we place the above in our scaffolded DbContext we will lose these customisations when we next scaffold. Therefore we should put them in a statically created DbContext class which inherits from the generated one….

Override DbContext

and our Startup.cs -> ConfigureServices method would look something like below….


In our code we use instances of StaticContext and not EFContext…

Products Controller

What are Proximity Placement Groups and how to create them in the Azure Portal

From an architectural point of view, it’s recommended to place any VMs which need to communicate with each other in the same region and availability zone to lower latency. Note that lower latency is a side effect not a design goal of using AZs.

One further level of latency optimisation possible in Azure is called Proximity Placement Groups.

What are Proximity Placement Groups?

While Availability zones allow resources to be placed in the same group of physical data centers, this still could mean they’re in different data centres perhaps many kilometers apart…

Availability Zones with PPGsProximity Placement Groups assure that resources are placed in the same data centre

Availability Zones with PPGs

By using PPGs absolute physical distance which is one of the main factors impacting network latency between VMs can be reduced.

PPGs can benefit all multi-tiered apps but are ideal in very latency sensitive applications such as those used in banking, trading etc. When combined with accelerated networking sub-millisecond latency times between VMs can be achieved.

The data centre associated with a PPG is allocated when the first VM is added to the group. All subsequent VMs added to the PPG will get added to the same data centre as the first VM. A subsequent request for a different virtual machine SKU may fail since it’s not available in the data centre already selected. For this reason Microsoft recommends adding the most exotic SKU to your PPG first.

Creating Proximity Placement Groups in Azure Portal

Adding PPGs in the Azure portal is easy. Search for ‘proximity placement groups’, visit the default listing page and click new.

All that’s needed is resource group, region and name. Obviously your region must be the same as where your virtual machines are located.

You can create PPGs via ARM templates, the CLI and PowerShell too.

Note that you need to create the PPG first using one of the above approaches before you can place a new or existing VM into it, there is no ability to create a new PPG inline (as you might with resource groups).

When creating a new VM, the PPG section is found near the end of the page on the Advanced tab.

When editing an existing VM the PPG section is found in Settings -> Configuration. Note the virtual machine needs to be stopped/deallocated to amend its PPG config.


Create Windows VM with accelerated networking – Azure PowerShell | Microsoft Docs

0.09 ms latency using Azure Proximity Placement Groups (