How to have private constructors and private setters when using Entity Framework Core Database first

Having private constructors and private setters is one of the primary ways we can enforce encapsulation in our domain model.

Entity Framework Core can work fine with both private constructors and private setters. Developers using the code first approach can just set the access modifiers on their constructors and properties as they wish.

BUT…

To achieve something similar with the DB first workflow we need to customise the code that EF generates during the scaffolding process.

Thankfully this is easily done using the EF Core Power Tools Visual Studio extension. This extension uses the Scaffolding.Handlebars library to support the customization of code which is generated during the scaffolding process. You don’t need the extension to use the library but it’s a bit easier and EF Core Power Tools is a great extension anyway so I’ll use it in the post.

Customising Entity Framework scaffolded models

The steps are…

Install the extension via the Extensions -> Manage Extensions window.

Right click on the relevant project and select EF Core Power Tools -> Reverse Engineer and first select which tables you want to scaffold and then on the subsequent screen check the option for customizing the code…

Customise EF Code with EF Core Power Tools

After running the scaffolding you should see entities generated AND a CodeTemplates folder which contains all the handlebar template files for customizing entities and the DbContext…

Code templates folder

The two files you need to edit are highlighted above.

Your edits will look similar to below…

Private constructor

Extending Entity Framework generated models with partial classes

After changing the above files and scaffolding again all your entities should now have private constructors and private property setters. To add public constructors and public methods to your entities simply create partial classes for each entity.

Always use partial classes to extend your EF generated models. Never edit the generated models directly as you will lose your customisations each time you re-scaffold.

For example here’s a public constructor for creating an Address instance…

while here’s a public method which will allow us to update the properties of an address instance..

Now we can see that the only way to create an address is via the non default constructor. This means we can ensure that an invalid object is never created.

Create an address

Similarly when we want to update an address we cannot set properties individually as all the setters are private… we must go through the one centralised UpdateAddress method

Update an address

Summary

You’ll always get better DDD support with the Entity Framework code first workflow. For those of us who prefer the DB first approach combining the customisation of EF generated code and partial classes is a nice technique to move a little closer to a DDD based app.

How to run Entity Framework Core DB scaffolding from a .bat file in Visual Studio

It’s very handy to wrap the Entity Framework Scaffolding command in a bat file and run it on demand from within Visual Studio. Thankfully this is easy to do…

Installing required dependencies

Open up a command prompt window, navigate to your EF project folder and run the below commands to make sure you have the required dependencies…

dotnet tool install – global dotnet-ef
dotnet add package Microsoft.EntityFrameworkCore.Tools

There’s a good chance you’ll already have these installed.

Create a .bat file which calls the ef dbcontext scaffold command

Put a command similar to below in a .bat file and save it in whatever project your EF DbContext and Models are.

dotnet ef dbcontext scaffold name="AdventureWorks" Microsoft.EntityFrameworkCore.SqlServer – force – output-dir Models – context-dir Context – context EFContext – use-database-names

In the above example we have a named connection which references a connection in the appsettings.json file.

To understand what all the various parameters and flags mean visit the EF Core tools reference (.NET CLI) – EF Core page.

If you’re copying and pasting from above, note WordPress has messed up the dashes. Each flag should have two dashes in front of it.

Executing the .bat file in Visual Studio

At this point you can just double click the file in the project folder and it will scaffold.

To run it from within Visual Studio we can install the Open Command Line extension. After installing the extension we can right click the .bat file and select ‘Execute File’…

Execute File

and we’ll see the CMD window popup…

CMD popup

‘∩╗┐dotnet’ is not recognized as an internal or external command

If you get this strange looking error when trying to run the .bat file it is probably saved in UTF-8 encoding format which cmd.exe doesn’t support. To resolve open the bat file in Notepad and save it with ANSI encoding

Save as ANSI encoding

Example of an Entity Framework Core Database First development workflow in a team environment

If you’re using Entity Framework Core with the Database First approach you may be looking for a suggested development workflow in terms of how changes to the DB can be managed in a team environment.

In this post I’ll outline the major elements of the workflow I tend to use on most EF projects. First, here’s two fundamental ground-rules to using this workflow…

DB is KING… C# models and DbContext should never be customised DIRECTLY

In a true Database first approach (as opposed to a one time scaffold from an existing DB) the DB is the owner or start point for the C# models which the Scaffold-DbContext command will create. This means your C# models should be completely rebuildable after any DB changes so don’t customise the scaffolded models. The same applies for your DbContext class, don’t customise this directly.

To add custom code to your models use partial classes

Use partial classes to implement an interface

To add custom code to your DbContext, create a new one and inherit from the generated one

Override DbContext

Database objects are stored in a SQL Server DB Project and must be source controlled

Aside from never customising generated code a second prerequisite or ground rule is that the database must be source controlled. We must store the definition of the DB in source control so other developers can pull it down and synch to their local DBs.

Assuming we are using SQL Server we can create a representation of our DB in a SQL Server Database Project in Visual Studio and add it to our solution….

SQL Server Database Project

Suggested development workflow when DB schema changes are needed

OK assuming we know not to ever change auto-generated EF Core code and that our database must be source controlled lets look at the day to day flow…

– 1 –
Developer gets latest and uses the DB project to push any changes into their local DB

Before the developer is about to start on some new work, they need to make sure they’ve got the latest code and database project downloaded from source control.

Using the Visual Studio schema comparison tool they then compare the DB project to their local DB to see if any other developers made changes. If they have some differences will be listed in the comparison results and the developer can then use the tool to push the updates (no scripts needed) to their local DB.

I prefer to, but you don’t have to check the schema comparison files in but it’s useful to have each developer at least have them saved locally so they can just double click them to run comparisons.

Two comparison files will allow a developer to synch from their DB to the project and vice versa….

Using DB comparison tool in Visual Studio

– 2 –
Developer adds a new column to a table in their local DB

Let’s say we want to add a new column for website address to the customer table so we run the below…

ALTER TABLE [SalesLT].[Customer] ADD WebsiteAddress nvarchar(500) NULL;

– 3 –
Re-scaffold from the developers local DB and include the -force parameter

The DB has been updated so we now need to update the C# models to match so we run something similar to below in the Package Manager Console…

Scaffold-DbContext name="AdventureWorks" Microsoft.EntityFrameworkCore.SqlServer -force -outputDir Models -contextdir Context -Context EFContext -UseDatabaseName

or we can use the .NET CLI and wrap the scaffold command in a bat file.

The connection string is referenced from a named connection in appsettings.json and will be pointing to the developers local DB. At this point we can’t scaffold directly from a DB project but this functionality is supported in EF Core Power Tools if you’d like to check it out.

Note the inclusion of the -force flag which tells Entity Framework Core to overwrite existing files. In this case the previously generated Customer.cs file will be re-created and will now include a new string property called WebsiteAddress. The context file will also be re-created and will now include a mapping corresponding to WebsiteAddress.

Our pending items will look similar to…

Pending Changes after scaffolding

– 4 –
Developer codes as normal and repeats steps 2-4 until ready to check in.

At this point the DB and the C# models are in synch so the developer can just continue to develop as normal. When they are ready to check in they move to step 5…

– 5 –
Developer runs comparison tool in Visual Studio to compare their local DB to the DB project

The developer is finished the task but if they check in only the C# code the next developer to do a ‘Get Latest Version’ command in Visual Studio will have problems. We need to check in the C# model changes and corresponding DB changes together…

To update the DB project we can again use the comparison tool in Visual Studio. This time the source will be the developers local DB and the target will be the database project on the developers local machine.

Note the comparison tool uses the locally downloaded version of the project. It is not possible to compare directly against the latest version that’s in source control. This means it is very important to have the latest project downloaded locally.

After running the comparison tool the output will look a bit look the below image. Click the image for a larger view in a new window.

Comparison Tool Results

We can see the tool has found one change corresponding to the WebsiteAddress column added earlier. To push this change into the project we need to select the checkbox in the middle and click Update.

After the update completes the database project will have been updated and the pending changes window will now look similar to…

Pending Changes after updating db project

Note… if we added a new object as opposed to editing an existing one the DB Project file itself would also show in the pending items.

You don’t have to do this but before checking in I like to…

1) re-run the comparison to make sure that after updating the DB project there are no differences left (in this case) or only differences whose corresponding C# model code will not be checked in either.

and

2) Compare the local version of changed objects to the latest in source control just to make sure I can reconcile every change…

– 6 –
Check in the database project changes and model/DbContext changes TOGETHER

All that’s left is to check-in….

Final checkin

In our real apps we would of course have a lot of other file changes to check in as well such as those made to other C# files, .CSHTML files, JavaScript files etc.

Deploying changes to promoted environments such as DEV, UAT or PROD

How you deploy to forward environments will vary depending on your processes but for the DB part it will likely involve using the comparison tool to compare the DB project against the target DB and then building up deployment scripts.

I usually have comparison files similar to below saved and checked in to the DB project…

Because the target is a database (and not a DB project) the generate scripts button will be enabled…

Generate Scripts option

The script created should be used only as an aid and needs to be manually reviewed before execution.

Note…. I recommend always using the DB project and not an actual DB as your single source of truth when building up rollout scripts as this forces developers to check-in their DB changes otherwise their functionality won’t work on DEV, UAT and PROD.

Checking in DB changes is very important as DBs are rebuilt frequently, developers do stupid things to them, old redundant objects are often not dropped etc. Having your objects checked in means you can always rollback to a previous version and understand why or who made a particular change to an object.

Summary

I’ve used the above database first workflow in teams of 5-20 developers regularly over the last few years without major issues. The issues that do arise are usually due to developers not following the basics such as getting the latest code before starting on new tasks and checking in generated EF code and corresponding DB object changes together.

Let me know what you think…

Setting audit columns in Entity Framework Core when using the database first approach

It’s a very common use case for applications to have to track when records are created or modified and which user was responsible.

The steps below show how we can easily add createdBy, createdDate, changedBy and changedDate columns to our DB and populate them using Entity Framework Core.

Add audit columns to your tables

Adding the relevant columns to your tables is easy BUT if you have a lot tables it’s a bit of a pain.

I found the below script on Stack Exchange the other day which helps create the ALTER statements to add a column to any table which does not already have it.

In the below example the output shows that the created_date column is missing from three tables…

Add a new columns to all tables that don't have it currently

Create an interface with C# properties to match DB columns

Interface with audit related properties

We’ll use this interface to mark entities as having these audit related properties so we can have Entity Framework Core set them later on…

Re-Scaffold and have all your classes implement the new interface

We need to re-scaffold to bring our new DB columns into our models. After this we could manually add : IHasAuditProperties in each EF generated model class BUT this means as soon as we scaffold again we will lose these custom changes.

A better option is to use partial classes. Using the below as an example C# will combine the manually created Address partial class in the ExtendedModels folder with the partial class Entity Framework Core has generated in the Models folder. When we need to re-scaffold, our partial classes will remain so we won’t lose our changes.

Use partial classes to implement an interface

The partial class approach above is nice, but if you have a lot of models it can be time consuming. Another approach is to change the format of the generated models which are output during scaffolding. We can do this easily using the excellent EF Core Power Tools extension.

When using this tool, we will scaffold using a graphical rather than a command line interface. One of the options in the tool is to customise the generated code using Handlebar templates. This gives us the ability to have all our entities implement an interface or inherit from a base class…

Inherit from a base class

For full steps involved see my previous article…

Make entities inherit from a base class in Entity Framework Core DB First – Dave Callan

Override SaveChanges() to set the values for audit related properties.

Finally we can set the audit properties by Overriding SaveChanges/SaveChangesAsych in our static DbContext class…

Override SaveChanges

If we place the above in our scaffolded DbContext we will lose these customisations when we next scaffold. Therefore we should put them in a statically created DbContext class which inherits from the generated one….

Override DbContext

and our Startup.cs -> ConfigureServices method would look something like below….

ConfigureServices

In our code we use instances of StaticContext and not EFContext…

Products Controller

How to turn off query tracking on a context level in Entity Framework Core

In Entity Framework Core we’re used to setting AsNoTracking() direct on our queries, but it’s possible to set this on a global context level too. We can do this from our Startup.cs -> ConfigureServices method as shown below…

Setting query tracking behaviour globally

or we can set it on the context itself…

Setting query tracking behaviour globally

Why would we turn off query tracking for a whole DbContext?

Mostly you wouldn’t want to do this, but there is a couple of use cases where it’s useful such as when you have a read only site and know for sure no entities will need updating.

It could also be useful in a CQRS architecture when you have two DbContexts, one which is optimised for reads (eg. product catalogue) which has tracking turned off and one which is optimised for writes which has it turned on.