Entity Framework 7 bulk update performance benchmarks

Entity Framework 7 supports strongly typed bulk updates and deletes. This functionality is a nice addition as previously the built in way of updating/deleting many records was very inefficient from a performance perspective.

This post will focus on bulk updates and use BenchmarkDotNet benchmarks to show the potential performance benefit of using the new Entity Framework bulk update ExecuteUpdate method in EF 7.

Entity Framework 6 ‘Bulk updates’

First let’s see how Entity Framework 6 works for updates …

If we wanted to set the LastUpdated field on all blogs for example we’d need to use code like below where we first have to read all blogs into scope …

and assuming we were using the SQL Server provider, have kept the default batch size of 42 and are updating 500 blog records Entity Framework would produce SQL similar to below …

Click on the image for a larger view in a new window …


We can see in total (apart from transaction related roundtrips) Entity Framework has initiated 13 roundtrips.

Apart from the select statement to read blogs into memory, since we are updating 500 records there are 12 update roundtrips as each batched roundtrip contains 42 update statements, except for the last as 500 doesn’t divide by 42 evenly.

This is very inefficient due to so many roundtrips and the fact that each update is actually sent in a separate UPDATE statement each of which the database must execute separately.

Entity Framework 6 bulk update workaround via plain SQL

Unfortunately some devs won’t understand that this is how EF works and will suffer performance issues because of it.

Most devs thankfully, as a simple workaround will just drop to plain SQL and use ExecuteSql in Entity Framework …

or maybe switch to ADO.NET completely …

or perhaps Dapper …

with all three approaches producing a single similar update statement that will update all blogs in one roundtrip with one single command.

Depending on a number of factors such as number of records to update, batch size and latency the performance difference between the built in EF 6 update approach and dropping to plain SQL can be huge.

Entity Framework 7 bulk update syntax

Here’s what the EF 7 bulk update looks like …

and the SQL it produces is below …

again we have a single update statement but a strongly typed API.

Entity Framework 7 Bulk update performance benchmarks

Sticking with the example above of setting a DateTime column on blog records I ran some benchmarks to update 42, 500 and 1000 records to see how much better the new ExecuteUpdate() method would perform over the existing EF 6 way. I’ve also included two manual SQL approaches for comparison.

The benchmark results are below.
Click on the image for a larger view in a new window …


We can see underscored in yellow that the non bulk approach is significantly slower than the new EF 7 bulk update API. As the amount of records goes up the difference becomes larger due to more roundtrips and more individual update statements for the DB to execute.

Note that the DB used here is on localhost so the latency factor is minimal. It’s very likely the ratio differences between the EF new bulk and existing non bulk approach would be larger if the DB was remote to the benchmark console app.

Should I use ExecuteUpdate() or plain SQL for bulk updates in Entity Framework 7?

We can see from above that overall ADO.NET appears to be the fastest in this particular run for updating many rows, but the difference is marginal and there is a lot of variance between runs.

I’d definitely stick with ExecuteUpdate() rather than dropping to plain SQL as it’s strongly typed and refactor safe.

GitHub Gist to re-create benchmarks

If you want to see the specific benchmarks I ran or to try and recreate them the code is available as a Gist on GitHub ->

Entity Framework BulkUpdate v SaveChange v Manual SQL benchmarks (github.com)

Leave a Reply

Your email address will not be published. Required fields are marked *