Monday 20 November 2023

5 - Adding Blazor

So in the previous step 4, we had upgraded our application to ASP.NET Core 3.1, but we still had no actual Blazor anywhere in the system. However this is a good approach - I'd strongly suggest getting the ASP.NET Core upgrade working and stable before you try to do this step to add Blazor. We were still encountering bugs for a couple of weeks after the upgrade, as the users found things we'd missed in the testing stage.

As the bugs were cleared out it was time to try to add Blazor to our existing project!

Server or WebAssembly?

At this point I needed to decide if we went down the Blazor WASM or Blazor Server-side route. The great thing about Blazor is that this choice is important but not irreversible. Blazor code works on both technologies equally well, and providing you handle obtaining resources from the server in the right way, it's not difficult to swap from one to the other.

Backward Compatibility

The big issue in this choice was being able to keep all the existing legacy application functionality. Our migrated application uses MVC controllers and view, and the Blazor functionality needs to work alongside this as we gradually migrate away from it. 


Blazor WASM is much more involved from a security perspective as it involves a change from using Cookie-based authentication to JWT token-based and Identity Server. I tried doing this for a few days, but this was an issue for getting the legacy MVC stuff to work in this context. So for now I decided to leave this version and try the server-side.

Server-Side Test

Server-side was much easier to integrate as we could keep our Cookie-based authentication and not have to deal with Identity server just yet. I've also found server-side much easier to debug than WASM, although that has now improved a lot with Blazor WASM RTM release 3.2.

To figure out the steps to upgrade, my approach was to create a new clean Blazor Server-side project that included authentication, and then compare the changes required. For server-side these are fairly simple. I had to replicate our _Layout.cshtml content in MainLayout.razor and _Host.cshtml so we had a layout that looked the same as our MVC layouts.

Another wrinkle is that our ASP.NET Core MVC 404 page handler no longer works. Blazor is a single-page-application model, so it handles all routes that don't map to a static file or controller. This means it's the final handler for requests and the Blazor <NotFound> component in the <Router> ends up with these. I just replicated the status page in Blazor which was easy.

Wednesday 9 September 2020

FAQ - Why isn't my component updating when I call StateHasChanged ?

Blazor updates the UI using a diffing algorith in the same way as other SPAs like Angular, React and Vue.

This means that it does not re-render an entire HTML page when something changes, but re-renders individual components in as efficient a way as possible.


Let's assume we have a page structured like this:

Let's assume that some change takes place in Component B which causes a re-rendering. This means that the rendering process has to call the child components as well, so Component C will also re-render.

However, re-renders flow down, not up. Component A is unaffected as it's a parent. Component D is also unaffected, since it's a child of A, not B.

How to Change This?

In most cases this is perfectly fine, but there are situations where the changes in one component need to be reflected in another. A good example here is a shopping cart model.

Let's assume the diagram above is a shopping app. Component B is the list of products. Component D is the shopping cart, maybe at the top right of the page.

A user can click 'Add to cart' next to a product in B. This should add to the cart, which should then update to show the number of items and total cost.

Since the two components are separate, we need to use some form of inter-component communication. In Blazor there are several ways to do this, and Chris Sainty's excellent blog article provides some details on how this can be done. To summarise, there are three ways:


Component B can raise an event when a product within the component is added to the cart. The parent Component A would then have to wire up to each of these events and then forward them to D (the cart) so it can update, and then call StateHasChanged(). This isn't a good way since it requires wiring code to make things work, resulting it a lot of tightly-coupled code. Imagine if the product list was component C - then we'd have to raise the event to add in C, wire up the event in B, pass the event up to A, which then informs D that a new item was added.

I would argue it's not a good approach to solving this problem.

Cascading Parameters

Blazor supports Cascading Parameters where a value (or object) in a higher-level component can be referenced in a child component. This doesn't need any intermediate component to know about or pass along the value. In our shopping cart example, we could create a <CascadingValue Value="Cart"> in Component A. This can then be accessed in Component B, C and D if they chose, by using the [CascadingParameter] attribute.

The product page would call the Cart.AddProduct() method. The cart UI, component D also references the Cart using the [CascadingParameter] attribute, and subscribes to an event raised by the cart when it changes. It can then call StateHasChanged() for Component D.

State Containers

State containers have the same approach as the cascading parameters discussed above, but we use Dependency Injection to obtain an instance of the state container.

In our Shopping Cart example we'd create a Cart instance and in both the product list and the cart UI component D, we'd inject it:

@inject ShoppingCart Cart;

The product list then calls Cart.AddProduct() and the cart UI component D listens for an event CartChanged, to trigger an update.

Thursday 30 July 2020

Cool Blazor Snippets

I've started to create a set of useful Blazor component snippets I've been using on an application I've been developing. These should be useful in a stand-alone context. I'll add more as I create them.

Hides the content in production but renders it in development mode. 
Renders a small coloured badge top right for Bootstrap breakpoints. Wrapped in DevOnly component
Adapted from FlightFinder - this creates a greyed-out content area when the flag is set.
Hides the content when a value is null and shows a loading screen

Friday 22 May 2020

4 - ASP.NET to ASP.NET Core

What I thought would be a fairly straightforward change was upgrading from ASP.NET (framework) to ASP.NET Core. It actually took me a month to migrate our application.

Identity to Identity Core

I knew there would be authentication issues since ASP.NET Identity (which we used on our ASP.NET app) is a .NET Framework-only library.

Microsoft rewrote the library for .NET Standard/Core as ASP.NET Core Identity - this version is also the version used by a Blazor server-side template if you opt for the built-in authentication. However the database schema used by the Core version is modified so you need to upgrade. However, I was able to create a version of the database that included the new fields required for Core, but would also work against the legacy version. This meant I could test new the app using a copy of the live security database to ensure that when we migrated the live application, I could update the database and go.

It also meant that if disaster struck we could revert the live app back to using the .NET Framework version without having to reverse the database changes. Might be worth a separate article on this alone.

DI Model

One of the key differences between ASP.NET Framework and Core is the use of the services model and Dependency Injection (DI). Our old app had a number of 'techniques' (polite word for hacks) for  getting services, so a chunk of time was required to restructure controllers, services and code to work with this new approach. It's worth it though as the DI model is much simpler and makes unit testing much easier too.

JSON Serialization

Something I hadn't anticipated was that System.Text.Json is used as the Web API serializer by default in ASP.NET Core, and this caused a lot of problems with API calls from the client JavaScript.

Initially it was behaviours like camel-casing serialized names by default, e.g. a C# property "TestThis" is serialized as "testThis". This caused a number of API calls to fail as the deserialization on the client wasn't mapping properties on the client, where we had used the C# naming style.

Then other differences with serialization and deserialization of JSON kept cropping up - so eventually I gave up trying to fix lots of code, and switched to using Newtonsoft JSON:
      }).AddNewtonsoftJson( =>
          // revert to original naming when serializing
          o.SerializerSettings.ContractResolver = new DefaultContractResolver();

Resources and Scripts

In the ASP.NET Core model the static resources such as CSS, JavaScript and Images are all located in a new folder wwwroot. I took the upgrade as an opportunity to move our client-side resources such as Bootstrap, JQuery etc. to be loaded using LibMan. I'm not a fan of NPM and using that would probably also mean we'd need to implement build scripts such as Gulp or WebPack to copy files around. LibMan avoids this by allowing you to specify the libraries and even specific files and their destination.
Our own Scripts were located in a /Scripts folder using TypeScript, so I had reconfigure this to output the resulting JavaScript into the wwwroot folder. However when debugging we had an issue: the source TypeScript is only present in the /Scripts folder, which isn't part of the published application. I fixed that by amending the .csproj file with this section:
  <!-- copies TS source to wwwroot (only for Debug build) -->
  <!-- source: -->
  <Target Name="CopyTsToWwwRoot" BeforeTargets="Build" Condition="'$(Configuration)'=='Debug'">
    <Message Text="Copying scripts/ts to wwwroot" />
      <SourceTs Include="$(MSBuildProjectDirectory)\Scripts\**\*.ts" />
    <Copy SourceFiles="@(SourceTs)" DestinationFiles="@(SourceTs -> '$(MSBuildProjectDirectory)\wwwroot\scripts\%(RecursiveDir)%(Filename)%(Extension)')" />

This section will copy the .ts files into the target wwwroot folders and retain the folder structure, so the files are present.

The only thing I wasn't able to fix with LibMan was the loading of TypeScript type definitions, so I had to use NPM to load these. However these are development-only to enable TypeScript compilation, so we didn't need to copy the files into wwwroot so I was able to avoid having to implement WebPack etc.

Web Jobs

Our app runs on Azure and we also hosted a number of background WebJobs to perform various long running tasks and process Azure queues. Although we could retain these I wanted to bring these into the web application using the IHostedService support in ASP.NET Core 3.1

This is actually much better than WebJobs since the code isn't located in the AppData folder and is easier to set up and control. It shares the same IConfiguration system as the main web application as well. It's not strictly required for upgrading to using Blazor but I'd strongly recommend using this service over WebJobs.

3 - Build Process

Hopefully you have an automated build process!

Azure DevOps, GitHub and many other sources provide this as a service and once you've converted to using it you'll never go back. We use Azure DevOps for our code and CI/CD capabilities so it was pretty easy to upgrade.

Most of our libraries were written a while ago and used the "classic" build tool, which had a GUI interface and helped you create the build pipeline. However, it was really tedious to use on lots of projects, as you had to manually re-create the same process over and over for each one.

The newest hotness is the Azure Pipelines YAML support - you can now add a build file to source control and then configure a build pipeline to use this. The build pipeline is now part of the source control process and you can edit/upgrade/copy these more easily.

As most of my libraries follow a standard format I re-used the same .YML file in each project, just editing a single line to specify which project I wanted to package into a Nuget library.

Once I'd pushed the repository up to Azure DevOps, I just created a new pipeline and pointed it at the YML file in the repo.

# dotnet standard library build-test-pack-nugetpush

# This build is triggered if changes are pushed to the master branch
- master

# Use the 'Default' agent pool (our own hosted agents)
# if you want to use Azure DevOps built-in agents, change to
#  vmImage: 'ubuntu-latest'
# or
#  vmImage: 'windows-latest'

# since this is proejct .NET Std and .NET Core either will work
# if it still has .NET Framework, use the windows version
  name: Default

  solution: '**/*.sln'
  buildPlatform: 'Any CPU'
  buildConfiguration: 'Release'
  project: '[project-folder]/[project-name].csproj'
  # this is used to signal which project is to be packaged into a Nuget package


# ensure the latest 3.1.x SDK is loaded for the build
- task: UseDotNet@2
  displayName: 'Load SDK using 3.1.x'
    version: 3.1.x

# restore packages, and include any in our internal feed
- task: DotNetCoreCLI@2
  displayName: 'restore packages'
    command: restore
    vstsFeed: '[internal-feed-ID]'

- task: DotNetCoreCLI@2
  displayName: Build solution
    projects: '$(solution)'
    arguments: '--configuration $(BuildConfiguration)'

- task: DotNetCoreCLI@2
  displayName: Run Unit tests
    command: test
    projects: '$(solution)'
    arguments: '--configuration $(BuildConfiguration)'

# this is a VSTS task I created to set environment variables from the project

# version and append the build number (it's in the marketplace)
# the old NUGET PACK used to allow wildcards e.g. 1.2.* but "dotnet pack" does not.
# e.g. if the project is version 1.2.3 this sets VERSION_BUILD to 1.2.3.[buildNo]

# Note: this uses PowerShell so won't work for ubuntu agents
- task: conficient.VersionReaderTask.version-reader-build-task.VersionReaderTask@1
  displayName: 'Generate build variables '
    searchPattern: '$(project)'
    buildPrefix: '.'

# package the project and version using the environment variable
- task: DotNetCoreCLI@2
  displayName: Pack
    command: pack
    packagesToPack: '$(project)'
    versioningScheme: byEnvVar
    versionEnvVar: 'VERSION_BUILD'

# push the .nupkg packages to our internal Nuget feed
- task: DotNetCoreCLI@2
  displayName: 'Push'
    command: 'push'
    packagesToPush: '$(Build.ArtifactStagingDirectory)/*.nupkg'
    nuGetFeedType: 'internal'
    publishVstsFeed: '[internal-feed-ID]'  # change this to your feed

# I've commented the pipeline to help you understand what each part does

2 - Nuget Packages

Nuget Packages

The second big issue with moving to .NET Standard is that any Nuget packages you consume need to have a .NET Standard support. Any Framework-only packages have to be replaced with something equivalent.

Common packages such as Newtonsoft.JSON already do this, so in many cases you won't need to do anything. Other packages might have a .NET Standard version with a different name. Some packages just won't and you'll need to find alternatives - or perhaps lobby the package owner to create a .NET Standard version.

We used iTextSharp 4.x in our application. This was a .NET Framework only package, but there was an iTextSharp 7 release that did support .NET Standard, but was now a chargeable package, and had a number of API changes from the 4.x version which would have meant a lot of rewriting our PDF generation library to work with it.

A search led us to iTextSharp.LGPLv2.Core which was a .NET Standard port of the 4.x version - it was totally compatible with our existing code, and still under a free licence, so we decided to use this.

Not all .NET Framework functionality is present in .NET Standard 2.0 but there are often Microsoft Nuget packages to covert these. An example is System.Drawing - a lot of the Framework version used the Windows GDI libraries so wasn't going to work for a portable framework like .NET Core/Standard.

Downstream Issues

One aspect of changing the package dependencies on upstream libraries is that every downstream library then has to use the same version as well. 

In .NET Framework I often had to include packages used by upstream libraries as transitive dependencies in .NET Framework never really worked very well, and this leads onto the mess that are Binding Redirects trying to resolve these.

The good news is that this is pretty much fixed in .NET Standard and .NET Core - a downstream app can use an upstream library and the dependencies will be included. You only need to import the packages directly if the app uses the library directly itself.

As part of my upgrade to .NET Standard I had the very satisfying task of deleting a lot of app.config files from the libraries.

Friday 6 March 2020

1 - Migrating to .NET Standard

.NET Standard 2.0

Blazor client apps and libraries need to reference .NET Standard 2.0 code - and more recently they have been based on .NET Standard 2.1.

.NET Standard 2.0 is fine as you can use code targeting this in .NET Framework applications of version 4.7.2 or later, but this isn't the case for .NET Standard 2.1 - it only works with .NET Core 3.0 onward. You can see the compatibility table at

However as a migration strategy I recommend for non-Blazor libraries to use 2.0 during the migration process.

Migration Strategy

Existing applications using .NET Framework are rarely easy to just migrate in a big-bang approach where you upgrade everything to .NET Standard 'overnight' and the application is now .NET Standard/Core. Usually they are comprised of multiple modules and would take several weeks or even months to update. So does this mean it's impossible?

Well, my approach during 2018 and 2019, while Blazor was under development was that we would plan to migrate to .NET Standard for all libraries.

The first thing we did was to start now by writing all new libraries using .NET Standard 2.0 where possible. This ensures we don't continue to build a bigger workload of migration as we go forward.

Next was to gradually migrate lower-level libraries to .NET Standard 2.0 as and when we could.


While .NET Standard 2.0 claims compatibility with .NET Framework 4.6.1, please note the asterisk in the compatibility table. There are lots of problems you'll encounter if you're using anything less that .NET Framework 4.7.2 (I did!).

Migrating to .NET Standard

Our first hurdle was that we needed to modify all our own libraries from .NET Framework 4.7.2 to use .NET Standard 2.0 instead.

As .NET Standard 2.0 is very closely aligned with the Framework functionality, there should be relatively few code changes required, but anything platform specific has to be avoided.

This might sound like a simple task but remember that a .NET Standard library can only reference other .NET Standard libraries - so you have to start at the bottom of the dependency tree and work your way up. If you have a small number of packages this should not be an issue, but it can take a lot of work for larger applications.


I realised writing this post that I'm making some big assumptions here, and if these don't apply you're probably not in a good place to undergo this process.

The first is that everything is under source control. Git, AzureDevOps, VSTS, (SourceSafe even!) whatever - if you don't have a way to revert stuff you're going to have problems. I'm not expecting anyone reading this type of article not to have source control, but it's worth pointing out.

Second is that you have reasonably comprehensive unit tests in some form. If your app is quite large (mine has over fifty different libraries included in the main web app) then this is a must. You're about to make a lot of major changes and you need to know that each component you migrate to .NET Standard will still work correctly. I actually upgraded my unit tests to .NET Core 3.0 as part of the process but you can keep them on .NET Framework during the transition to give you confidence that you're not going to break things.


I decided to do a step-by-step approach by initially updating the lowest level utility libraries to .NET Standard 2.0, and leaving the higher up libraries in .NET Framework for now. This is where I ran into our first issues.

Microsoft's compatibility matrix states that .NET 4.6.1 implements .NET Standard 2.0, but note the  2 suffix:

...there are several issues with consuming .NET Standard libraries that were built for those versions from .NET Framework 4.6.1 projects. For .NET Framework projects that need to use such libraries, we recommend that you upgrade the project to target .NET Framework 4.7.2 or higher.

So you really need to have your application upgraded to at least .NET Framework 4.7.2 to be sure it will be able to consume .NET Standard libraries without a lot of issues. I did encounter these and I upgrade our app to use 4.7.2.

So if you're not on at least 4.7.2 you need to first do an upgrade to that, and ensure everything works on this version first, before attempting the move to .NET Standard.

References and Project Formats

The other changes that are not so obvious are that .NET Standard and .NET Core use the newer, leaner "2017" project format. In addition, the project Nuget references no longer use a packages.config file but have the references embedded in the project file. Both of these changes need to be implemented to make migrating to .NET Standard possible.

References Upgrade

If you've still got packages.config the first step is to upgrade to Package References. The latest versions of Visual Studio support this as a simple click-to-migrate operation. See for instructions on how to do this. Once this is done, the packages folder is gone, the packages.config files are removed and the project files now contain the package reference list.

Project Format Upgrade

The second step is to upgrade the project format to the 2017 format version. This is a much leaner, simpler format and is able to host most types of library, but not all. For example, a .NET Framework 4.7.2 class library can use the 2017 format, but an ASP.NET web application cannot.

However in this situation you're upgrading libraries so that should not be an issue.

Upgrading is not a simple process, but there is an excellent tool which does most of the work. This is installed as a dotnet extension and you upgrade using the command line.

dotnet migrate-2019 wizard [solutionfilename.sln] 

You can optionally chose to migrate on a project-by-project basis if you wish.

The tool converts the content of the .csproj file to the 2017 format, including the Package References settings. It also copies values out of the AssemblyInfo.cs file in the Properties folder. The file is retained and only has a couple of remaining values, ComVisible and Guid - if you don't need these you can remove this completely.

Now the projects are upgraded you can open and edit the .csproj file by double-clicking it. You'll find the files much smaller as the new format uses an include-by-default approach for files, so it no longer needs to list each file in the project. This really helps with source control conflicts when different people work on the same library!

You can also delete a lot of the cruft that gets copied over. Nate McMaster has an excellent article that covers the migration project files in much more detail.


If you had a .nuspec file in a project to define NUGET packaged details this can also be dispensed with - the 2017 format contains all the values you need and you can use dotnet pack to create a package. I will discuss the nuget package process for build servers in a later post.

5 - Adding Blazor

So in the previous step 4, we had upgraded our application to ASP.NET Core 3.1, but we still had no actual Blazor anywhere in the system. Ho...