The Blazor Blog!
The Blazor blog is my new blog covering specifically Blazor related issues as I develop applications in this new technology from Microsoft.
Monday 20 November 2023
5 - Adding Blazor
Wednesday 9 September 2020
FAQ - Why isn't my component updating when I call StateHasChanged
?
Blazor updates the UI using a diffing algorith in the same way as other SPAs like Angular, React and Vue.
This means that it does not re-render an entire HTML page when something changes, but re-renders individual components in as efficient a way as possible.
Example
Let's assume we have a page structured like this:
Let's assume that some change takes place in Component B which causes a re-rendering. This means that the rendering process has to call the child components as well, so Component C will also re-render.
However, re-renders flow down, not up. Component A is unaffected as it's a parent. Component D is also unaffected, since it's a child of A, not B.
How to Change This?
In most cases this is perfectly fine, but there are situations where the changes in one component need to be reflected in another. A good example here is a shopping cart model.
Let's assume the diagram above is a shopping app. Component B is the list of products. Component D is the shopping cart, maybe at the top right of the page.
A user can click 'Add to cart' next to a product in B. This should add to the cart, which should then update to show the number of items and total cost.
Since the two components are separate, we need to use some form of inter-component communication. In Blazor there are several ways to do this, and Chris Sainty's excellent blog article provides some details on how this can be done. To summarise, there are three ways:
Events
Component B can raise an event when a product within the component is added to the cart. The parent Component A would then have to wire up to each of these events and then forward them to D (the cart) so
it can update, and then call StateHasChanged()
. This isn't a good way since it requires wiring code to make things work, resulting it a lot of tightly-coupled code. Imagine if
the product list was component C - then we'd have to raise the event to add in C, wire up the event in B, pass the event up to A, which then informs D that a new item was added.
I would argue it's not a good approach to solving this problem.
Cascading Parameters
Blazor supports Cascading Parameters where a value (or object)
in a higher-level component can be referenced in a child component. This doesn't need any intermediate component to know about or pass along the value. In our shopping cart example, we
could create a <CascadingValue Value="Cart">
in Component A. This can then be accessed in Component B, C and D if they chose, by using the [CascadingParameter]
attribute.
The product page would call the Cart.AddProduct()
method. The cart UI, component D also references the Cart using the [CascadingParameter]
attribute, and subscribes
to an event raised by the cart when it changes. It can then call StateHasChanged()
for Component D.
State Containers
State containers have the same approach as the cascading parameters discussed above, but we use Dependency Injection to obtain an instance of the state container.
In our Shopping Cart example we'd create a Cart instance and in both the product list and the cart UI component D, we'd inject it:
@inject ShoppingCart Cart;
The product list then calls Cart.AddProduct()
and the cart UI component D listens for an event CartChanged
, to trigger an update.
Thursday 30 July 2020
Cool Blazor Snippets
- DevOnly
- Hides the content in production but renders it in development mode.
- SizeCheck
- Renders a small coloured badge top right for Bootstrap breakpoints. Wrapped in DevOnly component
- GreyOutZone
- Adapted from FlightFinder - this creates a greyed-out content area when the flag is set.
- Loading
- Hides the content when a value is null and shows a loading screen
Friday 22 May 2020
4 - ASP.NET to ASP.NET Core
Identity to Identity Core
I knew there would be authentication issues since ASP.NET Identity (which we used on our ASP.NET app) is a .NET Framework-only library.Microsoft rewrote the library for .NET Standard/Core as ASP.NET Core Identity - this version is also the version used by a Blazor server-side template if you opt for the built-in authentication. However the database schema used by the Core version is modified so you need to upgrade. However, I was able to create a version of the database that included the new fields required for Core, but would also work against the legacy version. This meant I could test new the app using a copy of the live security database to ensure that when we migrated the live application, I could update the database and go.
It also meant that if disaster struck we could revert the live app back to using the .NET Framework version without having to reverse the database changes. Might be worth a separate article on this alone.
DI Model
One of the key differences between ASP.NET Framework and Core is the use of the services model and Dependency Injection (DI). Our old app had a number of 'techniques' (polite word for hacks) for getting services, so a chunk of time was required to restructure controllers, services and code to work with this new approach. It's worth it though as the DI model is much simpler and makes unit testing much easier too.JSON Serialization
Something I hadn't anticipated was that System.Text.Json is used as the Web API serializer by default in ASP.NET Core, and this caused a lot of problems with API calls from the client JavaScript.Initially it was behaviours like camel-casing serialized names by default, e.g. a C# property "TestThis" is serialized as "testThis". This caused a number of API calls to fail as the deserialization on the client wasn't mapping properties on the client, where we had used the C# naming style.
Then other differences with serialization and deserialization of JSON kept cropping up - so eventually I gave up trying to fix lots of code, and switched to using Newtonsoft JSON:
}).AddNewtonsoftJson( => { // revert to original naming when serializing o.SerializerSettings.ContractResolver = new DefaultContractResolver(); //
Resources and Scripts
<!-- copies TS source to wwwroot (only for Debug build) --> <!-- source: https://github.com/NuGet/Home/issues/6743 --> <Target Name="CopyTsToWwwRoot" BeforeTargets="Build" Condition="'$(Configuration)'=='Debug'"> <Message Text="Copying scripts/ts to wwwroot" /> <ItemGroup> <SourceTs Include="$(MSBuildProjectDirectory)\Scripts\**\*.ts" /> </ItemGroup> <Copy SourceFiles="@(SourceTs)" DestinationFiles="@(SourceTs -> '$(MSBuildProjectDirectory)\wwwroot\scripts\%(RecursiveDir)%(Filename)%(Extension)')" /> </Target>
The only thing I wasn't able to fix with LibMan was the loading of TypeScript type definitions, so I had to use NPM to load these. However these are development-only to enable TypeScript compilation, so we didn't need to copy the files into wwwroot so I was able to avoid having to implement WebPack etc.
Web Jobs
Our app runs on Azure and we also hosted a number of background WebJobs to perform various long running tasks and process Azure queues. Although we could retain these I wanted to bring these into the web application using the IHostedService support in ASP.NET Core 3.1This is actually much better than WebJobs since the code isn't located in the AppData folder and is easier to set up and control. It shares the same IConfiguration system as the main web application as well. It's not strictly required for upgrading to using Blazor but I'd strongly recommend using this service over WebJobs.
3 - Build Process
Azure DevOps, GitHub and many other sources provide this as a service and once you've converted to using it you'll never go back. We use Azure DevOps for our code and CI/CD capabilities so it was pretty easy to upgrade.
Most of our libraries were written a while ago and used the "classic" build tool, which had a GUI interface and helped you create the build pipeline. However, it was really tedious to use on lots of projects, as you had to manually re-create the same process over and over for each one.
The newest hotness is the Azure Pipelines YAML support - you can now add a build file to source control and then configure a build pipeline to use this. The build pipeline is now part of the source control process and you can edit/upgrade/copy these more easily.
As most of my libraries follow a standard format I re-used the same .YML file in each project, just editing a single line to specify which project I wanted to package into a Nuget library.
Once I'd pushed the repository up to Azure DevOps, I just created a new pipeline and pointed it at the YML file in the repo.
# dotnet standard library build-test-pack-nugetpush
# This build is triggered if changes are pushed to the master branch
trigger:
- master
# Use the 'Default' agent pool (our own hosted agents)
# if you want to use Azure DevOps built-in agents, change to
# vmImage: 'ubuntu-latest'
# or
# vmImage: 'windows-latest'
# since this is proejct .NET Std and .NET Core either will work
# if it still has .NET Framework, use the windows version
pool:
name: Default
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
project: '[project-folder]/[project-name].csproj'
# this is used to signal which project is to be packaged into a Nuget package
steps:
# ensure the latest 3.1.x SDK is loaded for the build
- task: UseDotNet@2
displayName: 'Load SDK using 3.1.x'
inputs:
version: 3.1.x
# restore packages, and include any in our internal feed
- task: DotNetCoreCLI@2
displayName: 'restore packages'
inputs:
command: restore
vstsFeed: '[internal-feed-ID]'
- task: DotNetCoreCLI@2
displayName: Build solution
inputs:
projects: '$(solution)'
arguments: '--configuration $(BuildConfiguration)'
- task: DotNetCoreCLI@2
displayName: Run Unit tests
inputs:
command: test
projects: '$(solution)'
arguments: '--configuration $(BuildConfiguration)'
# this is a VSTS task I created to set environment variables from the project
# version and append the build number (it's in the marketplace)
#
# the old NUGET PACK used to allow wildcards e.g. 1.2.* but "dotnet pack" does not.
# e.g. if the project is version 1.2.3 this sets VERSION_BUILD to 1.2.3.[buildNo]
# Note: this uses PowerShell so won't work for ubuntu agents
#
- task: conficient.VersionReaderTask.version-reader-build-task.VersionReaderTask@1
displayName: 'Generate build variables '
inputs:
searchPattern: '$(project)'
buildPrefix: '.'
# package the project and version using the environment variable
- task: DotNetCoreCLI@2
displayName: Pack
inputs:
command: pack
packagesToPack: '$(project)'
versioningScheme: byEnvVar
versionEnvVar: 'VERSION_BUILD'
# push the .nupkg packages to our internal Nuget feed
- task: DotNetCoreCLI@2
displayName: 'Push'
inputs:
command: 'push'
packagesToPush: '$(Build.ArtifactStagingDirectory)/*.nupkg'
nuGetFeedType: 'internal'
publishVstsFeed: '[internal-feed-ID]' # change this to your feed
# I've commented the pipeline to help you understand what each part does
2 - Nuget Packages
Nuget Packages
The second big issue with moving to .NET Standard is that any Nuget packages you consume need to have a .NET Standard support. Any Framework-only packages have to be replaced with something equivalent.Common packages such as Newtonsoft.JSON already do this, so in many cases you won't need to do anything. Other packages might have a .NET Standard version with a different name. Some packages just won't and you'll need to find alternatives - or perhaps lobby the package owner to create a .NET Standard version.
We used iTextSharp 4.x in our application. This was a .NET Framework only package, but there was an iTextSharp 7 release that did support .NET Standard, but was now a chargeable package, and had a number of API changes from the 4.x version which would have meant a lot of rewriting our PDF generation library to work with it.
A search led us to iTextSharp.LGPLv2.Core which was a .NET Standard port of the 4.x version - it was totally compatible with our existing code, and still under a free licence, so we decided to use this.
Not all .NET Framework functionality is present in .NET Standard 2.0 but there are often Microsoft Nuget packages to covert these. An example is System.Drawing - a lot of the Framework version used the Windows GDI libraries so wasn't going to work for a portable framework like .NET Core/Standard.
Downstream Issues
Friday 6 March 2020
1 - Migrating to .NET Standard
.NET Standard 2.0
.NET Standard 2.0 is fine as you can use code targeting this in .NET Framework applications of version 4.7.2 or later, but this isn't the case for .NET Standard 2.1 - it only works with .NET Core 3.0 onward. You can see the compatibility table at https://docs.microsoft.com/en-us/dotnet/standard/net-standard#net-implementation-support
However as a migration strategy I recommend for non-Blazor libraries to use 2.0 during the migration process.
Migration Strategy
Existing applications using .NET Framework are rarely easy to just migrate in a big-bang approach where you upgrade everything to .NET Standard 'overnight' and the application is now .NET Standard/Core. Usually they are comprised of multiple modules and would take several weeks or even months to update. So does this mean it's impossible?Well, my approach during 2018 and 2019, while Blazor was under development was that we would plan to migrate to .NET Standard for all libraries.
The first thing we did was to start now by writing all new libraries using .NET Standard 2.0 where possible. This ensures we don't continue to build a bigger workload of migration as we go forward.
Next was to gradually migrate lower-level libraries to .NET Standard 2.0 as and when we could.
Issues
While .NET Standard 2.0 claims compatibility with .NET Framework 4.6.1, please note the asterisk in the compatibility table. There are lots of problems you'll encounter if you're using anything less that .NET Framework 4.7.2 (I did!).Migrating to .NET Standard
Our first hurdle was that we needed to modify all our own libraries from .NET Framework 4.7.2 to use .NET Standard 2.0 instead.As .NET Standard 2.0 is very closely aligned with the Framework functionality, there should be relatively few code changes required, but anything platform specific has to be avoided.
Assumptions
I realised writing this post that I'm making some big assumptions here, and if these don't apply you're probably not in a good place to undergo this process.The first is that everything is under source control. Git, AzureDevOps, VSTS, (SourceSafe even!) whatever - if you don't have a way to revert stuff you're going to have problems. I'm not expecting anyone reading this type of article not to have source control, but it's worth pointing out.
Second is that you have reasonably comprehensive unit tests in some form. If your app is quite large (mine has over fifty different libraries included in the main web app) then this is a must. You're about to make a lot of major changes and you need to know that each component you migrate to .NET Standard will still work correctly. I actually upgraded my unit tests to .NET Core 3.0 as part of the process but you can keep them on .NET Framework during the transition to give you confidence that you're not going to break things.
Pitfalls
I decided to do a step-by-step approach by initially updating the lowest level utility libraries to .NET Standard 2.0, and leaving the higher up libraries in .NET Framework for now. This is where I ran into our first issues.Microsoft's compatibility matrix states that .NET 4.6.1 implements .NET Standard 2.0, but note the 2 suffix:
...there are several issues with consuming .NET Standard libraries that were built for those versions from .NET Framework 4.6.1 projects. For .NET Framework projects that need to use such libraries, we recommend that you upgrade the project to target .NET Framework 4.7.2 or higher.
So you really need to have your application upgraded to at least .NET Framework 4.7.2 to be sure it will be able to consume .NET Standard libraries without a lot of issues. I did encounter these and I upgrade our app to use 4.7.2.
So if you're not on at least 4.7.2 you need to first do an upgrade to that, and ensure everything works on this version first, before attempting the move to .NET Standard.
References and Project Formats
The other changes that are not so obvious are that .NET Standard and .NET Core use the newer, leaner "2017" project format. In addition, the project Nuget references no longer use apackages.config
file but have the references embedded in the project file. Both of these changes need to be implemented to make migrating to .NET Standard possible.References Upgrade
If you've still got packages.config the first step is to upgrade to Package References. The latest versions of Visual Studio support this as a simple click-to-migrate operation. See https://docs.microsoft.com/en-us/nuget/consume-packages/migrate-packages-config-to-package-reference for instructions on how to do this. Once this is done, the packages folder is gone, the packages.config files are removed and the project files now contain the package reference list.Project Format Upgrade
The second step is to upgrade the project format to the 2017 format version. This is a much leaner, simpler format and is able to host most types of library, but not all. For example, a .NET Framework 4.7.2 class library can use the 2017 format, but an ASP.NET web application cannot.However in this situation you're upgrading libraries so that should not be an issue.
Upgrading is not a simple process, but there is an excellent tool https://github.com/hvanbakel/CsprojToVs2017 which does most of the work. This is installed as a dotnet extension and you upgrade using the command line.
dotnet migrate-2019 wizard [solutionfilename.sln]
You can optionally chose to migrate on a project-by-project basis if you wish.
The tool converts the content of the .csproj file to the 2017 format, including the Package References settings. It also copies values out of the AssemblyInfo.cs file in the Properties folder. The file is retained and only has a couple of remaining values, ComVisible and Guid - if you don't need these you can remove this completely.
Now the projects are upgraded you can open and edit the .csproj file by double-clicking it. You'll find the files much smaller as the new format uses an include-by-default approach for files, so it no longer needs to list each file in the project. This really helps with source control conflicts when different people work on the same library!
You can also delete a lot of the cruft that gets copied over. Nate McMaster has an excellent article that covers the migration project files in much more detail.
Nuspec
If you had a .nuspec file in a project to define NUGET packaged details this can also be dispensed with - the 2017 format contains all the values you need and you can use dotnet pack to create a package. I will discuss the nuget package process for build servers in a later post.5 - Adding Blazor
So in the previous step 4, we had upgraded our application to ASP.NET Core 3.1, but we still had no actual Blazor anywhere in the system. Ho...
-
Blazor updates the UI using a diffing algorith in the same way as other SPAs like Angular, React and Vue. This means that it does not re-r...
-
I've started to create a set of useful Blazor component snippets I've been using on an application I've been developing. These s...
-
Nuget Packages The second big issue with moving to .NET Standard is that any Nuget packages you consume need to have a .NET Standard suppo...