The Alchemy Bin

Gavin's Space

Image Processing With .Net Core

An article I wrote last month on ImageSharp for .Net Core.

Defiantly worth a look; and if you have the time to spare, help out with.

Code Guru Image Processing Article

Cortana and Universal Windows Platform

My latest article has been published to Code Guru, which looks at how to start a UWP App with Cortana. This was quite a fun article to write and I spent quite a few hours looking at the possible returned data from Cortana after completing the Article.

The article can be found here…

How to Start a Universal Windows Platform (UWP) App Using Cortana

Managing Non-blocking calls on Code Guru

The next article in the theme of async programming has be added to Code Guru.

Managing Non-blocking Calls on the UI Thread with Async Await

I spent quite a considerable amount of time learning the async await pattern myself in the early days of its appearance. I recall a lot of pain when first trying to wield it. It’s a powerful too, but can go horribly wrong if miss-used.

Much fun was had too; and many laughs at the high degree of failure was had.

EF Core and SQLite on Code Guru

My first official article is now up on Code Guru on the topic of EF Core and SQLite.

I’ll be producing articles for code guru from time to time, helping to fill in on Peter Shaw’s regular column when needed.

You can find my latest article here…

Simple Asp.Net Core Microservice Project

Warning : This project is out of date and will not work with the latest releases of .Net Core.

This is a project which can be found on GitHub at…

I started this project as a simple way of testing how much I recalled off the top of my head when creating an Asp.Net Core application from an empty project and is based on a project by Peter Shaw and his Microservice talk for DDD North 2015 which can be found here.

I created this project on Windows using Visual Studio 2015, but given this uses the coreclr; in theory you could get this running on Linux/Mac OS X.

The project is divided into a number of areas and each service project is designed to run on the Kestrel http server as independent processes.

The first set of projects are the web applications themselves. At the time of this writing there are three of them. The first is the ‘Microservices.Web’ project which deals with the web view. Making use of MVC the controllers for this project can be found in the ‘WebViews’ folder. The other two are the Web Apis and their controllers can be found in the ‘WebApis’ folder.

Logic is injected into the controllers and this logic is found in the ‘Logic’ solution folder. DI is setup in the startup.cs files of the web application projects. You’ll see something like this…

services.AddTransient<IGetEvents, EventData>();

At the moment only the web apis have any addition logic injected into them.

The DBContext is in the ‘Data’ solution folder with the models project and uses Entity Framework Core. The data project also contains a startup.cs so you can run this project in isolation for updating your db with migrations. Also, it’s worth noting I’m using User Secrets to hold the connection string to my db. You’ll need to configure this at your end if you wish to run the project.

How to run the Web Applications

All the projects are set to use ‘dnxcore50’ which you can see in each of their project.json files. If you haven’t installed the .Net Core and the stuff for Asp.Net Core 1 you can get started with this here…

At the time of this writing the dnx version I’m using is 1.0.0-rc1-update2 coreclr x64. If you haven’t played with AspNet Core 1 yet, you can run the web projects from a command prompt opened at the folder of the specific project using this command.

dnx web

Now, the web api projects are already setup to run from a different port to the default localhost:5000. In their project.json (I’m looking at the project.json in the ‘Microservices.ServiceOne’ project) you will find a line which looks like this…

"web": "Microsoft.AspNet.Server.Kestrel --server.urls=http://localhost:5001/"

Which sets for us the port we want this project to use. Service two is set to use 5002.

The web view project, making use of Typescript and Gulp for minifying, simply uses JQuery ajax calls pointed at the ports setup for now. I’m thinking of a more sophisticated system for this later. Making use of razor and some of the new stuff in Asp.Net Core, and if you look in the ‘_Layout.cshtml’, you’ll see the ‘asp-append-version’ being used to give us a hash of the minified js file. There’s a lot of new stuff like this in there but at the moment this project hasn’t made use of them. A very useful example of which is the loading of different JS files depending on the environment, e.g. development or production. Though as said, this will appear in the project at some later time when the beers are plentiful.

Having all three web applications running looks something like this below.



I’ve also navigated to localhost:5000 so you can see that logging is enabled and you can see stuff happening. You can also see things like SQL generated by LINQ to SQL and such like in the command prompt output; it’s very handy.

Word of warning, there’s a good chance most of this will stop working as I’ll probably be working on it after a number of beers at the weekend.

Creating Commands: Asp.Net Core

Using the AspNet Core is a joy in itself, but here I make a record (mainly for myself) of creating commands for use through the Command Prompt which can be consumed by a web application.

Using commands is a common activity when you are building web applications with AspNet Core. You interact with Entity Framework through a Command Prompt through the library crated for such things. If you’ve used this you’ll be familiar with the dependency in your project.json “EnttiyFramework.Commands”.

You’ll also may or may not be familiar with these lines…

"commands": {
 "web": "Microsoft.AspNet.Server.Kestrel",
 "ef": "EntityFramework.Commands";

Anyway, those lines allow something like this..

dnx ef database update

So let’s create one of these commands for ourselves…

Before we start, if you’re new to AspNet core, or even new to .Net Core then you can get started here as you’ll need to install the .Net Core and the AspNet core stuff for this…

Getting Started with .Net Core

Also, it’s worth noting that at the time of this writing, there are things which will be renamed in the near future. You’ll also need an Asp.Net project created, I’ve built one using the Visual Studio template for Asp.Net 5 (Asp.Net Core) web application. You can also use the Yeoman generator for Asp.Net 5 if that is your weapon of choice.

Yeoman Generator for Asp.Net 5

Now, let’s get on with building our command application which will be consumed by our Asp.Net web application. This is really simple using the .Net Core; and from a command prompt in a directory of your creation to hold the application (I’ve named my directory “TestConsoleApp.Command”), execute this…

dotnet new

This will create you a basic C# console application and in your folder you’ll have a NuGet.Config file, a Program.cs and project.json.

Firstly, just to check what dependencies your project.json is referencing as some things are in beta at the moment and changing; here’s what I have.

"dependencies": {
"Microsoft.CSharp": "4.0.1-beta-23516",
"System.Collections": "4.0.11-beta-23516",
"System.Console": "4.0.0-beta-23516",
"System.Linq": "4.0.1-beta-23516",
"System.Threading": "4.0.11-beta-23516";

…and the framework is targeting “dnxcore50”.

Now, the console application generated by the dotnet core should be a simple “Hello World!” app, outputting that text to the console.

If you run…

dnu restore
dotnet build

You should find your complied dll in bun/Debug/dnxcore50; again, names of directory’s may change in the near future.

Given the directory was named “TestConsoleApp.Command”, I’m looking at a dll named “TestConsoleApp.Command.dll”. This is the dll we’ll now reference in the project.json of the web application. If you’re using Visual Studio you can add your references as normal. If you are editing the project.json file of your web application directly; you can enter this into your dependencies.

"frameworks": {
    "dnxcore50": {
       "dependencies": {
        "TestConsoleApp.Command": "1.0.0-*"}}

Then you can add your command here in the web applications project.json…
(In VS, if you save this document with changes VS will do a restore of packages. If not you can run dnu restore from the command prompt in the root of the web application’s project directory)

"commands": {
 "test": "TestConsoleApp.Command"

Once done, open your command prompt at the location of your web application and run…

dnx test

Which should produce whatever message you had set your console app to produce. There are of course other ways to do this. I’ve done this by adding a reference to the assembly from outside the web application solution. However, having the console app project in the solution is what I would say a better way to go about this because of this…

When referencing the assembly I found a few things happened. Firstly a wrap is created in the root of the web application which contains a project.json. This project.json contains a reference to the location of the assembly. Now, this is where things went a bit funny. Even though a reference is kept to its location; a copy is made of the assembly and put in the lib folder which is also in the directory of the application. Now, one may think this is a good idea as we now have a copy of the assembly close to the application. That though is where I found a little problem. When you update your assembly, that copy does not get updated automatically at the present time. I find I have to go to that directory, delete the copy-or copy the update assembly over-and re-fresh the reference; which then pulls over another copy.

Due to the nearest wins rule, it is the copy which is being used and until I found that copy being kept I did have a few hair scratching moments wondering why my updates weren’t being seen.

So there you have it, I find this arrangement is a very nice way to give a bit extra when the situation requires it but keeping it all in one neat place for usage.

LIDNUG & Stephanie Locke Thur 10th March

I would like to let you all know that we have an up-coming event on LIDNUG Thur 10th March with Stephanie Locke titled…

Implementing Analytics in your Applications

This is a free event held on-line through Live Meeting available to anyone who wishes to attend.

You can register for the event via Eventbrite here:


And now; a few words about Steph…

Steph Locke wants to live in a world where everyone she encounters enjoys their jobs and is awesome. Since she’s gifted with the inability to be daunted by a task, she’s using her unbounded perkiness to bring awesomeness to the people. Steph is a fiend for learning, an MVP and a highly popular presence at many community events in the UK; and that’s before we even start talking about the events Steph organises herself.

Steph can be found on twitter : @SteffLocke
and you can find her blog here :

If you are interested in joining the LIDNUG group you can find us in LinkedIn…

XAML, a Do Not, and Collection Virtualisation

The Problem

This is a problem I see often, sometimes in on-line tutorials, and in code I come across from time to time. The problem is thinking you have item virtualisation when, in fact, you do not. Or, as a beginner with XAML, you may not be aware of virtualisation yet.

Firstly though, what is Collection Virtualisation?
It’s a very simple idea. If you were to have a large list of data to display on your UI, a list that extends well beyond the boundaries of the UI; Virtualisation is a mechanism which relieves pressure on the rendering work by not looking a items which the user cannot see. Point of fact, it actually extends a little further than what the user can see; in what can be referred to as the Realisation Window. If you were to have a vertical list, spanning the height of your UI, this Realisation Window extends one UI height up, and one UI height down.

So… Back to the problem. The problem is often caused my optimisation attempts made in the XAML. This attempts are usually found when replacing an ListBox with an ItemsControl. Why might you want to do this?

This is a very valid thing to do. The ItemsControl found in all XAML frameworks is a very light control compared to it’s derived relatives; the ListBox, ListView etc. You may rightly chose to use a simple ItemsControl to show a list of items you do not intend to select or interactive with; well, not without making considerable changes to the ItemsTemplate.

However, doing so comes at a cost…

The first of which is very obvious, and that is you can no longer scroll your list of items in any direction; the ScrollViewer is not present in the template of the ItemsControl. Actually, there’s pretty much nothing in there except the ItemsPresenter needed to show the items.

Consider the following…

<ItemsControl ItemsSource="{Binding Items}">

The above is a basic setup of the ItemsControl bound to it’s source of ‘Items’. This will not provide with a mechanism to scroll through your list of items.

 <ItemsControl ItemsSource="{Binding Items}">

And this is a potential fix to this problem. (This isn’t a good way to do this, and at the end of this piece we’ll look at a bit of code that shows us where the ScrollViewer should go.)

Let’s take a closer look at the above two snippets of code. I’ve done this in a simple WPF application so I can make use of an older tool which I still like; the ‘WPF Inspector’. It can be found here…

In my application, the property the ItemsControl is bound to is an Array of Byte, and I’ve quickly populated this with random values, with a length of 500. When I run this application and attach the WPF inspector I see this curious statement at the bottom.


Our list is not being virtualised. This means the rending engine is doing far more work than it needs to. For this application the overhead is not noticeable; but if our items were quite complicated and shown in an application which had other UI concerns to deal with; we’ll very quickly see a degraded performance when this list is rendered.

So what happens if we add the ScrollViewer to our visual tree as shown in the code snippet earlier?


Other than now being able to scroll the list vertically, there was no change to the state of virtualisation. Note the number of UI elements being rendered, which is next to ‘MainWindow’, that states 1034.

If we were to make the ItemsControl a ListBox and remove the ScrollViewer in our code like this…

<ListBox ItemsSource="{Binding Items}">

Then run the application and attach the WPF Inspector, we’ll find two things. Firstly the warning about our virtualisation has disappeared, and; the UI elements will be much lower than the 1034 mentioned in the last image. I’m showing 102 next to MainWindow now, this may vary depending on how high your MainWindow is.

What is the first thing this shows us? It shows us that the ListBox is doing far more than what its visual appearance lets us believe. Even though dropping to an ItemsControl has removed the Selector and the UI elements making up the list box, our application performance is potentially in danger; which is the opposite effect we intended when choosing an ItemsControl over a ListBox.

However, not all is lost. We can build this functionality back into our ItemsControl with a little extra XAML.

Consider the next piece of code…

<ItemsControl ItemsSource="{Binding Items}">


   <ControlTemplate TargetType="ItemsControl">


Run the application and once again attach the WPF Inspector. You’ll find our changes have made an impact on the virtualisation of our list, and peace is resorted to the land of XAML.

I hope this helps as when I started working with XAML some years ago I pulled out a few hairs wondering why my UI was slower when I’d remove UI elements from the tree. If you have any questions please find me on twitter @GLanata.

Syncfusion E-Book Number 7

Source: Syncfusion E-Book Number 7

Another great title from Mr Shawty

Blog at

Up ↑