Coverlet and Case-Sensitivity

If you are familiar with Linux, you already know that one of its differences from Windows is file systems case-sensitivity. Dev teams have to ensure casing is correct for building and testing on Linux; then Windows generally works.

As we containerize .NET Core code to run on Linux as well as Windows, we have encountered some interesting case-sensitivity issues. Recently, we realized that the open source code coverage tool for .NET Core, Coverlet, is always case-sensitive – even on Windows – with the Include parameter. Consider the following dotnet command:

dotnet test /p:CollectCoverage=true /p:Include="[J3DI*]*" /p:Exclude="[Test.J3DI*]*"  ./Test.J3DI.Domain

This command works correctly because the casing matches the path and file names on both OSs. (Cloning the Git repository, on either Linux or Windows, retains file and path casing.)

If we change the command just slightly (change Include filter specification to lower-case ‘i’):

dotnet test /p:CollectCoverage=true /p:Include="[J3Di*]*" /p:Exclude="[Test.J3DI*]*"  ./Test.J3DI.Domain

The tests run, but no code coverage will be recorded. Coverlet will output the standard coverage.json file, but it only contains an empty object. The file size is very small (just 2 in my case). Worse, Coverlet will report 100% coverage – very misleading.

+--------+------+--------+--------+
| Module | Line | Branch | Method |
+--------+------+--------+--------+

+---------+------+--------+--------+
|         | Line | Branch | Method |
+---------+------+--------+--------+
| Total   | 100% | 100%   | 100%   |
+---------+------+--------+--------+
| Average | NaN% | NaN%   | NaN%   |
+---------+------+--------+--------+

To remedy, double check casing in the file paths and Include / Exclude filters.

Snappy for .NET Core on Linux (AVOID!)

Snaps are very easy to install on most Linux OSs, are able to auto-update, etc. But, don’t use Snap to install .NET Core SDK on Linux. Although Snap allows for multiple versions of a snap package, only one is active at a time. Why is this a problem?

The specified framework 'Microsoft.NETCore.App', version '2.0.0' was not found.
  - The following frameworks were found:
      3.0.0 at [/usr/share/dotnet/shared/Microsoft.NETCore.App]

When the .NET Core SDK 3.0 snap package is active, down-level builds don’t work – e.g., using .NET Core SDK 3.0 to build for 2.2, 2.1, etc. Dev organizations of much size cannot simply upgrade everything to 3.0 en mass, so everyone’s config must support targeting netcoreapp2.0 with newer tooling.

Can this be remedied with some magical Snap tricks? Maybe. IMO, the easier route is to simply install all the SDKs you need using the official .NET Core instructions.

Bitbucket Pipelines vs .NET Core

There are many things to like about Atlassian’s Bitbucket, but .NET support in Pipelines is not on the list.

BitBucket’s Pipelines are for “Integrating CI/CD for Bitbucket Cloud,” and are “trivial to set up” using a YAML file for describing actions.

from 3/8/2019

Here’s an example of a pipeline file:

pipelines:
  default:
    - step:
        caches:
          - dotnetcore
        script: # Modify the comma`nds below to build your repository.
          - export PROJECT_NAME=hello.csproj
          - export TEST_NAME=hello-tests.csproj
          - dotnet restore
          - dotnet build $PROJECT_NAME
          - dotnet test $TEST_NAME

Pretty simple, right? Yep, nice and simple. Executing the pipeline is pretty easy, too. The problem is that this pipeline will fail, and the failure has little to do with the code.

Bitbucket Pipelines only accepts the JUnit XML format for test output. Since your standard .NET project does not use JUnit tooling, the Bitbucket Pipeline doesn’t find any test results.

You could fix this problem by using the JUnitTestLogger. Or you just use a hosted CI/CD solution which handles .NET better. Azure DevOps anyone?

Avoid Shared Assemblies with Azure Functions

Keep Azure Function assemblies monolithic, or be prepared to work deal with mismatching Azure Storage assemblies.

Azure Function projects which attempt to share common code via other assemblies will fail to compile with:

CS0433: The type 'CloudBlob' exists in both 'Microsoft.Azure.Storage.Blob, Version=9.4.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' and 'Microsoft.WindowsAzure.Storage, Version=9.3.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' <Assembly with Azure Functions and including shared assembly which includes Azure Storage assembly>
AzFnTrials Solution Hierarchy

The mismatch occurs due to the Azure Storage versions included in the Azure WebJobs extensions and more general Azure Storage.  The nearby AsFnTrials Solution Hierarchy screenshot highlights the mismatched versions.  
The easiest way to avoid this compilation error is to keep all of the code in a single assembly – e.g., move BlobHelper.cs to SharedAsmb and remove CommonLib (or remove it from dependencies).

The downside with this approach is that sharing code gets more complicated, particularly when existing shared code exists.

Controlling .NET Core tool versions by global.json

If you ever need to use older versions of .NET Core tooling, an easy option is setting the sdk version in global.json.  For example, create an empty directory and execute dotnet –info. Since I’m using .NET Core 2.0, my output is:

.NET Command Line Tools (2.0.0)

Product Information:
 Version: 2.0.0
 Commit SHA-1 hash: cdcd1928c9

Runtime Environment:
 OS Name: ubuntu
 OS Version: 16.04
 OS Platform: Linux
 RID: ubuntu.16.04-x64
 Base Path: /usr/share/dotnet/sdk/2.0.0/

Microsoft .NET Core Shared Framework Host

Version : 2.0.0
 Build : e8b8861ac7faf042c87a5c2f9f2d04c98b69f28d

Now add this global.json file to the directory:

{
  "sdk": {
    "version": "1.0.0"
  }
}

and run dotnet –info again.  Now the output will look like:

.NET Command Line Tools (1.0.1)

Product Information:
 Version: 1.0.1
 Commit SHA-1 hash: 005db40cd1

Runtime Environment:
 OS Name: ubuntu
 OS Version: 16.04
 OS Platform: Linux
 RID: ubuntu.16.04-x64
 Base Path: /usr/share/dotnet/sdk/1.0.1

Microsoft .NET Core Shared Framework Host

Version : 2.0.0
 Build : e8b8861ac7faf042c87a5c2f9f2d04c98b69f28d

The highlight shows that the older tools are used.  Now, use the older tools to create a new project by executing dotnet new console.  Notice the TargetFramework value in the project file:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>netcoreapp1.1</TargetFramework>
  </PropertyGroup>
</Project>

The .NET Core 1.x tooling created a console project targeting NetCoreApp1.1.  By comparison, .NET Core 2.x tooling will create the project targeting NetCoreApp2.0.

Other tools are affected, too.  For example, dotnet build invokes MSBuild version 15.1.548.43366 for sdk version 1.0.0, and version 15.3.409.57025 for sdk version 2.0.

Beyond manual creation, this capability seems useful for automation scenarios which need to specify which tooling to use.  I was even able to force 1.0.0-preview2-1-003177 tooling to be used.  Will you use this capability?

Is global.json irrelevant (or dead)?

If you followed our .NET Core 1.0.1 Migration Workflow post you may have noticed that global.json was deleted as part of the migration.  Why?

Previously global.json was used primarily for its projects property which specified directories containing project.json files.  In effect, this property provided the same behavior as a Visual Studio Solution file (.sln).  So, the migration process replaced global.json with a .sln file.

Less frequently used, however, was global.json’s sdk property which specifies which SDK version (in $env:ProgramFiles\dotnet\sdk).  While the projects property is ignored, the sdk property is still used.

Consider this global.json:

{
    "projects": [
        "foo", "bar", "Does Not Exist"
    ],
    "sdk": { "version": "1.0.3" }
}
Executing dotnet restore or dotnet build succeeds even though the directories listed in project do not exist.  That property is obviously ignored.  Changing the sdk version number, however, impacts which tooling the dotnet command uses.

 

Interested in more details? See How to use Global.json in the .NET Core Tools 1.0 world.

 

How to use global.json in the .NET Core Tools 1.0 world

UPDATE: see our post on using global.json to control .NET Core tool version.

The .NET Core Tools 1.0 release officially made the switch from using global.json and project.json to using Visual Studio Solution and Project files, respectively.  See Microsoft’s .NET Core Tools 1.0 announcement and our migration guide.

Global.json is not completely useless in this new world, however.  Importantly, you can use it to control which tooling version the dotnet command uses.

Controlling .NET Core Tooling Version

What if you need to create a .NET Core project via older tooling?  Your team may still need to use project.json for some period of time, your builds have not been updated yet, or you may just need it for testing purposes.  Instead of creating a VM or some other heavyweight procedure, just use global.json!  Add the following global.json to an empty directory:

{
    "sdk": { "version": "1.0.0-preview2-003133" }
}

NOTE: This json refers to the 1.0.0-preview2-003133 tooling. If this specific version doesn’t work on your machine, check %ProgramFiles%\dotnet\sdk to see which tooling versions are installed.

Now enter dotnet –help on the command line from within the same directory.  Notice that the first line of output reads:

.NET Command Line Tools (1.0.0-preview2-003133)

Whereas entering dotnet –help from within a different directory (i.e., sans global.json) produces:

.NET Command Line Tools (1.0.3)

Going Back In Time – The Easy Way!

Now, go back to the directory with the global.json file and enter dotnet new -t Console.  Since global.json references the older tooling version, this command creates a project.json file, just like the early days of .NET Core!

 

 

 

Note also that the file dates are set to 9/20/2016 – seemingly the date of the tooling’s binaries in %ProgramFiles%\dotnet\sdk\1.0.0-preview2-003133.

Using dotnet migrate for project.json to .csproj

Read our post at .NET Core 1.0.1 Migration Workflow which explains using .NET Core CLI for migrating from the old project.json approach to the new Visual Studio Project (a.k.a., MSBuild) approach.

.NET Core Tools 1.0 Migration Workflow

If you’ve been using .NET Core for very long, you have code based on global.json and one or more project.json files.  Several months (a year?) ago Microsoft announced deprecating this approach, and a return to Visual Studio Solution (.sln) and Project (.csproj, .fsproj, . vbproj, etc.) files.  This transition also involves changing from directory based to file based orientation.  For example, attempting dotnet build with the 1.0.1 toolset yields:

MSBUILD : error MSB1003: Specify a project or solution file. The current working directory does not contain a project or solution file.

Specifying a project.json file doesn’t work either; it yields:

error MSB4025: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1.

After upgrading to the toolset, migrating code involves a few steps:

  1. Create VS Solution file.  The migration process is much easier if you create this file first.  Using a shell (cmd, powershell, bash, etc.), change to the directory containing the project’s global.json and run dotnet create sln.  The result is a (relatively) empty .sln file.
  2. Migrate project.json files to VS Project files.  Run dotnet migrate in the same directory as above.  This command will recursively find project.json files, convert them to C# project files, and add references to them in the solution file.
  3. At this point you should be able to restore and build using the .sln.  Recall that dotnet tool is no longer directory oriented.  Include the .sln file in the restore and build commands, e.g., dotnet build <project>.sln.

 

 

.NET Core multi-project build change in tasks.json

In earlier versions of .NET Core tooling, building multiple projects was simply a matter of adding them as args in the tasks.json file.

1
2
3
4
5
6
7
8
9
   "tasks": [
        {
            "taskName": "build",
            "command": "dotnet",
            "args": [
                "./J3DI.Domain/",
                "$./J3DI.Infrastructure.EntityFactoryFx/",
                "$./J3DI.Infrastructure.RepositoryFx/",
   ...

Each directory was a child of the location with global.json, and each had its own project.json. This approach worked very well for producing multiple .NET Core libraries and their associated unit tests.

After migrating this code and changing the references to the specific .csproj files, we found that only 1 arg is allowed for the dotnet build task. If the args array contains more than one item, compilation gives

MSBUILD : error MSB1008: Only one project can be specified.

The fix is to reference the Visual Studio Solution File (.sln) in the args.

 

1
2
3
4
5
6
7
    "tasks": [
        {
            "taskName": "build",
            "args": [
                "${workspaceRoot}/J3DI.sln"
            ],
    ...

Good News: There’s still a way to build multiple projects by encapsulating them in a .sln file.

Bad News: Visual Studio required. (IOW, ever tried manually creating or managing a .sln?)