How To Use Mocha for Node Testing in Windows (2017)

Using Mocha on Windows is so much easier these days!  The previous post on this topic showed how to get past the Linux-specific instructions in Mocha’s Getting Started.  That approach is outdated, and using Mocha on Windows is now a breeze.  So, let’s see how easy it is:

After creating directory for your project, use npm to create a package.json file. (You can skip this step if adding mocha to an existing project)

npm init

You’ll be presented with several options. For this example, accept defaults for all except test command which you’ll type mocha.  Accept the resulting json displayed and your package.json will be written to disk.

If you started with an existing project, i.e., package.json already existed, you just need to set test to mocha in the scripts section.

Once the package.json exists, install mocha.  The save-dev option puts mocha in the devDependencies section rather dependencies.

npm install –save-dev mocha

Now you’re ready to create a test!  Create a subdir test and use your favorite editor to create test.js there.
[code lang=”js” title=”test.js”]
var assert = require(‘assert’);
describe(‘Array’, function() {
describe(‘#indexOf()’, function() {
it(‘should return -1 when the value is not present’, function() {
assert.equal(-1, [1,2,3].indexOf(4));
After saving the file, ensure all packages are up to date.  Go back to the project directory (above test) and run:

npm update –dev

The dev option causes devDependencies to be processed, too.

That’s it!  Now just run the test:

npm test

The output will look like this:
[code lang=”text”]
✓ should return -1 when the value is not present

1 passing (9ms)

Is global.json irrelevant (or dead)?

If you followed our .NET Core 1.0.1 Migration Workflow post you may have noticed that global.json was deleted as part of the migration.  Why?

Previously global.json was used primarily for its projects property which specified directories containing project.json files.  In effect, this property provided the same behavior as a Visual Studio Solution file (.sln).  So, the migration process replaced global.json with a .sln file.

Less frequently used, however, was global.json’s sdk property which specifies which SDK version (in $env:ProgramFiles\dotnet\sdk).  While the projects property is ignored, the sdk property is still used.

Consider this global.json:

    "projects": [
        "foo", "bar", "Does Not Exist"
    "sdk": { "version": "1.0.3" }
Executing dotnet restore or dotnet build succeeds even though the directories listed in project do not exist.  That property is obviously ignored.  Changing the sdk version number, however, impacts which tooling the dotnet command uses.


Interested in more details? See How to use Global.json in the .NET Core Tools 1.0 world.


How to use global.json in the .NET Core Tools 1.0 world

UPDATE: see our post on using global.json to control .NET Core tool version.

The .NET Core Tools 1.0 release officially made the switch from using global.json and project.json to using Visual Studio Solution and Project files, respectively.  See Microsoft’s .NET Core Tools 1.0 announcement and our migration guide.

Global.json is not completely useless in this new world, however.  Importantly, you can use it to control which tooling version the dotnet command uses.

Controlling .NET Core Tooling Version

What if you need to create a .NET Core project via older tooling?  Your team may still need to use project.json for some period of time, your builds have not been updated yet, or you may just need it for testing purposes.  Instead of creating a VM or some other heavyweight procedure, just use global.json!  Add the following global.json to an empty directory:

    "sdk": { "version": "1.0.0-preview2-003133" }

NOTE: This json refers to the 1.0.0-preview2-003133 tooling. If this specific version doesn’t work on your machine, check %ProgramFiles%\dotnet\sdk to see which tooling versions are installed.

Now enter dotnet –help on the command line from within the same directory.  Notice that the first line of output reads:

.NET Command Line Tools (1.0.0-preview2-003133)

Whereas entering dotnet –help from within a different directory (i.e., sans global.json) produces:

.NET Command Line Tools (1.0.3)

Going Back In Time – The Easy Way!

Now, go back to the directory with the global.json file and enter dotnet new -t Console.  Since global.json references the older tooling version, this command creates a project.json file, just like the early days of .NET Core!




Note also that the file dates are set to 9/20/2016 – seemingly the date of the tooling’s binaries in %ProgramFiles%\dotnet\sdk\1.0.0-preview2-003133.

Using dotnet migrate for project.json to .csproj

Read our post at .NET Core 1.0.1 Migration Workflow which explains using .NET Core CLI for migrating from the old project.json approach to the new Visual Studio Project (a.k.a., MSBuild) approach.

.NET Core Tools 1.0 Migration Workflow

If you’ve been using .NET Core for very long, you have code based on global.json and one or more project.json files.  Several months (a year?) ago Microsoft announced deprecating this approach, and a return to Visual Studio Solution (.sln) and Project (.csproj, .fsproj, . vbproj, etc.) files.  This transition also involves changing from directory based to file based orientation.  For example, attempting dotnet build with the 1.0.1 toolset yields:

MSBUILD : error MSB1003: Specify a project or solution file. The current working directory does not contain a project or solution file.

Specifying a project.json file doesn’t work either; it yields:

error MSB4025: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1.

After upgrading to the toolset, migrating code involves a few steps:

  1. Create VS Solution file.  The migration process is much easier if you create this file first.  Using a shell (cmd, powershell, bash, etc.), change to the directory containing the project’s global.json and run dotnet create sln.  The result is a (relatively) empty .sln file.
  2. Migrate project.json files to VS Project files.  Run dotnet migrate in the same directory as above.  This command will recursively find project.json files, convert them to C# project files, and add references to them in the solution file.
  3. At this point you should be able to restore and build using the .sln.  Recall that dotnet tool is no longer directory oriented.  Include the .sln file in the restore and build commands, e.g., dotnet build <project>.sln.



.NET Core multi-project build change in tasks.json

In earlier versions of .NET Core tooling, building multiple projects was simply a matter of adding them as args in the tasks.json file.

   "tasks": [
            "taskName": "build",
            "command": "dotnet",
            "args": [

Each directory was a child of the location with global.json, and each had its own project.json. This approach worked very well for producing multiple .NET Core libraries and their associated unit tests.

After migrating this code and changing the references to the specific .csproj files, we found that only 1 arg is allowed for the dotnet build task. If the args array contains more than one item, compilation gives

MSBUILD : error MSB1008: Only one project can be specified.

The fix is to reference the Visual Studio Solution File (.sln) in the args.


    "tasks": [
            "taskName": "build",
            "args": [

Good News: There’s still a way to build multiple projects by encapsulating them in a .sln file.

Bad News: Visual Studio required. (IOW, ever tried manually creating or managing a .sln?)

.NET Core: No Sophisticated Unit Testing, Please!

In my previous post, I wrote about .NET Core’s limitation regarding directory depth.  I explained that I’m trying to create several related Domain-Driven Design packages for J3DI.  One of .NET Core’s strengths is the ability to use exactly what’s needed.  Apps don’t need the entire .NET Framework; they can specify only the packages / assemblies necessary to run.  Since I want J3DI to give developers this same option — only use what is needed — I broke the code down in to several aspects.

I’ve enjoyed using Microsoft’s lightweight, cross-platform IDE, Visual Studio Code (VSCode), with this project. It has a nice command palette, good Git integration, etc. But, unfortunately, it appears that VSCode will only execute a single test project.

For context, here’s my tasks.json from the .vscode directory:

   "version": "0.1.0",
   "command": "dotnet",
   "isShellCommand": true,
   "args": [],
   "tasks": [
         "taskName": "build",
         "args": [ 
         "isBuildCommand": true,
         "showOutput": "always",
         "problemMatcher": "$msCompile",
         "echoCommand": true
         "taskName": "test",
         "args": [
         "isBuildCommand": false,
         "showOutput": "always",
         "problemMatcher": "$msCompile",
         "echoCommand": true

Notice how args for the build task includes 5 sub-directories. When I invoke this build task from VSCode’s command palette, it builds all 5 sub-directories in order.

Now look at the test task which has 2 sub-directories specified. I thought specifying both would execute the tests in each. Maybe you thought so, too. Makes sense, right? Well, that’s not what happens. When the test task is invoked from VSCode, the actual command invoked is:

running command> dotnet test ./Test.J3DI.Domain ./Test.J3DI.Infrastructure.EntityFactoryFx
error: unknown command line option: ./Test.J3DI.Infrastructure.EntityFactoryFx

(BTW, use the echoCommand in the appropriate task section to capture the actual command)

Hmmmm, maybe the build task works differently? Nope. Here’s its output:

running command> dotnet build ./J3DI.Domain ./J3DI.Infrastructure.EntityFactoryFx ./Test.J3DI.Common ./Test.J3DI.Domain ./Test.J3DI.Infrastructure.EntityFactoryFx

Ok, so it seems that dotnet build will process multiple directories, but dotnet test will only process one. To be clear, this is not a bug in VSCode — it’s just spawning the commands as per tasks.json. So I thought maybe multiple test tasks could work. I copied the test task into a new section of tasks.json, removed the first directory from the new section, and remove the second directory from the original section. Finally, I set isTestCommand for both sections.

   "taskName": "test",
   "args": [ "./Test.J3DI.Domain" ],
   "isTestCommand": true
   "taskName": "test",
   "args": [ "./Test.J3DI.Infrastructure.EntityFactoryFx" ],
   "isTestCommand": true

I hoped this was the magic incantation, but I was once again disappointed. Hopefully Microsoft will change dotnet’s test task to behave like the build task. Until then, we’re stuck using shell commands like the one shown in this stackoverflow question.

Try .NET Core, but keep it shallow

I’ve been building a Domain-Driven Design (DDD) framework for .NET Core.  The intent is to allow developers to use only what they need, rather than requiring an entire framework.  The project, J3DI, is available on GitHub (get it? Jedi for for DDD?)

The initial layout had 3 projects under src, and 4 under test:

| global.json
| NuGet.config
| \---J3DI.Domain
| \---J3DI.Infrastructure.EntityFactoryFx
| \---J3DI.Infrastructure.RepositoryFactoryFx

The global.json in J3DI included these projects:

   "projects": [

Well, that was a mistake.  After building the src projects, the test projects were not able to find the necessary dependencies from within src.

error: Unable to resolve ‘J3DI.Domain (>= 0.1.0)’ for ‘.NETStandard,Version=v1.3’.

Assuming I had something wrong, I tinkered around in global.json, but couldn’t find the magical incantation of path string format.  Finally it dawned on me that dotnet might not be treating the path as having depth.

So, it turns out, .NET Core only lets you go one level down from global.json (as of versions 1.0.0 and 1.0.1).  After pulling each project up a level, effectively removing the src and test levels, I updated the global.json file.

   "projects": [

After that, dotnet got happy. Magic incantation found!

Must Have Tooling for .NET Core Development

Here’s a great set of tools for smoothing your transition to developing in .NET Core.


  • VSCode – cross platform IDE; great for coding .NET Core



REST, Versioning & Religion, Part I

Have you been tracking opinions on the “right” way to version REST API’s?  Don’t miss out on the fun!  If you’d like to get a brief overview of the matrix of possibilities, check out Troy Hunt’s Your API Versioning Is Wrong.  Although this is an oldie, it’s definitely a goodie!  Reading it will give you a brief glimpse into the religious nature of REST advocates.

In case you’re of the TLDR persuasion, here’s a quick summary:


  • REST proposes that the URL (sans query) specifies the resource.
  • Services need some manner of versioning.  IOW, it’s impossible to design and implement the perfect service which never changes.


  • Does that mean that service API’s (URL) should be versioned?  Well, that depends on your religious views.  Does the resource really change?
  • Oh, so the representations should be versioned? Maybe. It really depends on your religious views.

So stick around to get *all the answers in Part II!

* – refers to opinions you’re likely to disagree with, possibly with religious fervor.

%d bloggers like this: