CloudBlob Fails Function Indexing

When creating a new blob triggered Azure Function, the default type binding is Stream.

[FunctionName("Function1")]
public static void Run([BlobTrigger("{name}", 
    Connection="AzureWebJobsStorage")]Stream myBlob, ...)

Several binding types are available, such as CloudBlockBlob, 
CloudPageBlob, etc.  Both of these classes are derived from CloudBlob, yet it cannot be used as a binding type.  Although the compiler doesn’t mind, CloudBlob will at run-time during function indexing.

CloudBlob Fails Indexing

When the functions host indexes all the functions available, it fails if the binding type is CloudBlob.  If you know that the triggering blob will be a block or page blob, you can use CloudBlockBlob or CloudPageBlob, respectively.  For code that may be triggered by either, use ICloudBlob instead of CloudBlob.

Note that the error message is helpful in v2 of Azure Functions.  For the full list of types available for the triggering blob, see Microsoft’s documentation.

Avoid Shared Assemblies with Azure Functions

Keep Azure Function assemblies monolithic, or be prepared to work deal with mismatching Azure Storage assemblies.

Azure Function projects which attempt to share common code via other assemblies will fail to compile with:

CS0433: The type 'CloudBlob' exists in both 'Microsoft.Azure.Storage.Blob, Version=9.4.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' and 'Microsoft.WindowsAzure.Storage, Version=9.3.1.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' <Assembly with Azure Functions and including shared assembly which includes Azure Storage assembly>
AzFnTrials Solution Hierarchy

The mismatch occurs due to the Azure Storage versions included in the Azure WebJobs extensions and more general Azure Storage.  The nearby AsFnTrials Solution Hierarchy screenshot highlights the mismatched versions.  
The easiest way to avoid this compilation error is to keep all of the code in a single assembly – e.g., move BlobHelper.cs to SharedAsmb and remove CommonLib (or remove it from dependencies).

The downside with this approach is that sharing code gets more complicated, particularly when existing shared code exists.

Breaking Azure Functions with local.settings.json

Because local.settings.json is finicky!

At some point you’re going to want to add an object to the Values section of local.settings.json. Don’t! Declaring an object (or array) in the Values breaks the configuration, resulting in Missing value for AzureWebJobsStorage in local.settings.json.  Notice the customObj declaration below.

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "AzureWebJobsDashboard": "UseDevelopmentStorage=true",
    "customKey": "customValue",
    "customObj": {
      "customObjKey": "customObjValue"
    }
  }
}

Just having customObj declared will breaks things.  Yes, it’s valid JSON; no, your functions won’t be able to retrieve settings from Value.

A simple key-value pair works – like customKey above. After removing customObj, retrieving the setting for customKey from Values is straight-forward

var customVal = System.Environment.GetEnvironmentVariable("customKey");

Beyond this simple case for local.settings.json, use app configs, and remember that Azure Functions v2 switched to ASP.NET Core Configuration, and does not support ConfigurationManager.  See Jon Gallant’s post for details.

Controlling .NET Core tool versions by global.json

If you ever need to use older versions of .NET Core tooling, an easy option is setting the sdk version in global.json.  For example, create an empty directory and execute dotnet –info. Since I’m using .NET Core 2.0, my output is:

.NET Command Line Tools (2.0.0)

Product Information:
 Version: 2.0.0
 Commit SHA-1 hash: cdcd1928c9

Runtime Environment:
 OS Name: ubuntu
 OS Version: 16.04
 OS Platform: Linux
 RID: ubuntu.16.04-x64
 Base Path: /usr/share/dotnet/sdk/2.0.0/

Microsoft .NET Core Shared Framework Host

Version : 2.0.0
 Build : e8b8861ac7faf042c87a5c2f9f2d04c98b69f28d

Now add this global.json file to the directory:

{
  "sdk": {
    "version": "1.0.0"
  }
}

and run dotnet –info again.  Now the output will look like:

.NET Command Line Tools (1.0.1)

Product Information:
 Version: 1.0.1
 Commit SHA-1 hash: 005db40cd1

Runtime Environment:
 OS Name: ubuntu
 OS Version: 16.04
 OS Platform: Linux
 RID: ubuntu.16.04-x64
 Base Path: /usr/share/dotnet/sdk/1.0.1

Microsoft .NET Core Shared Framework Host

Version : 2.0.0
 Build : e8b8861ac7faf042c87a5c2f9f2d04c98b69f28d

The highlight shows that the older tools are used.  Now, use the older tools to create a new project by executing dotnet new console.  Notice the TargetFramework value in the project file:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>netcoreapp1.1</TargetFramework>
  </PropertyGroup>
</Project>

The .NET Core 1.x tooling created a console project targeting NetCoreApp1.1.  By comparison, .NET Core 2.x tooling will create the project targeting NetCoreApp2.0.

Other tools are affected, too.  For example, dotnet build invokes MSBuild version 15.1.548.43366 for sdk version 1.0.0, and version 15.3.409.57025 for sdk version 2.0.

Beyond manual creation, this capability seems useful for automation scenarios which need to specify which tooling to use.  I was even able to force 1.0.0-preview2-1-003177 tooling to be used.  Will you use this capability?

How To Use Mocha for Node Testing in Windows (2017)

Using Mocha on Windows is so much easier these days!  The previous post on this topic showed how to get past the Linux-specific instructions in Mocha’s Getting Started.  That approach is outdated, and using Mocha on Windows is now a breeze.  So, let’s see how easy it is:

After creating directory for your project, use npm to create a package.json file. (You can skip this step if adding mocha to an existing project)

npm init

You’ll be presented with several options. For this example, accept defaults for all except test command which you’ll type mocha.  Accept the resulting json displayed and your package.json will be written to disk.

If you started with an existing project, i.e., package.json already existed, you just need to set test to mocha in the scripts section.

Once the package.json exists, install mocha.  The save-dev option puts mocha in the devDependencies section rather dependencies.

npm install –save-dev mocha

Now you’re ready to create a test!  Create a subdir test and use your favorite editor to create test.js there.

var assert = require('assert');
describe('Array', function() {
  describe('#indexOf()', function() {
    it('should return -1 when the value is not present', function() {
      assert.equal(-1, [1,2,3].indexOf(4));
    });
  });
});

After saving the file, ensure all packages are up to date.  Go back to the project directory (above test) and run:

npm update –dev

The dev option causes devDependencies to be processed, too.

That’s it!  Now just run the test:

npm test

The output will look like this:

  Array
  #indexOf()
    ✓ should return -1 when the value is not present

 1 passing (9ms)

Is global.json irrelevant (or dead)?

If you followed our .NET Core 1.0.1 Migration Workflow post you may have noticed that global.json was deleted as part of the migration.  Why?

Previously global.json was used primarily for its projects property which specified directories containing project.json files.  In effect, this property provided the same behavior as a Visual Studio Solution file (.sln).  So, the migration process replaced global.json with a .sln file.

Less frequently used, however, was global.json’s sdk property which specifies which SDK version (in $env:ProgramFiles\dotnet\sdk).  While the projects property is ignored, the sdk property is still used.

Consider this global.json:

{
    "projects": [
        "foo", "bar", "Does Not Exist"
    ],
    "sdk": { "version": "1.0.3" }
}
Executing dotnet restore or dotnet build succeeds even though the directories listed in project do not exist.  That property is obviously ignored.  Changing the sdk version number, however, impacts which tooling the dotnet command uses.

 

Interested in more details? See How to use Global.json in the .NET Core Tools 1.0 world.

 

How to use global.json in the .NET Core Tools 1.0 world

UPDATE: see our post on using global.json to control .NET Core tool version.

The .NET Core Tools 1.0 release officially made the switch from using global.json and project.json to using Visual Studio Solution and Project files, respectively.  See Microsoft’s .NET Core Tools 1.0 announcement and our migration guide.

Global.json is not completely useless in this new world, however.  Importantly, you can use it to control which tooling version the dotnet command uses.

Controlling .NET Core Tooling Version

What if you need to create a .NET Core project via older tooling?  Your team may still need to use project.json for some period of time, your builds have not been updated yet, or you may just need it for testing purposes.  Instead of creating a VM or some other heavyweight procedure, just use global.json!  Add the following global.json to an empty directory:

{
    "sdk": { "version": "1.0.0-preview2-003133" }
}

NOTE: This json refers to the 1.0.0-preview2-003133 tooling. If this specific version doesn’t work on your machine, check %ProgramFiles%\dotnet\sdk to see which tooling versions are installed.

Now enter dotnet –help on the command line from within the same directory.  Notice that the first line of output reads:

.NET Command Line Tools (1.0.0-preview2-003133)

Whereas entering dotnet –help from within a different directory (i.e., sans global.json) produces:

.NET Command Line Tools (1.0.3)

Going Back In Time – The Easy Way!

Now, go back to the directory with the global.json file and enter dotnet new -t Console.  Since global.json references the older tooling version, this command creates a project.json file, just like the early days of .NET Core!

 

 

 

Note also that the file dates are set to 9/20/2016 – seemingly the date of the tooling’s binaries in %ProgramFiles%\dotnet\sdk\1.0.0-preview2-003133.

Using dotnet migrate for project.json to .csproj

Read our post at .NET Core 1.0.1 Migration Workflow which explains using .NET Core CLI for migrating from the old project.json approach to the new Visual Studio Project (a.k.a., MSBuild) approach.

.NET Core Tools 1.0 Migration Workflow

If you’ve been using .NET Core for very long, you have code based on global.json and one or more project.json files.  Several months (a year?) ago Microsoft announced deprecating this approach, and a return to Visual Studio Solution (.sln) and Project (.csproj, .fsproj, . vbproj, etc.) files.  This transition also involves changing from directory based to file based orientation.  For example, attempting dotnet build with the 1.0.1 toolset yields:

MSBUILD : error MSB1003: Specify a project or solution file. The current working directory does not contain a project or solution file.

Specifying a project.json file doesn’t work either; it yields:

error MSB4025: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1.

After upgrading to the toolset, migrating code involves a few steps:

  1. Create VS Solution file.  The migration process is much easier if you create this file first.  Using a shell (cmd, powershell, bash, etc.), change to the directory containing the project’s global.json and run dotnet create sln.  The result is a (relatively) empty .sln file.
  2. Migrate project.json files to VS Project files.  Run dotnet migrate in the same directory as above.  This command will recursively find project.json files, convert them to C# project files, and add references to them in the solution file.
  3. At this point you should be able to restore and build using the .sln.  Recall that dotnet tool is no longer directory oriented.  Include the .sln file in the restore and build commands, e.g., dotnet build <project>.sln.

 

 

.NET Core multi-project build change in tasks.json

In earlier versions of .NET Core tooling, building multiple projects was simply a matter of adding them as args in the tasks.json file.

1
2
3
4
5
6
7
8
9
   "tasks": [
        {
            "taskName": "build",
            "command": "dotnet",
            "args": [
                "./J3DI.Domain/",
                "$./J3DI.Infrastructure.EntityFactoryFx/",
                "$./J3DI.Infrastructure.RepositoryFx/",
   ...

Each directory was a child of the location with global.json, and each had its own project.json. This approach worked very well for producing multiple .NET Core libraries and their associated unit tests.

After migrating this code and changing the references to the specific .csproj files, we found that only 1 arg is allowed for the dotnet build task. If the args array contains more than one item, compilation gives

MSBUILD : error MSB1008: Only one project can be specified.

The fix is to reference the Visual Studio Solution File (.sln) in the args.

 

1
2
3
4
5
6
7
    "tasks": [
        {
            "taskName": "build",
            "args": [
                "${workspaceRoot}/J3DI.sln"
            ],
    ...

Good News: There’s still a way to build multiple projects by encapsulating them in a .sln file.

Bad News: Visual Studio required. (IOW, ever tried manually creating or managing a .sln?)