Controlling .NET Core tool versions by global.json

If you ever need to use older versions of .NET Core tooling, an easy option is setting the sdk version in global.json.  For example, create an empty directory and execute dotnet –info. Since I’m using .NET Core 2.0, my output is:

.NET Command Line Tools (2.0.0)

Product Information:
 Version: 2.0.0
 Commit SHA-1 hash: cdcd1928c9

Runtime Environment:
 OS Name: ubuntu
 OS Version: 16.04
 OS Platform: Linux
 RID: ubuntu.16.04-x64
 Base Path: /usr/share/dotnet/sdk/2.0.0/

Microsoft .NET Core Shared Framework Host

Version : 2.0.0
 Build : e8b8861ac7faf042c87a5c2f9f2d04c98b69f28d

Now add this global.json file to the directory:

  "sdk": {
    "version": "1.0.0"

and run dotnet –info again.  Now the output will look like:

.NET Command Line Tools (1.0.1)

Product Information:
 Version: 1.0.1
 Commit SHA-1 hash: 005db40cd1

Runtime Environment:
 OS Name: ubuntu
 OS Version: 16.04
 OS Platform: Linux
 RID: ubuntu.16.04-x64
 Base Path: /usr/share/dotnet/sdk/1.0.1

Microsoft .NET Core Shared Framework Host

Version : 2.0.0
 Build : e8b8861ac7faf042c87a5c2f9f2d04c98b69f28d

The highlight shows that the older tools are used.  Now, use the older tools to create a new project by executing dotnet new console.  Notice the TargetFramework value in the project file:

<Project Sdk="Microsoft.NET.Sdk">

The .NET Core 1.x tooling created a console project targeting NetCoreApp1.1.  By comparison, .NET Core 2.x tooling will create the project targeting NetCoreApp2.0.

Other tools are affected, too.  For example, dotnet build invokes MSBuild version 15.1.548.43366 for sdk version 1.0.0, and version 15.3.409.57025 for sdk version 2.0.

Beyond manual creation, this capability seems useful for automation scenarios which need to specify which tooling to use.  I was even able to force 1.0.0-preview2-1-003177 tooling to be used.  Will you use this capability?

Using dotnet migrate for project.json to .csproj

Read our post at .NET Core 1.0.1 Migration Workflow which explains using .NET Core CLI for migrating from the old project.json approach to the new Visual Studio Project (a.k.a., MSBuild) approach.

.NET Core Tools 1.0 Migration Workflow

If you’ve been using .NET Core for very long, you have code based on global.json and one or more project.json files.  Several months (a year?) ago Microsoft announced deprecating this approach, and a return to Visual Studio Solution (.sln) and Project (.csproj, .fsproj, . vbproj, etc.) files.  This transition also involves changing from directory based to file based orientation.  For example, attempting dotnet build with the 1.0.1 toolset yields:

MSBUILD : error MSB1003: Specify a project or solution file. The current working directory does not contain a project or solution file.

Specifying a project.json file doesn’t work either; it yields:

error MSB4025: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1.

After upgrading to the toolset, migrating code involves a few steps:

  1. Create VS Solution file.  The migration process is much easier if you create this file first.  Using a shell (cmd, powershell, bash, etc.), change to the directory containing the project’s global.json and run dotnet create sln.  The result is a (relatively) empty .sln file.
  2. Migrate project.json files to VS Project files.  Run dotnet migrate in the same directory as above.  This command will recursively find project.json files, convert them to C# project files, and add references to them in the solution file.
  3. At this point you should be able to restore and build using the .sln.  Recall that dotnet tool is no longer directory oriented.  Include the .sln file in the restore and build commands, e.g., dotnet build <project>.sln.



Must Have Tooling for .NET Core Development

Here’s a great set of tools for smoothing your transition to developing in .NET Core.


  • VSCode – cross platform IDE; great for coding .NET Core



Azure 1.4 SDK and Tools

Yesterday (3/9/11) Microsoft announced updates for the Windows Azure SDK and Tools (VSCloudService.exe).  As the download page indicates, this SDK release’s primary purpose is to address stability issues.  Other improvements include:

  • Azure Management Portal – Improved responsiveness
  • Azure Connect – Multi-admin support and installation for non-English Windows
  • Azure Content Delivery Network (CDN) – Provides for delivery of secure content via HTTPS

Be aware also that the 1.4 SDK installer automatically uninstalls previous versions.  So, if you need 1.3 for a while still, you’ll need to keep it safe.

Download the SDK & tools here.

Better Management Certificates for Azure

As I wrote earlier this week, the Silverlight-based Management Portal may be the best feature of Azure’s 1.3 release.  It is a handsome UI, Azure components are far easier to access, and (most importantly) the amount of time required for common tasks is sharply reduced.

This post is only partly about the Management Portal, however.  The certificate generating code in the Azure Tools for VS 2010 is also involved.

The Problem

Although it’s nice to be able to create the necessary management certificate from within Visual Studio, the resulting certificate naming is confusing.  The following dialog shows an existing certificate and the selection for creating a new certificate.

AzVSTools - Certificate Selection

When this tool creates a certificate, it sets the Friendly Name you entered and sets Issued By to Windows Azure Tools.  No big deal, right?  Right – until you add the certificate via the Azure Portal….

Azure Mgmt - Mgmt Certs with Comments

This view of the portal shows the Management Certificates, but you can’t really tell which is which.  For example, which of the two certificates corresponds to the one with Friendly Name: Deployment Credentials in the Azure Tools dialog?  You really can’t tell unless you are able to distinguish them by their thumbprints or validity dates.  Why doesn’t Deployment Credentials  appear in one of the fields?  Well, let’s take a quick look at the certificate in Certificate Manager (certmgr.msc).

CertMgr - Personal Certs - Windows Azure Tools-Deployment Credentials

When Azure Tools created the certificate, it set Windows Azure Tools in the Issued To and Issued By fields.  The name I provided the tool appears in the Friendly Name field.  I’m glad that I can distinguish the certificate in my local store with the friendly name, but it’s only known in my local store.  That’s the problem: Friendly Name is not part of the certificate; it’s metadata associated with the certificate, and only locally.

What’s A Better Way?

Instead of using the Azure Tools to create a certificate, use the MakeCert tool.  Azure only accepts certain certificates (X.509, 2k-bits, SHA1, etc.), so you have to provide a few specific parameters.  Here’s a sample command line:

makecert -sky exchange -r -n “CN=<CertName>” -pe –a sha1 -len 2048 -ss My “<CertFileName>”

where CertName specifies the name you want to appear in Name field in Management Certificates of the management portal, and CertFileName specifies where to store the certificate file on your local drive.

Now, when you upload the certificate to the management portal, you can easily distinguish the certificates.

Azure Mgmt - Mgmt Certs - Better Names2

Then, when you Publish from Visual Studio, simply choose the appropriate certificate from the list.

AzVSTools - Certificate Selection2

Admittedly, the Friendly Name isn’t set, but you have no trouble distinguishing between certificates in either Visual Studio or Azure’s Management Portal.

Process Explorer 14.01 Revives Single View of Indicators

You just gotta love the SysInternals team’s responsiveness!  Just last week I wrote that didn’t like how the v14 release of Process Explorer did not include a single view of all the System Information indicators.  Yes, it’s nice to have an independent view of each indicator on its own tab, but I still want the synchronous summary view.

Say hello to Process Explorer v14.01!  In this release, the team added (revived) the Summary tab to the System Information dialog.

Process Explorer v14.01 - System Information Summary tab screenshot
Process Explorer v14.01 - System Information Summary tab

Summary view, it’s good to have you back!  You’ll notice that this view is not quite the same as the original (pre-v14.x) dialog.  (See prior post for v12 screenshot).  The Summary tab really is a summary of the other three tabs:  CPU Usage History from the CPU tab; Commit & Physical Memory Histories from the Memory tab; I/O, Network and Disk Bytes History from the I/O tab.

Unfortunately, the Summary view does not offer Show one graph per CPU support, so you only get the aggregated graph.  That’s not such a big deal though.  A nice to have feature, however, would be the ability to double-click a graph on the Summary tab which would navigate you to the appropriate detail tab.

OAuth & WRAP for Development Teams

We posted some code to CodePlex which smooths OAuth integration for development teams.

AccessViolationException During VS2010 Debugging in VBox

If you’re using VirtualBox to host a development environment for Visual Studio 2010, this info may be helpful to you.  During debug sessions, you may receive the following exception:

Access Violation Exception in VS2010 guest in VirtualBox
Access Violation Exception in VS2010 guest in VirtualBox

I resolved this problem by disabling VBox’s Nested Paging option for memory.  You’ll need to shut-down the guest OS, open its settings in VBox, and turn off the Nested Paging option.

VirtualBox Guest Settings Dialog, System/Acceleration/Enable Nested Paging Turned Off
VirtualBox Guest Settings Dialog

For reference, my configuration is:

Base OS: Windows 7 x64

Virtualization Host: VirtualBox 3.2.8

Guest OS: Windows Server 2008 R2, Standard

Visual Studio 2010 10.0.30319.1 RTMRel

Lightswitch: Initial Thoughts & Recommendations

I tinkered with the first public beta of Microsoft’s new Lightswitch application development tool.  Here’s a few initial thoughts:

  1. Lightswitch makes it very easy to access data in pretty rich ways.  Business people who are not developers will be able to unlock corporate data in ways that would typically require a developer or significant knowledge of querying tools, etc.
  2. Lightswitch is easy, approachable, but there’s still some learning curve involved.  Layout Items such as Vertical Stack, Horizontal Stack, Text and Picture, and so on provide rich user interface capabilities, but how long will it take users to learn how to use them properly.
  3. Data access is very easy — maybe too easy.  Just point Lightswitch to an existing data store and go!
  4. Relationships to data are automatic or easy to assign.  The features around these relations are very good — Master-Detail screens are super easy.
  5. There’s no hard deployment requirement for Lightswitch apps.  Yes, it requires Silverlight, etc., but you don’t have to deliver it into an IIS farm, etc.
  6. OOPS!  Businesses must realize that all previous points add up to the potential for major train wrecks with data.  For example, data relationship features make it easy to create Master-Detail screens, but cascading deletes are just as easy.


  1. Businesses should take great care to ensure that all Lightswitch application development runs against TEST data.  Far too many companies ignore or skip this Best Practice, but they may suffer seriously if they aren’t careful.
  2. Protect data at the database; not in the app.  Again, this is a Best Practice for software design, but I wouldn’t expect business people to be in tune with it.  DBA’s or data owners must protect production data and provide matching test databases (identical schema, data domains, security, etc.)
  3. Don’t prototype and expand.  People will be tempted to take a Lightswitch application’s generated code, turn it into a full-fledged Silverlight project and then build from there.  Don’t!  Is it possible? Probably, but it’s also possible to climb stairs to the top of the Empire State Building.  I suspect future versions of Lightswitch will reduce this gap (a la creating OCX’s with VB back in the day)

Lastly, I did run into several bugs — not unexpected in an early beta.  One example the product team may be interested in has to do with changing screen layout in non-debug mode (Ctrl-F5 from VS2010) results in

Error: The last change wasn’t successfully persisted. Please shut down the running application. Error: Change cannot be persisted because KittyHawk is not running under debug mode.