Snappy for .NET Core on Linux (AVOID!)

Snaps are very easy to install on most Linux OSs, are able to auto-update, etc. But, don’t use Snap to install .NET Core SDK on Linux. Although Snap allows for multiple versions of a snap package, only one is active at a time. Why is this a problem?

The specified framework 'Microsoft.NETCore.App', version '2.0.0' was not found.
  - The following frameworks were found:
      3.0.0 at [/usr/share/dotnet/shared/Microsoft.NETCore.App]

When the .NET Core SDK 3.0 snap package is active, down-level builds don’t work – e.g., using .NET Core SDK 3.0 to build for 2.2, 2.1, etc. Dev organizations of much size cannot simply upgrade everything to 3.0 en mass, so everyone’s config must support targeting netcoreapp2.0 with newer tooling.

Can this be remedied with some magical Snap tricks? Maybe. IMO, the easier route is to simply install all the SDKs you need using the official .NET Core instructions.

CloudBlob Fails Function Indexing

When creating a new blob triggered Azure Function, the default type binding is Stream.

[FunctionName("Function1")]
public static void Run([BlobTrigger("{name}", 
    Connection="AzureWebJobsStorage")]Stream myBlob, ...)

Several binding types are available, such as CloudBlockBlob, 
CloudPageBlob, etc.  Both of these classes are derived from CloudBlob, yet it cannot be used as a binding type.  Although the compiler doesn’t mind, CloudBlob will at run-time during function indexing.

CloudBlob Fails Indexing

When the functions host indexes all the functions available, it fails if the binding type is CloudBlob.  If you know that the triggering blob will be a block or page blob, you can use CloudBlockBlob or CloudPageBlob, respectively.  For code that may be triggered by either, use ICloudBlob instead of CloudBlob.

Note that the error message is helpful in v2 of Azure Functions.  For the full list of types available for the triggering blob, see Microsoft’s documentation.

Breaking Azure Functions with local.settings.json

Because local.settings.json is finicky!

At some point you’re going to want to add an object to the Values section of local.settings.json. Don’t! Declaring an object (or array) in the Values breaks the configuration, resulting in Missing value for AzureWebJobsStorage in local.settings.json.  Notice the customObj declaration below.

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "AzureWebJobsDashboard": "UseDevelopmentStorage=true",
    "customKey": "customValue",
    "customObj": {
      "customObjKey": "customObjValue"
    }
  }
}

Just having customObj declared will breaks things.  Yes, it’s valid JSON; no, your functions won’t be able to retrieve settings from Value.

A simple key-value pair works – like customKey above. After removing customObj, retrieving the setting for customKey from Values is straight-forward

var customVal = System.Environment.GetEnvironmentVariable("customKey");

Beyond this simple case for local.settings.json, use app configs, and remember that Azure Functions v2 switched to ASP.NET Core Configuration, and does not support ConfigurationManager.  See Jon Gallant’s post for details.

.NET Core multi-project build change in tasks.json

In earlier versions of .NET Core tooling, building multiple projects was simply a matter of adding them as args in the tasks.json file.

1
2
3
4
5
6
7
8
9
   "tasks": [
        {
            "taskName": "build",
            "command": "dotnet",
            "args": [
                "./J3DI.Domain/",
                "$./J3DI.Infrastructure.EntityFactoryFx/",
                "$./J3DI.Infrastructure.RepositoryFx/",
   ...

Each directory was a child of the location with global.json, and each had its own project.json. This approach worked very well for producing multiple .NET Core libraries and their associated unit tests.

After migrating this code and changing the references to the specific .csproj files, we found that only 1 arg is allowed for the dotnet build task. If the args array contains more than one item, compilation gives

MSBUILD : error MSB1008: Only one project can be specified.

The fix is to reference the Visual Studio Solution File (.sln) in the args.

 

1
2
3
4
5
6
7
    "tasks": [
        {
            "taskName": "build",
            "args": [
                "${workspaceRoot}/J3DI.sln"
            ],
    ...

Good News: There’s still a way to build multiple projects by encapsulating them in a .sln file.

Bad News: Visual Studio required. (IOW, ever tried manually creating or managing a .sln?)

The Beginning of the End of OS’s

Who cares about operating systems anymore? Microsoft’s recent moves toward Linux, along with their emphasis on Azure, should make it clear that OS’s are diminishing in importance.  (cf. Red Hat on AzureSQL Server on LinuxBash on Windows)  Breaking with Steve Ballmer’s (misbegotten) approach to Linux, Nadella’s Microsoft realizes that Windows isn’t the center of their universe anymore (and can’t be considering their inability to convert desktop dominance to mobile devices).

Developer sentiment is another indicator.  Fewer and fewer developers care about the OS.  OS just doesn’t matter as much in a world of Ruby, Python, Node, MEAN, etc.  This trend will accelerate as PaaS providers continue to improve their offerings.

OS’s aren’t going away, but their importance or mind-share is waning broadly.

Is JSON API an REST Anti-Pattern?

JSON API is an anti-pattern of REST (at least partially).  JSON API’s core problem is that it restricts 1/3 of the fundamental concepts, namely representation of resources. In the Content Negotiation section of the JSON API spec we learn:

  • Clients must pass content-type: application/vnd.api+json in all request headers
  • Clients are not allowed to use any media type parameters
  • Servers must pass content-type: application/vnd.api+json in all response headers
  • Servers must reject requests containing media type parameters in content-type (return error code 415 – Unsupported Media Type)
  • Servers must reject requests lacking an unadorned Accept header for application/vnd.api+json (return error code 406 – Not Acceptable)

In other words, application/vnd.api+json is the only representation allowed.  This restriction may be temporary – v1 spec indicates these requirements “exist to allow future versions of this specification to use media type parameters for extension negotiation and versioning.”  Will the restrictions be lifted in v1.1, v2.0, v3.0?

So What?

“Ok, so JSON API is overly restrictive on representations.  Big deal.  Why should I care?”  As always “it depends” (typical, right?).  Teams meeting the following criteria may not need to care about this issue:

  • Simple / Single Application – if the application is single purpose; service is not expected to serve multiple clients, client types
  • JSON Only – if the application is never expected to provide media formats other than JSON
  • Simple Representations – if the application is never expected to provide different representations; IOW, if media type parameters will always be sufficient

Outlook.com / Live.com Enabling Spammers?

Microsoft’s Outlook.com email site is all the rage this week.  With a clean, responsive interface, many are harping it as a symbol of “the new Microsoft.”  Hope springs eternal.

I was disappointed, however, to see this error message after incorrectly typing a password:

Outlook.com's Wrong Password Error Message

Hmmm. Did I miss a change in the security world regarding email address privacy?  Hopefully Microsoft will remedy this situation quickly and use the typical (and more private) approach – “That email address and password do not match our records.”

 

Keeping SysInternals Up-To-Date

Ed Wilson, Microsoft Scripting Guy, has a good article on TechNet about automatically keeping your local SysInternals files up-to-date.  If you use the SysInternals tools, you know that they are updated fairly frequently – often due to suggestions from outside Microsoft.  If you aren’t using SysInternals, well, you should start.

Scripting Guy’s article is a bit long-winded (in a Spencer F. Katt way), so here’s the quick and dirty for getting started.

  1. Copy the Powershell script, Get-Sysinternals.ps1, from the TechNet Gallery
  2. Paste the script into your favorite editor and save it to the location where you keep scripts (e.g., %UserProfile%/Scripts)
  3. Open Powershell or Powershell ISE as admin (otherwise the script provides a warning: This script requires running as an elevated administrator
  4. Before running the script, make sure you know exactly where SysInternals tools are stored (e.g., %ProgramFiles(x86)%/SysInternals).  You’ll provide this path when you run get-sysinternals.ps1.  If you don’t provide a path, the script will put the SysInternals tools in %SystemRoot%/SysInternals.  Call me paranoid, but I don’t like making changes within %SystemRoot% if it can be avoided.
  5. Run the script.  For example, I run the script like this:

Get-SysInternals “${env:ProgramFiles(x86)}/SysInternals”

Don’t forget the curly braces, or you’ll end up with a path like C:\Program Files(x86)\SysInternals (note the missing space b/t Files and (x86))

 

Here’s a screenshot of the output on my machine:

image

 

Worth noting:

  • Colorized output is a very nice touch!
    • New apps / utilities are reported in green
    • Updated apps / utilities are reported in yellow
    • Unchanged apps / utilities are reported in white
  • The script appears to re-write your machine’s path environment in a different order!  (See the Old Path and New Path sections of the screenshot above) I wasn’t expecting that, and I’m not sure I like it.  That’s a pretty aggressive move.

 

I’m pretty satisfied with manually executing this script occasionally.  Automating it, I have to admit, is pretty cool, however.  So, if you want to automate the script, check out Scripting Guy’s article.

Now, if we just had a way of keeping Get-Sysinternals.ps1 up-to-date.  Smile

GoDaddy! Really, it’s time for you to GO!

A few years ago I got sucked into GoDaddy’s low prices and (seemingly) cohesive capabilities across a wide technology spectrum.  For the better part of a year, however, I’ve been reminded of the maxim, “You get what you pay for.”

I finally gave GoDaddy the boot, and I’m already pleased with the results.  I wish I had enough time to recount all the GoDaddy-induced pain, but I just don’t.  A few quick points:

  • Customer Service was terrible.  Yes, I received the “we promise to respond within 24 hours” email, but I frequently had to reply back (days later) asking, “what’s the status of this issue?”  Of course I’m not privy to GoDaddy’s inner workings, but I am left with the impression that SOP is to do nothing until the customer makes a reminder contact (calls by phone, send an email, etc.)
  • Charged more for less.  It’s still hard for me to believe, but it’s true.  I hosted 2 WordPress blogs with GoDaddy.  One was measurably faster even though it contained less content and used fewer plug-ins, etc.  Unfortunately, it is also the less important blog.  After demonstrating (with measured results) that one was slower than the other, GoDaddy said there was nothing they could do about it.  If I were to upgrade to the next package level (Economy to Deluxe if memory serves), performance would improve substantially.  “Sucker” is now emblazoned on my forehead.  I continued to measure the performance of my more important (and now more powerfully hosted) blog.  There was absolutely no change (other than the amount GoDaddy charged me).  I disabled WordPress plug-ins and did everything I could to give GoDaddy’s host server the opportunity to deliver the promised performance.  Nada.
  • WordPress Upgradability – During the tenure of my blogs at GoDaddy, there have been at least a half-dozen upgrades to the WordPress platform.  I can only remember one of the dozen working correctly (2 blogs x half-dozen upgrades = ~dozen total).  GoDaddy’s Customer Support personnel blame WordPress.  But when I talk to other bloggers, their upgrade from 3.0 to 3.1 (or whatever) is flawless.  We take the same steps; their blogs (not on GoDaddy) upgrade; mine don’t.

 

Have I already written that much?  I really didn’t want to spend much time on this topic, but my fingers just couldn’t hold back. Frustrating!

Ok. Take a breath; think positive thoughts. I’m moving on now.  Good riddance!

Amazon MP3 Uploader, Doesn’t

Everyone’s pretty excited about Amazon’s MP3 Cloud Player with 5 GB of Cloud Drive space.  I downloaded the Android version to my HTC Incredible as soon as I could. The concept is great – buy music from any device (just about) and play it from any device (just about).  Do you really need an iTunes app anymore?

Amazon’s announcement came along with some disappointing and exciting caveats.  Disappointing: If you previously bought MP3 music from Amazon, it doesn’t show up in your Cloud Drive, and there’s no handy option on the Cloud Player site to move it there.  Exciting: You can use the Amazon MP3 Uploader tool to (wait for it…) upload MP3s to your Cloud Drive!  So, you can manually copy previously purchased (from Amazon or elsewhere) to your Cloud Drive and use it from your various Cloud Players.

Well, in theory anyway.  I’ve tried to upload MP3 files using version 1.0.1 of the Amazon MP3 Uploader, but it doesn’t actually upload any files.  So, exciting becomes Disappointing!  The application (which is based on Adobe AIR) either hangs or crashes.  In the past 24 hours I’ve experienced 3 hangs and 2 crashes.  The more recent crash dump was:

1 Problem signature: 2 Problem Event Name: APPCRASH 3 Application Name: Amazon MP3 Uploader.exe 4 Application Version: 0.0.0.0 5 Application Timestamp: 4ca30b7b 6 Fault Module Name: Adobe AIR.dll 7 Fault Module Version: 2.6.0.19120 8 Fault Module Timestamp: 4d7a8071 9 Exception Code: c0000005 10 Exception Offset: 001fe2db 11 OS Version: 6.1.7601.2.1.0.256.4 12 Locale ID: 1033 13 Additional Information 1: 0a9e 14 Additional Information 2: 0a9e372d3b4ad19135b953a78882e789 15 Additional Information 3: 0a9e 16 Additional Information 4: 0a9e372d3b4ad19135b953a78882e789 17 18 Read our privacy statement online: 19 http://go.microsoft.com/fwlink/?linkid=104288&clcid=0x0409 20 21 If the online privacy statement is not available, please read our privacy statement offline: 22 C:Windowssystem32en-USerofflps.txt 23

BTW, I started out trying to upload a bunch of files, but after the first hang I’ve just tried to upload 1 or 2 files.  I’m using Windows 7 32-bit with SP1. I’ve launched the uploader app from both IE9 and Firefox 3.6.16. AIR seems to be up-to-date (see Fault Module Version).