Creating Self Signed Certificates on Kubernetes

https://cshort.co/37Xreym Welcome to 2020. Creating self signed TLS certificates is still hard. Five (5) years ago I created a project on github …

Creating Self Signed Certificates on Kubernetes

Snappy for .NET Core on Linux (AVOID!)

Snaps are very easy to install on most Linux OSs, are able to auto-update, etc. But, don’t use Snap to install .NET Core SDK on Linux. Although Snap allows for multiple versions of a snap package, only one is active at a time. Why is this a problem?

The specified framework 'Microsoft.NETCore.App', version '2.0.0' was not found.
  - The following frameworks were found:
      3.0.0 at [/usr/share/dotnet/shared/Microsoft.NETCore.App]

When the .NET Core SDK 3.0 snap package is active, down-level builds don’t work – e.g., using .NET Core SDK 3.0 to build for 2.2, 2.1, etc. Dev organizations of much size cannot simply upgrade everything to 3.0 en mass, so everyone’s config must support targeting netcoreapp2.0 with newer tooling.

Can this be remedied with some magical Snap tricks? Maybe. IMO, the easier route is to simply install all the SDKs you need using the official .NET Core instructions.

CloudBlob Fails Function Indexing

When creating a new blob triggered Azure Function, the default type binding is Stream.

[FunctionName("Function1")]
public static void Run([BlobTrigger("{name}", 
    Connection="AzureWebJobsStorage")]Stream myBlob, ...)

Several binding types are available, such as CloudBlockBlob, 
CloudPageBlob, etc.  Both of these classes are derived from CloudBlob, yet it cannot be used as a binding type.  Although the compiler doesn’t mind, CloudBlob will at run-time during function indexing.

CloudBlob Fails Indexing

When the functions host indexes all the functions available, it fails if the binding type is CloudBlob.  If you know that the triggering blob will be a block or page blob, you can use CloudBlockBlob or CloudPageBlob, respectively.  For code that may be triggered by either, use ICloudBlob instead of CloudBlob.

Note that the error message is helpful in v2 of Azure Functions.  For the full list of types available for the triggering blob, see Microsoft’s documentation.

Breaking Azure Functions with local.settings.json

Because local.settings.json is finicky!

At some point you’re going to want to add an object to the Values section of local.settings.json. Don’t! Declaring an object (or array) in the Values breaks the configuration, resulting in Missing value for AzureWebJobsStorage in local.settings.json.  Notice the customObj declaration below.

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "AzureWebJobsDashboard": "UseDevelopmentStorage=true",
    "customKey": "customValue",
    "customObj": {
      "customObjKey": "customObjValue"
    }
  }
}

Just having customObj declared will breaks things.  Yes, it’s valid JSON; no, your functions won’t be able to retrieve settings from Value.

A simple key-value pair works – like customKey above. After removing customObj, retrieving the setting for customKey from Values is straight-forward

var customVal = System.Environment.GetEnvironmentVariable("customKey");

Beyond this simple case for local.settings.json, use app configs, and remember that Azure Functions v2 switched to ASP.NET Core Configuration, and does not support ConfigurationManager.  See Jon Gallant’s post for details.

.NET Core multi-project build change in tasks.json

In earlier versions of .NET Core tooling, building multiple projects was simply a matter of adding them as args in the tasks.json file.

1
2
3
4
5
6
7
8
9
   "tasks": [
        {
            "taskName": "build",
            "command": "dotnet",
            "args": [
                "./J3DI.Domain/",
                "$./J3DI.Infrastructure.EntityFactoryFx/",
                "$./J3DI.Infrastructure.RepositoryFx/",
   ...

Each directory was a child of the location with global.json, and each had its own project.json. This approach worked very well for producing multiple .NET Core libraries and their associated unit tests.

After migrating this code and changing the references to the specific .csproj files, we found that only 1 arg is allowed for the dotnet build task. If the args array contains more than one item, compilation gives

MSBUILD : error MSB1008: Only one project can be specified.

The fix is to reference the Visual Studio Solution File (.sln) in the args.

 

1
2
3
4
5
6
7
    "tasks": [
        {
            "taskName": "build",
            "args": [
                "${workspaceRoot}/J3DI.sln"
            ],
    ...

Good News: There’s still a way to build multiple projects by encapsulating them in a .sln file.

Bad News: Visual Studio required. (IOW, ever tried manually creating or managing a .sln?)

The Beginning of the End of OS’s

Who cares about operating systems anymore? Microsoft’s recent moves toward Linux, along with their emphasis on Azure, should make it clear that OS’s are diminishing in importance.  (cf. Red Hat on AzureSQL Server on LinuxBash on Windows)  Breaking with Steve Ballmer’s (misbegotten) approach to Linux, Nadella’s Microsoft realizes that Windows isn’t the center of their universe anymore (and can’t be considering their inability to convert desktop dominance to mobile devices).

Developer sentiment is another indicator.  Fewer and fewer developers care about the OS.  OS just doesn’t matter as much in a world of Ruby, Python, Node, MEAN, etc.  This trend will accelerate as PaaS providers continue to improve their offerings.

OS’s aren’t going away, but their importance or mind-share is waning broadly.

Is JSON API an REST Anti-Pattern?

JSON API is an anti-pattern of REST (at least partially).  JSON API’s core problem is that it restricts 1/3 of the fundamental concepts, namely representation of resources. In the Content Negotiation section of the JSON API spec we learn:

  • Clients must pass content-type: application/vnd.api+json in all request headers
  • Clients are not allowed to use any media type parameters
  • Servers must pass content-type: application/vnd.api+json in all response headers
  • Servers must reject requests containing media type parameters in content-type (return error code 415 – Unsupported Media Type)
  • Servers must reject requests lacking an unadorned Accept header for application/vnd.api+json (return error code 406 – Not Acceptable)

In other words, application/vnd.api+json is the only representation allowed.  This restriction may be temporary – v1 spec indicates these requirements “exist to allow future versions of this specification to use media type parameters for extension negotiation and versioning.”  Will the restrictions be lifted in v1.1, v2.0, v3.0?

So What?

“Ok, so JSON API is overly restrictive on representations.  Big deal.  Why should I care?”  As always “it depends” (typical, right?).  Teams meeting the following criteria may not need to care about this issue:

  • Simple / Single Application – if the application is single purpose; service is not expected to serve multiple clients, client types
  • JSON Only – if the application is never expected to provide media formats other than JSON
  • Simple Representations – if the application is never expected to provide different representations; IOW, if media type parameters will always be sufficient

Outlook.com / Live.com Enabling Spammers?

Microsoft’s Outlook.com email site is all the rage this week.  With a clean, responsive interface, many are harping it as a symbol of “the new Microsoft.”  Hope springs eternal.

I was disappointed, however, to see this error message after incorrectly typing a password:

Outlook.com's Wrong Password Error Message

Hmmm. Did I miss a change in the security world regarding email address privacy?  Hopefully Microsoft will remedy this situation quickly and use the typical (and more private) approach – “That email address and password do not match our records.”

 

Keeping SysInternals Up-To-Date

Ed Wilson, Microsoft Scripting Guy, has a good article on TechNet about automatically keeping your local SysInternals files up-to-date.  If you use the SysInternals tools, you know that they are updated fairly frequently – often due to suggestions from outside Microsoft.  If you aren’t using SysInternals, well, you should start.

Scripting Guy’s article is a bit long-winded (in a Spencer F. Katt way), so here’s the quick and dirty for getting started.

  1. Copy the Powershell script, Get-Sysinternals.ps1, from the TechNet Gallery
  2. Paste the script into your favorite editor and save it to the location where you keep scripts (e.g., %UserProfile%/Scripts)
  3. Open Powershell or Powershell ISE as admin (otherwise the script provides a warning: This script requires running as an elevated administrator
  4. Before running the script, make sure you know exactly where SysInternals tools are stored (e.g., %ProgramFiles(x86)%/SysInternals).  You’ll provide this path when you run get-sysinternals.ps1.  If you don’t provide a path, the script will put the SysInternals tools in %SystemRoot%/SysInternals.  Call me paranoid, but I don’t like making changes within %SystemRoot% if it can be avoided.
  5. Run the script.  For example, I run the script like this:

Get-SysInternals “${env:ProgramFiles(x86)}/SysInternals”

Don’t forget the curly braces, or you’ll end up with a path like C:\Program Files(x86)\SysInternals (note the missing space b/t Files and (x86))

 

Here’s a screenshot of the output on my machine:

image

 

Worth noting:

  • Colorized output is a very nice touch!
    • New apps / utilities are reported in green
    • Updated apps / utilities are reported in yellow
    • Unchanged apps / utilities are reported in white
  • The script appears to re-write your machine’s path environment in a different order!  (See the Old Path and New Path sections of the screenshot above) I wasn’t expecting that, and I’m not sure I like it.  That’s a pretty aggressive move.

 

I’m pretty satisfied with manually executing this script occasionally.  Automating it, I have to admit, is pretty cool, however.  So, if you want to automate the script, check out Scripting Guy’s article.

Now, if we just had a way of keeping Get-Sysinternals.ps1 up-to-date.  Smile