CloudBlob Fails Function Indexing

When creating a new blob triggered Azure Function, the default type binding is Stream.

public static void Run([BlobTrigger("{name}", 
    Connection="AzureWebJobsStorage")]Stream myBlob, ...)

Several binding types are available, such as CloudBlockBlob, 
CloudPageBlob, etc.  Both of these classes are derived from CloudBlob, yet it cannot be used as a binding type.  Although the compiler doesn’t mind, CloudBlob will at run-time during function indexing.

CloudBlob Fails Indexing

When the functions host indexes all the functions available, it fails if the binding type is CloudBlob.  If you know that the triggering blob will be a block or page blob, you can use CloudBlockBlob or CloudPageBlob, respectively.  For code that may be triggered by either, use ICloudBlob instead of CloudBlob.

Note that the error message is helpful in v2 of Azure Functions.  For the full list of types available for the triggering blob, see Microsoft’s documentation.

Breaking Azure Functions with local.settings.json

Because local.settings.json is finicky!

At some point you’re going to want to add an object to the Values section of local.settings.json. Don’t! Declaring an object (or array) in the Values breaks the configuration, resulting in Missing value for AzureWebJobsStorage in local.settings.json.  Notice the customObj declaration below.

  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "AzureWebJobsDashboard": "UseDevelopmentStorage=true",
    "customKey": "customValue",
    "customObj": {
      "customObjKey": "customObjValue"

Just having customObj declared will breaks things.  Yes, it’s valid JSON; no, your functions won’t be able to retrieve settings from Value.

A simple key-value pair works – like customKey above. After removing customObj, retrieving the setting for customKey from Values is straight-forward

var customVal = System.Environment.GetEnvironmentVariable("customKey");

Beyond this simple case for local.settings.json, use app configs, and remember that Azure Functions v2 switched to ASP.NET Core Configuration, and does not support ConfigurationManager.  See Jon Gallant’s post for details.

.NET Core multi-project build change in tasks.json

In earlier versions of .NET Core tooling, building multiple projects was simply a matter of adding them as args in the tasks.json file.

   "tasks": [
            "taskName": "build",
            "command": "dotnet",
            "args": [

Each directory was a child of the location with global.json, and each had its own project.json. This approach worked very well for producing multiple .NET Core libraries and their associated unit tests.

After migrating this code and changing the references to the specific .csproj files, we found that only 1 arg is allowed for the dotnet build task. If the args array contains more than one item, compilation gives

MSBUILD : error MSB1008: Only one project can be specified.

The fix is to reference the Visual Studio Solution File (.sln) in the args.


    "tasks": [
            "taskName": "build",
            "args": [

Good News: There’s still a way to build multiple projects by encapsulating them in a .sln file.

Bad News: Visual Studio required. (IOW, ever tried manually creating or managing a .sln?)

The Beginning of the End of OS’s

Who cares about operating systems anymore? Microsoft’s recent moves toward Linux, along with their emphasis on Azure, should make it clear that OS’s are diminishing in importance.  (cf. Red Hat on AzureSQL Server on LinuxBash on Windows)  Breaking with Steve Ballmer’s (misbegotten) approach to Linux, Nadella’s Microsoft realizes that Windows isn’t the center of their universe anymore (and can’t be considering their inability to convert desktop dominance to mobile devices).

Developer sentiment is another indicator.  Fewer and fewer developers care about the OS.  OS just doesn’t matter as much in a world of Ruby, Python, Node, MEAN, etc.  This trend will accelerate as PaaS providers continue to improve their offerings.

OS’s aren’t going away, but their importance or mind-share is waning broadly.

Is JSON API an REST Anti-Pattern?

JSON API is an anti-pattern of REST (at least partially).  JSON API’s core problem is that it restricts 1/3 of the fundamental concepts, namely representation of resources. In the Content Negotiation section of the JSON API spec we learn:

  • Clients must pass content-type: application/vnd.api+json in all request headers
  • Clients are not allowed to use any media type parameters
  • Servers must pass content-type: application/vnd.api+json in all response headers
  • Servers must reject requests containing media type parameters in content-type (return error code 415 – Unsupported Media Type)
  • Servers must reject requests lacking an unadorned Accept header for application/vnd.api+json (return error code 406 – Not Acceptable)

In other words, application/vnd.api+json is the only representation allowed.  This restriction may be temporary – v1 spec indicates these requirements “exist to allow future versions of this specification to use media type parameters for extension negotiation and versioning.”  Will the restrictions be lifted in v1.1, v2.0, v3.0?

So What?

“Ok, so JSON API is overly restrictive on representations.  Big deal.  Why should I care?”  As always “it depends” (typical, right?).  Teams meeting the following criteria may not need to care about this issue:

  • Simple / Single Application – if the application is single purpose; service is not expected to serve multiple clients, client types
  • JSON Only – if the application is never expected to provide media formats other than JSON
  • Simple Representations – if the application is never expected to provide different representations; IOW, if media type parameters will always be sufficient / Enabling Spammers?

Microsoft’s email site is all the rage this week.  With a clean, responsive interface, many are harping it as a symbol of “the new Microsoft.”  Hope springs eternal.

I was disappointed, however, to see this error message after incorrectly typing a password:'s Wrong Password Error Message

Hmmm. Did I miss a change in the security world regarding email address privacy?  Hopefully Microsoft will remedy this situation quickly and use the typical (and more private) approach – “That email address and password do not match our records.”


Keeping SysInternals Up-To-Date

Ed Wilson, Microsoft Scripting Guy, has a good article on TechNet about automatically keeping your local SysInternals files up-to-date.  If you use the SysInternals tools, you know that they are updated fairly frequently – often due to suggestions from outside Microsoft.  If you aren’t using SysInternals, well, you should start.

Scripting Guy’s article is a bit long-winded (in a Spencer F. Katt way), so here’s the quick and dirty for getting started.

  1. Copy the Powershell script, Get-Sysinternals.ps1, from the TechNet Gallery
  2. Paste the script into your favorite editor and save it to the location where you keep scripts (e.g., %UserProfile%/Scripts)
  3. Open Powershell or Powershell ISE as admin (otherwise the script provides a warning: This script requires running as an elevated administrator
  4. Before running the script, make sure you know exactly where SysInternals tools are stored (e.g., %ProgramFiles(x86)%/SysInternals).  You’ll provide this path when you run get-sysinternals.ps1.  If you don’t provide a path, the script will put the SysInternals tools in %SystemRoot%/SysInternals.  Call me paranoid, but I don’t like making changes within %SystemRoot% if it can be avoided.
  5. Run the script.  For example, I run the script like this:

Get-SysInternals “${env:ProgramFiles(x86)}/SysInternals”

Don’t forget the curly braces, or you’ll end up with a path like C:\Program Files(x86)\SysInternals (note the missing space b/t Files and (x86))


Here’s a screenshot of the output on my machine:



Worth noting:

  • Colorized output is a very nice touch!
    • New apps / utilities are reported in green
    • Updated apps / utilities are reported in yellow
    • Unchanged apps / utilities are reported in white
  • The script appears to re-write your machine’s path environment in a different order!  (See the Old Path and New Path sections of the screenshot above) I wasn’t expecting that, and I’m not sure I like it.  That’s a pretty aggressive move.


I’m pretty satisfied with manually executing this script occasionally.  Automating it, I have to admit, is pretty cool, however.  So, if you want to automate the script, check out Scripting Guy’s article.

Now, if we just had a way of keeping Get-Sysinternals.ps1 up-to-date.  Smile

GoDaddy! Really, it’s time for you to GO!

A few years ago I got sucked into GoDaddy’s low prices and (seemingly) cohesive capabilities across a wide technology spectrum.  For the better part of a year, however, I’ve been reminded of the maxim, “You get what you pay for.”

I finally gave GoDaddy the boot, and I’m already pleased with the results.  I wish I had enough time to recount all the GoDaddy-induced pain, but I just don’t.  A few quick points:

  • Customer Service was terrible.  Yes, I received the “we promise to respond within 24 hours” email, but I frequently had to reply back (days later) asking, “what’s the status of this issue?”  Of course I’m not privy to GoDaddy’s inner workings, but I am left with the impression that SOP is to do nothing until the customer makes a reminder contact (calls by phone, send an email, etc.)
  • Charged more for less.  It’s still hard for me to believe, but it’s true.  I hosted 2 WordPress blogs with GoDaddy.  One was measurably faster even though it contained less content and used fewer plug-ins, etc.  Unfortunately, it is also the less important blog.  After demonstrating (with measured results) that one was slower than the other, GoDaddy said there was nothing they could do about it.  If I were to upgrade to the next package level (Economy to Deluxe if memory serves), performance would improve substantially.  “Sucker” is now emblazoned on my forehead.  I continued to measure the performance of my more important (and now more powerfully hosted) blog.  There was absolutely no change (other than the amount GoDaddy charged me).  I disabled WordPress plug-ins and did everything I could to give GoDaddy’s host server the opportunity to deliver the promised performance.  Nada.
  • WordPress Upgradability – During the tenure of my blogs at GoDaddy, there have been at least a half-dozen upgrades to the WordPress platform.  I can only remember one of the dozen working correctly (2 blogs x half-dozen upgrades = ~dozen total).  GoDaddy’s Customer Support personnel blame WordPress.  But when I talk to other bloggers, their upgrade from 3.0 to 3.1 (or whatever) is flawless.  We take the same steps; their blogs (not on GoDaddy) upgrade; mine don’t.


Have I already written that much?  I really didn’t want to spend much time on this topic, but my fingers just couldn’t hold back. Frustrating!

Ok. Take a breath; think positive thoughts. I’m moving on now.  Good riddance!

Amazon MP3 Uploader, Doesn’t

Everyone’s pretty excited about Amazon’s MP3 Cloud Player with 5 GB of Cloud Drive space.  I downloaded the Android version to my HTC Incredible as soon as I could. The concept is great – buy music from any device (just about) and play it from any device (just about).  Do you really need an iTunes app anymore?

Amazon’s announcement came along with some disappointing and exciting caveats.  Disappointing: If you previously bought MP3 music from Amazon, it doesn’t show up in your Cloud Drive, and there’s no handy option on the Cloud Player site to move it there.  Exciting: You can use the Amazon MP3 Uploader tool to (wait for it…) upload MP3s to your Cloud Drive!  So, you can manually copy previously purchased (from Amazon or elsewhere) to your Cloud Drive and use it from your various Cloud Players.

Well, in theory anyway.  I’ve tried to upload MP3 files using version 1.0.1 of the Amazon MP3 Uploader, but it doesn’t actually upload any files.  So, exciting becomes Disappointing!  The application (which is based on Adobe AIR) either hangs or crashes.  In the past 24 hours I’ve experienced 3 hangs and 2 crashes.  The more recent crash dump was:

1 Problem signature: 2 Problem Event Name: APPCRASH 3 Application Name: Amazon MP3 Uploader.exe 4 Application Version: 5 Application Timestamp: 4ca30b7b 6 Fault Module Name: Adobe AIR.dll 7 Fault Module Version: 8 Fault Module Timestamp: 4d7a8071 9 Exception Code: c0000005 10 Exception Offset: 001fe2db 11 OS Version: 6.1.7601. 12 Locale ID: 1033 13 Additional Information 1: 0a9e 14 Additional Information 2: 0a9e372d3b4ad19135b953a78882e789 15 Additional Information 3: 0a9e 16 Additional Information 4: 0a9e372d3b4ad19135b953a78882e789 17 18 Read our privacy statement online: 19 20 21 If the online privacy statement is not available, please read our privacy statement offline: 22 C:Windowssystem32en-USerofflps.txt 23

BTW, I started out trying to upload a bunch of files, but after the first hang I’ve just tried to upload 1 or 2 files.  I’m using Windows 7 32-bit with SP1. I’ve launched the uploader app from both IE9 and Firefox 3.6.16. AIR seems to be up-to-date (see Fault Module Version).

PowerShell Script Gets Stuck in Error State?

I wrote a PowerShell script to track the response times of a few web sites.  I know there are commercial products that crawl a site at given times and report timing details, missing links, etc., etc., etc.  But, I’m just trying to demonstrate to my hosting provider that one of my sites is consistently slower than other sites.  Ironically, the site which is consistently slower is a higher-grade hosting plan.  Yep, I pay more for the slower site than I do the other sites.  I need this site to be faster than the others, so I don’t want to just lower the plan.

Ok, enough background.  The script simply walks a small array of URLs, creates a WebRequest with caching turned off (NoCacheNoStore), executes the request and calculate the amount of time for the response to return (a LastByte meaurement).  After timing the response for each URL in the array, the script sleeps for a random amount of time (between 2 minutes and 4 hours).

The problem is that eventually one of the sites will timeout.  (Of course this is one of the primary reasons I pay a company to host for me – my customers and visitors shouldn’t experience timeouts, slow responses, etc.  But I digress.)  Once a WebException with the message “The operation has timed-out” occurs, every future WebRequest experiences the same exception.  At first I thought the script must not be clearing $error appropriately, but it is clearing $error between requests (at least twice actually).  I don’t think this exception should occur with each request.  Some of my rationale includes:

  1. $error.Clear() is called before each WebRequest is created (see the first line in the foreach loop of the Main function)
  2. The script does not attempt to re-use WebRequest; it creates a new instance each time (see the first few lines of the GetResponse function)
  3. Neither does it attempt to reuse WebResponse; it sets the variable to $null prior to executing WebRequest.GetResponse(). (see line about mid-way into the GetResponse function)

Another important point is that the PowerShell environment becomes corrupted.  If I stop the script (ctrl-c) and restart it, the time-out errors will continue.  If I close the PowerShell environment (shell or ISE), restart the environment and then restart the script, the errors will not occur (at least not until a “real” timeout occurs).

If you’re interested to educate me on the finer points of PowerShell, using WebRequest in PoSh, etc., the full script is below.  Post a comment if you have ideas or a solution.

1 ################################################## 2 # Script: Track-WebResponse.ps1 3 # Created by: J Burnett 4 # History: Feb, 2011 - Created 5 ################################################## 6 7 function GetResponse($uri) 8 { 9 # Create WebRequest; force it to bypass cache 10 $request = $null # TODO: does this help eliminate the repetitious "Operation timed out" errors 11 $request = [net.WebRequest]::Create($uri) 12 try { 13 $request.CachePolicy = new-object -TypeName System.Net.Cache.HttpRequestCachePolicy 14 $request.CachePolicy = [net.cache.HttpRequestCacheLevel].NoCacheNoStore 15 } 16 catch { 17 $response = "ERROR: Unable to set CachePolicy: $error[0]" 18 } 19 20 [net.WebResponse] $response = $null 21 22 $requestStartTime = get-date 23 try { 24 $response = $request.GetResponse() 25 } 26 # Don't catch anything, just get the end time. Exceptions will be handled by the caller via $error 27 finally { 28 $endTime = get-date 29 } 30 31 return $response, $requestStartTime, $endTime 32 } 33 34 35 #################### 36 function DeclareCustomTypes() 37 { 38 $classDef = @" 39 public class ResponseResults 40 { 41 public string RequestUrl; 42 // Response timing data 43 public System.DateTime RequestStartTime; 44 public System.DateTime RequestEndTime; 45 public System.TimeSpan ResponseTime; 46 // Data from HttpWebResponse 47 public int ContentLength; 48 public string ContentType; 49 public bool IsFromCache; 50 // Error and other messages 51 public string Message; 52 // public System.Net.HttpWebResponse Response; 53 } 54 "@ 55 Add-Type -TypeDefinition $classDef 56 } 57 58 59 #################### 60 Function BuildObject { 61 param ($urlReqT, $respT, $requestStartTimeT, $endTimeT, $contentLength, $msg) 62 63 $respResults = new-object ResponseResults 64 $respResults.RequestUrl = $urlReqT 65 # Capture request / response timing data 66 $respResults.RequestStartTime = $requestStartTime 67 $respResults.RequestEndTime = $endTimeT 68 $respResults.ResponseTime = (([DateTime]$endTimeT) - $requestStartTime) 69 70 # Capture data from HttpWebResponse 71 $respResults.ContentLength = $respT.ContentLength 72 $respResults.ContentType = $respT.ContentType 73 $respResults.IsFromCache = $respT.IsFromCache 74 75 $respResults.Message = $msg 76 77 return $respResults 78 } 79 80 81 #################### 82 Function TimeRequest($uri) 83 { 84 $error.Clear() 85 $resp, $requestStartTime, $endTime = GetResponse($uri) 86 $msg = "Response from $uri took " + ($endTime - $requestStartTime) 87 88 $contentLength = 0 89 if ($null -ne $resp) { 90 $contentLength = $resp.ContentLength 91 $msg += " [ContentLength: $contentLength]" 92 } 93 else { 94 $msg = "ERROR: $error $msg" 95 } 96 97 $forXml = BuildObject $uri $resp $requestStartTime $endTime $contentLength $msg 98 $timeForFileName = $requestStartTime.ToString("") 99 ($forXml | ConvertTo-Xml).Save("./ResponseResults-$timeForFileName.xml") 100 101 return $msg 102 } 103 104 105 #################### 106 Function Main() 107 { 108 DeclareCustomTypes 109 110 $rgUri = @("", "", "") # "", "") 111 while ($true) { 112 foreach ($uri in $rgUri) { 113 $error.clear() 114 $resultMsg = TimeRequest($uri) 115 write-host $resultMsg 116 } 117 118 # Sleep for 2 minute to 4 hours 119 $sleepSecs = get-random -min (2*60) -max (4*60*60) 120 write-host "Sleeping for $sleepSecs seconds... `n" 121 start-sleep -seconds $sleepSecs 122 } 123 } 124 125 #################### 126 # Globals 127 $requestStartTime = get-date 128 129 #################### 130 # Main Entry point 131 Main 132 133