Perils of Async: Introduction

As application communications over lossy networks and “in the cloud” have grown, the necessity of performing these communications asynchronously has risen with them. Why this change has been occurring may be an interesting topic for another post, but a few simple cases demonstrate the point:

  • Web browsers make multiple, asynchronous HTTP calls per page requested. Procuring a page’s images, for example, have been asynchronous (“out-of-band”) operations for at least decade.
  • Many dynamic websites depend on various technologies’ (AJAX, JavaScript, jQuery, etc.) asynchronous capabilities – that’s what makes the site “dynamic.”
  • Similarly, most desktop and mobile applications use technologies to communicate asynchronously.

Previously, developing asynchronous software – whether inter-process, multi-threaded, etc. – required very talented software developers. (As you’ll see soon enough, it still does.) Many companies and other groups have put forward tools, languages, methodologies, etc. to make asynchronous development more approachable (i.e., easier for less sophisticated developers).

Everyone involved in software development – developers, managers, business leaders, quality assurance, and so on – need to be aware, however, that these “tools” have a down-side. Keep this maxim in mind: Things that make asynchronous software development easier also make bad results Ibugs!) easier. For example, all software involving some form of asynchronicity

  • Not only has bugs (as all software does), but the bugs are much, much more difficult to track down and fix
  • Exhibits higher degrees of hardware-based flux. Consider, for example, a new mobile app that is stable and runs well on a device using a Qualcomm Snapdragon S1 or S2 (single-core) processor. Will the same app run just as well on a similar device using (dual-core) Snapdragon S3 or above? Don’t count on it – certainly don’t bet your business on it!

This series of posts, Perils of Async, aims to discuss many of the powerful .NET capabilities for asynchronous and parallel programming, and to help you avoid their perilous side!

Best Practice for Endorsing on LinkedIn

“And now for something completely different!” Yes, this is a bit off-beat for us, but we think you’ll be glad to learn a better way to endorse people on LinkedIn.

Recently LinkedIn has more aggressively elicited your endorsement for people in your network.  You are presented with four people from your network along with just one skill per person.


You have the option of endorsing all of them at once, or one person at a time. Regardless of which path you take, you are only able to endorse one skill per person.  We want endorsements on LinkedIn to be meaningful, so we prefer to endorse multiple skills for one person at a time.  Here’s what we do…

First, go to the person’s profile page.  From the four person endorsement grid, you can right-click their picture and open a new browser tab or window. Alternatively, you can search for them or find them in your network other ways.

Once you are on the person’s profile page, simply use your mouse to hover over the drop-down indicator to the right of the Send a message button.


Hovering will cause the drop-down menu to appear, from which you will select Endorse skills & expertise.  Now LinkedIn adds endorsement to the top of the person’s profile page.


Within the endorsement area, you can add skills you want to endorse, remove skills you do not want to endorse, etc.  After completing the set of skills you want to endorse for that person, click the Endorse button. 

By the way, the person you have endorsed can remove your endorsement if they disagree with it for any reason. So we think it’s worthwhile to add skills you believe the person demonstrates.

azureQuery vs. Azure SDK for Node


One of our recent projects involved using JavaScript to access Windows Azure data and features.  When considering the overall design, we discussed client- or server-side execution models (where the “meat” of the code will execute).  In this post we hope to expose what we learned in this process to others.  Although quite a few JavaScript libraries exist for accessing parts of Azure, the two we’ll analyze here are azureQuery and Azure SDK for Node.

First, a little context about each library:

azureQuery Azure SDK
Publisher David Pallman – Neudesic Windows Azure – Microsoft
URL azureQuery Windows Azure Node.js Developer Center
Code URL azureQuery on CodePlex azure-sdk-for-node on GitHub
Initial Release July, 2012 September, 2011


Next, some characteristics of the libraries:

azureQuery Azure SDK
Execution Locale Client-side (browser) Server-side (node)
Fluent (chaining) language support? Yes No
Storage Support?


Yes Yes


*Not Yet Yes


*Not Yet Yes
Service Bus Support? ^No Yes
Identity & Access Control? No No
* As of 9/12/12, azureQuery only provides access to Windows Azure Blob Storage ^ We are not clear whether azureQuery plans to support Service Bus integration.


The table above highlights that, in its current state, azureQuery is very limited in its support of Azure features.  Actually, that’s to be expected. azureQuery was first published in late July, 2012; Azure SDK for Node was 10 months old at that point. We expect azureQuery will deliver support more areas of Azure, especially as the level of developer contribution improves (David Pallman has a full-time job, after all!).


Which should you use?

So, which of these libraries should you use for projects now?  If you’re thinking, “That’s not even the right question!” you are right!  Decisions regarding which code runs client-side or server-side has a great deal more to do with application requirements, scale expectations, data change rates, etc.

However, it is pretty clear at this point that azureQuery is still in its infancy.  If your goal is to rapidly deliver a solution using Windows Azure (beyond Blobs), then you should use Azure SDK for Node.  This decision will change as azureQuery fulfills its (assumed) mission. If solution demands client-side execution (e.g., rich visualization of changing data), then we encourage you to invest in azureQuery and contribute to its advancement.

How To Use Mocha for Node Testing in Windows

NOTE: these instructions are outdated.  Using Mocha on Windows is so much easier now!  Skip this post and head straight over to the more current post.

Even with some advice from other sites, we had trouble getting Mocha testing to work well in Windows.  Well, we could get tests to run, but inefficiently and error-prone from a developer perspective.  Our solutions are cross-platform, so we wanted to use a consistent mechanism for Windows and Linux (and assume Linux covers us well on Mac OS X).

Most of the suggestions from other sites worked, so we’ll just walk through them quickly.  But getting test execution via makefiles was a headache.  Aside: The “Unix Way” uses makefiles for almost everything.  The Unix Way for a developer to execute Mocha tests is to simply type ‘make test‘  Without getting into any philosophical or pragmatic rationales, this is the best way we’ve found for cross-platform Mocha testing.

Setting Up Your Environment

We’ll assume that you already have Node configured and working.  The first step is to get Mocha installed and configured. As with most Node packages, installing Mocha is as easy as

npm install mocha

Once installed, Mocha shows up in the list of packages

Mocha in Node Packages List

Now you’re ready to create tests.  Just follow the instructions, 1. 2. 3. Mocha!, on Mocha’s GitHub site.  Their instructions are written for a Unix-like environment (.e.g., “$EDITOR test/test.js”), but translation is simply:

  1. Create a test directory
  2. Use your favorite editor to copy the provided JavaScript in to test\test.js file
  3. Run Mocha

Oops! Step 3 doesn’t work on Windows like they expect it will on Linux.  Yes, you could add Mocha to PATH, but that’s not the better solution – especially since we aim to use make to run the tests.  So we’ll just skip step 3 and get help from Alex Young’s excellent post, Testing With Mocha.  You can skip the first bit about creating a package.json file and using npm to install Mocha.  (Eventually you’re going to want to leverage the power of the Node Package Manager!)

Just copy Alex’s simple, 3-line makefile to your directory (the parent of the test directory where you stored test.js).   If you were using a “typical” unix developer environment, you could simply run the tests in test.js by using make:

make test

In most Windows developer environments, however, makeeither doesn’t exist or isn’t going to work correctly.  Other have written, for example, that Visual Studio’s nmake.exe isn’t a good proxy for make in this case.  As you search for how to remedy this situation, you’re likely to come across Mocha requires make. Can’t find a make.exe that works on Windows on The elements of this post that proved to be helpful are:

  1. Install Cygwin
  2. Alias make to Cygwin’s make.exe ([install location]\cygwin\bin\make.exe)
  3. (Optional) Use the makefile template from Richard Turner’s response

Now you should be able to make test and see the test run, right?  Well, not so fast professor.  Cygwin’s make.exe conforms to some White Space Persnicketiness Specification.  We thought surely we were ready to execute our tests, but kept suffering make errors about “missing separator.”

Make missing separator errors

Well, sir, that’s what we call a helpless error.  It’s an error alright, but sure isn’t helpful.  It turns out, however, that had we been steeped in unix-land makefiles, we might know that white space matters.  That is to say that make treats a series of spaces differently than tabs.  If your favorite editor is converting tabs to spaces, then make complains of “missing separator.”  [UPDATE: some have suggested that GNU’s Make for Windows handles tabs or spaces.]

Ok, we’re almost there.  You’ll need to configure your editor to keep tabs rather than convert them to spaces.  Ideally, your editor will also support writing files with unix-style line endings, LF (\n) rather than CRLF (\r\n) typically used on Windows.  Below is an example of how to configure Notepad++.

Notepad++ Configuration to Keep Make Happy

Since make doesn’t like tabs to be converted to spaces, we change our Notepad++ settings to save the tabs rather than convert them.  Visual Studio and other developer-oriented text editors have similar settings.  In Notepad++ select Preferences from the Settings menu, and then the dialog tab for Language Menu/Tab Settings. Select Makefile in the Tab Settings, listbox on the right; turn Use default value and Replace by space off.  After you save these settings, you’ll also need to replace the spaces in your makefile with tabs.  For example, we replaced the beginning spaces on Line 2 below with a beginning tab.  Just save your changes, and you should be able to make test successfully.

Notepad++ Makefile Tab Settings
If you’re also interested to configure Notepad++ to use unix-style line endings, select Unix Format from the EOL Conversion menu item of the Edit menu.

Notepad++ End of Line Configuration


Whew! That wraps it up.  Now your developers should be able to use make test in Windows, Linux and Mac environments.  One less difference to remember between platforms means one more boost to developer productivity.

We hope this is helpful to you.  Feel free to leave comments if this works for you, if it doesn’t, etc. / Enabling Spammers?

Microsoft’s email site is all the rage this week.  With a clean, responsive interface, many are harping it as a symbol of “the new Microsoft.”  Hope springs eternal.

I was disappointed, however, to see this error message after incorrectly typing a password:'s Wrong Password Error Message

Hmmm. Did I miss a change in the security world regarding email address privacy?  Hopefully Microsoft will remedy this situation quickly and use the typical (and more private) approach – “That email address and password do not match our records.”


Windows 8 Installation Fails in VirtualBox

We wanted to tinker with the Windows 8 Release Preview, so we tried to install it on our tinkering box.  Having two tinkering boxes, we started with the trashable tinkering box — a Dell Dimension from mid-2005.  Admittedly, we attempted this installation to see if “it will set the drive on fire.”  We tend to throw crazy things at this Dimension, but it keeps on tickin’! Its spec are:

  • Hardware:
    • Dell Dimension 4700
    • CPU: Pentium 4 @ 3.0 GHz
    • RAM: 4 GB (physical)
    • Disk: 1 TB; > 750 GB free space
  • Software:
    • Arch Linux; Kernel: 3.4.6-1-ARCH #1 SMP PREEMPT
    • VirtualBox 4.1.18_OSE r78361

We created a new VM and hooked it up to the 32-bit ISO for Window 8 Release Preview.  After starting the VM, VBox presents an error dialog stating:

VT-x/AMD-V hardware acceleration is not available on your system. Certain guests (e.g. OS/2 and QNX) require this feature and will fail to boot without it.

This dialog gives the user a choice of closing the VM or continuing.  It’s the tinker box, so we chose to continue!  Next the Windows logo appeared, so we got excited that this just might work.  But, after just a couple of minutes, the installer died and gave this message:

Your PC needs to restart.
Please hold down the power button.
Error Code: 0x0000005D

Well, that’s clearly a bad result.  We had no interest in even attempting to push past this kind of problem.  Installing Windows 8 on our tinkering server (Windows 2008 R2, Hyper-V).

After some discussion, we decided that this configuration probably should not work.  So, chalk this up to “Yep, we have demonstrated that what should not work actually does not work.”  Hopefully no one else will be tempted to try Win8 on this kind of config, but maybe this will save them some time if they do.

Windows Azure Management Portal in Firefox, Moonlight on Linux


We have Arch Linux running on a 7 year old Dell desktop. It’s an oldie, but a goodie.  The combination of Arch with LXDE makes for a good administrative machine – email, web browsing, bittorrents, etc.  We had tried using the Silverlight-based Windows Azure Management Portal on this machine – using Mono’s Moonlight as the the Silverlight for Linux – but found enough hiccups that we stopped wasting our time.  When Windows Azure began offering its HTML5-based management portal, our interest in managing our Azure systems from Linux was renewed.  Here’s a brief review of our experience:

Using Firefox 13.0.1 on Arch Linux, we opened After signing in, we were left on what appeared to be a blank page.  On right-clicking the page, we learned that it was actually trying to use Silverlight, and the Moonlight implementation didn’t seem to be rendering correctly.  We wondered why we hadn’t been given a choice between Silverlight and HTML5 – we seem to remember that in Win7+ IE.

We uninstalled Moonlight in hopes that the portal’s page code would opt for HTML5 when no Silverlight support was detected. Unfortunately, the portal’s entry page just showed the familiar “To view this content, please install Silverlight….”

Disappointed, again.  The management portal doesn’t seem to detect the lack of Silverlight support and redirect to the HTML5 version.  The user is not presented a choice of which to use. And either the Moonlight implementation or the portal implementation in Silverlight don’t work correctly.

UPDATE: After tweeting that the portal wasn’t working in our config, we quickly received a response from @ScottGu saying that we need to use for the HTML5 portal. (Whether the tweet came from the real Scott Guthrie or a ghost tweeter, we don’t know). We were immediately pleased to find that the HTML5 portal worked very well in our non-Microsoft config! Kudos to Microsoft and the Windows Azure team for delivering cross-platform, cross-browser management tools – well done!

UPDATE 2: The portal link/button on navigates to (which requires Silverlight).  If you want to use the HTML5-based management portal, be sure to open directly.

A PowerShell Script to Assist WCF Service Hosting / Testing

When developing for WCF, I find situations in which I need to manually host the WCF services.  (“Manually” == “not from within Visual Studio”).  Sometimes I have to go back and forth between different services, etc.  The command I really wanted was to be able to “just host from here.”  So, I created a simple PowerShell script that does just that.

WCF-HostMe starts in the current directory and looks for a service to host.  It simply looks for a config file (matching *.[exe|dll].config) and assumes that the config file’s name-matching exe or dll is the service.  After formatting and building the parameter values, it launches the service using WcfSvcHost.exe.

Again, this is a simplistic approach and really only works well for development and testing purposes.  Obviously this is not a good approach for production environments.


###### WCF-HostMe.ps1
#    Hosts a WCF service using WCF Service Host (WcfSvcHost.exe) for testing purposes.
#    How it works:
#        Beginning in current dir, recursively searches for *.exe.config or *.dll.config
#        Assuming the .config file is asso'd with the WCF service, launches WCFSvcHost
#            using the service's and config's paths
#    History:
#        3/16/2012, J Burnett    Created

# Find .config file
# TODO: handle multiple search results
# TODO: detect assembly is not hostable? (not WCF, WF)
$configPath = (gci . -include *.exe.config, *.dll.config -recurse).VersionInfo.FileName

# Build WCFSvcHost param inputs - the full paths of service & config
$serviceArg = (" /service:""" + $configPath -replace '.config','') + """ "
$configArg = " /config:""$configPath"" "

# Launch WCFSvcHost to host the service
echo ''
echo "Attempting to host $serviceArg as WCF Service..."
start-process WcfSvcHost.exe -ArgumentList ($serviceArg + " /config:""$configPath"" ")
echo ''

### Copyright & Disclaimer
# This software is provided "as is"; there are no warranties of any kind.  This software 
# may not work correctly and/or reliably in some environments. In no event shall
# AltaModa Technologies, LLC, its personnel, associates or contibutors be liable for 
# ANY damages resulting from the use of this software.

Azure: Flex on IaaS, Keep PaaS Pure

I recently commented on Mary Jo Foley’s post Can Microsoft Save Windows Azure?  The key point of my post was that IaaS is good for Azure because it is good for adoption rate.

This change does raise concerns for software developers, however. On the topic of increasing IaaS in Azure, Mary Jo wrote:

This means that Microsoft will be, effectively, following in rival Amazon’s footsteps and adding more Infrastructure as a Service components to a platform that Microsoft has been touting as pure PaaS. [highlight added]

For software architects and developers, Microsoft PaaS approach with Azure has been a boon. Many non-developers would be surprised to know how much “infrastructure impact” creeps into system architectures and implementations. The (historically) pure PaaS Azure, however, provides us with the ability to implement highly available, scalable and performant systems with almost no infrastructure concerns.

Many consider Amazon to be the top cloud computing provider.  Amazon Web Services (AWS) provides a very good set of building blocks which (increasingly) work well together. From a developer’s perspective, however, these building blocks require too much “IaaS overhead.”   Determining which building blocks will be needed is the first hurdle. But then come the “which to use in what?” hurdles. Which distro and version of one of the Linuxes, which database type (SQL, NoSQL), how will the various systems communicate with each other, etc., etc., etc.

With Windows Azure, however, a developer can implement code on a developer workstation (using the Compute and Storage emulators).  When ready, the developer can deploy code to Azure directly from Visual Studio.  Relative to just about every other cloud provider, Azure developers start out much further down the road (= saves a lot of time).


In PaaS, Purity Matters

So how does Azure’s IaaS push impact developers?  Developers will be negatively impacted by letting IaaS pull in PaaS impurity.  In other words, if Microsoft muddies the existing Azure development platform (code interfaces), the level of “IaaS Overhead” increases.  As the level of IaaS Overhead increases, the time and cost benefits of PaaS erode.

Consider the Win32 days: Microsoft’s progressive push toward a standard set of API’s for Windows created a competitive advantage.  Although there were gaps (e.g., API’s supported on Windows NT, but not Windows 95), Win32 was far more pure than any other API at the time.  Companies that developed software for the myriad Unix systems encountered far more platform related costs than companies developing for Windows.  If Microsoft had tried to support Sun, IBM, HP and other Unix libraries, the advantages of Win32 would have vanished.

Hopefully Microsoft plans to keep the PaaS side of Azure pure, while letting the IaaS side flex.  Otherwise, they will undermine one of their most significant competitive advantages – a pure PaaS.

Periodic Table of Cloud Computing

David Pallmann, General Manager at Neudesic has published one of the best explanations of Cloud Computing in Windows Azure Design Patterns.  The title is a double misnomer – it isn’t just about Azure, and it is far more than Design Patterns.

Pallmann’s Periodic Table of Cloud Patterns is one of the best tools for visually capturing the various components and facets of Cloud Computing.  He uses these elements very effectively to touch on Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS) before delving into Windows Azure’s PaaS offerings.

Although the slides cover Windows Azure, Pallmann’s points are easily abstracted to Cloud Computing in general.  And he provides a very solid foundation for digging into areas such as:


  • Claims-based Security
  • Service Bus (resilient queue)
  • Storage – Blobs, Tables, Queues
  • Database – SQL Azure
  • Compute Instance Types


Overall an excellent presentation for Cloud Computing and well worth reviewing!