Read our post at .NET Core 1.0.1 Migration Workflow which explains using .NET Core CLI for migrating from the old project.json approach to the new Visual Studio Project (a.k.a., MSBuild) approach.
If you’ve been using .NET Core for very long, you have code based on global.json and one or more project.json files. Several months (a year?) ago Microsoft announced deprecating this approach, and a return to Visual Studio Solution (.sln) and Project (.csproj, .fsproj, . vbproj, etc.) files. This transition also involves changing from directory based to file based orientation. For example, attempting dotnet build with the 1.0.1 toolset yields:
MSBUILD : error MSB1003: Specify a project or solution file. The current working directory does not contain a project or solution file.
Specifying a project.json file doesn’t work either; it yields:
error MSB4025: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1.
After upgrading to the toolset, migrating code involves a few steps:
- Create VS Solution file. The migration process is much easier if you create this file first. Using a shell (cmd, powershell, bash, etc.), change to the directory containing the project’s global.json and run dotnet create sln. The result is a (relatively) empty .sln file.
- Migrate project.json files to VS Project files. Run dotnet migrate in the same directory as above. This command will recursively find project.json files, convert them to C# project files, and add references to them in the solution file.
- At this point you should be able to restore and build using the .sln. Recall that dotnet tool is no longer directory oriented. Include the .sln file in the restore and build commands, e.g., dotnet build <project>.sln.
Here’s a great set of tools for smoothing your transition to developing in .NET Core.
- VSCode – cross platform IDE; great for coding .NET Core
- .NET Portability Analyzer – VS plugin helps you understand portability of your project
- I Can Has .NET Core – provides GraphViz of your project showing supported, unsupported and other dependencies
Yesterday (3/9/11) Microsoft announced updates for the Windows Azure SDK and Tools (VSCloudService.exe). As the download page indicates, this SDK release’s primary purpose is to address stability issues. Other improvements include:
- Azure Management Portal – Improved responsiveness
- Azure Connect – Multi-admin support and installation for non-English Windows
- Azure Content Delivery Network (CDN) – Provides for delivery of secure content via HTTPS
Be aware also that the 1.4 SDK installer automatically uninstalls previous versions. So, if you need 1.3 for a while still, you’ll need to keep it safe.
Download the SDK & tools here.
As I wrote earlier this week, the Silverlight-based Management Portal may be the best feature of Azure’s 1.3 release. It is a handsome UI, Azure components are far easier to access, and (most importantly) the amount of time required for common tasks is sharply reduced.
This post is only partly about the Management Portal, however. The certificate generating code in the Azure Tools for VS 2010 is also involved.
Although it’s nice to be able to create the necessary management certificate from within Visual Studio, the resulting certificate naming is confusing. The following dialog shows an existing certificate and the selection for creating a new certificate.
When this tool creates a certificate, it sets the Friendly Name you entered and sets Issued By to Windows Azure Tools. No big deal, right? Right – until you add the certificate via the Azure Portal….
This view of the portal shows the Management Certificates, but you can’t really tell which is which. For example, which of the two certificates corresponds to the one with Friendly Name: Deployment Credentials in the Azure Tools dialog? You really can’t tell unless you are able to distinguish them by their thumbprints or validity dates. Why doesn’t Deployment Credentials appear in one of the fields? Well, let’s take a quick look at the certificate in Certificate Manager (certmgr.msc).
When Azure Tools created the certificate, it set Windows Azure Tools in the Issued To and Issued By fields. The name I provided the tool appears in the Friendly Name field. I’m glad that I can distinguish the certificate in my local store with the friendly name, but it’s only known in my local store. That’s the problem: Friendly Name is not part of the certificate; it’s metadata associated with the certificate, and only locally.
What’s A Better Way?
Instead of using the Azure Tools to create a certificate, use the MakeCert tool. Azure only accepts certain certificates (X.509, 2k-bits, SHA1, etc.), so you have to provide a few specific parameters. Here’s a sample command line:
makecert -sky exchange -r -n “CN=<CertName>” -pe –a sha1 -len 2048 -ss My “<CertFileName>”
where CertName specifies the name you want to appear in Name field in Management Certificates of the management portal, and CertFileName specifies where to store the certificate file on your local drive.
Now, when you upload the certificate to the management portal, you can easily distinguish the certificates.
Then, when you Publish from Visual Studio, simply choose the appropriate certificate from the list.
Admittedly, the Friendly Name isn’t set, but you have no trouble distinguishing between certificates in either Visual Studio or Azure’s Management Portal.
You just gotta love the SysInternals team’s responsiveness! Just last week I wrote that didn’t like how the v14 release of Process Explorer did not include a single view of all the System Information indicators. Yes, it’s nice to have an independent view of each indicator on its own tab, but I still want the synchronous summary view.
Say hello to Process Explorer v14.01! In this release, the team added (revived) the Summary tab to the System Information dialog.
Summary view, it’s good to have you back! You’ll notice that this view is not quite the same as the original (pre-v14.x) dialog. (See prior post for v12 screenshot). The Summary tab really is a summary of the other three tabs: CPU Usage History from the CPU tab; Commit & Physical Memory Histories from the Memory tab; I/O, Network and Disk Bytes History from the I/O tab.
Unfortunately, the Summary view does not offer Show one graph per CPU support, so you only get the aggregated graph. That’s not such a big deal though. A nice to have feature, however, would be the ability to double-click a graph on the Summary tab which would navigate you to the appropriate detail tab.
We posted some code to CodePlex which smooths OAuth integration for development teams.
If you’re using VirtualBox to host a development environment for Visual Studio 2010, this info may be helpful to you. During debug sessions, you may receive the following exception:
I resolved this problem by disabling VBox’s Nested Paging option for memory. You’ll need to shut-down the guest OS, open its settings in VBox, and turn off the Nested Paging option.
For reference, my configuration is:
Base OS: Windows 7 x64
Virtualization Host: VirtualBox 3.2.8
Guest OS: Windows Server 2008 R2, Standard
Visual Studio 2010 10.0.30319.1 RTMRel
I tinkered with the first public beta of Microsoft’s new Lightswitch application development tool. Here’s a few initial thoughts:
- Lightswitch makes it very easy to access data in pretty rich ways. Business people who are not developers will be able to unlock corporate data in ways that would typically require a developer or significant knowledge of querying tools, etc.
- Lightswitch is easy, approachable, but there’s still some learning curve involved. Layout Items such as Vertical Stack, Horizontal Stack, Text and Picture, and so on provide rich user interface capabilities, but how long will it take users to learn how to use them properly.
- Data access is very easy — maybe too easy. Just point Lightswitch to an existing data store and go!
- Relationships to data are automatic or easy to assign. The features around these relations are very good — Master-Detail screens are super easy.
- There’s no hard deployment requirement for Lightswitch apps. Yes, it requires Silverlight, etc., but you don’t have to deliver it into an IIS farm, etc.
- OOPS! Businesses must realize that all previous points add up to the potential for major train wrecks with data. For example, data relationship features make it easy to create Master-Detail screens, but cascading deletes are just as easy.
- Businesses should take great care to ensure that all Lightswitch application development runs against TEST data. Far too many companies ignore or skip this Best Practice, but they may suffer seriously if they aren’t careful.
- Protect data at the database; not in the app. Again, this is a Best Practice for software design, but I wouldn’t expect business people to be in tune with it. DBA’s or data owners must protect production data and provide matching test databases (identical schema, data domains, security, etc.)
- Don’t prototype and expand. People will be tempted to take a Lightswitch application’s generated code, turn it into a full-fledged Silverlight project and then build from there. Don’t! Is it possible? Probably, but it’s also possible to climb stairs to the top of the Empire State Building. I suspect future versions of Lightswitch will reduce this gap (a la creating OCX’s with VB back in the day)
Lastly, I did run into several bugs — not unexpected in an early beta. One example the product team may be interested in has to do with changing screen layout in non-debug mode (Ctrl-F5 from VS2010) results in
Error: The last change wasn’t successfully persisted. Please shut down the running application. Error: Change cannot be persisted because KittyHawk is not running under debug mode.