Sunday, 26 February 2012

Nuget & TFS

The Nuget package manager lets you easy reference "external" assemblies like moq or ninject in Visual Studio. But when you upload the Visual Studio project with the nugetted references into TFS the actual assemblies are not uploaded into the source control.

Typically they are located in special folder. By default this is called "packages". This means when you get the sources from TFS on another folder and/or machine , you can not build the solution properly. You're missing dependencies.

So upload the package folder as well in TFS via the team Explorer you might say. That's is a valid solution indeed.

Another approach I found on a blog post (...) is to execute the nuget command-line executable to retrieve the dependencies.

  1. Navigate to folder <mySolutionFolder>
  2. ...nuget.exe install <myProjectFolder>\packages.config -o Packages

This will look for a file called packages.config, check the dependencies and download them if necessary into a folder called packages. This folder should be located on the same hierarchy when initiated. Usually beneath the <mySolutionFolder>

Maybe there are other (better) solutions when working with nuget? Let me know...


best regards,

Azure scalability - Elastic demand

I have just recently dived into Azure and cloud computing in general. Some things I presumed to come out-of-the box from Azure was auto-scaling. Elastic supply of computing power is one of the main drivers for the existence of cloud computing in general and Azure in particular.

Sure you can change the number of instance for a web or worker role by changing the service configuration file through the management portal. This might be fine for some situation but to dynamically change supply to current and/or predicted future demand is not in the portal (at least I didn't find it ....) There are of course the Windows Azure Service Management Cmdlets and Windows Azure Diagnostics with which you can build your own solution perphaps.

Maybe Microsoft didn't provide the auto-scaling functionality out-of-the box because there could be a conflict of interest? MS might execute any provided logic to auto-scale in their own benefit? Or maybe they want give the Azure "eco-system" also some breadcrumbs. Like for example AzureWatch :

AzureWatch dynamically adjusts the number of compute instances dedicated to your Azure application according to real time demand. User-defined rules specify when to scale up or down, so that your application will always have enough computing power to process the workload without unnecessary over-provisioning.
 
To help you build your own solution, the MS PnP group released the Microsoft Enterprise Library Autoscaling Application Block (WASABi) : ..... lets you add automatic scaling behavior to your Windows Azure applications. You can choose to host the block in Windows Azure or in an on-premises application. The Autoscaling Application Block can be used without modification; it provides all of the functionality needed to define and monitor autoscaling behavior in a Windows Azure application.

Azure boot camp material


The Windows Azure Training Kit  includes a comprehensive set of technical content to help you learn how to use Windows Azure. It also contains many slide-decks . If you care for a more condensed content of this material I can recommend to take a look at the presentations on AzureBoot camp.



Saturday, 25 February 2012

Azure trivia

Ever wondered how a Azure datacenter looks like on the inside?

On the following link you can see a video of a Microsoft datacenter that might host the Azure platform services and your Azure "enabled" applications. Many other services may actually run there as well like for example Bing or Hotmail.

Check out the most recent generation where servers are stocked in sea-containers.

Thanks for Maarten Balliauw for pointing me to this site.