Application Development Chronicles

Tuesday 13 March 2012

How to use an A-drive on Azure Hosted Virtual Machine with a Non-genuine Windows on it ...?

When you publish a web/worker role to Azure and request that Remote Desktop is activated, you can visit your VM afterwards via RD.

Your can browse through your drives .....


I still wonder how I can use the floppy-drive :-) though. Also notice the remark in right bottom corner : "this copy of Windows is not genuine".

Best regards,


PS: According to Maarten Balliauw this is because they closed some "windows services" on the VM that are responsible for checking this compliancy.

Sunday 4 March 2012

Sunday 26 February 2012

Nuget & TFS

The Nuget package manager lets you easy reference "external" assemblies like moq or ninject in Visual Studio. But when you upload the Visual Studio project with the nugetted references into TFS the actual assemblies are not uploaded into the source control.

Typically they are located in special folder. By default this is called "packages". This means when you get the sources from TFS on another folder and/or machine , you can not build the solution properly. You're missing dependencies.

So upload the package folder as well in TFS via the team Explorer you might say. That's is a valid solution indeed.

Another approach I found on a blog post (...) is to execute the nuget command-line executable to retrieve the dependencies.

  1. Navigate to folder <mySolutionFolder>
  2. ...nuget.exe install <myProjectFolder>\packages.config -o Packages

This will look for a file called packages.config, check the dependencies and download them if necessary into a folder called packages. This folder should be located on the same hierarchy when initiated. Usually beneath the <mySolutionFolder>

Maybe there are other (better) solutions when working with nuget? Let me know...


best regards,

Azure scalability - Elastic demand

I have just recently dived into Azure and cloud computing in general. Some things I presumed to come out-of-the box from Azure was auto-scaling. Elastic supply of computing power is one of the main drivers for the existence of cloud computing in general and Azure in particular.

Sure you can change the number of instance for a web or worker role by changing the service configuration file through the management portal. This might be fine for some situation but to dynamically change supply to current and/or predicted future demand is not in the portal (at least I didn't find it ....) There are of course the Windows Azure Service Management Cmdlets and Windows Azure Diagnostics with which you can build your own solution perphaps.

Maybe Microsoft didn't provide the auto-scaling functionality out-of-the box because there could be a conflict of interest? MS might execute any provided logic to auto-scale in their own benefit? Or maybe they want give the Azure "eco-system" also some breadcrumbs. Like for example AzureWatch :

AzureWatch dynamically adjusts the number of compute instances dedicated to your Azure application according to real time demand. User-defined rules specify when to scale up or down, so that your application will always have enough computing power to process the workload without unnecessary over-provisioning.
 
To help you build your own solution, the MS PnP group released the Microsoft Enterprise Library Autoscaling Application Block (WASABi) : ..... lets you add automatic scaling behavior to your Windows Azure applications. You can choose to host the block in Windows Azure or in an on-premises application. The Autoscaling Application Block can be used without modification; it provides all of the functionality needed to define and monitor autoscaling behavior in a Windows Azure application.

Azure boot camp material


The Windows Azure Training Kit  includes a comprehensive set of technical content to help you learn how to use Windows Azure. It also contains many slide-decks . If you care for a more condensed content of this material I can recommend to take a look at the presentations on AzureBoot camp.



Saturday 25 February 2012

Azure trivia

Ever wondered how a Azure datacenter looks like on the inside?

On the following link you can see a video of a Microsoft datacenter that might host the Azure platform services and your Azure "enabled" applications. Many other services may actually run there as well like for example Bing or Hotmail.

Check out the most recent generation where servers are stocked in sea-containers.

Thanks for Maarten Balliauw for pointing me to this site.

Monday 22 December 2008

VISUG Event : Entity Framework & WCF

Hello,

Kurt Claeys gave a presentation about using the Entity Framework (EF) in a distributed scenario (Service oriented with WCF, Tight coupled Client-server) for the Visug , the Belgian Visual Studio user group.

His presentation was based on his personal research into the subject so he didn't gave immediately all the answers. Instead he showed us via concrete examples which difficulties he encountered during his endeavor.

The reason for going N-tier and the way to achieve this were out of scope for the session but decoupling was the key concept he used to convey this requirement.

After a very quick introduction into WCF and the EF, Kurt showed us the challenges you are confronted with if going this route and how WCF, EF or something else could provide a solution: Serialization of object graphs, contract sharing , … But the biggest hurdle in the current implementation seems to be change tracking.

In EF, a special mechanism called the ObjectContext, keeps track of all the things you do with the entity instances you retrieve through the EF infrastructure. This means the entity identification , the relationships ….and also the changes you make to the entity instances or the relations between them. Without the objectContext the EF infrastructure cannot create the necessary insert/delete/update statements for your data store.

Now a typical Service-oriented WCF service (cfr. 4 tenets of SOA ) is stateless in nature. This means that an operation that enables to retrieve an object graph through EF registers this action with the EF objectContext , but due to the nature of the service operation, the objectContext doesn’t stay around.

Kurt showed in some examples what this means for us as a developer . He also tried to come up with solution candidates to the problems like for example re-fetching the entity again and using special functionality from the objectContext the apply the changes. It worked but only for a single entity instance without relations. If you would like to go further …you had to do it yourself. Another solution candidate was to re-attach the object and apply all the changes to the re-attached object . If you had relations….you get the picture: DIY was the common factor in the candidate solutions.

There are some efforts in the EF community to give support of object change tracking in a distributed scenario but some of them are not active anymore and some are more research like efforts to look for solution.


Kurt concluded his presentation with what is coming in Vnext of EF , as told on PDC 2008.

What I remember is that if you’re going to use EF in a distributed scenario you should clearly do your homework first and recognize the potential difficulties you about to face in your situation and come up with a strategy to solve them.

Feel free to check the Visug site if you want to browse through the presentation. Maybe Kurt will also post his examples so you can try them out?

What are your experiences with EF?

Thanks for reading,

Best regards,

Alexander