Boston Azure Bootcamp – Saturday April 27, 2013

by Jason Haley 9. April 2013 12:38

Saturday April 27th, Eric Rohler and I (along with help from many other people) are organizing a local version of the Global Windows Azure Bootcamp (100+ bootcamps organized around the globe on that day).

The event is going to be a full day of Azure goodness.  We’ve tried to organize the topics and labs to be practical and useful for the majority of those developers interested in Windows Azure – everything from Cloud Service, Web Sites, Mobile Services and even Virtual Machines … a lot to cover in one day.

If you want to attend, please sign up for the event on the meetup group for the Boston Azure user group: http://www.meetup.com/bostonazure/events/107816882/ 

Here is the agenda

Start time End time Item Who
8:30 9:00 Check-in, breakfast, software install check  
9:00 10:00 Welcome Keynote Mark Eisenberg
10:00 10:10 Quick break  
10:10 11:00 Cloud Services Jason Haley
11:00 11:10 Quick break  
11:10 12:00 Labs time! All
12:00 12:30 Lunch  
12:30 12:45 TBA  
12:45 1:00 TBA All
1:00 1:10 Quick break  
1:10 2:10 Web Sites Udaiappa Ramachandran
2:20 2:30 Quick break  
2:30 3:30 Labs time! All
3:30 3:40 Quick break  
3:40 4:40 Windows Azure Mobile Services John Garland
4:40 4:50 Quick break  
4:50 5:50 Azure Virtual Machines Eric Rohler
5:50 6:00 Quick break  
6:00 6:30 Labs time! All

Please help us spread the word!

Comments (0) | Post RSSRSS comment feed |

Categories: Community | Azure
Tags:

Windows Azure SDK Gotcha: Using CSPack and CSRun With a Worker Role

by Jason Haley 22. March 2012 17:28

Today I was working on converting a simple app to an Azure worker role and trying to find the minimum work necessary to get it running in the azure emulator and not create an Azure project – just the necessary cscfg and csdef files.

In my attempt to reverse engineer a simple Azure worker role project (created using the Azure Visual Studio cloud project type) and comparing it with a normal Visual Studio project, I ran into a snag.

I added the necessary items to convert a project to an ‘azure’ project:

  • References to the 3 Azure dlls and set their properties to copy local for the Diagnostics and StorageClient, just like the cloud project had set.

image

  • Added the diagnostics to the app.config file
   1: <?xml version="1.0" encoding="utf-8" ?>
   2: <ServiceConfiguration xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"
   3:                       serviceName="ConsoleApp" osFamily="1" osVersion="*">
   4:  
   5:   <Role name="ConsoleApplication1">
   6:     <Instances count="1"/>
   7:     <ConfigurationSettings>
   8:       <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" />
   9:     </ConfigurationSettings>
  10:   </Role>
  11:   
  12: </ServiceConfiguration>

And created the guts of what’s in an Azure cloud project

  • Created a ServiceConfiguration.cscfg file
   1: <?xml version="1.0" encoding="utf-8" ?>
   2: <ServiceConfiguration xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration"
   3:                       serviceName="ConsoleApp" osFamily="1" osVersion="*">
   4:  
   5:   <Role name="ConsoleApplication1">
   6:     <Instances count="1"/>
   7:     <ConfigurationSettings>
   8:       <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" />
   9:     </ConfigurationSettings>
  10:   </Role>
  11:   
  12: </ServiceConfiguration>
  • Created a ServiceDefinition.csdef file
   1: <?xml version="1.0" encoding="utf-8" ?>
   2: <ServiceDefinition xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"
   3:                    name="ConsoleApp">
   4:   <WorkerRole name="ConsoleApplication1" vmsize="Small">
   5:     
   6:     <Imports>
   7:       <Import moduleName="Diagnostics"/>
   8:     </Imports>
   9:     
  10:   </WorkerRole>
  11:   
  12: </ServiceDefinition>

All of those items were modeled after a WorkerRole cloud project that works fine in the Azure emulator when run from Visual Studio.

However, when I used cspack and csrun with my newly converted ‘azure’ project, the emulator kept showing an error: “System.BadImageFormatException: Could not load file or assembly ‘xxx’ or one of its dependencies.  This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded”. Shown below:

image

Found the problem

After googling a bit, I found an answer in the Microsoft forums: Problems using csrun to start emulator.  After following Wenchao Zeng’s suggestion and I found the WaWorkerHost.exe.config was missing, so I created it like he said.  Once I added the config file to the necessary directory, the application loaded fine in the emulator. 

So that fixes it … now why was it missing?  First, I looked in the csx directory that the cloud project created (again reverse engineering) and found that the WaWorkerHost.exe.config file was in there.  So the VS created csx directory had it – and the cspack created csx directory didn’t have it.

How should I really fix it?

I dug around in the Windows Azure SDK\v1.6 directories a little and found the runtimes\base\x64 directory contained the same files that were in the csx directory where the missing config file was suppose to be … and it was missing in the SDK directory too!  So I added it.  Tried a cspack again and … nothing.  The file didn’t make it over in the packaging – so now I’m guessing that it either isn’t that directory that is copied or the file gets deleted (after finding the solution I tried again and the directory in the SDK is the one being copied but the config file seems to getting deleted).

Next step was to spend a couple of hours in Reflector looking at the build task and notepad looking at the targets file used by msbuild.  Yeah, I know a couple of hours?  There are a lot of parts is all I can say.  With the help of Reflector, I narrowed it down to a method in the CSPack task named CreateWorkerHostConfiguration().

After analyzing the calls to that method, I tracked the logic down to some checks for a targetFrameworkVersion attribute on NetFxEntryPoint element … which is one way to set the entry point for the role in the csdef file. However since I was passing the entry point in the cspack call, I didn’t think I needed to set that in the csdef file.

The Solution

As soon as I changed the csdef file to include the entry point (like shown below), the WaWorkerHost.exe.config file gets created by cspack and put in the property base/x64 directory.

   1: <?xml version="1.0" encoding="utf-8" ?>
   2: <ServiceDefinition xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"
   3:                    name="ConsoleApp">
   4:   <WorkerRole name="ConsoleApplication1" vmsize="Small">
   5:     
   6:     <Runtime>
   7:       <EntryPoint>
   8:         <NetFxEntryPoint assemblyName="ConsoleApplication1.dll" targetFrameworkVersion="v4.0"/>
   9:       </EntryPoint>
  10:     </Runtime>
  11:     <Imports>
  12:       <Import moduleName="Diagnostics"/>
  13:     </Imports>
  14:     
  15:   </WorkerRole>
  16:   
  17: </ServiceDefinition>

The thing that is still unknown is why the csdef generated using the Visual Studio cloud project doesn’t have the entrypoint in it – and it works fine in the emulator.  Not sure why adding it is necessary when using cspack but having, it in there does fix the problem.

Comments (1) | Post RSSRSS comment feed |

Categories: Azure
Tags:

VS 2008 CloudServiceItems.vsi

by Jason Haley 15. March 2010 18:21

Tonight I threw together a vsi to add WorkerRole and WebRole class templates for C# and VB.  These are classes that are handy to have when you are moving a web project or dll to a cloud service.

Just download the vsi: CloudServiceItems.vsi, then install it.  The items are not signed so you will get a warning message.  Feel free to just extract the vsi (its just a zip file renamed) and check out the contents if you want, but there isn’t much there to look at.

When you double click the vsi, Visual Studio will launch the installer and you’ll get the following dialog:

image

Then you’ll get a warning message about it not being signed:

image

Click Yes and then click Finish and the templates will be installed.

image

Next time you go to the Add New Item dialog in VS (VB or CS) you’ll have the WebRole and WorkerRole templates listed in the My Template area (like shown below).

image

Comments (0) | Post RSSRSS comment feed |

Categories: Azure
Tags:

Asp.Net Membership/Role/etc Provider scripts for SqlAzure

by Jason Haley 10. March 2010 17:04

Found this link tonight and don’t want to loose it: http://code.msdn.microsoft.com/KB2006191

It is the updated ASP.NET scripts for use with SQL Azure … though Session is not supported – so not in those scripts.

Comments (0) | Post RSSRSS comment feed |

Categories: Azure
Tags:

Updated AzureTableQuery to Use Azure Storage Client Extensions

by Jason Haley 2. March 2010 18:35

Just updated the source code (not a download package) with change set 43982.

The checkin was to make the project work with Azure Storage Client Extensions.  It isn’t quite complete yet, but most of the functionality is there.  Currently the Default2.aspx only supports the output to grid option … sorry I haven’t had time to finish the file export. 

NOTE: Default.aspx is the same code as before – only Default2.aspx uses the Azure Storage Extensions.

The good news is, almost all of the queries in the Query samples file work.  In order for those queries to work you’ll need to get the code up and running for Azure Storage Client Extensions first (which doesn’t take too long).

So now you can do things like (all thanks to Azure Storage Client Extensions):

Projections (and orderby)

image

Joins

image

Some additional Linq operators, like First() and FirstOrDefault()

image

There are still a few bugs in the code, but over all it works good enough to do some general querying of you Azure tables.

Comments (0) | Post RSSRSS comment feed |

Categories: Azure
Tags:

Azure Storage Client Extensions + Azure Table Query

by Jason Haley 6. February 2010 18:42

I’m working on combining the Azure Storage Client Extensions functionality into Azure Table Query.  This will add the ability to create projections, do joins on tables, ordering results and a couple of other things.  Below is a screen shot of one of the projection queries joining two tables together:

image

Comments (0) | Post RSSRSS comment feed |

Categories: Azure
Tags:

Introducing: Azure Table Query Project

by Jason Haley 4. February 2010 16:01

Today I moved my new pet project out to CodePlex: Azure Table Query

image

This project is a result of me wanting to run ad hoc LINQ queries against Azure table storage.  Last week I wrote about How To: Query Azure Log Tables with LINQPad – but that requires a separate assembly to contain the entities and context classes.  This week I’ve taken it a step further … starting with the work I did on the Query Editor in  PowerCommands for Reflector.  The Azure Table Query is a similar tool for Azure table storage but is implemented as a Web Role project.

Besides giving you the ability to run ad hoc queries against table storage, you can also export your table entities as a compressed serialized xml file.  You do this by creating the query you want exported then choose the output type you want from the drop down.

The project is still in its early stages, but here is a list of some features and limitations.

Current features

  • Ability to get the list of tables in a configured Azure table storage account
  • Ability to see properties of an entity (if there are entities in the table)
  • Ability to execute ad hoc LINQ queries against (a single table in) Azure table storage
  • Query output is a grid by default
  • Optional output currently includes Xml serialized objects (either a normal xml file or a compressed xml file)

Current limitations

  • The table storage account is configured as a setting of cloud service (which means you can’t change it on the fly)
  • Only able to use a single table in a query
  • The LINQ syntax is limited to the same subset of operators currently supported by Azure table storage
  • No security … meaning if someone finds this page they will be able to query your table storage and export the data
  • UI could be improved

The download and information on how to use it are on the project’s site: Azure Table Query

Comments (1) | Post RSSRSS comment feed |

Categories: Azure
Tags:

How To: Query Azure Log Tables with LINQPad

by Jason Haley 28. January 2010 12:09

Code Download: AzureLogsWithLINQPad.zip

Recently there have been some questions in the Windows Azure Forum about retrieving information from the Azure log tables, which have made me think it would be nice to just run a quick query against Azure table storage. 

Normally I use the Azure Storage Explorer (or sometimes the TableBrowser site) to view data in Azure table storage … however the filtering functionality is still a little limited with those tools.  And in the past I’ve looked at LINQPad but hadn’t taken the time to figure out how to use it with Azure table storage until today.

LINQPad (http://www.linqpad.net/) is a great utility for executing LINQ queries against a data source.  With LINQPad, you can use things like LINQ to Objects, LINQ to SQL and several other LINQ oriented ways to query a data source … but to use it with Azure table storage you need to do a little work which Jamie Thomson wrote up in July 2009.  His blog entry describes how hookup Azure table storage to LINQPad: LINQPad and Azure.  Please check his entry out for a background on what the rest of this entry is about … I’m basically just updating his.

What I want to do, is use LINQPad to query the log tables generated by Windows Azure.  When Windows Azure is populating the different diagnostic logs, some are transferred to tables (others go to blobs).

Here is a list of the tables Windows Azure creates and what is in them:

WADLogsTable Trace logs
WADDiagnosticInfrastructureLogsTable Infrastructure Logs
WADWindowsEventLogsTable Windows Event Logs (configured manually)
WADPerfomanceCountersTable Performance Counters (configured manually)

image

In order to have all the DataContext classes available for querying these log tables, I put together a small solution which contains the entities and data context classes.  The link for the download is at the beginning of this entry.

Once you have the AzureLogs solution compiled, getting LINQPad setup to make queries to these tables is pretty close to what Jamie outlined in his blog entry, but I’ll update the steps here (for the Nov 2009 release).

 

Step 1: Add the additional references

  • In LINQPad, go to the Query menu | Advanced Query Properties
  • Add the AzureLogs.dll
  • Add the Microsoft.WindowsAzure.Diagnostics.dll and the Microsoft.WindowsAzure.StorageClient.dll (if you compile the source included in the download, these will be in the bin\debug directory after you compile)
  • Add the System.Data.Services.Client.dll

image

Step 2: Add the additional namespace imports

  • Add the additional namespaces shown in the image below

image

Step 3: Set the language

  • In order to run multiple statements in LINQPad (which you’ll want to do), you need to change the Language to C# Statement(s).

image

Step 4: Write the query and Run it

If you open the query included with the download (WindowsAzureLogsQuery.linq), it will save you some time – you’ll just need to do the following then uncomment and edit the queries already started for you.

By default the WindowsAzureLogsQuery.linq is using the local storage account.

To point at a different storage account:

  • Edit the accountName and the sharedKey to your storage account settings
  • Uncomment the “var account” lines that uses those
  • Comment out the line that sets the account variable to the development storage

image

Comments (1) | Post RSSRSS comment feed |

Categories: Azure
Tags: