Jason Haley

Ramblings from an Independent Consultant

What I’m using to stay up to date on Azure news

Happy New Year!

Yeah, yeah I know - January is almost over already. You aren't tired and letting up on your resolutions yet are you? I know sometimes when I start a new year with a plan to make lots of changes and to be focused on a new year full of goals that really interest me … I start to run out of gas around the end of week 3.

Well it is now the beginning of week 4 - time to get back to it! Reread you goals and for those goals that have to do with keeping up to date and learning more about Azure - here are some resources for you that I just recently discovered.

We have a "pandemic puppy", whose name is Duncan (he is the one in the photo above) and I now have a reason to take a drive and go on a walk every morning. The drive is key - about 30 minutes each way - it gives me around 1 hour a day when I can easily listen to podcasts or audiobooks.

Here are the Azure podcasts I'm currently listening to:

Azure Podcast
Website: http://azpodcast.azurewebsites.net/
YouTube: https://www.youtube.com/c/TheAzurePodcast/featured
Hosted by: Cynthia Kreng @cynthiaKrenG, Kendall Roden @KendallRoden, Cale Teeter @CaleTeeter, Evan Basalik @EvanBasaliK, Russell Young @youngr6 & Sujit D'Mello (couldn't find on twitter)

Azure security Podcast
Website: https://azsecuritypodcast.net/
Hosted by Mark Simos @marksimos, Sarah Young @_sarahyo, Michael Howard @michael_howard and Gladys Rodriguez @Cyber_batgirl

Website: https://ctrlaltazure.com/
Hosted by Tabias Zimmergren @zimmergre and Jussi Roine @jussiroine

Azure for Executives
Website: https://azure.microsoft.com/en-us/industries/podcast/
Hosted by David Star @ElegantCoder

I'm probably one of the last people in the tech industry to have just started listening to podcasts and am curious - do you have any favorite Azure podcasts?

Last week I also discovered two Azure "news sites" I hadn't come across before that I'm going to start looking at on a regular basis:

Azure Info Hub
Url: https://azureinfohub.azurewebsites.net
Looks to be created by: Holger Sirtl (@hsirtl - https://twitter.com/hsirtl/)

Azure Notes
Url: https://www.azurenotes.tech/
Looks to be created by Derek Martin (@thebookofdoodle - https://twitter.com/thebookofdoodle)

This month I'm experimenting with using daily learning topics, where I'll spend any spare time listening, reading, etc. to content on a singular topic - one of which is Azure. For example I'll start the day listening to Azure podcasts during our commute to the beach/woods for a walk and back, then later on I'll check out any Azure news and maybe watch a YouTube video or just read Azure related content. So far this daily topic experiment is working pretty well to help me slowly get up to date in several topics … I'll keep you posted on how this experiment goes.

Do you have any tips on keeping up to date on Azure related topics?

Personal Update, November 2019

This year has gone by way too fast. On one hand, I have left way too many goals on the todo list and not accomplished nearly as much as I wanted – but on the other hand I have surpassed what I thought I would do … which shows the unbalanced nature of my life so far this year.

Recently I’ve been listening to Cal Newport’s Digital Minimalism audio book and last week started a 30 day period without Twitter, LinkedIn, news sites or watching news on TV. The book really made me realize, how often I was checking Twitter for the latest update or the news for anything that had happened since the last time I checked it – especially the week I was at Ignite. What. a. waste. of. time.

This past week, without those distractions, I’ve realized I need to structure my downtime and find more leisure activities and other learning topics to do with my time (work in progress).

And in case you are wondering how last week went - did I miss Twitter or the news sites? No, not even close. I found I was more present in conversations, conference calls and life in general.

Though, since I have been using Twitter to share my thoughts for awhile now – I did have the urge to share a few thoughts or things I found interesting. Not having Twitter as a platform to do that right now is probably a motivating factor of why I am writing this blog post.

I ran 3 half marathons in new states this year (Louisiana, Rhode Island and Vermont). I have also kept my running miles up and am planning on a full (and maybe an ultra) marathon sometime next year. I started thinking about running an ultra listening to David Goggins’ Can’t Hurt Me: Master Your Mind and Defy the Odds audio book earlier this year. I’ve now listened to that book twice. Great book!

I’ve been an independent consultant now for 10 years this month.

This year has been all about getting more advanced experience with containers, Kubernetes, Azure and .NET Core. In doing so, I’ve spent way too much time working and learning – which is the main reason a lot of other goals did not get accomplished. I have given a few presentations on Azure Kubernetes Service and written a few posts for other blogs about Kubernetes (one even got posted on Kubernetes.io blog: Get started with Kubernetes (using Python)) but there is more I want to share – I just haven’t taken the time to structure it yet.

The North Boston Azure Cloud User Group is still going strong. We have still been meeting on a regular basis. The DevBoston group has not gone as well – only 2 meetings this year.

I have some ideas in mind for a couple of workshops on containers and Kubernetes that I want to organize this winter, but nothing announced yet.

Boston Code Camp #32 is next weekend, I’m looking forward that that. I’ll be giving my Going Independent talk again for anyone interested on what its like to be an independent consultant.

Until next time …

Learning containers, Docker and Kubernetes

This blog has been quiet for a while, mainly because I have been busy learning containers, Docker and Kubernetes. In the past year and a half, the learning has often been side projects and presentations – but this year things have changed and I’m now using this stuff on a daily basis. I still have a lot to learn but I think it is time to start sharing some of what I have learned.

My motivations for understanding and using containers has evolved from focusing on a new deployment solution (ie. Docker containers) to running applications in an environment that assumes everything is already containerized (ie. Kubernetes). In between those two are new DevOps techniques to connect the them and in the end my life as a developer easier.

However, I’m still on the journey and have a lot more to learn.

Containers and Docker are the basics

Step one in this journey really is learning what containers are. Step two is taking the container concept and using Docker to apply it.

First start out with pulling and running existing Docker images from DockerHub. Donavan Brown has a nice blog on Fun with SQL Server Containers that requires little knowledge of how it all works and shows you the practical nature of how to use containers. It will get those gears in your head turning.

After you have played with Docker for a while, the next challenge is to go from source code to a running container. For this, you’ll need an introduction to Dockerfile.

If you use Visual Studio, the easiest way to get started is to create a new .NET Core ASP.NET project and check the “Enable Docker Support” checkbox. Then run your application.

After that is really learning what the Dockerfile does and some Best practices for writing Dockerfiles so you can start customizing it when needed.

Running in Kubernetes is the goal

With Kubernetes, there are options for to get started:

1. Use Kubernetes with Docker Desktop – this is the quickest, free and runs on your local machine.

2. Use Azure Kubernetes Service - this compares to Kubernetes the Easy Way and will put your application in the cloud.

3. Kubernetes The Hard Way - this will give you the ultimate understanding of how the platform works, since you’ll need to build it all.

If your goal is like mine, running your application in Kubernetes, then I would recommend #1 and then move to #2. Eventually I’d recommend trying out #3 to get a deeper dive once you are familiar with using Kubernetes.

Recently I discovered the Kubernetes Learning Path that walks you through videos, articles and hands on practice – I highly recommend starting with it.

What’s next

Watch this blog. I’m going to start regularly posting on containers, Docker and Kubernetes. I’m working on a sort of blog series about general tips and tricks I’ve learned along the way and new things as I discover them.

If you are an application developer and learning containers, Docker or Kubernetes and have a topic you’d like to see please contact me on Twitter @haleyjason.

Web Apps 2019 from Boston Code Camp 31

It has now been a week since Boston Code Camp 31, where I presented my new Azure Web Apps 2019 talk. I originally structured the talk to introduce things that have change or are changing in the near future in Azure Web Apps (according to announcements at Ignite last year) – which all assumed a basic knowledge of Web Apps. However there was a good percentage of people in the crowd that didn’t have that basic knowledge … so I spent more time walking through features in the tour of Web Apps than planned. Which meant I didn’t have the time for the full set of demos I had wanted to go through. There were also quite a few questions – which I though was well worth missing some of the demos.

If you attended the session and want to know more detail about the demos: I am modifying the talk to a hands on session for the Boston Area Global Azure Bootcamp (Burlington location) on April 27 – so you can come and walk through the code yourself!

The power point can be found here: AzureWebApps2019.pptx 

This was the second time I’ve given this talk. The first time I gave the talk I used one of my github repos for the code sample, but I recently found Joonas Westlin’s Github repo that is more complete than mine: Azure Managed Identity demo collection so this time I used his code. Thanks Joonas!

Here are some notes for you incase you missed it.

Newer Features

This section of the presentation is to highlight some of the things that have been added via the Azure Portal that you would find useful to know if you haven’t been in the portal for a little while to notice.

Changes on App Settings blade
  • Now called Configurations
  • Has tabs for fitting on one page better
  • FTP configuration (added last year)
  • HTTP/2 Support (added last year)
  • Settings and connection string values are now hidden by default
  • Advanced Edit allows you to edit multiple settings quickly in a json format (this is new)
Custom domains and SSL Settings blade
  • HTTPS Only (added last year in two places)
    • Custom domains blade
    • SSL Settings blade
  • Minimum TLS version is now configurable (added last year)
Networking blade
  • Can now add IP Restrictions (white listing) for you web app
    • Supports IP v4 and v6
  • Can handle the IP Restrictions for web app and kudu site separately (this is new)
Deployment slots blade
  • Improved UX
  • Combined the Testing in production features
Deployment Center blade
  • Improved UX
  • Search and filter repositories
  • Revamped log files

Securing Web Apps

This section of the presentation was to highlight how to use two new-ish features to make your web app more secure: managed identity and VNET integration (preview).


Managed Identity
  • Identity blade in Web Apps
    • System Assigned
    • User Assigned
  • Allows Azure resources to authenticate to other resources without storing credentials
  • Deployment slots have different identities
  • Best to work with by adding to an AAD security group
New VNet Integration (Preview)
  • Does not use Point to site VPN (this is new and in preview)
  • Requires unused subnet with 32 addresses
  • App and VNet must be in the same region
Virtual Network Service Endpoints
  • Extend your VNet to Azure services
  • Available with:
    • Storage
    • SQL DB
    • Key Vault
    • SQL Data Warehouse
    • PostgreSQL
    • MySQL
    • Cosmos DB
    • Service Bus
    • Event Hubs
Azure Key Vault
  • For storing your
    • secrets
    • keys
    • certificates
  • Has IP Firewall
  • Integrates with VNet (via service endpoint)
  • Access policies
    • Manage identity permissions
      • Users
      • Managed Identities
Azure Storage
  • Encrypted at rest – can now bring your own key (this is new)
  • Soft delete (this is new)
  • Has IP Firewall
  • Integrates with VNet (via service endpoint)
  • Access control
    • Manage identity permissions
      • Users
      • Managed Identities
  • Has IP Firewall
  • Integrates with VNet (via service endpoint)
  • Can grant SQL DB access to managed identity or AAD security group
Demo steps
  1. Walk through local development using managed identity
    1. Add local user to storage
    2. Add local user to SQL DB and client IP firewall
  2. Create a managed identity for web app
    1. Enable System Managed Identity in web app
    2. Create AAD group and add new managed identity as a member
  3. Connect web app to VNet
    1. Create VNet and subnet
    2. Enable Service endpoints on subnet
    3. Create NSG for SQL out and add to subnet
    4. Turn on VNet Integration (Preview) in Web App
  4. Connect Key Vault to VNet
    1. Configure Access policies for managed identity or AAD Group
    2. Configure VNet
  5. Connect storage to VNet
    1. Configure Access policies for managed identity or AAD Group
    2. Configure VNet
  6. Connect SQL to VNet
    1. Configure network rule and add to VNet
    2. Add AAD Group to SQL DB via sql

Samples: https://github.com/juunas11/Joonasw.ManagedIdentityDemos

What is new in Azure App Service networking

In the security trenches of Azure SQL Database and Azure SQL Data Warehouse

Tutorial: Secure Azure SQL Database connection from App Service using a managed identity

Learn how to protect your data in Azure Storage with new features and capabilities

Manage keys, secrets, and certificates for secure apps and data with Azure Key Vault

Going Independent from Boston Code Camp 31

Last weekend was Boston Code Camp 31, where I presented my Going Independent talk to a small crowd of interested individuals. It is always a small but very interested crowd. Most attendees consisted of full time employees this time – which is exactly the audience I put it together for.

The power point can be found here: GoingIndependent2019.pptx

I have given the presentation at least a dozen times now in the past 9.5 years, here’s the cliff notes for you incase you missed this year’s version of it.

1. Why? 

Know why you want to be an independent, what is important to you? Are you sure you can’t just switch jobs and find what you are really after? How about just freelancing and keeping your job plus some side work?

Next, show chart of full time employee with direct deposit (ie. stable income every 2 weeks).

Follow that up with my actual income chart as an independent consultant – which is all over the place. 0 for some months, while other months are more than 10k a month.

Then the annual chart – which is more stable but still not the same as the FTE chart shown first.

2. Are you Still interested?

Get some advice – talk to friends, family, other independent consultants, people that could hire you. Run your ideas by them and get feedback plus spread the word for what type of work you are looking for

Find Client #1 – usually the easiest client to find. Could be previous employers, co-workers, other independent consultants, etc.

Establish Your Company – Get an accountant and lawyer – find out the best legal entity type for your situation and set it up (LLC, S-Corp, C-Corp), open a business bank account and get a business credit card to keep everything separate from the beginning.

3. Get Started

Note on Work Expectations – FTE vs. Consultant

- What works to climb the corporate ladder doesn’t always help you when you are a consultant

  • Doing what you are asked to do without questioning it (no estimates, just getting started)
  • Throwing more time at a problem to fix it


  • Clarify what expectations are before starting any work
  • Learn to estimate and always track your hours
  • Managing expectations is very important to keep good relationships

Stay organized – invoice regularly, keep track of your cash flow and expenses, pay yourself and keep at least 6 months living expenses back (ie. always have F-you money so you are never in a situation where you can’t say no).

4. Finding the next client(s)

Contract or consultant? – contract work: check job boards or recruiters. A lot like being an employee with a middleman. Consultant – its up to you. Referrals are very important.

Network – have 2 answers to the question “What do you do?”. First the broader answer of what you will settle for (use when you really need the work). Second, answer for what you really want to do (the reason you are an independent after all).

  • Network with complimentary skilled consultants – Example: Architects, Database, Security people are all out there networking to – so if you are a web developer make sure you know them so they can refer you work if they find a need for web developers and vice versa.
  • If you are an introvert and not great at networking – have 2 groups you network with – practice groups that most likely won’t be your clientele and groups that could refer you real work. Use the practice groups to get comfortable.

Become an expert – this is like a phase 2. Once you are able to make a living doing this self employed thing – set out to take it up a level. Speak at user groups, write a blog and write open source projects – these 3 items can serve as your marketing material.

2 + 1 Rule – always have 2 small projects that pay something but are not time sensitive (or at least are flexible) and have 1 project that pays the bills. The 2 small projects can help cushion the period between the big projects.

My First Microsoft BUILD Experience and Four Key Takeaways

In May, I attended the Microsoft BUILD 2018 conference in Seattle. It was my first Microsoft BUILD conference, so I didn’t have any expectations. I’ve watched many of the past BUILD session videos on Channel9 – so I knew the presenters and content would be really good. By the end of the second day of the three-day conference, I had come to the realization that there were too many sessions that interested me to be able to attend them all. By the third day I decided take advantage of the expo and speak with as many product groups as I could – something I wouldn’t be able to do after the conference.

Once the conference was over, I left Seattle with a few key takeaways that I’d like to share.

#1 The MVP experience rocked!

clip_image002I’ve been an Azure MVP now for just over a year, so I hadn’t heard of any special MVP experience at BUILD. Of all the benefits Microsoft gives its MVPs, the most valuable is access to the product groups and, of course, the MVP Summit (which is a week full of content presented by the product groups).

Microsoft Influencer Pre-Day

I spent the Sunday before BUILD in a large conference room with around 50 other MVPs and RDs attending sessions by the product groups. It was great to get a summary of what was to come at BUILD, and it allowed me to better create my session schedule for the conference.

MVP conference badge lanyards

clip_image002[4]At BUILD the lanyards (those ribbons that go around your neck with the conference badge) were color-coded for certain groups: MVPs had white, RDs (Regional Directors) had azure blue, MSPs (Microsoft Student Partners) had purple, and regular attendees had black. I found having distinct lanyards served two purposes: I could identify other MVPs easier (which for an introvert like myself helped start a few conversations with “What area is your MVP in?”). I also noticed that product groups at the expo picked up on me being an MVP without my having to mention it.

Fast lane and reserved section for keynotes

I heard that the lines for the keynotes would be huge, but that didn’t turn out to be a problem, since a handful of Microsoft programs (such as MVPs, RDs, and MSPs) enjoyed a fast lane into the keynotes and reserved seating upfront.

#2 If you aren’t already moving to .NET Core, it’s time to reevaluate

I’ve spent most of my career working on software in the financial industry. I don’t know about other industries, but in my experience the financial industry is not an early adopter of new software such as new versions of operating systems, frameworks, and the like. If there is an overwhelming reason to upgrade, then it will happen – but with .NET Core and .NET Standard, the industry isn’t moving very fast.

.Net Core 2.1 and 3.0 roadmap

At BUILD they announced the .NET Core 2.1 RC (with Go Live Support) and a .NET Core 3.0 roadmap. If you’re waiting for the infamous Microsoft version 3 to happen before moving to the next version, you’re in luck – it is on the way! But seriously, if you aren’t yet planning your move to .NET Core, you should start to evaluate why not. The huge performance increases alone in .NET 2.1 (and I’m sure 3.0) may make you want to reconsider and keep in mind some of those performance improvements will never make it to the full .NET Framework.

If you are waiting because you have a heavy dependency on desktop applications, then you’ll want to keep an eye on .NET 3.0 since it brings WPF, WinForms and UWP applications to .NET Core running on Windows.


#3 Containers are now everywhere and you need to know how to use them

I’ve been a huge fan of PaaS since Azure launched and over the years have moved from Cloud Services to Web Apps for many of my clients. Though there are still a few holdouts on Cloud Services – mainly due to the App Service sandbox, which prevents the usage of shared components like the registry, cryptography and GDI, etc. Most, if not all, of these problems can be resolved using containers.

I must admit, I missed the bus on Docker. I only started learning about containers this year. At the BUILD conference, there were a lot of sessions on containers. If you go to Channel9 and filter the BUILD video sessions by the keyword “container”, there are 16 sessions.

New Container Features with App Services

clip_image002[8]There were several App Services announcements about its container features and how Web App creation has changed. Now, when you create a web app, you decide if you want to use Windows, Linux or Docker – effectively promoting containers to the same level of choice as determining the operating system. Though if you want to use Windows containers, you’ll need to get on the private preview list or wait a couple of months.

clip_image004Another new feature is the multi-container Linux Web App, which allows you to use a Docker compose yml file or Kubernetes Pod Definition yml file describing multiple containers to be deployed in a single App Service Web App.

Once the Windows containers functionality comes out of preview, App Service will have a full container story.

Azure Kubernetes Service (AKS) Improvements

When you have a system that is more complicated than a one- or two-container workload, you will want to consider AKS – now officially known as Azure Kubernetes Service. During BUILD, there was an aptly titled blog post Kubernetes on Azure: Industry’s best end-to-end Kubernetes experience that covers all the new AKS features.

For me, the three big new features were:

· New Azure Portal experience – I think this makes it more approachable for people to try out.

· Ability to deploy Kubernetes nodes into existing VNETs – This makes it more practical for larger projects that seem to be the sweet spot for AKS.

· DevOps Project support – The new DevOps Project makes it easy to create a Kubernetes cluster and wires up a CI/CD pipeline for you in a matter of minutes.

#4 DevOps isn’t only for large companies

The past couple of years, I’ve been learning more about DevOps. However, it is sometimes difficult to convince customers that have smaller IT departments of its value because they are under the impression it is too complicated to get started with.

“Friends don’t let friends right-click and publish” was the running joke in the DevOps sessions I attended. Publishing files from a developer machine to a deployment environment is usually thought of as “quick-and-dirty deployment” and generally not a good idea. But many people still do it, and it often leads to unpredictable deployments that may break in the deployed environment. As we all know, what runs on the developer’s machine doesn’t always run in the deployed environment.

The definition of DevOps shown in the sessions was provided by Donovan Brown:

DevOps is the union of people, process, and products to enable continuous delivery of value to our end users.

To me that definition may sound a bit lofty for a one- or two-person IT department. However, definitions aside, the new DevOps Project resource in Azure makes it really simple to wire up a CI/CD pipeline (taking care of the “continuous delivery” part of the definition).

There were many DevOps sessions I wanted to attend, but I was only able to fit a few into my schedule. If you go to Channel9 and filter the BUILD video sessions by the keyword “DevOps”, there are 13 sessions.

Improvements to the DevOps Project in Azure
The DevOps Project now covers most of the customer scenarios I need. And, the build and release steps can be modified afterward to account for the ones that aren’t covered. Once you complete the four- step wizard in the DevOps Project, you will get a nice dashboard that provides jumping-off links to project home page in VSTS, project backlogs, users & groups, your code repo, build definitions, build logs, release definitions, web app endpoint, status of the web app and an Application Insights chart. This really is a single pane for your CI/CD and VSTS projects inside of the Azure Portal.


The takeaway here is: the DevOps Project wizard can now build you a “proper” CI/CD pipeline in about the same time it takes to “right-click and publish,” so there really is no excuse any more to not use it.

Closing thoughts on Microsoft BUILD 2018

This was my first BUILD conference, and I was impressed with the size and quality of the event. I really enjoyed being there when the announcements were made. Spending the last day speaking with people on the expo floor was awesome and something that can’t be done any other way. I am still trying to catch up and watch all the BUILD sessions on Channel9 that I couldn’t fit into my schedule.

Virus Scan File Uploads Using Multi-Container Web App

This month at the Microsoft Build conference, the Azure App Service team announced multi-container support for Linux Web Apps. This capability enables you to deploy multiple containers in a single Web App.

In the session PaaS and Container Innovation – What’s new with App Service members of the App Service team show a demo of a Web App that has three containers: Nginx, WordPress and Redis.

The multi-container capability isn’t designed to compete with Kubernetes or other orchestrators but to just allow the ability to easily add that one or two more containers that will help support your containerized web application (like a cache for example).

This past weekend I was working on an Azure Functions extension that I’m planning on using to provide virus scanning for a website – when the thought crossed my mind that multi containers would enable me to provide virus scanning to a web app even easier. So, I took a detour from working on my extension and worked on this sample instead.


Often web sites need the ability to upload files. However, if you have been through a secure code review or penetration testing, you’ll know that to safely provide that functionality to your users you need to scan any uploaded files for viruses. This is one of those parts of an implementation people put off until later – especially if they don’t already have a solution for the virus scanning – then find out implementing it isn’t as easy as it should be.

In Azure you can upload files easily to blob storage, ensure the transport is secure and even ensure the files are encrypted at rest. But scanning those saved blobs for viruses is one of those features you have to implement yourself.

A couple of options for virus scanning via an API:

  • VirusTotal – a third party API that would require passing the file out of Azure to the service
  • ClamAV – an open source anti-virus scanning server (GNU GPL v2 license)

For my scenario, I have the following constraints:

  • I need to be able to integrate the virus scanning into my codebase using C#
  • I cannot transfer the files out of the data center just to scan for viruses
  • I don’t want to have a VM running 24/7 that is only used to scan less than 100 files a month

The Solution: Linux Web App with Two Containers

After doing some research, I’ve found a way to stay within my constraints and easily add virus scanning to a Web App.

  1. Use ASP.NET Core so I can run the site in a Linux container
  2. Use a second container to run the mkodockx/docker-clamav image (utilize the Nuget nClam package as a client to the ClamAv server)
  3. No need for a VM since we can now run multiple containers in a single Web App


Creating the demo web app

To verify things work the way I want, I created a simple web app that uploads files and then displays the results of the virus scan. In order to save time, I started with the Asp.Net Core Web application template, ripped the majority of the views and actions out and then used some code from the ASP.NET Core documentation for the file upload logic: File uploads in ASP.NET Core

I put a copy of the code in GitHub if you want to see the full web site code: https://github.com/JasonHaley/file-upload-demo

Here’s the code that takes the uploaded file(s), passes them off to the ClamAV server container and returns the results:

public async Task UploadFiles(List files)
    var log = new List();

    foreach (var formFile in files)
        if (formFile.Length > 0)
            var clam = new ClamClient("clamav-server", 3310);
            var result = await clam.SendAndScanFileAsync(formFile.OpenReadStream());

            log.Add(new ScanResult()
                FileName = formFile.FileName,
                Result = result.Result.ToString(),
                Message = result.InfectedFiles?.FirstOrDefault()?.VirusName,
                RawResult = result.RawResult
    var model = new UploadFilesViewModel();
    model.Results = log;
    return View(model);

The important thing to note with using the ClamClient– is the communication between the web site and the clamav-server container uses the container’s name, not an IP address.


You can follow these rest of this entry if you want to get it going yourself. In order to do this, you will need Docker for Windows running on your machine and a recent version of Visual Studio 2017.

Once you have the file upload logic in your ASP.NET Core Web application, you need to add the Docker support to the project.

1. Right click on the WebApp in the Solution Explorer

2. Choose Add -> Docker Support


This will add a Dockerfile to your Web project and a docker-compose project to the solution.


In your docker-compose project, open the docker-compose.yml file and add the clamav-server to the services, like shown below:

version: '3.4'

    image: mkodockx/docker-clamav
    image: ${DOCKER_REGISTRY}webapp
      context: .
      dockerfile: WebApp/Dockerfile

Now run the debugger (hit F5) to start the web application. The first run will take a little while to start since it has to pull down the ClamAV image and update the virus definitions.

Once it starts you should see a file upload page like shown here:


Select a file to upload and see if it has a virus in it:


If you want to test for a virus you can find the Eicar virus text for a test file here: http://www.eicar.org/86-0-Intended-use.html

Push the Docker Image to a Docker Hub

Now that the code works locally, the next step is to put the web project’s container into a repository so you can configure a Web App in Azure to use it.

For the purposes of this demo, I put my web app in docker hub at: https://hub.docker.com/r/haleyjason/file-upload-demo/

If you want to create your own image to put in docker hub, change your build to a Release build then start the application again. This will create the release images locally.

You will also need to have a Docker Hub account and create a repo to push the image to.

Once you have the Docker Hub repo ready, complete the following steps at a command line:

  1. List your Docker containers to get the container id and name
    docker ps
  2. Login by using the following and entering your Docker Hub username instead of <username>
    docker login –username <username>
    Then enter your password when prompted
  3. Tag your image using your container id and repo name
    docker tag <container> <dockerhub account>/<docker hub repo>:<tag>
    I used something like:
    docker tag 0c98 haley/file-upload-demo:latest
  4. Push the image to the repo
    docker push <dockerhub account>/<docker hub repo>

You should now have the web app container in Docker Hub.

Create the Azure Web App

The last step is to create a Web App and a Docker Compose file to connect the images.

First create a docker-compose.yml file that just connects the containers. The file contents should be similar to the following:

version: '3.4'

    image: haleyjason/file-upload-demo
    image: mkodockx/docker-clamav

Save this file somewhere so you can upload it to the Web App in the next part.

  1. In the Azure portal, Click on the plus in the upper left corner -> Web -> Web App
  2. On the Web App blade:
    - Provide an App name
    - Select your subscription
    - Select or create a new Resource Group
    - For OS, select Docker
    - For demo purposes, stay with the Generated App Service plan
  3. Click on the Configure container menu, then the Docker Compose (Preview) tab
  4. In the Configuration text box –> clik the folder icon and select the docker-compose.yml file you created earlier that connects the two containers.
  5. Click OK
  6. Check the Pin to dashboard checkbox
  7. Click the Create button to get the process of creating the web app started
  8. Once the web app is ready, in the Overview blade, click on the URL in for the application

Now wait 5 – 10 minutes … the first load takes several minutes – but once it is up and running it responds normally. 

When I select a couple of files:


I now get the scanned results:



The new multi-container capability of Azure App Service Linux Web Apps seems like a promising way to provide that ability to host a virus scanning server along side your web application.

Setup OWASP Juice Shop in Azure Container Instances (Part 3 of 3)

In the second part of this series we walked through using Web App for Containers as a way to get the OWASP Juice Shop Project up and running. 

In this part, I want to provide a step-by-step reference in how to get it running using Azure Container Instances.

Using the Azure Portal

1. Login to your Azure Subscription at https://portal.azure.com

2. Click on the Create Resource (plus) button in the upper left corner, select Containers, then Azure Container Instances


3. On the Create Azure Container Instances Basics blade enter values for the following:

  • Container name: unique name for your container (not the name from the container registry)
  • Select Public for the container image type
  • Container image: bkimminich/juice-shop
  • Subscription: choose your subscription
  • Resource Group: select an existing or enter a new one
  • Select a location near you
  • Click OK


4. On the Configuration blade

  • Select Linux for the OS Type
  • Select 1 for Number of cores
  • Select 1.5 GB for Memory
  • Select Yes for Public IP Address
  • Enter 3000 for the Port number
  • Click OK


5. Click OK on the Summary blade


Once the container is stared you will be able to navigate to the instance and find the IP Address in the upper right corner of the Overview panel. 


If you copy this IP address and add :3000 on the end for the port in a browser you will now get Juice Shop running.


Using the Azure CLI or Cloud Shell

If you are using Azure CLI - you will need to do step 0 to login and if you are using Cloud Shell – you will need to do step 0 to open the shell.

Azure CLI – Only

0. In a command window type the following and press enter

Then open a browser and type the code shown to you for authenticating and click Continue

You can know close that browser window.

Cloud Shell – Only

0. Click on the Cloud Shell button in the upper right of the portal image


The remaining steps are the same for both the CLI and the Cloud Shell.

1. Create a resource group using the az group create command giving it a resource name and location and hit enter


2. Create a new container using the az container create command giving it values for:

  • --resource-group juiceshop-cli-demo
  • --name juiceshop-cli-aci1
  • --image bkimminich/juice-shop
  • --dns-name-label juiceshop-cli-aci
  • --ports 3000
  • --ip-address public


Once the container is up and running, you can use this pattern to access the site: http://<dns-name-label>.<datacenter>.azurecontaner.io:3000


That is all it takes to get the Juice Shop up and running in Azure Container Instance – just 2 commands (1 if you already have a resource group).  Pretty nice.

Setup OWASP Juice Shop in Web App for Containers (Part 2 of 3)

If you want to know more about Web App for Containers, you can see Part 1 of this series for a brief feature outline or even better the documentation for Web App for Containers (also often referred as App Service on Linux) for more detail.

In this part I want to provide a step-by-step reference in how to get the OWASP Juice Shop Project setup and running in Web App for Containers.

Using the Azure Portal

1. Login to your Azure subscription at https://portal.azure.com

2. Click on the Create Resource (plus) button in the upper left corner, select Web + Mobile, then Web App for Containers


3. On the Web App for Containers Create blade enter the following:

  • App Name: enter unique name for app
  • Subscription: choose your subscription
  • Resource Group: select an existing or enter a new one

4. Click on the App Service plan

  • Click on Create New
  • Enter a name for the App Service Plan
  • Select a location near to you
  • Click Ok

5. Click on configure container

  • Select Docker Hub for the Image source
  • Select Public for Repository Access
  • Enter bkimminich/juice-shop for the Image name
  • Click OK


6. Check Pin to dashboard and click Create

Once the Web App loads and the overview blade is showing, click on the url in the upper right corner of the Overview


That should launch the Juice Shop in a browser:


Using the Azure CLI or Cloud Shell

If you are using the Azure CLI, you will need to do step 0 below (with the Cloud Shell there is no need to login)

Azure CLI - Only

0. In a command window type the following and press enter

az login


Then open a browser and type the code shown to you for authenticating and click Continue


You can know close that browser window.

Cloud Shell – Only

0. Click on the Cloud Shell button in the upper right of the portal image


Ok, the remaining steps will work with both the CLI and the Cloud Shell

1. Create a resource group using the az group create command giving it a resource name and location and hit enter


2. Create an app service plan using the az appservice plan create command giving it values for:

  • --name
  • --resource-group (same one you just created)
  • --sku
  • --is-linux


3. Create the web app using the az webapp create command giving it values for:

  • --resource-group (same group as above)
  • --plan (same name as plan you just created)
  • --name
  • --deployment-container-image-name NOTE: This is: bkimminich/juice-shop


Once the app is ready you can open a browser and navigate to the first url in the enabledHostNames section of the json retuned.  In my example it was https://juiceshop-web-cli.azurewebsites.net


That was the Web Apps for Container, now we can move onto Setup OWASP Juice Shop in Azure Container Instances

How to Setup OWASP Juice Shop on Azure (Part 1 of 3)

Last year when I was working on my Securing Your Web Application in Azure with a WAF talk, I was looking for a way to avoid writing my own site that exposed things like SQL injection and cross site scripting (XSS) and happened to find the Juice Shop project (I think it was Bill Wilder that introduced me to it but I’m not 100% sure).  The OWASP Juice Shop Project is a great site for testing your exploit skills on a modern web app … or in my case testing the effectiveness of a Web Application Firewall (WAF).

There are many resources on the web to find more information on the juice shop project and how to exploit it, I’m going to focus on the two easiest and quickest ways I’ve found in getting it running in Azure:

  • Web App for Containers
  • Azure Container Instances

For the individual walkthroughs, I want to cover both using the Azure portal and the Azure CLI in order to serve as a better reference – so to keep the length shorter I’m going to break this up into three parts:

First a little about these Azure products and their features.

Web App for Containers

Web App for Containers are similar to Web Apps and build on the App Service platform, but there isn’t feature parity between the two.  The most common features of Web Apps are supported including:

  • FTP capability
  • Deployment Slots
  • CI/CD integration
  • Application Settings (think environment variables that can be managed in the control plane)
  • Backups
  • Custom domains
  • SSL Certificates
  • Scale in/out (including autoscale)
  • Scale up/down (though not all App Service tiers are available)

Things special to Web App for Containers:

  • SSH to the container experience
  • Ability to deploy the site from a container registry

Currently only Linux containers are supported – which for the case of running Juice Shop is not a problem.

Web App for Containers seems designed for the scenario when you want to host a web site from a (Linux) container.

Azure Container Instances

Container Instances are basically Containers-as-a-Service and designed for single container workloads.  However you can run multiple containers in container groups (similar to a pod in Kubernetes).

  • Supports both Linux or Windows containers
  • Can run containerized tasks (not designed only for serving web sites that don’t return)
  • Ability mount Azure Files as volumes in a container
  • Can have multiple ports (and not just 80 and 443)
  • Public IP and DNS name labels are optional
  • Using the Kubernetes Connector, ACI can serve as a host in a burst scenario to handle excess capacity and host pods in container groups

Azure Container Instances seems more of a bare container product and designed for shorter run sites or tasks as well as extending existing Kubernetes clusters when needed.


Now that I’ve introduced the products, I will now provide the walkthroughs of the two different options. Next is Web App for Containers.