Jason Haley

Ramblings from an Independent Consultant

Virus Scan File Uploads Using Multi-Container Web App

This month at the Microsoft Build conference, the Azure App Service team announced multi-container support for Linux Web Apps. This capability enables you to deploy multiple containers in a single Web App.

In the session PaaS and Container Innovation – What’s new with App Service members of the App Service team show a demo of a Web App that has three containers: Nginx, WordPress and Redis.

The multi-container capability isn’t designed to compete with Kubernetes or other orchestrators but to just allow the ability to easily add that one or two more containers that will help support your containerized web application (like a cache for example).

This past weekend I was working on an Azure Functions extension that I’m planning on using to provide virus scanning for a website – when the thought crossed my mind that multi containers would enable me to provide virus scanning to a web app even easier. So, I took a detour from working on my extension and worked on this sample instead.

Backstory

Often web sites need the ability to upload files. However, if you have been through a secure code review or penetration testing, you’ll know that to safely provide that functionality to your users you need to scan any uploaded files for viruses. This is one of those parts of an implementation people put off until later – especially if they don’t already have a solution for the virus scanning – then find out implementing it isn’t as easy as it should be.

In Azure you can upload files easily to blob storage, ensure the transport is secure and even ensure the files are encrypted at rest. But scanning those saved blobs for viruses is one of those features you have to implement yourself.

A couple of options for virus scanning via an API:

  • VirusTotal – a third party API that would require passing the file out of Azure to the service
  • ClamAV – an open source anti-virus scanning server (GNU GPL v2 license)

For my scenario, I have the following constraints:

  • I need to be able to integrate the virus scanning into my codebase using C#
  • I cannot transfer the files out of the data center just to scan for viruses
  • I don’t want to have a VM running 24/7 that is only used to scan less than 100 files a month

The Solution: Linux Web App with Two Containers

After doing some research, I’ve found a way to stay within my constraints and easily add virus scanning to a Web App.

  1. Use ASP.NET Core so I can run the site in a Linux container
  2. Use a second container to run the mkodockx/docker-clamav image (utilize the Nuget nClam package as a client to the ClamAv server)
  3. No need for a VM since we can now run multiple containers in a single Web App

blogimage1

Creating the demo web app

To verify things work the way I want, I created a simple web app that uploads files and then displays the results of the virus scan. In order to save time, I started with the Asp.Net Core Web application template, ripped the majority of the views and actions out and then used some code from the ASP.NET Core documentation for the file upload logic: File uploads in ASP.NET Core

I put a copy of the code in GitHub if you want to see the full web site code: https://github.com/JasonHaley/file-upload-demo

Here’s the code that takes the uploaded file(s), passes them off to the ClamAV server container and returns the results:

[HttpPost("UploadFiles")]
public async Task UploadFiles(List files)
{
    var log = new List();

    foreach (var formFile in files)
    {
        if (formFile.Length > 0)
        {
            var clam = new ClamClient("clamav-server", 3310);
            var result = await clam.SendAndScanFileAsync(formFile.OpenReadStream());

            log.Add(new ScanResult()
            {
                FileName = formFile.FileName,
                Result = result.Result.ToString(),
                Message = result.InfectedFiles?.FirstOrDefault()?.VirusName,
                RawResult = result.RawResult
            });
        }
    }
            
    var model = new UploadFilesViewModel();
    model.Results = log;
                        
    return View(model);
}

The important thing to note with using the ClamClient– is the communication between the web site and the clamav-server container uses the container’s name, not an IP address.

Walkthrough

You can follow these rest of this entry if you want to get it going yourself. In order to do this, you will need Docker for Windows running on your machine and a recent version of Visual Studio 2017.

Once you have the file upload logic in your ASP.NET Core Web application, you need to add the Docker support to the project.

1. Right click on the WebApp in the Solution Explorer

2. Choose Add -> Docker Support

blogimage2

This will add a Dockerfile to your Web project and a docker-compose project to the solution.

blogimage3

In your docker-compose project, open the docker-compose.yml file and add the clamav-server to the services, like shown below:

version: '3.4'

services:
  clamav-server:
    image: mkodockx/docker-clamav
  webapp:
    image: ${DOCKER_REGISTRY}webapp
    build:
      context: .
      dockerfile: WebApp/Dockerfile

Now run the debugger (hit F5) to start the web application. The first run will take a little while to start since it has to pull down the ClamAV image and update the virus definitions.

Once it starts you should see a file upload page like shown here:

blogimage4

Select a file to upload and see if it has a virus in it:

blogimage5

If you want to test for a virus you can find the Eicar virus text for a test file here: http://www.eicar.org/86-0-Intended-use.html

Push the Docker Image to a Docker Hub

Now that the code works locally, the next step is to put the web project’s container into a repository so you can configure a Web App in Azure to use it.

For the purposes of this demo, I put my web app in docker hub at: https://hub.docker.com/r/haleyjason/file-upload-demo/

If you want to create your own image to put in docker hub, change your build to a Release build then start the application again. This will create the release images locally.

You will also need to have a Docker Hub account and create a repo to push the image to.

Once you have the Docker Hub repo ready, complete the following steps at a command line:

  1. List your Docker containers to get the container id and name
    docker ps
  2. Login by using the following and entering your Docker Hub username instead of <username>
    docker login –username <username>
    Then enter your password when prompted
  3. Tag your image using your container id and repo name
    docker tag <container> <dockerhub account>/<docker hub repo>:<tag>
    I used something like:
    docker tag 0c98 haley/file-upload-demo:latest
  4. Push the image to the repo
    docker push <dockerhub account>/<docker hub repo>

You should now have the web app container in Docker Hub.

Create the Azure Web App

The last step is to create a Web App and a Docker Compose file to connect the images.

First create a docker-compose.yml file that just connects the containers. The file contents should be similar to the following:


version: '3.4'

services:
  webapp:
    image: haleyjason/file-upload-demo
  clamav-server:
    image: mkodockx/docker-clamav

Save this file somewhere so you can upload it to the Web App in the next part.

  1. In the Azure portal, Click on the plus in the upper left corner -> Web -> Web App
    blogimage6
  2. On the Web App blade:
    - Provide an App name
    - Select your subscription
    - Select or create a new Resource Group
    - For OS, select Docker
    - For demo purposes, stay with the Generated App Service plan
    blogimage7
  3. Click on the Configure container menu, then the Docker Compose (Preview) tab
  4. In the Configuration text box –> clik the folder icon and select the docker-compose.yml file you created earlier that connects the two containers.
     blogimage11
  5. Click OK
  6. Check the Pin to dashboard checkbox
  7. Click the Create button to get the process of creating the web app started
  8. Once the web app is ready, in the Overview blade, click on the URL in for the application

Now wait 5 – 10 minutes … the first load takes several minutes – but once it is up and running it responds normally. 

When I select a couple of files:

blogimage9

I now get the scanned results:

blogimage10

Conclusion

The new multi-container capability of Azure App Service Linux Web Apps seems like a promising way to provide that ability to host a virus scanning server along side your web application.

Setup OWASP Juice Shop in Azure Container Instances (Part 3 of 3)

In the second part of this series we walked through using Web App for Containers as a way to get the OWASP Juice Shop Project up and running. 

In this part, I want to provide a step-by-step reference in how to get it running using Azure Container Instances.

Using the Azure Portal

1. Login to your Azure Subscription at https://portal.azure.com

2. Click on the Create Resource (plus) button in the upper left corner, select Containers, then Azure Container Instances

image

3. On the Create Azure Container Instances Basics blade enter values for the following:

  • Container name: unique name for your container (not the name from the container registry)
  • Select Public for the container image type
  • Container image: bkimminich/juice-shop
  • Subscription: choose your subscription
  • Resource Group: select an existing or enter a new one
  • Select a location near you
  • Click OK

image

4. On the Configuration blade

  • Select Linux for the OS Type
  • Select 1 for Number of cores
  • Select 1.5 GB for Memory
  • Select Yes for Public IP Address
  • Enter 3000 for the Port number
  • Click OK

image

5. Click OK on the Summary blade

image

Once the container is stared you will be able to navigate to the instance and find the IP Address in the upper right corner of the Overview panel. 

image

If you copy this IP address and add :3000 on the end for the port in a browser you will now get Juice Shop running.

image

Using the Azure CLI or Cloud Shell

If you are using Azure CLI - you will need to do step 0 to login and if you are using Cloud Shell – you will need to do step 0 to open the shell.

Azure CLI – Only

0. In a command window type the following and press enter

Then open a browser and type the code shown to you for authenticating and click Continue

You can know close that browser window.

Cloud Shell – Only

0. Click on the Cloud Shell button in the upper right of the portal image

image

The remaining steps are the same for both the CLI and the Cloud Shell.

1. Create a resource group using the az group create command giving it a resource name and location and hit enter

image

2. Create a new container using the az container create command giving it values for:

  • --resource-group juiceshop-cli-demo
  • --name juiceshop-cli-aci1
  • --image bkimminich/juice-shop
  • --dns-name-label juiceshop-cli-aci
  • --ports 3000
  • --ip-address public

image

Once the container is up and running, you can use this pattern to access the site: http://<dns-name-label>.<datacenter>.azurecontaner.io:3000

image

That is all it takes to get the Juice Shop up and running in Azure Container Instance – just 2 commands (1 if you already have a resource group).  Pretty nice.

Setup OWASP Juice Shop in Web App for Containers (Part 2 of 3)

If you want to know more about Web App for Containers, you can see Part 1 of this series for a brief feature outline or even better the documentation for Web App for Containers (also often referred as App Service on Linux) for more detail.

In this part I want to provide a step-by-step reference in how to get the OWASP Juice Shop Project setup and running in Web App for Containers.

Using the Azure Portal

1. Login to your Azure subscription at https://portal.azure.com

2. Click on the Create Resource (plus) button in the upper left corner, select Web + Mobile, then Web App for Containers

image

3. On the Web App for Containers Create blade enter the following:

  • App Name: enter unique name for app
  • Subscription: choose your subscription
  • Resource Group: select an existing or enter a new one

4. Click on the App Service plan

  • Click on Create New
  • Enter a name for the App Service Plan
  • Select a location near to you
  • Click Ok

5. Click on configure container

  • Select Docker Hub for the Image source
  • Select Public for Repository Access
  • Enter bkimminich/juice-shop for the Image name
  • Click OK

image

6. Check Pin to dashboard and click Create

Once the Web App loads and the overview blade is showing, click on the url in the upper right corner of the Overview

image

That should launch the Juice Shop in a browser:

image

Using the Azure CLI or Cloud Shell

If you are using the Azure CLI, you will need to do step 0 below (with the Cloud Shell there is no need to login)

Azure CLI - Only

0. In a command window type the following and press enter

az login

image

Then open a browser and type the code shown to you for authenticating and click Continue

imageimage

You can know close that browser window.

Cloud Shell – Only

0. Click on the Cloud Shell button in the upper right of the portal image

image

Ok, the remaining steps will work with both the CLI and the Cloud Shell

1. Create a resource group using the az group create command giving it a resource name and location and hit enter

image

2. Create an app service plan using the az appservice plan create command giving it values for:

  • --name
  • --resource-group (same one you just created)
  • --sku
  • --is-linux

image

3. Create the web app using the az webapp create command giving it values for:

  • --resource-group (same group as above)
  • --plan (same name as plan you just created)
  • --name
  • --deployment-container-image-name NOTE: This is: bkimminich/juice-shop

image

Once the app is ready you can open a browser and navigate to the first url in the enabledHostNames section of the json retuned.  In my example it was https://juiceshop-web-cli.azurewebsites.net

Next

That was the Web Apps for Container, now we can move onto Setup OWASP Juice Shop in Azure Container Instances

How to Setup OWASP Juice Shop on Azure (Part 1 of 3)

Last year when I was working on my Securing Your Web Application in Azure with a WAF talk, I was looking for a way to avoid writing my own site that exposed things like SQL injection and cross site scripting (XSS) and happened to find the Juice Shop project (I think it was Bill Wilder that introduced me to it but I’m not 100% sure).  The OWASP Juice Shop Project is a great site for testing your exploit skills on a modern web app … or in my case testing the effectiveness of a Web Application Firewall (WAF).

There are many resources on the web to find more information on the juice shop project and how to exploit it, I’m going to focus on the two easiest and quickest ways I’ve found in getting it running in Azure:

  • Web App for Containers
  • Azure Container Instances

For the individual walkthroughs, I want to cover both using the Azure portal and the Azure CLI in order to serve as a better reference – so to keep the length shorter I’m going to break this up into three parts:

First a little about these Azure products and their features.

Web App for Containers

Web App for Containers are similar to Web Apps and build on the App Service platform, but there isn’t feature parity between the two.  The most common features of Web Apps are supported including:

  • FTP capability
  • Deployment Slots
  • CI/CD integration
  • Application Settings (think environment variables that can be managed in the control plane)
  • Backups
  • Custom domains
  • SSL Certificates
  • Scale in/out (including autoscale)
  • Scale up/down (though not all App Service tiers are available)

Things special to Web App for Containers:

  • SSH to the container experience
  • Ability to deploy the site from a container registry

Currently only Linux containers are supported – which for the case of running Juice Shop is not a problem.

Web App for Containers seems designed for the scenario when you want to host a web site from a (Linux) container.

Azure Container Instances

Container Instances are basically Containers-as-a-Service and designed for single container workloads.  However you can run multiple containers in container groups (similar to a pod in Kubernetes).

  • Supports both Linux or Windows containers
  • Can run containerized tasks (not designed only for serving web sites that don’t return)
  • Ability mount Azure Files as volumes in a container
  • Can have multiple ports (and not just 80 and 443)
  • Public IP and DNS name labels are optional
  • Using the Kubernetes Connector, ACI can serve as a host in a burst scenario to handle excess capacity and host pods in container groups

Azure Container Instances seems more of a bare container product and designed for shorter run sites or tasks as well as extending existing Kubernetes clusters when needed.

Next

Now that I’ve introduced the products, I will now provide the walkthroughs of the two different options. Next is Web App for Containers.

Talk: Securing Your Web Application in Azure with a WAF

Last night I spoke at the Boston Azure User Group. The slides are available here: Securing Your Web Application in Azure with a WAF

As I mentioned last night, this is the first edition of the talk, I plan on giving it to my user group next month (October) and again at the Boston Code Camp in November – so if you have any feedback please let me know I’d love to improve the content.

I want to work more demos into the presentation next time … and of course get them all to work (1 demo didn’t work last night).

Here are some of the useful links mentioned last night:

Next time for the demos I’m going to look at using the OWASP Juice Shop webapp

Talk: Design Azure Web Apps and Mobile Apps

Wednesday and Thursday of this week (Sept 27 – 28) I helped out with the second round of 70-534 Architecting Azure Solutions Event in NYC (the Boston event was Sept 14 - 15).  Microsoft is organizing these events in many cities currently in the mid-west and east coast.

My talk was on Web and Mobile Apps.  There were about 40 – 50 people in the room and around 200 on the simulcast.

BTW: if you are interested in having an event streamed you should check out Nelco Media.

One of the links I mentioned for people who are studying for the exam: Exam prep for Architecting Microsoft Azure Solutions, 70-534 it has the exam OD’s linked to the corresponding landing area in the Azure documents.

My presentation can be downloaded on github here and here are links to the labs for the Web Apps and the Mobile Apps

Azure Help: Traffic Manager–PowerShell Script to Add Traffic Manager Probe IP Addresses

If you are using Traffic Manager to route traffic between two data centers and the endpoints are inside of a VNET, you’ll need to add network security group rules for Traffic Manager’s probe IP Addresses or your endpoints will always show as degraded.

image

Once you think of the purpose of the probe … then it makes total sense why the endpoint shows as being “Degraded” – if traffic manager can’t get to the endpoint then it can’t make a decision of whether it should send traffic to it or not.

So in order to fix the, you need to add the Traffic Manager probe IP Addresses to the inbound NSG rules of you VNET.  That is simple to do in the portal (or via PowerShell) but – there are 23 of them!  Who really wants to manually enter 23 of them? … and don’t forget there are 2 data centers, so that is really 46 of them.

I certainly don’t enjoy entering that many NSG rules (even though it is easy) – so I created a script to do it.

In my case, I had east and west coast data center locations.  You’ll need to plug in your subscription, resource groups, nsg names and maybe change the probe port and the number to start the priority at (I have 150 – 173).

You should also verify the IP Addresses have not changed from the listing on this page: https://docs.microsoft.com/en-us/azure/traffic-manager/traffic-manager-faqs under the question “What are the IP addresses from which the health checks originate?” about 3/4 the way down the page.

The source code can be found here: https://github.com/JasonHaley/AzureHelp/blob/master/TrafficManager/AddTrafficManagerProbeRules.ps1

#Verify latest Probe IP Addresses at https://docs.microsoft.com/en-us/azure/traffic-manager/traffic-manager-faqs

$subscriptionId = "YourSubscriptionId"
$resourceGroupEast = "EastCoastResourceGroup"
$resourceGroupWest = "WestCostResourceGroup"
$nsgEast = "EastCoastNSG"
$nsgWest = "WestCoastNSG"
$probePort = 80 
$rulePriorityStart = 150
$rulePriority = $rulePriorityStart

$trafficManagerProbeIPs = @("40.68.30.66",`
                             "40.68.31.178", `
                             "137.135.80.149", `
                             "137.135.82.249", `
                             "23.96.236.252", `
                             "65.52.217.19", `
                             "40.87.147.10", `
                             "40.87.151.34", `
                             "13.75.124.254", `
                             "13.75.127.63", `
                             "52.172.155.168", `
                             "52.172.158.37", `
                             "104.215.91.84", `
                             "13.75.153.124", `
                             "13.84.222.37", `
                             "23.101.191.199", `
                             "23.96.213.12", `
                             "137.135.46.163", `
                             "137.135.47.215", `
                             "191.232.208.52", `
                             "191.232.214.62", `
                             "13.75.152.253", `
                             "104.41.187.209",`
                             "104.41.190.203")

Login-AzureRmAccount

Set-AzureRmContext -SubscriptionId $subscriptionId

$groupEast = Get-AzureRMNetworkSecurityGroup -ResourceGroupName $resourceGroupEast `
     -Name $nsgEast
$groupWest = Get-AzureRMNetworkSecurityGroup -ResourceGroupName $resourceGroupWest `
     -Name $nsgWest


For($i=0; $i -lt $trafficManagerProbeIPs.Length; $i++) {
    $ruleName = "Inbound-TMProbe" + $i.ToString() + "-Https-Allow"

    $rulePriority = $rulePriorityStart + $i

    $groupEast | Add-AzureRmNetworkSecurityRuleConfig -Name $ruleName `
        -Description "Allow Traffic Manager Probe HTTPS" `
        -Access Allow -Protocol Tcp -Direction Inbound -Priority $rulePriority `
        -SourceAddressPrefix $trafficManagerProbeIPs[$i] -SourcePortRange * `
        -DestinationAddressPrefix * -DestinationPortRange $probePort

    $groupWest | Add-AzureRmNetworkSecurityRuleConfig -Name $ruleName `
        -Description "Allow Traffic Manager Probe HTTPS" `
        -Access Allow -Protocol Tcp -Direction Inbound -Priority $rulePriority `
        -SourceAddressPrefix $trafficManagerProbeIPs[$i] -SourcePortRange * `
        -DestinationAddressPrefix * -DestinationPortRange $probePort
}
$groupEast | Set-AzureRmNetworkSecurityGroup
$groupWest | Set-AzureRmNetworkSecurityGroup

This script has saved me a lot of time.

References

Traffic Manager routing methods

Traffic Manager Frequently Asked Questions (FAQ)

Manage network security groups using PowerShell

Create network security groups using PowerShell

Add/Set/Remove NSG rules in ARM mode Azure Powershell

Azure Help: WebApps–Copying and Exporting Connection Strings and App Settings

When you create a WebApp in Azure’s App Service, you get a couple of important features with Application Settings:

  1. The ability for your developers not know the production app settings or connection strings values … or just the option to develop with settings that are not used once deployed (such as working with a local database)
  2. The ability to override values in the web.config that is deployed with the application

The first item is important for security reasons.  It makes it possible to control access to the production credentials.  The second item is really useful for changing settings for different environments – such as a staging db for a staging slot and a production db for the production slot.

I am a big fan of using Application Settings in my WebApps.

If you are not familiar with them, just look for the Application Settings menu item in your Web App (shown below).  The App settings and Connection strings sections are what I want to discuss below.

imageimage

I’ve included some reference links at the bottom of this entry for you to learn more (if you are interested).

Problem 1:  Get a list of all the App Settings and/or Connection Strings of a deployed WebApp

The Azure portal will allow you to view all the settings, but sometimes you need the values for several app settings and/or connection strings – so viewing the values in the portal isn’t enough.

To solve this problem, I have created a PowerShell script that exports the values to two csv files (if I want both AppSettings and ConnectionString).  Usually it is just the AppSettings that I need, but you never know.

The source code can be found here:https://github.com/JasonHaley/AzureHelp/blob/master/WebApps/ExportConnectionStringsAndAppSettings.ps1

$subscriptionId = "<subscriptionId>"
$resourceGroupSource = "<source resource group>"
$webAppsource = "<source web app name>"
$slotSource = "<source slot>"

$appSettingsFileName = "appSettings.csv"
$connectionStringsFileName = "connectionStrings.csv"

Login-AzureRmAccount

Set-AzureRmContext -SubscriptionId $subscriptionId

# Load Existing Web App settings for source and target
$webAppSource = Get-AzureRmWebAppSlot -ResourceGroupName $resourceGroupSource `
   -Name $webAppsource -Slot $slotSource

# Create csv files if file names are set
If ($appSettingsFileName -ne "") {
    $webAppsource.SiteConfig.AppSettings | Select-Object -Property Name, Value | `
       Export-Csv -Path $appSettingsFileName -NoTypeInformation
}

If ($connectionStringsFileName -ne "") {
    $webAppsource.SiteConfig.ConnectionStrings | Select-Object -Property Name, Type, `
ConnectionString | Export-Csv -Path $connectionStringsFileName -NoTypeInformation }

The generated csv files look like this:

image

Problem 2:  Copy all the App Settings and/or Connection Strings to another WebApp

Once in awhile I need to move WebApps from one place to another or lately I’ve been upgrading clients from ASEv1 instances to ASEv2 instances – which means a new build of an environment (not an upgrade).  ASE stands for App Service Environment.

If I already have the values of the AppSettings and ConnectionStrings in a deployed WebApp … it would be nice to copy the values to the new one and not have to enter them one-by-one.

The important thing to note: AppSettings and ConnectionStrings are not in source control (unless you have complete ARM templates stored … which you wouldn’t want your secrets in – so that complicates matters) … so deploying the latest code to the new WebApp is only part of the solution of spinning up a new environment.

To solve this problem, I have created a PowerShell script that will copy over all the AppSettings and/or ConnectionStrings of one WebApp to another WebApp.

The source code can be found here: https://github.com/JasonHaley/AzureHelp/blob/master/WebApps/CopyConnectionStringsAndAppSettings.ps1

$subscriptionId = "<subscriptionId>"
$resourceGroupSource = "<source resource group>"
$resourceGroupTarget = "<target resource group>"

$webAppsource = "<source web app name>"
$webAppTarget = "<target web app name>"

$slotSource = "<source slot>"
$slotTarget = "<target slot>"

Login-AzureRmAccount

Set-AzureRmContext -SubscriptionId $subscriptionId

# Load Existing Web App settings for source and target
$webAppSource = Get-AzureRmWebAppSlot -ResourceGroupName $resourceGroupSource `
    -Name $webAppsource -Slot $slotSource

# Get reference to the source Connection Strings
$connectionStringsSource = $webAppSource.SiteConfig.ConnectionStrings

# Create Hash variable for Connection Strings
$connectionStringsTarget = @{}

# Copy over all Existing Connection Strings to the Hash
ForEach($connStringSource in $connectionStringsSource) {
    $connectionStringsTarget[$connStringSource.Name] = `
         @{ Type = $connStringSource.Type.ToString(); `
            Value = $connStringSource.ConnectionString }
}

# Save Connection Strings to Target
Set-AzureRmWebAppSlot -ResourceGroupName $resourceGroupTarget -Name $webAppTarget `
    -Slot $slotTarget -ConnectionStrings $connectionStringsTarget

# Get reference to the source app settings
$appSettingsSource = $webAppSource.SiteConfig.AppSettings

# Create Hash variable for App Settings
$appSettingsTarget = @{}

# Copy over all Existing App Settings to the Hash
ForEach ($appSettingSource in $appSettingsSource) {
    $appSettingsTarget[$appSettingSource.Name] = $appSettingSource.Value
}

# Save Connection Strings to Target
Set-AzureRmWebAppSlot -ResourceGroupName $resourceGroupTarget -Name $webAppTarget `
   -Slot $slotTarget -AppSettings $appSettingsTarget

These two scripts save me a lot of time due to the reality of there being several AppSettings that I need to work with (when I need to work with them).

References

Using App Settings in Azure Web Apps

Windows Azure Web Sites: How Application Strings and Connection Strings Work

Azure App Service Web Config Vs Application Settings

Easily Manage Azure Web App Connection Strings using PowerShell

Using Powershell to manage Azure Web App Deployment Slots

Azure Help Series

Today I’m starting a new series of blog entries – Azure Help.  There are many people who do “Tips and Tricks” and “Lessons Learned”, which I like and find useful – the idea with this series is the same, but I needed to come up with a unique name.

I want to cover real scenarios and lessons I’ve learned the past 5 or so years in using Azure on a day to day basis.

Entry List

This is a list of the blog entries I have for Azure Help.

Idea List

This is a list of ideas that I’m planning future entries about:

  • WebApps – Maintenance and Logging
  • Cloud Services – Using VSTS Release Management to Change ConfigurationSettings
  • VSTS - Add and Remove NSG to Publish a Release to VNet in Azure

Source Code

AzureHelp - a repo for code used in these Azure Help entries.

Suggestions?

Do you have a suggestion for future a future entry?  If so, feel free to email or tweet me your idea … as long as it makes sense, chances are good it will make it on the list Smile

Ways to get involved in the local tech community (5 of 5)

Help grow the community

The local tech community is made of the people in it and the value they provide for each other.  The community isn’t just user group leaders, speakers and volunteers.  In fact, the majority of the community are individuals trying to learn, teach, socialize and help others.

With that in mind, here are some ways for you to help grow and make the community stronger (ie. more valuable) for everyone.

Attend User Groups and Be Part of the Group

I think it is obvious, but by attending user groups you have a chance to personally interact with others in the community. 
Most groups have some networking/socializing time before or after the main event.  Use this time to talk to at least one new person you don’t know.  If you already know everyone there then maybe you could bring someone with you next time and introduce them.  The idea is the more people know each other in the community the more value everyone gets from it.

Once the main presentation starts, ask questions, be present.  Often times, if you have a question – chances are someone else would also like to know … so go ahead and ask.  The more interactive the group is the more value people will get from it.  Sometimes the small tangent a speaker takes to answer a question is the most valuable part of the talk.

Provide Feedback to Leaders and Speakers

For any event or presentation, I think everyone can provide value by providing honest feedback on the event.  For the leaders/organizers it will help with planning future events.  For speakers, honest feedback is always hard to come by and I don’t know a single speaker who wouldn’t like to get some feedback on their presentation.
If you don’t feel comfortable giving feedback, try suggesting topics or speakers for future events that you would be interested in.

Continue the Conversation

Once a user group meeting is over, you can still carry on the conversation.  Meetup.com provides the ability to add comments to the event, you can also use social media (like twitter, Instagram, etc.) or event write a blog post to keep the conversation about an event going.

Help Spread the Word

Growing a community takes time and effort.  The more people that can help spread the word about the group, the bigger the pool of possible attendees … and the bigger the community.
Personally, I use twitter a lot for ‘spreading the word’ – though I have my doubts on how effective this really is.  Another thing I do is email people and mention groups, topics and people to them.  This helps me spread the word to some people who may not be in the know but may be interested.