AWS to Azure – Making the Leap

I’m not even pretending that this is definitive or comprehensive. But, at 11 pm, here’s a few notes and some helpful links and resources as a companion to a presentation I wrote earlier today.

Migrating Workflows

If you’re an AWS developer and you are thinking of exploring your options in Azure-world – here are some things to keep in mind:

  • You’ve already hit the big challenges in moving to the Cloud, It’s much easier to move workloads from AWS to Azure, than from onprem to the cloud.
  • The majority of AWS functionality has a map in Azure. My take is – AWS started with a 3 year head start in the IAAS space, and that’s their strong point. Azure has a much stronger backbone and pedigree to where the cloud really gets interesting – PAAS/SAAS scenarios. The feature competition between Amazon, Microsoft and Google is a going to continue accelerating – which is a very good thing for you.
  • VM conversion from EC2 is easy; PAAS/SAAS conversion is tougher. None of these are truly apples-to-apples (example, AWS Lambda -> Azure Functions)
  • Availability models are very different
  • Project specific – know the integration points and SLAs and underlying platform services.
  • Deployment models are better in Azure!


Migration is remarkably easy – basically you follow some simple steps using Azure Site Recovery.


Amazon AWS to Azure – General Resources

TechNet Radio Series from Microsoft:


Great Pluralsight video – I loved this. An excellent starting point for people new to PAAS architecture.



Architecture Overview

Wonderful set of reference architectures – this is a terrific link:

 (see below for snapshot)


And a central repository for more whitepapers:


Another outstanding book – free – on Cloud Design Patterns. This is a terrific book and it has some outstanding reference works that can pair with this:

Web Development Best Practices Poster (and see the link above for a Scalability poster as well)


(screenshot below)




Now let’s talk a little more about how some of these components in the AWS space map over more into the Azure space.

Azure Functions


Blob Storage 

Blob Storage – very good walkthrough:


And more general notes on storage models in Azure:


Event Hubs


Azure Stream Analytics






API Management



ARM Templates







Azure auditing options for your custom reporting needs

Here’s the five options I’ve been able to find – so far – if you need fine-grained detail on your Azure subscription usage. (i.e. historically showing user access for security audits across multiple resource groups, etc.)


If you want a one-sentence recommendation – sorry I have to stick with “It depends”. I think you get great power with the OMS option (#2), but the PowerBI option (#3) is up and coming and very robust.


  • Option 1: Powershell Client for Azure RM. See the links below for more on this.
  • Option 2: Operational Insights
  • Option 3: Azure built in portal reporting
  • Option 4: PowerBI consuming the REST service. (See the links but this may very well be your best and most powerful option)
  • Option 5: Other tools consuming the ARM auditing APIs/SDK/CLI. There’s lots of log aggregation tool ranging from Excel to very sophisticated third party tooling that consumes the REST interface.


    In more detail:


    Option #1 – Powershell

    This was what we used two years ago. Nowadays, it seems like best practice is log aggregation–using Operations Management Service. That gives you the best level of customization and fine grained detail without having to take on PS scripting or consuming REST endpoints manually.


    Auditing reports using ARM Powershell, which in turn rests on the REST API we expose as part of the Azure resource manager. A Microsoft walkthrough of setup including deployment is here.


    There’s a good walkthrough on installing Powershell Client for Azure Resource Manager here. This blog goes through this in detail, including answers like ‘who accessed by subscription in the past 60 days”, “what access does a specific user have”, etc. We could extend this to show more detail points.


    There’s a walkthu on this blog of building out auditing reports. This blog uses ARM Powershell to come up with user list on subscriptions, modules used etc. And of course there’s third party products offering services in this space as well.


    The auditing APIs are evolving fast per my friends on the product team – there are some great third party tools out there that will provide this info. For you script based junkies – PS might be a great option. You can use PowerShell to view the Azure Activity Logs, showing all operations on the subscription and who did what. From here you can consume those API’s – fairly easily – and then you can crunch them into something useful.


    Start with the PS Commandlet Get-AzureRmLog:



    Option 2 – Operational Insights

    On #2 above, there’s an overview here of Operational Insights. A overview page on Log Analytics is here, documentation and FAQ is here, Not too much deep dive info on Operational Management Service (OMS) within Premier, but if you think this is a worthwhile option we can engage with a PFE and even build you out a pilot on it.  It can also now be connected directly to OMS (as well as Event Hubs and storage accounts). For the type of reporting you are talking about I think OMS would be the answer.



    Also worth pointing out that this is only activities carried out though ARM. If you want to see the audit records for changes to RDFE resources i.e. Classic Cloud services etc. then you still need to use the Operation Logs in the classic portal (or API). This caught me out recently trying to help a customer audit config changes to cloud services.



    Option #3 – Built in reporting in Azure

    Note that the audit data from Azure (ARM) is now available and searchable in the Azure Portal via the Activity Logs blade.





  • According to this article, there’s five different types of reporting available to subscription admins OOTB.
    • Anomaly reports – Contain sign in events that we found to be anomalous. Our goal is to make you aware of such activity and enable you to be able to make a determination about whether an event is suspicious.
    • Integrated Application reports – Provides insights into how cloud applications are being used in your organization. Azure Active Directory offers integration with thousands of cloud applications.
    • Error reports – Indicate errors that may occur when provisioning accounts to external applications.
    • User-specific reports – Display device/sign in activity data for a specific user.
    • Activity logs – Contain a record of all audited events within the last 24 hours, last 7 days, or last 30 days, as well as group activity changes, and password reset and registration activity.


    Option 4 – PowerBI

There’s a couple of slick ways to build out PowerBI reports direct from the REST endpoints. Some great references on this here. – this goes through the Power BI Content Pack for Azure Audit Logs. There’s a secondary article right here with some snapshots. From this doc:

“In a nutshell, Azure Audit Logs is the go-to place to view all control plane events/logs from all Azure resources. It includes system and user generated events. You can also access this through the Azure Insights SDK, PowerShell, REST API and CLI. The logs are preserved for 90 days in Azure’s Event Logs store.”

Here’s the data you can gather:

  • Events by any particular resource over time
  • Which users perform what actions, how frequently and on what resources
  • Actions and events per subscription, resource group, region etc.
  • Azure Service Health (outages and maintenance) events that potentially impacted your resources
  • Alerts and AutoScale events by resource and time
  • Failures, success of deployments and registrations


Microsoft has further documentation explaining how you can access Azure Audit Logs in the Azure Portal.


Option 5 – Other options:

  • There’s advanced reporting available in Azure Active Directory as well. Azure Active Directory Premium. Advanced reports help you improve access security, respond to potential threats and get access to analytics on device access and application usage. There’s a walkthrough of this at this page.



I hope to add to this in the future with some great third party tooling we could recommend. Stay tuned!


Explorations with Azure – knocking out a pilot website in about two hours.

So recently I was asked to put together a pilot website for a friend who’s not super tech-savvy. His business may or may not ever get off the ground; like most of us he’s wanting to float something out there and see if it goes somewhere. An internet storefront seemed like a great place to start.

OK, so after the requisite weeks of haggling over the name, I scaffolded out a site and posted it on Azure. I’m happy to announce that the site is awesome, very responsive – and best of all, free. It’s a VERY good use case for what Azure does very well – quick ramp-up sites that may or may not see full-size production traffic.

My first step was to create a Visual Studio Online project. This was simple – see below:

Following this I opened up the project in Visual Studio 2013. I had to map the workspace and pull down the code – in this case it was empty – before we could see our old friend Source Control Explorer.

Now, to work. I clicked, File, New Project, and selected ASP.NET Web Application. I selected Application Insights and synched it up with my new account.


I used the MVC option as my main option – but also added WebForms and WebAPI in case I want to tack them on in the future. Now, I’ve got a set of code available – that I can see is ready to be checked in. Right mouse click that baby and check in the code!


Back to your Azure account. Click on New Web Site and select the Custom Create option. I entered in my desired URL and a bare-bones SQL option – and, this is important, selected the Publish From Source Control option.

Selecting that option took me to the authorize connection screen in Azure, where I synched up my sourcecode with the new Azure website that was being spun up. Select the Authorize Now option here.

Once I did this, it took me back to Visual Studio 2013 – where a publish profile was ready to go for me. I skipped through the now-familiar Profile -> Connection -> Settings -> Preview steps – they were all set up nicely by the Azure publish profile – and clicked Publish.


… and, my site was published – lickity split. I could view it on stonefly. I also get all that cool less-is-more Azure dashboarding, showing requests/response time:

Best of all, I now have my site up and my code available in a repository that I can access anywhere. Feel free to check it out. No need to walk around with thumbdrives anymore. And, there’s something about building out a demo site quick and cheaply like this that appeals to me. Add to it that my TFSOnline account was free – yes, free (if MSFT ever changes that I could move over to GitHub) – and the case for handling this in Azure versus going all-out and spinning it up on or the like becomes pretty compelling. This is enough for my friend to evaluate and decide on next steps. Sprinkle in some HTML and CSS, and suddenly we’ve got ourselves a going concern:

Now, I could buy a DNS address (like, I don’t know, stoneflysoftware.azu or the like) and register it with Azure. But I probably won’t. As you can see from this site, it ain’t cheap – we’re talking $56/month for basic, which is a great deal if you have 6 or more sites you’d like to host, or you need true geodistributed 99.9% HA coverage on your site. Scott Hanselman has a whole series of articles on pennypinching with Azure that I’d recommend, including this nifty video on using Azure as a CDN.