Azure Virual Machine CPU Cores Quota

Did you know that by default your Azure subscription has a limit on the number of CPU cores you can consume for virtual machines? Until recently I didn’t. Maybe like you I’d only ever created one of two small virtual machines (VMs) within my MSDN subscription for testing and always ended up deleting them before coming close to the limit.

Per Azure subscription the default quota for virtual machine CPU cores is 20.

To clarify because the relationship between virtual and physical CPU cores can get a little confusing. This initial limit is directly related to what you see in the Azure portal when sizing your VM during creation.

AzureBasicVMsFor example, using the range of basic virtual machines to the right you would hit your default limit with;

  • 10x A2 Basic’s
  • 5x A3 Basic’s
  • 2.5x A4 Basic’s (if it were possible to have 0.5 of a VM)

Fortunately this initial quota is only a soft limit and easily lifted for your production Azure subscriptions.

Before we look at how to solve this problem it’s worth learning how to recognise when you’ve hit the VM CPU core limit, as its not always obvious.

Limit Symptoms Lesson 1

When creating a new VM you might find the some of the larger size options are greyed out without any obvious reason why. This will be because consumed cores + new machine size cores will be greater than your current subscription limit. Therefore ‘Not Available’. Example below.
AzureVMSizesGrey

The other important quirk here is that the size blade will only dynamically alter its colourations and availability state if the CPU cores are currently being used in VMs that are running. If you have already reached your limit, but the VMs are stopped and appearing as de-allocated resources, you will still be able to proceed and deploy another VM exceeding your limit. This becomes clearer in lesson 2 below where the failure occurs.

Limit Symptoms Lesson 2

AzureVMCreateNormErrorIf you are unlucky enough to have exceeded your limit with de-allocated resource and you were able to get past the VM size selection without issue your new VM will now be deploying on your Azure dashboard if pinned. All good right?… Wrong! You’ll then hit this deployment failure alert from the lovely notifications bell. Example on the right.

Note; I took this screen shot from a different Azure subscription where I’d already increased my quota to 40x cores.

This could occur in the following scenario.

  • You have 2x A4 Basic virtual machines already created. 1x is running and 1x is stopped.
  • The stopped VM meant you were able to proceed in creating a third A4 Basic VM.
  • During deployment the Azure portal has now done its sums of all resources covering both stopped and running VM’s.
  • 8x cores on running VM + 8x cores on stopped VM + 8x cores on newly deployed VM. Total 24.
  • This has exceeded your limit by 4x cores.
  • Deployment failed.

In short; this is the difference in behaviour between validation of your VM at creation time vs validation of your VM at deployment time. AKA a feature!

Limit Symptoms Lesson 3

AzureVMCreateWithJSONError2If deploying VMs using a JSON template you will probably be abstracted away from the size and cores consumed by the new VM because in default template this is just hardcodes into the relevant JSON attribute.

Upon clicking ‘Create’ on the portal blade you will be presented with an error similar to the example on the right. This is of course a little more obvious compared to lesson 1 and more helpful than lesson 2 in the sense that the deployment hasn’t been started yet. But still this doesn’t really give you much in terms of a solution, unless you are already aware of the default quota.

Apparently my storage account name wasn’t correct when I took this screen shot either. Maybe another blob post required here covered where the Azure portal is case sensitive and where it isn’t! Moving on.

 

 The Solution

As promised, the solution to all the frustration you’ve encountered above.

AzureHelpAndSupportTo increase the number of CPU cores available to your Azure subscription you will need to raise a support ticket with the Azure help desk… Don’t sigh!… I assure this is not as troublesome as you might think.

Within the Help and Support section of your Azure portal there are a series of predefined menus to do exactly this. Help and Support will be on your dashboard by default, or available via the far left hand root menu.

Within the Help and Support blade click to add a New Support Request. Then follow the prompts selecting the Issue Type as ‘Quota’, your affected Subscription and the Quota Type as ‘Cores per Subscription’.

AzureIncreaseCPUQuota

Once submitted a friendly Microsft human will review and approve the request to make sure it’s reasonable…. Requesting 1000 extra CPU cores might get rejected! For me requesting an additional 30 cores took only hours to get approved and made available. Not 2 – 4 business days as the expectation managing auto reply would have you believe.

Of course I can’t promise this will always happen as quickly so my advise would be; know your current limits and allow for time to change them if you need to scale your production systems.

I hope this post saved you the time I lost when creating 17x VMs for a community training session.

Many thanks for reading.


Leave a Reply

Your email address will not be published. Required fields are marked *

HTML tags are not allowed.

3,183 Spambots Blocked by Simple Comments

Paul’s Frog Blog

Paul is a Microsoft Data Platform MVP with 10+ years’ experience working with the complete on premises SQL Server stack in a variety of roles and industries. Now as the Business Intelligence Consultant at Purple Frog Systems has turned his keyboard to big data solutions in the Microsoft cloud. Specialising in Azure Data Lake Analytics, Azure Data Factory, Azure Stream Analytics, Event Hubs and IoT. Paul is also a STEM Ambassador for the networking education in schools’ programme, PASS chapter leader for the Microsoft Data Platform Group – Birmingham, SQL Bits, SQL Relay, SQL Saturday speaker and helper. Currently the Stack Overflow top user for Azure Data Factory. As well as very active member of the technical community.
Thanks for visiting.
@mrpaulandrew