Error when adding vmware vcenter server to virtual machine manager

When adding a vCenter server to virtual machine manager you might encounter the error “Could not retrieve a certificate from the FQDN server because of the error: The underlying connection was closed: An unexpected error occurred on a send.”

The root cause is a TLS incompatibility. The VMM server cannot connect using older TLS protocols and ciphers. To fix, make the following reg changes after taking a full backup of course:

$NetRegistryPath = “HKLM:\SOFTWARE\Microsoft\.NETFramework\v4.0.30319”
New-ItemProperty -Path $NetRegistryPath -Name “SchUseStrongCrypto” -Value “1” -PropertyType DWORD -Force | Out-Null

$NetRegistryPath = “HKLM:\SOFTWARE\WOW6432Node\Microsoft\.NETFramework\v4.0.30319”
New-ItemProperty -Path $NetRegistryPath -Name “SchUseStrongCrypto” -Value “1” -PropertyType DWORD -Force | Out-Null

Then restart the virtual machine manager server. You should now be able to add the vCenter server. Other errors could be related to missing updates. The latest vCenter versions are only supported with 2019 Update rollup 2 or later.

Add virtual machine host fails with error 20408

I recently had a problem adding a host to a VMM server – all the obvious things had been checked. WinRM was enabled, firewall rules were in place. Service account had admin rights and DNS was correct.

Still, every time I attempted to add the host an error occurred:

“Error (20408) VMM could not get the specified instance Microsoft:{668f165d-4dae-bcb6-5007ff1fc2e8} of class http://schemas.microsoft.com/wbem/wsman/1/wmi/root/standardcimv2/MSFT_NetAdapterRssSettingData on the server server.fqdn. The operation failed with error NO_PARAM”

In this instance, the server was a 2016 one which has been upgraded from 2012 R2. The fix was bizarre. Save all VMs and remove the vswitches so that only normal physical adapters remain and then recreate the vswitches. The config was identical but clearly, something behind the scenes was wrong and recreating the vswitches worked. Retrying the same job on VMM resulted in success and the host was added to VMM.

VMM Host not responding – WinRM Error and access is denied

If you have a Virtual Host in Virtual Machine Manager that is not responding, and forcing a manual refresh returns an error like this:

Error (2910)
VMM does not have appropriate permissions to access the resource C:\Windows\system32\qmgr.dll on the server.
Access is denied (0x80070005)

It can often be remedied by one of the following: Re-install the VMM agent, restart the virtual machine manager agent and WMI services or restart the virtual host.  It is also worth making sure your hosts are all up to date as well.

Occasionally I see a host where this doesn’t work and no matter what it, remains as “not responding” in VMM.  For me the case appears to be a broken winrm configuration.  You can be fooled into thinking winrm is setup correctly as a “winrm /quickconfig” returns as already setup, and the winrm service is running.

It looks like all the “winrm /quickconfig” command does is check that winrm has been enabled, it wont reset other possibly incorrect configurations or broken settings.

Comparing the winrm configuration and registry of a working identical host to a “not responding” host I have found the following commands will correct the deviated settings and usually results in a host that now responds to VMM.

reg add HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System /v LocalAccountTokenFilterPolicy /t REG_DWORD /d 1 /f

winrm set winrm/config/service/auth @{CredSSP=”True”}
winrm set winrm/config/winrs @{AllowRemoteShellAccess=”True”}
winrm set winrm/config/winrs @{MaxMemoryPerShellMB=”2048″}
winrm set winrm/config/client @{TrustedHosts=”*”}
winrm set winrm/config/client/auth @{CredSSP=”True”}

Be sure to run these on the effected host in an admin command prompt.

Cannot deploy Virtual Machines via App controller or virtual machine manager

I found myself unable to deploy Virtual Machine templates via App controller or virtual machine manager.  The detailed error was available when attempting via VMM and stated:

“The projected CPU utilization exceeds the CPU utilization of 0% specified at the host reserve level”

All the hosts had zero stars and I couldn’t continue.  Normally this is a useful message as it is preventing me from putting too many virtual machines on my hosts and stretching them too far, however in this instance I knew that we should be OK – we need to have 50 or so VMs on each host and they don’t use much CPU.  It was possible to manually make more virtual machine in hyper-v and then manage them post deployment in VMM but that defeats the point of having a scripted template deployment and app controller setup so users can deploy their own Test VMs?

I found that there is a somewhat hidden option in the host reserve settings available only in PowerShell.  You can have a look at what yours are configured to use with the Get-SCHostReserve” PowerShell command.  The setting you need to change to bypass CPU reserves is the “CPUReserveOff” parameter.

get-schostreserve

So as you can see I have my CPU reserve level set to 0% but when VMM evaluates the deployment if it believes there will be less than 0% CPU available it says no.

You can change this with the Set-SCHostReserve command.

Get-SCHostReserve -VMHostGroup “your host group here” | Set-SCHostReserve -CPU -Enabled $false

fix-get-schostreserve

“The projected CPU utilization exceeds the CPU utilization of 0% specified at the host reserve level” but it lets you continue and deploy anyway.

 

WinRM Connection limits

Sometimes in various Microsoft products (Exchange/VMM mostly) you might reach some of the WinRM connection limits.  Personally I see it most in Virtual Machine Manager when you have many admins who are making lots of changes and deploying large numbers of services.

The symptom is sometimes an error like this:  Error Connecting to remote server failed with the following error message: The WS-Management service cannot process the request. This user is allowed a maximum number of 5 concurrent shells, which has been exceeded. Close existing shells or raise the quota for this user.

It can also just show as a generic Failed/timeout job which then works when you re-try later.

1. On the offending server open a command prompt or Powershell window with administrative privileges.
2. Type in winrm get winrm/config/winrs to view the current configuration.

winrmmax

These values will need to be increased, don’t just add a load of zeros to the end as having limits configured can stop unwanted or malicious connections from brining a server to its knees.

To adjust the values use the commands below where 20 and 100 are appropriate numbers for your environment.

3. winrm set winrm/config/winrs @{MaxConcurrentUsers=”20″}
4. winrm set winrm/config/winrs @{MaxShellsPerUser=”100}

Upgrading System Center Virtual Machine Manager 2012 SP1 to R2

The upgrade process is reasonably smooth but requires you to uninstall the SCVMM 2012 SP1 and any Hotfixes or update rollups that are in place.  You will also need to uninstall the 8.0 Microsoft ADK and install the latest 8.1 version before running setup from the Virtual Machine Manager 2012 R2 disk.  The 8.1 ADK can be found here: http://www.microsoft.com/en-gb/download/details.aspx?id=39982

Before starting gather and record the details of the existing implementation if not already documented.  For example the database server, instance name, database name and the details of the virtual machine manager service account.  You can view the current database configuration from within the VMM console.  While in the “Administration/Settings” view, click on “General” and then double click “Database Connection” While it is possible to use the local system service I would recommend creating a domain user for the service to run as with a local system service account baremetal hyper-v host deployments are not possible and you may have trouble logging in from other machines.

I would also make sure at least 10GB of free space is available on the VMM server as the 8.1 ADK is quite large.

  1. Uninstall both the VMM 2012 condole and management features. Make sure you select the option to retain the database.
  2. Take a backup of your SCVMM database.
  3. Uninstall the 8.0 ADK
  4. Install the 8.1 ADK – If you are using the online installer a large amount of data will need to be downloaded.  If this is not possible consider running the installer ahead of the upgrade and download the full install for later offline installation.
  5. Run setup from the VMM 2012 R2 disk
  6. Simply follow the wizard entering the database and service account details you recorded earlier
  7. Once setup is complete there are a few tasks left to complete: Any driver packs will need to be removed and re-added or they may not be discovered correctly.  Virtual Machine templates may need to be re-pointed to the correct operating system containing VHD.  You will also of course need to update the agents on all of your VMM managed hosts.

System Center 2012 – Inside the Private Cloud

My three favorite parts of the System Center suite are Configuration Manager, Data protection Manager and Endpoint Protection.  These three products work well at making most of the chores of running an IT environment lighter

 

Configuration Manager & Endpoint Protection

This is, in my opinion, the flagship product of the System Center Suite.  Management of servers, workstations and even mobile devices is completed here and with Service Pack 1 an impressive list of operating systems and devices are supported including Linux and Mac OS.  The mobile device manager has now been brought into Configuration manager as well.  It is also within Configuration manager that you should deploy and manage Endpoint Protection.  Endpoint Protection was formally known as Forefront Protection, and I really hope this product continues being supported and isn’t eventually dropped like other forefront products have been, such as TMG. If you are lucky enough to have the standard or Enterprise CAL already (and you really should if you are looking at System Center) then it might be that you can save a fair bit of money by ditching your current Antivirus vendor and moving to Endpoint.

Typically in the past I would have used standard windows deployment from a share or USB volume or another vendor’s solution such as Ghost as the configuration manager effort wasn’t always worth the reward.  Deployments are now a lot easier and when tied with a decent collection of drivers and task sequences it is simple to quickly cater for a new situation or model of desktop or server.

 

Data Protection Manager

In my experience no backup solution is perfect and generally each has its strength and weaknesses.  With the 2012 iteration of Data protection manager the Microsoft offering is looking to be more of the former and less of the latter.

DPM is great at backing up Microsoft own products and applications and I have been using it to back up nearly 2TB of Exchange data and a large SharePoint Farm as well.

DPM offers many of the features an enterprise backup solution should, such as continuous protection, differential and incremental backups as well as Disk to disk and Disk to Tape backups.  I feel that it is only in the scheduling and retention options that DPM starts to fall down.  Typically I like to keep daily data for a month, weekly data for 2-6 months, month end for 2 years and year end data for even longer but unfortunately the retention a scheduling options don’t really cater for this approach, you simply have a hard limit on how long you can retain backup data with disk to disk used for short term and a second schedule for long term tape backups.  This leads to me using a different product to perform end of month backups simply so that I can keep them for longer than the other tape backups.

Generally DPM performs very well and can perform backups in shockingly fast order but it can have a tendency to occasionally mark replicas as bad or fail a snapshot only for it to succeed later without issue.  Quite possibly a quirk of the environment I have evaluated it in but something which seems to happen with other solutions a lot less often.  Also on upgrading on to sp1 be prepared to check the consistency of every replica and build the time taken for this into your upgrade plan.

Service pack 1 is definitely worth the upgrade as it sees a number of feature improvements such as support for deduplicated volumes and is the final piece in the puzzle to getting dedupe working on cheap hardware.  With Windows Server 2012 and Data Protection Manager you can use deduped volumes without the need to buy expensive storage solutions and licenses.  The useful extra features don’t end there, Cluster shared volumes can now be backed up as well as continuous protection of Hyper-V guest machines even while they are being live migrated.

 

Virtual Machine Manager

Virtual Machine Manager is to Hyper-V as vCentre server is to VMware.  VMM is the only real additional software cost you will have to bare if you want to use a full Hyper-V clustered solution. (God help anyone who wants to manage a large cluster of Hyper-V hosts as individual servers) The thought obviously being that if Microsoft gives you the Hyper visor for free you won’t balk at paying for the management tools and I expect a good number of people will buy the System Center Suite simply to be able to run Virtual Machine Manager.  If this sounds like you I hope you at least try the other parts of the System Center suite as they are worth a good look.

 

Orchestrator

Orchestrator is the centre point of the System Center suite and ties all of the other products together to make an intelligent workflow based automation solution.  It is based on software Microsoft acquired when it purchased Opalis back in 2009.

Orchestrator makes sense in the larger environments or when a requirement for automation is present such as in managed hosting.  It will likely be less use in smaller environments as the time taken to configure and automate tasks won’t have quite the same payback.

With Orchestrator it is possible to automate almost anything from deploying VMs through to recovering from an error condition in a service.  You can even find Integration packs from various vendors which lets you control and automate them from Orchestrator.

 

Operations Manager

Operations Manager is Microsoft monitoring and alerting system and in the latest version it does a lot more than peer into event logs and give you a huge list of errors.  As with the rest of the System Center Suite Service Pack 1 introduces support for Linux which enhances the appeal of Operations manager a little and the list of supported applications and devices seems to be constantly growing as well.  No doubt pure Linux environments will be running Nagios or something similar but for mixed or pure Microsoft environments Operations manager is definitely one of the best out there.

 

Service Manager

Service Manager is the System Center component I have spent the least time looking at.  It is hard to get excited about help desk solutions, especially when so many spend so long logged into them.  Possibly the best feature of Service Manager is the auditing and reporting.  If correctly configured with orchestrator Service Manager can help you to identify why problems are occurring or when changed were made which could have contributed to an issue.  Service Manager doesn’t feel like the kind of product people would buy on its own, but if you have already paid for the full System Center suite you would have to be silly not to at least try it and as with all of the other solutions mentioned here, generally the longer you use them, the more you come to realise how powerful and  they are.

 

Unified installer

There is a unified installer which is great for quickly deploying the whole System Center Suite and you can read all about my experience here I also urge you to click through some of the categories above for more System Center related posts.

System Centre Operations Manager Release Candidate Setup

Following on from the unified installer for the Microsoft private cloud, the System centre operations manager installation failed and I decided to attempt the installation myself.

Once you have a suitable server setup with windows 2008 R2 the first step is to install the .NET Framework 4, report viewer and all the required IIS roles:

IIS6 Metabase Compatibility role service.
ASP.NET role service.
Windows Authentication role service.
Static Content role service.
Default Document role service.
Directory Browsing role service.
HTTP Errors role service.
HTTP Logging role service.
Request Monitor role service.
Request Filtering role service.
Static Content Compression role service
IIS Management Console role service.

Even with all the correct roles and pre-requisit software installed the validator will still throw up a couple of problems.

 

You will need to load the IIS Manager and open the ISAPI and CGI Restrictions properties and change the deny to an allow for the ASP.NET v4.0.30319 line.

 

In addition to the above if you installed the .NET 4 framework before the roles you will get an error message: “The ISAPI and CGI Restrictions are disabled or missing” on validating the pre-requisites and will need to run the following command in a cmd window to resolve the problem.

%WINDIR%\Microsoft.NET\Framework64\v4.0.30319\aspnet_regiis.exe –r

If all is well the command window should look like this:

 

Now setup should allow you to proceed with setup and select weather to add a management server to an existing management group or create the first management server in a new management group.  I like most people playing with the release candidate chose the latter.

Next you are prompted to supply the SQL server details.  In my environment the failed unified installer attempt had left a default SQL instance installed which appeared to be unsuitable as the installer complained I was missing the “Full text indexing” features required.  I had used the SQL 2008 R2 express media which does not include this option (and according to tech-net not on the list of supported SQL versions) so my next steps were to remove the currently installed SQL instance and install a full fat version of SQL 2008 R2 Standard with the full text indexing and reporting services options.  In a production setup you may well choose to create a new database on an existing server but for the purposes of evaluating this I selected to have a local instance of SQL installed as it makes cleaning up this install later a lot easier as I can just destroy the VM.

With the advanced services options installed the operations manager setup wizard will allow you to continue and adjust the configuration of the operational database (although I just left these at the defaults)

 

On clicking next you should see a similar screen for the datawarehouse database.  Again I accepted the defaults and continued on to choose the reporting services instance I installed earlier with SQL 2008 R2 Standard (don’t forget to run the Reporting Services configuration manager and start the SQL Server agent service)

 

Next it is time to choose which IIS site to use for the web console.  In a production environment it would be prudent to configure a new site and setup SSL however I am sticking with the default website for now.

 

Next you are prompted to select an authentication mode for use with the web console.  I selected Mixed Authentication as it will be an entirely private deployment for evaluation/testing purposes only.

After selecting the authentication mode you are prompted to supply some domain account/s for the various roles to use.  I created a single user for this purpose however it would be advisable to separate the data and management accounts so you can fine tune the permissions they are granted.

 

After this step you can choose to opt-in (or not) to the various customer experience improvement programs.  I chose not to as this machine has no Internet access anyway.

Finally you are presented with a page full of the various configuration options set in the previous steps.  If all looks well click install, go grab a tea/coffee and you should be able to return to an installed SCOM 2012.

 

Next I will work on getting a few servers monitored and start evaulating what is avaliable in SCOM 2012.

Evaluating the Microsoft Private Cloud with the System Centre 2012 Unified Installer

After reading a lot about Hyper-V and attending an IT Camp at Microsoft I really wanted to give the new SCCM 2012 a closer look and at the same time get better acquainted with Hyper-V and other related upcoming Microsoft releases such as Data protection Manager and Service Manager.

Getting all the installation files and pre-requisite software downloaded

Sign up and download all the installation files here: http://technet.microsoft.com/en-us/evalcenter/hh505660 (6.6GB) and in addition to this you will need all of the pre-requisite software which is listed here: http://technet.microsoft.com/en-us/library/hh751268.aspx  I went through a fair bit of trial and error to get the set-up utility to detect all of the installation files and I recommend making sure to extract each of the products into their own folders.  Don’t put all of the products in one folder or share folders as if the installer doesn’t recognize one of the paths/files you wont be able to tell which one (plus its messy and you might end up over writing files) I would also suggest you do the same for all of the pre-requisite software. (also don’t forget to extract all of the zip/exe/iso files as the unified installer wont read them otherwise.  I found winrar invaluable for this)

 

Preparing the installation environment and servers

To get all the products installed you need at least 8 servers (physical or virtual) with a minimum of 2GB of ram each.  I set all mine up on a single Hyper-V host as its only for testing purposes and I don’t have loads of servers spare for development/testing work.  I would also suggest using a sensible naming convention or it can get pretty confusing quite quickly and bare in mind that the server you choose to initially run the set-up on will become the Orchestrator server.

Aside from the base windows 2008 R2 operating system all the machines need to have a few things configured before they are ready for deployment.  To avoid duplication I made sure all of the servers were in a single OU and created a policy to apply the customizations for me rather than individually configuring the local policy on each host.

Computer Config \ Administrative Templates \ System \ Credentials Delegation \ Allow Delegating Fresh Credentials
Set to = Enabled
Server = WSMAN/*

Computer Config \ Administrative Templates \ System \ Credentials Delegation \ Allow Delegating Fresh Credentials with NTLM only server Authentication
Set to = Enabled
Server = WSMAN/*

Computer Config \ Administrative Templates \ Windows Components \ Windows Remote Management (WinRM) \ WinRM Client \ Allow CredSSP authentication
Set to = Enabled

Computer Config \ Administrative Templates \ Windows Components \ Windows Remote Management (WinRM) \ WinRM Client \ Trusted Hosts
Set to = Enabled
TrustedHostList = *

Computer Config \ Administrative Templates \ Windows Components \ Windows Remote Management (WinRM) \ WinRM Service \ Allow Automatic Configuration of listeners
Set to = Enabled
IPv4 filter = *
IPv6 filter = *

Computer Config \ Administrative Templates \ Windows Components \ Windows Remote Management (WinRM) \ WinRM Service \ Allow CredSSP authentication
Set to = Enabled 

Computer config \Administrative Templates \ Network \ Network Connection \ Windows firewall \ Standard Profile \ Windows firewall Protect all network connection
Set to =  disabled

I then went round each server and ran a gpupdate to ensure they all applied the new policy before I attempted to run the unified set-up.

 

Running the unified installer

On running the unified set-up and selecting the products you are evaluating (I wanted to try all of them) you are prompted to provide a path to each of the installation files you downloaded earlier.  (I bet you are glad you downloaded and extracted each of the products/pre-requisites into their own folders now) if all is well it should be a simple exercise of browsing and selecting each of the folder paths created earlier.  Dont be surprised if it doesnt recognize one of the paths or files just make sure you have the right product/version and its extracted, even the iso file for the windows automated installation kit needs to be extracted so its just a normal folder full of files.  (UNC or local path names are both OK.)

 

Once you have completed both this screen and the pre-requisites page that follows it you can select what account you wish to use as the installer account.  I created my own domain user for this but you can use any user account which has the required permissions.  Following this you can configure other options such as site name etc. and finally you are presented with an install button.  Sit back and watch the progress bars.