Windows Server AppX Installation Failures

We use Remote Desktop Services to provide our clients with SaaS access to our Windows desktop application.  An issue on one of our cloud servers running Windows Server 2019 has been plaguing me for a while.  Server performance was sluggish, and I noticed in Task Manager that wsappx was using high CPU.  That’s the AppX Deployment Service, which is responsible for installing UWP/Windows Store apps for each user.  Even though we don’t have Windows Store installed on the server, there are still certain system AppX packages that are installed for every user.  These are in the C:\Windows\SystemApps folder, and configuration of these apps is stored in the %localappdata%\Packages folder for each user.  I noticed that the Packages folder was empty for users added after a certain date.  Another symptom was that the Start menu did not function for these users.  Installation was failing and Windows continually retried the installation for these users.  Hence, the high CPU usage.  Running Get-AppXLog in Powershell revealed the following error:

Error 0x800705AA: While processing the request, the system failed to register the windows.stateExtension extension due to the following error: Insufficient system resources exist to complete the requested service.

Uh… okay.  It would be REALLY handy to know what that “insufficient resource” is! The most insufficient resource is this error message!  My ISP was stumped as well.  So, I started trying the usual things: DISM and SFC repairs, chkdsk, disabled firewall, etc.  I eventually copied an image of the server to my local machine and ran it in Hyper-V, so I could try more invasive things without affecting users on the live server.  I started uninstalling programs and even ran an in-place upgrade to Server 2022.  Nothing helped.  By now, many would have spun up a new server and called it a day.  In this case, that would disrupt the work of 200+ users, not to mention cause a bunch of support calls while these users tried to migrate to the new server.

One of the behaviors was that a folder for each app would be created in the Packages folder during installation, and then quickly deleted.  I needed to know if this was a symptom of the failure or the cause of it (maybe something else was deleting the folders during installation).  So, I loaded up Process Monitor (SysInternals) to do some forensics.  I filtered for events on the Packages folder and tried to install one of the AppX packages from Powershell.  I could see that wsappx was deleting the folder (not some other process), meaning the deletion was a symptom of the failure, not the cause.  Then, I changed the filtered on Command Line to capture all events of the wsappx service, which included all directories and registry access.  This was thousands of entries, so I found where the package folder was being deleted and started looking above it for clues to what caused the failure.  Lo and behold!  One nearby entry had the result INSUFFICIENT RESOURCES.  It occurred when attempting to add a value to the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Notifications.  I found that key in RegEdit and it took a while to open… a good sign I was on the right track.  I then tried to manually add a value to that key and got an error.  Bingo! Basically, that registry key was full and could not accept any more values. That led me to the following article:

Registry bloat causes slow logons or insufficient system resources error 0x800705AA in Windows 8.1 – Microsoft Support

That article points to a hotfix and a utility to clean up the registry bloat causing the problem.  Since it was designed for Windows 8.1, I did not feel comfortable installing it on my Windows Server 2019.  Fortunately, I found another utility that addresses the issue:

https://github.com/Lazy-256/clnotifications

I downloaded/unzipped clnotifications.zip then ran the fix.  It reduced the Notifications registry section from 256K entries to 500.  AppX package installations are now succeeding and wsappx is no longer hogging CPU. 

Maybe this article will help someone else that encounters the same error.  If this isn’t your exact problem, RegistrySizeLimit or excessive firewall entries could also be the culprit.  And if you get stuck, Process Monitor can help you find where the issue is.

Head in the Cloud

With Microsoft’s unveiling of Windows Azure at the PDC last week, I’ve been trying to get my head around exactly what Azure is, or more specifically, why this “cloud computing” thing is so important to Microsoft. What is cloud computing? Heck, what is the “cloud”? According to Wikipedia:

The cloud is a metaphor for the Internet (based on how it is depicted in computer network diagrams) and is an abstraction for the complex infrastructure it conceals.

Ok, so the cloud is the Internet. Why not just call it the Internet? Evidently, it’s a more abstract, architectural way of looking at the Internet that’s not concerned with all the hardware and network protocols under the hood. You connect a router to the “Internet”. You put a service in the “cloud”. Indeed, when talking about cloud computing, people often refer to delivering “software as a service” or “software + services”. It’s broader than that and actually has been around for a while, so what’s the big deal? “Cloud computing” is a buzz-word, to be sure; a repackaging of existing concepts and sold as something new. In this case, the packaging could provide some benefits.

So, what is Windows Azure then? First of all, let’s state the obvious: this is web hosting from Microsoft. When it comes to hosting, you have a couple of options: inexpensive shared hosting, which is fine for static web sites and sites that need little processing power, and dedicated hosting, where you get your own machine(s) for mission-critical stuff. Azure Services is kind of a hybrid approach, where you pay for dedicated resources on Microsoft’s data center, but it’s all virtualized and you don’t know about the hardware underneath. Windows Azure is the base platform for managing all of that, an operating system of sorts with APIs exposed to developers.

My first thought is that this would be something Microsoft could sell to ISPs, but once I realized the scale of what they are trying to do, you really do want a Microsoft, a Google, or an IBM behind it. Scalability and reliability are the name of the game. When you sign up, you get an instance on the data center, which is presumably a virtual machine running Windows Server, but I don’t know the exact details and there’s probably more to it than that. You can have as many instances as you want (or can afford) on-demand, and you pay for exactly what you use and for how long you use it. Microsoft demonstrated scaling instances up or down by simply changing a setting in a config file.

How could that be useful? Imagine you’re offering a new web application and you need to easily scale up as you grow. Or maybe you’re an online retailer and you need to quadruple your web capacity during the holiday season. To get more granular, maybe you need 10 servers running your web site during peak hours but only 2 in the middle of the night. That’s the pitch, anyway.  Whether or not anyone actually needs that much flexibility remains to be seen, but it could be enough to convince some people to buy in.

So, how much is this gonna cost? Good question. Microsoft hasn’t released pricing yet, but they are going to have to be competitive with Amazon’s “Elastic Compute Cloud“. I looked at their pricing, and since everything is a-la-carte down to the hour and the gigabyte, I still don’t know. As far as I can tell, a single instance running 24/7 costs about $150-$200/month. That’s comparable to dedicated hosting, but if you added on the managed features you get with EC2 (backups, etc.) to dedicated hosting, EC2 may actually be cheaper than dedicated (I’m sure people will be analyzing and debating that for some time).  Of course, Microsoft has services such as .NET Services and SQL Services that sit on top of Windows Azure, as well as full-blown applications like Office and CRM, so it’s really going to depend on what you need. I hope Microsoft makes it easier than Amazon to determine your costs.

With this highly scalable and supposedly reliable system, will businesses move their applications to the cloud? I think you’ll see some of that, but the basic rules for determining if an application is a good candidate for the web haven’t changed. If a company relies on their line-of-business software for daily operations, I doubt they’ll risk being shut down because their DSL or T1 line goes down. However, maybe the cloud is the ideal place for their CRM system. Microsoft realizes that businesses are not going to put everything in the cloud, so part of .NET Services is to help companies bridge the gap between their on-premises and cloud systems. For example, logging in to your local network can also log you in to the CRM app in the cloud.

Now, for the million dollar question: Can you run FoxPro applications on Azure? Maybe, but not yet. Azure will initially be for managed .NET code only, but Microsoft claims they will open it up to unmanaged code in 2009. We’ll have to wait and see if that includes VFP.

Resources: