The last day of this year’s TechEd 2010 was a busy one! In the coming weeks, you will see a couple of blog posts on some of the best sessions I attended. Unfortunately, there was just too much presentation content to quickly put together some blog posts on all the great things from the show floor.
Pushing the Limits of Windows
To kick off the day, we attended a presentation called “Pushing the Limits of Windows”. The auditorium was packed for this one. Luckily, Steven Bink from www.bink.nu and I were able to snag a seat in the first row.
One of the most iconic figures in the Windows scene, Mark Russinovich,shared his insights on the operating system and showed off some of his optimization and troubleshooting techniques.
So, why is Mark Russinovich pushing the limits of Windows? Because IT decision-makers and professionals need to know how far they can take the operating system and how they can adjust Windows if they have any performance issues. Mark also talked about what to do when Windows runs out of resources and how to prevent this from occurring.
Determine any “handle leaks” and troubleshoot
Anybody that’s ever looked under Windows’ “hood” or read Microsoft’s technical documentation has heard the term “handle”. However, most Windows users have never encountered a handle before—and likely never will. The problem is that, in some instances when Windows runs slowly, it is due to a problem with handles. So, what exactly is a handle?
WinObj shows resources of your Windows operating system.
In very simple terms, Windows defines objects, such as “Desktop”, “Directory”, or “File”, to represent operating system resources (see above). Every time one of these resources needs to be accessed by a program, a corresponding handle for that object is created.
Whenever a process (this could be a program you run, for example) wants to interact with one of Windows’ resources, it opens a handle. The more resources a program needs, the more handles are used. To test out how many handles a process can use, Mark demonstrated a tool called testlimit to see how many handles Windows is able to … pardon the pun … handle!
Let’s try it out. Fire up testlimit, which you can find on the Windows Internals Book Web site. Open up a command prompt, and launch “testlimit32″ or “testlimit64″—both of which you can find in the extracted testlimit folder.
As seen above, Windows 7 is able to support a total of 16,711,662 handles. That’s a lot! It is highly unlikely that you will ever be able to reach this limit with any application you run. However, when Windows runs slowly, it is probably due to the fact that a specific program has not been developed correctly. It causes a “handle leak”—and that means Windows’ resources are going to waste because thousands and thousands of handles are being used.
Our advice: If you suspect that your system is running slowly, and you’ve already performed most of the usual optimization steps, then you should check to see if one of your programs is either causing a handle leak or is using a huge amount of handles by default. How can you do that? It’s a piece of cake! Just open up Windows Task Manager by right-clicking on your taskbar, and selecting “Start Task Manager” or pressing ALT+SHIFT+ESC at the same time. Go to the “Process” tab, and click on “View/Select Columns”. Check “Handles”, and hit “OK”. You will see a new column appear in your Task Manager that shows you exactly how many handles are being used and by which programs.
Sort the list by the process which is using the most handles. To do so, click on “Handles”.
In the case above, Outlook 2010 uses about 4,000 handles. This is quite a lot but nothing to worry about; however, once a program reaches approximately 10,000 handles, you should take a closer look. Is the program old, or does it seem to be full of bugs? You may want to consider looking for a newer version of it. Is this program something you really need on a day-to-day basis? If not, uninstall or disable it, or try to find an alternative version of the program, perhaps one by another company.
There’s one problem here—you don’t really want (and have) to keep an eye on Task Manager 24/7. In a future blog post, we will explain in detail how to set up a Performance Monitor to warn you as soon as a process has reached a certain handle limit, so you can take the appropriate actions quickly.
What’s the right size for the page file?
It is one of the oldest questions and has been asked since Windows’ early days—how large should the page file be? The question is not without substance. Depending on what you do and how much memory you have built into your computer, choosing the right size is very important.
As a quick reminder of what a page file is: When your PC runs low on memory because you’re running a video editing suite, a game, and 37 Web browser tabs at the same, for example, Windows moves data (or pages) out of RAM and into a file called “Pagefile.sys” on your hard disk. By doing that, your programs will continue to run even though you’ve physically exceeded your memory.
Determining how large this file should be is quite a myth in the Windows community. According to Mark, the definitive answer is 1 GByte—and the crowd cheered and laughed. “Just kidding!” he then added, with a smirk on his face. Mark went on to explain how many magazines and Web sites have shared ridiculous advice about how large the page file needs to be; for example, one magazine recommended setting the page file to 300 MBytes if you’re doing basic tasks or 600 MBytes if you’re running more intense applications. That’s just ridiculous!
However, recent Windows versions come with a formula that adapts the virtual memory based on how much physical memory you have installed.
Using the above, Windows 7 on a 4-GByte machine set the virtual memory to 4 GBytes and increased it to 6 GBytes. The problem is that this is based on a simple formula that just cannot be applied to every single user out there. It’s not one size fits all.
Mark continued on to share a more proper answer to this long-standing question—use PE! I will guide you through the necessary steps. Download the file from Microsoft TechNet, extract it, and double-click on “Procexp.exe”. Once PE runs, double-click on the little graph as pictured below.
This will show the many resource levels of your machine! However, the part that we’re interested in right now is called the “Commit Charge”. This is how Mark describes the amount of physical and virtual memory.
Continuing with the same case, the system has 4 GBytes of RAM, with a page file of about 4 GBytes. During the current Windows session (while PE ran), it peaked at about 3.8 GBytes of memory usage. However, as soon as I began my usual use of Windows, it peaked to about 6.2 GBytes of total memory.
So, a page file of 4 GBytes is barely enough for that. Imagine a program of yours crashes and a so-called “dump report” (with the crash information) is then created. This would just cause a whole lot of trouble as these reports can be quite large.
If you max out the limit of “Commit Charge”, you have to increase pagefile.sys. To do this, right-click on “Computer”, go to “Properties”, and click on “Advanced system settings”. Open up “Advanced”, and go to “Settings” (under the “Performance” category).
Click again on “Advanced”, and hit “Change”. Uncheck “Automatically manage paging file size for all drives”.
Now, enter the value under “Currently allocated” under “Initial size (MB)”. The maximum size should be determined by your “Commit Charge” that you previously found. Take the “Peak” value, and add at least a couple of GBytes to this, so you have a buffer just in case.
The Case of the Unexplained: Troubleshooting
In his second session of the day, Mark held a session called “The Case of the Unexplained: Troubleshooting” and described what troubleshooting is all about—with particular emphasis on the performance aspect—to yet another fully-packed auditorium.
“The problem is,” Mark explained, “that most applications do a poor job of reporting errors and performance issues.” Another thing that gets on his nerves is when PC and software support divisions reply to consumers’ queries with “restart the program/Windows, and see if that works.” But, you never know if the performance problem is going to occur again and what caused it in the first place. This is what we’re here for!
The first thing Mark recommended—use PE when your PC is running slowly. To troubleshoot sluggish performance, he mentioned replacing Task Manager with its more advanced counterpart by clicking on “Options/Replace Task Manager”. Mark then went through the list of processes that are active on a machine to see if and what might be the primary cause of the slow system.
With PE, you can dig in, double-click on a process and see everything about that process that might cause high RAM and CPU usage. Find out if it’s a verified Windows process and see how many resources it takes up. Mark recommends keeping PE running in Task Manager and regularly hovering over it to see which process is currently consuming the most CPU usage.
The higher the process usage, the lower your PC’s performance. You will notice stuttering audio, Web sites will feel sluggish, playing video will be choppy, and just about everything will feel like it’s running as slow as molasses. So, keep an eye on the PE icon to see what is slowing down Windows! To learn more about any process, such as “Connectify.exe”, go to PE, and double-click on the icon. On the first tab, “Image”, you will immediately see what it is and where it is located.
You will also find out if you really need it or if there’s an update available. What Mark recommended—”do a Bing (or some other type of) search [laughs], and find out what is going on with this process!”
That concludes our coverage of TechEd 2010—again, I have learned about so many interesting things that it would be impossible to put it all into a single post. In the coming weeks, I will try to flesh out all of my notes and update TuneUp Blog readers about this amazing show.