How to Use Visual Studio Performance Analyzers against Unit Tests

Having a nice set of automated tests in your project has lots of well publicized advantages. However, if you want to use the tests to help analyze a performance issue you get stuck with a simple problem, how do I run the test and run the Performance Analyzer?

The trick is that the Visual Studio Test project is itself a console application, but it is a sort of headless console application. This isn’t helpful has Performance Analyzer wants you to set a start-up project and without being able to write some code to invoke your tests you’re stuck. Normally a console application will have Project.cs file but test projects automatically build this behind the scenes. This means that if you simply drop a new Project.cs file into your test project it won’t build because you’ll get errors ‘Program has more than one entry point defined’. To allow your Program.cs file to be used you need to edit the project and add the following line after the TargetFramework property item;

<GenerateProgramFile>false</GenerateProgramFile>

Now your test project will have an easy to use Console/Main/Program.cs that you can code to invoke your tests. All you have to do now is point Performance Analyzer at it and press Start.

How to stop the CLR optimizing your variables during a debug session

Finally decided to see if there was something I could do to stop the CLR optimizing away all the variables when I want to remote debug an application. This seems to have the answer, http://blogs.msdn.com/tom/archive/2008/05/09/getting-more-information-from-clrstack.aspx
Essentially, create an myfile.ini for your dll/exe and put the following in it;
[.NET Framework Debugging Control]
GenerateTrackingInfo=1
AllowOptimize=0
 
 

Remote debugging – value as being optimized

Surley one of the most annoying problems with remote debugging is after finally getting a connection, the correct pdb’s, the correct source and a breakpoint hit you examing the values to be faced with a big red cross saying the value has probably being optimized and you cannot view it. I’m not absolutley sure of the reason but I guessed that CLR had optimised it because the code had run before the debugger was attached. So my tip is to try and attach the debugger before the code is run. Obviously not the easiest thing to do every time but if you can do this then the CLR will not optimize the values aways. Hopefully someone out there will provide a better answer.
 
 

Debugging when classic ASP calls into .net

Breakpoints not firing when called from COM+?
A collegue was having trouble debugging some .net code that was called from classic ASP code. After getting over the initial problems of making sure that the source code matched the binary files (always a good start) the breakpoint still wasn’t firing. The problem was that the .net component had a com wrapper hosted in COM+, nothing unusual there. However, they’d configured the application to launch a debugger as soon as the application starts. The problem here is that Visual Studio only attaches to the x86 part of DLLHost. When you eventually get around to calling the .net code it is blissfully unaware that it should be debugging. The solution I chose was to stop all this auto-launched and simply wait until some .net code has been called. Then attach to the DLLHost process, ensuring it states ‘managed’ in the process, and then you’ll be hitting breakpoints to your hearts content.
 
 

Remote Debugging using Visual Studio 2003

I got into one of those situations when I need to debug a bit of software of a server but I couldn’t install Visual Studio. So for the first time I stepped into the world of remote debugging. It was actually fairly painless, although I did cheat and went the not-recommend route of adding, ahem, myself to the local administrators group.
The basic steps are…
  1. Install remote components (not full blown Visual Studio), found in the install root of VS. A bit annoying you have to install anything but the client was ok with this
  2. Add your interactive user into the remote servers "debug users" group (and possibly the administrators account or discover the permission necessary to access a running process)
  3. On the remote server run; msvcmon -tcpip -anyuser Note that you can specify the correct user but if you’re not too concerned about security then this is a simple short-cut. Also read what it says, it will stop listening after a period of time (15 mins by default) 
  4. In Visual Studio open the process dialog and type in the name of the server
  5. Attach to the process and off you go

Apparently you can start the remote process off from Visual Studio too, but for some reason it wouldn’t let me specify the location. Since I was using Remote Desktop to control the program on the server it wasn’t a problem.