How to stop the CLR optimizing your variables during a debug session

Finally decided to see if there was something I could do to stop the CLR optimizing away all the variables when I want to remote debug an application. This seems to have the answer,
Essentially, create an myfile.ini for your dll/exe and put the following in it;
[.NET Framework Debugging Control]

Remote debugging – value as being optimized

Surley one of the most annoying problems with remote debugging is after finally getting a connection, the correct pdb’s, the correct source and a breakpoint hit you examing the values to be faced with a big red cross saying the value has probably being optimized and you cannot view it. I’m not absolutley sure of the reason but I guessed that CLR had optimised it because the code had run before the debugger was attached. So my tip is to try and attach the debugger before the code is run. Obviously not the easiest thing to do every time but if you can do this then the CLR will not optimize the values aways. Hopefully someone out there will provide a better answer.

Debugging when classic ASP calls into .net

Breakpoints not firing when called from COM+?
A collegue was having trouble debugging some .net code that was called from classic ASP code. After getting over the initial problems of making sure that the source code matched the binary files (always a good start) the breakpoint still wasn’t firing. The problem was that the .net component had a com wrapper hosted in COM+, nothing unusual there. However, they’d configured the application to launch a debugger as soon as the application starts. The problem here is that Visual Studio only attaches to the x86 part of DLLHost. When you eventually get around to calling the .net code it is blissfully unaware that it should be debugging. The solution I chose was to stop all this auto-launched and simply wait until some .net code has been called. Then attach to the DLLHost process, ensuring it states ‘managed’ in the process, and then you’ll be hitting breakpoints to your hearts content.

Remote Debugging using Visual Studio 2003

I got into one of those situations when I need to debug a bit of software of a server but I couldn’t install Visual Studio. So for the first time I stepped into the world of remote debugging. It was actually fairly painless, although I did cheat and went the not-recommend route of adding, ahem, myself to the local administrators group.
The basic steps are…
  1. Install remote components (not full blown Visual Studio), found in the install root of VS. A bit annoying you have to install anything but the client was ok with this
  2. Add your interactive user into the remote servers "debug users" group (and possibly the administrators account or discover the permission necessary to access a running process)
  3. On the remote server run; msvcmon -tcpip -anyuser Note that you can specify the correct user but if you’re not too concerned about security then this is a simple short-cut. Also read what it says, it will stop listening after a period of time (15 mins by default) 
  4. In Visual Studio open the process dialog and type in the name of the server
  5. Attach to the process and off you go

Apparently you can start the remote process off from Visual Studio too, but for some reason it wouldn’t let me specify the location. Since I was using Remote Desktop to control the program on the server it wasn’t a problem.