Odd behaviour with Powershell Get-ChildItem with exclude

Discovered some strange behaviour with the File objects in Powershell. If you pipe the results of Get-ChildItem through foreach and ask for the default $_ you’ll get the file name. However, if you add an -exclude to the gci then the $_ becomes the full path!? Very odd.

PC Woes

Just as I had been smugly defending Windows as a stable OS the envitable happended. I started the machine and got the dreaded "cannot find system" registry corrupt error.  After about 30 mins of trying to recover it I finally gave up and started the re-install of XP. "Ok I’ll lose some programs and drivers but I guess that won’t take too long to reinstall". No, for some reason XP setup was convinced the disk wasn’t formatted so I had to lose everything off that disk…grrr. I could have copied files of using recovery mode but my mouse finger was quicker than my brain. Anyway the system is up and running again and thanks to this blog I managed to get all the various AMD dual core drivers back and install the Adobe upgrade suite.
 
At the same time I had just purchased 4x1GB sticks of performance RAM, luckily I hadn’t put them in otherwise I’d be blaming them for the system crash. So I thought this would be the ideal time to install them, afterall I can only lose nothing now! "Easy job", I foolishly thought. Opening up my case I suddenly realised I didn’t know if I was supposed to install matched RAM in the same bank (next to each other) or the same colour (every other slot). After a bit of hunting around I’m pretty certain it’s the same colour. So I pressed on, only to discover that the nice quick-release memory clips couldn’t release because they couldn’t get past my graphic card! Honestly these motherboard manufacturers, do they ever try their own kit out? Luckly by removing the quick-release retention clip on the graphics card there was enough give in the card for me to open the memory clips to get the sticks out (and put the new ones in). "Ok let’s see if that worked…booting…BIOS screen…4GB (good, was 2GB)…Windows XP…System…3GB". What, where is my extra 1GB? Then it hit me like a bus, XP is 32-bit it not going to use 4GB is it. Grr, I’m so used to Windows Server I’d forgotten I run XP at home :(. Not even /pae switch could persuade it. Oh well, maybe I’ll get around to putting Vista 64 on it soon.

Optimizing Virtual PC

I rebooted my machine at work and had a quick browse through the BIOS settings looking for ways to optimize the disk. Whilst changing those settings I spotted another called Hardware Virtualization. Turns out that most modern processors from AMD and Intel support a special set of instructions that allow virtual machine software to bypass parts of the emulated layer and go straight to the processor. However, for Intel processors these are typically off by default. So if you use application like Virtual PC then turn this on in both your BIOS and the VPC settings and enjoy a small, but real, performance boost.

Constellation of Features

Back in June I attended @Media2007 (atMedia2007) I was impressed by a talk given by Jesse James Garratt. He used the interesting phrase, "a constellation of features". What I found doubly interesting with the phrase is the number of interpretations it has depending on who I talk to…
1. Like a star field, here are so many stars that you have trouble singling out anything interesting
2. Like the ‘bigger dipper’, ‘the archer’, etc it is supposed to show or signify something to one person but to others it is impossible (or very hard) to recognise
3. A group of related features

In the talk the phrase was in the context of applications like Word where there are so many things to choose from that the user is simply bewildered and will find it difficult to focus on the task they wish to complete, i.e. (1). But I rather like the other definitions too, especially (3) which conveys the opposite message.

Javascript, scope fun with the for loop

I thought I’d share a bit of JavaScript or a little Got-Me, consider the following;

function A()
{
  for (index = 0; index < 10; index++)
  {
       alert(index);
       B();
       alert(index);
  }
  alert(‘finish’);
}

function B()
{
  for (index = 0; index < 10; index++)
  {
       // do nothing
  }
}


I ran this on Firefox/Safari and got….

0,9, finish
I was surprised on two counts, 1. Why didn’t the for loop work? 2. Why the shared scope? Anyway assuming I had accidentally created a global I corrected it and it worked as expected…

for (var index = 0; index < 10; index++)

Xml and XPath on the client via JavaScript

This weekend, "I’ve been mostly…", looking at how to process XML in a
client browser using JavaScript. Normally I wouldn’t entertain the
thought but with the recent push of AJAX and various cross-browser
script libraries I thought I’d take another look. NB. This doesn’t
consider JSON, I’ll post on that option later.

XmlDocument

The
core requirement to most XML processing (ignoring SAX style processing)
is the document. AJAX has helped here and it seems that the majority of
browsers implement some form of Dom but they do suffer from
inconsistencies.

HttpRequest

Probably the most
reliable object is  HttpRequest, which is a necessity for AJAX. There
is a cross-browser issue but it’s fairly simple to solve…
if(window.XMLHttpRequest && !(window.ActiveXObject))…
    httpRequest = new XMLHttpRequest();

if(window.ActiveXObject)
    httpRequest = new ActiveXObject("Msxml2.XMLHTTP");

after
that the objects seem to use the same API. There does seem to be some
drawbacks with this approach,  you need a web site (or service) to get
the XML from and the lack of XPath support.

XmlDocument revisited

"But
I don’t want to fetch my XML from a site, I created it on the client",
I hear you cry. Well most browsers support the following…
if (document.implementation && document.implementation.createDocument)…
    xmlDoc = document.implementation.createDocument("", "", null);

if (window.ActiveXObject)
    xmlDoc = new ActiveXObject("Microsoft.XMLDOM");

similar
in nature to the previous cross-browser test. These have good DOM
support and you can create/append elements as you’d expect. The load
method seemed a bit flaky.  Now this maybe my code so I’ll hold off
saying it doesn’t work…but it didn’t seem to! But creating the DOM
manually seemed fine so it is still useful, especially for sending data
back. However, the biggest issue seems to be XPath

XPath

XPath is a very useful API for searching XML but its support in
browsers is very patchy. Mozilla based browsers offer a good implementation…
var paragraphCount = document.evaluate( ‘count(//p)’, xmlDocument, null, XPathResult.ANY_TYPE, null );

and it includes NodeIterator for a selectNodes style enumerating.
Microsoft has the more usual (I’m an MS developer) SelectNodes, again
very easy to abstract out the difference in API. However…Safari. The
Konqueror based browser simply doesn’t like XPath. So what to do? I
considered writing my own query engine but I don’t really want to waste
my time re-inventing the wheel. So I had a look for a script library,
but it seems like XPath libraries and are still in their infancy. Of
the ones out there XML for SCRIPT
looked promising but still not 100%. At this point I realised that for
my specific needs I could get away with a bit of DOM walking so I left
it there.

Conclusion

My conclusion is that XML support in the browser is ok if you base your
work around HttpRequest. But if you want to do XPath then look for an
alternative.

OS User Perspective – money for old rope?

I was reading yet another, "my OS is better than yours" thread the other day and was shaking my head at it. One particular Apple user using OSX Leopard to throw mud at a Vista user. So it was with some interest that I read a recent Mac User magazine review of Leopard where they reviewed 10 new features and go onto to say that there is really nothing new. At the same time I was attending a Windows 2008 training course and was been told about features that you can quite happily use in Windows 2003 and more often than not Vista. What struck me is that the interesting things in an OS are often never seen by the user, they enable other features or simply make the system faster or more robust. So I think to avoid users crying, "money for old rope", I think OS vendors really need to sell us on the fundamental changes not the latest show-boating translucent flyout menu.
 
 

Exceptions or Error Codes (will it ever end?)

Joined in on a discussion today about should services raise exceptions or return error codes. Ron Jacobs listed quoted the following…

According to the .NET Framework Class Library Design Guidelines

Exceptions are the
standard mechanism for reporting errors. Applications and libraries
should not use return codes to communicate errors. The use of exceptions
adds to a consistent framework design and allows error reporting from
members, such as constructors,that cannot have a return type.
Exceptions also allow programs to handle the error or terminate as
appropriate. The default behaviour is to terminate an application if it
does not handle a thrown exception.”Sure enough you don’t see error codes within framework, however I’m not 100% convinced. The first problem I have with exceptions is one of performance. The stack walking involved in processing an exception should not be ignored. However, I’m convinced that I’ve read that the exception handling mechanism is going to change to reduce the performance impact…can I find any evidence of it? No, maybe I dreamt it. So I’m going to conveniently ignore performance for now and move onto my second gripe with exceptions, that of handling exceptions. To illustrate the point let me introduce the following example, you need to write a business component that will transfer funds from one account to another. The component contains a number of rules;
1. Does the client have access to account A?
2. Does the client have access to account B?
3. Are there enough funds in account A?
4. Is account A open?
…etc

The Microsoft recommended route seems to be to create an exception for every rules. So you’d end up with something like;
1. AccountFailedAccessException(arg)
3. AccountInsufficientFundsException
4. AccountClosedException
etc

To catch these exceptions you need to write a specific catch statement for every type of exception. This is fine if you actually want to respond to each exception in a specific way but lets say the client is only interested in, "did it fail an access check" or "some other non access problem". This is a depressingly annoying to implement since you have to catch each exception and repeat handling code, e.g.
catch(AccountFailedAccessException)
DisplayAccessNotification()
catch(AccountInsufficientFundsException)
DisplayTransferUnavailabeNotification()
catch(AccountClosedException)
DisplayTransferUnavailabeNotification()
…etc
If this was an old style error code then a switch statement would easily consume similar codes…
switch(errorCode)
case: AccountFailedAccess
DisplayAccessNotification()
case: AccountInsufficientFundsException
case:AccountClosedException
case: etc
DisplayTransferUnavailabeNotification()

Now I’ve now become convinced that exceptions are the way to go (for none performance critical) errors. The obvious solution to the above problem is to create a hierarchy of exceptions…
AccountException
AccountFailedAccessException
AccountNonAccessException
AccountInsufficientFundsExceptions
AccountClosedException

Therefore if you’re only interested in something going wrong with the Account component you’d catch nothing but the root exception of AccoundException. If you want any other exception apart from Access then you’d catch AccountNonAccessException, etc. Although this sounds good I do concede that it is still a pain to code all those exceptions.

What happens if performance is still and issue? If there is no getting away from performance then the answer is to return a state structure/class or an enum. However, the big problem with this approach is the client isn’t required to consume return code. The great thing about an exception is its in the hands of the OS, if you choose to ignore the exception then the program counter will be whipped away from you. So isn’t an easy choice for the performance freaks but I feel the tide of change is such that if you do go the error code route then fewer clients will like you for it, and in the world of SOA it is becoming harder to ignore that unknown client. I just hope I didn’t dream about the change to SEH (Structure Exception Handling) and that a new performance oriented version is around the corner.

Architect MVP Juval Lowy and Ron Jacobs address my question

I was privileged a few ago to be invited to listen to a number of software architects talk about SOA and I started to wonder why SOA hasn’t taken off. One idea that popped into my head was, "do I trust the service provider?". With this in mind I posted on the MSDN forum. I was happily surprised to see that…
 

Congratulations – your post has been selected for an ARCast.TV Rapid Response!

To listen to Architect MVP Juval Lowy and Ron Jacobs address your question click here

Hope this helps,

Ron Jacobs

…and it reaffirmed what I and other posters had said…which is nice 😉

Expression Design Training

My trusty RSS Reader (RSS Bandit…what a terrible pun, if you don’t get it I’m not explaining it) told me of some free training for the Expression range of products over at www.lynda.com so I loaded up the Expression Design training. The training consists of a series of downloadable exercises and a set of corresponding QuickTime movies. Now I’m not an experienced Adobe Illustrator user so most of this vector design stuff was going to be new to me. Well actually the tool is pretty self explanatory if you’ve ever used anything with layers. The training was aimed squarely at someone who doesn’t have a clue, which I guess is fine. You do learn the basics of using it and it is worth going through if you use my top tip. Make use of clicking the scroll bar forward. E.g. when you see someone move a shape from A to B, fast forward because you understand it and don’t need to see them move it to C, D, E…etc!

One gripe I did have with the training is that it is using beta software and some of the sample files in the beta simply don’t exist in the released version, although the sample used are so simple it doesn’t really matter.

Overall it was good to go through, just don’t sit there and watch it second by second.