More Vista sideshow questions than answers?

After my initial play with gadgets I thought I’d see how to make the background transparent. Boy do I regret looking at that. The basic mechanism is based around the background image, set this to transparent and desktop will show through. Sounds simple. I downloaded Paint.NET and created a PNG with 50% opacity. The gadget text took on the background of the image but no opacity. Now I’m not sure if this is a problem with Paint or the gadget. However, after playing around with moving the image’s canvas I did eventually get parts of the gadget to be transparent, although it has created more questions than answers…

How to make use a background image? The main how-to shows a new tag to declaratively set the background, <g:background/>, but this only seemed to work with absolute positioning  – again not an official line just what I observed.  The  problem seems obvious, if you use an element then you’re getting involved in the document flow. I don’t understand this since the element is described as a way of setting the gadget.background.image without using code so I though it would be removed before rendering. So not very happy with this I had a look around the web. One example I spotted just used the basic CSS background-image for the body. It worked a treat, so why you need the alternative version I don’t know – perhaps that will become clear later, other gadget technologies perhaps? The next bizarre issue was the background image itself, again could be Paint.Net I’m not sure. When I replaced the background image with a new one the gadget rendered some odd mix of the old and new. When I removed the image from the folder the gadget worked fine, put the image back and the odd mix appeared. Created a brand new image (i.e. new name) and that worked fine. Chopped the corner off for a nice rounded edge and that worked…well if you count an odd lightning fork like line going through the gadget as working! So just be careful when editing the background image, if you get odd results just bin the file and create a new one.

Vista Sidebar gadget gotcha

On a train trip back from London I used my Laptop, yes actually got something half useful done on a train journey. I had a couple of nagging questions about Vista Sidebar gadgets so I thought I’d try a quick example. Threw together a couple of html and JavaScript files and voilà had a gadget. But it didn’t seem to respond to the resize events correctly. After a number of random changes I finally released my mistake. It seems that you must set a default size in the CSS otherwise it goes a bit crazy. I’m not sure if that is the official line but it certainly seemed to fix the problem for me.

@Media 2007 (atmedia2007)

First off, I wasn’t going to attend @Media this year because I wasn’t impressed with ’06. But circumstances changed and went in place of a colleague. So here is my review.

Venue

First off the venue was the Islington Business Centre, a nice place and one I’ve been to before for a Photoshop seminar. A bit harder to get to than the QEII Conference Centre but its fun to be crushed with a load of Bankers (yes I did check the spelling) in a tube for a while(?).

Tracks

There were two tracks, split by, "design-related" and, "tech-related" sessions. Hmm, yes I could detect the differences but the cross over was too vague for me, e.g. track 2 – Diabolical Design (design related) followed by The Mysteries of JavaScript (tech related) – how is that the same track? However, the real problem is there is no indication of the level of skills these sessions are aimed at. This is especially important for tech-related ones. Switching on a computer is tech-related and some of the session were closer to that level than an skilled JavaScript developer. So please, please tell us who the target audience is for a session.

Session Reviews

Beyond Ajax – Jesse James Garrett

Keynote style presentation by someone seemingly named after famous cowboy stories, I wonder how many times he’s heard that. I really enjoyed the session, made some excellent historical comparisons and really drove home how important the user experience of any product is. There were a couple of nice phrases and metaphors used that I certainly want to use myself. E.g.

  • Can’t live without
  • Avoid constellation of features
  • Products are people that know who they are
  • Design from the outside-in
  • Star to sail the ship by

Overall a slow start (a bit too much ego?) but quickly improved into an inspirational session like a good keynote should be. 8/10

Diabolical Design: The Devil is in the Details – Jason Santa Maria

(Design Related)
"Delivering a message", was the main point of this session and how to do this via the layout and colour scheme of a site.
Talked about some of the more subtle points of designing a page together with ideas about how to where to seek inspiration from. A number of the points I recognise from good design practices but since I’m not a graphic designer web developer I found them interesting including;

  • Colour pallets from life
  • Left to right, top to bottom, big and small
  • Focal Points
  • Where is the story
  • Whitespace is good – without good w\s use, "on the page and legible but not readable"
  • 66 character line
  • Grid – to provide order
  • Planning – ideas before images, sketchbooks, grey boxing
  • Strive for clarity
  • Avoid, "Product blanding" 🙂

Overall good points and well presented 7/10

High Performance Web Pages – Nate Koechley

(Technical)
I have to say I was disappointed with this because I enjoyed Nate’s session last year, one of the few sessions I did like. Although this session was good, it would probably have been easier to simply point everyone to http://developer.yahoo.net/blog/ and let us read the details. But without this session I wouldn’t have known about the site so it was certainly worth going to. I must confess I nearly didn’t attend because I had a feeling it was going to turn out like a list I could read at any time, but hey it still time usefully spent, just perhaps better to have spent it at home rather than in a seminar. The twelve rules:

  1. Make fewer Http request – inc. using CSS Sprites
  2. Use a content distribution network – content nearer the user
  3. Add expires headers
  4. GZip components
  5. CSS at the top (and combined)
  6. Scripts at the bottom (and combined)
  7. Avoid CSS expressions (fire too often)
  8. JS/CSS external files
  9. Reduce DNS lookips – keep alive, 2-4 hosts max
  10. Reduce the size of scripts
  11. Avoid redirects
  12. Turn off ETags

Some nice tips about what was used to capture some of the information during the production of these tests;

Overall disappointed, good tips but too rushed and it felt like death-by-powerpoint since essentially it was just reading out what was on the slides. Presentaton 6/10, content 9/10

Designing for International Users: Practical Tips – Richard Ishida

(Technical?)
Really interesting session from a "W3C International" employee although to be fair it was like an hour of interesting them-and-us trivia but it certainly took my understanding of the potential problems a step further. Some of the points included;

  • Localisation (to make a produce work in a specific "culture"), Internationalisation (design a product so that it can be easily Localised)
  • Beware of plurals
  • When displaying a list of languages to choose from, always show the languages in the target language. The demo of finding English from an Arabic site was compelling
  • Symbols – naughty words/gestures are always a problem for ticks and crosses were interesting too
  • Chart into – I found this one funny and relevant since some of the software I right is used in China
  • Unicode everywhere – ok this is a given but it doesn’t hurt to say it again.

Even though this was another potential read-from-list (and here are the tips) the demo’s were fascinating.
Overall lots of interesting demo’s and presented with some real enthusiasm and a decent amount of humour, just lacked a few examples how what you should do. 9/10

Microformats, Building Blocks, and You – Tantek Celik

I enjoyed a similar session last year so I attended this one, unfortunately there wasn’t much new. I did take a couple of things away;

  • POSH – plain old simple semantic HTML
  • Download a Microformat button aka badge aka icon
  • Operator plug-in

Overall if you’d never seen this session (or the like) before 8/10 if you have 6/10.

When Accessibility Is Not Your Problem – Joe Clark

This was always going to be a controversial topic from a controversial character. The basic premise seemed to be that rather than pander to issues and find odd solutions let face up to reality of the situation, i.e. if a device doesn’t support a page fix the device not the page, if a guideline is being pedantic then implement the true meaning don’t fret about it. So I had a great deal of empathy even I didn’t always agree with it. Let’s face it in the real world if a client wants their customers to use a reader and your site doesn’t work with it then I doubt saying, "make the reader manufacture change, nothing to do with me" isn’t really going to work. Still like I say the core of the arguments were sound, however…

Boy oh boy did he over labour the points. The guy obviously has various demons and various axes to grind but I’m afraid the over drawn out session was embarrassing. For example, he complained about the abbreviation specifications in the WCAG 2 guide and how in real life one man’s abbreviation is another’s word. Fair enough, and good point. However, he proceeded to ram that home with what seemed like 5 slides and 20 mins. Ok we get it, we do! There were a lot of people sniggering and generally complaining in whispers about the session.

Overall, good points but 40 mins too long, I just hope it was jet lag…3/10
[Edit] I’ve been talking about this session over lunch today and getting into a number of debates about who is responsible for this or that, all driven from this session. I think this shows the impact of the content and as time passes the poor presentation will pass but the core content will remain, so I’ve decided to separate out the score, 1/10 for presentation 6/10 for content, 10/10 for importance.

Day 2…

How to be a Creative Sponge – Jon Hicks

Nice 2nd day keynote talking about how Jon gathers information from all sorts of sources. Just a nice witty presentation and a gentle start to the day. 7/10

Bullet Proof Ajax – Jeremy Keith

(Technical)
To be honest I didn’t enjoy Jeremy’s session last year but I thought I’d give it another go. I thought it was good, it started off going over some of the same stuff but you have to get to a baseline so I that’s fair enough. The other plus was the improved humour, nice joke about iFrames. What I thought was fantastic was here was someone actually saying tha Ajax isn’t some silver bullet and although it certainly has it’s uses you shouldn’t use everywhere. Hurray! Jeremy was advocating the layering of behaviour, i.e. degrading gracefully just like CSS. Some of the other salient points;

  • Developers can control the server (in terms of performance), you can’t control the client spec’
  • Ajax isn’t for full blown applications
  • Good for small updates to a page – use of indicators such as fades to show the user what has changed
  • Problems with the dreaded back button and bookmarking – i.e. Ajax changes the state so what should those features do?

A nice summing up with, "the more complexity you put in the browser the harder it gets". 7/10

1 Web, Acid 2 and CSS 3 – Hakon Wium Lie

(Technical?)
Good presentation with a fair amount of bias for Opera but the points were fair. I liked the laudable concept of one web. The web should be same regardless of the device. Opera seeks to do this by using the same core engine in the devices it supports. Also learnt about the service that contains that core to pump binary version of the page directly to mobile phones – interesting. Although I’d have more interest in Opera if the latest version hadn’t contained such a breaking implementation of JavaScript/Dom.

Showed the Acid 2 test and how the browser community had set about the tasks of passing the test…apart from poor old Internet Explorer.

A brief show of some CSS3 features implemented in FireFox and Opera. Nice look at them and I really hope they are realised sooner than later. I thought the BOOM Microformat used to write a book was pretty impressive even if print media holds little interest with me but it really shows the potential of CSS and Microformats. I also liked the multi-column and widget demo.

Overall some interesting insights if a little thin on things to take away 7/10.

The Mysteries of JavaScript-Fu – Dan Webb

(Technical)
Very witty and enthusiastic session from someone that is certainly a developer rather than web designer. Based the session on a funny if obviously tenuous link to kung-fu films. Some interesting pointers and thoughts about developing JavaScript;

  • "A peasant language" 🙂
  • OnDom rather than OnLoad – see libraries
  • Use event delegation rather than multiple event handlers
  • Combine JS files – twice now
  • Use GZip – twice again, this stuff must be true 😉
  • Avoid checking the length in the loop when possible (interesting)
  • When showing extra UI (such as a drop down when MouseOver) create the controls JIT. Personally I’d consider having only one and moving it around.
  • Use Mac and Parallels for cross platform testing – yep couldn’t agree more
  • Firebug – how did we live without – why is it that I’ve not had trouble debugging JavaScript, perhaps no-one uses Visual Studio for web development
  • Selenium – test tool

Overall I enjoyed the session but Dan seemed to go off presentation on occasion and talked about issues covered in later slides, but it was a pretty good end to the session. 7/10

Summary

I thought @Media 2007 was much better than 2006. The tracks were still really muddled and I really do think they need explain what level of expertise the sessions are aimed at.

[Edit] Great post with links to presentations…http://learningtheworld.eu/2007/atmedia-2007-slides/

WiX – a first look

I’ve long been unimpressed by InstallShield. Ok, that’s slightly unfair, what I should say is that I’ve been unimpressed that it is different to all my other Visual Studio projects and that it costs quite a bit to purchase. So I went looking for an alternative. I’d previously played around a bit with the installation projects that come with Visual Studio and they seemed very clumsy and unintuitive to use so I had a look to see if they’d improved them. What I found was a project from Microsoft, if not officially a Microsoft Product, called WiX or Windows Installer XML. I downloaded the latest v3 build and ran through the v2 tutorial. So what is it? It has the feel of a cross between an XML build script and a web page. The package of items you want to install are laid out in a hierarchical fashion loosely modeled on the resulting folders. You can then put in all the usual features (such as shortcuts, DLLs, documents, etc) into components that will be installed in these folders. I found it very easy to produce and the XML schema makes a lot of sense. It wasn’t without hiccups, my first problem was with shortcuts. Since I was using v3 I could build the project from Visual Studio rather than using all those nasty command line instructions – how quaint ;). The errors produced were quite good, at least they pointed to the correct line (see my complaints about Mono). However, it complained that my shortcut wasn’t advertised so should have an entry in the registry. Umm, ok. ‘Advertisement’ installer speak for presenting information about the item to be installed so it was my ignorance of MSI that didn’t help here. But I didn’t want to play with the registry (event though I just needed to put an value where it was asking for one) but luckily there is an attribute called "Advertise" on the shortcut element, so I went ahead and set it to ‘yes’. Not really sure what I’m doing here but hey it worked! The next problem was altogether harder to solve, and think it is a bug. The build kept complaining that the Program Files folder didn’t have an entry in the remove files table. This highlights one of the problems with WiX. The compilation errors are coming from the compilers that are producing the eventual MSI and the error messages are, reasonably, related to that. However, the WiX hides all these weird tables from the author so the error looks very odd. Fortunately I did know a little bit about this and after playing with a RemoveFolder element (only by "incorrectly" setting its overriding folder property) did I manage to finally build the installer. I ran it and without any UI the files I wanted to install were there in the correct places…better than my first few attempts with the Visual Studio installer projects. The next stage was to provide some for of UI that you’d expect from an installer. This is were the default features of WiX come into their own. By simply adding a single UI element (with a few default UI types) and adding a reference to the WiX UI dll the installer suddenly had all the usual features, license statements, custom|full|typical, etc, etc. Well sort of. The compiler through a fit when I first added the reference and chucked out hundreds of error message all seemingly about localized values. So I set the culture in the project properties but that didn’t seem to work, foxed I went to the web. Turns out there is a problem with saving the project culture properties and you have to unload the project, edit the project xml and reload it again. That worked and the installer with all the nice installer wizard pages sprang into life.

I’ve yet to test creating your own UI and installing some of the slightly odd things that I need to do (e.g. COM+, Services, etc) but so far I’ve been impressed, if slightly concerned about the support for the project – but, as I’ve not mentioned, you can download all the source so if the problem needs to be fixed the option is always there to just fix it yourself. Hopefully I’ll have some more time to complete my testing and start using it soon.

Summary of advantages:
1. It is free!
2. The installer project can live in the Visual Studio solution like all the other assets
3. Easy to produce, XML structure simple to use and the Visual Studio support is basic but good enough
4. Since it’s XML it should be easy to automatically construct the files based on the other projects in a solution
5. You have the source code

Disadvantages:
1. v3 is a Beta product
2. You have the source code 😉
3. You have to re-write DLLs to create new UI components, I’d have preferred to have a  separate Visual Studio template for this.
4. Where is the support forum? Again another horrible mailing list only project. Come on people this isn’t the 80’s.

CSS TreeView adapter – too much recursion

Late in 2006 a semi-skunk works projects was released called CSS Control Adapter Toolkit. The idea was simple enough, the controls shipped in ASP.NET produced some dreadful HTML so by cunningly re-using the framework intended to support "non-standard HTML" devices you can reroute controls to use their CSS friendly equivalents. One such control given the CSS friendly treatment was the TreeView control. To support all the various clients side features of the Adapter toolkit it ships a JavaScript file that you should included in your sites. My colleague made the switch on our site and after a few teething problems the CSS version of our tree views seemed to be working…guess what is coming. We started to get reports that it wasn’t working on FireFox, the whole page seemed to lock up and wouldn’t respond to mouse clicks. I investigated the issue and soon realised that the JavaScript engine in Firefox was complaining about too much recursion. Normally I wouldn’t have been too surprised to see recursion in something processing a tree but in this case I knew that the tree wasn’t very deep at all, at most it would be six branches, surely that isn’t enough to send the recursion stack into melt down? I looked at the JavaScript shipped and spotted the problem (source file). When you process a tree the typical algorithm is to process all the peer nodes one at a time and for each node you pass the child nodes into the same function, hence the recursion. However for a tree six deep it should only ever have six contexts in the stack. However, this code was passing the next sibling back into the function therefore if you have 20 siblings then your stack was at least 20 deep and that’s before you consider the number children and their children, etc. So very quickly you can see that the stack could overflow. As elegant as it may seem to pass the sibling back into the function is not the right thing to do! I rewrote the code to simply enumerate over the siblings and only pass their children back into the function, unsurprisingly the engine managed to cope with six calls. If anyone has this problem and wants the code than I can post it, although it really is very simple fix.

Virtual PC 2004 vs 2007

Finally downloaded the VPC of Orcas (Visual Studio 9 Beta) so I can have a play with Silverlight. After what seemed like a life time to download the files I finally managed to get it running on VPC 2004. However, it was a bit slow, typical Virtual PC speed…poor. Now I know that VPC 2007 runs Vista better so I thought I’d give it a go and I’m not disapointed. VPC2007 seems much faster than VPC. If feels like your running on old machine, capable but you have to be a bit patient at times. Whereas VPC2004 was like you were running on a PIII that was also doubling as a server for hotmail. So my advice to all you VPC 2004 users, get 2007 now!
 

The death of cross-platform code?

After some recent exposure to Java and .net Mono I was pondering about the various advantages of abstracting the hardware into a virtual machine that the code runs within. However, it occurred to me that recent demonstrations of both technologies were run on virtual machine software, i.e. VMWare and Virtual PC/Server. With Vitualization (yuk made up word warning) improving seemingly by the month and with backing from the major chip manufacturers too it seems that Virtual Machine software is becoming the common place. It also seems that it is starting to make serious inroads into replacing machines for commercial use…albeit with, perversely, specialised hardware. So back to the topic, if you make the leap of faith that these Virtual Machine vendors will produce high performing products then why worry about cross-platform at all? Why should I port my Microsoft .net application to the .net Mono on the Mac when I can just fire up Parallels/VMWare and run it on a copy of Windows on the Mac. The expense of producing and maintaining software is almost always more expensive than throwing hardware at a problem. I’d argue that if you spend your budget on writing quality software for one platform and IF Virtual Machines perform, as the vendors are suggesting they will, then just run the software on the same platform on different kit via VMs.

Reality check. I’m not actually convinced that VMs will run at a decent rate for at least a year yet. Once we get 4/8 cores as standard then I think the story will look far more plausible.

[Edit] I’m now very interested in Silverlight, could this be the software VM that ruins my argument? I hope so!

Who is to blame for Credit Card validtion code on web sites?

For no means the first time I’ve just purchased something from an Internet site and I got the, "invalid credit card number" message. What was wrong, yes I’d included spaces in the number. Now is so rubbish for the following reasons;
1. The spaces are there to help us humans enter the correct value, so why encourage mistakes?
2. How difficult is it to remove spaces from text if that’s what is needed, really? A semi-training monkey could manage that.
3. I’m putting one of my most trusted assets in the hands of developers who can’t remove spaces, hardly comforting.

So if it is you who can’t write these routines for major credit card companies or retailers, contact me. I have very competitive rates when it comes to code that removes spaces!

…and relax.

Oracle SOA briefcamp

I attended a "briefcamp" at Oracle London today. A bit of a strange one this, it was really an overview of the SOA suite. Unfortunately the demo Gods were at the very worst today and nothing really worked, a little bit worrying since this is released code but I’ve been around computers too long to worry too much about that. As my blog probably shows I’m very much a Microsoft shop developer so I have to try very hard to keep an open mind about Oracle development tools but here goes. There are a lot of comparisons to be made between the offerings from Oracle and Microsoft but from a Architectural point of view they achieve pretty much the same thing. However, in my unbiased(?) view the .net Framework seems better that the Oracle versions. I would say that you can achieve what want with Oracle but the tools are unpolished, they have the feel of tools that are 5 years old. The demonstrator was pleased to show off the fact you could create a web service from a component in one click…really, that’s something I demand from a dev’ tool, not something I’d hope was there. The biggest gap for me was in configuration. An example give was the way of avoiding the performance cost when communicating with a web service. Oracle have a special in-memory type of wrapper that allows you to call a web service but really it doesn’t use http (et el). The problem is that if you use this wrapper you lose the portability of moving the service to a different location, i.e. the other side of a firewall. I asked how do you configure it to use the web service rather than the in-memory wrapper thinking that it would be like Microsoft’s Communication Foundation, "you recompile it" was the answer…hmm not exactly SOA in my book.

Overall though I’m not knocking Oracle on this, what is clear is that whether you chose Oracle or Microsoft as your preferred development platform, they both will do pretty much the same thing, it will probably come down to past preferences and political issues more than technical advantages…although I prefer Microsoft’s 😉

A fundamental difference between OSX (UNIX) and Windows

I spend most of coding time using .net and Windows. However, I do like to experiment in the world of OSX, hence my interest in MONO. It’s pretty low down and dirty and really it doesn’t feel like development on Unix has moved on in the last 10 years having to use command line arguments and the like, I mean really! 😉 This isn’t for me, so I went looking for a nice development IDE and found things like the monodevelop project. Then the eureka moment for me. After reading the instructions for Linux I found a link to someone with instructions for OSX. Part of the instructions included a warning that recompiling these libraries may have an adverse effect on the system…or something like that. There you have it, UNIX is great no doubt, but to get into it you really have to be brave. Windows, these days, mothers us far too much but it’s difficult to accidentally foul it up just by compiling your code. Whereas in UNIX everything is still about running scripts and making changes that make your buttocks clench. I do detect a real effort in both UNIX variants and Windows to protect the system from the user and to provide much better installation mechanisms. So I’ll probably drag my feet with Mono until I get a lovely installer that works directly with OSX rather than via GNome or the like.