The Robustness Principle

Joel Spolsky has written today about web standards, and how standards are wonderful things because there are so many to choose from. It’s a long article, and in it he mentions Jon Postel and the Robustness Principle, and his is a different interpretation of the idea from mine. The Robustness Principle, in short, is: “Be conservative in what you do, be liberal in what you accept from others.”

The wonderful thing about standards is that there are so many to choose from.

Joel believes the problem with this principle is that if you are forgiving of mistakes when you first implement something, people who make mistakes won’t notice and won’t learn how to do things properly. This is a fair enough point, but it isn’t the way I’ve always looked at this principle.

I come from a Unix background, so most of my programming is with protocols or text, not GUIs. I’ve always thought of this principle as “Be careful that your program provides the expected output in the expected way, but if other programs give yours weird input, don’t fail catastrophically.” This is the important part. ‘Robust’, to me, has always meant ‘doesn’t break into a zillion tiny pieces at the slightest hint of a breeze’, not ‘will accept any old thing as input if it can figure out what it thinks you meant.’

When I was taught software engineering at Uni, we learned that if you
say to a user “Input a number:”, they might not type in ‘5’, they might
type ‘apple’. Just because they did something silly (because your
interface is always perfect, right?), it shouldn’t crash your program.
Instead, you print an error. Something like “Syntax error: ‘apple'” if
you wanted to be obtuse, or perhaps “‘apple’ is not a number. Please
input a number:”. We didn’t try to guess which number was meant when the user said ‘apple’.

At the risk of starting another pointless flamewar, this is an ideological difference that I have observed between developers with a Windows background, and those with a Unix background. Windows continually seems to be trying to help me by making decisions for me. Unix just does what it’s told. In theory, Windows should be more helpful and Unix would be troublesome, but in practice this is not the case. Windows frequently gets in my way by trying to be helpful, but doing the wrong thing, much like an over-zealous five year-old. Unix does exactly what I tell it to do, even if ‘rm -rf /*’ destroys all of my files.  But maybe that’s what I want to do?

The problem with browsers is that the particular input with errors here is coming from a web server on the other side of the world. There’s no mechanism for the browser to complain that the HTML it received is broken… or is there?

Perhaps if, way back in the day, browsers had displayed a little message saying “This website is broken because of: ‘blah'”, instead of trying to fix a developer’s bad code, this would have encouraged developers to fix their bad code, because their users would have a way of knowing that the badness was caused by the page, not a bug in the browser. Right now, apart from server failures, users have no way of telling the difference between a badly coded page and a browser bug.

I tend to think that the browser developers are to blame here, and they’ve made a rod for their own back. As Joel says in his post, there is now no right answer because pain will be felt either way. Conform too much, and existing websites that use dodgy hacks to work around IE6 and IE7 bugs (which never got fixed, and so became entrenched as the de-facto standard) will break. Conform too little, and we’re stuck with the same chaos that means you have to test your site against all the browsers your customers are likely to use, and bunches of websites won’t work with Firefox or Safari.

Either way, it’s going to suck.

Bookmark the permalink.

Comments are closed.