The dinosaur that is web-standards

If you’re into web design or development at all, you’ve probably run into the term “web standards” and the W3C, the body whose job it is to organize and draft these specifications. The truth is, web standards are still by-and-large, just suggestions on how to do certain things, and are as of yet, not completely and widely followed. This is more an artifact of the way the web and browsers have evolved, but sometimes, you gotta wonder what the W3C is thinking.

Credit where credit’s due

To its credit, the W3C has done a lot of great work. Headed by Tim Berners-Lee, the creator of the web, it’s done good work in standardizing the data formats that encompass the Internet. Without some sort of guiding body, the web would certainly be a lot less usable than it is today. They’ve also made decent progress in updating standards as technology/trends change, and haven’t left out accessibility for the sensory-impaired, something that should be lauded.

Certainly attempting to standardize something as widespread as HTML/XHTML across something as big as the Internet is no easy task; working with the companies making the browsers must have been a hard task. After all, companies want to distinguish their product from the others; if they all support the same “standards”, what would make one better than the other? Perhaps this was the thinking during the browser wars when companies started introducing support for non-standard elements in order to make their browsers more appealing to users and web designers.

Part of this is the reason why some pages still render differently in IE than in Firefox or Opera. But these days, it’s mainly IE6 that’s the odd one out, with Firefox and Opera (among others) either supporting the W3C standards better or completely.

Competition breeds innovation

While this effort by browser makers to independently make their own specifications often (especially in the past) resulted in broken web pages that only worked in certain browsers, once in a while, it provided browser programmers the ability to introduce a new feature that was actually of use, besides something like the marquee tag.

What I’m talking about is the XMLHttpRequest object, the basis for Ajax and hence a lot of the websites or web applications that you may use on a regular basis. If you’ve used Gmail, if you Digg, or if you use the new Yahoo!, you’ve benefited from Ajax and hence this non-standard web technology. While the original concept used an ActiveX object, its value was apparently seen by other browser makers, as Mozilla introduced support for their browsers back in 2002, with other browser makers following suit. (Though interestingly, Opera only gained support recently – perhaps this is related to the fact that they follow the W3C specifications closer than either Mozilla or IE)

Gmail’s been around for over two years, and other Ajax websites or web applications have also been in use for over a year. However, the W3C published their first working draft only back in April of this year, which is attempting to standardize a method that is already in wide use. Admittedly, it’s a valiant effort – currently, implementations differ across browsers (mostly IE in one corner, the rest-of-the-world in the other), so getting something down that everyone can agree upon is good.

Too little, too late?

However, it strikes me that it took them this long to develop even a draft. And, with many JavaScript frameworks out there that already abstract the XMLHttpRequest object in such a way that you don’t have to worry about incompatibilities, isn’t some of this work perhaps done in vain?

Thankfully, there’s been some action to counter this slow reaction time. As with most standards bodies, slowness to adapt is a key problem. In fact, it’s why the W3C was developed in the first place, (ironically, it seems), in order to address the shortcomings of the IETF at the time. However, this time, the group that’s taken the helm is a not a vendor-neutral group, but instead one comprised solely of the companies and people who make browsers – the WHATWG.

The WHATWG is not meant to be competition for the W3C; they aren’t meant to replace them. Instead, they hope that by fostering a good relationship between browser makers, good standards can be developed at a fast rate on par with development in the real world. These drafts will then be submitted to the W3C, thus taking a lot of the workload off from the W3C.

W3C-what?

The W3C, which one shouldn’t remember to commend for their previous efforts, still sometimes comes up with specifications that, while looking good on paper, just don’t seem like they’ll translate into something real and usable.

Take XHTML 2.0, for example. With a name like that, you’d expect it to be a successor to, and be backwards compatible, with the current version of XHTML. Not so. In favour of a stricter definition of a document, backwards compatibility will not be included. All the talk of making your HTML/XHTML documents validate in order to preserve backwards compatibility and to ensure forwards compatibility seems to have gone out the proverbial window. For those who were skeptical on the usefulness of web standards, XHTML 2.0 must have seemed like the straw man they were looking for – it really underlined the separation from reality that the W3C seemed to taken. (Even the current versions of XHTML have problems that need to be addressed in the context of delivery over the web)

Maybe it’s time

So perhaps what we needed was the WHATWG – something to keep the W3C in line with reality, and to allow the speedy standardization of things that are still in the process of developing. So far, they’ve produced a few specifications, which have had an impact on the W3C. Their continued interest in text/html as a viable MIME type is rare; too often, people get caught up in current trends, such as the love-affair that everyone seemed to be having with XML a while ago. While certain XML formats are good, they do have issues when there’s a chance the XML will be hand-coded and not checked for well-formedness errors. In this case, traditional HTML in the text/html MIME type is probably better.

In conclusion, hopefully the state of web standards will be better in a few years or so. The W3C has a lot of good to offer, especially when it comes to accessibility and so forth. Combined with the helping hand of the WHATWG, things should progress for the better.

No Comments »

Post a Comment

(required)

(will not be published) (required)

XHTML tags allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Note: rel="nofollow" will be added to all links in comments.