Why poorly designed websites persist

I originally meant this to be part of my post, “Why Flash-based websites are bad“, but when I started typing the section, “A brief history of the problem”, things got a little too long. What can I say? I have a problem with being overly verbose, but I can’t bring myself to delete something I spent typing. This is probably also why I’m a pack rat – as I type this, there are five or six cardboard boxes sitting in my room that should probably be thrown out. But I digress.

If you’re a web designer, you may be disenchanted everytime you visit a poorly-designed or poorly-structured site. Why do these still persist, despite proper techniques of content/presentation separation having been around since 2001? Has the whole world gone mad? Fortunately no – but it is a complex problem.

The problem of poorly designed websites, in my opinion, can be traced to two important factors, both of which can be assigned varying degrees of “blame”, depending on your viewpoint. There are tons of complicating factors, but I believe these to be the two most important.

Firstly, HTML (the precursor to XHTML), was designed as a document markup language (hence the name), completely devoid or separate of presentation information. Since the earliest users of the web were mostly from academic circles, it’s no surprise that much of the markup in HTML applies to formatting things like scientific journal papers, which typically make extensive use of headings, lists, and other semantic elements. The complete lack of presentation styles in the first version of HTML might seem like a good thing nowadays to web designers “in the know”, but unfortunately, most people don’t think that way. This resulted in later version of HTML adding presentational elements such as font, and even browsers implementing their own non-standard elements, all due to user-demand for control of presentation. Much of these elements have now been depreciated or eliminated, but they still remain in use – it’s hard to close the door once it’s been opened.

The problem lies in the interpretation of what exactly is content, and what is presentation. Content exists for the purpose of communicating information or a concept; presentation defines how that content should be shown. However, most non-technical people tend to blur the line between the two – for most people, presentation is part of the communication process. This is an equally valid point-of-view, and is in fact, probably the normal way to view things – human beings are visual creatures.

It takes a different sort of person to consciously think “This part of the document should be marked up with a list because it depicts a series of objects with some sort of logical relation among them”. Most people simply read the list as-is, but subconsciously, they are probably making the same connection. I think this has to do with the person’s MBTI, (and obviously their areas of education), but that’s another topic altogether.

The second factor deals with the unusually low-barrier to entry for making a website. HTML, quoted as the lingua franca of the web, is a fairly easy language to learn, as far as learning topics in computer science go. A grade school kid who can read at a decent level can easily learn the basics and begin coding the basics of a web page almost immediately. This is enhanced by the availability of many WYSIWYG HTML editors that make creating web pages as easy as using Microsoft Word or PowerPoint. However, while HTML is easy to learn, like many other things it is hard to truly master, especially when using it properly with a combination of CSS and JavaScript. This creates the situation where “a little knowledge is a dangerous thing“.

While it’s easy to make a web page, it’s also easy to flaunt disregard for web-standards, accessibility and usuability, all in the name of making a page that looks “nifty” or “cool”. This is perhaps best seen in the unending multitude of MySpace pages, where users are given control to customize almost any part of their page. This has resulted in pages that are either poorly designed, hard to use, do annoying things like play music, or a combination of the above, creating a haven that is a throwback to the Geocities of the 1990’s. The problem is compounded by problems with proper implementation of best-practices as well as the fundamentally different ways design must be done on the web as opposed to other forms of media such as print.

In reality, HTML is a language that demands almost the same level of attention to detail as a programming language. However, since programming languages are harder to learn, they present a greater barrier to entry that keep out those not willing to learn things the right way. (If you make a syntax mistake, the program will probably not compile or run – however some would argue that there’s still a lot of problems related to poor programming practices as well.) Some people would like these strict rules to apply to XHTML as well – in fact they do, if XHTML is served as XML, which it was intended to be. This has created a very stratified community of “website designers”, best summed up in Roger Johansson’s article on Levels of HTML knowledge.

Furthermore, good structural HTML is important if you care about Google or any other search engine indexing your site properly. The Googlebot (their spider that crawls the web, searching for new content to add to their search index) will have the easiest time indexing your site if you’ve properly separated content from presentation, and used meaningful (X)HTML elements to markup your content. Many people do not understand this – they either don’t understand or have the false view that search engines view websites in the same way that people do, through a browser. Additionally, if you care at all about accessibility, you must follow web-standards, which are conducive to designing a website that is accessible by anyone with disabilities.

I don’t draconian parsing rules are the solution, as for something to be the lingua franca of the web, it must be accessible to use by all. Easy to use, standards-compliant XHTML editors are what is needed, but I acknowledge that the problem is difficult. Thankfully, things are improving – despite the presence of sites like MySpace, the increasing popularity of blogs has given another outlet for people to create personal websites. Most of these blogging platforms are relatively standards-compliant, and in line with the spirit of web standards as well, and in many ways have been at the forefront of promoting good website design. Let’s hope more sites follow suit.


  1. Until web browsers decide to stop being backwards compatible with early versions of Netscape, I canโ€™t see the situation improving. I think browsers should abolish โ€œQuirks Modeโ€ย, because only when people see that their sites are broken, will they (hopefully) attempt to fix them.

    And donโ€™t get me started on MySpace! ;o)

  2. I agree, in principle. In some way, I’d like to see browsers only support validated markup. However, I don’t think that’s practical, as we can’t expect everyone to fix their markup – nor would I expect these tough rules to suddenly make people switch their minds, as I mentioned in my article.

    If browsers suddenly stopped working with invalid code, then, we’d suddenly find a lot of angry users out there who are wondering why sites like MySpace don’t work. (Maybe that would be a good thing… ๐Ÿ˜‰ )

    The main problem is that while us tech people understand the need for validation, 99% of the world doesn’t – and that matters. Overall, I think we need to discourage people from using HTML (when they aren’t willing to spend the time to learn it), by offering good WYSIWYG editors, or for social networking sites, offering a good site builder that doesn’t allow the user to insert their own HTML. Facebook has done this, resulting a site with a good design that looks consistent – far better than MySpace.

  3. that’s great content. thanks for posting this.

Comments are now closed for this entry.