"When can we start using CSS1?" is an equally valid question which
we've only just recently started to seeing the answer to. CSS1 came
out in 1996, 6 years ago, and the most popular browser (WinIE) is only
now starting to show an inkling of the start of the beginning of a
hint of something that might be described as semi-solid support for
CSS1. So let's assume it takes as long for us to be able to use XHTML:
6 years. XHTML came out in 2000. So that gives an estimated answer of:
Of course, the difference between CSS1 and XHTML is that you can't
use parts of XHTML, it's an all (text/xml) or nothing
(text/html) deal. So in practice it'll be longer, since
you won't be able to use rely on backwards compatability tricks like
If by serving content with a text/html header I can
create pages in future-proof, easily parsable XHTML and have them
accessible to virtually every browser ever [released], XHTML it shall
be. Then when browser support catches up I'll be ready, and so will my
But you won't be ready. Your
site doesn't validate. When you switch to sending your site as
text/xml then boom, your readers will just get XML
validation errors. In any case, there's no reason to switch
legacy documents to new formats. Do you see anyone going out of their
way to change old "HTML 3.2" documents (specifically, the tag soup
mess that they claimed was HTML 3.2) to HTML 4.01 Strict? And do you
think browsers are going to stop supporting text/html any
Some of his points seem overly picky [...] I would not consider any of them to significantly dilute WaSP's message.
The message may not be diluted... I simply think it is very bad form for a group that is claiming to champion the standards to be making any mistakes at all on their site.
I checked a site Hixie mentions as sending the correct text/xml content-type header in NS4 and, as I suspected, NS4 popped up a "download" box and failed to render the page.
Which is correct behaviour, since Netscape releases prior to 6.0 didn't support XML. In fact, IE doesn't support XML either (check the site and see if it looks like a web page to you). This is why using XHTML at this time is inappropriate! You don't see people writing Web pages using XLink, or MathML, or SVG, and there is a good reason for that: Web browsers don't yet support those specs! So why do people insist on using the equally unsupported XHTML? (Of course the question Tantek would have us ask is "Why are those specs not still in CR?".)
I also checked out the W3's XHTML home page - XHTML1.0 strict and a content-type header of text/html.
The W3C site is a terrible example of good standards compliance. They have trailing vertical bars in their <link> element titles, they abuse the <address> element, and worst of all, they use a table for layout purposes.
So for I've had three prolonged looks of consternation, one "fucking hell", one "it's about time", one "Hixie!", one "I almost walked past you", and one "Something's different... I just can't quite put my finger on it".
The Web Standards
Project recently unveiled themselves again. This time their
message seems aimed more at Web authors and Web users than Web browser
Isn't it sad, then, to see that their own Web site has several
common errors? Sure, it validates, but that's only because the
validators are merely
checking that the documents follow the relevant DTDs, not that
they are following all the rules of the specs. (In their defence, this
site is better than most. However, given their message, they should be
Let's take a look at the www.webstandards.org markup.
The first thing we notice is that it is sent as
text/html, but it contains XHTML. While that's
technically allowed by thespecs, it makes
no sense... XHTML and HTML are technically incompatible. (For example,
<br/> in XHTML is actually equivalent to
<br>> in HTML.) What really makes no sense
is that the whole point of XML (the basis of XHTML) was that it should
avoid the mistakes that tag soup created... all XHTML markup on the
web should be valid. But if you send XHTML as text/html,
then browsers will treat it as plain old HTML (i.e. tag soup) and not
complain about errors! You can see this demonstrated very well by mpt's Web log (see what
the validator makes of it). XHTML is great; but if you use it,
sending it as text/html totally defeats the point. (It's
XML. Send it as text/xml. Like this site.)
The next thing we notice about the WaSP's markup is their
It contains mistakes such as not setting backgrounds and colours
together, setting hover rules for anchors as well as links (David
Baron has written a nice
page explaining this issue), and positioning blocks using pixels
instead of ems or percentages, which is why the site fails at unusual
Back to the XHTML markup, we start noticing some much more serious
Rather inappropriate alternate text... the image doesn't say "Web
Standards Project Logo", it says "Web Standards Project".
Below that we find a paragraph of text that is left semantically
neutral (it should be wrapped in a <p>
element). Then, only a few lines lower, we find presentational classes:
<div class="padder">. That's followed by an
<h1> just above an <h3>,
skipping an entire level of headings (<h2>). We're
also faced with a bunch of empty paragraphs, which makes no semantic
In conclusion, while the message is the right one, the messenger is
confused. Standards Compliance is a laudable goal, but it's not blind
obediance to the validator
which will bring us that; it's structurally correct markup with well
written stylesheets, using appropriate technologies at appropriate times.
Nadia points out the main reason why humanitarian eugenics wouldn't work in practice: it can't be policed effectively without going beyond what people would accept. In an ideal society, we could get everyone to buy in to the system, but in our society...
Well, I guess our hopes are all pinned on genetic engineering then.