loge.hixie.ch

Hixie's Natural Log

2008-03-26 21:00 UTC Tests that are never quite finished

Since we announced the Acid3 test a few short weeks ago, two major browser vendors have been tripping over themselves trying to pass the test, with both Opera and Safari getting very close in the last few hours.

Of course, with a test as complex as this one, I was bound to make some mistakes, which the browser vendors were very quick to tell me about. Sometimes they were mistaken, and the test was actually correct; other times I was wrong, and had to fix the test, and in one case, the spec changed.

Here's a quick overview of the more major changes I made to the test. Luckily, none of the errors were too serious.

Sub-pixel testing
It turns out that the original test accidentally required that browsers implement sub-pixel positioning and layout (and in fact the reference rendering got it wrong too, and relied on the same kind of rounding as Firefox does), which is somewhat dubious. I've changed the test to not rely on sub-pixel layout. However, it is very likely that this will be tested in Acid4, if we can get the specs to be clearer on this.
Surrogate pairs in SVG APIs
One of the submitted tests assumed that SVG APIs worked on Unicode codepoints, but the SVG spec changed to work on UTF-16 codepoints, like the rest of the DOM API, so the test was changed there. (The test changed a couple of times, because I originally got the fix wrong.)
The click() method
The test originally assumed that the click() method was reentrant, but the specs were vague on this and someone suggested making it fail if calls to it were nested, so I removed this part of the test (the spec hasn't been updated yet). I replaced it with an attribute test (the new second part of subtest 64).
The Performance Test
I made the loop counter in the performance test (a part of subtest 63) less complicated and shorter, to make it at least plausible that browsers could be fixed to pass that test quickly enough that it wouldn't always feel jerky. At the same time, I updated the test's infrastructure to report more details about pass and fail conditions and how long each subtest takes to run.
Namespace bug
Someone noticed that http://www.w3.org/1998/XML/namespace should have been http://www.w3.org/XML/1998/namespace in one of the subtests.
Linktest timeout
I made the linktest more resilient to slow network conditions. However, the test is still going to give you major issues if you are on a network with multi-second latency, or if the acidtests.org site is being slow.

When we released Acid2, the first browser passed it in about a fortnight. Acid3 is orders of magnitude more complicated. I really didn't expect to see passing browsers this side of August, let alone within a month. I am really impressed by the level of commitment to standards that Opera and the WebKit team (and Apple in particular) are showing here.

Pingbacks: 1 2 3 4 5 6

2008-03-04 01:19 UTC Moebius

After baking it for several weeks, we have finally decided Acid3 is stable enough to announce it is ready. We'll be working on a guide and other commentary in the coming weeks and months, but it'll take a while — Acid3 is far more complex than Acid2 was.

I have to say straight up that I've been really impressed with the WebKit team. Even before the test was finished, they were actively following up every single bug the test showed. Safari 3 (the last released version) scores 39/100 with bad rendering errors. At the start of last month their nightly builds were scoring about 60/100 with serious rendering errors. Now barely a month later the nightly builds are already up to 87/100 and most of the rendering errors are fixed. That's a serious testament to their commitment to standards. (Also, I have to say, it was quite difficult to find standards compliance bugs in WebKit to use in the test. I had to go the extra mile to get WebKit to score low! This was not the case with most of the other browsers.)

Speaking of standards, and of good news from browser development teams, Microsoft's IE team announced today that they were changing their mind about their mode switch, and that bug fixes they make to their rendering engine will be applied to pages in what HTML5 calls "no-quirks" mode (what has historically been known as "standards mode"). I'd like to congratulate the IE team on this brave decision. It's the right thing for the Web.

Meanwhile, HTML5 continues to make good progress. I have a page now which shows my progress in replying to e-mails, and as you can see from the changelog tracker, checkins are as fast and furious as ever.

Pingbacks: 1 2 3 4 5 6 7 8

2008-01-23 09:31 UTC Mistakes, Sadness, Regret

There were several announcements yesterday.

First: The W3C finally got around to publishing one of the drafts of the HTML5 spec as a "first public working draft" (a misnomer if ever there was one, given the history of HTML5). This is news for one reason and one reason only: it starts the W3C patent policy clock, which means that within a few months we will know if any of the browser vendors have patents they wish to use to block progress on the Web. Still, a step forwards.

Second: The syntax for Microsoft's plans for yet another quirks mode switch in IE8 were announced. Basically they are offering authors the option to pick a specific set of bugs, defaulting to IE7's. The idea is that with each new major IE version, they can decide to simply freeze their last set of bugs forever.

If Web authors actually use this feature, and if IE doesn't keep losing market share, then eventually this will cause serious problems for IE's competitors — instead of just having to contend with reverse-engineering IE's quirks mode and making the specs compatible with IE's standards mode, the other browser vendors are going to have to reverse engineer every major IE browser version, and end up implementing these same bug modes themselves. It might actually be quite an effective way of dramatically increasing the costs of entering or competing in the browser market. (This is what we call "anti-competitive", or "evil".)

It will also increase the complexity of authoring by an order of magnitude. Big sites will become locked in to particular IE version numbers, unable to upgrade their content for fear of it breaking. Imagine in 18 years — only twice the current lifetime of the Web! — designers will not have to learn just HTML, they'll have to learn 4, 5, maybe 10 different versions of HTML, DOM, CSS, and JS, just to be able to maintain the various different pages that people have written, as they move from job to job.

I mentioned all this on public-html last April, pointing out that if Microsoft was really interested in not breaking the Web they would instead send technical comments on the HTML5 spec, explaining what exactly the spec requires that is incompatible with deployed content. (Microsoft have yet to send any technical feedback on HTML5 at all, 8 months later.)

I've been shocked to see several people in the industry supporting this move, including people whom I have met personally and whom I know know better, but I'm in no position to criticise people who should have known better. I'm glad to see that at least people from Opera, from Mozilla (continued), from Webkit, and even from the WaSP, and from the community as a whole (if the comments on Slashdot and other sites are representative) see this for what it is.

There are several directions we can go in from here. We could prematurely claim that our pages want IE8's new quirks mode, which would likely cause pages to change rendering in IE8 as compared to IE7. However, that's mostly a futile exercise, as either few people will do this, and it won't matter, or a lot of people will do this, and Microsoft will just change the syntax before IE8 comes out.

We could use the "edge" feature that is apparently being provided to always trigger the latest mode. However, that would be even worse: if enough people use this feature, then Microsoft will feel compelled to just make "edge" equivalent to "IE8" and introduce a new way to trigger the "latest" mode.

This is what happened with the original DOCTYPE-switched "standards mode": many people wanted their pages to use the latest standards, but found that they ran into bugs in IE6, so they worked around those bugs with hacks — which relied on simple parsing bugs — to pass content specifically to IE6. With IE7, many of the simple parsing bugs that these hacks relied on were fixed, but the bigger underlying problems that the hacks were working around weren't. The pages thus broke in IE7. Microsoft's reaction was to make "standards mode" mean "IE7" mode, and to introduce a new mode ("IE8"). The same will happen if so many people use "edge" that a significant number of pages break in IE8 or IE9.

We could encourage everyone to actually use this feature, but actually then we're just contributing to making the syntax legitimate, and we end up fragmenting the Web, on Microsoft's terms. This is possibly the worst outcome.

Finally, we could just ignore the feature altogether, continuing to use JS compatibility libraries for the time being, the same way that everyone has been doing for years. Authors would also have to support IE7 anyway, at least for the forseeable future, so it wouldn't be an additional cost. This would make the experience on IE worse than on other browsers, but not so much worse as to turn users away from the sites themselves. Microsoft would find themselves having to support a number of rendering modes that were not being used, which is not a cost-effective measure; they would find their market share eroding, as customers switched to browsers that actually followed standards; and they would find their developers still asking for standards compliance that was actually compliant to the specs and interoperable with other browsers.

Therefore I recommend not including the meta tag, or, if you are forced to include it, making sure it says "IE=7", even once IE8 ships. This seems to me to be the best way to show your support for an open, interoperable Web on the long term.

It will be interesting to see whether IE8 really supports Acid2, since that test page doesn't include any of the special magic words being proposed here. Will they hard-code the URI? Will they check every page against a fingerprint and if it matches the fingerprint of the Acid2 page, trigger the IE8 quirks mode instead of the IE7 quirks mode?

Third: Well, not really an announcement, but Sjoerd Visscher pointed out something that for some reason I had never heard of, and which nobody I have spoken to about it since has ever heard of: using document.createElement() with the tag name of an unknown tag causes IE7's HTML parser to start parsing that tag differently. This piece of information makes building an HTML5 compatibility shim for IE7 far easier than had previously been assumed. This is probably the most noteworthy news of the day. I don't really know why I didn't know this before.

Pingbacks: 1 2 3 4 5 6 7 8 9 10 11 12

2008-01-14 09:01 UTC The competition for you to come up with the best test for Acid3

As many of you will have heard by now, I've been working on the next Acid Test. Acid Tests are a way to encourage browser vendors to focus on interoperability. With the Box Acid Test, Todd Fahrner highlighted the CSS box model, and the resulting interoperability was one of the first big successes of the movement towards having browsers properly implement Web standards. Acid2 tested a broader range of technologies, but primarily it was focused on the static processing of HTML and the static rendering of CSS.

With Acid3, we are focusing on the dynamic side of the Web. I have a work in progress which consists of a few rendering tests and 84 subtests, little functions that test specific things from script. But I'd like to have a round 100. That's where you come in. I'm announcing a competition to fill the last sixteen tests!

You have one week to submit one or more tests that fulfill all the following criteria:

  1. The test must consist of the body of a JavaScript function which returns 5 when the test passes, and which throws an exception otherwise. It doesn't matter what kind of exception.
  2. The test must compile with no syntax errors in Firefox 2, IE 7, Opera 9.25, and Safari 3. (You can use eval() to test things that are related to syntax errors, though.)
  3. The test must not crash any of Firefox 2, IE 7, Opera 9.25, and Safari 3.
  4. The test must fail (throw an exception) in either a Firefox trunk build from January 2008 or a Webkit trunk build from January 2008 (or, ideally, both). (Opera and IE are failing plenty of tests already, I don't want to add more tests that only fail in one of those. Of course if you find something that fails in Firefox or Webkit and Opera or IE, so much the better.)
  5. The behaviour expected by the test must be justifiable using only standards that were in the Candidate Recommendation stage or better in 2004. This includes JavaScript (ECMAScript 3), many W3C specs, RFCs, etc.
  6. You must be willing to put your test into the public domain. (I don't want us to end up with any copyright problems later!)

To write your test, you can use the test development console. In fact, I strongly recommend it. To see some example tests, you can look at the source of the current draft.

If you have a test that fulfills all the above conditions, e-mail it to me (ian@hixie.ch), along with a brief justification citing chapter and verse from the relevant specs you are using, and telling me which build of Webkit or Firefox fails the test.

I will then make the 16 final tests from the best submissions, and will put the names of the people who submitted those tests into the JS comments in the test as recognition.

Good luck, and thanks in advance!

Pingbacks: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78

2007-12-06 12:07 UTC Evolution in the species "companies"

Evolution is evident and well-known in biological species, and can be seen in artifical environments quite easily, but evolution actually applies to any environment with moderately well-defined entities that have moderately measurable properties, where these entities appear and disappear over time.

For example, companies, as a species, are subject to evolution. They are born, they live, they die; and they each have unique properties (different ways that they operate) which determine their characteristics. Actually, the whole of the species "group of people with a common agenda" is subject to evolution, but let's look specifically at companies that try to convince people to use their products or services in the software industry, since that's what I'm familiar with.

For a long time, the software industry consisted primarily of one kind of company. First the company would be founded with an idea, and then the company would get venture capital funding, which would be used to fund software development, the fruits of which would be sold to customers, and the money raised from those sales would fund further development of more software, and the cycle continued.

These companies die when they run out of money. Since venture capital isn't infinite (investors will stop pouring money in when they realise that they are never going to see any of it again), the key is raising money from software sales. There are several ways of ensuring that a company will get money from software sales:

Each of these strategies has a counter-strategy. For example, to counter an incumbent with a monopoly position, a company "just" has to provide a product of equal value. To counter a competitor with the best product, a company "just" has to provide a product that is better. To counter a competitor with a cheaper product, a company "just" has to lower its prices. To counter a competitor whose users would find it more expensive to switch to the company's new product than they would to upgrade to the newer version of the competition's product, the company "just" has to provide a transition path (if the cost of switching vendor would come from having to convert all existing data, for instance, a company could "just" support the competition's file formats).

Of course, that's often easier said than done. A company can't lower its prices to below its operating costs, as doing so would cause the company to eventually run out of money and die. Similarly, making the best product is significantly more difficult than making a mediocre product, and a company can run out of funds while trying.

It turns out, though, that the rules change when the competiting companies are unequal. If a company has a lot of money, there are a number of tricks it can play to compete with companies with less money:

This is where the evolution comes in. If a big company repeatedly kills all the companies that follow the model I've described so far, then only the companies that use different models will survive.

This isn't all theoretical. Microsoft is a big company, and they've played all the tricks above. Many companies in the software industry have died because they failed to make money, and many of those died because Microsoft starved them of that money by using the tricks described above. (Sometimes, the use of those tricks has been deemed illegal; other times, not. That's besides the point here.)

What's interesting is the effects that Microsoft's strategies have had from an evolutionary standpoint. When we look at the major companies competing with Microsoft today, we find that none of them are actually using the operating model I describe above.

Beating Microsoft by not needing money: Firefox, Apache, and Linux-based operating systems are examples of open source software arising as ways to compete with Microsoft. Microsoft's usual strategies typically don't work with open source software. Microsoft can't undersell open source, as it costs nothing. Furthemore, since the source is independent of the company behind the source, if the company runs out of money another one can simply come along and replace it, continuing from where it left off. Thus, starving the company of money doesn't actually kill the competition; indeed, open source software actually turns this strategy against the big company. It will take a long time, but in due course Microsoft will run out of money if it doesn't make profits, whereas open source projects can continue indefinitely.

Open source is not perfectly safe against the other tactics described above, though. It is still vulnerable to vendor lock-in and FUD, and a better product can still beat it. Projects like Wine help Linux with vendor lock-in, as it allows users using Microsoft Windows to switch to Linux more cheaply than if they had to replace all their software; similarly Open Office implements file format convertors to read Microsoft Office documents; and Samba implements Microsoft's networking protocols to allow a migration to a Linux-based infrastructure without requiring that users switch all their existing infrastructure at the same time.

FUD is heavily used by Microsoft against open source projects (a recent example is this FUD article against Firefox); open source as a development model can mitigate this by leveraging its community to counter such claims.

The biggest difficulty, though, is in creating and maintaining the best product. Nothing especially changed in the browser market in the years just before Firefox 1.0 was released: the market was stagnant after IE6's release, with all the alternatives (Netscape, Opera, etc) being fundamentally not good enough in comparison. Firefox 1.0 was the best product of its time, and that reason, and that reason alone, resulted in its success. All Microsoft have to do to beat Firefox is make a better product, something for which they certainly have the resources. Similarly, all the Linux OS community has to do to beat Windows is create a fundamentally better product from the end user's perspective, while ensuring that the cost of migration is lower than the cost of upgrading Windows.

Microsoft has clearly realised that open source is a new type of competitor, and they just as clearly haven't worked out how to compete with it (which is simply to make a better product). This is probably because they have spent so long as the "big company" that they have forgotten the four ways of making money, and can only remember the tricks for competing with smaller companies.

Microsoft making a superior product wouldn't kill open source, though. It would just make Microsoft money while the open source community and companies themselves developed a better product again. (Just look at open source today: Linux operating systems aren't, from the end user's perspective, fundamentally better than Windows, but Linux OS companies continue with a growing but small set of the users.) Thus the technique of improving only until the competition is dead doesn't work on open source competitiors. Microsoft would have to continuously work to improve its products to compete with open source.

Beating Microsoft by not allowing vendor lock-in: Google's operating practices differ from the typical software vendor in that the main service that Google provides is completely devoid of any vendor lock-in potential. Google beat the search engines before it by being better than they were, and Google could easily lose all its users overnight if a much better search engine was to be developed. The only reason Google has a majority market share is that it is the best. Microsoft's usual strategies don't work against this kind of company: they can't lock the users into their alternative, and so the users have a truly free choice as to which service to use, Microsoft's or the competition's, and they usually pick the better alternative.

Google has also learnt the zero cost trick: by charging advertisers instead of charging users, Google can get a large market share of users, which is needed to sell to advertisers. Google's search engine users don't care how many advertisers publish ads through Google, so even if Microsoft undercut Google on the advertiser side, it still wouldn't reduce the number of users on the search side. In addition, because the advertisers want to advertise on the site with the users, and because advertisers could just use both advertising systems, undercutting on the advertising side doesn't actually hurt Google. (Making advertising free on Microsoft's network would just lead to advertisers advertising on both networks, which would hurt Microsoft, since they would be footing the bill, but not Google, who would still be making money.)

The FUD strategy depends on the credibility of the company spreading it, but Google is widely trusted, so FUD doesn't work very effectively against Google either.

As noted above, though, there is a simple way in which Microsoft (or any other company) could compete with Google: make a better product. In fact, since Google's entire strategy — intentionally or not — is based on not allowing vendor lock-in, any better product which is also free would almost immediately beat Google. Naturally, Google invests heavily in making sure it continuously improves. (Note that unlike with open source, which could probably continue indefinitely under a superior competitor, it isn't clear that Google actually could survive for long once it lost its users to a competitor.)

Beating Microsoft just by being better: This brings me to my third example, Apple. Apple has died and been reborn several times, as far as I can tell, but its most recent strategy is based almost exclusively on one concept: making the best product for the user, and doing so in several different markets at once. This technique is difficult, because it requires constant high-quality development. However, it is very effective against a big company like Microsoft that has, by and large, stopped relying on quality to compete.

As a corollary, Apple has found an interesting counter-strategy to the big-company strategy of undercutting the competition until it is starved. It sells its products with very high profit margins, and sells these products in a small number of very separate markets. Thus, it doesn't have to sell many units to survive at all, and it doesn't need to sell any units in any one area so long as it sells enough in another area. So for example, Microsoft couldn't compete with the iPod by giving away the Zune: even if most users would stop buying iPods and would instead just get the free, or nearly free, Zune, the net effect would just be that Microsoft would lose lots of money and Apple would wait with few ill effects.

Apple's switch to intel enabled Boot Camp and products like VMWare Fusion, to mitigate the problems of Microsoft's attempts at vendor lock-in. (Apple also plays its own mild games of vendor lock-in to prevent users from leaving Apple products once they make the switch.)

Apple has, even more than Google, become somewhat immune to Microsoft FUD purely on the basis of its own credibility. By almost never pre-announcing products, by repeatedly delivering products of high quality, and by a management of the media so adept as to be aweinspiring, Apple has managed to almost completely neutralise any FUD attempt against them.

But again, there is one way that Apple is vulnerable. If Microsoft were to make a truly superior product, Apple would lose users. However, unless this strategy was applied to all of Apple's products simultaneously, it wouldn't kill Apple, it would only starve one particular part of the company. All Apple would have to do to come back is make a significantly better product again.

Conclusion: The companies that couldn't beat Microsoft have all died, and evolution has resulted in three very different types of companies that are each immune to Microsoft's strategies in their own way. Yet all are still vulnerable to the same thing: a better product.

For the end users, this is a good position for the industry to be in.

Pingbacks: 1 2 3 4