Hixie's Natural Log

2002-06-02 15:01 UTC And in the other corner...

Hyatt continues:

Yes, but at the cost of bloating the Mozilla code size. I would assert that with modern operating systems like OS X and Windows XP, that the amount of testing a major subsystem of the OS receives will vastly exceed the amount of testing that a component in Mozilla receives.

I was unfortunately vague. What I meant was that the Mozilla code that would implement the subsystem would receive more testing from Mozilla people if it was used on all platforms than would Mozilla's OS wrapper code if each platform had its own implementation.

I don't have data on how the performance of Objective C hashtables and strings compares to the Mozilla implementations, but I do know that Omniweb would have to pay the "copy tax" I mentioned in the previous blog to convert from Obj-C structures to C++.

Why would you ever be copying the hash tables out of Gecko into Objective C code? I admit I've got no experience writing a web browser on the Mac, but I don't see why there would have to be that much copying. There isn't that much that needs to pass between Gecko and the embedding application -- you pass a URI in, you get maybe a title, a site icon, a list of alternate stylesheet names, progress information, security information, and the occasional cookie and HTTP authentication notification. Unless the cut off has been made at a significantly lower level than I imagine, I don't see much need for huge amounts of copying.

You'd need on the order of 50-60 bugs to cover all of the "deCOMtamination" that should be taking place.

The most important ones are being done first. It is a long term task, but it is not intractable, and I am sure it is a lot easier to fix the over zealous COM-ification of Gecko than the lack of support for the inline box model, alternate stylesheets, HTTP Link headers, et al.

[The lack of multithreading] is also a problem that would take months to solve at this point.

Mozilla has been working on Gecko for over 4 years. A few months of work is a big undertaking, but it is not unheard of.

You could also reduce page load times through judicious use of multi-threading, especially on multi-processor machines.

Multiprocessor machines would benefit a lot from good multithreading, that I cannot deny.

Then again, if the Omniweb guys did use Gecko, they'd just end up duplicating Chimera, and competition is healthy, no? :)

I'd rather see Mozilla gain more contributors to help us fix these problems than see the community get fractured, working on multiple proprietary rendering engines at the cost of users' freedoms.