The votes are
in, with Microsoft's Office Open XML (OOXML) format gaining
international standard status. Both Microsoft
International jumped the gun a bit by proclaiming victory a day before
the official announcement, but the writing was on the wall since the
balloting closed on March 29. There are now two competing standards for
office document formats that have been approved by the International
Organization for Standardization (ISO): OOXML and Open Document Format
The most recent vote was an opportunity for the national bodies to
change their vote from September based on the outcome of the Ballot
Resolution Meeting (BRM). The September vote was relatively close but
OOXML did not pass, which led Ecma and Microsoft to try and address the
3,500 comments (1,000+ after eliminating duplicates) made by participating
countries. The comments and the Microsoft/Ecma solutions to them were
discussed during the five-day BRM meeting in Geneva in
When the BRM was announced, many wondered how that number of comments could be handled
in a week-long meeting, unfortunately the answer is: not very well. There
was simply too much to cover, so the majority of comments—mostly substantive
issues with OOXML—didn't get discussed and were voted on en
masse. The majority of participants abstained (18) or failed to vote (4), with six
voting to accept the changes proposed by Microsoft/Ecma and four voting
against. This allowed the BRM process to complete, leaving it up to the
national bodies to decide whether to change their September votes.
was again fairly close, but a net change of seven votes from
"disapprove" to "approve" moved OOXML into approval. 24 of 32 votes from
Participating countries were for approval, which is beyond the
two-thirds majority required. Also, 86% of the Observing countries voted
to approve, which is above the 75% required. In both cases, abstentions
are not counted.
At some level, the outcome should not be surprising. Microsoft put a huge
effort into ensuring OOXML standardization. Some would claim that they
"gamed" the system—it's pretty clear they did—what's less clear
is why, and what they plan to do next. Their tactics have been questionable,
which leads many to believe they have an ulterior motive.
To start with, Ecma International essentially rubber-stamped a
"specification" that Microsoft presented as ECMA-376.
Then it was introduced to ISO on the "fast-track" process, which is meant
for mature standards that have few gray areas or controversial parts.
Whatever else can be said of OOXML, nearly anyone that is not firmly in the
Microsoft camp can see that it is in no way mature, clear, or
non-controversial—it is flawed at multiple levels.
One of the most puzzling things about the process is how we have ended up
with two standards. In general, standards are supposed to be, well,
standard, allowing multiple implementations that use the standard,
but innovate in other areas. HTML and HTTP are standards, whereas Firefox, Safari,
Konqueror, Opera, and Internet Explorer all implement those
standards—some more faithfully than others—but
provide different sets of features on top. Microsoft's argument for
multiple standards is a
disingenuous one: choice.
It would seem that Microsoft wants to paint this as a VHS vs. Betamax
battle, where the consumer is able to choose the one best suited for their
needs. But, both of the video recording standards were proprietary, with
many arguing that the technically inferior choice "won". Microsoft is, of
course, no stranger to having its choices—again arguably technically
inferior and generally pushed through its near-monopoly on the desktop—come out on top.
One might be able to argue that competition between the standards is
consumer-friendly if there is a level playing field. In order for that to
happen, Microsoft would have to implement and deploy the competitive
standard—something it has clearly said it will not do. It is hard to
see how customers are going to be able to determine which of the two
formats is "better" when most of them will only be given one choice.
Many also fear that free software (and other non-Microsoft proprietary)
implementations of the standard will not be fully interoperable with the de
facto standard because of specification inadequacies or patents. Many, including ODF editor Patrick Durusau
have called for OOXML to be passed so that it can be clarified. Setting
aside the obvious cart-before-the-horse problem, standards bodies are
notoriously slow—it has been more than a year for the fast-track
approval of OOXML for example—expecting that clarifications can be
made through that process is somewhat alarming. More likely, changes will
be made in the format emitted by various Microsoft products and then
shoehorned into the standard some months or years later.
The claim that billions of documents exist in OOXML, which leads many to
believe it should be adopted, is particularly galling to many. There is no
OOXML standard yet—the final document has not yet been produced—but
that is a minor issue. The fact is that even though a form of OOXML is
available in recent Microsoft products, it is not the default and most
documents have not been stored using it. The billions of documents are
mostly stored in various versions of the proprietary DOC format that non-Microsoft users have
been struggling to read for years.
The opponents of OOXML had their own share of misbehavior during this
process. It is pretty unlikely that everyone who favored OOXML passage is
in the pay of Microsoft, for example. The doom and gloom predictions of
what will happen have sometimes been over the top as well. Free software
is not about restricting choices—if folks want to store
documents in OOXML, that is their decision.
So, what will happen to ODF? To many it looks like a truly vendor-neutral
standard—warts and all—will be shoved aside by a truly
vendor-specific one. Andy Updegrove, who has followed this process closely
and fairly objectively in his weblog, sees
things a bit differently. There is still a long way to go before OOXML
supplants ODF, if it ever does, according to Updegrove:
That answer is this: if anyone had asked me to predict in August of 2005
(the date of the initial Massachusetts decision that set the ODF ball
rolling) how far ODF might go and what impact it might have, I would never
have guessed that it would have gone so far, and had such impact, in so
short a period of time. I think it's safe to say that whatever happens
with the OOXML vote is likely to have little true impact at all on the
future success of ODF compliant products.
It is possible that Microsoft is changing its ways, but longtime Microsoft
watchers, especially those who have been harmed by their tactics in the
past, remain skeptical. One would guess Microsoft will be on its best
behavior for the next two months while objections to the approval can still
be raised. After that, we will see—over time—whether this is
yet another lock-in play or whether they wish to play fair in the
document storage arena. Every move they make will be closely scrutinized; there
are risks to reverting to their previous behaviors. But, if we end up with a
truly open standard, free of patent nonsense, and implementable by all, it
doesn't really matter whether it is OOXML or ODF.
Comments (24 posted)
Once upon a time, there were no usable free web browsers for the Linux
environment; the binary-only Netscape releases were all that was available
For many, the solution to the problem was to be found in the release of the
Netscape source code; some years later, we got the Mozilla and Firefox
browsers (based on the Gecko rendering engine) from this work. The KDE
project, though, took a different route in the late 1990's, developing the
KHTML renderer to use with the Konqueror application.
A few years later, Apple surprised the world by selecting KHTML as the base
for its Safari browser, despite the fact that Gecko was more widely
deployed. What followed was essentially a fork of KHTML and some bad blood
between Apple and the KDE project. Over time, the two sides have come to a
better understanding, but KHTML and Apple's version (WebKit) have remained separate. The
existence of two KHTML forks may not last that much longer, though, and
some interesting things appear to be happening.
One of those things is that Konqueror is slowly being moved over to WebKit
as its rendering engine. The decision to go in this direction was made at
the 2007 Akademy gathering, and work has been proceeding ever since.
Current Ubuntu development releases include a preview version of Konqueror
on WebKit. Work can be expected to continue in this direction, with the
result that KHTML will slowly lose its prominence in the KDE project. The
fork, in other words, is beginning to join, with the resulting software
being called "WebKit." [Update: as can be seen in the comments, this
paragraph overstated the case somewhat. Things might end up as
described here, but that is not the case now.]
Meanwhile, it seems that people are actually starting to use Safari, to the
point that web designers are thinking that they should actually test their
sites with it. For what it's worth, Safari currently accounts for just
over 3% of visits to LWN.net - relatively small compared to Firefox (over
60%), but, when added to Konqueror's 4.5%, it makes half of Internet
Explorer's 15% share. One can argue that the mix of browsers used by LWN
readers is not typical of the net as a whole, but, even so, it looks like
WebKit-based browsers just might become a
significant part of the Internet's software base.
When a GNOME project announces, on April 1, that it is moving over to a
major component which came from the KDE camp, one can be forgiven for not
taking it seriously.
The story does not stop there, though.
When a GNOME project announces, on April 1, that it is moving over to a
major component which came from the KDE camp, one can be forgiven for not
taking it seriously. But it would appear that this announcement from the Epiphany
that they are moving to WebKit as their sole rendering engine, is the real
thing. Epiphany, remember, is
the closest thing that GNOME has to an official web browser; it has users
who swear by its better integration with the GNOME desktop. But Epiphany
has always been based on the Gecko engine, and it seems that not a whole
lot of users have seen reasons to stick with it over Firefox, which
provides rather more functionality on the same engine. Epiphany is not a
big force in the browser arena currently.
Last year, the Epiphany developers added an abstraction layer which allowed
the browser to operate over multiple rendering engines, including WebKit.
Now they have decided to take that layer back out and to support just one
rendering engine: WebKit. The development team cites a number of reasons
for moving away from Gecko, including release-cycle mismatches, a feature
set which is driven by a competing project, and a lack of attention being
paid to the Gecko/GTK embedding API. Gecko, they have decided, is not the
best fit for Epiphany.
WebKit, instead, was designed for embedding - the WebKit project's goals explicitly
rule out building a browser themselves - and the GNOME
API is said to work very nicely. WebKit in GNOME uses technologies like
Cairo and Pango, like many other GNOME applications. Overall, the Epiphany
team feels like WebKit is a better match for what they are trying to do -
and they suggest that a number of other GNOME projects move in that
direction as well. The initial response from other GNOME participants
appears to be positive, with the exception of some concerns about
accessibility support in WebKit - concerns which, presumably, can be
The GNOME/KDE flame wars, happily, are some years behind us. Developers
from both projects are more interested in cooperation these days, but, so
far, much of that cooperation has been around relatively small, low-level
components. An HTML rendering engine is not a small, low-level component,
though. If both projects seriously work toward the improvement of WebKit,
they will have started an era of rather higher cooperation than has been
seen in the past. If this cooperation holds together, it can only be to
the benefit of both projects, and to all other users of WebKit as well.
The Gecko engine is good code and a highly successful project. But it is
also controlled by a company (Mozilla Corporation) whose agenda, beneficial
though it may be, does not include the creation of successful competing
browsers. So it's not entirely surprising that Gecko has not proved to be
entirely suitable for groups trying to create those competing browsers.
WebKit, at the outset, looks like it is better suited to this task. The
WebKit project has expressed interest in
working with GNOME; there might just be a productive partnership in the
But it's worth remembering that WebKit, too, is a project developed by a
company with its own objectives, few of which make any mention of turning
2009 into the real year of the Linux desktop. For now, though,
WebKit has the look of a project with all the right attributes: real
independence, merit-based access to the source repository, no requirement
for copyright assignments, reasonable licensing, and the right goals. It
may well be positioned to become a core component in the Linux desktop.
Comments (58 posted)
Free operating systems differ from the proprietary variety in a number of
ways. One of the differences which is most evident to all users is in the
provision of device drivers. With free systems, device drivers are free
software, provided with the system itself. Proprietary systems tend to
provide relatively few drivers; instead, proprietary drivers are shipped
with the hardware itself and installed separately. Anybody who wonders
about which model works better would be well advised to look at the events of
March 28, when Creative Labs shut
an outside developer who had been working to improve Creative's
Creative is, of course, a long-time manufacturer of audio hardware.
Opinions vary on the quality of that hardware, but there can be no doubt
that Creative has been successful in this market. Creative's customers
have found, though, that moving to Vista has been an unusually painful
experience, even by the standards of that particular system. It seems that
Creative's drivers have failed to provide the same level of functionality
found in previous versions, leaving customers with crippled hardware.
Strangely enough, said customers have not been entirely pleased with this
state of affairs.
Enter a developer called "Daniel_K". Daniel took the time to figure out
how the hardware worked and to patch Creative's drivers to, once again,
provide access to the full capability of the hardware. He then made those
drivers available to others. Creative hardware owners were happy about
this: somebody had finally managed to solve the problems they had been
complaining about. One would have expected Creative to be happy too; happy
customers tend to be good for business.
That's not the way of it, though. Instead, Creative removed links to the
fixed drivers from its forums and posted a public cease-and-desist letter.
According to Creative's Phil O'Shaughnessy:
By enabling our technology and IP to run on sound cards for which
it was not originally offered or intended, you are in effect,
stealing our goods. When you solicit donations for providing
packages like this, you are profiting from something that you do
not own. If we choose to develop and provide host-based processing
features with certain sound cards and not others, that is a
business decision that only we have the right to make.
There can be little doubt that Creative is operating within its legal
rights here. It has retained proprietary rights to its driver software,
and it has imposed the usual sort of "thou shalt not reverse engineer" EULA
on its users. So, while Daniel_K may (or may not) have been able to
legally reverse engineer the driver (depending on his location), he almost
certainly did not have the right to redistribute modified versions of
Creative's drivers. Asking for donations to help him continue this
activity will not have made him any friends at Creative either. When
dealing with other peoples' proprietary software in this manner, one should
not be surprised to get shutdown notices.
Creative may be on solid ground legally, but it still makes sense to look
at what is going on here. One might have attributed the driver problems to
a lack of competence at Creative, or, perhaps, to the general sort of
misery that (your editor has heard) goes along with Vista. Instead,
Creative's crippled drivers were the result of a "business decision."
Rather than allow its customers to get the most out of the hardware they
thought they owned, Creative decided to restrict that functionality,
presumably as a way of motivating those customers to buy newer, shinier,
better-supported hardware. Daniel_K, by making Creative's customers
happier, was threatening Creative's chosen business strategy.
Now consider a company whose hardware is supported by free drivers. That
company lacks the ability to use crippled drivers as a tool to "encourage"
customers to replace their hardware. Instead, that company has every
incentive to provide the best hardware possible and to ensure that said
hardware works to its fullest capability. Such a company would welcome an
outsider who made their products work better; those outsiders would be more
likely to receive job offers than cease-and-desist letters. Rather than
calling out the lawyers, this company could focus on the business of being
a hardware company.
Your editor knows which sort of company he would (and does) choose to buy
hardware from. Free drivers are not just a path toward higher-quality
support, though that is typically the result. They are not just a way to
help ensure that the kernel as a whole remains stable and debuggable. And
free drivers are not just a way to help ensure that all can learn and benefit
from the work which was done to get the hardware working. They are also a
way to avoid the threat of manipulation by hardware vendors who have
decided that providing the best value for customers is no longer a winning
business strategy. That is a sort of freedom which is worth having.
Comments (53 posted)
Page editor: Jonathan Corbet
Next page: Security>>