One problem which must be faced by any cooperative project is that of
quality management. If anybody can contribute to a work, how can a project
ensure that its output is up to the standards it has set for itself? A
panel session on this topic highlighted three very different
approaches to this issue.
Ullrich Pöschl, a researcher at the Max Planck Institute for Chemistry, is
trying to address a number of problems with the scientific publishing
world. Publication is crucial to scientists - it is, in the end, the one
concrete result from their work which matters. But the process to
publication is long and frustrating, and can often be hampered by personal
agendas and scientific conservatism. Your editor who, in a previous life,
actually published a paper in a
refereed journal can attest to what a
painful process it can be. There are also problems with scientific fraud
and (much more often) plain old carelessness. Scientists, in their rush to
get their work out, will often not take the time to produce work of the
needed quality. Quite a few papers are published which contribute little
and actually dilute the pool of scientific knowledge.
On the other side, scientific journals are tremendously expensive, and they
publish last year's work. There are a lot of pressures for faster - and
more open - access to scientific results. It seems that a more open
approach would benefit everybody, but only if the quality level can be
Ullrich is a founder of a relatively new journal (Atmospheric Chemistry
and Physics) which has set out to demonstrate a
new approach to scientific publication. This journal has retained much of
the classic scientific publication process - every paper is still reviewed by
anonymous referees whose questions must be answered to the editor's
satisfaction. Where things differ is in the openness of the process.
When a paper is submitted, as long as it's not complete junk, it will be
immediately published as a "discussion paper" on the journal's web site. It is
clearly marked as an unreviewed paper, not to be taken as definitive
results at that time. While the referees are reviewing the paper, others
can post comments and questions as well. These others are limited to "registered
scientists," since the desire is to keep the conversation at a high level.
The comments become part of the permanent record stored with the paper, and
they can, at times, be cited by others in their own right. The editor will
consider outside comments when deciding whether the paper is to be accepted
and what revisions are to be required.
After using this process for five years, Atmospheric Chemistry and Physics
has the highest level of citations in the field. Citations are important
in the scientific world: they are an indication that a given set of
research results has helped and inspired discoveries elsewhere. The high
level of citations here indicates that this publication process is
succeeding in attracting high-level papers and filtering out the less
Things are at an early stage - out of approximately 7,000 scientific
journals, about five are currently publishing with this sort of technique.
Others are interested, however, and that number can be expected to grow in
Martin Haase then took the podium to talk about quality management in
Wikipedia. While Wikipedia is a useful resource, there have been a number
of well-reported problems. Some articles can be flat-out wrong, or,
sometimes, distorted to meet somebody's political goals. Maintaining and
improving Wikipedia's reputation will require getting a handle on
Some measures being taken by Wikipedia are:
- Putting restrictions on anonymous access. In particular, anonymous
editors cannot create new articles.
- Getting a better handle on attribution of work. Wikipedia maintains
an article editing history now, and has lists of contributors. Some
people, it seems, have been surprised to learn this, and have
changed the style of their contributions afterward.
- A two-level reviewing process. Articles which have been heavily
reviewed and deemed to be correct can be designated as "featured"
articles. This process, however, turns out to be slow, so a new, less
rigorous "good article" designation has been created as well.
- Specific metadata about validation is being added to articles.
- There is a mechanism for creating permanent links to specific
versions of articles. These links can be used by outside sites to
link to a "known good" version of an article with no need to worry
about what subsequent changes could bring.
While agreeing that improving the quality of Wikipedia articles will be a
never-ending process, Martin seems to think that the measures being taken
will move things in the right direction. He warned explicitly about
"expertism" - requiring that articles be written by experts in the field.
It can be hard for experts to write articles for people who are unfamiliar
with the field - their work tends to be jargon-heavy and written at the
wrong level. They also tend to run in schools, and expert-written
articles tend to reflect the views of one school only. Limiting
contributions to experts would, in Mr. Haase's view, rob Wikipedia of much
of its usefulness.
The third panelist, Larry Sanger, disagrees. Larry was a part of the
creation of Wikipedia, but has since fallen out with that project. So,
while claiming to be a "big fan of Wikipedia," he spent much time
criticizing it. Wikipedia, he says, was meant to be the wild side
of Nupedia, it was never supposed to be the whole thing. With only
half of the original design, he says, it is not surprising that things have
So what has gone wrong? According to Larry, the Wikipedia rules are not
enforced uniformly, leading to lots of abuses. Anonymous editing attracts
trolls and other people whose main purpose is not the creation of a
top-quality encyclopedia. The Wikipedia community is insular and hard to
join. And there is no place for academics, people who are experts in
their field. Wikipedia people may fear expertism, but Larry, instead, is
on a campaign against amateurism. This amateurism, he says, is behind many of the
problems with Wikipedia, but the community will not recognize these
problems, and, thus, he says, will never fix them.
So Larry is going to fork Wikipedia. His project, called The Citizendium, will, he says, be very
different. It will start out very much the same, however: the same
software, and copies of all the Wikipedia articles. Those articles will
track changes to their Wikipedia equivalents until they are changed
locally, at which point they will become a hard fork. There are no plans
to fork the software. In essence, the Citizendium intends to make full use
of Wikipedia's free licensing (as is its right) to bootstrap the new site,
and only move away from Wikipedia content when and where it feels it has
something better to offer.
There will be some distinct roles for members of the Citizendium project. People who
are deemed to be sufficiently expert in a given field will be called
"editors"; regular contributors will be expected to defer to the editors in
their field of expertise. These editors will be self-selecting, but they
must publicly state their credentials. Editors can mark an article as
being "approved," indicating that, in their opinion, it has reached a
certain level of quality.
There will be no anonymous editing allowed in the Citizendium, and no
pseudonyms either. All contributors must work under their own names.
There will be a number of rules on how contributors and editors are
supposed to work, with quick expulsion from the project for those who do
not follow them. To that end, there will also be "constables," whose job
is to enforce these rules.
There are vague plans for a meeting to draft and approve the charter under which
the project operates. For now, however, the Citizendium is very much Larry
Sanger's project, with goals and processes set by him. Whether it will be
able to build a community and maintain it while keeping quality high
remains to be seen.
to post comments)