The keynote speaker on the second day of linux.conf.au 2007 was Christopher
Blizzard, currently with Red Hat. His topic was "relevance," and, in
particular, the relevance of the free software movement to the rest of the
One way to be relevant is to create top-quality products. There was an
emphasis on the word "product," rather than "project"; Chris was talking
about making things for people. The best products, he says, are those
which genuinely change the way we live. The example he used was cellular
telephones, which have truly changed the ways in which people communicate.
Your editor, often reduced to communicating with his children via text
message, is not convinced that all these changes are for the better, but
the talk did not address this side of things.
The next slide was a marketing shot of the iPhone. Is this a product which
will change how people live? Nobody in the audience was willing to argue
that it was.
Then came Firefox - a project which Chris worked on for some years.
Firefox "makes the web less annoying," and makes a point of respecting its
users, which is important. It's still not clear that Firefox has changed
the way people live, however. Even so, Firefox had some lessons to offer:
- You can't change the web from the back end. No matter how much
good and innovative work is being done on the server side, the
software which controls the user experience will shape the web.
Firefox has been successful because it is "driving from the front,"
and influencing how the users see and work with the net.
- Going direct to users is important; you can't count on others to
distribute your software for you.
- Stick to your core values. The Mozilla project gets significant
amounts of money from its sponsors, but it is unwilling to consider
sponsorships which would require user-hostile changes.
- Have a mission. A project will only produce a great project if it has
a strong idea of what it is trying to accomplish.
How many years, asked Chris, has it been the year of the Linux desktop? Is
Linux relevant for desktop users. In general, his answer was "no." Linux
is showing up in interesting places, however: the Nokia N800, telephones,
and the One Laptop Per Child project.
OLPC, says Chris, truly is a relevant project which will be changing
lives. It has a well-defined mission - providing computing technology in a
way which furthers the education of children in the developing world - and
it is creating a product which furthers that mission. To that end, a
number of interesting innovations have been made; these include the OLPC
display (which, among other things, is readable in full sunlight), the mesh
networking feature, and the ability to power it with a hand-operated
generator. The sugar user interface also rates high on the list; it has
tossed out much of the standard desktop metaphor in favor of a new design
aimed at the OLPC's target user base.
So, based on this, how should a project make itself relevant? Chris
- Find an important set of clients, and work toward their needs. In the
OLPC example, the clients are developing-world children (or, perhaps,
the governments which represent them).
- Find good designers and trust them. Free software developers are
often dismissive of the need for good design, but you cannot create a
great product without it. Once you have found people who can do this
design, you must trust them, even if their work takes you in
directions which are surprising and unfamiliar.
- Make your product for other people. Doing so requires developing a
certain amount of empathy for the intended clients and getting past
the "itch scratching" mode of development.
A project which follows these guidelines, says Chris, has a good chance of
being relevant well into the future.
Comments (9 posted)
Cory Ondrejka, CTO of Linden Lab, has some serious programming credentials. Before joining Linden Lab in late 2000, he worked on US government projects and Nintendo games; as well as writing much of the original core code for Second Life, he also designed the Linden Scripting Language (LSL), and wrote the LSL execution engine. He talks to Glyn Moody about the background to Linden Lab's decision to take the Second Life client open source, how things will work in practice, and what's going to happen server-side.
When did Linden Lab start to think about the possibility of opening up Second Life's source code?
We've been thinking about it fairly seriously for, gosh, nearly three years now. The effort to really get there is something that got kicked off pretty early in 2006.
Was there any particular stimulus at that time?
We started looking at what our residents were doing in preparation for some speaking we did at [O'Reilly's] ETech in March. One of the things that we discovered was that a very large percentage of our residents something on the order of 15% of people who logged in - were using the scripting language. So you start realising that there are tens of thousands of people at least, probably more like hundreds of thousands at this point, who have written code related to Second Life. And so it seems a little bit silly to not enable that creative horsepower to be applied to our code as well.
Was the decision to open the viewer's code a difficult one?
I think internally, as an organisation, buying into the idea is something that we were able to get to relatively quickly. People sometimes don't realise that the kind of work you have to do to be able to open source is exactly the same work that you're doing to close exploits and fix bugs. It's actually not a separate set of tasks in many ways.
Over 2006 there was also a very active reverse-engineering effort called libsecondlife that has something like 50 or 60 developers on their mailing list. They've been doing a very impressive job of reverse engineering the protocols and figuring out what's going on. They were finding exploits quite regularly and doing a good job of sending them to us, and saying: Hey, we found this, you guys might want to fix it.
What we found, of course, is that it doesn't really matter whether we open source or not, the exploits are going to get found - that's what has happened in all software. And so why not make it easier for folks like libsecondlife, if they're going to be poking around anyway? Let them have the code so that they're more likely be able to fix things that they find, and broaden it to a larger community of developers than just the developers who wanted to get involved in a reverse-engineering effort.
Why did you choose GNU GPLv2 license for the code?
We ended up talking about that a lot. We were basically surveying what license is still the dominant license in the open source community: it's GPLv2, and so in our minds it has a lot of legitimacy. It's also the one that gives us the most flexibility down the road, where if we want to do a dual-licensing scheme, or a more-than-dual licensing scheme, it's a lot easier to come from GPL than sort of back into it.
In fact, you already offer a commercial license, I believe?
We do. I think that for now we would be sort of surprised if a lot of people jumped on the commercial license today, but we have a lot to learn. This is a very big step: there's never been a product that was in the dominant position that then open sourced. Open source is usually used by folk who are either trying to gain market share, or projects that are very early stage. So in that sense, we're trying to be pretty careful and conservative in our decision-making process, because this is in some ways new ground. Much like three years ago, when we gave intellectual property rights back to our residents, and allowed them to own what they made, that was a very new step in this space, and so I think we're continuing the tradition of bleeding edge in our decision making.
When did you start the detailed preparatory work, and what did that entail in terms of preparing the viewer code for release?
It really got started in May and that process continued until the release. It was everything from doing external security audits, hiring additional staff, making sure that you could build it on all the platforms, and building the manifests for all the zip and tarballs we were going to distribute.
Did you have to do much in terms of making the code more legible or more modular?
I think we haven't done as much of that as we would like. Now, of course, nobody who has actually written code and then released it ever thinks the code is clean or modular enough; in fact there are pretty big changes coming down the pike to make the code better.
And that was a pretty active topic of debate: do we wait until after those changes to release the code? We decided that it made more sense to get the code out there. You can always find reasons not to open source, and ultimately it's better to let people begin getting expertise in the code even if we warn them: Hey, this part of the code is going to be changing. And what's neat is that less than 24 hours after we put the code out we've already accepted a user patch.
Could you say a little about these big changes that are coming through?
What we need is to be able not to have to update monolithically. Right now, we take down the grid, we update everybody's viewer, and everything comes back up. And obviously that's neither scalable nor testable. And so there's this long series of changes to be made to let us upgrade in a more heterogeneous way. And we are beginning to publish what those changes are going to be so that people know that they're coming and what to expect.
What are the things that you haven't been able to open source?
Well, for example, streaming textures in Second Life use the JPEG2000 compression standard, j2c, and we use a proprietary bit of code to do the decompression. Now libjpeg, which is the open source version of this, does j2c, but it's way too slow. So one of our first challenges to our user base is: Hey, go smack libjpeg around a bit, and optimise it and then we will happily swap it in.
Why do you distribute binary copies of libraries that are almost certain to be found on any GNU/Linux system -- zlib and ogg/vorbis, for example?
It just seems simpler to give people really complete sets and say: If you go through these steps you will build successfully. There are few things more frustrating than getting all excited about getting some code and you go to build it and it barfs. So we've really been trying to take steps to make sure that doesn't happen. Within about an hour and a half of us putting the code up, there was a picture up on Flickr of somebody who had compiled and made a change already.
In terms of the timing, Linden Lab's been very circumspect in talking about this move: the signals were later this year rather than at the beginning. Why is it happening now, much earlier than you originally indicated?
Linden Lab has always been probably more open than is good for us about what we're trying to do when. We have always talked about features that we're working on, and given estimates of when we were trying to release them. Like most software, we usually end up being a little bit later on those than we'd like to be. And so going forward, we're trying to do a better job of underpromising and overdelivering rather than the opposite. So if people get mad at me because I deliver stuff faster than I was going to, I think I can live with that. I like to beat expectations from here on out.
What do you hope to gain from open sourcing the viewer?
First of all, we expect to get a better viewer. We think we will do a better job of finding bugs and exploits with the Second Life community looking at the code. If you go out medium to longer term, I think we will see active feature development as the community gains expertise with the code and we continue to implement protocol changes to make it easier to implement the features. More importantly, I think we're going to be building expertise in running an open source project because this is just step one for us in terms of where we think Second Life needs to go.
Second Life is growing very rapidly at this point. We think that it is a Web-scale project, not a game-scale project. We will not be happy if at the end of the day we only have ten million users; I think we would all see that as a tremendous failure. So, if we're going to scale to Web levels, obviously we need to keep open-sourcing the pieces that make sense to open source. In order to do that, we need to build expertise at running open source projects, and being part of open source projects, and engaging the open source community. So we've taken the piece that we were first able to do that with, and we're going to learn a lot over the next couple of quarters.
Were you surprised by the large number of positive comments on the blog posting that announced the move?
There's no question that the Second Life community is the most creative, capable, intelligent, community ever targeted on one project in history. To give them the ability to make the project even more their own - it does not surprise me that they're pretty psyched about that.
What are the resources that you've put in place to work with the community that you hope to build around the code?
Right now, we basically have an army of one, Rob Lanphier, who did this before. He was at RealNetworks, and spearheaded the Real server open source project Helix.
What's he going to be doing, and how will the code submissions be processed?
He is going to be helping to hire a team, because we're eventually going to need a whole team to be just managing the ingress of code. Right now, he help set up JIRA, the project management software, which the users can register on and submit bugs and patches. They have a wiki for the open source project, and he has been pretty much managing that.
The QA team is also directly plugged in to the patch submission process so that they can pull patches in, test them on private set-ups, see what's going on. The developers will be keeping an eye on things as well. Like a lot of what Linden Lab does, it's going to be a relatively diffuse project.
You mentioned JIRA for issue tracking, what about the actual code management?
We use Subversion. There isn't yet a public Subversion repository, but we're getting there.
Will you be giving accounts on that to outside contributors?
I don't know exactly what Rob's plan is for that, but I would assume that there's going to be something like that. I expect the libsecondlife people will have a Subversion repository up before we do anything, anyway. They may host the code also -- they're pretty aggressive about doing that.
To foster external contributions, how about moving to a plug-in architecture?
I think that all of us agree that a plug-in structure on the client makes sense. It's just a matter of figuring out whether we want to leverage an existing one or re-invent the wheel.
You've indicated that you view opening up the client as a learning experience for open-sourcing the server in the right way: what additional issues will you need to address here - presumably the proprietary Havok physics engine is going to be a problem?
Certainly, there is the question of proprietary code. We may be able to do exactly what we did on the client side, where we are distributing binaries. In six months, when this [move to open up the client] is successful, it may make for very interesting conversations with folks. We can say: Hey, look, you are the leader in this sector, you should open source, here's why we did it and it worked. And I think the fact that there aren't any proof-points of that is maybe part of what scares companies from doing that. I think we're going to be a very interesting test case.
Obviously the server raises a host of security issues. We have a roadmap that we think solves those, and we're going to be sharing that roadmap sometime this quarter with the community, once we get it sufficiently refined that we're happy with it. We see a host of use-cases for servers where we need to make some pretty profound architectural changes in terms of how trust is established between user and server, between servers and each other, and servers and backend systems. But we see a path, and so it's just a matter of applying development resources to that path and moving along it.
What kind of things are you having to deal with?
In broad security terms, [it's] about code running on hostile machines. Right now, all of the simulator machines are machines that we own in our co-los. It's very different to have that code running on a machine in your garage, even though you're probably a trustworthy guy. That raises issues of trust. Once you have code running on hostile machines, it doesn't really matter whether you have the source or not: you can start doing things. And so we need to trust the simulators less, which means moving some of the things that the simulators currently do in a trusted fashion, out of them.
Does that mean centralizing certain Second Life services?
That depends. Let's say you were a large research organization and you wanted to be able at times use Second Life in a more private way. You might want to control even some of the centralized services. But what you don't want is just a fragmented set of parallel universes that can't talk to each other because you then lose the benefit that makes Second Life so strong, which is the fact that all these communities can connect across traditional geographic and community boundaries. And so the secret sauce becomes how do you architect it in a way that allows both Internet and intranet use.
Do you think that these future worlds will be part of the main Second Life geography or will there be portals from them through to your world?
Well, I think the answer is "yes", because there are some use-cases where it makes sense to be part of the big world, and other cases it makes sense to be a portal away.
Presumably you've also got to deal with issues like identity as avatars move between different worlds, and the tricky one of money?
It's almost like you've read my list: you're dead-on. What's good is, unlike six and half years ago, when we got rolling on this stuff, some of these have been partially solved by the Web. There are much better exemplars today than there were six and half years ago. And so for a lot of what we're going to be doing we can use existing technologies.
What does that imply about the convergence of 3D virtual worlds with the Web?
I think that when you look at anything in problem space, there's some things that the Web does very well. Text, it does it really well; one-to-many, it does it very well; sequential solo consumption of content, it does really well. But there are some things that shared collaborative space and virtual worlds and 3D do really well: if you need place, or you need people to be consuming the content together, where the audience matters, or knowing that we're consuming at the same time matters, or when you need simultaneous interaction.
So I think it's a little odd to imagine that either of those hammers will solve all problems. Instead, what you want is to be able take problems and move into them into the correct space. If you're doing text entry, doing it in 3D is just a big pain in the butt. So there are places for the Web, and there are places for virtual worlds, and I think what you want is as much data to flow between those two as smoothly as you can.
Finally, once you've opened up the code to the client and server, what will be left for Linden Lab to make some money from?
I think that would be a little bit like implying there's no business to be had on the Web if you give away Apache. The Web has shown us where a lot of the value is: identity, transactions, search, communities. And so nothing that we've talked about requires that Linden Lab give up any of those pieces. I think the key is for us to enable growth, building a much, much bigger market, and attempt to make money where it makes sense.
Glyn Moody writes about open source and virtual worlds at opendotdotdot.
Comments (4 posted)
IRC (Internet Relay Chat) is a venerable protocol which allows people to
type messages at each other across the net. Your editor remembers a
fascinating day in 1991, when observers in Moscow used an IRC channel to
report on the Soviet coup attempt; it was an early example of the power the
net would come to have. In subsequent years, however, your editor has had
little time for IRC. Getting LWN together every week requires a strong
focus on getting things done, and IRC can be a real productivity killer.
Pretending that IRC does not exist has been most helpful in getting the
Real Work done.
Recently, however, your editor has had reason to wander into IRC again.
done much in this area for a while, your editor lacked a favorite IRC
client - or any IRC client at all. Thus began the search for the best tool
for this particular job - and, eventually, this article.
Anybody who has investigated the topic knows that there is no shortage of
IRC clients to choose from. It would appear that free software developers
are often afflicted with this particular itch. There is no real hope of
reviewing them all, so your editor will not even try. Instead, this review
is restricted to graphical clients which appear to have a real user base
and which are under active development. Your editor also lacks access to
AOL instant messaging, MSN messaging, etc., so this review will be focused
on IRC functionality. Some clients can work with many networks; that
capability will be mentioned when appropriate, but it will not be reviewed
further. Finally, your editor has little to say about channel operator
commands, file downloads, or other such features of IRC; this article will
focus on the basics.
Gaim is a longstanding GNOME
messaging client. It does IRC, along with AIM, ICQ, MSN Messenger,
Yahoo, Jabber, Gadu-Gadu, and so on. If it's a messaging protocol, Gaim
can probably handle it. Those using it for IRC only will find that Gaim
brings a certain amount of baggage ("buddy lists" and such) which is not
useful in that context, and that some of the terminology used ("rooms")
does not quite match the IRC conventions. None of this is particularly
problematic in real use, however.
The main Gaim window is tab-oriented, with each IRC channel in its own
tab. This organization is space-efficient, but it can make it hard to
monitor more than one channel - though the color-coded tab tags help. Tabs
can be detached, however, allowing the user to fill the screen with
single-channel windows. Gaim windows use smooth scrolling, a feature your
editor got tired of back in the VT100 days; unfortunately, there appears to
be no easy way to turn it off. On the other hand, users can turn
off the insertion of cloyingly cute smiley graphics into the message
Private messages result in the quiet creation of a new tab - something
which can be easy for the user to miss. In general, the handling of
private messages in IRC clients seems a little awkward.
Gaim has support for IRC servers which can authenticate nicknames with
passwords. It also has a plugin feature which can be used to extend the
plugins add support for additional protocols, expose more preference
options, perform encryption, and more.
Finally, on your editor's system, the Gaim client was a huge process. It
should not be that hard to create an IRC client which requires less
that a 50MB resident set, but the Gaim developers have not done that.
Running Gaim made the whole system visibly slower. Gaim also doesn't take
the hint when all of its windows are closed; one must explicitly tell it to
go away by selecting "Quit" from the "Buddies" menu in the "Buddy list"
window - something your editor found less than entirely intuitive.
Konversation is a KDE-based
client centered around IRC. Like many KDE clients, it is feature-heavy and
Like Gaim, Konversation is based on a single window with tabs. In this
case, however, there does not appear to be any way to detach the tabs into
their own windows.
One nice feature in Konversation is "remember lines," lines drawn in each
conversation window when it goes out of view. When returning to a channel,
the user knows just where to start reading to catch up on the new stuff.
This feature gets a little aggressive at times, drawing several lines
together in low-activity channels; one presumes this little glitch can be
ironed out. Konversation also has an option to suppress all of the channel
event lines (comings and goings) which tend to clutter up the
Konversation can handle passwords, but it required a bit more setup work
than some other clients.
Also available is a "URL catcher" tab which simply accumulates URLs posted
on subscribed channels.
Overall, Konversation comes across as a featureful and useful IRC client.
The documentation which comes with it is well-done and comprehensive; it
helped your editor get past his initial questions ("how do I make it stop
joining #kde?") quickly. Detachable tabs would make it nearly perfect.
Perhaps your editor is pushing it a bit by including ERC in this list. ERC
is an emacs-based IRC client; it can be added onto emacs 21, and it
has been bundled into the upcoming emacs 22 release. Emacs is a
strongly graphical environment these days, and ERC offers all of the
point-and-click configuration and operation options that the other clients
reviewed here have.
ERC maintains a separate buffer for each open IRC channel. It tends to
hide those buffers, and there is no simple tab bar for switching between
them. It is a simple matter for an emacs user to configure the display as
desired, with different channels displayed in different windows or frames.
Somebody who is not familiar with the emacs way of doing things would have
a harder time of it, however.
There is a separate buffer for managing the connection with the IRC server,
and that is where private messages show up. It is probably safe to say
that very few users will keep that buffer visible, with the result that
private messages tend to go unnoticed. ERC also arguably features the
ugliest, most unreadable channel list window of any of the clients
Display is highly configurable. By default, ERC is less color-happy than
most other graphical clients, a feature which your editor appreciates.
There is a full list of options for filtering users and message types,
performing text transformations, etc. And, of course, the experienced
emacs user can simply attach elisp functions to any events requiring more
There is no provision for marking the last-read text in ERC. This
functionality is easily obtained by moving point off the end of the buffer,
essentially saving the current location - but the user must remember to do
Overall, your editor likes the feel of working with ERC - but, then, he is
known to be sympathetic to emacs-based solutions. There is no need to
figure out how to search for specific text, for example - all of the normal
text searching functions work as expected. Saving text or a partial log is
straightforward. There is no one-line text window to type into; one simply
types into the buffer and long lines are broken naturally. And so on.
Emacs users will probably be happy with ERC; the rest of the world is
unlikely to pick up emacs to be able to use it.
XChat is a popular client with a
relatively long history. Your editor tried out the GNOME version of XChat
on several networks. Finding servers was relatively easy, since XChat
comes equipped with a long list built into it. One thing which becomes
immediately apparent, however, is that XChat grabs the channel list in a
blocking operation. The client can go completely unresponsive for several
minutes until the listing is complete - not the friendliest introduction
The main XChat window features a tree listing of servers and open panels on
the left, and a display of one of those channels in the main pane. There
does not appear to be any way to view more than one channel's traffic at
any given time. The left pane marks channels with unread activity - with a
separate mark if the only activity is enter and leave events.
The XChat feature list is long. It has a "last read" line in each window,
though how it decides when something was read remains a bit of a mystery.
It is not directly related to expose, focus, or mouse button events. Those
who are relatively uninterested in actually reading IRC traffic can
set up window transparency and background images. There is a plugin
mechanism which can used to set up a URL grabber window or to script the
client in Perl or Python. Moving the pointer over a correspondent's name
yields a popup with that person's name and origin information. There is no
password support, however. Unlike some other clients, XChat appears to
have relatively little support for channel operator functions.
Graphically, XChat is reasonably pleasing, with a use of color which is
not entirely excessive. Private messages are handled in a relatively
straightforward and visible way - but the dialog for selecting a user to
talk to is painful. Overall, it is a capable and easy client adequate for
the needs of a large subset of IRC users.
Once upon a time, the Mozilla client looked as if it were about to grow to
encompass the functionality of most other programs found on a typical
desktop system. The Mozilla project eventually decided to redirect its
efforts toward the more focused Firefox and Thunderbird tools, leaving the
old, comprehensive application behind. There were users who did not like
that state of affairs, and who dedicated some time to continuing its
development. The result was the SeaMonkey project.
Tucked into one corner of this tool is an IRC client.
Your editor's introduction to this tool was somewhat rocky. It offered up
Undernet as one of its connection possibilities. Your editor decided to
check it out and see what channels were available. After a long period
where the client was completely unresponsive (attempting to list information
for over 20,000 channels), it simply crashed. Note to the SeaMonkey
developers: if you must crash, please have the courtesy to do so
before making the user wait for a long network transfer.
When SeaMonkey is operating, it provides a single, tabbed window with
nicknames on the left. There is no way to have more than one channel
on-screen at a time. There is no password support. All told, the
SeaMonkey IRC client ("ChatZilla") comes across as unfinished and rough
compared to a number of the alternatives. Your editor has seen nothing
here to convince him that web browsers need to support IRC too.
Ksirc is a simple IRC client shipped with KDE; it does not appear to have a
web page dedicated to it. It offers less help than many other clients;
your editor's install of ksirc did not know about any IRC servers, for
example. Once configured, however, it operates well enough.
The bulk of the interface is done through a single window, with each
channel represented by a tab. It is possible to detach the tabs into
separate windows, making it possible to see multiple windows at once.
There is also a "ticker mode" where messages scroll by in a single-line
window, but this mode did not render properly on your editor's system. A
separate window shows the list of servers and open channels, but it does
not appear to actually be useful for much.
Your editor appreciates restraint in the use of color, but ksirc, perhaps,
takes the idea too far by default. The window is essentially
monochromatic, dense, and difficult to read. The use of color can be
configured, however, and there is a set of filters which can be used to
highlight messages with text of interest. When the automatic colorizing
mode is enabled, however, it has an unhealthy tendency to pick gray for
some of the more active users - a bit of a pain considering that the window
background is, by default, gray.
Overall, ksirc is a sufficiently capable tool for most needs. It gives the
impression of having been left behind by some of the other KDE-based IRC
clients, however, and of not getting much development attention in recent
A more contemporary KDE client is Kopete. This tool, perhaps, is the KDE
answer to Gaim; it appears to have support for just about any messaging
protocol one can imagine. Once again, your editor only looked at the IRC
If ksirc is dense and hard to read, Kopete is the opposite. The default
display is full of white space, divider bars, icons, smilies, and more.
it can be hard to follow a conversation for the simple reason that very
little of it actually fits into the window. Kopete supports themes,
however, and it does not take long to find a theme which makes a little
better use of screen real estate.
At the outset, Kopete's interface is a bit intimidating. The small window
that comes up seems to offer little in the way of interesting operations -
joining a channel, say. For that, one must know to right-click on the
little icon which shows up in the taskbar tray and wander through the
menus. It all works fine once one gets the hang of it, but a new user
trying to get started without having read the manual is likely to be
frustrated for a while.
It is hard to miss private messages in Kopete - the application creates a
new window and throws it at you. For the serious messaging user, there is
a whole set of options for configuring just how hard the client tries to
let you know about various sorts of events.
About the only thing that is lacking is a "last read" line. With that in
place, and with an appropriate theme, Kopete is a powerful and attractive
Finally, your editor tried out KVirc,
which is a bit of a different approach to IRC clients. Unlike Kopete,
which leaves the first-time user trying to figure out
what to do, KVirc starts with a set of configuration windows - one of which
even displays the GPL text for approval. The user ends up with a big
window containing another for server selection. It would appear that just
about every IRC server on the planet has been put into this dialog; it's a
After selecting a server (and, perhaps. entering password information), the
user encounters one of the more peculiar aspects of KVirc. Every channel
has its own window, but all of those windows are contained within the big
KVirc window. There is a background image in the big window, and the
channel windows are all translucent. It is all visually striking, but your
editor could not help wondering why the developers felt the need to
implement their own window manager. It even has options for tiling all of
the subwindows - with a choice of several different algorithms.
KVirc also has KVS, its own,
special-purpose scripting language "inspired by C++, sh, perl, php and
mIrc". There is a separate window for monitoring socket operations,
no end of options for playing sounds, a set of anti-spam and anti-flood
filters, and more. It's all powerful and striking, but it's hard to help
wondering if all that brilliant development energy couldn't have gone into
something more generally useful than another IRC client.
For people who spend much of their lives in IRC, KVirc might well be the
tool of choice. It's visually striking, feature-rich, and users can script
their own bots directly within the client. For your editor's purposes,
however, KVirc is an overly heavy tool, wanting the full screen and ongoing
Some readers will certainly note the biggest omission from this review: bitchx. It is, beyond doubt, a powerful
client; bitchx was left out primarily because it is not a graphical
client. Those who are determined to remain in the curses world are
unlikely to be much interested in the other clients listed here, so there
doesn't seem to be much point in trying to compare them.
So which client will your editor use when he wishes to be grumpy with
others in real time, one line at a time? ERC probably remains at the top
of the list, but XChat is also a useful and capable client. If your
editor were a user of other messaging protocols as well, it would pretty
much come down to Gaim or Kopete, depending on one's desktop orientation.
Your editor's high-school son tends to quickly minimize windows when others
walk into the room, but he would appear to have settled on Gaim.
In the end, however, just about any of these clients is adequate for the
One cannot help but wonder why the free software community has produced such a large
set of IRC clients. Yes, IRC is an important communication channel, and a
well-designed client can make IRC more pleasant to work with, but it still
does not seem like there would be room for that many applications doing
essentially the same thing. One cannot fault developers for scratching an
itch and giving the result to the world. Perhaps, once they have achieved
the creation of the world-dominating IRC client, some of these developers
will move on to the creation of something truly revolutionary.
Comments (37 posted)
Page editor: Rebecca Sobol
Inside this week's LWN.net Weekly Edition
- Security: Chaostables for confusing nmap scans
- Kernel: The state of the Nouveau project; KHB: Recovering Device Drivers: From Sandboxing to Surviving; RCU and unloadable modules.
- Distributions: LCA: How to improve Debian security; new releases: BLAG 60000, FreeBSD 6.2, FreeSBIE-2.0, IPCop Firewall 1.4.13, Ubuntu Herd 2; Mandriva at the Solutions Linux 2007 summit
- Development: Twisted reaches the 2.5.0 milestone, new versions of Linux-HA, SQLite, wxSQLite3, libnfnetlink, bogofilter,
conntrackd, VirtualBox OSE, Apache, Plume CMS, Crystal Space, Azureus, GNOME,
GARNOME, Xfce, eispice, Linux Libertine, Varconf, swfdec, Theorur,
AbiWord, libgpod, BNF for Java.
- Press: Tivo Healthcare, Vista will boost desktop Linux, CES 2007 coverage, Linux.conf.au coverage, Sun open-sources Fortress, GPLv3 OpenSolaris, EU study on FLOSS, Groklaw looks at the BSD license, DNS Extensions, VMware review, PS3 development, Eclipse joins JCP, Flash 9 for Linux, Buy an OLPC.
- Announcements: Ardour needs sponsors, Fluendo CODECs for Linux, WiMAX proto boards, Akademy CFP, LayerOne CFP, OSCON CFP, OLS CFP, ETech 2007, KDE-NL meeting, Nepomuk-KDE Workshop, SCALE preparation, Open Source Health Care Summit, Searchme Launches Wikiseek.