|
|
Subscribe / Log in / New account

Leading items

A pair of acquisitions

By Jonathan Corbet
August 21, 2007
While much of the commercial world was watching the initial public offering of VMWare stock, a competitor was carefully pushing forward a different strategy. On August 15, Citrix announced its acquisition of XenSource, the company formed to commercialize the Xen hypervisor. At $500 million, it is a pricey purchase - Citrix guesses that XenSource will bring in $50 million in revenue in 2008, but at a cost of $60-70 million. So profits from XenSource, in the near term, will be virtualized as well; perhaps the plan is to make it up in volume.

Those who fear that money cannot be made with free software might take comfort in a half-billion dollar acquisition of a free software company. Of course, XenSource is far from a pure free software operation. The kernel-level code is GPL-licensed, as is required; much of that code has recently, after a long delay, found its way into the mainline kernel. But the upper layers - the code for the management of virtualized systems - is highly proprietary. It is offered in a three-tier scheme, with the more expensive products un-crippling larger numbers of features. These products are where the revenue comes from.

This acquisition is somewhat indicative of what is happening in the virtualization market. The low-level functionality is free, and is getting steadily more capable. But the tools for the administration of virtualized systems - a task of daunting complexity for sites running large numbers of virtual guests - are generally proprietary. It is the offerings at this level which give XenSource its value despite the fact that Xen's kernel-level support is increasingly surrounded by capable and arguably better-designed alternatives. For all practical purposes, the XenSource acquisition is just the purchase of yet another proprietary software company, Xen's free software origins notwithstanding.

Perhaps more interesting is the acquisition of the ClamAV project by Sourcefire, the company behind the Snort intrusion detection system. ClamAV, a virus scanner, is a true free software project which, previously, had lacked a commercial component. Details have not been disclosed, but one assumes that the owners of ClamAV did not make out quite as well as the holders of XenSource stock. They did get jobs out of the deal, though; they will now continue their ClamAV work as Sourcefire employees.

Who the owners are is, in this case, an interesting question. Projects led by developers with commercial ambitions typically require copyright assignments for any outside contributions. With ownership of 100% of the code base, selling a project (or taking it proprietary) is a relatively straightforward operation. ClamAV, however, is not one of those projects, and all contributors retained their copyrights. So Sourcefire does not own the entire ClamAV code base (or the equally important virus signature database). What it has acquired is the copyrights held by the primary contributors - a large part of the project, but not the whole thing.

This ownership structure could be a bit of a challenge for Sourcefire going forward; part of the plan for making money from this deal involves making a commercially-licensed version of ClamAV available for vendors who wish to integrate ClamAV into their products without being bound by the GPL. To make this offering possible, Sourcefire will be digging through the code and the source code management system to weed out any code which it cannot relicense. If the developers involved have an accurate idea of how much code is involved, if they are thorough in eradicating it, and if they do not anger any outside contributors to the point that they wish to create trouble, this scheme could go well. If a misstep is made somewhere, the possibility of legal action and other unpleasant consequences is very real.

For now, the stated plan is to continue to keep the entire code base and signature database available under the GPL. Sourcefire's Mike Guiterman says that the ClamAV user community has nothing to worry about:

In this case our (Sourcefire's) track record with Snort speaks for itself. Sourcefire has never with held or delayed a feature in Snort from the open source community. Snort releases and Sourcefire commercial releases are in lock step.

It has been pointed out, though, that there is a bit more to Sourcefire's track record than stated above. Snort releases may happen "in lock step," but anybody who has not bought a Snort rules subscription must wait 30 days for rule updates. Like Snort, ClamAV uses a frequently-updated set of rules which are compared against incoming traffic to detect threats. So it would seem that the ClamAV signature database would be very much amenable to the same commercial treatment; that is, after all, how a number of other anti-virus companies do business.

For now, though, all of the indications are that Sourcefire will not be creating a subscription service around ClamAV signature updates. Quite possibly the company feels that one reason for ClamAV's success is the presence of a wider community which can contribute those updates; putting signature updates behind a subscription gate would almost certainly cause community contributions to dry up. Rather than risk damaging the project it just bought, Sourcefire may have decided to seek revenue in other directions - for now, at least.

With sufficient care, Sourcefire should be able to keep the ClamAV community together - and, perhaps, help it to grow further. Acquisition of a free software project is almost certain to bring change, but that change need not be bad. As we head steadily toward World Domination, we may well see more of these deals. One can only hope that the companies carrying out these acquisitions understand well that, in the absence of the wider community, all they can acquire is a lump of code. Preserving the value of a project acquisition requires preserving the community that goes with it. As long as this important fact is kept in mind, acquisitions can be ultimately beneficial to the affected projects and free software as a whole.

Comments (9 posted)

Large projects and decentralized development

By Jake Edge
August 22, 2007

Development using Git, with its decentralized model, is gaining proponents for projects beyond its Linux kernel heritage. Some recent threads on the kde-core-devel mailing list have been discussing how Git might be used by some developers without disrupting the Subversion (svn) infrastructure that is used by KDE. That conversation has broadened to consider how a large project like KDE might reorganize to take advantage of Git's strengths. It does not look like KDE is really considering a switch – they converted from CVS a little over two years ago – but the discussion is useful to anyone thinking about using Git.

There are really two separate discussions taking place, the first concerns using Git without disrupting svn, while the second covers the larger issues of how to structure and use Git for a larger project. The two are intertwined as the "best practice" for a KDE-sized project is to convert incrementally. Smaller sub-projects, a particular KDE application for example, would use Git while still committing the changes back to the svn repository. Trying to do a wholesale conversion of a project the size of KDE, with many developers, testers, translators and users – not to mention millions of lines of code – would be something approaching impossible.

For tracking an svn repository, while using Git locally, the git-svn tool is indispensable. It uses any of the svn protocols to check out a repository, optionally including branches and tags, and installing them as a Git repository. A developer then uses Git commands locally, using git-svn again when ready to update from or push changes to the svn repository. It is not a perfect fit, complaints about losing history in the conversion have been heard, but it does provide Git users a way to interact with svn.

The decentralized nature of the Git development model is always a stumbling block for projects that are used to the single, central, repository model of svn and other revision control systems. Adam Treat invited a rather well-known expert on Git, with some small experience in applying it to large projects, to comment on some of the questions he and others had. Linus Torvalds, who is also a KDE user, responded, at length, with some very useful insights.

Breaking the project into sub-projects is the first step:

So I'm hoping that if you guys are seriously considering git, you'd also split up the KDE repository so that it's not one single huge one, but with multiple smaller repositories (ie kdelibs might be one, and each major app would be its own), and then using the git "submodule" support to tie it all together.

Using the git-submodule command, a project can be broken up into many pieces, each with their own Git repository. Those separate repositories can then be stitched together into a "superproject" that understands how to handle a collection of repositories. If a change affects multiple modules, it can still be handled in an atomic way:

What happens is that you do a single commit in each submodule that is atomic to that *private* copy of that submodule (and nobody will ever see it on its own, since you'd not push it out), and then in the supermodule you make *another* commit that updates the supermodule to all the changes in each submodule.

See? It's totally atomic. Anybody that updates from the supermodule will get one supermodule commit, when that in turn fetches all the submodule changes, you never have any inconsistent state.

Users of a development tree have differing needs, which Git supports by not requiring a central repository that all users must interact with. Torvalds believes that the development organization, not the tool, should determine which repositories are central:

I certainly agree that almost any project will want a "central" repository in the sense that you want to have one canonical default source base that people think of as the "primary" source base.

But that should not be a *technical* distinction, it should be a *social* one, if you see what I mean. The reason? Quite often, certain groups would know that there is a primary archive, but for various reasons would want to ignore that knowledge.

For Linux, his kernel Git tree is the center, but for a variety of other users, the "stable" tree or distribution kernel trees for example, their repositories are the source. Those repositories can and do update from time to time from the main tree, but they control when and the users of those trees don't have to care.

On the subject of mapping the current KDE practices to Git, Torvalds is, characteristically, not shy about expressing his opinion:

Hey, you can use your old model if you want to. git doesn't *force* you to change. But trust me, once you start noticing how different groups can have their own experimental branches, and can ask people to test stuff that isn't ready for mainline yet, you'll see what the big deal is all about.

Centralized _works_. It's just *inferior*.

There is a clash of development models going on and Torvalds is pushing the kernel's model. His reasons are good, though they may not convince everyone, which is why Git tries hard to avoid forcing any particular style. As he did with open source development, Torvalds is trying to lead by example, while not forcing anyone to change.

Reading the full threads including the entire posting by Torvalds will be very interesting to those who follow source code management issues. This culture clash, centralized and somewhat bureaucratic versus decentralized and freewheeling will come up again and again over the next few years. Torvalds seems to think the Git model will work most everywhere and his track record for making smart choices is good. It will be interesting to watch.

Comments (11 posted)

Microsoft's licenses: excerpts from a conversation

By Jonathan Corbet
August 22, 2007
Microsoft recently submitted two licenses to the Open Source Initiative to be considered for approval as being truly open source. There have been a few themes which have come out of the subsequent discussion. One is that the licenses are generally seen as being compliant with the Open Source Definition, though their incompatibility with other licenses bothers a few people. Not everybody agrees that the Microsoft Permissive License (MS-PL) is truly "permissive," and some have asked for a name change. There have been some grumblings that the licenses offer no additional value in a time when the OSI is actively trying to reduce license proliferation.

But, as can be seen below, the heated part of the conversation was about a different topic: can and should the OSI judge a license based on its origin? Without further ado...

Does this submission to the OSI mean that Microsoft will:

a) Stop using the market confusing term Shared Source
b) Not place these licenses and the other, clearly non-free , non-osd licenses in the same place thus muddying the market further.
c) Continue its path of spreading misinformation about the nature of open source software, especially that licensed under the GPL?
d) Stop threatening with patents and oem pricing manipulation schemes to deter the use of open source software?

If not, why should the OSI approve of your efforts? That of a company who has called those who use the licenses that OSI purports to defend a communist or a cancer? Why should we see this seeking of approval as anything but yet another attack in the guise of friendliness?

-- Chris DiBona

I'm unclear how some of your questions are related to our license submissions, which is what I believe this list and the submission process are designed to facilitate. You're questioning things such as Microsoft's marketing terms, press quotes, where we put licenses on our web site, and how we work with OEMs - none of which I could find at http://opensource.org/docs/osd.
-- Bill Hilf

Be careful what you ask for. Do you really want everything RMS says about the BSD and similar licenses to be on-topic for approval of future FSF licenses? Should it be? Or should we do the right thing and restrict our review to the licenses themselves?
-- Chris Travers

Hey, I can sympathize - personally, I really don't approve of the FSF, and I'd love to see the OSI turn down the GPLv3.

Except I wouldn't, really, because then the OSI would lose every shred of credibility and quickly become irrelevant - just like it would if it failed to carefully consider the licenses submitted by Microsoft, or to approve them if they were found to adhere to the OSD.

-- Dag-Erling Smørgrav

This comes back to an old question on this list: is the OSI simply responsible for mechanically approving licenses? Or is the OSI responsible for, as it says on the web site, "maintaining the Open Source Definition for the good of the community"? In my opinion, which I acknowledge is not widely held, the good of the community does not require approving every applicable license.

That said, I personally would be in favor of approving the Microsoft licenses. I think it is overall a benefit to the community to acknowledge that code under these licenses is open source.

-- Ian Lance Taylor

OSI's role is merely to certify the licences that meet OSD criteria, and promote the concept of open source in general.
-- Rick Moen

The OSI board's anti-proliferation efforts appear to take them one step beyond certification though. It would seem to be that otherwise compliant licenses could be rejected if they simply duplicate the terms or purpose of an existing license... I would guess that a license that copied the Apache license and replaced all instances of Apache with some other abstract word would be rejected, no matter what the compatibility matrix looked like. How about a license that had exactly the same requirements as Apache, but restated them in a completely different way? From there, what's the *smallest* difference in licensing terms that would be worth adding yet another license?
-- Brian Behlendorf

I think (as I thought two years ago) that this is a case where the anti-proliferation rules should be set aside. We are dealing with an organization that has the potential of being a major player in free and open source software (and if they don't like the GPL, there are plenty of other FLOSS-producing organizations that don't like it either). If they can only bring themselves to release such software under their own particular licenses, so much the worse; but not more the worse than if they never released any FLOSS software at all
-- John Cowan

So the question becomes, should OSI discriminate? Will a farmer let a fox into the henhouse if the fox puts on a chicken suit?

I think not. Not if he wants to have any chickens. A fox in a chicken suit is still a fox and is still planning to eat his chickens. So only a stupid farmer would reason that a fox in a chicken suit, even one made from real chicken feathers, should now be allowed to reside in his chicken coop with his tasty chickens. Farmers are supposed to consider what foxes are known to do to chickens and what a fox's motives and likely purpose might be in putting on a chicken suit and sweetly pawing on the door to the henhouse.

-- Groklaw

Over time, it will probably become obvious that MS-PL and MS-CL are merely yet more additions to the horde of insignificant/redundant licences that, nonetheless, do pass OSD muster. They aren't innovative or particularly useful, though they do have the minor excellence of brevity...

There's really nothing new, here. However, if OSI were to surrender the integrity of its certification program, that would be something new, and particularly bad. Which is easily a sufficient reason for that not to occur.

-- Rick Moen

The actual decision must wait for the recommendation from the OSI license approval committee and the vote of the board of directors.

Comments (14 posted)

Page editor: Jonathan Corbet
Next page: Security>>


Copyright © 2007, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds