User: Password:
|
|
Subscribe / Log in / New account

LWN.net Weekly Edition for March 29, 2007

The third GPLv3 draft

The original plans had called for the third draft of the GNU General Public License update to come out late last year. Needless to say, things didn't happen that way. Between trying to address concerns raised from various directions and responding to the Microsoft/Novell deal, the Free Software Foundation ended up having to slip its schedule; as a result, eight months have passed since the second draft was released. One could well argue that a major license update should not be made in a hurry, and thus the delays are not problematic. In any case, the wait is over: the new GPLv3 draft is available. In many ways, the draft resembles its predecessors; in others, it has changed significantly. This article will focus on the differences.

One area of conflict has been the anti-DRM provisions. The relatively uncontroversial language stating that GPLv3-licensed works are not "technological measures" has been reworked slightly to give it a more international focus:

No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures.

The previous draft had been specific to the DMCA, but anti-circumvention laws are a global issue, so this change makes sense.

The "anti-tivoization" provisions have been the source of much of the disagreement over this license. The new draft changes those sections significantly - though the intent remains the same, and people who did not like the previous versions are unlikely to feel better about the new language. In previous drafts, signing keys required to convince hardware to run a given binary were deemed to be part of the source code, and thus a required part of the (required) source distribution. The drafters decided that extending the definition of "source code" in this way was not the best idea. So, instead, we now have "installation information":

"Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made.

The license goes on to say that, if GPLv3-licensed code is shipped as part of a product, the installation instructions must be made available as well. Actually, it's not anywhere near that simple, for a couple of reasons. The first is this concept of a "user product," which is new in this draft:

A "User Product" is either (1) a "consumer product", which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling.

The actual requirement for the shipping of installation information is:

If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information.

One might well wonder what is going on here. In the explanation materials sent to LWN with the license draft, the FSF states:

After some discussion with committees, we discovered that the proposals in the second discussion draft would interfere with a number of existing business models that don't seem to be dangerous. We believe that this compromise will achieve the greatest success in preventing tivoization.

The nature of these innocuous business models is not spelled out. What it comes down to, though, is that gadgets intended to be sold to businesses will be exempt from the "installation instructions" requirements. This seems strange; it may well be businesses which would have the most use for the ability to change the code running in devices they purchase. The FSF has been saying that the right to replace the software in a device is required for true software freedom; why is that right now less important for devices which are not "user products"?

This exemption could prove to be a big loophole. Many years ago, your editor bought a digital audio tape deck. The rules for DAT decks in those days specified that they must implement the "serial copy management system" - a couple of bits in the digital audio data stream which indicated whether another deck was allowed to record that stream or not. It turned out that decks intended for "professional use" were exempt, however - musicians, after all, might actually want to make copies of their work. As far as your editor could tell, the difference between "professional" and "consumer" decks (at the low end, anyway) consisted of a pair of rack-mount ears; "professional" decks were available at the local guitar shop. Anybody could get a SCMS-free deck with little trouble. The exemption for devices which are not "user products" looks similar; with this language, the FSF may well be setting us up for a flood of "business use" gadgets which happen to available at the local big-box technology store.

The "additional terms" section has been simplified a bit. The second draft included the optional requirement that, if the covered code is used to implement a web service, the users should be able to get the source via that service. This requirement, intended to close the "web services loophole," is absent from the third draft.

The termination rules still allow any copyright holder to terminate the license if it is violated. There is a new escape clause, though:

However, if this is your first violation of this License with respect to a given copyright holder, and you cure the violation within 30 days following your receipt of the notice, then your license is automatically reinstated.

An opportunity to fix a GPL violation is consistent with how the license has been enforced so far.

The patent language has changed significantly as well. The second draft included a covenant not to enforce any relevant patents against recipients of the software; in the third draft, instead, an explicit patent license is granted. This change is apparently intended to make the patent grant language look more like that found in other licenses.

The change which will attract the most attention, though, is the language aimed at the Microsoft/Novell deal; it does not look like anything found elsewhere. It starts by broadening the definition of a "patent license" to include things like covenants not to sue, thus covering the Novell non-license. There is a clause saying that if you distribute covered code under the protection of such a license, you must arrange for all recipients - anywhere - to have the same protection. Then there's this part:

You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a patent license (a) in connection with copies of the covered work conveyed by you, and/or copies made from those, or (b) primarily for and in connection with specific products or compilations that contain the covered work, which license does not cover, prohibits the exercise of, or is conditioned on the non-exercise of any of the rights that are specifically granted to recipients of the covered work under this License.

The FSF is still considering whether it should grandfather in deals made before this draft was released.

The restriction to deals involving software companies is strange; it will just cause the next deal to be done by way of a patent-troll corporation. The prohibition only applies if the payments are based on the number of copies distributed, meaning that the next such deal will look like a fixed-sum payment - we will never know how that sum was calculated. There are enough loopholes in this section that it seems unlikely to slow down the next patent shakedown in any significant way. If the grandfather clause is added, it will not even affect Novell, the target of this whole thing.

There is an interesting new exception in this draft:

Notwithstanding any other provision of this License, you have permission to link any covered work with a work licensed under version 2 of the Affero General Public License, and to convey the resulting combination. The terms of this License will continue to apply to your covered work but will not apply to the work with which it is linked, which will remain governed by the Affero General Public License.

The posted version of the Affero GPL is version 1; your editor was not able to find any mention of a second version anywhere. The FSF must know something the rest of us are not yet privy to.

Finally, there is explicit support for signing away the right to decide on future license changes to others:

If the Program specifies that a proxy can decide whether future versions of the GNU General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Program.

There are various other tweaks - providing source by way of a network server is now officially allowed, for example. In many ways, GPLv3 is shaping up exactly as it was supposed to: it is bringing the license up to contemporary, worldwide standards and is evolving in response to input from the community. Your editor anticipates that the new anti-DRM and anti-Novell language will be the subject of significant criticism, however. They are developing the complex, baroque nature of code which has been repeatedly patched far beyond its original design. That language may require some work yet.

The current plan calls for the FSF to accept comments on this draft for the next 60 days, after which the final draft will be released. One month later - around the end of June - the GPLv3 will become official. The FSF claims to be actively looking for comments, so now is the time for anybody who has remaining concerns to speak up. Regardless of whether certain high-profile projects move to GPLv3, we all will be working with code covered by this new license. It's important that we help the FSF get it right.

Comments (55 posted)

Beryl and Compiz: back together again?

Once upon a time, just over one year ago, the Compiz window manager hit the net. Compiz, which features fancy 3D effects, was the result of some months' worth of behind-closed-doors work at Novell. There was an enthusiastic reception, and others began to hack on the code. It didn't take long, however, before some of those others found that it was hard to get their changes back into the Compiz mainline. Eventually one of those developers, Quinn Storm, got tired of carrying an increasing collection of external patches. The result was a fork, and the Beryl project was created.

These events can be acrimonious, and the Compiz/Beryl fork was no exception. Beryl developers complained that Compiz was run as a Novell fiefdom which was uninterested in patches from the outside. On the Compiz side, Beryl's decision to relicense the code from the MIT license to the GPL meant that code could flow from Compiz to Beryl, but not in the other direction. In early 2007, a Compiz site administrator vandalized Beryl's site, an act which must surely mark a low point in relations between the two projects.

During this time, development on both sides continued, with Beryl quickly developing a reputation for bells, whistles, and an unbelievable number of configuration options. Compiz took a more conservative course, working on getting the core functionality working in a way which seemed, to its core developers, to be right. Despite all of this, the differences between the two code bases are apparently less than one might think. No major architectural change have happened; instead, most of Beryl's additions come in the form of plugins.

Recently, though, the Beryl developers started to ponder some more sweeping changes. According to Robert Carr, the conversation went like this:

Around a month and a half ago some of us were discussing some rather radical changes to the design of beryl-core which we inherited from Compiz, this inevitably led to "We should talk to Compiz about this to keep things synced", which even more inevitably leads to "If we are going to talk to Compiz to keep our designs similar, so on, so forth, are our differences really so large that we need to be two seperate projects?".

The result was that the two projects started talking again. As of this writing, it would appear that Beryl and Compiz have come to an agreement to end the fork and join back into a single project. Should things happen this way, the results for eye-candy fans should be good. There are a few details which need to be worked out first, though.

One of those is licensing. The fact that Beryl's work is licensed under the GPL means that, for the two projects to merge, one of them must be relicensed. It looks like Beryl will be the one to give here, moving its core back to the MIT license. The number of contributors is evidently sufficiently small that this sort of change is still feasible.

Then there is the issue of how to merge the changes in the code. According to Mr. Carr, agreement has been reached on most points, at least with regard to the core changes. In the past, Compiz leader David Reveman has not been receptive to Beryl code:

With a few notable exceptions, most of the code I've seen going into what is now beryl is not high quality code that would be considered for compiz.

It seems that the situation is different now:

The technical part of the merge seems pretty straight forward from my point of view and I've got the understanding that so is also the case for the main contributors to the core of beryl.

The merge is probably helped by the Compiz project's plan to split the code into "core" and "extra" modules. Much of what is currently in Beryl will, it seems, slip into compiz-extra with little trouble.

So if licensing and code are not problems, what are the potential sticking points in this merger? It seems that there are two of them: naming and leadership. The Beryl side is pushing for a new name and structure which would enable a clean start for the entire project. Without that, they fear, one side or the other will probably get the short end of the stick. Mr. Reveman responds:

The merge is done by moving changes made to beryl into compiz or by adding alternative solutions to compiz. No changes are made to the design of compiz and 99% of the code is still code being written as part of the compiz project so I'm having a hard time to justify a name change of the core and I know that most people in the compiz community are firmly against such a name change.

From reading the discussion, one gets the sense that the leadership issues have not yet been the subject of serious discussion. Some sort of project management model will have to be worked out, or the newly merged project will run a risk of falling victim to the same tensions and forking again.

There should be an answer, though. It would be a sad day if these two projects could come together, resolve their technical and licensing differences, then drop the whole thing because they cannot agree on the name. Some great progress has been made on reunifying one of the most unpleasant forks in our community; it seems like the remaining issues must somehow be amenable to a solution.

Comments (8 posted)

Working with raw images on Linux

This article is part of the LWN Grumpy Editor series.
Your editor's exploration of high dynamic range (HDR) techniques inspired one comment suggesting that photographic topics should be avoided in the future if your editor wishes to avoid looking foolish. As it happens, fear of looking foolish would make this particular job almost impossible to do; when one writes for an audience that knows more than the author, occasional foolishness will inevitably result. Even for authors who are not so inherently foolish as your editor. So, foolish or not, here is a followup to the HDR article; this week's topic is working with raw files.

Most digital cameras are set to produce JPEG files; for many applications, such files are more than good enough. But most decent cameras support other formats, and a vendor-specific raw format in particular. The raw format contains something close to what was measured by the sensor, with a minimum of processing in the camera. These files are large, unwieldy, and in a proprietary format, which argues against their use in many situations. But, by virtue of holding the original image data, raw files give the photographer a much wider range of options later on. Much of the processing normally done in the camera (white balance, histogram adjustment, etc.) can be tweaked later on. For this reason, people who do photography for a living often prefer to record in the raw format.

Even for the rest of us, who have no hope of earning a living that way, raw files can keep creative options open. For people who like to play with HDR techniques there is an additional advantage: the camera typically record 12 to 16 bits of data for each channel - rather more than fits into a JPEG file. That, in turn, means that the dynamic range of raw files is significantly higher - assuming, of course, that the camera has a sensor which can meaningfully record data at that resolution. The extra range can be used to increase detail in images in a number of ways, including the use of tone mapping techniques.

Raw file formats are created by camera manufacturers, who generally feel no need to document their work. They will usually sell you a tool for decrypting their raw files - but, strangely enough, Linux support is usually missing from the feature list. Fortunately, the free software world benefits from the work of Dave Coffin, who has set a task for himself:

So here is my mission: Write and maintain an ANSI C program that decodes any raw image from any digital camera on any computer running any operating system.

The result is dcraw, which comes awfully close to meeting that goal. It supports a huge list of cameras, and it does so at a high level of quality - arguably better than the vendor's tools. It is a command-line tool, aimed at batch operation or invocation from other programs; dcraw can be run from a gimp plugin, for example. Just about anything one wants to do with a raw image file is supported by dcraw.

The only downside is that processing raw images can be an interactive process. If one wants to make adjustments, a command-line tool can get [UFRaw] tiresome after a while. The answer to that complaint is the UFRaw tool, which is built on dcraw. UFRaw allows adjustment of the white and black points, gamma curve, white balance and more - all with immediate visual feedback. When the desired result is achieved, it can be saved in a number of formats.

UFRaw is not perfect. It's one of those applications that thinks it's clever to remember where the last image was stored and put the next one in the same directory. Your editor, instead, expects programs to default to the directory they were started in, or, failing that, to the directory where the source file was found. It's aggravating to save a file then have to figure out where the application decided to put it. UFRaw is doubly obnoxious in this regard because it immediately exits after saving the file. The non-resizeable window is also annoying. One assumes these little difficulties can be dealt with eventually; meanwhile, the core functionality is good stuff.

What sort of results can one expect? Here are three versions of the window view photo featured in the HDR article:

OriginalUFRaw edited Tone mapped
[Original] [UFRaw] [ToneMapped]

(See this page for larger versions of the pictures).

Some quick editing with UFRaw was sufficient to bring out a fair amount of detail in the plant in the foreground - though the background lost some contrast as a result. The tone-mapped photo does better at maintaining contrast throughout the frame. The end result is not as complete as the full HDR image (visible here), but it does show that raw files contain information which can be recovered later on to improve the picture. Taking a single raw image is much easier than the full bracketed HDR technique, and it allows tone mapping techniques to be used on subjects which stubbornly refuse to stand still for a few minutes while several shots are taken.

One thing worth noting in conclusion: we should not take our ability to work with raw images for granted. Vendors like Nikon and Sony are known for encrypting their raw formats. The language they use to justify themselves will look most familiar; consider this advisory from Nikon regarding its NEF format:

As a proprietary format, Nikon secures NEF's structure and processing through various technologies. Securing this structure is intended for the photographer's benefit, and dedicated to ensuring faithful reproduction of the photographer's creative intentions through consistent performance and rendition of the images.

In other words, photographers are being locked out of their own images for their own benefit. All of the usual counterarguments apply here; photographers might just have their own idea of where there benefit lies. And what happens to those raw images a decade or two from now, when the vendor has long since ceased to support the format and, even if one can find one's single legal backup copy of the software, it refuses to run on currently available systems? Fortunately, we have dcraw, which will document the reading of these formats indefinitely.

So far, vendors' attempts to encrypt raw files have been broken in short order. Chances are that trend will continue. But there is little difference between breaking into a raw image file and turning off the copy protection bits inside a PDF file. The stage is clearly set for an ugly battle, probably involving the DMCA, when some vendor decides to turn nasty.

Photographers have been worried about this issue for a few years now; efforts like the OpenRAW project have been working, with little success, to get camera manufacturers to open up their formats. Adobe has been pushing its Digital Negative format as a standard; it would be a step in the right direction, but this format still has mechanisms for the embedding of vendor "private" information. At this time, there does not seem to be a clear solution in sight. We must deal with cameras just like we deal with many other types of hardware: we have to figure out how it works ourselves.

Comments (13 posted)

Page editor: Jonathan Corbet

Inside this week's LWN.net Weekly Edition

  • Security: Metasploit 3.0; New vulnerabilities in cups, evolution, firefox, mysql, ...
  • Kernel: Application-friendly kernel interfaces; Deferrable timers; Integrity management.
  • Distributions: Character encoding and the Debian Project Leader elections; pure:dyne 2.3.52,Ubuntu 7.04 Beta, YDL v5.0.1 for Apple PowerPC
  • Development: KDE 4 gets more Hot New Stuff, new versions of Firebird, BusyBox, Metasploit Framework, Django, Silva, CLAM, Vamp Plugin API, ASCO, Covered, Layout editor, Mirth, Ember, Dia, Urwid, Canorus, GMIDImonitor, Qtpfsgui, Swfdec, Gran Paradiso, Croquet, Shed Skin, Pydev, PHP OpenID.
  • Press: Clearing up anti-GPL3 FUD , Ian Murdock: Making Solaris more like Linux, coverage of CeBIT, Decibel Hackathon, Guademy, LAC, Emulex supports RHEL5, Novell's latest announcements, Oracle and Yahoo partner, Oracle joins OIN, ripping DVDs, DMCA abuses, history of Linux and Trademarks, Open Font License revised, PHP Search Engines, making activities for the OLPC XO, chess engines, desktop distribution reviews, Gentoo and developer conflicts.
  • Announcements: Linux Foundation announces board, LQ Wiki reaches 3000 articles, ActiveState PDK 7.0, Coverity squashes 6000 bugs, Xandros to bundle Scalix, Ted Ts'o wins FSF award, Pure Data Spring School 2007, EuroPython cfp, GUADEC cfp, OO.o cfp, RAID cfp, LAC to feature live streams, Sys Admin Technical Conference, Where 2.0 speakers announced.
Next page: Security>>

Copyright © 2007, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds