User: Password:
Subscribe / Log in / New account

Leading items

Ksplice and CentOS

By Jonathan Corbet
July 26, 2011
Ksplice first announced itself in 2008 as a project for "rebootless kernel security updates" based at MIT. The students behind the project soon graduated, and so did the project itself; a company by the same name was formed to offer commercial no-reboot patching to customers who cared deeply about uptime. Ksplice Inc. also offered free update services for a number of distributions. Much of this came to an end on July 21, when Oracle announced that it had acquired Ksplice Inc. and would incorporate its services into its own Linux support offerings. A free form of ksplice might just live on, though, with support from an interesting direction.

On the same day that Oracle announced the acquisition, CentOS developer Karanbir Singh suggested that one place the CentOS community could help out would be in the creation of a ksplice update stream. CentOS updates had been available from Ksplice Inc., on a trial basis at least; the company even somewhat snidely let it be known that they were providing updates for CentOS during the first few months of 2011, when the CentOS project itself had dropped the ball on that job. Oracle-ksplice still claims to support CentOS, but there is not even a trial service available for free; anybody wanting update service for CentOS must pay for it from the beginning. (The free service for Fedora and Ubuntu appears to still be functioning, for now - but who builds a high-availability system on those distributions?).

It is hard to blame Oracle too much for this decision. Oracle has bought a company which, it believes, will make its support offerings more attractive. Making the ksplice service available for free to CentOS users, in the process making CentOS more attractive relative to commercial enterprise offerings, would tend to undercut the rationale behind the entire acquisition. While it would certainly be a nice thing for Oracle to provide a stream of ksplice updates for CentOS users, that is not something the company is obligated to do.

So if CentOS is to have an equivalent service, it will have to roll its own. There are a few challenges to be overcome to bring this idea to fruition, starting with the ksplice code itself. That code, by some strange coincidence, disappeared from the Ksplice Inc. site just before the acquisition was announced. The Internet tends not to forget, though, so copies of this code (which was released under the GPL) were quickly located. Karanbir has posted a repository containing the ksplice 0.9.9 code as a starting place; for good measure, there are also mirrors on gitorious and github.

Getting the ksplice code is the easy part; generating the update stream will prove to be somewhat harder. Ksplice works by looking at which functions are changed by a kernel patch; it then creates a kernel module which (at runtime) patches out the affected functions and replaces them with the fixed versions. Every patch must be examined with an eye toward what effects it will have on a running kernel and, perhaps, modified accordingly. If the original patch changes a data structure, the rebootless version may have to do things quite differently, sometimes to the point of creating a shadow structure containing the new information. And, naturally, each patch in the stream must take into account whatever previous patches may have been applied to the running kernel.

Some more information on this process can be found in this article from late 2008. The point, though, is that the creation of these runtime patches is not always a simple or mechanical process; it requires real attention from somebody who understands what the original patches are doing. CentOS has not always been able to keep up with Red Hat's patch stream as it is; the creation of this new stream for kernel patches will make the task harder. It is not immediately obvious that the project will be able to sustain that extra effort. If it does work out, though, it would clearly make CentOS a more attractive distribution for a number of high-uptime use cases.

An interesting question (for those who are into license lawyering, anyway) is whether a patch in Oracle's ksplice stream constitutes a work derived from the kernel for which the source must be provided. Having access to the source for Oracle's runtime patches would obviously facilitate the process of creating CentOS patches.

Even if a credible patch stream can be created, there is another challenge to be aware of: software patents. The Ksplice Inc. developers did not hesitate to apply for patents on their work; a quick search turns up these applications:

The first of these has a claim reading simply:

A method comprising: identifying a portion of executable code to be updated in a running computer program; and determining whether it is safe to modify the executable code of the running computer program without having to restart the running computer program.

That is an astonishingly broad claim, even by the standards of US software patents. One should note that both of the applications listed above are exactly that: applications. Chances are that they will see modifications before an actual patent is granted - if it is granted at all. But the US patent office has not always demonstrated a great ability to filter out patents that overreach or that are clearly covered by prior art.

Once again, license lawyers could get into the game and debate whether the implied patent license in the GPL would be sufficient to protect those who are distributing and using the ksplice code. Others may want to look at Oracle's litigation history and contemplate how the company might react to a free service competing with its newly-acquired company. There are other companies holding patents in this area as well. Like it or not, this technology has a potential cloud over it.

It all adds up to a daunting set of challenges for the CentOS project if it truly chooses to offer this type of service. That said, years of watching this community has made one thing abundantly clear: one should never discount what a determined group of hackers can do if they set their minds to a task. A CentOS with no-reboot kernel updates would be an appealing option in situations where uptime needs to be maximized but there are no resources for the operation of a high-availability cluster. If the CentOS community wants this feature badly enough, it can certainly make it happen.

Comments (38 posted)

Desktop name collisions

July 27, 2011

This article was contributed by Nathan Willis

The GNOME and KDE development communities ran into a potentially confusing name collision recently when it was discovered that both were using "System Settings" to label the menu entry for their respective environmental configuration tools. A plan for handling the redundant names was eventually hashed out, though it shed light on a variety of other issues about system configuration on modern Linux desktops.

The debate started when Ben Cooksley, maintainer of KDE System Settings, wrote to both the GNOME desktop-devel list and KDE kde-core-devel lists with what he termed a "formal complaint" about the name change in GNOME 3's unified configuration tool, from "Control Center" to "System Settings." Cooksley argued that users would be confused by the presence of both GNOME's System Settings tool and KDE's, and that GNOME "packagers" (meaning downstream distributions) would disable the KDE tool, thus leaving mixed-environment users without a way to configure important KDE application settings. Because KDE was using the term before GNOME, he ended the complaint requesting that GNOME "immediately rename it once again to another name which is not in conflict."

Outside of the core issue, Cooksley's initial few messages were openly combative in tone, accusing the GNOME project of deliberately choosing the same name; remarks like "as KDE occupied this name first, it is ours as a result, and I will NOT be relinquishing it to satisfy your personal (selfish) desires" threatened to derail any serious discourse. A few other posters in the two threads also reacted with acrimony, but list moderators Olav Vitters (GNOME) and Ingo Klöcker (KDE) were quick to step in and warn participants to keep the discussion civil. For the most part, the discussion did calm down, and focused on the technical challenge of permitting two system configuration tools to co-exist — a challenge without an easy solution.

Configuration Junction

The root of the potential confusion is that both KDE and GNOME handle desktop-wide preferences for a range of settings that their constituent applications need: localization, mouse settings, keyboard shortcuts, preferred file-handling applications, even widget and sound themes. Many of the settings are defined in specifications, but some are unique to just one desktop environment or another.

In both cases, the name given to the settings-configuration application is generic rather than customized and unique, as it is for most desktop environment utilities. Few on either list gave much credence to the notion that KDE had "dibs" on the generic usage of the name System Settings. Generic names, after all, are by their very nature going to attract name collisions. Shaun McCance observed that "you just can't expect to own generic names across desktops."

Jeremy Bicha even pointed out that a previous name collision between the two projects happened in the other direction, with KDE duplicating the name System Monitor, which GNOME had already been using for years:

There's no evidence to believe that KDE was trying to cause a conflict then, nor is there any evidence that Gnome is doing that now. Unproven allegations like these encourage the criticized party to get defensive and start attacking back, or just not want to listen. Please look for solutions instead of conspiracies.

When on an entirely-KDE or entirely-GNOME system, the name of the other environment's configuration tool theoretically should not matter, but when users install applications from the other environment, the other tool could get pulled in as a dependency, and users are faced with two menu entries named "System Settings." As several people on the thread pointed out, simply renaming one tool or the other to "System Preferences" does not solve the problem, as in either case it is unclear which tool is associated with which environment. Niklas Hambüchen added that although "preferences" and "settings" may be two different words in the English translations of the strings, in many others languages the two tools might still end up using the same word.

GNOME has an OnlyShowIn: GConf key that it uses to make its System Settings appear only in GNOME Shell and Unity, so users running KDE (but using some GNOME applications) do not see the name-colliding menu entries. But as Cooksley and Bicha pointed out, the same solution does not work for KDE, because a substantial number of KDE applications expect the KDE System Settings tool to be available in the menu, even when running under GNOME (or another environment).

McCance suggested that each configuration tool include two .desktop files (which are used by both environments to build the system menus): one for the "native" environment which would use the generic "System Settings" name, and one for the non-native environment, which would prepend "GNOME" or "KDE" to the name, for clarity. Although that approach is possible under the .desktop specification using the OnlyShowIn= and NotShowIn= keys, Cooksley said it was already too late to make the change in KDE's .desktop files because the project had already frozen for its 4.7.0 release in August.

Several others felt that supplying two .desktop files for a single application was inelegant, and that the .desktop specification needed patching to specifically support applications that provide different names in different environments. User markg85's recommendation involves adding entries for NativeDE=, and a NameNonNative= key that would be used to provide an alternate name.

On the kde-core-devel list, Ambroz Bizjak offered up a slightly different proposal, in which each application would include a Name= key (as they do currently), but add a Specific-KDE-Name= key for use in KDE, and a Specific-GNOME-Name= key for GNOME, etc. The debate over the difference between those two proposals (and variations of each) is currently ongoing on kde-core-devel.

KDE applications and configuration

A tangent arose in the initial discussion over tool names asking why a KDE application would depend on the external KDE System Settings tool's presence when running under GNOME. Alex Neundorf said there were many configuration issues that could only be set through KDE System Settings, such as "widget style, colors, printing, file associations etc." Cooksley added Phonon, keyboard shortcuts, date/time and localization, and theme.

Giovanni Campagna insisted that those examples should actually be classified as bugs (either in KDE or in the particular application), because the majority of the settings in questions should be accessible to applications regardless of the desktop environment running, either through XSETTINGS, D-Bus, or other means. The KDE Wallet password-storage application mentioned by Cooksley, for example, should be used if the environment is KDE, but all KDE-based applications should follow the .org.freedesktop.Secrets setting, which will direct them to gnome-keyring if the environment is GNOME. Emmanuele Bassi said that most GTK+-based applications currently do adhere to the Freedesktop standards.

Aurélien Gâteau commented that he has been patching KDE applications to do better in this regard, so that "isolated" KDE applications will more closely follow the behavior of generic Qt applications, and pick up the configuration settings set by the environment. He said that there were "very few" applications that can only be configured through a KDE Control Module (KCM) (the type of component presented in KDE System Settings); all others should be completely configurable through their own Settings menus.

The effort to standardize KDE application behavior is obviously ongoing. Later in the thread the personal finance manager KMyMoney came up as another example of an application that relies on KCM components in KDE System Settings to configure its localization settings. Ryan Rix pointed out that KMyMoney could embed the localization KCM.

As for XSETTINGS support, Frédéric Crozat commented that he had written KDE support for the specification in 2007, but that the code had yet to be merged. Gâteau added that he was under the impression that the specification was still in the draft stage, and not ready for public consumption.

KWord developer Thomas Zander said that the whole situation should be treated as a "call to action":

This shows that our system settings actually is only for KDE based applications. [...] Today we realized that Gnome apps don't use our settings and KDE apps need some KCMs that have no Gnome equivalents. And thats not something to get mad about when others work around it, I would personally see that as a call to action.


The long-term response certainly is to get out of the situation where KDE apps can't be configured without KDEs system settings application. I'll personally take a look at my app; KWord. I have to figure out if Gnome (or Windows) users can configure their locale so we don't have a default of A4 for users that want a Letter sized page.

The short-, medium-, and long-term

It is still not entirely clear what the KDE developers' plan is for 4.7.0. Cooksley concurred with McCance's proposal to use OnlyShowIn= and NotShowIn= keys as a "medium" term solution. When asked why he could not make the changes to KDE System Settings' .desktop files in Subversion and have a "short" term fix ready before August, though, he replied that as per the KDE Release Schedule, only build fixes are permitted after the freeze date.

In the medium term, it does appear that KDE will take the dual-.desktop approach, and that the discussion over additions to the .desktop specification is an attempt to find a "long" term solution. The longer that discussion continued, however, the more people began to comment that the truly long-term approach would be to obviate the need for every environment to provide its own set of system settings tools, particularly when the tools control the same underlying cross-desktop specifications. Two silos of settings are bad enough, but two tools controlling the same settings is a scenario with problems of its own.

For that problem, no one has yet drafted a proposal. But it is not only the KDE camp that recognizes the issue; in his proposal to the desktop-devel list, McCance argued that working on a shared groundwork was the best path forward, saying "if a user has to set his language in two different applications just because he happens to use applications written in two different toolkits, we have failed miserably." The good news is that the KDE and GNOME teams will both be in Berlin the second week of August for the Desktop Summit. Hopefully the long-term answer will inch a little closer to the present as a result.

Comments (9 posted)

IKS: Toward smarter content management systems

July 27, 2011

This article was contributed by Koen Vervloesem

Interactive Knowledge Stack (IKS) is an open source project focused on building an open and flexible technology platform for semantically enhanced Content Management Systems (CMS). Recently, the project held a workshop in Paris, myCMS and the Web of Data, where some IKS tools were presented and where users of the IKS framework demonstrated how they used the semantic enhancements of the project in their CMS. According to the organizers, the event attracted 90 participants.

IKS is a collaboration between academia, industry, and open source developers, co-funded with €6.58 million by the European Union. The goal is to enrich content management systems with semantic content in order to let the users benefit from more intelligent extraction and linking of their information. In other words, as researcher Wernher Behrendt described it in his introduction of the workshop: "The vision of IKS is to move the CMS forwards in the domain of interactive knowledge." Anyone can participate in this vision, for instance by adding their input to the user stories page on the project's wiki.

All of the code for the various IKS projects are provided under a permissive open source license, either BSD, Apache, or MIT. This is expressly done to pave the way for commercial use of IKS. Two of the software components of the IKS stack that are already in good shape are Apache Stanbol (a Java-based software stack to provide semantic services) and VIE (Vienna IKS Editables, a solution to make RDFa encoded semantics browser-editable).

Semantic applications

[Stéphane Croisier]

In his keynote speech "From Semantic Platforms to Semantic Applications", Stéphane Croisier emphasized some problems of the current semantic technology solutions. There is a lot of development happening, with Linked Data, natural language processing, entity extraction, ontologies, and reasoners, that make a lot of promises, but all of these solutions are moving slowly. Croisier has investigated some of them in a so-called "one-week reality check", and he didn't like what he saw:

Many of the semantic web solutions are not ready for multi-language environments, which is especially in Europe a big problem, or they have poor scalability. Others have a steep learning curve, and this industry is also plagued a lot by fanaticism and religious wars, like we had in open source five years ago. All these factors prevent mainstream adoption of the semantic web.

But the problems are not limited to the technical level. According to Croisier, the next key challenge is improved user experience:

Current user interfaces for the semantic web are ugly and not user-friendly. One of the reasons is that the budgets go mostly to platform development, not to development of the user interface, which is probably because many semantic web projects are born in universities and have an academic approach, focusing on the technology. But it doesn't have to be this way, and if we want a breakthrough of the semantic web, we better start working on good user interfaces.

At the same time, Croisier expressed his hope that developers will move their efforts from semantic platforms to semantic applications, or in other words "migrate from the geek to the practitioner". Only geeks are excited by the platform stuff like RDF (Resource Description Framework), ontologies and REST (REpresentational State Transfer) interfaces, but the industry needs some smart content applications. However, there seems to be a barrier to overcome, as Croisier admitted: "We're all still trying to find the killer app for the semantic web."

Apache Stanbol

After Croisier's talk, a couple of early adopters showed their demos of applications built on Apache Stanbol, the open source modular software stack for semantic content management initiated by IKS. Stanbol components are meant to be accessed over RESTful interfaces to provide semantic services, and the code is written in Java and based on the OSGi modularization framework.

Stanbol has four main features to offer to applications using its services: persistence (it stores or caches semantic information and makes it searchable), lifting/enhancement (it adds semantic information to unstructured pieces of content), knowledge models and reasoning (to enhance the semantic information), and interaction (management and generation of intelligent user interfaces). If you want to take a peek at the possibilities, there's an online demo: just paste some text into the form and run the engine to look at the entities Stanbol finds. There are also some installation instructions in the documentation to run a Stanbol server yourself. Because Stanbol has a RESTful API, it's also easy to test it with a command line tool like curl.

At the IKS workshop, some integrators showed how they integrated Stanbol into an open source CMS. For instance, the London-based company Zaizi showed an integration with the enterprise content management system Alfresco. The code for this integration is licensed under the LGPL and there's a website with some information and installation instructions. The semantic engine extracts entities from Microsoft Office, ODF, PDF, HTML, and plain text documents uploaded to Alfresco and shows the entities next to the content details. The entities can also be selected in Alfresco's interface to list all other documents classified with that entity.

Jürgen Jakobitsch from the Austrian company punkt. netServices presented its Drupal plugin to integrate Stanbol. The current version is targeted at Drupal 6, but an update for Drupal 7 is coming soon. The module enables tag recommendations as well as semi-automated semantic annotation. The open data website of the Austrian government is running this Drupal/IKS integration.

Andrea Volpini and David Riccitelli from the Italian company InsideOut10 presented WordLift, an open source plugin to enrich textual content on a WordPress blog using HTML microdata, which is easy to parse by search engines. When writing a blog post, the content is sent to Stanbol, and the entities it finds will be added in Google Rich Snippets or format. The user can then select which of the found entities are relevant. It's all still quite experimental, but the target of the developers is clear: spoon-feeding HTML microdata to the search engines using semantic web technologies. According to Volpini, the source code of the plugin will be published in a few weeks.

In addition, Olivier Grisel from the French open source ECM (enterprise content management) company Nuxeo presented their Semantic Entities module for the Nuxeo CMS and Juan A. Prieto presented the integration of Stanbol with the semantic CMS XIMDEX.

Vienna IKS Editables

[Henri Bergius]

The other key component of the IKS software stack is VIE (Vienna IKS Editables), presented at the workshop by the main developer, Henri Bergius. The idea is to "build a CMS, no forms allowed", as people don't like forms ("forms are only for communication with the government," according to Bergius). To make this possible, the CMS and some JavaScript code must agree on the content model, and this is what VIE offers: it understands RDFa, a semantically annotated version of HTML.

If you annotate your website (or CMS) with RDFa, suddenly JavaScript code can understand the meaning of your content. VIE is an MIT-licensed browser API for RDFa, bridging RDFa to JavaScript. It depends on Backbone.js and jQuery, and it reads all RDFa annotated entities as JavaScript objects on a page where the library is loaded. These objects can then be edited by the user in the browser, and changes are synchronized with the server and the Document Object Model (DOM) in the browser.

The big promise of VIE is that it is independent of the CMS: the same lines of JavaScript work on Drupal, WordPress, TYPO3, and any other CMS that has provided an implementation of the Backbone.sync method. Apart from implementing this method, you only have to mark up your content with RDFa, include vie.js in your pages and write some JavaScript code. The three latter tasks are all independent of the underlying CMS.

On top of VIE, there's also VIE^2 (Vienna IKS Editable Entities), which talks to services like Stanbol and OpenCalais to find related information for your content. To show what's possible with VIE and VIE^2, the IKS developers created, an online collaborative meeting tool.

Surviving after EU funding

[Bertrand Delacretaz]

The IKS project was started in 2009 as a 4-year EU project, but how will it survive after the project (and the funding) is done? Bertrand Delacretaz from Adobe had some advice. Apart from being a developer at Adobe, he is also a member of the board of directors of the Apache Software Foundation. Stanbol is currently an Apache Incubator project since 2010, and the developers would like it to graduate to a full Apache project, preferably before the end of 2012 because that's when the IKS project (and hence the funding) stops.

There are, however, some criteria before an Apache incubator project is allowed full project status. Delacretaz gave two examples: all communication about the project has to happen on the -dev mailing list, and there have to be at least three legally independent committers (with different income sources). The latter is currently a problem for Stanbol, because too many committers get funded by the IKS project. So Delacretaz would like to see more (external) committers for Stanbol to secure its future.

In search of the killer app

At the end of the conference, the organizers announced the IKS Semantic CMS UI/X Competition. Project manager John Pereira said that the first 1.5 years of IKS were focused on infrastructure, but now the focus has shifted to the users. In the contest, the IKS project will give two awards of €40,000 to CMS developers who build "killer user experiences and user interfaces" on top of IKS technology. Anyone with an idea for a killer semantic application can enter the contest.

Of course there are some conditions. The proposed solution should reuse as many IKS components as possible, and it should ideally be easy to implement. It also should focus on providing a compelling semantic experience. Ideas can be found in the list of semantic UI/X user stories. The awards will let the winners finance the development of their proposed solution, and in exchange the deliverables have to be released under a permissive open source license. Proposals should be submitted online (there's no online form yet, at the moment you should email John Pereira) before November 2011 and the five best ones will be shortlisted and invited to pitch their proposals at the J. Boye Conference in November 2011, where the two winners will be selected.

There are some striking parallels between the promises of the semantic web and the "year of the Linux desktop" meme. Since at least 2000, IT magazines and web sites have been declaring every year as the year of the Linux desktop, in the sincere hope that that year would see a breakthrough in Linux adoption by businesses and home users on desktop computers. In the same way, the press has been writing about small success stories of semantic web technology, with expectations that it would soon come to a breakthrough. However, although most of the technology under the hood is ready, it looks like we still have to wait a while for this "year of the semantic web". What the IKS workshop made clear is that there's a lot of work to do on the level of the user interface. VIE looks like an interesting component for semantic web user interfaces, but as many of the speakers made clear, the whole industry is still desperately searching for that killer app.

Comments (none posted)

Page editor: Jonathan Corbet
Next page: Security>>

Copyright © 2011, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds