User: Password:
Subscribe / Log in / New account

Leading items

LinuxCon: Keeping open source open

By Jake Edge
September 23, 2009

Keith Bergelt, CEO of the Open Invention Network (OIN), described the circumstances which led the company to recently purchase 22 Microsoft patents, as part of a talk at the first LinuxCon. While the circumstances surrounding that purchase were quite interesting—and indicative of Microsoft's patent strategy—he also described the mission of OIN as a protector of Linux from patent trolls. Because patents are likely to be a threat to Linux for a long time to come, organizations like OIN are needed to allow Linux development to continue with as few patent impediments as possible.

Linux Foundation (LF) executive director Jim Zemlin introduced Bergelt by noting that OIN had done a great service for the Linux industry and community by purchasing those patents, which otherwise would have gone to "non-operating" companies—essentially patent trolls. Bergelt caught wind of the sale and headed off what might have been a potent attack against Linux, Zemlin said.

OIN was started by six companies (Sony, IBM, NEC, Red Hat, Philips, and Novell) four years ago to anticipate and preempt these kinds of patent sales, Bergelt said. It is a "very unusual entity" and when he was approached to be the CEO, it took some time to understand the "active benevolence" that was the mission of OIN. The members put a "very significant amount of money" into OIN, which means that, unlike a pledge fund, the capital is available, allowing Bergelt the autonomy to make decisions about how to deploy it.

OIN licenses its patents for use by others, with the proviso that those companies not assert their patents against Linux. It is, essentially, a defensive patent pool for the entire Linux community.

He sees the mission of OIN as allowing Linux to "be beneficial, at a macro level, to economic growth", by reducing the patent threat. The most recent patents were purchased from Allied Security Trust (AST), which represents its 15 members (including three that Bergelt named: HP, Ericsson, and IBM) by buying patents, licensing them to the members, and then reselling the remaining rights on the open market. Bergelt contrasted AST and OIN, saying that the latter is not just representing the six companies who are its members, but is, instead, "representing society". In his view, "patents will continue to exist", so it is important to "ensure that they don't have a negative impact on Linux in the future".

Bergelt described Microsoft's patent suit against TomTom as being a part of the software giant's "totem strategy". By getting various companies to settle patent suits over particular patents, Microsoft can erect (virtual) totem poles in Redmond, creating a "presumption of patent relevance". According to Bergelt, Microsoft tends to attack those who try to create parity with it in some area, which TomTom did. But, TomTom had overextended itself with a large amount of debt from their acquisition of mapping company Tele Atlas. That made it an opportune time to put the squeeze on TomTom, which is exactly what Microsoft did.

But, Microsoft was surprised to find that TomTom had allies in the form of OIN and others. Originally, Microsoft had asked for an "astronomical" sum to settle the suit, but after TomTom joined OIN and countersued Microsoft, the settlement number became much smaller. In fact, it was small enough that it was not necessary to report the amount under Dutch securities regulations. Because the cost to defend a patent suit—even successfully—could be upwards of $14 million, the TomTom board really had no choice but to settle.

But, patent suits are generally fairly high-profile, and there are other means to attack Linux companies more quietly. One of those is to sell patents to "non-practicing" (or "non-operating") entities who have no other business besides patent litigation. These trolls do not have any products that could be the target of patent countersuits, which is a standard way of combating patent suits. Bergelt said that $20 billion has been spent this decade by multiple organizations to acquire patents for trolling.

Companies with large patent portfolios have been pressured by investors to use those patents to generate revenue. One way to do that is to sell them to trolls, which brings in money and insulates the company from actually bringing suit itself. In some cases, this has led to patent trolls attacking the customers of the company who originally held the patents, Bergelt said.

Over the last three years, OIN has been one of the three largest patent acquirers, so it could not have been an oversight that Microsoft did not approach OIN about buying these patents. The bundle of patents was expressly presented as being relevant to Linux, which has the effect of "pointing the troll in the right direction", according to Bergelt. He clearly indicated his belief that this was an attempt to attack Linux by proxy; Microsoft would have "plausible deniability" because they could claim they were sold to a defensive patent pool such as AST.

But, AST is required to resell the patents it acquires, after licensing them to its members, within 12 months of purchasing them. Normally it would sell them to trolls, but Bergelt was able to arrange a purchase by OIN. He noted that if you wanted to get patents to trolls, but keep your hands "clean", selling them to AST is the right way to do it. Going forward, though, there is a patent treaty forming between AST and OIN, which should help alleviate this particular problem in the future.

The Data Tern/Amphion patent suit against Red Hat, which was based on a relational database patent, was also noted by Bergelt as a successful defense of free software from a patent threat. Red Hat settled the suit on behalf of the community as a whole, rather than allow further suits against free software to be filed. Bergelt said that Data Tern/Amphion were "not anti-Linux", in contrast to Microsoft's intent, but were focused purely on the return on its investment in buying the patent.

Intellectual Ventures is an organization to keep an eye on, Bergelt said, as it has some 23,000 patents, more than any other non-practicing entity. Three weeks ago, it started selling some of its patents—to patent trolls. OIN is also approaching patent trolls to suggest that they contact OIN before suing Linux companies. In some cases, OIN has averted lawsuits by acquiring patent rights from trolls.

The 22 patents in question are listed on the OIN website, but they aren't separated from the rest of the patents that OIN has acquired. They were all issued to either Microsoft or SGI originally, though, Bergelt said, which should assist anyone wishing to study what the patents cover. He noted that they are not the OpenGL patents, as some thought, because those are believed not to read on Linux.

In addition to acquiring patents, OIN has several other projects that are meant to reduce the patent problems for Linux. Peer to patent and post-issue peer to patent are both meant to "crowdsource" the process of finding prior art for patents that are in process or those that have already been issued. The former is meant to help the Patent and Trademark Office (PTO) so that bad patents don't get issued, while the latter looks for bad patents so that they can be submitted to the PTO for re-examination.

Defensive publications are another strategy that companies can take to protect their ideas without patenting them. OIN is advocating the use of defensive publication to create prior art, so that, in the best case, patents will not be granted covering those ideas. Instead of the "negative right" that is created with a patent, defensive publication creates something that everyone can use, but no one can patent. OIN's lawyers will review defensive publication submissions for free, making any necessary changes and then adding them to the database which is used for prior art searches by the PTO.

Companies who want to patent their ideas can also use defensive publication by patenting the core idea and wrapping that core with published information. This is happening more frequently because the cost of a patent application is becoming "prohibitive". OIN is encouraging the community to use defensive publications to protect its ideas as well.

Bergelt stressed that OIN is not set up as an anti-Microsoft organization, as they are focused on any entity threatening Linux with patents. In the most recent case that was Microsoft, but his expectation is that "Microsoft will go through a painful transition", but will eventually join the free software community. The benefits of free software development will be too strong to resist.

In closing, both Zemlin and Bergelt mentioned the Linux Defenders project, which is a joint venture between OIN, LF, and the Software Freedom Law Center. It is the umbrella organization for the peer to patent efforts along with the defensive publication initiative, but it also seeks to counsel companies who have been approached about patents that read on Linux. Zemlin noted that the traditional approach is to get a potential victim to sign a non-disclosure agreement (NDA) before discussing the patents in question. He stressed that companies should get in touch with Linux Defenders before signing the NDA, as that seriously limits what help it can provide.

In response to questions from the audience, Bergelt noted that there is some hope for patent reforms, which may "narrow the space" for trolls to work in. Judges are starting to recognize the problem he said, but wholesale changes are not likely in the cards. In addition, he noted that even defining "non-practicing entity" is difficult, pointing to Qualcomm as an example of a company that was not very successful using its patents in products, but quite successful in licensing them to others.

He also sees hope at the PTO. Fewer poor patents are being issued and far fewer patents are being issued overall. Things are changing, but they will never be as good as we want them to be, he said.

Comments (37 posted)

LinuxCon: Some advice from Uncle Dirk

By Jonathan Corbet
September 23, 2009
Dirk Hohndel has been a member of our community since the earliest days. In recent years, he has helped direct Intel's (very friendly) strategy toward Linux - a job which has required, one assumes, a great deal of educational work inside the company. Dirk also spends a fair amount of time outside of Intel, advising the community on how it can work better with vendors, with customers, and with itself. His thoughtful talks on the topic are usually well worth hearing. In two separate talks on the first day of the first LinuxCon, Dirk had some fairly general thoughts on how the next steps toward world domination can be taken.

When ASUS created the netbook market, its disruptive new machines all ran Linux. The development community welcomed this news, which seemed like a [Dirk Hohndel] validation of much of what we've been doing all these years. But it did not take very long before Microsoft was announcing that the vast majority of netbook systems were now shipping with Windows instead. How is it, Dirk asks, that Windows is able to displace Linux on systems like netbooks?

Part of the problem, certainly, was the second-rate distribution which was shipped with the early netbooks. It suffered from what Dirk calls the "three click problem." When the system is first turned on, everything looks great. But, by the time the user gets three clicks into the system, it's clear that it is an unfinished product. Obvious problems - configuration dialog boxes for applications which do not fit on the small screen, for example - are everywhere. So it does not take long for users to feel that they have not gotten what they really wanted.

But the bigger problem, says Dirk, is that the systems installed on these devices are trying to be Windows. They are trying to beat Microsoft at its own game, and that is a difficult strategy at best. If the ultimate goal of a development project is to copy somebody else, it is inevitable that the project will always be behind its target. It will never be a perfect copy, and users will know. The user's experience will always be less than it could be with the original.

An example is's attempt to copy the "ribbon" interface found in Office 2007. It's already two years later, it is not that great an interface in the first place, and will not do it as well as Microsoft did. Suffice to say that Dirk does not appear to be much impressed by this particular initiative. Similarly, attempts to copy the iPhone in mobile devices are doomed to an always-inferior existence. There has to be a better way.

That better way, says Dirk, is to move past the desktop metaphor which was never all that great an idea in the first place. People who are buying computers now are not interested in desktops, and they do not really care about the operating system they are running. What they want is to join communities. So the most important thing we should be doing, in the design of our applications and interfaces, is to better connect users with the communities they are interested in.

Indeed, the processes in many communities seem to have the explicit goal of encouraging people interested in design to go elsewhere. On the issue of design, Dirk made the claim that we have few real designers in our communities. Indeed, the processes in many communities seem to have the explicit goal of encouraging people interested in design to go elsewhere. One partial exception might be KDE; Dirk claims that KDE applications tend to be nicer because Nokia (and Trolltech before it) have put true design resources into the Qt toolkit. In general, though, we are not doing a good job of reaching out to designers, but we need those designers if we are going to create great systems.

The closing note of this talk was simple: listen to the users. And, by "users," he did not mean the people in the room, but the much wider user community that we need to reach.

Dirk's second talk filled a brief keynote slot; it was called "how to shine in a crowded field." The specific crowded field he was talking about was consumer electronics, which is packed with devices in search of customers. In this market, success is not something that just happens. There are, says Dirk, four things which are required.

The first of those is vision. There are, he says, plenty of visionaries out there, even if many of them do not see as far as they might think. We need those visionaries - just following others is, as was described above, not the way to be successful. Our community needs people who are not stuck doing things the way they have always been done.

The second requirement is competence - the ability to actually implement the visions. One of the nice things about the open source world is that competence is very much on display. We can (relatively) easily measure the competence of others, and our own competence as well. We are very free to learn from each other and quickly improve our competence.

Then there's commitment. Without commitment, developers will not see the task through to the end. And, just as importantly, users need to see that commitment. They need to know that the developers will be around, that they are serious, that they will respond to bugs, and that they will continue to carry the code forward. That said, open source makes users less dependent on the commitment of others. When a proprietary software vendor abandons a body of code, there is nothing the users can do about it. Open source software can be picked up and carried forward by others.

Finally, there is the matter of focus. Without focus, we will lose; there are simply too many distractions which can get in the way.

So how does the community do in these areas? We have visionaries, though Dirk would like to see more of them who are willing to go further off the beaten path. For competence, Dirk suggests downloading a random SourceForge project and looking at the code. That, he says, will make one question whether the open source community possesses any competence at all. Commitment, too, is on display at SourceForge - most projects there are inactive and going nowhere.

And focus, he says, is really hard. As a result, open source projects are highly susceptible to the 80/20 problem. The first 80% of the work is fun. But the task of actually finishing the job is less so, so it often doesn't happen. So we have a surfeit of 80%-done programs which have since been abandoned. We have, he says, 55 bad spreadsheets out there when we could have three really good ones. If we could stick to the projects we have, rather than yielding to the temptation to start some new, shiny project, we would be in much better shape.

Another example is the nearly 300 active distribution projects out there; it would be better to have fewer choices which were more complete. Given that, one might ask why Dirk's group went off and created Moblin - yet another new distribution. His answer (to his own question) was that they studied the available distributions and couldn't find one which they thought they could carry forward to a full implementation of the vision they had for Moblin. They needed to start anew, he said, to be able to commit to reaching the end.

In conclusion, Dirk says, the recipe for standing out is relatively straightforward: listen to the users, implement the whole vision, and go someplace where others have not been.

Comments (41 posted)

Some shots from the Golden Penguin Bowl

By Jonathan Corbet
September 23, 2009
The traditional Golden Penguin Bowl made a reappearance in a new venue at LinuxCon on September 23. Gracious host Steve Ballmer Jeremy Allison led the Nerds (Jono Bacon, Joe Brockmeier, and Matt Domsch) in their victorious trivia battle against the Geeks (Greg Kroah-Hartman, Ted Ts'o, and Chris Wright). It was a grueling event requiring detailed knowledge of Arthur C. Clarke books, bad science fiction movies, archaic architectures, Rick Astley lyrics, and remote-control helicopter piloting. Here's a few photos from the event.

[photo] Our host, Jeremy Allison
[photo] The Nerds: Jono Bacon, Joe Brockmeier, and Matt Domsch
[photo] The Geeks: Greg Kroah-Hartman, Ted Ts'o, and Chris Wright
[photo] The crowd gets ruthlessly rickrolled by the Nerds and the MC
[photo] Chris Wright takes the controls; Ted Ts'o does his best to stay out of the way.
[photo] We didn't need all those parts anyway, right?
[photo] Matt Domsch achieves liftoff.

Comments (3 posted)

TomTom unveils OpenLR location-referencing format

September 23, 2009

This article was contributed by Nathan Willis

On September 8, GPS device maker and mapping service provider TomTom pulled back the curtain on what it hopes will become an industry-wide standard for location referencing and dynamic route guidance. OpenLR, as it is known, is designed to allow heterogeneous applications and services to exchange location information in a compact, map-agnostic manner, which would ease the burden of interoperability between Web map services, car navigation devices, and other content systems that provide location-sensitive data such as public safety warnings. TomTom said it wants OpenLR to be a royalty-free, open specification, with a GPLv2-licensed encoder and decoder that will come shortly.

The company has long used Linux and open source software in its hardware products, which led to the famous patent lawsuit with Microsoft in February of 2009, over the VFAT filesystem. TomTom counter-sued Microsoft for patent infringement, and the two companies settled out-of-court in March. Despite its history with the open source community and development model, OpenLR is TomTom's first attempt at launching a completely new open source project of its own.

OpenLR bird's eye view

The problem OpenLR is designed to solve is rapid exchange of location-relevant content between independent data providers, aggregators, and end-user devices. OpenLR is not a geographic coordinate system (such as World Geodetic System 84 (WGS 84)) or a markup language akin to KML or GPX. Rather, OpenLR focuses on encoding location reference points (LRPs) using a combination of coordinates and attributes such as functional road class (FRC) and form of way (FOW) that describe the LRP in terms of its physical attributes. Thus, an application using a map from a web-based mapping service and directions from a GPS device can decode an LRP using multiple factors and determine that it is the same location, even if they use different map formats or disagree slightly.

In spite of the name "location reference point," as it is defined by OpenLR, an LRP is more like what a mathematician might call a directed graph edge: it has a start and end node, a bearing (compass direction), and a length. This evidences OpenLR's underlying goal of describing travel rather than precisely pinpointing stationary objects, but the terminology could still be confusing for newcomers. FRC and FOW likewise focus the attention on roads; FRC is defined as a number from FRC 0 ("main road"), to FRC 1 ("first class road") all the way down to FRC 7 ("other road"). FOW describes the physical type of road: motorway, roundabout, traffic square, and so on.

The primary use case TomTom outlines for OpenLR is to describe "line locations," which it defines as the concatenation of shortest paths covering a set of LRPs. OpenLR itself does not calculate the shortest or best path between a start LRP and end LRP; it merely provides a way for the software to encode it for exchange in a bandwidth-friendly way. OpenLR is not concerned with other map elements found along the way, such as geographical features or points of interest (POIs).

Routing between selected locations is arguably the easiest scenario to imagine; a device could request a route between two points and receive directions back from a remote server as OpenLR data. In addition, TomTom describes several cases where OpenLR might be used to propagate other information useful to travelers, such as traffic congestion data, public safety warnings, and even cooperative vehicle-to-vehicle communication — all of which share the same need for shortest-path routing information — plus applications useful to municipalities such as real-time urban traffic management and toll-road usage information.


TomTom's OpenLR Introduction [PDF] says that OpenLR is designed to be map-agnostic (meaning that OpenLR data is independent of both the map vendor and map version), communication-channel independent (so it can be transmitted just as easily by radio broadcast or over an IP network), and encoder independent (so that any device, application, or service can unambiguously decode the information sent by any other). The company has posted a more detailed description of the OpenLR data format in a white paper [PDF] available on its web site, including the byte-oriented stream format and details about how to specify each component, from coordinates (in WGS 84) to bearings and distances.

In its presentation, the company explains the value of releasing OpenLR as an open standard — better buy-in from key industry stakeholders, security against intellectual property threats, and flexibility to expand and enhance the standard in the direction chosen by the community. TomTom has filed for patent on the core concept in OpenLR, but says that it will publish the method used in the patent in its GPL-licensed encoder and decoder implementation. The documentation itself is published under the Creative Commons CC-BY license.

TomTom explains in the presentation that it chose the GPLv2 for OpenLR's license in order to protect free implementations from patent attack, noting that commercial services can still deploy the software. It also says that the license to use OpenLR will include a non-assertion clause. Complete details are provided in a separate license document [PDF].

Although TomTom says it will take the leadership and maintenance role in OpenLR's development, the white paper and presentation both assert that the company wants and expects the open source community to participate in expanding OpenLR, including the coverage of different types of data (such as Points and Areas), support for different formatting option such as XML, integration with GPS and Galileo positioning systems, and integration with the Transport Protocol Experts Group (TPEG) traffic and travel information standard.

The race is on

The core data covered in OpenLR's route-and-traffic exchange usage scenario can also be expressed in other, existing formats. The most widely-known is Radio Data System Traffic Message Channel (RDS-TMC), a format broadcast in a data sideband of standard FM radio transmissions. RDS-TMC is widely deployed in just a few countries, notably Germany, though it is available around Western Europe and North America. RDS-TMC traffic data itself can originate from a number of sources, including government-deployed road sensors, and the format itself is published.

Nevertheless, using RDS-TMC is problematic — particularly for free software — because it encodes the actual locations referenced via a copyrighted data set, one which is limited in size and not easily updated or corrected. A system similar in scope called AGORA-C is proprietary and commercial, relying on licensing and royalty collection, which has led to uncertain commitment from industry players. The TPEG format TomTom alluded to it its presentation is open, but TomTom regards its current location-referencing subsystem (TPEG-Loc) as unsuitable because of a lack of standardized encoding rules.

The market for location-referencing is large; free routing services from the likes of Google and Yahoo do not bring in any revenue, but in-car navigation systems (both built-in and aftermarket) are reportedly a huge and still-growing business. TomTom itself sells navigation software for platforms like the iPhone, and fee-based services for drivers to avoid speed traps and other road hazards. TomTom also owns map maker Tele Atlas, which it acquired in 2007.

Competition between TomTom and mapping rivals like Garmin and DeLorme in this space is fierce; the financial stakes are high and the number of players is low. That is a situation which free software advocates recognize has prompted the strategic release of a core technology as open source many times before. OpenLR certainly meets a need in the navigation stack; open projects like OpenStreetMap cannot use alternative systems such as RDS-TMC or AGORA-C because of their licensing. Nevertheless, OpenLR's openness is no silver bullet; for it to make a substantial impact it will still have to be adopted by multiple industry players, including traffic data providers.

Of course, an active show of participation on the standard from the open source and open standards communities could go a long way in making that happen. TomTom is expected to present about OpenLR this week at the World Congress on Intelligent Transport Systems. The reaction there will say a lot about the industry's take on the technology. For the open source community's reaction, one will probably have to wait for the still-to-come source code release.

Comments (3 posted)

Page editor: Jonathan Corbet
Next page: Security>>

Copyright © 2009, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds