|
|
Subscribe / Log in / New account

Eben Moglen returns to LCA

By Jonathan Corbet
January 15, 2015

LCA 2015
One of the defining moments of LCA 2005 was Eben Moglen's keynote, which was mostly focused on the dangers that software patents presented to our community. Ten years later, Eben returned to LCA for another keynote address. While he had some things to say about software patents, it is clear that Eben thinks that the largest threats to our community — and our freedom in general — come from elsewhere.

The last ten years

Eben started by saying that a lot has happened in the last ten years; a lot of great software has been created, and we have fewer enemies than we did then. He had warned us about software patents, and, in the last ten years, the full patent war that he had feared has come to be. It has cost billions of dollars and distorted the industry; companies have done their best to make use of patent monopolies to slow down their competitors. But, in the middle of this, the free software community got a lot of help, and the worst laws around the world began to change.

Some years ago, he had been discussing various ideas for patent pools and other defensive techniques with Richard Stallman; the idea was to be able to sit at the table with at least a few chips with which to negotiate some freedom to operate. They came up with "lots of inoperable plans" and, meanwhile, the business community put together some real patent pools.

But, importantly, during this time judges were starting to figure out that something is wrong with the patent system; they were "losing their enthusiasm for the subject." In the last year, Eben said, we have won three unanimous decisions in the US Supreme Court. That has put the [Eben Moglen] community into an extraordinary position. The patent wars are still raging, but the possibility that the system will be used against free software developers is disappearing. We're not done yet, but the playing field is much more level.

The most interesting aspect of the patent situation in the coming decade will result from the fact that the largest economy in the world will no longer be the United States — it will be China. So the most important patent system in the world will be in China; we will be contending with lots of statutory monopolies in a society without the rule of law. The problem will affect us lightly, he said, but it will hit our industrial partners much more heavily; their response may be one of our most interesting challenges.

So, he concluded, in the last decade we have made a great deal of good software and managed to abate some serious nuisances. We have been at least partially successful in communicating our message; everything from ice cream to weapons is described as "open source" now. What we have done is to show how things will work in the 21st century. 20th century industrial society loved hierarchies; they were intrinsic to the organization of that society. We, instead, have shown the workings of a society based on transparency, participation, and non-hierarchical structures.

Our relationship to transparency is so intimate, he said, that you cannot describe the free software community without it. We are not a business with a big show window; we are a porous community that anybody can join. Participation is thus the outcome of our commitment to transparency. And we don't just do non-hierarchical projects; to a great extent, we invented them. We have taken over because we have shown that non-hierarchical collaboration is how you have to make everything. Nobody can live without our software; even Microsoft recognizes that we have won.

The threat to freedom

This is the structure that we need to face our next big threat: the all-seeing surveillance society that was spotlighted by the Snowden disclosures. It appears that there is little hope that we can prevent powerful interests from turning the net into a tool for totalitarians. We are living in the world, he said, that people like Richard Stallman and Phillip Zimmermann were trying to prepare us for.

Eben recounted how he stumbled across the first PGP posting on a FIDO board in 1991; he sent Phillip a message congratulating him for his work and for having changed the world. He also pointed out that Phillip was "in a shitload of trouble" and that he, Eben, would be there to help.

Try to imagine, Eben said, a world where PGP had been intimidated out of existence. We would have no PGP, no SSH, no OpenSSL. We would be facing "irresistible despotism" now. Even with those tools, we are facing powerful entities that can use surveillance to predict behavior and prevent the coalescence of dissent. We now live in the world that we are afraid of, and we now have a responsibility to improve our inventions and to spread them as widely as possible to fight that outcome.

We are moving toward the net as a single exoskeleton nervous system that embraces all of humanity. Will it be built to be controlled by its users at the endpoints, or will it be controlled from the center? This is the political decision that we will make for the future. We don't have any direct say over what will happen to the climate of the planet, but we can determine the physiology of its nervous system. As was foreseen by our various "crackpot visionaries," our freedom will depend on how we use our technologies. If our political ideas are to survive the surveillance society, we must continue the process of building our political and social theory. We must, he said, become more aware of the political implications of what we do.

Edward Snowden has done the important work, he said. Most people who are connected to the net want to do something to improve the situation; that means they want to meet us more than they did before. In the last decade, "our software was convenient, but we weren't." But "everybody wants to meet us now." We need to go out and explain how we can save freedom together. It would be a convenient time to build freedom into the net. But it is hard to sustain freedom in an engineering environment that is designed to take it away. Free software, instead, is software that preserves freedom, autonomy, and privacy. Nobody in their right mind will use security-oriented software that they can't read anymore.

In other words, Snowden proved that we were right. We had been saying that you cannot trust software that you cannot read; we thought it was an obvious point. Now everybody understands that point.

The first law

Even so, nothing has happened yet for users whose technology sells them out every day. What we need, he said, is a First Law of Robotics for software. If the free software community doesn't implement such a law, nobody will. Big power is committed against the First Law; it wants devices to work for it, not for their owners. If it succeeds, the human race could pay a price that lasts for generations.

So, he said, we have to start taking ourselves a little more seriously. What little has protected us from disaster already has been made in our community. We are why non-hierarchical modes of community exist, why it is possible at all. "Terrific!," he said, we did a great job. But we're not done.

Neither are we alone. When the Diaspora developers decided to try to fix social networking to make it ethical, it was surrounded by a huge community of supporters. But we have also had casualties, people like Aaron Swartz and Ilya Zhitomirskiy. We have to honor the sacrifices these people made.

In the current era, people are beginning to understand that the freedom of the net is the freedom of our society. Now is the time when we find out if we can use what we have done to keep humanity free. If we don't do it, it will not happen. We have narrowly escaped a few catastrophes in the last ten years, and we have gotten a lot of information from Snowden on what we did right. The unlimited resources devoted to breaking our view of the net have failed because of what we have made. We have demonstrated our vitality, and are now merging into the larger movement for freedom of the net. We will soon demonstrate whether we can carry this work forward.

Everybody's power, he concluded, runs on our plumbing. Whether it takes us toward freedom is up to us. We have a lot to do, and we will get lots of help. But, to start, there will be 1.6 billion people living in China without freedom, and the world's flagship democracy has abandoned the rule of law when it comes to surveillance. These forces are pushing us toward darkness. We have a lot to do to prevent that, but, if we don't do it, freedom stops. Free software is necessary for a free society, he said, and "we are playing for keeps now."

[ Your editor would like to thank linux.conf.au for funding his travel to the event. ]

Index entries for this article
Conferencelinux.conf.au/2015


to post comments

Eben Moglen returns to LCA

Posted Jan 16, 2015 13:52 UTC (Fri) by xav (guest, #18536) [Link] (14 responses)

Wouldn't such a law be against the spirit of Free Software ?

Eben Moglen returns to LCA

Posted Jan 16, 2015 14:05 UTC (Fri) by zuki (subscriber, #41808) [Link] (2 responses)

It depends on whos free software. For a long time there have been two camps:
— idealists like FSF and friends for whom the software is just a means for the freedom of the user of the software,
— pragmatists (the open source camp) for whom the openness of the software is a mechanism for better software production.

It conflicts with the second camp's definition, but not the first. The first is imho more important.

Eben Moglen returns to LCA

Posted Jan 27, 2015 17:09 UTC (Tue) by wilck (guest, #29844) [Link] (1 responses)

I recall that the FSF has made it clear on several occasions that free software can't prohibit rogue uses. Nuclear weapon designers or fascists are allowed to use free software like everybody else.

Eben Moglen returns to LCA

Posted Jan 27, 2015 18:08 UTC (Tue) by zuki (subscriber, #41808) [Link]

Well, ... yes. This does not conflict in any way with what I said.

Eben Moglen returns to LCA

Posted Jan 16, 2015 16:11 UTC (Fri) by drag (guest, #31333) [Link]

It's somewhat ironic to talk about 'laws' when the people whose job it is to create and enforce real laws are our greatest threat.

Certainly freedom should be the highest priority, but it would be extremely misguided and self defeating to try to use something like copyright law (or whatever) to enforce it. It needs to be something that is philosophical, a guiding principal that we share with other people when it comes to openness, transparency, and freedom.

Eben Moglen returns to LCA

Posted Jan 16, 2015 17:05 UTC (Fri) by Karellen (subscriber, #67644) [Link] (9 responses)

Depends on how you draft it.

"No software may harm *a human*, or through inaction allow a human to come to harm." might be against the spirit of free software, if you wanted the freedom to write a free software program to harm other people.

"No software may harm *its user*, or through inaction allow its user to come to harm" is most definitely in the spirit of free software. It's one of the reasons to have free software, in order that we may chip away at the harms our software does us through bugs or missing functionality.

Eben Moglen returns to LCA

Posted Jan 19, 2015 10:21 UTC (Mon) by ballombe (subscriber, #9523) [Link] (5 responses)

Who is the user ? the AGPL shows this is not as clear cut in the mind of the FSF as we would like.

Eben Moglen returns to LCA

Posted Jan 19, 2015 12:46 UTC (Mon) by mpr22 (subscriber, #60784) [Link] (4 responses)

It seems to me that the AGPL says that a person who remotely accesses your network-facing software's public interface is a user of that software. I don't see anything unreasonable in this, even though I don't like the AGPL and would generally prefer other copyleft or permissive licences on software for any network-facing task other than direct delivery of human-readable text for interactive consumption.

Eben Moglen returns to LCA

Posted Jan 19, 2015 18:34 UTC (Mon) by caitlinbestler (guest, #32532) [Link] (3 responses)

Is a remote user of a web site the user of the software running on that website, or are they a user of a *service* being provided by the user of the software running on that website.

It is the website publisher who selects the software that runs there, so they feel more like "the user" to me.

Eben Moglen returns to LCA

Posted Jan 20, 2015 7:40 UTC (Tue) by KSteffensen (guest, #68295) [Link] (2 responses)

Is a person using Emacs to edit a file a user of Emacs or a user of the *service* being provided by Emacs?

It doesn't matter whether the software is run through a browser or a terminal, whoever is providing the input and receiving the output is the user.

Eben Moglen returns to LCA

Posted Jan 20, 2015 12:58 UTC (Tue) by ghane (guest, #1805) [Link]

> Is a person using Emacs to edit a file a user of Emacs or a user of the *service* being provided by Emacs?

I think a better way to ask this (consistent with post above yours) is:

Is a person using Emacs to edit a file a user of the OS or a user of the *service* being provided by Emacs?

The user is best served by Emacs being free; Emacs may be best served by the OS being free.

And there are turtles all the way down: The OS wants a free kernel, which likes free hardware, etc.

> It doesn't matter whether the software is run through a browser or a terminal, whoever is providing the input and receiving the output is the user.

But that would make my Manager the user, because he gave me the input, and he will make sense of the output. Or maybe the customer who hired my employer to process some data on the Wolfram Alpha website...

So there seem to be turtles all the way up, as well.

Eben Moglen returns to LCA

Posted Jan 23, 2015 6:01 UTC (Fri) by rdc (guest, #87801) [Link]

For the web application that is of course true, but what about the software that is used behind the scences to support that application. For that stuff it may well be correct to say the hosts are the users and not the end user.

Eben Moglen returns to LCA

Posted Jan 26, 2015 15:55 UTC (Mon) by ortalo (guest, #4654) [Link] (2 responses)

I like the first formulation better and I also think it is more accurate too.
Why would you want the freedom to write a free software program to harm other people?
This is not a freedom in my opinion. It may be the case if you are linked to an obligation (such as law enforcement). In this case, you will want to have the obligation to use free software, auditable by everyone, to check that you have not built a monster.

Eben Moglen returns to LCA

Posted Jan 26, 2015 17:32 UTC (Mon) by dlang (guest, #313) [Link] (1 responses)

first define 'harm', does making someone feel bad count? if so, outlaw every communication technology.

second, how exactly would the software know that it's affecting the real world? let alone what the actual results would be.

Eben Moglen returns to LCA

Posted Jan 26, 2015 18:02 UTC (Mon) by Limdi (guest, #100500) [Link]

See string "radar working" => halt program
See string "rocket start now" => warn user
See string "attack" => gracefully shutdown
In case it got prevented manipulate a bit: throw random numbers into memory pages

In other words some knowledge of the context might be required.
Maybe some sort of ai that can see the difference between game and real? :)

Hopefully it will not get a patch which shows it the door.

Eben Moglen returns to LCA

Posted Jan 26, 2015 12:49 UTC (Mon) by ekram (guest, #70515) [Link] (7 responses)

Presumably such a First Law for software would have to prevent the use of it in any military equipment - one example that springs to mind is the Linux powered rifle that has been publicised recently.

Although this would be extremely welcome, I really can't see how that could be possible and still make the software free - restricting the freedom to make weapons is something that would inevitably restrict other freedoms too. In other words, it's a very difficult problem.

It's also quite ironic that a great majority of the software freedom principles we now have were created with the help of military funding.

Eben Moglen returns to LCA

Posted Jan 26, 2015 16:17 UTC (Mon) by ortalo (guest, #4654) [Link] (5 responses)

Nope, restricting the freedom to make weapons is something that *enable* other freedoms (or more precisely others freedom).

Your problem is only apparent IMO. Freedom exists only when rules exist too and especially restrictions to things that negate all freedom. Without any rules there is no freedom, it is the case of arbitrary action and probably not at all what you meant in the first place.
I am no law-oriented, so I would take the logical approach to state it. You need a deontic logic to reason about (an agent) freedom, and simple propositional logic does not allow to talk about it at all. In the former you can formulate interdictions (which imply negative propositions) and stay consistent. In propositional logic you would only get contradictive formulas when combining facts and their negation.

BTW, I am not so sure the military oriented funding was the only key to creation of free software. At least not more than academic funding.

Eben Moglen returns to LCA

Posted Jan 26, 2015 18:57 UTC (Mon) by nybble41 (subscriber, #55106) [Link] (4 responses)

> Nope, restricting the freedom to make weapons is something that *enable* other freedoms (or more precisely others freedom).

Ridiculous. Making weapons doesn't have any negative impact on others' freedoms. *Using* weapons in certain ways might, but then again you could achieve the same effect with tools which were never intended as weapons, or even with your bare hands. Conversely, weapons can be used to *defend* freedom, your own and others'. They are tools like any other, with noble and ignoble uses depending on the intent of the wielder.

Eben Moglen returns to LCA

Posted Jan 26, 2015 20:34 UTC (Mon) by ekram (guest, #70515) [Link] (2 responses)

>> Nope, restricting the freedom to make weapons is something that *enable* other freedoms (or >more precisely others freedom).
>
>Ridiculous. Making weapons doesn't have any negative impact on others' freedoms. *Using* >weapons in certain ways might, but then again you could achieve the same effect with tools which >were never intended as weapons, or even with your bare hands. Conversely, weapons can be used >to *defend* freedom, your own and others'. They are tools like any other, with noble and ignoble >uses depending on the intent of the wielder.

I think this is missing the point - the comment was regarding weapons that run (free) software in order to function / work. The ethical question comes from furnishing the software with the ability to prevent humans from being harmed, akin to Asimov's first law of robotics - 'A robot may not injure a human being or, through inaction, allow a human being to come to harm.'

Eben Moglen returns to LCA

Posted Jan 26, 2015 22:34 UTC (Mon) by viro (subscriber, #7872) [Link]

... which is utter bullshit. Enforcing that will result in completely closed system - note that Asimov's robots were very much _not_ user-modifiable. Moreover, their _uses_ were originally subject to review and approval by manufacturer - that was explicitly stated as the reason for "lease only" policy. And when that got relaxed, well... didn't take that long for the things to get very fishy. As in "this will render Earth uninhabitable, but that's OK, since it will drive the outwards migration and serve The Greater Good(tm)". With privacy violations that would make NSA weep with envy, BTW - mind-reading, with aforementioned Greater Good as an excuse for any manipulations, and not accountable to anybody whatsoever...

Eben Moglen returns to LCA

Posted Jan 26, 2015 23:26 UTC (Mon) by nybble41 (subscriber, #55106) [Link]

> I think this is missing the point - the comment was regarding weapons that run (free) software in order to function / work.

I don't think I was missing ortalo's point, though it may have gone a bit off-topic. Even so, I think the same reasoning can be applied to the software component of a "smart" weapon: denying the freedom to use particular software in a weapon does not, in and of itself, enable other freedoms or others' freedom.

> The ethical question comes from furnishing the software with the ability to prevent humans from being harmed, akin to Asimov's first law of robotics - 'A robot may not injure a human being or, through inaction, allow a human being to come to harm.'

Most of Asimov's stories end up highlighting just how impractical and self-contradictory these simplistic laws are. In this case, for example, consider what would happen if some human being possessing a potentially deadly but "smart" tool programmed as above was threatened by another human being with a "dumb" weapon. How is the "smart" tool to choose? Allow itself to be used as a weapon and injure one human being, or allow the other human being to come to harm by refusing to operate?

Then there are all the different varieties of harm: physical, mental, emotional; imminent or distant; reversible or irreversible. The outcome of a given action can rarely be predicted with 100% certainty. Perhaps it's beneficial 99.9% of the time, but may result in harm in the other 0.1%. This rule would prohibit any action which *might* result in harm, regardless of the benefits or the willingness of the human beings involved to accept the risk.

When and if AIs ever develop to the point of making sophisticated ethical decisions on par with humans, we'll be well past any need for the Three Laws. Until then, attempts to enforce these laws are likely to do more harm than good. We should leave the ethics to the human beings who own and operate the machines.

Getting back to the article, I think the original reference to the First Law of Robotics is really just confusing the issue:

> Even so, nothing has happened yet for users whose technology sells them out every day. What we need, he said, is a First Law of Robotics for software. If the free software community doesn't implement such a law, nobody will. Big power is committed against the First Law; it wants devices to work for it, not for their owners.

From the context, it seems that the intent of this "First Law of Robotics for software" would not be "do not harm humans", but rather, "don't subvert the wishes of the owner of the device in favor of other interests". The similarity ends with the vague concept of a "first law". When it comes to weapons, this would seem to go in precisely the opposite direction: if the owner wants to use the weapon, the software shouldn't get in the way or refuse to operate. Of course, the article was referring more to cryptography software and spyware, and perhaps DRM, rather than weapons software.

Eben Moglen returns to LCA

Posted Jan 26, 2015 23:50 UTC (Mon) by dlang (guest, #313) [Link]

It's worth going back to the debates around the time of the open source definition and earlier when the GPL was being written that covered this very topic (look for the example of shark lasers to find the right discussions)

It boils down to the fact that if you try to prevent derivatives of your software from being used in ways that you don't approve of, the software isn't considered Free. And the Open Source folks rejected similar restrictions based on the very pragmatic issue that it's impossible to say what's a weapon (is a car a weapon? it depends on who's driving)

Eben Moglen returns to LCA

Posted Jan 26, 2015 17:18 UTC (Mon) by micka (subscriber, #38720) [Link]

I don't think this is necessary.

This is covered in another set of rules so it doesn't need to be done apart. There are already laws that forbid to build or own weapons (at least in civilized countries) so this would just be redundant.

In other countries where there are not already such laws, I don't think you could event enforce the "don't use to make weapons"...


Copyright © 2015, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds