Toward healthy paranoia
Some of the recent leaks have made it clear that the NSA has worked actively to insert weaknesses into both cryptographic standards and products sold by vendors. There is, for example, some evidence that the NSA has inserted weaknesses into some random-number generation standards, to the point that the US National Institute of Standards and Technology has felt the need to reopen the public comment period for the 800-90A/B/C random number standards, in which there is little confidence at this point. While no compromised commercial products have yet been named, it seems increasingly clear that such products must exist.
It is tempting to believe that the inherent protections that come with free software — open development processes and code review — can protect us from this kind of attack. And to an extent, that must be true. But it behooves us to remember just how extensively free software is used in almost every setting from deeply embedded systems to network routers to supercomputers. How can such a software system not be a target for those bent on increasing the surveillance state? Given the resources available to those who would compromise our systems, how good are our defenses?
In that context, this warning from Poul-Henning Kamp is worth reading:
To an intelligence agency, a well-thought-out weakness can easily be worth a cover identity and five years of salary to a top-notch programmer. Anybody who puts in five good years on an open source project can get away with inserting a patch that "on further inspection might not be optimal."
Given the potential payoff from the insertion of a vulnerability into a widely used free software project, it seems inevitable that attempts have been made to do just that. And, it should be noted, the NSA is far from the only agency that would have an interest in compromising free software. There is no shortage of well-funded intelligence agencies worldwide, many of which operate with even less oversight than the NSA does. Even if the NSA has never caused the submission of a suspect patch to a free software project, some other agency almost certainly has.
Some concerns about this kind of compromise have already been expressed; see the various discussions (example) about the use of Intel's RDRAND instruction to add entropy to the kernel's pool of random data, for example (see also Linus responding to those concerns in typical Linus style). This lengthy Google+ discussion on random-number generation is worth reading; along with a lot of details on how that process works, it covers other concerns — like whether the NSA has forced companies like Red Hat to put backdoors into their Linux distributions. As people think through the implications of all that has been going on, expect a lot more questions to be raised about the security of our software.
Predicting an increased level of concern about security is easy; figuring out how to respond is rather harder. Perhaps the best advice comes from The Hitchhiker's Guide to the Galaxy: don't panic. Beyond anything else, we need to resist any temptation to engage in witch hunts. While it is entirely possible that somebody — perhaps even a trusted community figure — has deliberately inserted a vulnerability into a free software project, the simple truth remains that most bugs are simply bugs. If developers start to come under suspicion for having made a mistake, we could find ourselves driving some of our best contributors out of the community, leaving us weaker than before.
That said, we do need to start looking at our code more closely. We have a huge attack surface — everything from the kernel to libraries to network service daemons to applications like web browsers — and, with no external assistance at all, we succeed in adding far too many security bugs across that entire surface. There is clearly a market for the location and exploitation of those bugs, and there is quite a bit of evidence that governments are major buyers in that market. It is time that we got better at reviewing our code and reduced the supply of raw materials to the market for exploitable vulnerabilities.
Much of our existing code base needs to be looked at again, and quite a bit of it is past due for replacement. The OpenSSL code is an obvious target, for example; it is also widely held to be incomprehensible and unmaintainable, making auditing it for security problems that much harder. There are projects out there that are intended to replace OpenSSL (see Selene, for example), but the job is not trivial. Projects like this could really use more attention, more contributors, and more auditors.
Another challenge is the proliferation of systems running old software. Enterprise Linux distributions are at least supported with security updates, but old, undisclosed vulnerabilities can persist there for a long time. Old handsets (for values of "old" that are often less than one year) that no longer receive updates are nearly impossible to fix. Far worse, though, are the millions of old Linux-based routers. Those devices tend to be deployed and forgotten about; there is usually no mechanism for distributing updates even if the owners are aware of the need to apply them. Even projects like OpenWRT tend to ignore the security update problem. Given that spy agencies are understandably interested in attacking routers, we should really be paying more attention to the security of this kind of system.
While many in the community have long believed that a great deal of surveillance was going on, the current revelations have still proved to be shocking, and they have severely undermined trust in our communications systems. Future disclosures, including, one might predict, disclosures of activities by agencies that are in no way allied with the US, will make the problem even worse. The degree of corporate collaboration in this activity is not yet understood, but even now there is, unsurprisingly, a great deal of suspicion that closed security-relevant products may have been compromised. There is not a lot of reason to trust what vendors are saying (or not saying) about their products at this point.
This setting
provides a great opportunity for free software to further establish itself
as a secure alternative. The maker of a closed product can never
effectively respond to
suspicions about that product's integrity; free software, at least, can be
inspected for vulnerabilities. But to take
advantage of this opening, and, incidentally, help to make the world a more
free place, we need to ensure that we have our own act together. And that
may well require that we find a way to become a bit more paranoid while not
wrecking the openness that makes our communities work.
Index entries for this article | |
---|---|
Security | Backdoors |
Posted Sep 12, 2013 5:16 UTC (Thu)
by mathstuf (subscriber, #69389)
[Link]
Posted Sep 12, 2013 7:45 UTC (Thu)
by khim (subscriber, #9252)
[Link] (49 responses)
Posted Sep 12, 2013 8:20 UTC (Thu)
by zmower (subscriber, #3005)
[Link] (47 responses)
Posted Sep 12, 2013 9:00 UTC (Thu)
by khim (subscriber, #9252)
[Link] (46 responses)
Website may say whatever it wants (and old versions indeed used GPLv2 and LGPLv2.1), but actual versions of GnuTLS are using GPLv3 and LGPLv3 which means that it's not suitable for embedding in the core firmware image which means it's not a valid replacement of OpenSSL for most projects.
Posted Sep 12, 2013 14:10 UTC (Thu)
by meuh (guest, #22042)
[Link] (22 responses)
Being embedd[ed] in the core image firmware, who care it's OpenSSL, GnuTLS, Selene, etc. : it will never be updated, so there's no benefit from a security point of view. But with GPLv3/LGPLv3 GnuTLS, it would be easier for users to be able to create new core image firmware with their own, updated GnuTLS. vendor/OEM/ODM have to release the tools needed to build a new core image firmware and allow users to install it in their own hardware.
Posted Sep 12, 2013 17:14 UTC (Thu)
by khim (subscriber, #9252)
[Link] (21 responses)
You mean there are no difference between incomprehensible and unmaintainable code and easy to read and understand code? Looks like a novel idea to me… You mean it's somehow easier to replace OpenSSL with GPLv3/LGPLv3 GnuTLS rather then replace it with Selene or just fix bugs in OpenSSL? Perhaps, but I'm not all that sure. I kinda understand what you are saying: in some parallel universe where GPLv3 is wholeheartedly embraced by vendors and where GnuTLS is used by everyone people are observing huge benefits from the anti-TiVozation clause. But in our universe situation is radically different: GnuTLS is rarely (if ever) used and it's not part of the solution at all. Instead of trying to provide tools needed to build a new core image firmware and allow users to install it in their own hardware vendors are busy doing completely different work: they are looking for FSF-owned and GPLv3/LGPLv3-licensed code and ripping it out as fast as they could. People suffer because they can not actually use perfectly valid code, yes. I'm not sure how it can be portrayed as some kind of win for the free software, sorry. I agree with Rob Landley: copyleft is dying and the one thing to blame is GPLv3. It was presented as a straightening of GPLv2 and I even believed for some time that this “I am altering the deal. Pray I don't alter it any further.” plot will actually work, but RMS was too arrogant: he never even said the second part! Instead he “altered the deal” and promised that he'll alter it further if any new loopholes will be found. Vendors rebelled and we've lost even what GPLv2/LGPLv2 gave us. The whole story have justified the RMS's “GNU/Linux” name, but it's sad justification: Linux went on to become “nonGNU/Linux”, it's “nonGNU/Linux” variants are thriving but GNU part is dying along with GNU/Linux combination. Now even article which specifically discusses problems of OpenSSL ignores existence of GPLv3/LGPLv3-licensed GnuTLS. It exists, but it's not part of the solution: people just ignore it's existence when discuss problem in question and potential outcomes. Do you really want to imply that this outcome is somehow the best one for someone? I fail to see how.
Posted Sep 12, 2013 17:53 UTC (Thu)
by hummassa (subscriber, #307)
[Link]
I am not panicking. GNU variants are not dying, and even L?GPLv3 variants of software are alive and well. Now, people may (or may not) perceive that the nonGNU variants of stuff is more vulnerable to this short of shennanigans. I am quite sure they are, and I am quite content with using my (mostly) GPLd OS.
Posted Sep 12, 2013 18:00 UTC (Thu)
by pizza (subscriber, #46)
[Link] (15 responses)
I agree that copyleft is out of favor now, but IMO the GPLv3 had little to do with it. It's simply easier/less riskier/cheaper to "just take" BSD or Apache-licensed software.
Too many big players have been burned by GPL compliance issues (usually of their own creation -- "GPL == Public Domain, right?") and rather than do the right thing and fix their internal processes, they'd rather avoid that entirely, because it's cheaper in the short term, and that's all that counts.
FWIW I've been on both sides of this situation.
Posted Sep 12, 2013 21:51 UTC (Thu)
by khim (subscriber, #9252)
[Link] (14 responses)
Sure. But that was always the case. What's new is the fact that collaborative projects are released under non-GPL licensed. Think five, seven, ten years ago. What large FOSS projects existed back then? Apache, GCC, JVM, Linux, Mono, MySQL, OpenOffice.org, PostgreSQL, etc. Some projects were GPLed, some were not GPLed, but when people said that GPL is the big edge of Linux over the BSDs they were genuine: it really felt as if GPL kept projects together and that only people who wanted to create their own proprietary forks participated in Apache/BSD-licensed projects (which, naturally, stifled their growth because proprietary extensions went away when firms went away). Today even projects which are explicitly developed by large coalitions are not using GPL. Think Android, LLVM, Tizen, etc. Old non-GPLed projects are raised to the forefront (think EFL) and large projects are relicensed (think AOO). It's true that GPLv3 was not the sole factor for this development, but “GPLv3 scare” sure as hell made it more acute. I know that some companies have “no GPLv3” rule and even larger number have “no GPLv3 if there are sensible choice” rule. The sad truth is that I just don't know what can be done at this point: “GPLv3 scare” was not because license was changed, but because said change felt as “I am altering the bargain, pray I don't alter it any further”-style change. Even if FSF will suddenly decide to go back to GPLv2 it'll not change anything.
Posted Sep 12, 2013 22:17 UTC (Thu)
by pizza (subscriber, #46)
[Link] (7 responses)
>Today even projects which are explicitly developed by large coalitions are not using GPL. Think Android, LLVM, Tizen, etc. Old non-GPLed projects are raised to the forefront (think EFL) and large projects are relicensed (think AOO).
Those may be coalitions technically, but in every case there is a single mammoth player driving/steering/gatekeeping most of the work.
Android in particular is an interesting case as it's responsible for the vast majority of the anti-GPL push. Far more often than not the various device makers don't give anything back, layering their own crap on top of the core AOSP and keeping their changes private in an attempt to differentiate themselves, creating incompatibilites deeper into the system. We're *still* fighting binary-only driver (kernel and userspace) messes, and that doesn't even touch on the higher-level stuff.
If not for Google's continued patronage (and driver of most new development), Android would have completely fragmented by now in a mess that would have made the BSD wars look like a schoolyard spat. Google recognizes and embraces this, and is attempting to layer *another* set of (completely proprietary) APIs on top of "Open" Android.
I fear that the folks leading this mad rush towards Apache/BSD have forgotten the lessons learned from the BSD wars, and that we're heading towards another BSD bubble/cliff. When that (inevitably?) happens, the pendulum will swing back towards favoring Copyleft.
Why do I strongly come down on the side of copyleft? Because, simply, if you don't have source code (and the means to run/install/use it) then you don't have jack.
Posted Sep 13, 2013 7:33 UTC (Fri)
by khim (subscriber, #9252)
[Link] (6 responses)
BSD bubble/cliff was just a sideshow. The real winner of Unix wars was Windows—and the only reason for that was the fact that AT&T stopped developing UNIX. Yes, if Google will stop developing Android then someone else may pick up the slack. But I doubt it'll be copyleft. More likely it'll be Tizen or even Windows (again). May be. But this will need to be different, non-FSF-driven copyleft. May be it'll be CC-BY-SA, or may be it'll be something else. But GPL… it has the perception of “spoiled goods” and I don't see how that trend can be reversed. This affects FSF first (things like GnuTLS or GUILE), but all other GPL projects are affected, too.
Posted Sep 16, 2013 15:53 UTC (Mon)
by nix (subscriber, #2304)
[Link] (5 responses)
Posted Sep 16, 2013 17:15 UTC (Mon)
by hummassa (subscriber, #307)
[Link] (4 responses)
I suspect (and I have probably mentioned this before) that we'll see a real GPL comeback (v3 and all or even v4, if needed) some years from now, once the NSA-like surveillance take an unbearable toll on the economy or on the social structures. We'll see.
Posted Sep 17, 2013 9:26 UTC (Tue)
by renox (guest, #23785)
[Link] (3 responses)
Apparently, you didn't look very thoroughly.. The 'anti-tivoization' clause of the GPLv3 is quite controversial, for example Linus Torvalds doesn't like it, would you claim that he "doesn't like RMS"?
Posted Sep 17, 2013 12:39 UTC (Tue)
by hummassa (subscriber, #307)
[Link] (2 responses)
For instance, Linus says that the GPL and the LGPL are more or less the same, because linking does not make a derivative work, and AFAICT he is absolutely right.
OTOH, the "tivoization" stance was a pragmatic one -- Tivo was one of the first all-out many-consumers device and the use of locking bootloaders killed the possibly thriving modifications market. So, while Linus was right (because Tivo could, at any moment, choose to go with some BSD or even with XNU) he was also anti-RMS and anti-software-freedom.
One can be "I don't like RMS" for pragmatic reasons. It does not make it good for software freedom. Apple does not like RMS because Apple likes to maintain control, and RMS/FSF/GNU/GPL is all about relinquishing control downstream AND guaranteeing that control would stay downstream.
Posted Sep 17, 2013 19:13 UTC (Tue)
by dlang (guest, #313)
[Link] (1 responses)
Posted Sep 18, 2013 0:14 UTC (Wed)
by hummassa (subscriber, #307)
[Link]
Posted Sep 12, 2013 23:17 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
Example: Samba in Mac OS X. Apple is developing their own SMB client from scratch just not to use Samba.
Posted Sep 12, 2013 23:25 UTC (Thu)
by pizza (subscriber, #46)
[Link]
Google really isn't much better. Those two entities are responsible for the vast majority of the anti-copyleft movement, if only because their funding made it possible for everyone else to tag along.
Posted Sep 19, 2013 17:18 UTC (Thu)
by Wol (subscriber, #4433)
[Link] (3 responses)
But to anyone who bothers to understand RMS, it's pretty clear the FSF is not altering the bargain. GPLv3 is pretty much ALL BUGFIXES. Okay, we may disagree, but the anti-Tivoisation clause simply prevents the manufacturer reserving to themself the right to update the software. If it's in ROM (or PROM) and can't be updated, that's fine. Another bugfix for a bug I didn't even realise existed - if you put a GPLv2'd BINARY on your website, then even if you put the source right next to it you trigger the "make the source available for three years" clause. You have to FORCE people to download the source. v3 now means if they don't get the source at the same time, they have no right to come back later The rest of it is in the same vein. Cheers,
Posted Sep 19, 2013 20:02 UTC (Thu)
by khim (subscriber, #9252)
[Link] (2 responses)
You are preaching to a choir here. Yes, that's what I said seven years ago. Sure, for RMS it's just a bugfix since he clearly considers GPLv2 weapon in the fight for the software freedom and clearly stated: Change is unlikely to cease once GPLv3 is released. If new threats to users' freedom develop, we will have to develop GPL version 4..
But I also said back then that it's dangerous ground to play with—exactly because others perceived (and still perceive!) GPLv2 differently. In effect switch from GPLv2 to GPLv3 folks highlighted difference between “Free Software” and “Open Source” folks. For “Free Software” camp it was just a bunch of bugfixes but for “Open Source” folks it was fundamental change of the status quo. As Linus put it: To me, the GPL really boils down to “I give out code, I want you to do the same.” The thing that makes me not want to use the GPLv3 in its current form is that it really tries to move more toward the “software freedom” goals. Stallman expected to see that people will embrace “Free Software” and go with GPLv3 but most embraced “Open Source” and rejected it. Linus rejected it outright, some others guys did that later (for example GnuTLS parted way with FSF and went back to GPLv2). In effect FSF showed us that RMS is right once again and that most “Open Source” folks are not ready to join Church of Emacs—Saint IGNUcius. Sure, but FOSS is increasingly used in places where such lock-down is expected and sometimes needed (==mandated by law). Mobile phones are locked because carriers want to sell simple thing like tethering support for $$, car software is locked because there are fear that someone will alter it and car manufacturers will be declared responsible for death of people despite all these NO WARRANTY claims and so on. Apple just wants to control both software and hardware on the devices it sells. For all of them new, altered bargain is totally unacceptable. Why? GPLv2 clearly gives you another choice: Accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange. In today's world web is a medium customarily used for software interchange and it's pretty clear that if source and binary are near each other on the same server one accompanies the other. It's the same as when you offer Debian DVD with binaries and Debian DVD with sources—it's clearly up to recipient do decide if s/he wants to grab second DVD as well, you are not required to see if s/he'll actually take it with him or her. GPLv3 clarifies things like use of torrents for the software distribution, and makes it clear that you are not losing your right if you fix accidental violations fast enough, that's true, but these minor improvements are overshadowed by much, much larger of changes in other places.
Posted Sep 26, 2013 17:34 UTC (Thu)
by nix (subscriber, #2304)
[Link] (1 responses)
Posted Sep 26, 2013 17:50 UTC (Thu)
by khim (subscriber, #9252)
[Link]
Well, yeah, that's true, but it's not that important. The important fact is that is stopped participating in the FSF's jihad and instead have chosen to stay relevant. Only time will tell if they'll succeed or if it's case of “too little too late”. The important fact is not that it can be used in conjunction with GPLv2 programs, the important fact is that now it can be used by various vendors and linked with GPLv2, Apache, or even proprietary programs.
Posted Sep 13, 2013 10:40 UTC (Fri)
by meuh (guest, #22042)
[Link] (3 responses)
What matter is how one can address the current shortcomings / vulnerabilities, how one can fixed them, how one can deploy a fixed version. If it's not possible to upgrade, to fix, to change, it's doomed, it's vulnerable and dangerous, whatever the quality of the initial code base. GPLv3 / LGPLv3 GnuTLS would allow users to do this, while BSD OpenSSL don't. In the BSD case, you have to rely on the device vendor / OEM / ODM / manufacturer to provide an update. But they could be totally unaware the security concerns or unwilling to provide a fix for old devices. YMMV.
Posted Sep 13, 2013 11:37 UTC (Fri)
by hummassa (subscriber, #307)
[Link] (1 responses)
In a post-Snowden day and age, I can't understand just *how* people can justify anti-GPLv3 (and anti-copyleft) sentiment.
Allowing tivoization is allowing spying, forever.
Allowing proprietarization (XNU was "open source" until OSX 10.3? 10.4?) is allowing spying, forever.
Only copyleft cures that. Only copyleft empowers users against that kind of shit.
Copyleft is not really good for the behemot-likes of Google, Facebook, Apple and Microsoft -- the Big Four nominated for their haemorraging of your private intel to the NSA. It's only good for each of their two billion users/assets/products. Copyleft lowers the cost of change; enterprises loathe lowering that particular cost.
IMNSHO, individuals who choose "liberal licensing" are helping big companies to plot the end of privacy, and maybe of any semblance of democracy present today in the world. Small enterprises that choose "liberal licensing" are just paving the way for the big enterprises to come and crush them, and it suits them right.
Posted Sep 13, 2013 14:13 UTC (Fri)
by danieldk (subscriber, #27876)
[Link]
Not that it matters much to the discussion, XNU is still open source:
http://www.opensource.apple.com/release/mac-os-x-1084/
> individuals who choose "liberal licensing" are helping big companies to plot the end of privacy
The world is not black and white. There are plenty of companies (big and small) who produce proprietary software, like liberal licensing for that reason, but are not plotting 'the end of privacy'.
And personally, I don't care if someone uses my code in a proprietary product, as long as credit is given where credit is due. I do mind if they'd use it for sharks with lasers, but it's pretty hard (and non-FLOSS) to craft a license to prevent uses that I consider immoral.
Posted Sep 13, 2013 15:02 UTC (Fri)
by khim (subscriber, #9252)
[Link]
That's solution of totally different problem. People who know or care about GPLv3/LGPLv3 are such a tiny minority that they can be easily ignored. And they don't need any protection anyway (since they agree to lose the protection offered by vendor right from the start). The other 97% of users must be protected somehow and GPLv3 does not help there, in fact it makes situation worse. Think about it: there are many errors in FOSS and elsewhere but the most security problematic part of the computer is still usually resides between chair and keyboard - and GPLv3/LGPLv3 makes this problem more acute, not less. This should (and probably would) be done by making sure vendor, OEM, ODM, manufacturers are not shirking liability. Note that all licenses try to disclaim liability but they still are limited by law: if law says vendor must do something it must do something even if license says something different. Of course this will mean that vendors will want to lockdown the device to reduce attack surface and GPLv3/LGPLv3 is incompatible with that. OTA updates were invented long ago. Windows Update was launched 18 years ago and Linux distributions have offered the same service for about the same time. Yup. GPLv3/LGPLv3 make life better for 2-3% of geeks and worse for everyone else. Why do you think vendors should pick this nonsolution for nonproblem?
Posted Sep 12, 2013 16:46 UTC (Thu)
by ScottMinster (subscriber, #67541)
[Link] (20 responses)
Though I'm not quite sure how it applies to software burned into a ROM that can never be changed.
Posted Sep 12, 2013 19:49 UTC (Thu)
by lsl (subscriber, #86508)
[Link] (17 responses)
Posted Sep 12, 2013 22:09 UTC (Thu)
by khim (subscriber, #9252)
[Link] (16 responses)
Or when you include fixes for hardware flaws in software. You may be surprised, but in embedded world that's not abnormal case, it's typical situation that you include some software safeguards against hardware faults. This means that you need make it hard to use unauthorized software and if user will circumvent this restriction somehow (most likely be removing some kind of seal) - well, it's his/her choice, your can righteously say that if use circumvented your protection then s/he's on his/her own and warranty is null and void. GPLv3 breaks this scheme apart: you are forced to offer some way to replace some components of your system (often central components of your system) and even if you'll disclaim the warranty for such a case (GPLv3 explicitly allows that) you still are faced with support calls burden. But this change itself was not the biggest problem with GPLv3: GPLv3 requirements are not all that onerous per se. Biggest problem with GPLv3 was “I am altering the bargain, pray I don't alter it any further” perception. Which basically poisoned GPL well.
Posted Sep 12, 2013 23:36 UTC (Thu)
by pizza (subscriber, #46)
[Link] (15 responses)
One other comment that I think bears mentioning.
All previously-distributed GPLv2 software is still GPLv2; it doesn't magically stop working or relicense itself.
So at the end of the day, the response to folks who take the "oh noes, the people who write software we get to use and modify for free may, at some point in the future, change the terms of new software they write after that date!" attitude come off as if they are complaining that their free pony doesn't come in the color they want.
Nevermind that the alternative licensing schemes (eg BSD, Apache) could just as easily end up in the same situation, with future releases licensed under different terms. And historically, that's precisely what tends to happen as the tragedy of the commons unfolds yet again.
Honestly, I just don't get it. :)
Posted Sep 13, 2013 7:17 UTC (Fri)
by khim (subscriber, #9252)
[Link] (14 responses)
Sure—but that's exactly the point. I'll try to explain. Just what is (or may be I should say “was”?) the advantage of the copyleft scheme? To the vendor, I mean, not to the end user? The answer is simple: the sole advantage is unchangeable tit-for-tat bargain. It's as simple as that. From vendor's POV Apache or BSD license is clearly superior: you can use it without worrying about compliance (even if you forget to do anything remedy is to add couple of sentences to the manual), it's easy and simple, no added costs. But, as you've noted, Apache or BSD creation can always go extinct: it's creator is free to add additional restrictions at any time which, indeed, leaves you stranded. Now, historically GPL offered an alternative—sure, you are saddled with some additional hassles if you use copylefted software, but in exchange you get the bargain which can not be altered: no matter what happens next version will also carry the same terms and can be used with the same rules as previous one. With GPL vendor is bound by some strict rules yet everyone else must follow them as well! That's the only selling point of a GPL, really (vendors don't care about user's freedom thus that part flies right over their heads). One can not do the same with GPL because of you may not impose any further restrictions on the recipients' exercise of the rights granted herein requirement. Well… one cannot alter the bargain if they are not FSF as it turned out. FSF can do that (and everyone knew they could do that), but as long as that ability was theoretical vendors were ready to ignore it (after all there are bazillion other things which can go wrong for them, what's one purely hypothetical scare compared to all that?). This was powerful offer and vendors were ready to tolerate the problems the GPL creates in exchange for that powerful promise. Except now this offer is null and void: FSF altered the bargain and, even worse, explicitly promised to alter this bargain further (how else can you interpret Change is unlikely to cease once GPLv3 is released. If new threats to users' freedom develop, we will have to develop GPL version 4.?). This effectively killed one and only advantage of the copyleft: it's no longer possible to guarantee that new versions will carry the same terms. In fact it's now known that terms will be changed if new threats to users' freedom develop. Just what is the advantage of GPL over BSD if it's terms can be changed at any point anyway? Not “in the color they want” but “in the same beige color it used yesterday”—and they have chosen this particular stud-farm in the first place because farmer promised them that all the followup batches of free ponies will come in the exact same beige color! They could have chosen black ponies or white one or even rainbow ones, but they have chosen dull beige because it was not clear if black ponies or rainbow ponies will be available tomorrow or not. But now instead of beige ponies you are getting brown ponies! WTF???!!! What happened to the bargain which attracted them to beige ponies in the first place? Note how Linux (which is also developed under GPL terms, remember?) is not affected by this strong copyleft backlash. Why? Because it didn't alter the bargain! When FSF unleashed it's “weapon of mass oponion” Linus explicitly refused to participate. I think this is what saved Linux. MINIX is in state similar to where LLVM was back when GPLv3 happened, but companies don't see the need to spend their money on it because Linux (with it's unchanged bargain) is “good enough”.
Posted Sep 13, 2013 10:38 UTC (Fri)
by hummassa (subscriber, #307)
[Link]
the FSF/GNU can alter the GPL at their whim.
Which is, first, a false premise (they do have bylaws and promises and goals and they can't just *change* the licenses in a new version, they have to *justifiably* change them), and second, and more important, a moot premise (people can continue to develop GPLv2, LGPLv2, LGPLv3 software without any outside change to their licensing, for that you can only drop the "vX or later" on your license).
Posted Sep 13, 2013 15:08 UTC (Fri)
by wookey (guest, #5501)
[Link] (12 responses)
You are quite right that big corps don't like GPLv3, but in my experience that's almost entirely due to patent-paranoia, rather than anything to do with potential changes in new licences. Because GPLv3 is a lot clearer about the fact that they really do have to agree not to assert patent claims on 'anything derived from this' than v2 was. So this problem is actually a side-effect of the whole software-patent disaster rather than much do with changes trying to retain practical user freedoms.
The growing realisation that you can't trust _any_ code you (or someone) can't examine, rebuild and replace, especially if it came from the US, could make a real difference, and shift things back towards copyleft. We shall see. I'd like to think so, but then I already thought copyleft mattered.
Posted Sep 13, 2013 15:46 UTC (Fri)
by khim (subscriber, #9252)
[Link] (11 responses)
Huh? The GPLv2 actually only discusses two possibilities: The idea that you can just omit “or any later version” and thus make sure new ideas of GPLv3 will not affect you was Linus' inventions, not FSF's one—and others were not so enlightened. Anti-GPLv3 started at the end of distribution tree: vendors revolted and only after that happened developers decided that they will forget about it since GPL is no longer acceptable. But by now a lot of developers see the writing on the wall: if you want to create code which will be used by real people to solve real problems - you should not pick the GPLv3. Some old projects get away with relicensing because they are important enough (think Samba), but how many popular GPLv3-licensed projects can you name which gained popularity as GPLv3 project and not as project which first gained popularity as GPLv2 project and then changed the bargain. Of course first case of "altered bargain" was AGPL, not GPLv3 but there situation was different: some projects adopted it, most ignored it, but there was no bargain alteration, few projects (if any) went from GPLv2 to AGPLv1. What if you relied on “no further restrictions” clause and hoped to receive similar terms for newer versions of software which is now only available under GPLv3 license (this is what happened with GCC and Samba)? Should you wait till GLibC will be relicensed under GPLv3 and then start to suddenly look for a replacement or should you just abandon it and go with something like Bionic? Well, may be, but it's the same case in the end: bargain was altered and both people who don't like new bargain and people who don't want to deal with next unexpected change in the bargain are revolting.
Posted Sep 13, 2013 18:36 UTC (Fri)
by hummassa (subscriber, #307)
[Link]
Actually, the section 9 of the GPLv2 is discussing two alternatives to the (obvious, IMNSHO) option where you are accepting the terms of the license you are reading in the moment you make copies of a GPLd work. Section 14 of the GPLv3 discusses the same two alternatives.
The forementioned sections are in the line of:
* these are the terms of this license; but if someone licenses the work as "GPLvX or later", it means that you have the option (not the obligation) of taking it under the term of later GPL's; and if someone licenses some work as "GPL", it means that you have the option (again, not any obligation) of taking it under the term of any version of the GPL that you care for.
Posted Sep 19, 2013 17:26 UTC (Thu)
by Wol (subscriber, #4433)
[Link] (9 responses)
Actually, the GPL (v2 or otherwise) ITSELF discusses NONE of this. There is a load of blurb - which is not part of the GPL itself - which discusses this. There is a LOT of rubbish in the comments on this article, and it's pretty much all down to the fact that far too many people don't actually understand how copyright works. Cheers,
Posted Sep 19, 2013 20:22 UTC (Thu)
by khim (subscriber, #9252)
[Link] (8 responses)
Sorry, but it does: Discussion about “later versions” is very much part of GPLv2—but Linus was apparently first who studied it enough to understand what it may mean in the future, found that GPLv2 is not like MPL (which explicitly says that You may also choose to use such Covered Code under the terms of any subsequent version of the License published by Netscape) or CC-BY-SA (which says You may distribute, publicly display, publicly perform, or publicly digitally perform a Derivative Work only under the terms of this License, a later version of this License with the same License Elements as this License, or a Creative Commons iCommons license that contains the same License Elements as this License (e.g. Attribution-ShareAlike 2.0 Japan)). and consciously decided to reject this possibility. P.S. It'll be interesting to hear outcry from Creative Commons users when these licenses will alter it's bargain enough—this should be even bigger fiasco then GPLv2-to-GPLv3 transition because with GPLv2 you at least have Linus's option of rejecting said transition. No such luck with Creative Commons. Well, sure. The fact that people actually distribute anything under Creative Commons and never even discuss the fact that by doing so they give complete power over their creations to some random guys over there but discuss GPLv2-to-GPLv3 transition to the death is large enough clue. Remember how FSF unilaterally shifted the whole Wikipedia without asking it's authors about anything to CC-BY-SA 3.0? Well, folks behind Creative Commons can easily do similar trick to the whole corpus of the Creative Commons-licensed art.
Posted Sep 19, 2013 23:44 UTC (Thu)
by lsl (subscriber, #86508)
[Link] (7 responses)
The FSF? Unilaterally? Is it even possibly to make a more biased account of that story? As stated in your link 17.000 Wikimedia/Wikipedia authors voted on it with 75 % being in favour of the license change. The FSF was just an accomplice in that plot originating in the Wikimedia communities as it had the power to allow licensing changes through the 'or later' clause.
Posted Sep 20, 2013 1:29 UTC (Fri)
by khim (subscriber, #9252)
[Link] (6 responses)
Right. Small percentage of Wikipedia copyright holders held a vote and even among them about 10% (that's over thousand copyright-holders, remember?) voted against said change. Without FSF's power to unilaterally change the license they faced lengthy (probably multi-year) process with uncertain outcome. They convinced FSF to apply it's power and bam: opinion of over thousand people about how their work can be used went to the wolves. Indeed. Note that votes, opinions and all other stuff was only needed to convince FSF, FSF had no need for any votes to change the license. It could have made it even if votes showed that two guys are for the change and 1698 are against. P.S. Note that I'm not saying that this change was bad. I'm saying that it was made against of express wishes of sizable chunk of copyright holders. Recall how long it took for the Dell to overcome Icahn's opposition - and this is in case where deeds are supposed to be done by vote while Icahn only had 6% of voting power. With copyright you are not supposed to vote. Indeed, when nine lines were included in Android against Sun (and Oracle) wishes they raised racked to the sky, won the argument and only failed to convince court that incorrectly appropriated nine lines are worth billions. Yet FSF or CC can do such changes relicensing twice per day (if that'll be their wish) and nobody can say anything at all.
Posted Sep 20, 2013 15:15 UTC (Fri)
by nybble41 (subscriber, #55106)
[Link] (5 responses)
You can't claim that the contributors didn't have a say in the license. They agreed to any changes the FSF might make when they agreed to the "or later" clause in the original license. This is a bit like complaining that the code you published under the GPL is being used to guide nuclear missiles; it was licensed for use by anyone, for any purpose, and you can't take that back later. In the same way, if you choose to license your work with an "or later" clause you give up control over how the work may be licensed in the future to whichever organization publishes the license. If that isn't what you want, don't license the work under an "or later" clause in the first place.
Of course, there is no requirement for the project to include contributions without such a clause, and doing so may seriously complicate project management down the road. If no compromise can be reached they may simply have to do without your contribution.
Posted Sep 20, 2013 16:31 UTC (Fri)
by khim (subscriber, #9252)
[Link] (2 responses)
They didn't. Well, sure. They accepted that the Free Software Foundation may publish new, revised versions of the GNU Free Documentation License from time to time. And they were promised that such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns.
Sure. But this is not what transpired here. Instead of receiving new similar in spirit version of license they were transferred en-masse from one Suzerain to another. That's fundamental violation of principle The vassal of my vassal is not my vassal. This may or may not be legal, but I know that it's not something I would like. That's what I try to do lately, yes. I was significantly more forgiving to these clauses in the past, but after GPLv3 and GFDLv1.3 abuses of power by FSF it becomes more and more clear to me that the ability “to bugfix the license” is not good enough reason to give this much power to third party. Instead it's more honest to use BSD (or maybe Apache) license and give equal powers to everyone. Situation is not all that dissimilar to problem of Canonical's copyright assignment.
Posted Sep 20, 2013 18:00 UTC (Fri)
by nybble41 (subscriber, #55106)
[Link] (1 responses)
Good, we're in agreement then. The FSF obviously felt the proposed Creative Commons license was "similar in spirit" to the GFDL, while resolving a number of issues which arose because what it was being applied to was not, in fact, documentation. They weren't breaking any promises, just using their position as the authoritative publisher of new versions of the GFDL to resolve real problems and concerns relating the GFDL in the context of WikiMedia.
> That's fundamental violation of principle The vassal of my vassal is not my vassal.
There are no "vassels" here, only free individuals choosing licenses for their contributions without thinking through all the possible consequences.
Posted Sep 20, 2013 21:05 UTC (Fri)
by khim (subscriber, #9252)
[Link]
Few questions: No. They were abusing their position as the authoritative publisher of new versions of the GFDL to loan certain GFDL-licensed works to other fiefdom. They most definitely don't feel that CC-BY-SA is “similar in spirit” enough to give free pass to all GFDL users, they only exchanged parts of their congregation, not all of them. It's idle talk. We can agree that individuals have chosen licenses without thinking too much about consequences, but there are also the fact that FSF treated these “free individuals” as serfs who have no power over their own creations because they once signed them away by choosing to license their work under “GNU Free Documentation License, Version 1.[012] or any later version”. Their wishes were irrelevant, their intents were ignored, new license was created to fulfill the Wikimedia Foundation's request, not to satisfy unanimous resolution of Wikipedia authors. Indeed with unanimous resolution they could have switched to any license of their choosing without FSF's involvement. Note that CC-BY-SA is even more devious then GFDL: it embeds ability to use “a later version of this License” in the text of license itself. Even if you distribute something under CC-BY-SA 2.5 or CC-BY-SA 3.0 one may use text of CC-BY-SA 10.0 (which can include anything Creative Commons Corporation will want to include in it) and you can not disagree with that hijacking by omitting “or later” text from the license grant.
Posted Sep 20, 2013 16:42 UTC (Fri)
by khim (subscriber, #9252)
[Link] (1 responses)
If I decide to participate in some project then I should accept the license they are using. “Or later” clauses, CLAs and all that. Doing anything else is sheer insanity. If my contribution is large enough I may decide to create separate project (which may or may not be pulled as third-party component into other one), if it's not large enough then I should accept the offer I was given, but I should not submit pieces under different license. Heck, even FSF explicitly says We recommend you use this license for any Perl 4 or Perl 5 package you write, to promote coherence and uniformity in Perl programming even when they say We urge you to avoid using it about it's first half (Artistic 1.0 license).
Posted Sep 20, 2013 18:06 UTC (Fri)
by nybble41 (subscriber, #55106)
[Link]
Obviously. The project can't accept contributions without "or later" from just one contributor; it's all or nothing. The question was whether or not to participate in the first place given a project which has adopted an "or later" clause. If enough people choose not to participate in projects with such a clause, the projects will most likely drop it in order to regain contributors.
Posted Sep 14, 2013 12:49 UTC (Sat)
by dd9jn (✭ supporter ✭, #4459)
[Link] (1 responses)
Posted Sep 15, 2013 11:05 UTC (Sun)
by mpr22 (subscriber, #60784)
[Link]
"Hit by a bus" is not a problem. Either they're still legally competent afterwards, in which case they can reinstate the license; they're dead, in which case their heirs are legally competent to reinstate the license; or they're alive but no longer legally competent, in which case whoever the law has authorized to act on their behalf is legally competent to reinstate the license. More awkward is where they've divested themselves of their electronic devices and moved to, say, Mongolia or Nunavut.
Posted Sep 16, 2013 15:50 UTC (Mon)
by nix (subscriber, #2304)
[Link] (1 responses)
Posted Sep 17, 2013 23:59 UTC (Tue)
by khim (subscriber, #9252)
[Link]
Mea culpa. Indeed, it looks like GnuTLS dropped it's FSF ties and, indeed, went to LGPLv2.1 (although that particular change was not advertised at all and is not even mentioned in NEWS file and thus I missed it). Makes it even more strange that it's not mentioned in this article when half-dead Selene is mentioned.
Posted Sep 13, 2013 19:20 UTC (Fri)
by juliank (guest, #45896)
[Link]
A reason for choosing GPL2 was that if I license stuff under GPL, everybody else must distribute it under that license, and cannot add further restrictions and thus prevent me from incorporating their changes.
Posted Sep 12, 2013 8:20 UTC (Thu)
by ortalo (guest, #4654)
[Link] (1 responses)
[1] firewalls and antivirus... ahaha!
Posted Sep 12, 2013 16:17 UTC (Thu)
by wtanksleyjr (subscriber, #74601)
[Link]
The effects of unhealthy paranoia can be subtle and deadly. One of them is that we tend to add complexity to obviously security-related systems thinking that we're adding security.
An example is the FIPS definition of randomness, which includes a health check that specifies the number of consecutive zero bytes that can come from a random number source. That actually decreases randomness slightly, and that hurts healthy applications like feeding into Linux's pool. A worse example is how this gets implemented: many vendors, including Intel, make other measures of health, and those measures remove more and more randomness from the stream. How much randomness is in the final output? We don't know, because it's masked over by AES so that it looks high-entropy. But someone who knows the secrets used to mask it (and knows what situations can reset them) would be able to see the true entropy.
Adding those health measures makes perfect sense -- but they actually throw away genuine randomness.
Another example is in the distinction between /dev/random and /dev/urandom. So long as the state of the pool is initiated, mixed, and extracted properly (which is a source of possible problems), the randomness is simply cryptographically valid. Keeping a count of entropy after initiation adds a layer of complexity that simply doesn't justify itself, and opens itself to attacks including DoS and timing.
Posted Sep 12, 2013 8:42 UTC (Thu)
by gerdesj (subscriber, #5446)
[Link] (1 responses)
In any large organisation it can be tricky keeping tabs on what is installed where and at what version etc and I doubt that the NSA and co. are any better at it than anyone else. Add to that the possibility that you have to worry about your own "backdooring" and you have a horrendous asset management and security auditing task.
I wonder if the teeth arms of our security services have considered this if the supposition that they have compromised these devices is true.
If there were back doors, then you can pretty much guarantee that persons other than those who hold the keys will know how to abuse them.
Cheers
Posted Sep 13, 2013 9:05 UTC (Fri)
by ortalo (guest, #4654)
[Link]
Personnally, I would even question the situation further: it may be possible that currently, skilled (and possibly not-so-high-morality) people are most prominently recruited on highly-paid jobs; while the people working on defence are most frequently chosen among low grade computer people (possibly with a not-so-low-morality) on cheaper jobs.
PS: Cyberdefense weapons proliferation... Note I know how to be paranoid too!
Posted Sep 12, 2013 9:19 UTC (Thu)
by mjthayer (guest, #39183)
[Link]
Posted Sep 12, 2013 12:15 UTC (Thu)
by renox (guest, #23785)
[Link] (5 responses)
I don't expect this to change, so don't fool yourself when you think that our software is secure: it may be secure against random crackers, but against governments??
Posted Sep 12, 2013 21:09 UTC (Thu)
by anselm (subscriber, #2796)
[Link] (4 responses)
What makes you think that the situation in the proprietary software world is in any way different?
At least in the FOSS community, projects – unlike proprietary software vendors – have nothing to gain by trying to keep security issues secret and unfixed.
Posted Sep 13, 2013 7:43 UTC (Fri)
by renox (guest, #23785)
[Link] (3 responses)
Ah, I see, what I wrote is ambiguous.. To clarify: I think that it is the same in the proprietary software world as in the FOSS community: security isn't really a concern, just an afterthought.
Proof that the FOSS doesn't really care about security:
I'm sure that spender could come with a list with a hundred items.
Posted Sep 13, 2013 10:42 UTC (Fri)
by hummassa (subscriber, #307)
[Link] (2 responses)
Posted Sep 13, 2013 11:42 UTC (Fri)
by spender (guest, #23067)
[Link] (1 responses)
Say what you will about the intentions for fixing vulnerabilities in the proprietary world. I find it to be the same for the Linux kernel really.
What's undeniable though is the dramatic change Microsoft has made in their development processes (SDL) and entire approach to security (EMET, etc). In his now-famous memo (http://www.wired.com/techbiz/media/news/2002/01/49826) Bill Gates identified security as a systemic threat to his business.
Contrast this to the Linux kernel, which is still very much in an old mindset. Even the Linux kernel's security pride and joy, its ability to publish timely fixes in response to submitted reports, is rendered ineffective by upstream's inability and unwillingness to communicate the importance of those fixes. In the space of any other commercial product based on Linux (Android, NASes, etc), you also have the problem of those fixes just not getting out to the users at all.
-Brad
Posted Sep 13, 2013 13:17 UTC (Fri)
by anselm (subscriber, #2796)
[Link]
I agree that the Linux kernel development community could do a lot to improve their handling of security issues. That does not detract from the observation that most FOSS projects are much more open about security than most proprietary vendors. The Apache web server project, for example, seems to do a decent job of dealing with security issues and their fixes.
Microsoft may be better than they used to be but they often still need extensive prodding before acknowledging, let alone fixing, security issues. In many cases it requires an active exploit out in the wild to get most vendors to do anything, mostly because the act of having to publish patches at all means bad PR (for having been vulnerable in the first place). It is also difficult to get customers to install the patches, and there is a chance of introducing new bugs when patching existing ones, which is why after-market upgrades are often viewed as a bad idea, and are restricted to the most egregious problems. For a vendor it often pays to sit on problems that are not being actively exploited, where in the FOSS community (with the possible exception of some projects like the Linux kernel) proactive fixing of even theoretical security issues is generally welcomed.
Posted Sep 12, 2013 13:57 UTC (Thu)
by lambda (subscriber, #40735)
[Link] (1 responses)
Poul-Henning Kamp's take on the value of cryptography is a bit overly pessimistic:
He forgets that one of the big worries that this whole incident has brought up is dragnet style surveillance; tapping everyone's communication, putting it into a big database, and then allowing analysts to search through it with only the most basic of protections (they need to select in a dropdown why they think that one of the parties is foreign).
His "Mr. and Mrs. Smith" absolutely can benefit from better encryption; encryption that is more ubiquitous, stronger, and easier to use. Merely raising the bar from "tapping everyone's communication" to "needing to target particular people with a proper warrant" would help immensely.
Of course, he's also right in that more encryption alone can't solve the problem. We need proper political solutions, that put good protections against abuse in place. But we shouldn't rely on just that approach. If we attack it on two fronts, getting more and better encryption more widely deployed as well as applying political solutions, we are far more likely to achieve our privacy goals.
Posted Sep 13, 2013 9:16 UTC (Fri)
by ortalo (guest, #4654)
[Link]
However, in my humble (really!) opinion, everyone seems to forget here that modern computer communication technology has definitely changed the situation. The key issue now is not really anymore to question how to prevent a government to monitor their citizens communications: that's technically feasible and law enforcement necessitates it.
To summarize: how do we log access to log files?
Posted Sep 12, 2013 20:33 UTC (Thu)
by zooko (guest, #2589)
[Link]
This is the reason why my startup launched our first product a month ago:
https://leastauthority.com/blog/least-authority-announces...
I was sad when LWN didn't see it as sufficiently relevant to your readership to include mention of it in LWN!
"And that may well require that we find a way to become a bit more paranoid while not wrecking the openness that makes our communities work."
I believe healthy paranoia can be sustainable and productive. There are a lot of techniques that are good for both security reasons and normal old software engineering reasons.
In the Tahoe-LAFS project, we have always required code review by someone other than the author, full test coverage on changes, and modularity of the codebase. The same practices are used in Twisted Python and Mozilla Firefox. Obviously this doesn't make it impossible for a bad actor to hide a sufficiently clever vulnerability in their contribution, but it really raises the bar, and of course it also helps with the (much more common) innocent mistakes.
Posted Sep 12, 2013 22:44 UTC (Thu)
by zooko (guest, #2589)
[Link]
If I needed an SSL implementation right now I would probably look at Botan first (http://botan.randombit.net/). It is mature, the documentation seems good, and the author — Jack Lloyd — seems knowledgeable and has helped out my project (Tahoe-LAFS) in the past. I also might look into yassl (http://www.yassl.com/yaSSL/Home.html) or polarssl (https://polarssl.org/).
However, people sometimes reach for "SSL" when what they really need is a crypto library, not an SSL library! For crypto, depending on your specific requirements, I would recommend nacl (http://nacl.cr.yp.to/), Crypto++ (http://www.cryptopp.com/), Botan, or my own pycryptopp (if you are writing in Python and you happen to need only the three specific algorithms that pycryptopp happens to provide).
Posted Sep 13, 2013 0:14 UTC (Fri)
by giraffedata (guest, #1954)
[Link] (6 responses)
How?
Posted Sep 13, 2013 11:22 UTC (Fri)
by hummassa (subscriber, #307)
[Link] (5 responses)
Posted Sep 14, 2013 16:35 UTC (Sat)
by giraffedata (guest, #1954)
[Link] (4 responses)
Are we talking about ordinary public standards?
Posted Sep 14, 2013 17:15 UTC (Sat)
by khim (subscriber, #9252)
[Link] (3 responses)
The same way it always did. Yup. Note that more then quater-century ago NSA involvement made DES stronger, not weaker (although with shorter key size which, as you can guess, is a weakness, but it can hardly be named “hidden weakness”).
Posted Sep 15, 2013 19:20 UTC (Sun)
by giraffedata (guest, #1954)
[Link] (2 responses)
If the NSA activity we're talking about is anything like what is described in the Wikipedia article on DES, then "has inserted weaknesses" is entirely inappropriate wording. The NSA's involvement in DES, according to the article was:
The US government wanted to establish a standard for encrypting US government data. It sought proposals, via the National Bureau of Standards, from the public and consulted with NSA to evaluate them. IBM submitted a proposal and consulted with the NSA in developing it. NSA suggested two changes to IBM's initial proposal. One was a reworking of the "s-tables," which IBM's encryption experts analyzed and found to be good and accepted. The other was to reduce the key length from 64 bits to 48. IBM rejected that. NSA then proposed 54 bits and IBM found that to be better than 64 and accepted it. IBM made the resulting proposal to NBS and NBS accepted the proposal as a standard for encrypting US government data. Some time later, public standards bodies including ANSI and ISO adopted the same standard. The article doesn't tell the process by which ANSI and ISO adopted it, but I see no evidence that NSA was involved.
Posted Sep 16, 2013 0:07 UTC (Mon)
by khim (subscriber, #9252)
[Link] (1 responses)
Really? When you make standard 256 times weaker then it could be otherwise it's not “inserting weaknesses”? How do you call said process? Note that in story with DES quite visible change made standard weaker and opaque change didn't but it does not change the principal position: standard was changed at the NSA request and nobody outside NSA had any idea for why said request was made in the first place. Right. 48 bits instead of 64 means it's 65536 times easier to crack. NSA, of course, proposed 56 bits, not 54 and, more importantly, IBM never agreed and never claimed 56 bits are better than 64—that's an absurd claim. Of course 56 bits cypher are weaker then 64 bits. 256 times (if there are no other substantial differences). But IBM decided that it's better to accept 56bit compromise rather then try to insist on 64bits and see their proposal thrown out. Why would NSA involved? The deed was done much earlier—when 64bits were replaced with “good enough” 56bits and S-boxes were altered. It does not look like S-boxes changes were nefarious (we still don't really know), but problem with change from 64bits to 56bits is self-evident to anyone who knows how cryptography works.
Posted Sep 17, 2013 15:57 UTC (Tue)
by giraffedata (guest, #1954)
[Link]
That would be, but NSA did not make the standard anything.
"Make" or "insert" is highly misleading terminology when you're talking about influence this small. My government makes me pay taxes; my insurance agent doesn't make me buy life insurance. I insert a post in my blog; I don't insert a story about me in LWN by sending a press release.
Yes, IBM did. You're taking too narrow a view of "better" that just means harder to crack. There are costs associated with longer keys, and IBM had to consider them all. IBM found that the added security did not justify the cost of the extra 8 bits. My understanding of the story is that what NSA convinced IBM of between IBM's initial and final proposal was how hard 56 bits was to crack, and that changed the balance in IBM's opinion.
Posted Sep 13, 2013 14:40 UTC (Fri)
by cesarb (subscriber, #6266)
[Link] (1 responses)
1. More emphasis on repeatable builds (https://lwn.net/Articles/555761/, https://lwn.net/Articles/564263/).
If building a package from its source code is repeatable, several independent services located in separate countries and with different owners could recompile each package from the source code, and publish a signed list of with the checksum of the resulting packages.
When installing a package, the package manager could then download the list of the checksums of all the distribution's packages from several of these services. Any mismatch with the checksum of the downloaded package would be cause for suspicion, and the downloaded package would be quarantined for later examination by security researchers.
This would give a strong assurance that the package has been compiled from the published source code, and has not been tampered after being compiled. The end user would not have the overhead of recompiling everything just to check if it has not been tampered. And since anyone could be running one of these services in private, even if all public services are compromised at the same time, it could be detected.
The same would also need to be done with the installation media; its build should also be repeatable and also be verified by that kind of service.
2. More static checking.
The ideal would be to be able to use automated provers (see for instance http://www.dwheeler.com/formal_methods/) to prove the absence of whole classes of defects in the source code. This is a hard problem; however, even weaker tools could help.
For instance, imagine how useful would it be to be able to add a notation (even if it is in a separate tool-specific file) saying "this field of this struct must be protected by this spinlock in the same struct, unless the struct has just been allocated and thus there are no other references to the struct" (or even more complicated conditions), and have the tool automatically verify it for you, after every change you make. Even if the tool did nothing else, it would still be useful.
That is only one example; there are many other possibilities. There can be several tools focusing on different classes of defect. And even small things like -Werror can help.
This is one area where free software can shine. Since the source code is available for everyone to study, anyone can develop static checking approaches for it, without waiting for the original maintainers (especially if the tool allows any annotations it needs to be placed in separate files).
Posted Sep 21, 2013 20:00 UTC (Sat)
by cesarb (subscriber, #6266)
[Link]
Posted Sep 13, 2013 15:32 UTC (Fri)
by Baylink (guest, #755)
[Link]
https://www.schneier.com/blog/archives/2006/01/countering...
Posted Sep 18, 2013 13:08 UTC (Wed)
by glaesera (guest, #91429)
[Link]
selene > openssl?
It's funny that GnuTLS (which is alive and well) is not even mentioned. Apparently GPLv3 succeeded, but not in a way it's creators hopes: they tried to straighten the old GPLv2 "liberty or death" principle in it and have gotten second choice. Which is quite a pity.
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
When Free Software principles meet security ... Toward healthy paranoia
Being embedd[ed] in the core image firmware, who care it's OpenSSL, GnuTLS, Selene, etc. : it will never be updated, so there's no benefit from a security point of view.
But with GPLv3/LGPLv3 GnuTLS, it would be easier for users to be able to create new core image firmware with their own, updated GnuTLS.
vendor/OEM/ODM have to release the tools needed to build a new core image firmware and allow users to install it in their own hardware.
When Free Software principles meet security ...
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
I agree that copyleft is out of favor now, but IMO the GPLv3 had little to do with it. It's simply easier/less riskier/cheaper to "just take" BSD or Apache-licensed software.
Toward healthy paranoia
Toward healthy paranoia
I fear that the folks leading this mad rush towards Apache/BSD have forgotten the lessons learned from the BSD wars, and that we're heading towards another BSD bubble/cliff.
When that (inevitably?) happens, the pendulum will swing back towards favoring Copyleft.
Toward healthy paranoia
But this will need to be different, non-FSF-driven copyleft.
Ah, I love the smell of argument-by-assertion in the morning.
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
because said change felt as “I am altering the bargain, pray I don't alter it any further”-style change.
WolToward healthy paranoia
But to anyone who bothers to understand RMS, it's pretty clear the FSF is not altering the bargain. GPLv3 is pretty much ALL BUGFIXES.
Okay, we may disagree, but the anti-Tivoisation clause simply prevents the manufacturer reserving to themself the right to update the software.
Another bugfix for a bug I didn't even realise existed - if you put a GPLv2'd BINARY on your website, then even if you put the source right next to it you trigger the "make the source available for three years" clause. You have to FORCE people to download the source.
Toward healthy paranoia
Toward healthy paranoia
GnuTLS has gone back to (L)GPLv2+, not GPLv2.
The license change is because a number of projects couldn't use newer versions of GnuTLS because they were using GPLv2 only. i.e., this is allowing for those annoying projects, not becoming one of them.
Toward healthy paranoia
GnuTLS is rarely (if ever) used and it's not part of the solution at all.
The solution is GPLv3/LGPLv3 to give right to users to fix the software running on their hardware. Or how we (as citizen, a nation, etc.) ensure that vendor, OEM, ODM, manufacturers must meet security criteria throughout the life cycle of a device: it's political matters, some pressure has to be put on the providers to keep devices safe.You mean there are no difference between incomprehensible and unmaintainable code and easy to read and understand code?
Over the time, each code will have its own set of vulnerabilities. There's no perfect code. So if the code, whatever it is, is burned in a ROM, the quality of the code don't really matter to me. As soon as it's burned, it's already weak.Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
The solution is GPLv3/LGPLv3 to give right to users to fix the software running on their hardware.
Or how we (as citizen, a nation, etc.) ensure that vendor, OEM, ODM, manufacturers must meet security criteria throughout the life cycle of a device: it's political matters, some pressure has to be put on the providers to keep devices safe.
What matter is how one can address the current shortcomings / vulnerabilities, how one can fixed them, how one can deploy a fixed version.
GPLv3 / LGPLv3 GnuTLS would allow users to do this, while BSD OpenSSL don't.
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Well, unless you build your device to specifically deny full access to the owner.
Toward healthy paranoia
Toward healthy paranoia
Nevermind that the alternative licensing schemes (eg BSD, Apache) could just as easily end up in the same situation, with future releases licensed under different terms.
Honestly, I just don't get it. :)
So at the end of the day, the response to folks who take the "oh noes, the people who write software we get to use and modify for free may, at some point in the future, change the terms of new software they write after that date!" attitude come off as if they are complaining that their free pony doesn't come in the color they want.
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Nobody makes people use GPLvX _or later_, which is required for your argument to make any sense.
1. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation.
2. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation.If you are happy with GPLv2 then use that, if you are OK with GPLv3 then use that.
You are quite right that big corps don't like GPLv3, but in my experience that's almost entirely due to patent-paranoia, rather than anything to do with potential changes in new licences.
Toward healthy paranoia
Toward healthy paranoia
The idea that you can just omit “or any later version” and thus make sure new ideas of GPLv3 will not affect you was Linus' inventions, not FSF's one—and others were not so enlightened.
WolToward healthy paranoia
Actually, the GPL (v2 or otherwise) ITSELF discusses NONE of this.
9. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation.There is a LOT of rubbish in the comments on this article, and it's pretty much all down to the fact that far too many people don't actually understand how copyright works.
Toward healthy paranoia
Toward healthy paranoia
As stated in your link 17.000 Wikimedia/Wikipedia authors voted on it with 75 % being in favour of the license change.
The FSF was just an accomplice in that plot originating in the Wikimedia communities as it had the power to allow licensing changes through the 'or later' clause.
Toward healthy paranoia
Toward healthy paranoia
You can't claim that the contributors didn't have a say in the license.
They agreed to any changes the FSF might make when they agreed to the "or later" clause in the original license.
In the same way, if you choose to license your work with an "or later" clause you give up control over how the work may be licensed in the future to whichever organization publishes the license.
If that isn't what you want, don't license the work under an "or later" clause in the first place.
Toward healthy paranoia
Toward healthy paranoia
The FSF obviously felt the proposed Creative Commons license was "similar in spirit" to the GFDL, while resolving a number of issues which arose because what it was being applied to was not, in fact, documentation.
1. How could FSF know if CC-BY-SA 10.0 will have any resemblance to GFDL at all? They allowed relicensing from GFDL 1.2 to CC-BY-SA 10.0, after all.
2. Their promise quite explicitly said any later version published by the Free Software Foundation - is it fair to abuse this permission to switch to some other license not published by Free Software Foundation?
2. If CC-BY-SA is “similar in spirit” and actually resembles GFDL then why FSF says (quite explicitly) that we do not want to grant people this permission for any and all works released under the FDL?They weren't breaking any promises, just using their position as the authoritative publisher of new versions of the GFDL to resolve real problems and concerns relating the GFDL in the context of WikiMedia.
There are no "vassels" here, only free individuals choosing licenses for their contributions without thinking through all the possible consequences.
Toward healthy paranoia
Of course, there is no requirement for the project to include contributions without such a clause, and doing so may seriously complicate project management down the road. If no compromise can be reached they may simply have to do without your contribution.
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Unwelcome title
It has the potential to prevent us from reaching practical security on actual open source assets (such as public reviews, objectivity on weaknesses and verifiable expertise of reviewers) while emphasizing unrealistic efforts for (impossible) perfect security.
As you suggest very cleverly in the rest of the article (which, as usual, I like much more) I do think too we need to encourage people to invest more efforts in that area (and certainly differently from the way investments are made today [1]) to reach another security level in open source with adequate confidence (ie. limited of course because unlimited confidence is for human lovers, not for computers; yet ;-).
But the motivation for that should not be paranoia IMHO, but the hope that such property would offer new potential for computer usefulness (and the general public) and also be fun - because that's what fuels open source work.
Unwelcome title
Toward healthy paranoia
Jon
Toward healthy paranoia
That would explain a lot of things; but shapes a frightening next decade for computer security when both governements and big companies need to justify the high wages distributed on this misleadingly named "cyberdefense".
Toward healthy paranoia
Toward healthy paranoia
For example:
-most of the tools are written with 'unsafe' languages.
-the OS architecture isn't especially based on security.
Fat chance, it isn't even *designed* the right way!
Toward healthy paranoia
security has not really been a concern in FOSS community (like in the proprietary software word): it's features first and then security is bolted on afterwards, which of course doesn't work..
Toward healthy paranoia
- C instead of 'safe by default' languages such as Ada (for example).
- the X design: any application can snoop on other applications
...
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
Say what you will about the intentions for fixing vulnerabilities in the proprietary world. I find it to be the same for the Linux kernel really.
What's undeniable though is the dramatic change Microsoft has made in their development processes […]
Toward healthy paranoia
There will also always be a role for encryption, for human-rights activists, diplomats, spies, and other "professionals." But for Mr. and Mrs. Smith, the solution can only come from politics that respect a basic human right to privacy—an encryption arms race will not work.
Toward healthy paranoia
The key issue is: how do we control the government in their usage. I would *really* like to know when a government agent and/or a big company and/or neighbour and/or managers performs an operation on files related to me. Full stop. *Only* individuals acting on their own should have a right to privacy; the others should be monitored in the first place.
Toward healthy paranoia
Toward healthy paranoia
Toward healthy paranoia
There is, for example, some evidence that the NSA has inserted weaknesses into some random-number generation standards,
Toward healthy paranoia
No, I meant how does the NSA insert something into a standard?
NSA inserting weaknesses into standards
NSA inserting weaknesses into standards
No, I meant how does the NSA insert something into a standard?
Are we talking about ordinary public standards?
NSA inserting weaknesses into standards
There is, for example, some evidence that the NSA has inserted weaknesses into some random-number generation standards,
...
No, I meant how does the NSA insert something into a standard?
The same way it always did. [Wikipedia article on DES]
NSA inserting weaknesses into standards
If the NSA activity we're talking about is anything like what is described in the Wikipedia article on DES, then "has inserted weaknesses" is entirely inappropriate wording.
The other was to reduce the key length from 64 bits to 48. IBM rejected that.
NSA then proposed 54 bits and IBM found that to be better than 64 and accepted it.
The article doesn't tell the process by which ANSI and ISO adopted it, but I see no evidence that NSA was involved.
NSA inserting weaknesses into standards
When you make standard 256 times weaker then it could be otherwise it's not “inserting weaknesses”?
IBM never agreed and never claimed 56 bits are better than 64
A few ideas
A few ideas
Trusting Trust
Toward healthy paranoia
A bug is a bug, I want to agree to this, but any open-source hobby programmer would not say 'I put that bug there deliberately', when it has been found, like the NSA did.
If they really spend million-Dollar amounts of money to weaken cryptography technologies, which I don't believe, then the whole organisation should be completely abolished, because then they would be producing insecurity instead of security.
For everyone who has a healthy sense of paranoia already, I recommend this:
https://www.eff.org/https-everywhere
https://www.eff.org/https-everywhere/faq