|
|
Subscribe / Log in / New account

Gräßlin: FLOSS after Prism: Privacy by Default

On his blog, KDE hacker Martin Gräßlin issues a call to action for free software developers to have their projects default to privacy-preserving operation. "With informational self-determination every user has to be always aware of which data is sent to where. By default no application may send data to any service without the users consent. Of course it doesn't make sense to ask the user each time a software wants to connect to the Internet. We need to find a balance between a good usability and still protecting the most important private data. Therefore I suggest that the FLOSS community designs a new specification which applications can use to tell in machine readable way with which services they interact and which data is submitted to the service. Also such a specification should include ways on how users can easily tell that they don't want to use this service any more."

to post comments

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 9, 2013 0:16 UTC (Fri) by Company (guest, #57006) [Link] (3 responses)

I don't want privacy by default. I want public by default.

I want privacy where it matters and then I want it well thought out and working. Privacy is hard, not even the Tor guys got it right...

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 9, 2013 0:50 UTC (Fri) by nybble41 (subscriber, #55106) [Link] (1 responses)

> I want privacy where it matters

If the policy is not privacy by default then any private communications will stand out and draw suspicion. Moreover, once something is made public there is no way to take it back. Unless you've specifically indicated that you want something to be shared, it's safer to assume that it's supposed to remain private.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 9, 2013 7:06 UTC (Fri) by mmarq (guest, #2332) [Link]

And no software, no matter the genius that code them, can decide wisely... or at all... what is not important and what is private.

The hell i would accept any software do that decision for me.

OTOH the "security psychosis" is driving many users out of FOSS... for me firefox warnings and page blocking was the last drop (mozilla go F... your own patience).

I'm "always" Ultra Safe... what is important doesn't go near the internet not for even a galaxy of distance... if it must go, then its not important, or i'm out of it altogether.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 9, 2013 14:54 UTC (Fri) by drag (guest, #31333) [Link]

> I don't want privacy by default. I want public by default.

Actually I would vastly prefer private by default and then public only when it matters.

> I want privacy where it matters and then I want it well thought out and working. Privacy is hard, not even the Tor guys got it right...

I think that maybe privacy and anonymity are being conflated here? Tor is aiming at approximating both, which is going to be very hard.

I mean I can have a private life and have privacy in my house due to the barriers I have created to prevent observation by others (ie: walls and window blinds), but a great deal of people know I am me and were I live and such things. So I have privacy even though I am no anonymous in my comings and goings. It would be much more difficult to live my life in complete anonymity.

And like that it's very easy to encrypt stuff so that the only you and the server (and by extension the people running the server) really knows what was set. But it's virtually impossible for me to hide the fact that I was communicating with the server.

I think therefore the KDE guy said stuff that was pretty common sense, even though its easy to nitpick specific things he said and idioms he used. It's important to be able to decide what gets sent out and what does not. The only way to maintain privacy on the internet is to simply not send data out over it and if you do need to communicate to have everything properly authenticated, encrypted, and identify accurately who you are communicating with.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 9, 2013 0:56 UTC (Fri) by euske (guest, #9300) [Link] (3 responses)

This is all nice and cozy, but I feel that today's major culprit of privacy invasion is a Web application, so limiting desktop apps might not have a big impact.

What's really scary is not about covert data collection, but about the fact that many users *willingly* submit their information to these apps.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 9, 2013 5:06 UTC (Fri) by maxiaojun (guest, #91482) [Link]

Exactly. Even if some niche Web applications claim that they preserve user privacy, how can one verify that?

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 9, 2013 17:03 UTC (Fri) by shmerl (guest, #65921) [Link] (1 responses)

Privacy concerns will reduce the usage of web applications which rarely can normally handle end to end encryption.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 10, 2013 5:58 UTC (Sat) by maxiaojun (guest, #91482) [Link]

Web applications make Linux much more usable. When it comes native desktop applications, Linux is certainly the worst among the big three.

Yes, Linux is good for server. But I find that the server here mostly mean public Web server (that hosts Web application) . For most organizations the client side is still Windows/Office, Linux servers are not that premier in this case.

The conclusion is that denying Web applications are ridiculous. Even MSFT is trying to follow this trend now.

PRISM is not a surprise at the end of day. I believe that any governments in the world want to have same surveillance power if they can. And I suspect that some of them already had. For example, is it a news that German government uses spyware?

http://www.nytimes.com/2011/10/15/world/europe/uproar-in-...

Web-(developer)-oriented Crypto Algorithm

Posted Aug 10, 2013 13:40 UTC (Sat) by martin_vahi (guest, #92302) [Link] (13 responses)

Dear lwn.net readers,

The purpose of this post is to advertise a one-time pad crypto algorithm derivative that has been specifically designed for web programming.

The advertised algorithm differs from the one-time pad by a fact that the one-time pad works with bits, but the derived algorithm uses single Unicode characters in stead of the single bits.

The specification resides at:
http://longterm.softf1.com/specifications/txor/index.html

Yours sincerely,
Martin.Vahi@softf1.com
Just another freelance software developer

P.S. Comments are also welcome. :-)

Web-(developer)-oriented Crypto Algorithm

Posted Aug 10, 2013 18:23 UTC (Sat) by rmayr (subscriber, #16880) [Link] (4 responses)

I am not sure if I see the applicability to web programming. How do you suggest web apps do the key management if no high-bandwidth out-of-band channel exists between the web server and the browser (or two browsers)?

An idea: If one does it, then one Should at Least try to do it Well

Posted Aug 12, 2013 9:46 UTC (Mon) by martin_vahi (guest, #92302) [Link] (3 responses)

The short answer is: TXOR and one-time pad are symmetric-key algorithms and therefore do not address the scenario that You described.

The motivation for rejecting public-key cryptography, regardless of crypto algorithm, is as follows:

Claim_1)

ciphertext=encryption(part_1_of_the_keypair,cleartext)=function_1(cleartext) cleartext=decryption(part_2_of_the_keypair,ciphertext)=function_2(ciphertext)

That is to say:

cleartext=function_2(ciphertext) cleartext=function_2(function_1(cleartext))

id est

There is a bijection between cleartext and ciphertext and the function_2 is an inverse function of the function_1.

Claim_2)

In the case of all public key cryptography algorithms, one of the parts of the key pair is public. As the encryption/decryption algorithm is also public, one of the functions, function_1 or function_2 is always public.

It is only a matter of mathematical skill, how to construct function_1 or function_2, if it is firmly known that the publicly known function definitely has a reverse function.

Claim_3)

The fact that I'm dumb enough to fail the task, does not mean that others, professional mathematicians, who work on it as part of their day-job, may-be, but not only, in the NSA, can't solve it.

An idea: If one does it, then one Should at Least try to do it Well

Posted Aug 12, 2013 10:10 UTC (Mon) by dlang (guest, #313) [Link]

you seem to be missing the reason that public key algorithms have been used, namely the difficulty in distributing and managing symmetric-key algorithms (including one-time-pads)

public key algorithms work when function1 and function2 are both well known, but it is impractical to derive one key when you know the other. If you are not willing to trust that, it's trivial today to use symmetric keys (if you solve the key distribution problem)

An idea: If one does it, then one Should at Least try to do it Well

Posted Aug 12, 2013 10:23 UTC (Mon) by rmayr (subscriber, #16880) [Link] (1 responses)

I understand the difference between symmetric and asymmetric cryptography, and that is the reason for my question. OTR is the most difficult option in terms of key management. Not only do you need to keep the key secret during the exchange between involved parties (in contrast to asymmetric crypto), but is also going to have to be as long as the message itself, and must never be re-used again.

That is what confuses me: you are talking about web programming as the application area for your character-based OTR combination operator, but in web programming I see no way on how to realistically do the key management for anything remotely OTR-like (hence most/all claims of OTR for web applications are snake oil).

If you intend to reject asymmetric crypto, then I'd love to hear a better option for it (as we have known for quite a while that e.g. DH and RSA will be susceptible to quantum algorithms once we get a sufficient number of qbits in a stable configuration).

Btw, asymmetric crypto is not inherently less secure than symmetric crypto, as the decryption operation is always the inverse of encryption (we are talking about lossless encryption, I assume ;-) ). It has just been studied a lot longer.

Rene

An idea: If one does it, then one Should at Least try to do it Well

Posted Aug 12, 2013 10:24 UTC (Mon) by rmayr (subscriber, #16880) [Link]

To clarify my btw, symmetric crypto has been studied for a lot longer than asymmetric. My original comment might be read the other way, sorry about that.

Web-(developer)-oriented Crypto Algorithm

Posted Aug 10, 2013 18:55 UTC (Sat) by johill (subscriber, #25196) [Link]

so unicode characters aren't stored as bits? ....

Web-(developer)-oriented Crypto Algorithm

Posted Aug 10, 2013 18:59 UTC (Sat) by dps (guest, #5725) [Link]

The proposal would seem to be about twice as expensive as xor for no increase in security given a genuinely random pad that is used only once. Generating, distributing and destroying pads could be a major problem.

c=m xor KDF(key), where KDF is a key derivation function that expands key the length of m, has already been proposed. One advantage of using xor is that the same code, or discrete logic, can be used for both encryption and decryption. The security of this construction is usually limited by the key size.

Web-(developer)-oriented Crypto Algorithm

Posted Aug 10, 2013 20:15 UTC (Sat) by geofft (subscriber, #59789) [Link]

You claim that the advantage of TXOR is that it's "computationally efficient to use within text processing oriented scripting languages". But this sort of processing goes back into arithmetic on the character values, and your algorithm has three arithmetic operations (add, subtract, and mod), whereas XOR only has one (XOR). So I expect it to be less computationally efficient than XOR.

Here's some quick anecdata in line with that, in Python 3:

>>> str1 = ''.join(chr(random.randint(0, 0x10ffff)) for i in range(4096))
>>> str2 = ''.join(chr(random.randint(0, 0x10ffff)) for i in range(4096))
>>> def xor(a, b):
...   return chr(ord(a) % ord(b))
... 
>>> def txor(a, b):
...   return chr((ord(b) - ord(a) + 0x110000) % 0x110000)
... 
>>> time.clock()
66.14
>>> for i in range(1000):
...   _ = ''.join(xor(c1, c2) for (c1, c2) in zip(str1, str2))
... 
>>> time.clock()
69.41
>>> for i in range(1000):
...   _ = ''.join(txor(c1, c2) for (c1, c2) in zip(str1, str2))
... 
>>> time.clock()
73.12

So 3.27 seconds for the XOR version, and 3.71 for the TXOR version. (I ran this a few times to avoid cold-cache effects and got the same values to within a few hundredths of a second.)

Besides, any serious use of encryption should be passing off the string as a binary string to an existing encryption library using an existing encryption mode, which should be doing all the XORs in C. There's generally no good reason to be implementing this yourself in a high-level language, and definitely no good reason to be using one-time pads directly.

What the TXOR is Good for

Posted Aug 12, 2013 9:19 UTC (Mon) by martin_vahi (guest, #92302) [Link] (4 responses)

Thank You for Your questions and Your answers!

The main problem that the TXOR solves is to allow a one-time pad like algorithm to be used without having to treat strings as bitstreams. Writing code that translates decrypted bitstreams back to strings takes considerable amount of labour and the only, or at least the main, benefit of the TXOR is to allow to skip the writing of the function decrypted_bitstream_2_programminglanguage_specific_string(..).

It seems to be a trend that the UTF-8 is the preferred format at server side, including databases, and UTF-16 is the default version in JavaScript. The UTF-8 and the UTF-16 are different binary formats that both use varying amounts of bytes for encoding Unicode characters. Part of the computational efficiency of the TXOR comes from the avoidance of the encoding format specific book-keeping that would be run within the decrypted_bitstream_2_programminglanguage_specific_string(..).

Given the task of writing the code that converts decrypted bitstreams back to strings that have the programming language specific/supported encoding, one might also say that the main benefit of the TXOR is to allow a one-time pad like algorithm to be used in situations, where the amount of available software development resources is very limited. Big corporations and NSA do not benefit from the TXOR that much, because they have loads of resources to write practically any extensive software that they please, but small software shops have to achieve the same result with considerably smaller amount of resources.

Thank You all again for
Your answers, questions and
for reading my comment. :-)

What the TXOR is Good for

Posted Aug 12, 2013 10:14 UTC (Mon) by dlang (guest, #313) [Link] (3 responses)

if you never have to interact with anything but your own software, you never have to convert strings from one representation to another (at which point TXOR doesn't save you anything)

But as soon as you need to deal with other software, you have to agree on a representation to use (at that point TXOR doesn't avoid needing to convert the data)

Java is stuck using UTF16 for legacy reasons, but everyone else has recognized that UTF8 is much more efficiant as a practical matter.

Why raw Bitstreams are Shaky

Posted Aug 12, 2013 16:09 UTC (Mon) by martin_vahi (guest, #92302) [Link] (2 responses)

Thank You for Your answer.

What about conversion between the UTF-8 (PHP, databases) and the UTF-16 (JavaScript)?

How to make sure that the decrypted_bitstream_2_string will not become jet-another-Y2K-UNIX-2038-problem after a Unicode standard update?

An attempt to write bitstream2string at one of the answers of the http://stackoverflow.com/questions/1240408/reading-bytes-from-a-javascript-string contains some Unicode specific constants, that probably have to be updated after a Unicode standard update.

Such library updates are not possible, either due to limited amount of development time or by mere lack of (legal) access to client's servers.

Why raw Bitstreams are Shaky

Posted Aug 12, 2013 21:17 UTC (Mon) by dlang (guest, #313) [Link] (1 responses)

if you have javascript talking directly to a database, something is wrong.

you should have javascript talking to a process on your server, and that process on your server will talk to a database.

That server process is going to have to convert the data from one format to another at some point, TXOR doesn't eliminate the need to do that.

If the UTF16 specifications change in a way that's incompatible with the exiting deployed data, the updated specifications are going to be ignored. The existing data is not going to be thrown away just because someone wants to change a spec. So your worries about how to handle such a problem are meaningless.

Why raw Bitstreams are Shaky

Posted Aug 23, 2013 2:03 UTC (Fri) by elanthis (guest, #6227) [Link]

> if you have javascript talking directly to a database, something is wrong.

One takes it you're not a NodeJS fan. :)

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 11, 2013 4:24 UTC (Sun) by Richard_J_Neill (subscriber, #23093) [Link] (13 responses)

I think we need to define more clearly who is the adversary (aka "Eve" in the Alice & Bob illustration). I don't think that it's really corporations who are the problem - whatever the sins of Facebook, Google, Doubleclick, etc may be, it's always possible for the consumer to opt out. The problem is clearly government and the (regrettably, usually legal) actions of the "security" services. These, we cannot opt out of, though they are far more of a threat. Also, the metadata is as much a problem as the data (the EFF explain it very lucidly here: https://www.eff.org/deeplinks/2013/06/why-metadata-matters ).

For example, I might (with reservations) be perfectly happy to trust Facebook not to be evil with a subset of my personal data. But I don't trust my ISP to carry the data back and forth without leaking it (or the metadata, if it's encrypted), I distrust the legal process which could subpoena it (from the ISP or FB), and I very much distrust the governments and agencies involved.

So, I think we actually need to fix the infrastructure. For example, Firefox should always embed Tor (and make it easy to run an intermediate node); Thunderbird should include enigmail; part of the setup for every Linux distro should include crypto; we need a genuinely trustworthy SSL certificate root (and perhaps a body such as the EFF to allocate free SSL certificates to every Linux installation: for example, when I set up apache, it should be able to get an SSL cert automatically, much in the same way that openssh-server generates a key on first run).

Also we need a solution for *routing*. I don't know how this could be done... for example, how do I make a DNS request without the DNS server's administrator knowing; or how do I send an email to a friend without leaking the metadata of the fact of that communication: who spoke to whom and when. [Has this been solved in any of the bit-torrent protocols?]

On the up-side, we now mostly have >10Mbit/s connections for most ADSL users... that means that most of the time, we could tolerate a 10x slowdown in data speed (especially if combined with pervasive, transparent use of rsync and compression). So Tor-by-default might be a good way to go.
Also, much of the world has smartphones now... so Android could implement encrypted, off-the-record messaging by default.

In my view, what we need now is leadership and co-ordination. While I'd like to hope we can defeat the NSA at the ballot box, we will probably have to do it in source-code: not just for ourselves, but for our less-technical friends and family, even those in the Windows world.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 11, 2013 7:19 UTC (Sun) by maxiaojun (guest, #91482) [Link] (6 responses)

Are you proposing a totally anonymous Internet. If so, I guess would be a paradise for terrorists, drug dealers, pornographies, ...

If FLOSS people really love anonymity, why don't they try this idea in FOSS communities first?

Why LWN commenting requires an ID?
Why bug reporting requires registration and E-mail address?
Why mailing lists requires an E-mail address?
Why IRC channels requires a nick, sometimes even a registered one?
...

On the other hand, FOSS communication's archive is generally public accessible, isn't this a major source of privacy leaking?

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 11, 2013 15:51 UTC (Sun) by Richard_J_Neill (subscriber, #23093) [Link]

> Are you proposing a totally anonymous Internet. If so, I guess would be
> a paradise for terrorists, drug dealers, pornographies, ...

Not quite: I'm proposing a totally pseudonymous Internet.

The problem with computers is that we don't get shades of grey. Everything is binary. Either we build an fully encrypted network (and yes, this will enable some of the bad guys), OR we permit a complete surveillance state.

The lessons of history teach us how fragile democracies are, how easily they fall, and how little we should trust the "guardians" with power. Personally, I'll take the risk of terrorism over the risk of tyranny,

Off-topic: might I also suggest that the way many terrorists are created is by disenfranchisement. If governments were denied the illusion that surveillance is a magic solution to terrorism, it might just possibly encourage our politicians to seek proper solutions to the issues.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 12, 2013 7:46 UTC (Mon) by jezuch (subscriber, #52988) [Link]

> Are you proposing a totally anonymous Internet. If so, I guess would be a paradise for terrorists, drug dealers, pornographies, ...

Ah, the usual collection of boogeymen. I'm not going to let the 0,01% of wrongdoers spoil the Internet for the remaining 99,99%.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 12, 2013 10:49 UTC (Mon) by jwakely (subscriber, #60262) [Link] (1 responses)

> Are you proposing a totally anonymous Internet. If so, I guess would be a paradise for terrorists, drug dealers, pornographies, ...

Note that terrorists, drug dealers, pornographers etc. can also use public transport, allowing them to travel anonymously from home to their place of business to freely conduct terrorism, drug dealing or pornography. They are also allowed to gather in public places to form their nefarious plots and schemes. They also buy food in shops, without having to identify themselves. We even allow them to view inspirational movies such as The Rock, Blow and Boogie Nights.

We must make every home a prison.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 12, 2013 10:52 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

>We must make every home a prison.
Don't give them ideas.

By now it's pretty clear that the US (and UK) governments use "1984" and "Fahrenheit 451" as instruction manuals.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 13, 2013 13:30 UTC (Tue) by Seegras (guest, #20463) [Link]

> Are you proposing a totally anonymous Internet.

Anonymity is actually a cornerstone of democracy. Because, you know, if it does not exist you could be blackmailed or strong-armed by somebody who wants you to vote his way. Or even more insidious, find out what you're likely going to vote, and try to suppress your vote by some other means, like black communities in the USA that were prevented from voting.

Anyone being against anonymity is likely an enemy of democracy.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 23, 2013 2:13 UTC (Fri) by elanthis (guest, #6227) [Link]

> Why bug reporting requires registration and E-mail address?

Actually, this is a severe usability problem. Almost every time a user wants to file a bug they have to (a) find the bug reporting site, (b) find and click "sign up", (c) fill in three pages of info, (d) wait for an email that does not always show up instantly, (e) go to a confirmation or login page, (f) finally start filling in a rigid form designed by people who think every user is an experienced QA lead, and then (g) potentially get stuck receiving email updates for the next 5 years on a bug that is apparently too contentious or difficult to just fix.

The vast majority of people never even get to step (a). A very large number of people getting past that stop at (b)-(e). (g) and the pain of going through the prior steps trains more people to not even bother with (a). The system is broken.

I've been very happy with an open bug report form using Akismet and some other strategies to eliminate spam. The form has an _optional_ sign up field offering Google or Facebook or OpenID login. The bug report entry is just a big text field.

If I get a low-quality report with no contact email, I just hit the big easy red "Ignore" button in the admin-side. The workflow is incredibly simple. I get a number of bug reports I'm quite certain I never would have gotten otherwise. For larger projects, we have trainers QA teams who filter the bugs from users before passing them to devs, so low-quality reports never waste a minute of engineering time.

The best systems collect as much information as possible (I realize the irony given the nature of this thread) and then filter things down to relevant kernels. Even if 90% of the bugs received are too low-quality to spend time trying to fix, trends of what kinds of bugs received, categories (selected by the triage team) and keywords, and so on all help build up very real and useful data that doesn't exist when only the most diehard and desparate users are filing bugs.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 11, 2013 8:41 UTC (Sun) by mgraesslin (guest, #78959) [Link]

See my second blog post to that topic: http://blog.martin-graesslin.com/blog/2013/08/floss-after...

I think you will find some of your ideas covered :-)

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 11, 2013 14:29 UTC (Sun) by raven667 (subscriber, #5198) [Link] (4 responses)

> I don't think that it's really corporations who are the problem - whatever the sins of Facebook, Google, Doubleclick, etc may be, it's always possible for the consumer to opt out. The problem is clearly government

I think those are one in the same, data collected by private companies is available to the government spooks and the problems with collecting this data are just as bad if private companies do it as when the government does it.

It's not a matter of opting out of modern living, like some sort of electronic Amish, its a matter of having clearly articulated standards of behaviour (laws) and then enforcing those standards with oversight (regulation). This can be accomplished if the populace is willing to work the levers of power (democracy) to make it happen, otherwise whoever else is working the levers of power (plutocrats) will have their way and no useful standard of behaviour will be enforced.

Personally I'd like to see privacy laws such that it would be illegal to hold personally identifying information from say web server logs for more than two weeks. You could anonymize your stats and roll it up into reports but not keep the full resolution data. Same for phone companies, do they even need to record call history for billing purposes any more, many plans are flat rate, unlimited calling, so we could require that call history not be recorded at all, or be destroyed at the end of the billing period and not be shared with outside parties or used for any other purpose.

> I might (with reservations) be perfectly happy to trust Facebook

Without transparency and oversight that "trust" is blind and very asymmetric, they can make a lot of informed guesses about you as a person but you have no idea how they are judging you.

> Also we need a solution for *routing*. I don't know how this could be done... for example, how do I make a DNS request without the DNS server's administrator knowing; or how do I send an email to a friend without leaking the metadata of the fact of that communication: who spoke to whom and when. [Has this been solved in any of the bit-torrent protocols?]

I don't think is the path to go down, at best it might be useful for those people who have a high tolerance for operational security but it's probably fundamentally impossible to communicate the way we normally do without leaving a lot of metadata for traffic analysis.

> In my view, what we need now is leadership and co-ordination. While I'd like to hope we can defeat the NSA at the ballot box, we will probably have to do it in source-code: not just for ourselves, but for our less-technical friends and family, even those in the Windows world.

I think that if you don't fix this stuff at the ballot box then, while you might have a few cypherpunks patting themselves on the back about how clever they are, the rest of the population is just going to be herded into the wood chipper, metaphorically speaking. Hopefully just metaphorically, the tools the NSA wields and the data that Google and Facebook have, should not be handed on a silver platter to the next Stalin, as if that kind of great evil can't happen again or can't happen here.

The only way to keep data safe is to not have it at all.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 11, 2013 15:35 UTC (Sun) by NAR (subscriber, #1313) [Link]

"Same for phone companies, do they even need to record call history for billing purposes any more, many plans are flat rate, unlimited calling, so we could require that call history not be recorded at all, or be destroyed at the end of the billing period and not be shared with outside parties or used for any other purpose."

A couple of years ago a band of people started kill minority (roma) persons randomly in Hungary. Went to a secluded house, set it on fire, then shoot down the fleeing inhabitants (including a four years old boy). Part of the evidence against them was mobile phone call history, because the same SIM cards were used where the murders were committed. I don't know how key was this information, but it was definitely used to found the criminals.

I know there's a cultural difference between the US and Europe (especially Eastern Europe) about privacy - but maybe having that information is not that bad. If you don't trust the handling of that information, why do you think they wouldn't keep the data even if you require them to not keep it?

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 11, 2013 16:12 UTC (Sun) by Richard_J_Neill (subscriber, #23093) [Link] (2 responses)

> I think that if you don't fix this stuff at the ballot box then, while
> you might have a few cypherpunks patting themselves on the back about
> how clever they are, the rest of the population is just going to be
> herded into the wood chipper, metaphorically speaking.

I absolutely agree with you, and with your proposals for strong privacy laws. But sadly, the majority of people do not truly appreciate the dangers, and for those that do, there is no clear choice of political party that will really stand for freedom (and have the nerve to take on the national-security apparatus, and that has a chance at the election). By the time privacy truly hits the political headlines, it will be too late.

Obama just said: "It's true, we have significant capabilities. What's also true is that we show a restraint that many other governments around the world refuse to show..."

I don't want a world where businesses usually obey the civilian laws, but everything can be monitored the instant that the NSA choose to refrain from restraint.

Therefore, I think that we need a technical solution. If this is done properly, we can protect the entire World, not just the privacy-aware techies in the West. The majority of the Internet runs on FOSS. So, let's bake strong privacy into the Linux kernel, Apache, Firefox, BIND, etc, and let's also make it really *easy* for non-experts to harden their systems.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 12, 2013 7:59 UTC (Mon) by jezuch (subscriber, #52988) [Link]

> Obama just said: "It's true, we have significant capabilities. What's also true is that we show a restraint that many other governments around the world refuse to show..."

That's... disastrous. This is very much subject to the Murphy's Laws, as originally formulated by Murphy: if there are two possible ways to do something and one of them leads to a disaster, sooner or later someone will do it the wrong way. The corollary is that an existing capability begs to be used. If there is a capability that can be used to catastrophic ends, sooner or later (or rather sooner than later) someone, authorized or not, will use it this way. Restraint be damned...

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 14, 2013 11:43 UTC (Wed) by Arker (guest, #14205) [Link]

"there is no clear choice of political party that will really stand for freedom (and have the nerve to take on the national-security apparatus, and that has a chance at the election)"

There is the LP which will clearly stand for freedom and have the nerve, and it already has a history of clearing the massive ballot access hurdles, so for a third party it's the obvious choice.

There is a developing left-right coalition that just needs to coalesce fully to be able to displace the current duopoly. Neither the traditional left nor the traditional right sides of that coalition can accept the LP platform whole, the lefties fail economics and the righties fail reproductive rights, but both sides are ever so slowly coming to accept that we have to shut down the national insecurity state first, and there is no point in arguing the rest until after that is done.

Gräßlin: FLOSS after Prism: Privacy by Default

Posted Aug 12, 2013 7:09 UTC (Mon) by zarrro (guest, #54749) [Link]

A lot of people are talking about the right technical solution that will protect the privacy.

This is possible only it theory, but not in reality. Why ?

Quote [ http://queue.acm.org/detail.cfm?id=2508864 ]:

INCONVENIENT FACT #1 ABOUT PRIVACY
POLITICS TRUMPS CRYPTOGRAPHY
Nation-states have police forces with guns. Cryptographers and the IETF (Internet Engineering Task Force) do not.

The whole article is worth reading BTW, but this is the simple truth. No matter how sophisticated the technology is, there are way too many points on the information highway which are controlled by people with guns.

IMHO, the only way to actually fix this, is not to have more privacy. It is to have less :)
This is technically possible, and can have a far greater effect.
Something like, instead of having one Facebook account, where you want to control privacy you have to have 5, where you do not . The more meaningless data there is out there, the faster all the fancy statistics using will break . And w/o these statistics, all the gathered information is quite useless :)


Copyright © 2013, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds