2023 PSF annual impact report
2023 PSF annual impact report
Posted May 9, 2024 14:30 UTC (Thu) by pizza (subscriber, #46)In reply to: 2023 PSF annual impact report by mb
Parent article: 2023 PSF annual impact report
According to you, personal trust isn't um, trustworthy either:
"People used to be married to people who betrayed them"
Trust is, until it isn't.
The best you can _ever_ do is have some recourse after the fact, and hope it either acts as a sufficient deterrent or can compensate you for your damages/loss. Decry this principle all you like, but it is the basis [1] of every legal system out there.
[1] Granted, the true basis of _every_ system is the explicit threat of force against those that don't comply with (or otherwise violate) the rules.
Posted May 9, 2024 14:57 UTC (Thu)
by mb (subscriber, #50428)
[Link] (16 responses)
Yes. Trust is never absolute.
I just wanted to say that an "authorization" from government X (place your favorite distrust country here) is useless for me.
Improving trust by asking an untrusted third party is not going to work.
Requiring such authorization just introduces huge barriers into projects for no good reason.
I would probably also have fallen for Jia Tan, if I she had attacked me. But no level of government authorization could have prevented it.
That is not how trust works. At all.
Posted May 10, 2024 15:50 UTC (Fri)
by kleptog (subscriber, #1183)
[Link] (15 responses)
That makes no sense. Governments do not issue statements of trustworthiness. They issue proofs of identity, which you can do with what you like. You not trusting the proof of identity is orthogonal to whether you trust someone. Trust also relative: I trust most people not to want to kill me, but the people I would trust to pay back a €1000 loan is much smaller.
There are places where the fact you have passport X makes you (somehow) more trustworthy than someone with passport Y, but on an individual level that makes no sense whatsoever. Context matters.
Posted May 10, 2024 16:08 UTC (Fri)
by mb (subscriber, #50428)
[Link]
Next time please read the full text and not only the last sarcastic sentence. Ok? :)
Posted May 10, 2024 20:01 UTC (Fri)
by pizza (subscriber, #46)
[Link] (13 responses)
Yes they do; it's called a security clearance.
But that's another matter entirely.
Posted May 10, 2024 21:55 UTC (Fri)
by kleptog (subscriber, #1183)
[Link] (12 responses)
> Yes they do; it's called a security clearance.
I don't know about all jurisdictions, but at least here what such a clearance means is "we did a bunch of research on someone and didn't find any red flags". And then there are laws that say certain information can be shared with such people. That doesn't mean those people are actually trustworthy, just that from a risk management perspective the risk is low.
So guess you could say they issue "this person is 99% chance trustworthy, and we can lock them up if they break trust" certificates. Which from a government's point of view is good enough for their purposes. It's of no use whatsoever for open-source projects though.
Posted May 11, 2024 14:28 UTC (Sat)
by pizza (subscriber, #46)
[Link] (11 responses)
Why is that of "no use whatsoever" for open source projects? I mean, that's the same principle F/OSS licences and all other legal constructs (and I'd argue nearly all human interactions) are based on -- folks who violate the rules get punished (either directly or otherwise enforced) by the state. (And, I mioght add, this is the ultimate goal of all "Real names" policies. If its proponents say otherwise, they're either lying or blithering idiots)
I mean, that "99% trustworthy, we'll lock them up if they break trust" is good enough for folks that deal with actual life-and-death situations -- Are you seriously saying that F/OSS development should be held to a _higher_ standard of trust than a doctor or military general?
Posted May 11, 2024 14:50 UTC (Sat)
by mb (subscriber, #50428)
[Link] (9 responses)
Because it's a ridiculous process.
Not even my employer, for whom I develop safety critical software, requires such nonsense. I have not shown any state authorization document to them. I could have sent anybody under my name.
Posted May 11, 2024 16:26 UTC (Sat)
by pizza (subscriber, #46)
[Link] (8 responses)
That is, IMO, completely fair. And I also completely agree with you.
It's a ridiculous amount process that _still_ won't guarantee that someone can be "trusted" even in the short term.
...Which is why any proposal along the lines of "developer trustworthiness" should be jettisoned with extreme prejudice -- Frankly, even entirely trustworthy well-intentioned people still make mistakes with potentially disastrous consequences (see: log4j debacle) so we have to be able to deal with those messes regardless.
Instead, we need to focus on (early) detection, containment, and (*always* after-the-fact) cleanup.
...But keep in mind that one facet of post-facto cleanup is using the legal system to punish ne'er-do-wells, which isn't possible without tying psuedonyms to real-world identities, which in turn currently requires a _lot_ of work so is only done for particularly egregious acts (eg where death, serious injury, or very large monetary losses occurred). Having some sort of cross-jurisdiction-verifiable [1] identification requirement would make that much easier, and thus make it possible to go after lower-level offenders (and the resulting deterrent effects[2]). Again, this sort of thing is a core precept of both civil and criminal law.
Of course, when the same entity that carries out the punishment also gets to define what is and isn't a punishable offence, there is a significant (and oft-demonstrated) potential for abuse. So there are clearly pros and cons, but ultimately each society has to debate those and determine for themselves how they will balance those opposing principles.
[1] And by that I mean actually *verifiable*, not "send us an easily-photoshopped image of a physical ID card"
Posted May 11, 2024 17:05 UTC (Sat)
by mb (subscriber, #50428)
[Link] (7 responses)
Punishment gets us nowhere.
Punishment is hard and expensive to do. Especially, if you don't even live in the country of the perpetrator.
Some countries criminal laws are not even based on punishment as such.
Yes, I would personally also like to know who Jia Tan really is. But what would we do with this information? I can't think of anything good. If he was Chinese, I could immediately see how stupid people would start to generalize and make stupid conclusions. That would be bad. Especially, as we have such people in governments these days.
It would not improve things to know who Jia Tan is. Except for me personally knowing and having a "good" feeling about my prejudices being "right".
Posted May 11, 2024 17:32 UTC (Sat)
by pizza (subscriber, #46)
[Link] (4 responses)
You are correct -- except that we're currently nowhere near that bare minimal threshold.
Posted May 11, 2024 17:42 UTC (Sat)
by mb (subscriber, #50428)
[Link] (3 responses)
Posted May 12, 2024 14:18 UTC (Sun)
by pizza (subscriber, #46)
[Link] (2 responses)
We are saying the same thing, from opposite perspectives.
You can't have punishment without first getting *caught*, and since the odds of getting caught are so small, any potential pumishment has no deterrent effect.
However, it's been repeatedly demonstrated that requiring "real names" [1] considerably increases the odds of getting caught and therefore punished.
[1] Even minimally verified
Posted May 13, 2024 5:15 UTC (Mon)
by LtWorf (subscriber, #124958)
[Link] (1 responses)
Posted May 13, 2024 14:17 UTC (Mon)
by pizza (subscriber, #46)
[Link]
As I've repeatedly said (in other threads, in this thread, and even in the message you're replying to) "real names" have to be at least "minimally verified" to have even the possibility of a positive outcome.
(I've also said that you need a much stronger standard -- ie a way to (1) authenticate the credentials themselves, and (2) ensure the credentials match the person presenting them. These are inherently political/jurisductional issues, not technical)
Posted May 12, 2024 17:52 UTC (Sun)
by farnz (subscriber, #17727)
[Link] (1 responses)
That last line is contradictory to what I know of criminology; increasing the punishment does increase deterrent effect, as long as the chances of getting caught are high enough. The problem comes in when you're not increasing the chances of getting caught, and attempting to deter purely by high penalties if caught.
First, you have people who, for some reason, do not have the ability to engage in causal reasoning. These people are rare, but they do exist.
More significantly, the punishment's effect on deterrence scales with the perceived chance of getting caught to begin with. If you consider your chances of getting caught to be near-zero, no amount of punishment will have a deterrent effect; what's the difference between a loud "NO!" and life in prison if you don't think either will happen?
To put it differently, when they're considering breaking the rules, people multiply their perceived cost of punishment by the perceived chance of being caught; if the resulting number is small enough compared to the perceived benefit of breaking the rules, then they'll break the rules. And there's a mental "clamp" on the range for everything "perceived", so you can't just increase the punishment further to get a bigger deterrent; the only option once the cost of punishment reaches people's "basically too big to get bigger" is to increase the chance of being caught, or reduce the benefit of breaking the rules.
Posted May 12, 2024 23:17 UTC (Sun)
by Wol (subscriber, #4433)
[Link]
You've clearly not watched all these programs about the police :-)
I think what you say is true of the older generation, but so many kids these days seem to have brains addled by drugs (or drink) that they don't have a clue what they're doing ...
And for big crimes, people don't seem to think about the consequences of getting caught at all. Many crimes are "spur of the moment" things - and the bigger ones are often fuelled by anger (as I said, driven by drink or drugs ...).
Cheers,
Posted May 11, 2024 22:01 UTC (Sat)
by mpr22 (subscriber, #60784)
[Link]
2023 PSF annual impact report
I would not trust such authorization from most countries in the world.
Requireing such things makes the situation *worse*. Here is my certificate from government X. How dare you don't trust me! It's written *here* that I am trustworthy.
2023 PSF annual impact report
2023 PSF annual impact report
2023 PSF annual impact report
2023 PSF annual impact report
2023 PSF annual impact report
2023 PSF annual impact report
If an Open Source project requires any sort of state based authorization, then I'd rather not contribute than go through this nonsense. And I bet I'm not the only one.
So you are effectively reducing people working on the things and you are making things worse by adding this process.
2023 PSF annual impact report
[2] A good example of this is how Hollywood has evolved its efforts to combat "piracy"; I personally know several folks who stopped routinely pirating everything once their ISP sent them "do this again and you'll get disconnected, and oh, there's no competition so good luck getting online with a different provider" letters,
2023 PSF annual impact report
Does it reduce the effects of the attack?
No.
Does it ensure such crimes happen less?
No. There is no deterrence for crimes above a certain steal-bubblegum-threshold.
Does it reduce the possibility of the perpetrator doing it again?
No. In some countries criminals in prisons even get *more* criminal.
And then, what do you get? Nothing.
But it could have serious drawbacks for a society to know it.
2023 PSF annual impact report
2023 PSF annual impact report
There is no deterrence *above* the threshold. That sounds counter intuitive at first. But it actually isn't. People don't think about the possible law consequences before committing a big crime, because they expect not to be caught in the first place.
2023 PSF annual impact report
2023 PSF annual impact report
2023 PSF annual impact report
2023 PSF annual impact report
Punishment gets us nowhere.
Does it reduce the effects of the attack?
No.
Does it ensure such crimes happen less?
No. There is no deterrence for crimes above a certain steal-bubblegum-threshold.
2023 PSF annual impact report
Wol
2023 PSF annual impact report