"Critical" projects and volunteer maintainers
Over the last five decades or so, free and open-source software (FOSS) has gone from an almost unknown quantity available to only the most technically savvy to underpinning much of the infrastructure we rely on today. Much like software itself, FOSS is "eating the world". But that has changed—is changing—the role of the maintainers of all of that code; when "critical" infrastructure uses code from a FOSS project, suddenly, and perhaps without warning, that code itself becomes critical. But many maintainers of that software are volunteers who did not set out to become beholden to the needs of large companies and organizations when they released their code, they were just scratching their itch—now lots of others are clamoring for theirs to be scratched as well.
The supply-chain security problem is clearly a serious one that needs to be addressed. The Log4j incident provides a recent example of how a security vulnerability in a fairly small component can ripple out across the internet by way of dependency chains. Some projects depended directly on Log4j, but many others became vulnerable because they were using some other library or package that depended on Log4j—directly or indirectly.
Some of the places where dependency chains are often lengthy, and thus more vulnerable to the intentional injection of malware, are various language-specific repositories of packages. Sites like the Python Package Index (PyPI) provide a huge palette of components that can be used by applications or other libraries. The pip tool that comes with Python will happily install PyPI packages along with all of their dependencies, recursively. Many other languages have similar repositories and tooling.
Critical components
There are multiple efforts these days to identify the most critical dependencies and to provide assistance to those projects so that they do not end up in the position of a pre-Heartbleed OpenSSL—or represent that one project in the classic xkcd. For example, the Open Source Security Foundation (OpenSSF) has its Alpha-Omega project that is identifying projects needing assistance with their security. PyPI has also been identifying its packages that have been downloaded the most over the last six months based on its public data sets; those that are in the top 1% are deemed "critical". Roughly 3500 projects have been identified in this manner and the maintainers of those projects are being offered a free security key to help them set up two-factor authentication (2FA) for their PyPI accounts.
Authentication using 2FA is not currently required for any packages, but
PyPI plans to require it for maintainers of critical projects "in the
coming months
". Once that goes into effect, maintainers who have not
enabled 2FA (using a security key or time-based
one-time password (TOTP) application) will presumably not be able to
make changes, such as updating the package. That, of course, has its own
risk, in that a critical package may not be able to get the update it needs
for some serious vulnerability because its maintainers failed to sign up
for 2FA.
On July 8, Skip Montanaro posted
a message to the Python discussion forum noting that a defunct project of
his, lockfile, had been
identified as critical. The project had been marked as deprecated at the
top of its README (with alternatives listed) and has
not seen any releases since 2015. He wondered why it was considered
critical and asked: "What should I do to get rid of this designation?
"
Donald Stufft said
that the package is being downloaded roughly 10-million times per month.
Dustin Ingram pointed to the FAQ in the security-key giveaway announcement
that says "once the project has been designated as critical it retains
that designation indefinitely
", so lockfile will be considered critical
henceforth. The lockfile module is part of the OpenStack project; the
README file for lockfile suggests contacting the openstack-dev
mailing list for assistance in moving away from it.
It turns out that "no OpenStack projects declare direct dependencies on lockfile since
May 2015
", according
to "fungi", who is a system administrator for OpenStack. But
lockfile is still used by parts of the OpenStack project.
In a perfect demonstration of the insidious nature of dependency chains,
fungi tracked down its use by the project:
I've found that some OpenStack projects depend on ansible-runner, which in turn depends on python-daemon, which itself declares a dependency on lockfile. I'll need to confer with other contributors on a way forward, but probably it's to either help python-daemon maintainers replace their use of lockfile, or help ansible-runner maintainers replace their use of python-daemon.
So most or all of the downloads of this "critical" PyPI project are
probably for continuous-integration testing of OpenStack and the
components that use lockfile should likely have replaced it with something
else nearly eight years ago. Hugo van Kemenade suggested
encouraging people to stop using it; "if you're still in a position to
release, emit a DeprecationWarning on import suggesting the
replacements. Or something noisier like a UserWarning.
" Paul Moore noted
that marking it as deprecated did not work, nor did ceasing releases
in 2015; "I'm not at all sure 'tell people not to use it' is a
viable strategy for getting marked as 'not critical'.
"
Opinions
On July 9, Armin Ronacher posted his
thoughts about PyPI's 2FA requirement; that post was extensively discussed
here at LWN, at Hacker News,
and elsewhere. Ronacher makes it clear that he does not see 2FA as an
unreasonable burden for maintainers of PyPI projects, but he does wonder
where it all leads. For one thing, it is, apparently, only critical
packages at PyPI that will be required to have 2FA set up, so "clearly
the index [PyPI] considers it burdensome enough to not enforce it for everybody
".
That creates something of a double standard. As Ronacher put it, he did
not set out to create a critical package, that was something that happened
organically. But the kinds of problems that can be prevented through 2FA,
such as a malicious actor logging into PyPI with stolen credentials, can
happen with any package, not just popular ones. "In theory that type of
protection really should apply to every package.
"
But there is also a question of what else might be required down the road. When the projects at PyPI today were created, there was no mention of 2FA, so other things may be added down the road as well.
There is a hypothetical future where the rules tighten. One could imagine that an index would like to enforce cryptographic signing of newly released packages. Or the index wants to enable reclaiming of critical packages if the author does not respond or do bad things with the package. For instance a critical package being unpublished is a problem for the ecosystem. One could imagine a situation where in that case the Index maintainers take over the record of that package on the index to undo the damage. Likewise it's more than imaginable that an index of the future will require packages to enforce a minimum standard for critical packages such as a certain SLO [service level objective] for responding to critical incoming requests (security, trademark laws etc.).
Some of those requirements make perfect sense from a security standpoint; in fact, some should perhaps be in place already. But there is now an ongoing discussion about disallowing projects from being deleted from PyPI. Obviously deleting a project that other projects rely on is kind of an antisocial act, but it does seem like something the author (and probably copyright holder) should be allowed to do. It can lead to chaos like the famous left-pad fiasco, however.
The
recent 2FA push from PyPI led a maintainer to accidentally
remove all of the old releases of the atomicwrites package. As
noted by Stufft in the PyPI deletion discussion linked above, he restored
the atomicwrites releases at the request of the maintainer, but "it took
about an hour to restore 35 files
". Finding a way to head off those
kinds of mistakes would be useful in addition to preventing downstream
chaos when a maintainer deletes their project.
What I like about the cargo-vet approach is that it separates the concerns of running an index from vetting. It also means that in theory that multiple competing indexes could be provided and vetting can still be done. Most importantly it puts the friction of the vetting to the community that most cares about this: commercial users. Instead of Open Source maintainers having to jump through more hoops, the vetting can be outsourced to others. Trusted "Notaries" could appear that provide vetting for the most common library versions and won't approve of a new release until it undergoes some vetting.
Reaction
Django developer James Bennett had a sharply worded
reply to Ronacher on July 11 (which was also discussed at Hacker
News and no doubt elsewhere). In much of it, Bennett seems to
be reacting to the arguments that others are making, rather than those that
Ronacher made. But Bennett's main complaint with Ronacher is that he thinks
the cargo vet approach is flawed and that those who release
FOSS have a responsibility to users in an "ethical and social sense
", even
though any legal responsibility has been disclaimed in the
license. "Yeah, if you publish open-source code you do have some
responsibilities, whether you want them or not.
"
Bennett's list of responsibilities for a FOSS maintainer seem generally
reasonable, "because what they really boil down to is the basic societal
expectation of 'don't be an asshole'
". But he is raising a strawman
here, since Ronacher never argued that maintainers should be
(allowed to be)
assholes. Ronacher simply wondered what other requirements might be
imposed on maintainers over time, some of those that he mentioned
(e.g. a service level objective) would be quite
onerous for a volunteer maintainer.
Bennett's weakest argument seems to be that Ronacher owes more to his users than he might voluntarily choose to give because his work on FOSS has opened various doors for him. It is a fairly strange argument, in truth. Overall, Bennett seems to be addressing lots of things that Ronacher did not say, or even imply. The heart of what Ronacher was trying to do was to try to figure out where the boundaries are, not to claim they had already been crossed.
It seems vanishingly unlikely that PyPI will be establishing two-day security-fix timelines, for example, on its critical projects, but there are surely lots of companies and other organizations out there that wish it would. There is a general tendency for all humans (and their constructs like companies) to shirk responsibilities if they can find another to pin them on. Companies and organizations that are shipping software that is dependent on the FOSS supply chain need to be deeply involved in ensuring that the code is secure.
Doing that work will cost a lot of money and take a lot of time. We are seeing efforts to do that work, and the PyPI 2FA requirement is one of those pieces. It is hardly a panacea, but it is a useful step.
As Luis Villa noted last year, FOSS maintainers are being asked to do more and more things; often they are being asked to do so without any compensation, though perhaps "doors opening" counts to a limited extent. As more critical projects are identified, it is likely we will see more conflicts of this nature. What happens when a maintainer does not want to follow the recommendations of OpenSSF (or some other similar effort) on changes? Forks are generally seen as a hostile move, but one suspects that may ultimately happen for projects that find themselves at odds with sponsoring organizations. That is a rather different world than the one FOSS grew up in.
Posted Jul 13, 2022 22:22 UTC (Wed)
by jafd (subscriber, #129642)
[Link]
If a company cares about the security of its supply chain so much, it should create paid positions for people to take care of it. And no, enlisting volunteers to do it all for free doesn't count.
Oh and also they should reckon with a possibility that the maintainer will refuse even the money and will not give a crap about imaginary obligation somebody on the internets thinks they have.
Posted Jul 13, 2022 22:38 UTC (Wed)
by logang (subscriber, #127618)
[Link] (14 responses)
To this end, I think it would be better if publishers of libraries indicated the level of support they will provide in the metadata of their packages: anywhere from "random fixes maybe when I feel like it" to "guaranteed to fix a security issue within 2 days". That would make for a better plan than just forcing developers to become critical maintainers because of the poor choices of other developers.
Rules would be in place such that a package can't have a dependency on any package that has a lower level support guarantee than the one it declares.
Developers adding packages to their project would have some way to declare the support they are expecting. If they're doing some hobby project and don't care, then they can have access to everything. If they are building something real that they may need to support then they are limited to packages that have declared a certain level of support; or they have to contribute to (or adopt) a library and improve it before they can use it. I'd imagine large companies could then have internal policies to prevent their developers from using stuff that nobody has declared support for.
There'd have to be systems in place for maintainers to bow out: dropping the maintenance level for a package would have to create warnings for anything that depends on it which would essentially just be a call for help. Also methods for users to say that a package isn't really maintained to the level it claims.
All this would be voluntary and opt-in but at least there would be some indication of the support intentions of the original publisher.
I haven't read too much about the cargo-vet plan, but seems to me the most important thing is whether there are active maintainers wiling to support their work (which can change at any time); not whether someone has reviewed the code.
Posted Jul 14, 2022 6:46 UTC (Thu)
by NYKevin (subscriber, #129325)
[Link] (12 responses)
* "License? Eh, it's only a couple of lines,** nobody will ever know."
Posted Jul 14, 2022 9:06 UTC (Thu)
by Sesse (subscriber, #53779)
[Link] (11 responses)
Posted Jul 14, 2022 16:39 UTC (Thu)
by hkario (subscriber, #94864)
[Link] (10 responses)
Posted Jul 15, 2022 15:50 UTC (Fri)
by salimma (subscriber, #34460)
[Link] (9 responses)
Posted Jul 15, 2022 16:34 UTC (Fri)
by Sesse (subscriber, #53779)
[Link] (8 responses)
Posted Jul 15, 2022 16:42 UTC (Fri)
by hkario (subscriber, #94864)
[Link] (7 responses)
Posted Jul 15, 2022 17:49 UTC (Fri)
by Sesse (subscriber, #53779)
[Link] (6 responses)
Posted Jul 15, 2022 18:05 UTC (Fri)
by hkario (subscriber, #94864)
[Link] (5 responses)
Posted Jul 15, 2022 18:13 UTC (Fri)
by Sesse (subscriber, #53779)
[Link] (1 responses)
More to the point, do you have real-world examples of OSS projects vendoring 1000+ C++ libraries?
Posted Jul 15, 2022 19:47 UTC (Fri)
by Wol (subscriber, #4433)
[Link]
That experience says compiler upgrades break working code ...
Cheers,
Posted Jul 15, 2022 20:56 UTC (Fri)
by bartoc (guest, #124262)
[Link] (2 responses)
You can mitigate this by just like, writing a bash script that downloads and builds all your dependencies and plops them into somewhere where your buildsystem can find them. Then again .... how do you think linux package managers got started :D?
Posted Jul 15, 2022 23:37 UTC (Fri)
by rgmoore (✭ supporter ✭, #75)
[Link] (1 responses)
This assumes you intend to keep in sync with upstream. More frequently, projects vendor libraries precisely because they want to diverge from upstream, and copy/paste/edit is easier in the short term than trying to maintain their changes as a patch. When upstream changes, they don't adopt those changes because they aren't relevant, and the vendored library gradually diverges from upstream.
Posted Jul 20, 2022 9:58 UTC (Wed)
by bartoc (guest, #124262)
[Link]
To be honest I think all these methods can be fine sometimes, esp the submodule thing is basically a poor-man's package manager and if you share a buildsystem with upstream (or use meson's cmake support thing, I guess) it can be fine.
Hell even the gnulib thing of extracting particular (possible differently licensed) components can work, they do keep things up to date ... most of the time. Ofc a lot of gnulib is much less useful than you would like.
I do kinda wish when people vendored stuff they were better about changing the names of functions, sometimes the vendored library doesn't "escape" across ABI boundaries and can be used with the upstream version just fine, if you change all the symbol names (or dynamically link). Actually have any linkers had support for doing this when static linking, like you can specify that undef symbols should be resolved from different archives for different sets of objects?
Posted Jul 16, 2022 21:52 UTC (Sat)
by marcH (subscriber, #57642)
[Link]
This is already required in some big (hence slow?) companies and there's also progress at the regulatory level, search "Software Bill Of Materials".
Security catastrophes and other scandals move things in the right direction... very slowly.
Posted Jul 13, 2022 22:44 UTC (Wed)
by Paf (subscriber, #91811)
[Link] (13 responses)
These are deep, deep stacks with many pieces. Most companies involved have no realistic prospect of paying or wanting to pay - that’s part of the point of FOSS, I think? We focus on free is in freedom but free as in beer is part of the idea too (for some of us anyway?). Perhaps then it’s just those that want to ask for stuff that should pay? That’s more reasonable.
But it’s all very hard. It seems certain only the largest companies and those with very tigh needs for a particular project will ever help pay?
Posted Jul 13, 2022 23:48 UTC (Wed)
by rgmoore (✭ supporter ✭, #75)
[Link] (8 responses)
They may not have the money to pay the full maintenance cost of all their dependencies, but they shouldn't need to. After all, the chances are they aren't the only ones using it. That's pretty much guaranteed if it makes it into the top 1% of downloads. What might help is if the package archive created something like a tip jar. Users could donate money to specific projects if they know those projects are important to them, or they could just donate some money to the service as a whole and it would automatically be apportioned among the projects they download, with maybe a little bit diverted to the archive to keep the whole thing running. If every company donated a nominal amount each month- maybe equal to the coffee budget for their programming team- it would add up.
You could even set it up so the amount the maintainer received was related to their promised service level. Projects dumped on the server with no promise of future support would get nothing. Ones that promise only that the maintainer will get to it when they have the time would get a little bit, and ones that can promise rapid turnaround on bug fixes would get a lot, at least by comparison. It would let developers who don't want to make any promises do so, while developers who hope to make some money from their projects would have an incentive to promise (and deliver) better response. It would even give companies an incentive to open source their own libraries in hopes of recovering some of their development and maintenance costs.
Posted Jul 14, 2022 3:43 UTC (Thu)
by dasyatidprime (guest, #102159)
[Link] (1 responses)
Posted Feb 23, 2024 14:24 UTC (Fri)
by LtWorf (subscriber, #124958)
[Link]
It seems that even critical pypi projects would get 0€, and things in the top 50 downloads would get a few hundreds per year.
Didn't seem worth to me to invest the time to enroll.
Posted Jul 14, 2022 7:32 UTC (Thu)
by himi (subscriber, #340)
[Link]
Isn't this kind of model opening up the "tip jar" operator to some level of responsibility for actually /tracking/ the responsiveness of projects? Quite apart from raising questions of where the money goes to, as well as probably a whole host of other questions and challenges that are probably going to make it all horribly impractical.
Running something like this really doesn't seem like it would be viable for any but the largest and most well resourced archive maintainers, or for many other organisations that might be interested in supporting maintainers and developers. And the organisations that /could/ do it might not be ones that the community would trust very much with such a role, like Google or Microsoft (or even Mozilla or the FSF).
Clearly /something/ needs to be done to ensure better resourcing for open source maintainers and developers, but I'm not sure this model would be very practically useful.
Posted Jul 15, 2022 18:49 UTC (Fri)
by samroberts (subscriber, #46749)
[Link] (4 responses)
I've some 10 or 20 year old projects that are moderately widely used, written in languages I don't use anymore. I'm not maintaining them, and a tip jar won't entice me to do so, I have a job, and at least for ruby, the modern package tooling is so different now, I don't know how to use it.
I suspect that the author of the lockfile project mentioned in the article is in a similar position, unable to convince people to stop using the project, and no amount of money is going to make it easier.
Posted Jul 15, 2022 21:10 UTC (Fri)
by rgmoore (✭ supporter ✭, #75)
[Link]
Remuneration isn't the solution to every problem, but it can certainly help with specific situations. For example, no amount of money might entice you to revisit your old projects, but it might help attract a new maintainer. That might be because they're a user of your project or a complete mercenary who's doing it only for the tip money. Either way, having a new maintainer would be a good thing. Providing a reward for taking over unmaintained projects that others depend on is a potential benefit of having some kind of money available.
Posted Jul 16, 2022 3:19 UTC (Sat)
by pabs (subscriber, #43278)
[Link]
Posted Jul 16, 2022 20:09 UTC (Sat)
by NYKevin (subscriber, #129325)
[Link] (1 responses)
Posted Jul 22, 2022 15:32 UTC (Fri)
by deltragon (guest, #159552)
[Link]
Posted Jul 14, 2022 11:09 UTC (Thu)
by kleptog (subscriber, #1183)
[Link] (3 responses)
If it were as simple as simply donating some money to an entity with a list of packages to distribute to it might have a chance of happening.
(Of course, as soon as you start talking about money you get the discussion of how to divide it. Should big projects get more than small projects? Should active projects get more than idle feature complete ones? You'd probably have to provide options there too.)
What I try to do is to, if I find a bug, try to write a useful bug report and if possible include a patch. I feel it's the least I can do.
Posted Jul 14, 2022 12:45 UTC (Thu)
by kpfleming (subscriber, #23250)
[Link] (2 responses)
Posted Jul 14, 2022 14:22 UTC (Thu)
by kleptog (subscriber, #1183)
[Link] (1 responses)
The pricing is clearly aimed at very large businesses, not your run-of-the-mill SME.
Posted Aug 1, 2022 20:11 UTC (Mon)
by Wol (subscriber, #4433)
[Link]
Other companies do the same thing.
But the deal is, if I have a problem with another project, I talk to my developer. Hopefully the developer of the other project is also part of the association, and they fix my problem for me.
You won't then need that many companies involved and - certainly for companies and coders that join - the companies get good support and the coders eat :-)
Cheers,
Posted Jul 13, 2022 22:56 UTC (Wed)
by david.a.wheeler (subscriber, #72896)
[Link] (6 responses)
Many organizations are encouraging or requiring that at least some OSS projects use 2FA/MFA tokens. I suspect that eventually 2FA/MFA tokens will simply be a requirement for all major forges (like GitHub) and for all major package repositories (like PyPI, npm, and RubyGems).
There's a reason this is happening. Unfortunately, attackers are increasingly focusing on taking over OSS developers' accounts - and they're succeeding. The attackers then change the project's source code and/or deployed packages, potentially resulting in a devastating impacts on OSS users. According to "Towards Measuring Supply Chain Attacks on Package Managers for Interpreted Languages" account compromise is the second most common supply chain attack (after typosquatting) on OSS packages for dynamic programming languages like JavaScript, Python, and Ruby. Attackers directly focus on account takeover. For example, eslint-scope, a package with millions of weekly downloads in npm, was compromised to steal credentials from developers.
The easy solution, one that counters these attacks in almost all cases, is requiring 2FA/MFA tokens. Nobody wants to add requirements. But users presume that the OSS is being being provided by the OSS developers, not by an attacker who broke into the OSS developer's account via a broken password or SIM swapping.
Here are some of the many organizations supporting OSS that are encouraging or requiring 2FA/MFA tokens (in some cases at least):
Nobody is forcing anyone to use these forges and package repositories. I think it's reasonable for them to set some basic requirements as long as they have a good reason, aren't too hard to implement, and don't harshly impede other goals like privacy. I think this meets the bar.
Posted Jul 14, 2022 9:37 UTC (Thu)
by roc (subscriber, #30627)
[Link] (5 responses)
Posted Jul 14, 2022 11:34 UTC (Thu)
by hmh (subscriber, #3838)
[Link] (2 responses)
2FA apps running in a (hopefully separate) device are almost always accepted as well, at least for the forges.
So, your typical floss developer, even in areas where hardware tokens are either expensive or hard to acquire, can trivially get past that bar.
There really isn't any excuse for not mandating 2FA other than credential recovery procedures, as long as it is an interoperable one that is not going to cause any sort of vendor lock-in.
The recovery procedures are a indeed a concern, though.
Posted Jul 14, 2022 15:15 UTC (Thu)
by tialaramex (subscriber, #21167)
[Link]
TOTP can be phished, it's maybe going to be a bit harder to phish the sort of person who writes a Python package than my mother, but on the other hand it's probably a spear phishing exercise and so the attacker is putting real work in not just spewing out a million bogus emails claiming to be from FedEx or whoever. This is why FIDO is so important that it's worth spending money (not very much money in the larger scheme of things, but spending money is annoyingly difficult in the sort of organisations that have money) to give people FIDO authenticators rather than just point them at TOTP. The "cheapest" viable attacks on FIDO appear to be: Persuade the user to give you their physical token e.g. call them and insist it needs replacing, send a "courier" to pick it up (this sounds like a good way to get arrested, Police may not understand Internet crimes, but they have seen this trick done with credit cards for decades). Persuade the user to install malware which can request suitable login credentials and send them to you, and then persuade the user to physically authorise that authentication. Both ideas are something you could imagine a state actor pulling off but aren't very practical for small time crooks, bored teenagers and suchlike real world attackers.
The difficult-to-acquire thing is probably given a misleading impression by the fact these tend to be US projects. Americans have a much easier time sending physical objects to a handful of countries and those are listed. But I assure you that a hacker living in Moscow right now, regardless of how they feel about their country's invasion of Ukraine, can get themselves a FIDO authenticator if they want one for example.
Posted Jul 14, 2022 19:34 UTC (Thu)
by roc (subscriber, #30627)
[Link]
Posted Jul 15, 2022 19:21 UTC (Fri)
by mathstuf (subscriber, #69389)
[Link]
Is that something that you can set on your account or is it more that you only use `sk-` SSH keys?
If it is via token-anchored keys, how do you manage backup keys (I have…a few)? Do you just have N SSH keys registered or is there a way to make N keys unlock a single SSH keypair? (I really want to avoid "add new token's key" to every service if/when I get new ones because I prefer to manage a keypair per service as well and making that an NxM matrix sounds…horrendous[1].) I'm also worried about exhausting the key storage on the token (IIRC, an older method used up "slots" on the key, but docs about `sk-` keys don't mention hardware limits).
[1] You also end up with a O(n) configuration with `IdentitiesOnly` and leaks how many backups there are directly in the configuration.
Posted Jul 17, 2022 3:30 UTC (Sun)
by alison (subscriber, #63752)
[Link]
Posted Jul 14, 2022 3:54 UTC (Thu)
by milesrout (subscriber, #126894)
[Link] (4 responses)
The starting point has to be along the lines of "Hi, I use your software. Is there some way I can help you to ensure that what shows up on PyPI is actually from you? Can I provide you with a USB security key and help you set up 2FA and cryptographic signing of packages? Is there some way I can help you to ensure that you respond to bug reports? I am willing to pay you money for your services. If you are or are not interested, please let me know." Until you are engaged in a contract for work, there are no obligations whatsoever on you for the quality of things you put out there for free on the internet covered in GIGANTIC disclaimers, nor any obligations to maintain the software, or respond to bug reports. Nor are there any obligations on you to keep a clean "supply chain" or to ensure that your account is difficult for others to access. If you want to publish your password openly on the internet, or use "hunter2" as your password, that is your prerogative, no matter how popular your code has become, until and unless you sign a contract providing otherwise.
Posted Jul 14, 2022 10:51 UTC (Thu)
by willy (subscriber, #9762)
[Link] (2 responses)
Posted Jul 14, 2022 12:45 UTC (Thu)
by farnz (subscriber, #17727)
[Link]
That assumes that the maintainer of libfoo bothers to do that - you may well find that you have to do the same work tracking down all of their transitive dependencies, because they're only certifying their own code, and not the code they depend upon.
Without careful communication, you can end up in two "bad" places:
Supply chain security is not a trivial problem when you have a commercial relationship with your suppliers - it's even harder when they're volunteers.
Posted Jul 14, 2022 23:18 UTC (Thu)
by jafd (subscriber, #129642)
[Link]
Posted Jul 21, 2022 5:45 UTC (Thu)
by brunowolff (guest, #71160)
[Link]
Posted Jul 14, 2022 12:50 UTC (Thu)
by kpfleming (subscriber, #23250)
[Link]
It's basically the same situation as all of the other 'dominant' open source forges and repositories; all of the tools can talk to *any* forge or repository, but users don't want to have to take time to configure them so they just use the default ones.
Posted Jul 14, 2022 13:22 UTC (Thu)
by fung1 (subscriber, #144307)
[Link]
The current transitive dependency OpenStack has on lockfile (because of the Ansible project's ansible-runner dependency chain) is really only for a fairly small corner of the hundreds of individual projects which make up OpenStack, so PyPI's download count for lockfile is probably not primarily due to OpenStack's automated testing and more likely due to the general popularity of the python-daemon project (which depends on lockfile). Further, the OpenStack community's CI relies heavily on local caches and mirrors, so doesn't actually pull directly from PyPI very often.
Posted Jul 14, 2022 13:28 UTC (Thu)
by martin.langhoff (subscriber, #61417)
[Link] (9 responses)
It's a romantic ideal, it worked while OSS was on the margins, but it's impractical in real life in the mainstream.
As a developer using a library, you don't want a single "solo maintainer" dependency. In the best days, you are absorbing code changes without peer review. On a bad day, the maintainer can go rogue, have health issues, have credentials stolen...
Let's go back to larger "standard libraries" and frameworks, team maintained. Each part of them can be small, but the umbrella effort is larger, involves more people, can have 'peer review' rules and it's a more reasonable target for corp funding.
Posted Jul 14, 2022 13:50 UTC (Thu)
by cyperpunks (subscriber, #39406)
[Link] (8 responses)
No end user should ever need to use pypi and similar repositories.
In fact if could be argued such repositories does more damage than good to software development, it's not a sustainable way forward.
Posted Jul 14, 2022 15:07 UTC (Thu)
by rgmoore (✭ supporter ✭, #75)
[Link]
Which does nothing for people who run Windows or MacOS. Even on Linux, it doesn't help very much for bleeding edge libraries under rapid development, which may be too new to have gotten attention from the distribution and may be way out of date if they have. Personally, I think the approach to developing that depends on those unstable, bleeding edge libraries is deeply flawed. Obviously there are plenty of developers who disagree, and the languages have decided to side with those developers.
Posted Jul 14, 2022 15:20 UTC (Thu)
by NightMonkey (subscriber, #23051)
[Link] (6 responses)
No longer like "herding cats", but "herding cats that herd cats"! :D
Posted Jul 14, 2022 17:20 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (2 responses)
Posted Aug 2, 2022 12:10 UTC (Tue)
by immibis (subscriber, #105511)
[Link] (1 responses)
Posted Aug 14, 2022 12:11 UTC (Sun)
by flussence (guest, #85566)
[Link]
Posted Jul 14, 2022 18:10 UTC (Thu)
by atnot (subscriber, #124910)
[Link]
Distros are limited in the amount of new packages they can review. The queue for new packages in most distros is hundreds to thousands of packages long. As such, they usually require that a package be of a certain age and popularity or depended upon by something that is.
So yes, you can say, "why would I use pypi/cpan/etc. when all of the best libraries are in my distro already". But that's ignoring that the only reason that rich ecosystem of libraries exists the first place is because enough people were willing to install it in other ways that it became worth packaging. And that popularity was only possible because installing it those ways was easy for users[1].
[1] It's also worth considering that the only reason your distro can afford to maintain those tens of thousands of python/perl/etc packages in the first place is because those packages exist on an index, in a format standardized enough that you can just use automated tools to repackage it with little extra work, instead of as disorganized tarballs on Some Guy's personal website.
Posted Jul 14, 2022 23:51 UTC (Thu)
by rgmoore (✭ supporter ✭, #75)
[Link] (1 responses)
I would argue that CPAN, at least, is not an attempt to reinvent distributions, since it was created around the same time as the first Linux distributions. It has been in service since 1995, and distributions have changed a lot about the way they work since then. Not to mention that any system designed to distribute libraries in 1995 that restricted itself to Linux just wasn't fit for purpose. I would argue it isn't fit for purpose today, since many developers today aren't working on Linux. Leaving out all developers who work on non-Linux OSes wasn't sensible in 1995, and it still isn't today.
Posted Jul 15, 2022 15:15 UTC (Fri)
by NightMonkey (subscriber, #23051)
[Link]
Posted Jul 24, 2022 0:55 UTC (Sun)
by developer122 (guest, #152928)
[Link] (3 responses)
I also do like the idea of 3rd party notaries, especially in that new versions of a package are published to the repo not automatically taken as the (latest) version to install until one of more of one's vetting notary(s) of choice have examined it.
Posted Aug 3, 2022 17:09 UTC (Wed)
by LtWorf (subscriber, #124958)
[Link] (2 responses)
I maintain one such package in debian and basically the original author stopped maintaining it and I can' make updates because I don't have the permissions. Pypi has no way to ever reclaim or remove it.
Also, today I found the module "typing-inspect". It's marked as alpha by the author. It has about 800000 daily downloads. It seems people do not read the flags that already exist when adding a dependency, so I don't think one more flag would change anything (except add hassle to people publishing libraries).
I'd be in favour of signing uploads, but it seems that the world has decided that learning to use gpg (or a graphical frontend) is hard, before even trying.
In my experience, people tend to pin dependencies. It seems typedload's recent versions are mostly ignored, while an older and much slower version from 2019 is the most downloaded by far.
Posted Aug 3, 2022 18:04 UTC (Wed)
by mathstuf (subscriber, #69389)
[Link]
It seems to me based on my reading of the description that it is perma-alpha:
> The “typing_inspect” module defines experimental API for runtime inspection of types defined in the standard “typing” module.
Posted Aug 4, 2022 11:37 UTC (Thu)
by kleptog (subscriber, #1183)
[Link]
Business are taking supply chain risks more and more seriously these days, so increasing awareness might be just enough to get some businesses to devote resources into fixing the problem.
Posted Aug 1, 2022 13:01 UTC (Mon)
by LtWorf (subscriber, #124958)
[Link] (8 responses)
Today I feel even more sorry because I'm one of those people.
I want to add that google ships the keys to a very limited set of countries (I'm currently in one of them), and although they are free of charge, they still require a credit card number to be inserted to accept the order.
Posted Aug 1, 2022 13:54 UTC (Mon)
by atnot (subscriber, #124910)
[Link] (7 responses)
Other issues aside, I am continuously annoyed by how often US tech companies just assume that everyone must own a credit card. They are not nearly as prevalent or necessary in the rest of the world. I have personally never owned or had the desire to own one, does this really disqualify me from having a popular PyPI library now?
Posted Aug 1, 2022 15:19 UTC (Mon)
by rahulsundaram (subscriber, #21946)
[Link] (2 responses)
No, most places which require it accept a debit card just fine.
Posted Aug 1, 2022 16:09 UTC (Mon)
by atnot (subscriber, #124910)
[Link]
Posted Aug 3, 2022 13:17 UTC (Wed)
by LtWorf (subscriber, #124958)
[Link]
Posted Aug 3, 2022 15:03 UTC (Wed)
by LtWorf (subscriber, #124958)
[Link] (3 responses)
It disqualifies you from receiving the USB tokens from google. They are only shipped to a very very small list of countries (Austria, Belgium, Canada, France, Germany, Italy, Japan, Spain, Switzerland, UK, USA).
Everyone else must set up 2FA with an authenticator app on the phone.
I plan to try and save the seed and use oathtool to generate the code.
Posted Aug 3, 2022 18:01 UTC (Wed)
by mathstuf (subscriber, #69389)
[Link]
This is exactly what I do. My phone has precisely zero OTP secrets (other than the SIM card for the SMS-based ones…which is at least better than nothing).
Posted Aug 4, 2022 3:37 UTC (Thu)
by pabs (subscriber, #43278)
[Link]
https://github.com/tadfisher/pass-otp
Posted Aug 4, 2022 6:12 UTC (Thu)
by jem (subscriber, #24231)
[Link]
Oh well, there are always competing security keys like the YubiKey.
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
** Do not do this. This is not how copyright law works.
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
And even if you do change the compiler, all that's needed to make the compile again, is make it use older standard explicitly or ignore some warnings, not completely rewrite large pieces of code...
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
Wol
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
The problem is that each time a vendored library updates you need to do a bunch of work to ensure your rewritten build system keeps up with any changes in upstream's build system.
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
many many organizations - I’m sure the vast majority - with dependencies on FOSS don’t have anything like the money to pay for maintenance of all their dependencies.
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
I spent countless enjoyable hours on them, but those days are past. A couple have enough interest that I've managed to transfer maintenance to a couple other people. One should never be used again, but gets a 3million downloads a week... (no kidding). I've no idea how to get the popular upstream projects that have it as a many levels down dependency to get rid of it, and I've tried. I could just add a deprecation warning to it, so every consumer would see the warning, but that seems pretty hostile, and deleting it seems even more hostile. I really wish there was better deprecation mechanisms, and softer crowbars to get downstream consumers to move on.
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
Wol
Many organizations are moving to 2FA/MFA
Many organizations are moving to 2FA/MFA
Many organizations are moving to 2FA/MFA
Many organizations are moving to 2FA/MFA
Many organizations are moving to 2FA/MFA
Many organizations are moving to 2FA/MFA
Many organizations are moving to 2FA/MFA
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
I would argue such community exists: it's goes under the name of Red Hat & Fedora, Debian & Ubuntu, openSUSE & SLES etc.
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
While people may not read the flags on a project, people are using tools like safety to warn them about security issues. If such tools could be configured to also warn about abandoned packages, unauthenticated packages and the like it would at least raise some visibility. NPM for example notifies when projects are seeking donations for example (though no idea how well that works).
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
"Critical" projects and volunteer maintainers
https://github.com/browserpass/
"Critical" projects and volunteer maintainers