|
|
Subscribe / Log in / New account

Soller: Real hardware breakthroughs, and focusing on rustc

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 1, 2019 15:44 UTC (Sun) by mcatanzaro (subscriber, #93033)
In reply to: Soller: Real hardware breakthroughs, and focusing on rustc by mjg59
Parent article: Soller: Real hardware breakthroughs, and focusing on rustc

Hi Matthew! It seems we have started a nice little debate. I'm impressed by the thoughtful, intelligent replies of our fellow LWN commenters.

If you don't personally feel that bundling libraries on this magnitude is a technical problem, that's fine. But as you are no doubt well aware, the status quo in Linux distribution communities is to minimize use of bundled dependencies as far as possible. If we are to start rewriting existing C and C++ code using Rust, this means most applications will eventually be linking to a couple dozen shared libraries written in Rust, each such library statically linked to hundreds of Rust dependencies, most of which will be duplicated among the other shared libraries used by the application. Perhaps you're fine with this -- perhaps it is the future of Linux distributions -- but for the purposes of my argument it's sufficient to recognize that a lot of important developers don't want to see this happen, and those developers are not going to be using Rust.

I think Rust needs those developers. Rust currently proposes to upend two well-established norms: (a) the use of memory unsafe languages in systems programming, and (b) current aversion to use of bundled dependencies, especially for system libraries. Challenging either norm, on its own, would be tremendously difficult. Challenging both at the same time seems like a strategic mistake. The Rust community loses essential allies, and we wind up with more and more code written in unsafe languages.

Currently, Rust seems like the only memory-safe systems programming language that's generated enough interest among Linux developers to have serious long-term potential to upend C and C++. I see potential to have a future, perhaps 30 years from now, where distributing libraries written in C or C++ is considered malpractice, and where modern code is all written in Rust (or future memory-safe systems languages). But I don't see that happening without more acceptance and buy-in from the existing Linux development community, and Rust doesn't seem to be on track to achieve it. The Rust community and the Linux distribution community are separate silos with separate goals, and it's hard to see outside one's own silo. If the Rust community were to better understand and address the concerns of the Linux distribution community, the potential exists to significantly increase adoption and acceptance of Rust.


to post comments

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 1, 2019 20:38 UTC (Sun) by roc (subscriber, #30627) [Link]

I think you're mostly right except for "Rust needs those developers". Rust is already successful without those developers, and the Rust community knows it ... which means they won't readily make significant concessions to better fit into the classic Linux distro shared library model.

If you specifically mean "Rust needs those developers if it's to completely usurp C/C++", then sure. Maybe enough Rust people are inspired by that to make some concessions.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 1, 2019 20:57 UTC (Sun) by mjg59 (subscriber, #23239) [Link] (21 responses)

Is this really true? Plenty of developers are moving towards using containerised distribution mechanisms - even with the frameworks provided by Flatpak, apps are likely to ship with more bundled code than in the traditional distribution workflow. The Rust community and the distribution community may be separate silos, but I'm not sure that's going to have a significant impact on how well adopted Rust ends up being.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 1, 2019 22:40 UTC (Sun) by rodgerd (guest, #58896) [Link]

Even outside of containers, we see things like Python venv being used to render runtimes independent of the distribution's view of the world. I doubt there are many Python or Ruby (for example) developers who rely on the Debian stable or RHEL versions of those interpreters and libraries.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 1, 2019 23:57 UTC (Sun) by mcatanzaro (subscriber, #93033) [Link] (19 responses)

What software goes into the container? Let's say the user application (which could be Rust, yes), plus a distro runtime, either a traditional Linux distro or something that looks very similar (in the case of flatpaks, that would be freedesktop-sdk or the derived runtime, which represents a careful balancing act between sharing the most important common dependencies and avoiding stuffing too much stuff into the runtime). And what do you use to run a container? You can't have a container without a host system, and the host system is going to be a traditional distro written in C and C++. Do the environments where we run our containers not matter? Do our desktop systems not matter?

Let's say Rust is currently poised to be successful in the user application space. It's not poised to be successful at replacing the C and C++ used to power our operating systems. And that's a shame, because we have a lot of unsafe C and C++ code that needs to be replaced over the coming decades.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 0:06 UTC (Mon) by mjg59 (subscriber, #23239) [Link] (18 responses)

The majority of the code that's exposed to untrusted input lives in the containers, so from that perspective I care a *lot* more about the code inside there than I do the code that's in the distribution.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 0:15 UTC (Mon) by mcatanzaro (subscriber, #93033) [Link] (17 responses)

And the distribution inside the container? Does your containerized flatpak not use its runtime at all? The application bundles *everything*, without using the runtime-provided GTK or Qt, GLib, WebKit, libcurl or libsoup, libxml2, GnuTLS or OpenSSL?

Unless your application code never provides any untrusted input to the runtime -- which seems very unlikely -- the runtime is security-sensitive too, and it is all C and C++. 40 years from now, this will hopefully no longer be the case. We'll still have distributions, but they will use a safe systems programming language instead. It could be Rust, but it seems Rust does not want to target this space, so I suppose more likely it will be something else instead.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 0:21 UTC (Mon) by mcatanzaro (subscriber, #93033) [Link]

I forgot to mention GStreamer in my list. That community is trying the hardest to find a place for Rust in Linux systems. It is security-critical, and has strict API/ABI stability requirements. I'm curious what its future will be.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 0:30 UTC (Mon) by mjg59 (subscriber, #23239) [Link] (15 responses)

If distributions don't provide what application developers want, they'll just include those components in the containers rather than depending on the runtime. The argument that nobody will ever develop Rust implementations of any of these components seems to depend on these components being maintained by people who are primarily concerned with what Linux distributions want, which seems like an unsupported assertion.

Basically: If nobody cares about the shared libraries a distribution ships, why should a distribution care?

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 9:57 UTC (Mon) by nim-nim (subscriber, #34454) [Link] (14 responses)

Devs that provide piles of half-baked, half-checked, half-fixed third party code in containers will be hard pressed to find a market, among anyone who values not losing his data.

The user-side part of the equation has been learning what containered apps mean in the last years. It's not impressed. They’re a lot worse than downloading random exe files from random web sites on the internet (worse because containers permit extending the amount of crap third party code in the end result).

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 12:00 UTC (Mon) by pizza (subscriber, #46) [Link] (12 responses)

> Devs that provide piles of half-baked, half-checked, half-fixed third party code in containers will be hard pressed to find a market, among anyone who values not losing his data.

This is a very naive belief.

Look no further than the rise of "wget http://some/random/script.sh | sudo bash" build scripts. Also, proprietary software is far, far worse, and manages to do just fine in the market.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 14:16 UTC (Mon) by NAR (subscriber, #1313) [Link] (10 responses)

Look no further than the rise of "wget http://some/random/script.sh | sudo bash" build scripts.

And this is the problem with the distribution landscape. Most developers won't bother learning the intricacies of creating packages for N distributions (especially if they are working on e.g. Mac). There's also a quite big chance that at least one of their dependencies are not packaged, so either they package those too (even more work) or bundle the dependencies (which is a big no-no in the eyes of distribution people). But I'm starting to see "Installation Instructions" like "docker pull ...", so even the "wget ... | bash" kind of instructions can be obsoleted. If distributions want to stay relevant, they might need to invent some automatic mechanism that takes a package from npmjs.com, pypi.org, cpan.org, hex.pm, etc. and creates parallel-installable distribution packages.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 17:03 UTC (Mon) by mcatanzaro (subscriber, #93033) [Link] (9 responses)

"if distributions want to stay relevant" just doesn't make any sense, and never did. How are you going to use npmjs.com, pypi.org, cpan.org, or hex.pm without a computer operating system to access it from? If not a Linux distribution, what else?

Expecting third-party software developers to package software for Linux distributions doesn't make a lot of sense either. They can if they want to, targeting the biggest and most important distros, but surely most will probably prefer to distribute containerized software that works everywhere using docker or flatpak or similar. Nothing wrong with that. It doesn't mean distros are no longer relevant, it just means there are nowadays better ways to distribute applications to users. Your application is still 100% useless if the user doesn't have a distro to run it on.

I see no problem with "the distribution landscape." It works well for building the core OS and for distributing popular open source applications. It was never a good way for distributing third-party software, and that's fine.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 18:06 UTC (Mon) by rahulsundaram (subscriber, #21946) [Link]

>"if distributions want to stay relevant" just doesn't make any sense, and never did. How are you going to use npmjs.com, pypi.org, cpan.org, or hex.pm without a computer operating system to access it from? If not a Linux distribution, what else?

If distributions are reduced to only the core bits and everything else is managed by per language package management systems, distributions have far less relevance than they used to have and therefore their packaging policies don't have as much influence as it used to as well. This may very well be the better role for distributions but historically distributions have attempted to package up the whole world of open source software and pretty much the only way regular users would get the software installed on their systems. This isn't the case any longer

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 18:28 UTC (Mon) by mjg59 (subscriber, #23239) [Link] (7 responses)

If all a distribution is providing is the infrastructure to run containers, then the number of shared libraries that the distribution needs to ship is pretty small. Why prioritise that case over the far more common case?

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 18:41 UTC (Mon) by pizza (subscriber, #46) [Link] (4 responses)

...and what exactly is supposed to go into those containers? Will everyone be compiling every dependency themselves from scratch?

(Or will they just download some customizable container template containing some precompiled libraries from a third party?)

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 18:48 UTC (Mon) by mjg59 (subscriber, #23239) [Link] (1 responses)

Someone still has to provide shared objects that existing code depends on somehow, but that's pretty orthogonal to why distributions have traditionally preferred shared objects - the benefits of them are already gone if everybody's including copies of them in containers.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 4, 2019 8:08 UTC (Wed) by nim-nim (subscriber, #34454) [Link]

That’s a pie in the sky argument. You don’t want to depend on distributions so you assume someone else will pick up the flag (and, moreover, you assume that that someone else, when confronted with the same issues distributions face today, will make different choices, because we all know distribution choices are bad, say the devs that have never tried integrate at scale)

In the meanwhile, real-world containers in Docker… etc public stores have been publicly audited by security researchers and those researches found out first, that those containers did rely on distributions for their content and second, that the more they tried to replace the distribution layer with custom dev-friendly ways to do things, the less up to date and secure the result ended up.

Things that may work technically are large out-of band container content inspection by Microsoft (GitHub) Google (Go), where Microsoft or Google or another internet giant orders devs to fix their open source code if they want to continue being listed in the audited container store.

I doubt devs will love this ordering a lot more than being told politely by distributions to fix things because they are becoming un-packageable.

And, that’s nothing more than a capture of free software commons by commercial entities, taking advantage of the lack of nurturing of those commons by the people who benefit from them today.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 5, 2019 6:37 UTC (Thu) by sionescu (subscriber, #59410) [Link] (1 responses)

Yes, eventually everyone should compile their own stuff. Most task containers on Borg contain exactly one statically-linked binary in addition to a very thin runtime mounted read-only from the host.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 5, 2019 21:31 UTC (Thu) by nim-nim (subscriber, #34454) [Link]

And Google manages that because they have a *centralized* organization with a *single* *unified* SCM tree.

So, good model for a cloud giant.

Atrocious model for everyone *not* a cloud giant.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 21:47 UTC (Mon) by mcatanzaro (subscriber, #93033) [Link]

I suppose that's a fine point. Every distribution still needs a fairly large number of shared libraries, many of which are security-critical, but if you don't care about them then it's reasonable to not prioritize them.

Of course, the shared libraries that distributions need to ship is 90% of what I care about. So you lose me, and other developers working in this space. We wind up with "Rust is cool for application development, but don't use it for systems programming." If that's the goal, then OK, but I suspect that's not really the goal of Rust.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 3, 2019 20:10 UTC (Tue) by Wol (subscriber, #4433) [Link]

> If all a distribution is providing is the infrastructure to run containers, then the number of shared libraries that the distribution needs to ship is pretty small. Why prioritise that case over the far more common case?

And that imho is exactly the attitude that led to the failure of the LSB :-(

It was focussed too much on allowing the distros to tell applications what they provided - which is sort-of okay for Open Source applications, but I tried to push it (unsuccessfully) towards a way where apps (quite possibly closed-source) could tell the distro what they needed. I really ought to try to get WordPerfect 6 running under Wine again, but it would have been nice for WordPerfecrt for Linux 8 to have an lsb "requires" file I could pass to emerge, or rpm, or whatever, and it would provide the pre-requisites for me.

Oh well ...

Cheers,
Wol

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 15:27 UTC (Mon) by nim-nim (subscriber, #34454) [Link]

The "wget http://some/random/script.sh | sudo bash" rises in dev circles. User side, awareness is rising about what that implies in security and robustness. You only need to be burnt once to refuse to play anymore (that the same phenomenon that caused Windows to be blacklisted as a dev platform in the 90's: too many system failures dues to atrocious deployment habits).

Proprietary software is not intrinsically worse. The level of atrocity is limited by the commercial requirement to support the result.

Container-ish distribution and vendoring are proprietary software habits at their worst, without any support requirement.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 4, 2019 7:13 UTC (Wed) by patrakov (subscriber, #97174) [Link]

<troll>
Distributions that provide piles of half-baked, half-checked, half-fixed third party code in the form of never-updates shared libraries and other packages will be hard pressed to find a market, among anyone who values not losing his data.
</troll>

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 0:05 UTC (Mon) by rahulsundaram (subscriber, #21946) [Link] (12 responses)

>If the Rust community were to better understand and address the concerns of the Linux distribution community, the potential exists to significantly increase adoption and acceptance of Rust.

Rust has no real incentive to play by distribution rules. Firefox's usage of Rust and subsequent increasingly widespread adoption will not be hampered by distributions

Distributions don't have the strong influence they used to as guiding upstreams towards certain behaviour. Upstream projects like Rust will simply bypass distributions aided by widespread use of containers and tooling like Cargo. Distributions are now in the defensive, figuring out how to stay relevant. That's the new reality.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 3:29 UTC (Mon) by pizza (subscriber, #46) [Link] (11 responses)

> That's the new reality.

And what happens when this brave new reality encounters its inevitable "zlib" moment?

The Rust ecosystem has some very serious tunnel vision -- Well-maintained open-source project (ala Firefox) are the exception, not the rule.

It's all fine and dandy to say that distributions are irrelevant or obsolete, but that doesn't make their experience and advice on sustainability/maintainability wrong.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 3:44 UTC (Mon) by rahulsundaram (subscriber, #21946) [Link] (7 responses)

> It's all fine and dandy to say that distributions are irrelevant or obsolete, but that doesn't make their experience and advice on sustainability/maintainability wrong.

Maybe but it is clear that power centre has shifted away from distributions (and I say that as someone who has involved in distribution maintenance for atleast a decade). It is up to distributions to convince the language ecosystems why the experience is valid or useful

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 10:11 UTC (Mon) by nim-nim (subscriber, #34454) [Link] (6 responses)

I think devs are so drunk on their new tooling ability to bypass free software distributions, they forget users do require a level of stability.

So either those devs work with free software distributions to provide this stability, or the market for their app segment will be captured by non-free-software third parties, that perform this distribution role.

This is happening right now with Cloudera, OpenShift, etc.

The end result of not working with distributions, and producing things end users can not use with any level of security, is not users adopting the dev model, it’s users giving up on distributions *and* upstream binaries, and switching to proprietarized intermediated product sources.

And, practically, that also means devs having to follow the whims of the proprietary intermediaries if they want any influence on how their code is actually used. Do you really think they will love better? Even if the proprietary intermediaries provide them with free-as-beer SDKs?

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 11:05 UTC (Mon) by NAR (subscriber, #1313) [Link] (2 responses)

I think devs are so drunk on their new tooling ability to bypass free software distributions, they forget users do require a level of stability.

Actually providing stability is a reason to bypass distributions. It's more than annoying when installing/upgrading an unrelated application upgrades a common dependency too with an incompatible new version...

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 3, 2019 10:56 UTC (Tue) by roblucid (guest, #48964) [Link]

This argument depends on an application that's broken requiring an obsolete (or new) library implementation for some reason (probably excessive bundling creating a non API implementation dependency).

Library updates should support the API, incompatible API changes require a new package, which may provide a legacy API or support co-existence with an installation of the old library by changing filenames or directories. Shared libraries actually do permit applications choosing different implementations if required.

Rather than 'just in case' silos, fix the bugs and write competent software. Bad security fixes breaking stuff are bugs, regressions which ought be fixed and the sysadmin is the only one who can decide the right mitigation in the deployment ..

The ability to secretly rely on vulnerabilities IS NOT a benefit to the end user

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 3, 2019 13:52 UTC (Tue) by farnz (subscriber, #17727) [Link]

To expand on that, there are three groups involved here, not two:

  1. Users, who don't care about the technical stuff as long as the computer does what they need it to do. Any change, even a security upgrade, is unpopular with users, as there's always the risk that the computer will stop doing what they need it to do on a change; users, however, do want some changes, because they see what the developers can make computers do, and decide they want that.
  2. Developers, who find new ways to make the computer do useful stuff. Developers are keen on some forms of change, because changing things is how you find a new way to make the computer do useful stuff.
  3. Operations, who keep the computer restricted to only doing what the users want it to do, and not doing things that the users don't want. Operations like different forms of change to developers, as they want changes that reduce the chance of the compuer doing things the users don't want it to do.

Distributions that survive end up being a compromise between the three. They update fast enough to keep developers happy; they're good enough at stopping things that you don't want to have happen to keep operations happy; they make the computer do enough useful things that users are happy. But note that distributions are not essential in this set - they just happen to be one form of compromise between the three groups that has historically worked out well enough to survive. Containers are another - especially combined with CI - where you build a complete FS image of the "application" and run it; you regularly update that image, and all is good.

Basically, things go wrong when the desires of the three groups are unbalanced - if a distribution becomes "this is what operations want", then users and developers leave it to ossify; if a distribution becomes "this is what users want", it loses developers and operations; if it becomes "this is what developers want", it loses users and operations. And as every real individual is a combination of users, developers and operations to various degrees, the result of such ossification is a gradual loss of people to work on the distribution.

As soon as distributions see themselves as "in opposition" to developers (or operations, or users), they're at risk - this is about finding a balance between developers' love of using the best technologies they know about, and operations' love of not bringing in tech for the sake of it that results in users getting value from the distribution.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 14:48 UTC (Mon) by walters (subscriber, #7396) [Link] (2 responses)

Er, what? How is OpenShift not FOSS?

I know the meaning of the individual words in your message, but the way you've combined them isn't making much sense to me... ("proprietarized intermediated product sources"? Really?)

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 4, 2019 8:57 UTC (Wed) by nim-nim (subscriber, #34454) [Link] (1 responses)

Well the proof is in the pudding.

The difference between effective open sourcing and over the wall code dumping others can not use is the existence of things like Oracle Linux.

I haven’t seen the equivalent OpenShift-side but I may not have looked hard enough.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 4, 2019 9:47 UTC (Wed) by zdzichu (subscriber, #17118) [Link]

In theory OpenShift comes from community https://www.okd.io
In practice, CentOS built images once and did not provide any updates for them.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 22:42 UTC (Mon) by rodgerd (guest, #58896) [Link] (2 responses)

As opposed to the perfect record of distributions which, I don't know, single-handedly wreck a chunk of the OpenSSH keyspace with hamfisted patches?

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 22:52 UTC (Mon) by pizza (subscriber, #46) [Link]

Using that reasoning, Rust isn't perfect so therefore everything else it has to offer is by definition worthless.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 3, 2019 6:36 UTC (Tue) by tzafrir (subscriber, #11501) [Link]

Recall that in this case the library in question was partially hostile and needed patching to work properly on the said platform.

So how will such a case work with bundled libraries? Nobody fixes problematic libraries down the stack? Developers do occasionally keep patched versions of such libraries? But then it is each developer on their own, right?

The same problem does not go away with bundled libraries. It may even get worst, because each developer / project is on its own tackling those issues.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 9:40 UTC (Mon) by joib (subscriber, #8541) [Link] (4 responses)

I think it's a mistake to assume very many developers choose their tools based on what distros currently think is convenient to package.

The current distro model is *a* solution (certainly not the only possible solution, and not necessarily the "best" one either, depending on how you define "best") to the problem of how to handle distribution of mostly C code, and largely set in stone 25 years ago.

If one looks at the where the world is going, it seems that more and more we're seeing applications bundling their own dependencies, be it in the form of container images, language specific build/dependency management systems with static linking like Rust or Go, or per-application virtual environments for dynamic languages like Python/Ruby/Javascript/Julia/etc. And things like app stores where applications can rely on a fairly limited system interface, bundling everything else.

I think the onus is on distros to adapt to the world as it is, not wishing for a bygone age. Otherwise they'll just be relegated to the position of an increasingly idiosyncratic way of providing the basic low-level OS while people get the actual applications they want to run from somewhere else. I also think that distros can have a very useful role to play here, e.g. keeping track of security updates for all these apps with bundled dependencies etc.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 15:33 UTC (Mon) by nim-nim (subscriber, #34454) [Link] (3 responses)

I think you’re mistaking “devs want to play with their language of choice code downloader” and “users want to master npm, crates, go modules, etc just to get anything done”

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 2, 2019 17:18 UTC (Mon) by joib (subscriber, #8541) [Link] (2 responses)

No, I don't think users want to learn 57 different packaging systems. But if the alternative is to not use said software because the distro refused to include it in their repository because it was made with a tech stack the distro maintainers don't accept, that's what they'll do.

Providing a single coherent interface to manage packages, QA, integration testing, vetting of packages, following the distro policy, security updates (as previously mentioned) are certainly valuable things that distros do. I'd like to see distros continuing this valuable work, rather than being delegated to an increasingly irrelevant provider of the low level OS. And to do this, I think distros need to better accommodate how software is developed and deployed today, rather than wishing for a bygone era where gcc, make and shell was all that was needed.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 5, 2019 14:46 UTC (Thu) by nim-nim (subscriber, #34454) [Link] (1 responses)

Community distributions accept pretty much everything, they are *community* distributions, anyone who care about anything can get his pet thing added to the distribution just by joining if he can’t find anyone else to do it for him.

Distributions, however, have minimal standards to retain the trust of their userbase.

Anything that needs too much work, to attain those standards, will get kicked outside distros, just because no one wants to work on it.

That’s why alternatives to gcc, make and shell get not traction. dev-only tech, that only helps devs, and makes everyone except devs miserable, won’t be adopted by anyone except devs. If those devs want their stuff to get into distributions, they can do the work themselves, or make it easier for others to do this work.

If those devs don’t want to do the work, and don't want to help others do it, they can stop moaning distributions are unfriendly, and prepare for world where the thing they deploy on is controlled by Apple, Google or Microsoft, and they need to construct all the parts over this proprietary baseline in Linux From Scratch mode, depending on proprietary cloud services.

Devs can certainly kill distros. So far, they haven’t demonstrated any working plan once this is done.

Soller: Real hardware breakthroughs, and focusing on rustc

Posted Dec 5, 2019 15:11 UTC (Thu) by pizza (subscriber, #46) [Link]

> If those devs want their stuff to get into distributions, they can do the work themselves, or make it easier for others to do this work.

That's a key point -- Traditional distributions do a lot of work integrating a large pile of unrelated code into a mostly-cohesive (and usable, and maintainable) whole. Application (and container) developers rely heavily on that work, allowing themselves to focus on the stuff they care about (ie their application)

If those distributions go away, that low-level systems development and integration work still needs to get done by _someone_, and to be blunt, application developers have shown themselves to be, even when willing, to be generally incapable of doing this work.

Oddly enough, the developers that are both willing and capable seem to recognize the value (and time/effort savings) that distributions bring to the table -- because without the distros, the developers would have a lot more work on their hands.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds