|
|
Subscribe / Log in / New account

New leadership for Asahi Linux

The Asahi Linux project, which is working to support Linux on Apple silicon, has announced the resignation of Hector "marcan" Martin as its lead, and his replacement by a seven-person committee. "Today's news is bittersweet. We are grateful to marcan for kicking off this project and tirelessly working on it these past years. Our community will miss him. Still, with your support, the project has a bright future to come". Martin has explained his reasons for leaving at length in this blog post.

to post comments

Some maintainers still behave like Linus used to

Posted Feb 13, 2025 16:10 UTC (Thu) by kragil (guest, #34373) [Link] (7 responses)

That toxic behavior has consequences, fact of life.

Some maintainers still behave like Linus used to

Posted Feb 13, 2025 17:51 UTC (Thu) by Lennie (subscriber, #49641) [Link] (2 responses)

"Suffice it to say, I ended up traveling for most of the year, all the while having to handle various abusers and stalkers who harassed and attacked me and my family (and continue to do so)."

that does not seem like normal (or at least should not be)

Some maintainers still behave like Linus used to

Posted Feb 16, 2025 8:59 UTC (Sun) by ssmith32 (subscriber, #72404) [Link] (1 responses)

This does seem like a major problem, and definitely would make anyone a bit touchy. But, the thing is, the blog doesn't make clear whether it is at all related to what was going on on LKML, maintainers, and their followers.

Or if it was just an unrelated, tragic, coincidence.

It seems unlikely to me that online discussions about the Linux kernel could result in stalking, no matter how harsh or toxic.

So I'm assuming the latter?

And I get why details wouldn't be shared, but it would be healthy for the community to know if LKML discussions triggered stalking or not.

It was a curious paragraph.

Some maintainers still behave like Linus used to

Posted Feb 16, 2025 10:41 UTC (Sun) by ck (subscriber, #158559) [Link]

Not only is it entirely believable, its also not limited to the LKML.
I am not a well-known individual in the FOSS world, but speaking out against someone else's favourite project got me to the point where I had to roll over my online identity more than once to shake the trolls.
I would caution anyone who gets into Open Source from using their real name or photo in association with their work. FOSS has caught up with the rest of the internet, and its not a safe space.

Some maintainers still behave like Linus used to

Posted Feb 13, 2025 21:29 UTC (Thu) by milesrout (subscriber, #126894) [Link] (3 responses)

Are you referring to when they try to use social media brigading to get what they want? I don't remember Linus ever doing that.

Some maintainers still behave like Linus used to

Posted Feb 14, 2025 8:44 UTC (Fri) by kragil (guest, #34373) [Link]

Probably not, but I'm also very sure that Mastodon can't develop any noticeable force tbh. Especially compared to Linus ranting on LKML and getting multiplied by every tech outlet on the planet.

Some maintainers still behave like Linus used to

Posted Feb 14, 2025 15:19 UTC (Fri) by marcan (guest, #103032) [Link] (1 responses)

I didn't do that either. "Brigading" has a very specific meaning, and both Linus and David Airlie used the term incorrectly. Ranting on social media is not automatically brigading (and Linus and other people on that thread have done their share of ranting on social media too).

Some maintainers still behave like Linus used to

Posted Feb 14, 2025 17:46 UTC (Fri) by jhe (subscriber, #164815) [Link]

Yes, it is not brigading, but that doesn't make it less abusive. And "Linus does it, too" ... . People tried to have him removed from the project because of that. You should aim higher than that.

A lot of good stuff in there

Posted Feb 13, 2025 16:14 UTC (Thu) by koverstreet (✭ supporter ✭, #4296) [Link] (181 responses)

and I really hate to see us bleeding good people.

A lot of good stuff in there

Posted Feb 13, 2025 16:38 UTC (Thu) by mb (subscriber, #50428) [Link] (170 responses)

It might come as a surprise to you, but people don't like to be yelled at or be blocked in progress for no good reason.
Yet, the kernel community is full of people yelling at other people and blocking other people.

I think it's time for *all* good non-abusive people to leave the Linux project and start a new kernel project on the base of modern development technologies.

A lot of good stuff in there

Posted Feb 13, 2025 16:58 UTC (Thu) by alx.manpages (subscriber, #145117) [Link] (13 responses)

It might come as a surprise to you, but people don't like being accused of either (a) being as stupid as AI or (b) using AI to interact with people. Especially in a context where we're supposed to be technical people and that we know what we're doing.

You're right now accusing Kent of being an asshole, without explicitly saying so. I find that unacceptable. I was at the verge of calling you explicitly that a few hours ago. You certainly earned merits for it. I preferred to respond you a few hours later when I was more calmed.

I don't like being yelled. Yet, I have deserved it more than once, and I appreciate that people have yelled at me. Maybe it wasn't the nicest way they could have told me, but I learnt from them anyway. People aren't perfect, and they might express their voice in different ways. For example, Theo de Raadt has yelled at me repeatedly, and I appreciate it; he made me see his points, and he was right most of the time (not always, but that's another story).

Whoever is free of sin should cast the first stone.

A lot of good stuff in there

Posted Feb 13, 2025 17:06 UTC (Thu) by mb (subscriber, #50428) [Link] (8 responses)

>It might come as a surprise to you, but people don't like being accused of either (a) being as stupid as AI or (b) using AI to interact with people.

Pardon?

>You're right now accusing Kent of being an asshole

... Wow.
How?
I have never said or even implied that.

> I was at the verge of calling you explicitly that a few hours ago.

You seem to be a very lovely person.

A lot of good stuff in there

Posted Feb 13, 2025 17:06 UTC (Thu) by corbet (editor, #1) [Link] (1 responses)

Maybe this is a good point - if not past it - to stop this particular sub-discussion.

A lot of good stuff in there

Posted Feb 14, 2025 1:39 UTC (Fri) by jmalcolm (subscriber, #8876) [Link]

Gold star moderation right there

A lot of good stuff in there

Posted Feb 13, 2025 17:09 UTC (Thu) by alx.manpages (subscriber, #145117) [Link] (5 responses)

> >It might come as a surprise to you, but people don't like being accused of either (a) being as stupid as AI or (b) using AI to interact with people.
>
> Pardon?

I'm referring to <https://lwn.net/Articles/1009296/>.

> I have never said or even implied that.

Then I misunderstood. I'm very sorry for that. Given the context (link above; and Kent's recent issues for heated discussions), I interpreted that. I'm happy that it was a misunderstanding.

A lot of good stuff in there

Posted Feb 13, 2025 17:26 UTC (Thu) by mb (subscriber, #50428) [Link] (4 responses)

>I'm referring to <https://lwn.net/Articles/1009296/>.

Ok, I guess I should have written this more elaborately yesterday. So I'll try to do that now:

I read a couple of your comments and basically all of them had the "typical" AI structure to me. That might have been coincidence or not. I didn't know, of course.
I just wanted to let you know. Give some feedback.
I think letting you know is useful in both cases whether you did copy&paste something from an AI or whether you didn't.

I'm not reading AI articles or AI comments or articles and comments that *look* like it.
That's all. Really.

I'm very sorry that this has apparently upset you a lot.

> Given the context (link above; and Kent's recent issues for heated discussions),

Of course I implicitly referred to these issues.
But I would never call him names or shout at him, because that is the behavior that I'm actually criticizing.

I fundamentally disagree with you that being yelled at is a good thing or is even remotely acceptable.
I have been yelled at by the same people who yelled at you. It might be acceptable to you, but It's not acceptable to me.

A lot of good stuff in there

Posted Feb 13, 2025 22:01 UTC (Thu) by alx.manpages (subscriber, #145117) [Link] (3 responses)

> Ok, I guess I should have written this more elaborately yesterday. So I'll try to do that now:

Thanks!

> I read a couple of your comments and basically all of them had the "typical" AI structure to me.

I think we should not accuse humans of looking like AIs, at least in certain contexts where we can assume there's a certain person behind the username. If we start having to prove that we're humans, we'll be in trouble.

Of course, if you're talking to a random reddit new user, you might very well do that, but I think we shouldn't do that here.

> That's all. Really.
>
> I'm very sorry that this has apparently upset you a lot.

Apologies accepted. It was just a misunderstanding. I'm happy now. :)

Please accept mine for the heated response from earlier today.

> I fundamentally disagree with you that being yelled at is a good thing

I don't think it's always the best thing, but for example, today it was useful, I think. If I hadn't yelled at you, you wouldn't have explained your point, and I would have continued thinking you're an asshole, and you'd have continued thinking I might be a chatbot. Me yelling at you has been beneficial in this case, I think. Things aren't black and white, but grey.

Having said that, I think yelling is usually not useful, and personally don't use it often (you might have gotten a bad impression earlier today, but I think it's rather unusual from me to yell). I just understand that some people might have different thresholds for yelling, and accept them. (and for being yelled at)

Cheers,
Alex

A lot of good stuff in there

Posted Feb 15, 2025 23:24 UTC (Sat) by alx.manpages (subscriber, #145117) [Link] (2 responses)

A lot of good stuff in there

Posted Feb 15, 2025 23:40 UTC (Sat) by mb (subscriber, #50428) [Link] (1 responses)

Wow. What a toxic person you are.

Enough

Posted Feb 16, 2025 0:10 UTC (Sun) by corbet (editor, #1) [Link]

Please, stop here, enough name calling.

A lot of good stuff in there

Posted Feb 13, 2025 17:07 UTC (Thu) by koverstreet (✭ supporter ✭, #4296) [Link] (3 responses)

Can both of you calm down? Going all "circular firing squad" isn't the answer here.

I didn't feel I was being called an asshole, so let's just let that go.

We've got some real and legitimate frustrations about the way things are operating, but pulling a BSD isn't the answer either. We need to get better at working together, and giving up and leaving isn't the answer either.

We do need leadership that is a bit less out of touch - Christoph's been a problem for a long time, and him getting a pass while Hector got chewed out wasn't good.

But it'll also help if we can avoid venting too much at the maintainers who have been around the longest and done the most, so - perhaps a more respectful attitude all around.

A lot of good stuff in there

Posted Feb 13, 2025 17:34 UTC (Thu) by mb (subscriber, #50428) [Link]

Thanks for your answer. I fully agree with what you just said.

A lot of good stuff in there

Posted Feb 14, 2025 2:05 UTC (Fri) by jmalcolm (subscriber, #8876) [Link]

The best way to admonish Christoph is to merge the code.

Thank you for the filesystem by the way. I wish it was a bit easier to use on a root partition but I know we will get there.

A lot of good stuff in there

Posted Feb 14, 2025 6:32 UTC (Fri) by joib (subscriber, #8541) [Link]

> We've got some real and legitimate frustrations about the way things are operating, but pulling a BSD isn't the answer either. We need to get better at working together, and giving up and leaving isn't the answer either.

I think a large part of the reason for the dominance of Linux in so many areas today is due to disparate interests being able to collaborate, thus saving a lot of wheel-reinvention and building up momentum. Whether you attribute this to the copyleft license, Linus personal leadership, some kind of "development culture" or whatever, but surely the BSD-style solving of disagreements by taking your ball and going home is one of the big reasons why the *BSD's are relatively marginal today, and Linux isn't.

I'm not a kernel developer so I don't have a part in this fight, but it seems that Linus needs to show some leadership here. Either overruling Christoph, or then declaring the RfL experiment failed so the people involved in it don't need to waste more effort on a futile endeavor.

A lot of good stuff in there

Posted Feb 13, 2025 18:31 UTC (Thu) by ferringb (subscriber, #20752) [Link] (153 responses)

> I think it's time for *all* good non-abusive people to leave the Linux project and start a new kernel project on the base of modern development technologies.

What would that even actually look like, let alone the 'how'? That's not a "this is stupid", it's an actual "htf would that pig gain wings" question.

Frankly trying to deal w/ kernel code makes me want to drink a liter of whisky, but there's a ton of momentum and knowledge bound up in the current culture, and a *lot* of that code is fairly brutal to try and learn- both the inherint complexity, and just stupid shit like documentation being nonexistent. That's a lot to displace w/ by an alternative.

All of this I know you know, hence wondering what realistic alternative there could even be at this stage, sans something like FANG funding a fork, which has it's own issues.

A lot of good stuff in there

Posted Feb 13, 2025 19:02 UTC (Thu) by mb (subscriber, #50428) [Link] (152 responses)

Split-offs have been done in the past for non-technical and for technical reasons.
Think about x.org or LibreOffice for just two examples of big project split-offs.

Also, big rewrites have always happened frequently. Linux itself is such a project. Or a smaller but more recent example: coreutils/uutils.

I have deliberately not detailed the "how". But I would not be surprised at all if at some point a project suddenly pops up that does it *somehow*.

New operating systems pop up and vanish all the time. They are laughed at, maybe. Including Linux in 1991.

>Frankly trying to deal w/ kernel code makes me want to drink a liter of whisky

Yes, but that doesn't mean it has to be like that.
I think a huge part of that is due to the programming language that some of the maintainers love so much.

>hence wondering what realistic alternative there could even be

Would it have looked realistic to talk about an xfree86 fork in the early 2000s? Or OO.org? Not to me. It still eventually happened somehow, though.
Lots of once big projects have completely been forgotten and replaced by now.
I think it's foolish to think that the current Linux is the only viable option to go forward.

A lot of good stuff in there

Posted Feb 13, 2025 20:12 UTC (Thu) by ferringb (subscriber, #20752) [Link] (150 responses)

> Think about x.org or LibreOffice for just two examples of big project split-offs.

Possibly debating to debate, but my recollection from then is both communities basically were disgruntled and then the owners did a forced error which kicked it off.

Xfree86 did the license crap, Oracle abandoned OOo but kept the reins. My point there is both communities- from what I recall- where ripe for bailing, and the owners basically forced it. Akin to redis and valkey, or terraform and opentofu. Your recollection of those events- specifically your vantage- isn't mine, I was just on the corp FOSS/distro side for that. IE, I could be talking out of my ass.

> I think a huge part of that is due to the programming language that some of the maintainers love so much.

There is a deep irony in the complexity of kernel tooling, macro and attribute usage to try and make C safer and more streamlined to avoid gotchas, while contrasted against the opposition to a language that fundamentally eliminates half that crap while providing a stronger framework for building even more crazed safety/robustness tooling. Baking RAIT, state progression, all of that into typing and ownership is a no brainer. It's not even "chuck it all and rewrite", it's a pretty smooth onramp that yields gains as things go. A bit more complex than jumping between shitty C standards, but that's how I view it.

Either way, even if it's doomed to fail I'd love to see a space where R4L (and other kernel culture/process improvements) seem like a collaborative effort across the board. Including a reset of what's considered acceptable behavior as cultural norms, in particular.

No argument for any of your other points, even if I disagree w/ the manpages/coreutils maintainer's view of "I can do C safely" vs my view of "I make mistakes, I want tooling that prevents the bulk of it". It's a bit blunt in phrasing, but again, I come at this from a tooling angle.

A lot of good stuff in there

Posted Feb 13, 2025 23:00 UTC (Thu) by butlerm (subscriber, #13312) [Link] (149 responses)

A Rust based kernel and indeed an entire Rust based or Rust dominated operating system is an interesting project that is well worth doing, including automated, large scale code translation from C / C++ and possibly other languages into Rust.

That said it is probably always going to be an uphill battle to persuade a large development community to abandon or even gradually abandon an existing well established language ecosystem for an entirely new one, no matter what the advantages are.

And this is especially the case because it is far from certain that C compilers, optimizers, and linkers will not be developed that remove the vast majority if not all of C's and C++'s undefined behaviors without a performance penalty - not only without a performance penalty but with a performance increase. There are people working on that right now that have been mentioned here on LWN.net and from what I can see they have been making pretty good progress.

Furthermore it is far from established that Rust would be the best memory safe target language to convert something as large as the Linux kernel code base to anyway. There are much better known languages that are probably easier to learn than either Rust or contemporary C++ are that may be better choices for that. A Pascal derivative like Modula-3 with the appropriate improvements is a good example.

I can think of a number of other possibilities that could work including appropriately statically compiled, linked, and optimized C# or other languages from the .NET world. Perhaps someone should try that as an experiment as well. Or something like Ada with the appropriate extensions perhaps. Or a new language designed and developed for the purpose. Ada is forty years old after all. The Go language with the appropriate extensions comes to mind as well.

A lot of good stuff in there

Posted Feb 13, 2025 23:30 UTC (Thu) by khim (subscriber, #9252) [Link] (34 responses)

> And this is especially the case because it is far from certain that C compilers, optimizers, and linkers will not be developed that remove the vast majority if not all of C's and C++'s undefined behaviors without a performance penalty

We can be 100% sure that it'll never happen. It's impossible to eliminate all undefined C and C++ behaviors. Mathematically impossible.

And elimination of of majority, while possible, is not happening, either.

That, essentially, means, that C would be, eventually, replaced. Some people, like marcan, just couldn't accept the fact that it would only be replaced, ultimately, with significant percentage of C developers. That's the usual way such things are happening, which is sad, but also, unfortunately, inevitable.

> There are people working on that right now that have been mentioned here on LWN.net and from what I can see they have been making pretty good progress.

Where can we read about said progress?

> Furthermore it is far from established that Rust would be the best memory safe target language to convert something as large as the Linux kernel code base to anyway.

The core thing, affine type system would be there, it's simply inevitable: we don't have any realistic alternative.

But yes, Rust may not be the best choice… only we don't have any other, right now.

> A Pascal derivative like Modula-3 with the appropriate improvements is a good example.

Modula-3 is not memory safe, thus it's non-starter. Ada may qualify, it got a way to be memor safe around five years ago. But I seriously doubt it would be better accepted than Rust. Same is true for Swift.

What other alternatives are there?

> I can think of a number of other possibilities that could work including appropriately statically compiled, linked, and optimized C# or other languages from the .NET world.

100% non-starter. These languages base their memory safety on entirely different foundation, tracing GC. Not something you may bring into kernel (at least not into Linux kernel).

> The Go language with the appropriate extensions comes to mind as well.

Is this an attempt to write post with AI generator or something?

To include so many “nonstarter” ideas in one plausibly sounding post… I have no idea how to achieve that without AI.

A lot of good stuff in there

Posted Feb 13, 2025 23:51 UTC (Thu) by koverstreet (✭ supporter ✭, #4296) [Link] (4 responses)

> Same is true for Swift.

Automatic memory management, and it's still pretty tied to Apple.

A lot of good stuff in there

Posted Feb 13, 2025 23:56 UTC (Thu) by khim (subscriber, #9252) [Link] (3 responses)

> it's still pretty tied to Apple.

That's what I meant: while Rust is having acceptance issues… chances that Linux developers would accept something that's under full control of Apple is much smaller.

> Automatic memory management.

Sure, that's less interesting than affine types – but it's still memory safety without tracing GC thus, at least in theory, compatible with the needs of kernel in the current “thou shell not use memory-unsafe language” era.

But previous item makes it non-starter.

A lot of good stuff in there

Posted Feb 14, 2025 14:04 UTC (Fri) by emk (subscriber, #1128) [Link] (2 responses)

As someone who has worked on both kernels and garbage-collected language implementations, they are two great tastes that do not go great together. It's not theoretically impossible, but it's Hard Mode on several axes at once. And you generally need to make compromises around the kernel's abilities, or bypass the GC for long stretches of code.

A lot of good stuff in there

Posted Feb 14, 2025 14:06 UTC (Fri) by koverstreet (✭ supporter ✭, #4296) [Link] (1 responses)

There's some good stories about how the Lisp people pulled it off back in the day - but yeah, not something we want to do now.

A lot of good stuff in there

Posted Feb 14, 2025 17:35 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

If you're interested, there's a Singularity OS from Microsoft that pulled this off in modern times. It's a pretty fascinating experiment, and they managed to tame the system-wide GC quite a bit.

Thanks for making me read (was A lot of good stuff in there)

Posted Feb 14, 2025 0:20 UTC (Fri) by dskoll (subscriber, #1630) [Link] (2 responses)

I don't know Rust at all and I'm not a kernel programmer, but khim's comment above made me go and research what "affine types" are, and let me down quite an enjoyable little rabbit whole to the point where I finally get (I think) what the big deal about Rust is.

So thank you!

Thanks for making me read (was A lot of good stuff in there)

Posted Feb 14, 2025 8:29 UTC (Fri) by 4761 (guest, #165801) [Link] (1 responses)

I recently stumbled about this blog post. Can highly recommend it, as it goes into further detail on this! (If you haven't come across it)

Thanks for making me read (was A lot of good stuff in there)

Posted Feb 15, 2025 2:31 UTC (Sat) by ebiederm (subscriber, #35028) [Link]

For things that have been achieved in the real world I recommend looking at ADA Spark. ADASpark has already reached the point where hardware security bugs are easier to find than the software security bugs. In part because ADA Spark has the facilities to verify manual memory management is being done correctly.

https://blog.adacore.com/when-formal-verification-with-sp...

For what is possible I recommend looking at separation logic. One of the original papers includes a memory safety proof of an algorithm that reverses a linked list and manually manages memory.

The references in the Wikipedia page is a good starting place to find out more about Separation Logic. https://en.m.wikipedia.org/wiki/Separation_logic

Last I looked it is still an open question how to convert separation logic into a type system for a language that has manually manages memory.

Still separation logic gets used in lots of interesting proofs of manual memory management. So any serious solution to manual memory management and security is going to need to interact with separation logic at some point.

The key property of separation logic is being able to show that one piece of code and data, does not interact with another piece of code and data. That property is needed for any kind of reasoning that works on that scales to millions of lines of code of current code bases.

If the solution to showing separation is sufficiently granualar, a proof that "free" and the data passed to it does not interact with anything else in the code base is enough to show that calling "free" is memory safe. Which potentially allows machine checked memory safety for all of the existing data structures in the kernel.

A lot of good stuff in there

Posted Feb 14, 2025 3:04 UTC (Fri) by butlerm (subscriber, #13312) [Link] (25 responses)

Not sure what happened to two or three replies I posted here, but generally speaking insulting others is not the best way to win friends and influence people. As far as Rice's Theorem goes that is trivially solvable with extensions to the language - even #pragma style - that specify what happens if a function exceeds limits in time or space as a function of its arguments and the language visible machine state.

As far as progress goes, I believe Google is your friend. Articles on the subject have appeared here and on a couple of other sites within the past month or two. As far as Modula-3's memory safety is concerned, same answer as with C - specify exact implementation defined behavior so code that is memory safe can be written and demonstrated to be memory safe. As far as all the suggested languages are concerned I suggested that they would be worth trying as experiments - or even research projects. I can't quite see the great harm that would come from that.

As far as languages that normally use a tracing garbage collector are concerned, I am not sure where you get the idea that they must use a tracing garbage collection instead of something like an internal implementation of reference counting with extensions to break cycles, or why they would require embarassing pauses, or have performance characteristics worse than RCU or a slab allocator for that matter.

The languages I mentioned are just the beginning. With a little bit of software engineering you could translate the entire Linux code base into an appropriate dialect of Python and with appropriate compiler technology compile it to have performance greater than that exhibited by the C code base right now. I am pretty sure that would also be possible with languages like Fortran, Java, and Eiffel with the appropriate extensions.

Not to mention assembly language with automated translation from of assembly code from one CPU architecture to another. With a little creativity one could write and maintain the entire kernel in a close derivative of 68000 assembly language and automatically cross-assemble it into every target architecture the Linux kernel supports - *and* check for and maintain all the safety guarantees expected of a modern system level programming language. That is probably not the best way to maintain a kernel, but it would be hard to beat a kernel written in assembly language for performance. Netware 386 was written in assembly language for a reason.

None of this stuff is impossible or has been shown to be impossible. Anyone who thinks it is simply lacks imagination or is too dedicated to one possible solution to carefully consider any of the others.

A lot of good stuff in there

Posted Feb 14, 2025 4:12 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

> specify exact implementation defined behavior so code that is memory safe can be written and demonstrated to be memory safe.

Erm... What?

A lot of good stuff in there

Posted Feb 14, 2025 8:30 UTC (Fri) by Wol (subscriber, #4433) [Link] (1 responses)

> That is probably not the best way to maintain a kernel, but it would be hard to beat a kernel written in assembly language for performance. Netware 386 was written in assembly language for a reason.

???

Forth, anyone?

Cheers,
Wol

A lot of good stuff in there

Posted Feb 14, 2025 11:51 UTC (Fri) by khim (subscriber, #9252) [Link]

Every layer of abstraction adds overhead. Forth is not an exception. It's slower and less efficient than assembler… but also more compact!

That's remarkable, but not unique, quality. Microsoft have allegedly used some variant of p-Code in their early software for the same reason.

A lot of good stuff in there

Posted Feb 14, 2025 8:54 UTC (Fri) by taladar (subscriber, #68407) [Link]

Anyone who thinks "if we just change language X enough it could have all the properties of better language Y too" doesn't understand that that is just a worse way of adopting a new language since it requires just as much rewriting of code.

A lot of good stuff in there

Posted Feb 14, 2025 10:33 UTC (Fri) by khim (subscriber, #9252) [Link] (20 responses)

> Not sure what happened to two or three replies I posted here, but generally speaking insulting others is not the best way to win friends and influence people.

Who told you I'm here “to win friends and influence people”? I'm here to learn… if others have anything worthwhile to teach me. It happens.

> As far as Rice's Theorem goes that is trivially solvable with extensions to the language - even #pragma style - that specify what happens if a function exceeds limits in time or space as a function of its arguments and the language visible machine state.

This phrase just means that you are an idiot or you think I'm an idiot… is it an insult? You can treat it as one, if you want, but that's just the fact.

Rice's Theorem result couldn't be “fixed” in an idiotic way that you propose because it, fundamentally, means that it's impossible to transform the program in any way, except purely syntactical) and then ask “does this program still performs in the exact same way or not”. Question of “how big the program have to be for that question not to be unanswerable” is interesting, but only in a purely theoretical sense: in practice verifiers start to require more power and time than we may ever have very quickly, before size of the program reaches anything resembling the OS kernel, even primitive OS kernel (like MS-DOS).

> I am not sure where you get the idea that they must use a tracing garbage collection instead of something like an internal implementation of reference counting with extensions to break cycles

Because “internal implementation of reference counting with extensions to break cycles” is simply one way to implement tracing GC?

The fundamental property of tracing GC is that it insists that it should know everything about all references from and to all objects anywhere in the system. “Extensions to break cycles” share that property and thus split the whole world in two: the one handled by GC and the one that's not handled by GC.

And then we are back to the square one because someone, somehow, have to write and support that “world that's not handled by GC”.

> The languages I mentioned are just the beginning.

I would rather they say they are the end. Of the discussion, not the Rust or Rust for Linux, of course. They show that you don't understand why Rust exists, why Rust for Linux exist and what problem they are solving.

Because they are not trying to solve any actual technical problem, but rather their existence and popularity have shown us that we hit something that's impossible to solve by technical means and it's time to “redefine the task”. In parcticular it's impossible to create a language for the “we code for hardware” guys which would both produce optimal (or close to optimal) result and not turn the program into a pile of goo when certain, hard to test and detect, conditions are violated.

The solution to that social problem is social and very-well known, too: one funeral at time approach works reliably… others… not so much.

Rust for Linux is an interesting experiment to see if that problem can be solved without throwing out the work of thousands of people and starting from scratch.

It's very interesting and it would be cool to see if Linus could pull of the third revolution (after Linux itself and then Git), but for all his successes in the years past I wouldn't expect him to mass-change Sauls to Pauls… that's not how humans work. Some people would be converted… but most wouldn't.

Would that be enough to save Rust for Linux (and, by extension, Linux)? I have no idea really. But I know that expecting technical solutions to social problems is foolish and expecting to change social problem by a decree of top guys simply wouldn't work.

> With a little bit of software engineering you could translate the entire Linux code base into an appropriate dialect of Python and with appropriate compiler technology compile it to have performance greater than that exhibited by the C code base right now.

Yet another idiotic proclamation not justified by anything. I guess if you would define “appropriate dialect of Python” as “something that looks like a Python program from the outside but can be syntactically translated to C” (because purely syntactical change sidesteps the Rice theorem) then it may even be true, but I doubt you have meant it like that.

> Not to mention assembly language with automated translation from of assembly code from one CPU architecture to another.

These things exist but you lose anywhere between 50% and 90% of performance. Precisely because you have to spend a lot of resources emulating behavior that's not needed at all – but it's not possible to detect that and not emulate it (back to the Rice's theorem, again).

The more efficient and tricky your assembler program the larger the slowdown. To faithfully emulate all aspects of 3.58 MHz SNES and make all assembler programs with all nasty tricks their authors have invented work one needs a 3GHz CPU.

Hardly a way to produce an OS that real people would use for real work.

> With a little creativity one could write and maintain the entire kernel in a close derivative of 68000 assembly language and automatically cross-assemble it into every target architecture the Linux kernel supports - *and* check for and maintain all the safety guarantees expected of a modern system level programming language.

Another iAPX 432? It would work exactly as “efficiently” (that is: very inefficiently) and would achieve exactly the same level as success as original.

Possible? Maybe. Feasible? Most likely no. The majority of the Linux kernel code is in drivers and this 3x-10x slowdown and drivers don't play well: hardware loses patience and state, then nothing works.

> That is probably not the best way to maintain a kernel, but it would be hard to beat a kernel written in assembly language for performance.

Transpiled from one CPU architecture and thus at least 3x to 10x slower? Easy. Not all languages would be able to beat it, but many are enough for that.

> Netware 386 was written in assembly language for a reason.

Sure. And that reason was: it had no need to support anything by 386 and had no need anything by file and printer sharing.

When people started demanding more things from their network OSes it died extremely quickly.

There are lots of OSes that are written in assembler language still in use. You never hear about them because, these days, they need emulators and thus are much slower than anything else.

> Anyone who thinks it is simply lacks imagination or is too dedicated to one possible solution to carefully consider any of the others.

And anyone who tells tales that are not justified by history, experience or even math is just a buffoon. And you sound like a someone who have never done any work in any relevant area, be it binary translation, language development or kernel OS development… what makes you sure you can teach anything worthwhile to anyone who have done such work?

Intel 432

Posted Feb 14, 2025 14:32 UTC (Fri) by james (subscriber, #1325) [Link] (19 responses)

Another iAPX 432? It would work exactly as “efficiently” (that is: very inefficiently) and would achieve exactly the same level as success as original.
Bob Colwell, later to be senior architect on the P6 (= Pentium Pro) and Pentium 4, published postgrad research on why the 432 failed. He was insistent that a lot of people had taken the wrong lessons: Bob said that nearly all the performance problems were due to the space limitations of the existing manufacturing processes (leading to no memory caches and no general-purpose registers, for example), "part of the problem with the 432 was these guys just weren't paying attention to performance", and this anecdote:
Eventually, I said "Okay, who are you? You know way too much about this technology." He tells me "I'm the leader of the compiler team." And I said "In that case I probably just fatally offended you." He said "No, not at all because I know we generate bad code and I don't care." He said "We don't like the 432 hardware team." And I thought "Oh my God, there is no hope that this project is going to work when you have the two main casts killing each other." He said "That hardware team never listened to us compiler folks. At some point we decided that we'd live up to the letter of the contract but beyond that? No."
Bob thought that 90+% of the performance issues were down to Intel mis-steps, and with a newer manufacturing process, more attention to performance, a redesigned ISA, and a competent compiler, there could be a competitive part.

Incidentally, on Netware performance, he visited them in "1992 or so" and reported:

And the answer came back, "We don't care what it is, doesn't have to go faster. We're satisfied with the 486." And I thought okay, you're doomed.

Intel 432

Posted Feb 14, 2025 16:41 UTC (Fri) by branden (guest, #7029) [Link] (18 responses)

Just wanted to add that yes, everyone should read Colwell's oral history interview. Preferably from start to finish. As a fan of the Ada language, I think I stumbled across it myself while trawling the Web for the story of how the i432 went wrong.

I found this part to be a major takeaway:

"...if some human mind created something then your human mind can understand it. You should always assume that, because if you assume it, it throws away all the doubts that are otherwise going to bother you, and it's going to free you to just concentrate on 'what am I seeing, how is it working, what should I do about it, what am I trying to learn from this'. Never, ever think that you're not smart enough; that's all nonsense." -- Robert P. Colwell

Fortunately for cowboys and rock stars in kernel programming as elsewhere--and the managers who enable them--too few people hear or heed this advice. In these circles, it is believed that one's odds of professional advancement improve as one more convincingly illustrates that one's colleagues are intellectually hamstrung. To put it nicely.

I'm no fan of Intel as a firm but have found much to admire architecturally in their failed experiments, like the 432, the i960, and, yes, the notorious IA64 (Itanium)--I gather that much of what what the last demanded of compilers, drawing furious opposition that labeled the machine dead on arrival, is now done by compilers as a matter of routine. If that's correct, it's not too different from the Ada/i432 situation.

Observing the tenor of LKML's most heated disputes, we might say that only a second-rate hacker learns from his own mistakes, let alone draws a lesson from anyone else's. A consensus that certain mistakes didn't happen, or won't be aired before an unvetted audience, may be a defining trait of broligarchy.

Intel 432

Posted Feb 14, 2025 17:20 UTC (Fri) by khim (subscriber, #9252) [Link] (6 responses)

> I gather that much of what what the last demanded of compilers, drawing furious opposition that labeled the machine dead on arrival, is now done by compilers as a matter of routine

Nope. Not even remotely close. VLIW was built on the premise that scheduling could be done in the compiler. In advance. Before anything is known about actual availability of data.

That's what crashed and burned Pentium 4, Itanic and even for GPUs it proved out not to be feasible, after pipelines there have become diverse enough.

And no, compilers couldn't do that. They can kinda-sorta-maybe a tiny bit simulate that with PGO and other tricks (used by BOLT, etc), but these are still very far from precision needed to make VLIW viable.

> If that's correct, it's not too different from the Ada/i432 situation.

It's only marginally different. i432 have created an architecture which couldn't be fast and compiler was supposed to, somehow, make it fast. And Ada team just completely bombed that and haven't even tried.

With Itanic compiler team actually tried. And they even had some achievements… but they could never approach efficiency of hardware scheduler because hardware scheduler is much more primitive but it has much more precise data. When access to memory (main RAM, not L2 or L3 cache) takes 500 cpu cycles… mistakes in scheduling are extremely costly and compilers couldn't prevent them.

> I found this part to be a major takeaway

Yet you apply it entirely wrongly. Instead of trying to understand how things work (and yes, everything that human invented could be understood by another human) you just look on things, pat yourself on the back telling yourself that you can understand things… and then, without actually taking time and effort to understand them (hey, that's long and complicated process. let' skip it)… you produce the output “as if” you actually understood them.

That's, essentially, what LLMs are doing: they couldn't understand what they are saying and their goal is not to understand anything but to create sequence of words that human reader would perceive as having some meaning… the same means produce the same results, only in case of LLMs we are permitted to call that sequence of worlds “hallucinations”, but if human does the same then it's “opinion”… and we have to respect it — but why?

Intel 432

Posted Feb 14, 2025 18:21 UTC (Fri) by branden (guest, #7029) [Link] (5 responses)

>> I found this part to be a major takeaway

> Yet you apply it entirely wrongly.

Uhh, thanks for the insult. The earlier part of your response had utility, though you offer more than baseline reasons to not swallow your claims uncritically. In practically every comment you post to LWN, you wear your motivated reasoning on your sleeve.

> Instead of trying to understand how things work (and yes, everything that human invented could be understood by another human) you just look on things, pat yourself on the back telling yourself that you can understand things… and then, without actually taking time and effort to understand them (hey, that's long and complicated process. let' skip it)… you produce the output “as if” you actually understood them.

Yup, you got me, that's what I do all the time, in every case. You've discovered a completely general truth!

Fortunately it's one that applies only to other people (or maybe just me)--not to you.

> That's, essentially, what LLMs are doing: they couldn't understand what they are saying and their goal is not to understand anything but to create sequence of words that human reader would perceive as having some meaning… the same means produce the same results, only in case of LLMs we are permitted to call that sequence of worlds “hallucinations”, but if human does the same then it's “opinion”… and we have to respect it — but why?

I'm disinclined to think that LLMs are equivalent to what is (or used to be?) called "general intelligence". Perhaps inadvertently, you've supplied a reason to reconsider my presupposition. We can be confident that even if an LLM proves equivalent to a general intelligence (however defined), your own personal state of evolution will prove so advanced that it can't simulate you. You've got the spotlight--keep shreddin', bro!

Intel 432

Posted Feb 14, 2025 18:58 UTC (Fri) by khim (subscriber, #9252) [Link] (4 responses)

> Fortunately it's one that applies only to other people (or maybe just me)--not to you.

If that was an attempt to sarcasm then you failed. Of course it doesn't apply “only to other people”… the human brain is hard-wired to skip thinking as much as it can (it's costly, very power-consuming, process, after all) and I'm not an exception.

But I make conscious effort to double check what I write by looking at the various sources (including, but not limited, to Wikipedia) and don't demand “respect to my opinion”. My “opinions”, like everyone's are worth nothing. And if my facts or my reasoning are incorrect then they can be shown to be wrong by referencing to the appropriate contradiction in them.

But please don't tell tales about how I should tolerate nonsense because… what would it give me? Friends? Why would I need friends on some remote forum, be it LWN or Reddit ? I have them in “real life”™.

> We can be confident that even if an LLM proves equivalent to a general intelligence (however defined), your own personal state of evolution will prove so advanced that it can't simulate you.

LLM would never be able to “prove equivalent to a general intelligence”. What they have achieved, at this point, is the ability to regurgitate things that were told by others. And it's fascinatig (and a tiny bit scary) how often people are doing the same.

But it's unclear when and if LLMs would get something that would teach them to do something more.

Most likely that moment is still far away, most likely decade or more away… and when it would come they wouldn't be called LLMs.

> In practically every comment you post to LWN, you wear your motivated reasoning on your sleeve.

Yes. That's the whole point: you may see it and you may challenge it. If you can.

The whole idea is not to convince someone to “respect my opinion”, but to either accept my reasoning or show me where they are wrong.

To bro or not to bro

Posted Feb 14, 2025 20:17 UTC (Fri) by branden (guest, #7029) [Link] (3 responses)

I have no motivation to rebut the LLM stuff. Like other disruptive technologies, it's massively overhyped, I suppose because unhinged promotion is the most reliable route to obtaining VC money. Speculators invariably demand ludicrous returns.

> If that was an attempt to sarcasm then you failed.

More like a transparent attempt to flatter your gargantuan ego, since the tenor of your remarks over several years on LWN is that this is a concession you demand in conversation--albeit one that you would seem to prefer not to be gauchely and overtly labelled as such.

> Of course it doesn't apply “only to other people”… the human brain is hard-wired to skip thinking as much as it can (it's costly, very power-consuming, process, after all) and I'm not an exception.

This sounds like the sort of trivial observation one might find at LessWrong, where the admission of universal limitations is a form of currency accepted as a proxy for personal humility. I don't buy it.

> But I make conscious effort to double check what I write by looking at the various sources (including, but not limited, to Wikipedia)

Good for you.

> and don't demand “respect to my opinion”. My “opinions”, like everyone's are worth nothing.

Your model of opinion utility is useless. You might as well call opinions "noise" in the information-theoretic sense; that would make your epistemology more clear to your interlocutors.

I offer, without expecting you to accept, an alternative model. Opinions are a form of knowledge the strength of which derives from the expertise and reliability of the authority uttering them. Expert opinions are valued as evidence in courtrooms because a vetting process is applied to experts. For academic expert witnesses, that process involves verification (well, recitation, at least) of their credentialing by accredited authorities on top of personally authored, peer-reviewed, topically applicable research. That is not the standard we apply to Internet discussion forums or chat around the water cooler in the office; nevertheless we similarly apply weight to a person's claims depending on what is known about their relevant expertise and their past record of reliability.

For example, I found myself reading Robert P. Colwell's oral history interview closely and attentively despite an active disinterest in x86 architecture because (1) he had evident, relevant expertise in the field of computer architecture generally, in which I'm intensely interested and professionally adjacent; and (2) he reached conclusions that seemed consistent with my own <mumble> years of experience in software engineering; and that, bluntly, I was inclined (or biased) to believe already.

I've had numerous encounters with people like you who punk on others for not knowing things. You might concede, in a LessWrong sense, that we are all creatures that wallow constantly in ignorance. Nevertheless when correcting others you do it with an excess of derogation, awarding to yourself the privilege of expressing contempt based solely (or sufficiently) on your "conscious effort to double check what you write by looking at the various sources (including, but not limited, to Wikipedia". Well, bully for you. You checked Wikipedia. Maybe something else.

The contemptible thing about your comment was that you took the statement from Colwell that I quoted, claimed to agree with it, applied it with ludicrous specificity to a tentative claim I made about computer architecture and the history of compiler development, then proffered your (purportedly) superior and conflicting knowledge. Thus, rather than upholding Colwell's perspective as you claimed to do, you undermined it, by counseling others that their minds are inferior to yours--the opposite of Colwell's lesson.

> [I] don't demand “respect to my opinion” ... please don't tell tales about how I should tolerate nonsense

Who said you should? And incorrect statements are not always "nonsense". As noted above, your model confuses simple ignorance with carelessness, both of these with malicious deception, and all of the foregoing with noise in an information channel.

It's not the world's responsibility to give you a raw, noise-free information channel to inerrant truth. (Beware those offering one: they're building a cult.) It's your responsibility to identify imperfect paths to truths that will suffice. Your policy of nondiscrimination among the varieties of imperfect paths to knowledge lead you, apparently, to an intolerant tone and arrogant modes of expression. Alternatively, you already preferred that tone and mode, and developed (or borrowed) an epistemology to justify them.

> what would it give me? Friends? Why would I need friends on some remote forum, be it LWN or Reddit ? I have them in “real life”™.

Good for you for having friends in real life. The classes of relationship for which you seem to have no use, and yet which a forum like LWN comments (which has a narrowly focused audience, demographically and topically) depends, are known as "peers" or "colleagues". (That in turn doesn't mean people have to sickly sweet to each other--first, mandating such encourages belittling rhetoric to take on more baroque forms, crafted carefully to express contempt yet escape censure; second, engineers argue, sometimes passionately, about things all the time. Sometimes we learn new facts that sprout only from contested ground.)

Among peers and colleagues, just as among friends, it is okay to be mistaken or ignorant. What is valuable is to be able to offer, and accept, correction gracefully. Another valuable skill is the capacity to argue an issue, even a contentious one, without unnecessarily involving one's ego.

Some people will be unable or unwilling to associate with you on those bases; sometimes, it will be due to their own prejudices or defects. Other times, it will be solely down to you being supercilious and overbearing. One of these, you can control.

> That's the whole point: you may see it and you may challenge it. If you can. ... The whole idea is not to convince someone to “respect my opinion”, but to either accept my reasoning or show me where they are wrong.

Your point, and idea, need to be larger wholes to make communication with you more than minimally productive.

I concede that minimal productivity might be your objective.

(Meanwhile I'll count down to a scolding by one of the editors. :-/ )

Countdown reached

Posted Feb 14, 2025 20:32 UTC (Fri) by corbet (editor, #1) [Link] (1 responses)

As you can imagine, we will indeed ask that the folks involved in this subthread put an end to it.

It's sad, honestly, that we have to keep asking this.

Countdown reached

Posted Feb 14, 2025 21:14 UTC (Fri) by khim (subscriber, #9252) [Link]

I posted my answer before I saw this request. Sorry about that. I'll stop here.

To bro or not to bro

Posted Feb 14, 2025 21:13 UTC (Fri) by khim (subscriber, #9252) [Link]

> (2) he reached conclusions that seemed consistent with my own <mumble> years of experience in software engineering; and that, bluntly, I was inclined (or biased) to believe already.

Means instead of trying to fight the confirmation bias you are embracing it.

That's also normal for humans and hard to fight, but that's also what we are supposed to fight if we want to create predictions that are even remotely close to what would actually happen.

One simple, yet efficient way to combat it is to compare your past predictions to what have actually happened. This approach have revealed that my picture of the world have few definitely bad spots — one example is Apple. Probably because my view of that company is similar to RMS's view (Remember? Steve Jobs, the pioneer of the computer as a jail made cool, designed to sever fools from their freedom, has died), while the majority of Apple users are feeling themselves comfortably in that jail — and there are enough of them to ensure that Apple would be able to do things that any other company couldn't.

That's an interesting phenomenon, but it's, thankfully, limited: while Apple continue to be able to squeeze a lot of money from Apple users and force them to do things that human beings are not supposed to acquiescence too… this world is still limited and looks like it would be dismantled at some point, anyway.

There are some other blind spots where my predictions are not performing well, but these are mostly related to timings and these are quite hard to predict (e.g. Google killed ChromeOS development, to finally, make sure it only have one platform across laptops, tablets and phone — but that happened in 2024… not decade before that, as was my expectation… Why was Google pouring money into something that would never fly? I have no idea, the move was obvious… but something made it hard to do when it made sense to do).

But can you present similar predictions that have been justified or not justified? Do you even look on what you have predicted 10 or 20 years before?

> Thus, rather than upholding Colwell's perspective as you claimed to do, you undermined it, by counseling others that their minds are inferior to yours--the opposite of Colwell's lesson.

I think we are seeing two entirely different lessons there. You look on the Never, ever think that you're not smart enough; that's all nonsense, self-pat yourself on the back and tell yourself “hey, I shouldn't think I'm not smart enough and they all should respect me because I'm smart enough”!

And I look on the part that you also cited, yet ignored: it throws away all the doubts that are otherwise going to bother you, and it's going to free you to just concentrate on “what am I seeing, how is it working, what should I do about it, what am I trying to learn from this”… these are important things, respect is earned, not given — but you have the capability to earn it! And then he continues to part where he gives that respect to others: He's still better at it than I'll ever be. I mean, I watched him do what I consider near miracles, like walking up to a refrigerator and telling you that the bearings and the compressor are bad, walking up to a car, telling you that there's something wrong with the carburetor just by listening to it..

And I'm ready to do the same if you are actually better at doing such things: take a look on something where you are an expert and tell me what's wrong and what's going to happen… and then you can be respected for your abilities and knowledge.

But… where is that area? Where have you “walked up to something” and amazed someone with your ability to do the correct conclusion from what little you may observe there?

Intel 432

Posted Feb 14, 2025 17:25 UTC (Fri) by jem (subscriber, #24231) [Link] (1 responses)

>I'm no fan of Intel as a firm but have found much to admire architecturally in their failed experiments, like the 432, the i960, and, yes, the notorious IA64 (Itanium)

What was wrong with the i960? Wikipedia says it was a "best-selling CPU" in its segment and calls it a "success". Maybe you are confusing it with the i860, which "never achieved commercial success."

Intel 432

Posted Feb 14, 2025 17:38 UTC (Fri) by khim (subscriber, #9252) [Link]

Please read the Wikipedia article more carefully: i960 was essentially RISC with added pieces of iAPX432 on top (in the “object oriented” BiiN version).

The “base” version (without BiiN silliness) was a success, while all that “object oriented” crazyness was only used in some government-related projects (and while details are classified we can assume that it was mostly for buzzword-compliance because there was no successor).

Itanium and compiler changes

Posted Feb 14, 2025 17:29 UTC (Fri) by farnz (subscriber, #17727) [Link] (8 responses)

the notorious IA64 (Itanium)--I gather that much of what what the last demanded of compilers, drawing furious opposition that labeled the machine dead on arrival, is now done by compilers as a matter of routine.
What made the Itanium dead on arrival is not what it demanded of compilers, but that to succeed it required the compilers to do things for Itanium but not for other architectures. Itanium could not perform adequately unless the compiler scheduled code - but to perform better than Pentium Pro, POWER3 and other out-of-order machines, you needed to ensure that the compiler did not use the scheduling information to do a better job of codegen for those systems.

But compiler developers aren't drooling idiots, and used the analyses they did that were required to make Itanium perform adequately to help non-Itanium CPUs. As a result, instead of Itanium outperforming Pentium III, as analyses of compiler output from the early 1990s would suggest, Pentium III tended to outperform Itanium because the compiler improvements needed to make Itanium perform well also improved P6's performance.

Itanium and compiler changes

Posted Feb 14, 2025 18:27 UTC (Fri) by branden (guest, #7029) [Link] (7 responses)

> But compiler developers aren't drooling idiots, and used the analyses they did that were required to make Itanium perform adequately to help non-Itanium CPUs. As a result, instead of Itanium outperforming Pentium III, as analyses of compiler output from the early 1990s would suggest, Pentium III tended to outperform Itanium because the compiler improvements needed to make Itanium perform well also improved P6's performance.

That summary is consistent with other accounts I've read. But then that would make the Itanium, rather than a stupid folly, more like the right mistake to make at the right time. Yet that is a far more generous characterization than it normally gets.

I suggest that redemptive readings of architectures that "failed in the market" are more instructive to engineers than x86 (or even ARM) partisanship.

Itanium and compiler changes

Posted Feb 14, 2025 18:45 UTC (Fri) by farnz (subscriber, #17727) [Link] (6 responses)

No, it doesn't, because the compiler changes were made before Itanium needed them, precisely because they helped other architectures, too.

What Itanium demanded was that you regress compilers for x86, PowerPC etc so that they were back at the level that they were in 1993 or so when the project that became Itanium showed that hand-written EPIC code could compete with compiled output for OoOE machines.

Combine that with the 1998 chip releasing in 2001, having had to be ported forwards a process and cut down to size because it was (quite literally) too big to fit on the maximum possible chip size for 180 nm, let alone the 250 nm it was supposed to come out on, and it was clearly a folly - they literally couldn't build the thing they promised for 1998, it never performed acceptably compared to other CPU designs, HP had to continue PA-RISC for several years because Itanium wasn't good enough to replace it, and the x86-32 compatibility never performed as well as promised.

Itanium and compiler changes

Posted Feb 14, 2025 19:24 UTC (Fri) by khim (subscriber, #9252) [Link] (2 responses)

Note that Itanium wasn't designed by idiots. Like Transputer they were designing the CPU for the wold where development of single-thread core have “hit the wall” near 100MHz and thus new ways of faster execution were needed.

In that imaginary world of slow CPUs and fast memory access VLIW made perfect sense and was, in fact, one of the most promising designs.

But after Athlon hit 1GHz, at the end of XX century… it became obvious that Itanic just simply make no sense in the world of fast CPUs and slow memory… but Intel had to push it, for marketing reasons, even if it was doomed to fail and it was obvious that it has no future.

Itanium and compiler changes

Posted Feb 15, 2025 16:37 UTC (Sat) by farnz (subscriber, #17727) [Link] (1 responses)

It's notable that Intel bet twice in a row on technology futures that didn't happen.

Itanium was a bet that it would be hard to scale clock frequency, but that it would be trivial to go wider (both on external buses and internally). As a bet to take in 1994, that wasn't a bad choice; the failure at Intel was not cancelling Itanium in 1998 when it became clear that Merced would not fit into 250 nm, that process scaling wasn't going to allow it to fit into the next couple of nodes either, and that once you'd trimmed it down to fit the next node, it wasn't going to perform well.

Then, Pentium 4 was a bet that it would be hard to scale logic density, but that clock frequency would scale to at least 10 GHz. Again, wrong with hindsight, but at least this time the early P4 releases were reasonable CPUs; it's just that it didn't scale out as far as intended.

Itanium and compiler changes

Posted Feb 15, 2025 17:40 UTC (Sat) by khim (subscriber, #9252) [Link]

> It's notable that Intel bet twice in a row on technology futures that didn't happen.

What's notable is that both times bets sounded perfectly reasonable. Cray-1 reached 80Mhz is a year 1975 and PA-7100 and Pentium, POWER2, SuperSPARC… all fastest CPUs for almost two decades topped at around that clock speed!

Assuming that trend would continue was natural.

Then, suddenly, during Itanium fiasco, 100Mhz barrier was broken and clock speeds skyrocketed… assuming that this trend would continue wasn't too crazy, either!

> Again, wrong with hindsight, but at least this time the early P4 releases were reasonable CPUs; it's just that it didn't scale out as far as intended.

More importantly: when Intel realized that P4 story is bust, too – it quickly turned around and went with P6 descendants… Tejas was cancelled… but for some reason Intel kept Itanium on life support for years.

Itanium and compiler changes

Posted Feb 14, 2025 20:21 UTC (Fri) by branden (guest, #7029) [Link] (2 responses)

[earlier]
> > But compiler developers aren't drooling idiots, and used the analyses they did that were required to make Itanium perform adequately to help non-Itanium CPUs.

> No, it doesn't, because the compiler changes were made before Itanium needed them, precisely because they helped other architectures, too.

I'll confess to being the sort of inferior mind that khim frequently laments, because I'm having a hard time inferring a coherent chronology from the foregoing.

This drooling idiot's going to need a good resource for the struggles of ISA and compiler development in the '90s. Anyone got any references to recommend?

Itanium and compiler changes

Posted Feb 14, 2025 22:16 UTC (Fri) by malmedal (subscriber, #56172) [Link]

> Anyone got any references to recommend?

Search for EPIC. It's a frequently recurring topic on Usenet's comp.arch, you can try reading the archives or asking.

I don't think the compilers ever scheduled Itanium's instruction-bundles very well. Itanium had superior floating point performance, but I believe this was because the FPU was register-based. x87 used a stack-based instruction set, which did not play very well with superscalar scheduling.

Itanium and compiler changes

Posted Feb 15, 2025 16:46 UTC (Sat) by farnz (subscriber, #17727) [Link]

I'd suggest looking at the things that came out of IBM research - Single Static Assignment and so on - along with looking deep into how EPIC was supposed to work.

You'll need a good understanding, also, of out-of-order execution (e.g. Tomasulo's algorithm) and register renaming, to understand why compiler developers were doing more instruction scheduling for IBM POWER CPUs even without Itanium's deep need for explicit scheduling.

The core, though, was that EPIC relied on the instruction stream being pre-scheduled in software with explicit dependency chains and speculation, making the core simpler. Out-of-order execution (OoOE) infers the dependency chains and speculation based on the instructions in the "instruction window"; being able to schedule instructions the way EPIC requires allows OoOE to see more opportunities for parallel execution, since it can execution instructions in parallel wherever there isn't a dependency chain.

A lot of good stuff in there

Posted Feb 13, 2025 23:36 UTC (Thu) by willy (subscriber, #9762) [Link]

Go needs a garbage collector. Modula-3 doesn't have a union type, and has a very sad compiler story.

Rust is actually a good choice, stop trying to throw out alternatives.

A lot of good stuff in there

Posted Feb 14, 2025 0:41 UTC (Fri) by rolexhamster (guest, #158445) [Link] (102 responses)

    ... because it is far from certain that C compilers, optimizers, and linkers will not be developed that remove the vast majority if not all of C's and C++'s undefined behaviors without a performance penalty - not only without a performance penalty but with a performance increase. There are people working on that right now that have been mentioned here on LWN.net and from what I can see they have been making pretty good progress.

Speaking as someone who has developed in C and C++ for 25+ years, I can say that the above statements are demonstrably false.

There is no way to fix either C and C++ without breaking backwards compatibility. At that point it's more productive (and with far less technical debt going forward) to use a new language that has been developed from scratch with safety explicitly in mind.

All the diagnostic tooling (in conjunction with so-called restricted language subsets) are not sufficient to provide proper safety guarantees in C and C++. They're like band-aids to cover up symptoms rather than addressing the root cause. In less charitable terms, they are equivalent to putting lipstick on a pig; there is still a (very ugly) pig underneath.

The entire resistance to iteratively incorporate Rust in the Linux kernel is downright bizarre and counter-productive. It's as if certain maintainers are too set in their ways to learn a very useful new language, and are happy with half-measures and wishful thinking. This eventually leads to obsolescence.

A lot of good stuff in there

Posted Feb 14, 2025 4:22 UTC (Fri) by wtarreau (subscriber, #51152) [Link] (100 responses)

> The entire resistance to iteratively incorporate Rust in the Linux kernel is downright bizarre and counter-productive. It's as if certain maintainers are too set in their ways to learn a very useful new language, and are happy with half-measures and wishful thinking. This eventually leads to obsolescence.

I think the root cause of the problem precisely is what is written above: there are people who consider that others do not *want* to learn their pet language, despite the fact that about everyone agrees that it's probably one of the most difficult languages around to learn. I tried. I failed. Too many cryptic symbols, code is not pronounceable etc. That simply doesn't work for me. As I said to Miguel along a discussion some time ago, it makes me feel like I'm trying to code with smileys. The effort is just too strong for me. And I think that's actually the problem a number of other maintainers are facing. If you want to impose a very difficult language, it's normal to face resistance. But the resistance is not necessarily against the introduction of *a* new language, but just a perceived inability to learn *this one*. And this must be respected, and addressed in a way that doesn't require all these people to even just have to parse it if they can't. Systematically accusing people of deliberately refusing to do something they can't do is particularly harsh, it's like attacking handicaps.

At an era where everything wants to be "inclusive", maybe start with making Rust inclusive ? BPF used to be byte code only that was used only in tcpdump; it finally got a compiler and verifier that now makes it compile from C into safe code. Maybe you need a similar C-to-Rust compiler that will allow developers to write their code using a much simpler C syntax and turn it to Rust, rejecting what is not proven safe ? Just like with eBPF it will put the effort on the developer but using a language they understand at least.

A lot of good stuff in there

Posted Feb 14, 2025 9:10 UTC (Fri) by hunger (subscriber, #36242) [Link] (1 responses)

What are you actually struggling with? "Programming with smileys" sounds like it is the syntax. Would a translation layer that looks more like C help? Such a layer would contain non-C parts, to express all the language features just not available in C, so I doubt that would help much. You would still have to deal with lifetimes, even if the syntax for that would look different.

What would help you?

If you kernel guys continue with Rust, you as a community will need to come to grips with the hard part of introducing a new language: You will need to work with people approaching problems in different way than what you are used to. You will need to develop a basic understanding of the "rust way" in addition to "the C way". If you guys do not manage that, then you will continue to burn each other out. If you do pull this off, I think all of you can benefit a lot from the fresh wind this will bring.

The rust devs start with a slight advantage here: Many of them know C already. They also have a disadvantage though: They "got the rust way" and think it is superior, so they will be impatient, waiting for the old C guard to catch up to them. That won't happen: You will need to find common understanding somehow and that will neither be on the C nor the Rust side.

IMHO this Rust for Linux thing is a much more interesting social experiment than a technical one.

A lot of good stuff in there

Posted Feb 14, 2025 15:02 UTC (Fri) by Wol (subscriber, #4433) [Link]

> You will need to work with people approaching problems in different way than what you are used to.

Which is hard. Which is why when I mention databases everyone groans :-)

But I just see things completely differently to most other people. I see objects, which contain attributes and relations. People "classically trained" in databases see relations, attributes and tuples, out of which they build ... well I'm not sure ...

And never (well hopefully not) the twain shall meet ...

Vi/Emacs. Declarative/Procedural. MultiValue/Relational. Rust/C. Just try not to pick a fight with the other side. Try to see things through their eyes. And remember - THEIR BRAINS MAY NOT BE WIRED THE SAME AS YOURS!

Cheers,
Wol

A lot of good stuff in there

Posted Feb 14, 2025 12:02 UTC (Fri) by taladar (subscriber, #68407) [Link] (4 responses)

As someone who has learned many programming languages as a hobby in the past I find the view that syntax is even close to the most difficult thing to learn about a new language outright bizarre.

Syntax might be hard to figure out initially, especially in languages like Haskell that use a lot of operators instead of named functions you can search for in a general purpose search engine (but Haskell has Hoogle to help with that), it might be hard to break some habits when switching languages for a bit, sure. But those are all minor problems compared to semantic changes between languages in terms of actual learning effort.

A lot of good stuff in there

Posted Feb 14, 2025 23:41 UTC (Fri) by khim (subscriber, #9252) [Link] (2 responses)

> As someone who has learned many programming languages as a hobby in the past I find the view that syntax is even close to the most difficult thing to learn about a new language outright bizarre.

It's not as bizarre as you think if you view the whole thing from the mindset of someone who think all these fancy languages and other modern tools exist only to emit machine code in a certain sequence.

For them any language is only an thin (or thick) veil that hides their beloved machine code from them.

In that mindset syntax is, indeed, the biggest obstacle and things besides syntax simply just don't even exist.

And that's second part that puts Rust developers and kernel developers into unsolvable bind: while some kernel developers have come to accept the fact that what they want (compiler that makes it possible to predictably emit efficient machine code and ignore any and all rules) couldn't exist and wouldn't exist… significant part still want that mythical too and wouldn't agree to deal with anything else.

> But those are all minor problems compared to semantic changes between languages in terms of actual learning effort.

But they don't see it like that! Machine code is machine code… what “semantic changes” are you talking about?

A lot of good stuff in there

Posted Feb 15, 2025 22:14 UTC (Sat) by kleptog (subscriber, #1183) [Link] (1 responses)

Some of this thread sounds very similar to the comments by one of the creators of Go, that they were hoping to attract people that wrote C but ended up attracting way more people from more managed languages.

It's like there's something about being an ingrained C programmer that makes it harder to switch to something higher level.

Like the comments related to it being hard to make complicated data structures in safe Rust. If you're coming from a language where the only data structures are arrays, maps and objects, you don't understand how this can be a problem.

A lot of good stuff in there

Posted Feb 16, 2025 11:20 UTC (Sun) by khim (subscriber, #9252) [Link]

> Some of this thread sounds very similar to the comments by one of the creators of Go, that they were hoping to attract people that wrote C but ended up attracting way more people from more managed languages.

That wasn't C, that was C++.

> It's like there's something about being an ingrained C programmer that makes it harder to switch to something higher level.

Not really. Rust attracted C++ developers when that wasn't in initial plan at all.

It's just people use languages differently… and very often not in a way their creators intended these languages to be used!

Rob Pike (and his team) saw C++ as hugely complex, convoluted beast… and “solved” that problem by creating simpler language… that moved error detection from compile time to runtime to achieve that… of course C++ developers would shun that! But for Python and Ruby developers… who were fed up with types mismatch runtime errors… it was perfect fit.

Similarly with Rust: what was a tiny side-story initially, not even central part of the language design… solved issues that C++ (and most other languages have) – and that made it popular among people who were fed up with endless fights against invalidated pointers and iterators… in all languages… but not among the ones who don't see anything wrong with what they are doing and think the fact that certain error-handling paths contain mistakes is not reason good enough to raise all that racket with “Rewrite it in Rust”.

> Like the comments related to it being hard to make complicated data structures in safe Rust. If you're coming from a language where the only data structures are arrays, maps and objects, you don't understand how this can be a problem.

Believe me, C++ have more data structures than you can list. And Rust can implement all the data structures that you may ever imagine. There are [inclusive lists](https://docs.rs/intrusive-collections/latest/intrusive_collections/), [arenas](https://docs.rs/bumpalo/latest/bumpalo/), [self-referential data structures](https://docs.rs/ouroboros/latest/ouroboros/) and many other things… No, the problem is different, this time, again.

Problem lies on C side, not Rust side, this time. In most languages (be it C++, Ada, Java or even Go) it's relatively easy to separate data structures from the “business logic” – but C (especially “C used as portable assembler”) doesn't give you such luxury. That's why natural inclination of C user is not to import implementation of some data structure, but to reimplement them. In ad-hoc fashion. Hey, that could be more efficient and I wouldn't have to rely on someone's else work!

> It's like there's something about being an ingrained C programmer that makes it harder to switch to something higher level.

“We code for the hardware” mindset. “C as portable assembler” is really as high as you can push it. And even that starts falling apart with modern compilers. Higher-level language means that you have to give up that control and, in particular, would start using data structures prepared by someone else. That's flat-out unacceptable for these guys (hardware don't have any pre-canned structures means their code should have them, either) even if it's easy to see that alternatives wouldn't work: most bugs that fuzzers and other tools find in kernel are precisely in implementation of ad-hoc data structures and it wouldn't be practical to write them in any language.

Even if you use WUFFS that makes, in theory, possible to write ad-hoc implementations… you quickly find out that writing all these ad-hoc implementation and all these ad-hoc correctness annotations… it's too painful to not reuse code.

A lot of good stuff in there

Posted Feb 15, 2025 11:07 UTC (Sat) by jengelh (guest, #33263) [Link]

>Syntax might be hard to figure out initially, especially in languages like Haskell that use a lot of operators instead of named functions [... and there is] Hoogle to help with that

Please. I don't know of anyone ever who has produced an ACME::EyeDrops-syntacted code *and* kept to maintain it after the "syntax might be hard initially" part.

A lot of good stuff in there

Posted Feb 14, 2025 21:25 UTC (Fri) by roc (subscriber, #30627) [Link] (13 responses)

That some experienced kernel developers have tried and failed to learn Rust seems very strange to me. If we could figure out the right explanation(s) for that, we'd learn a lot more about what R4L needs to be more successful.

It seems strange to me because I know a variety of people who have learned Rust and none of them found it particularly hard to get into:
-- Me. I was "Rust-adjacent" from the beginning of Rust (in fact I was one of the people who first reviewed Graydon's proposal, when it was a very different language). But from first trying to write Rust code to being reasonably productive was a few days max. But OK, I had a CS PhD, lots of experience with C++, some experience with other weird languages.
-- My Pernosco co-founder. No PhD, lots of C++ experience but not other languages. With Rust, had a similar experience to me.
-- My son, an undergrad CS student. A quite small amount of experience with JS, Python, Java and C. Picked up Rust pretty easily.
-- Various people I worked with at Mozilla and Google who ended up on a project where they needed to use Rust. C++ experience, some like Rust more than others, but none "failed to learn Rust" or anything close to that. AFAICT they all picked it up pretty easily.
(This is all "can read Rust code and write code to solve problems", not "have mastered everything".)

Possible explanations? Could it be that all these people are smarter than the average kernel maintainer? I don't believe that for a second.

Maybe some kernel maintainers subconsciously (or consciously) don't want to use or like Rust, and that sabotages their efforts, possibly without them being aware of it?

Maybe for some kernel maintainers who write nothing but C for decades, their minds get locked into certain ways of reading and writing code, and it's really hard to break out of that? Sounds weird, but this corresponds most closely to what you described in your comment.

I can't think of any other possible explanations right now.

A lot of good stuff in there

Posted Feb 15, 2025 0:20 UTC (Sat) by khim (subscriber, #9252) [Link] (12 responses)

> Maybe for some kernel maintainers who write nothing but C for decades, their minds get locked into certain ways of reading and writing code, and it's really hard to break out of that? Sounds weird, but this corresponds most closely to what you described in your comment.

No, it's now weird and, in fact, our discussions with Wol have shown the problem extremely acutely.

These developers don't think in terms of C. For them C is just a fancy way to generate machine code and they think in terms of machine code. Which is then back-converted, in their mind, into C before code is typed.

Sure, C have significantly different syntax, but it was, initially, created for that very task and thus such use doesn't feel unnatural, at first. You have to remember that C wasn't actually, a high-level programming language and maybe, not even a language at all… it was a means to write machine code for different machines from one “macropackage”.

It was later turned into something resembling high-level language by different people, members of C (and, later, C++) committees… but for these developers who still think in terms of machine code… nothing have changed (except for the “evil compilers” that require more and more effort to stop them from breaking their “beautiful machine code”).

> Could it be that all these people are smarter than the average kernel maintainer? I don't believe that for a second.

They are not smarter, but they have different background. For them Rust actually exist. And in: it's something separate from the machine code that comes out of the compiler. It has separate rules, separate properties, it's something that you deal with not by tracking the path from the “another weird syntax” to the machine code, but like it's something that can be separated from machine.

And that's exactly where one funeral at time vibe lives: to understand Rust you first have to understand that it doesn't try to invent yet another way to represent machine code in your program, but, in fact, it gives you a way to represent your intent (and then compiler have freedom to transform said intent into a machine code – and you may, mostly, simply ignore details of that process).

That's not something you can actually learn, that's something you have to accept… and it's easier to accept it for someone with JS, Python, Java experience than for someone who first wrote machine code with PEEK/POKE in BASIC, then “graduated” to Assembler and “got a degree” in C.

They try to apply the exact same rules to Rust… and they simply just don't feet: Rust, especially safe Rust, is very limited and doesn't allow them to express many things that were expressible even in C… that's awful experience for someone who thinks in assembler.

P.S. The irony here is, of course, that the last real programmer had the exact same trouble trying to accept assembler. It, too, robbed him of some of skills that he was feeling were quite valuable. Kernel developers haven't lived in a era of drum memory thus they never developed these skills and it wasn't hard to them to live with such “sacrifice”. But Rust taxes their ability in a very similar way: it just makes something that they feel is part of what they are supposed to have in their toolbelt inaccessible… and that causes this strong rejection feeling.

A lot of good stuff in there

Posted Feb 15, 2025 4:45 UTC (Sat) by ebiederm (subscriber, #35028) [Link] (8 responses)

There is a different issue for me.

I came to the conclusion very quickly that for the kinds of code I typically write, I can not express it in safe Rust.

Not that we are talking much of a challenge. I have to think hard to find any of the data structures from my intro to data structures and algorithms that Rust can be implemented in safe Rust.

At which point I do my research and I see that Rust dropped the ball and it is possible to machine check the implementation of those algorithms. Rust just doesn't give me the tools I need.

I do a bit more research and realize that C has a lot of accidental/unnessary complexity C++ is off the charts. Rust isn't as bad as C++ but there remains a lot of unnecessary complexity. Which all matters because if we want to build aecure software. The kind where we can stand a server up on the Internet and start having to worry what happens when the uptime counter wraps because there hasn't been any security issues found in the software for years. Completely matters.

The more complex the foundation is the harder it is to analyze.

Which means Rust really isn't a programming language I desire to use. It complains because it can not understand my safely written code, and it complicates analysis of that code.

Rust seems an incremental step forward. But we are talking an incremental step forward from code that has been good enough for the last 25 years to exceed the reliability of bleeding edge hardware. I have stories.

That is C really is good enough to get the job done.

I have seen people make the case that Rust can get the job done too. I haven't seen much support for the notion that Rust does the job better. I am pretty certain if I were using Rust I would find it an over complicated clap trap that I have to continually fight to get it out of my way.

A lot of good stuff in there

Posted Feb 15, 2025 5:28 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link] (4 responses)

> The kind where we can stand a server up on the Internet and start having to worry what happens when the uptime counter wraps because there hasn't been any security issues found in the software for years. Completely matters.
> That is C really is good enough to get the job done.

I struggle to understand how you can reconcile these two statements. C has shown again and again and again and again and again and again that it's NOT possible to write secure non-trivial software in it. Pretty much every large network-facing product in C has seen its share of critical security issues.

> Rust seems an incremental step forward. But we are talking an incremental step forward from code that has been good enough for the last 25 years to exceed the reliability of bleeding edge hardware. I have stories.

What "reliability"?

A lot of good stuff in there

Posted Feb 15, 2025 13:08 UTC (Sat) by ebiederm (subscriber, #35028) [Link] (3 responses)

What I want in something new:
> The kind where we can stand a server up on the Internet and start having to worry what happens when the uptime counter wraps because there hasn't been any security issues found in the software for years. Completely matters.

I do not see Rust making it possible to do the above.

The Rust type system does not allow modelling what needs to be modelled to allow for data structures to be written in safe Rust. For me it is like working with epicycles in an Earth centered solar system model, when what we need are ellipses in a sun centered model.

It is a lot easier to show an ellipse is correct because the model is simpler.

Although honestly I don't think Rust has managed to be even as good as epicycles were in astronomy. Rust just says: it is unsafe I give up.

If Rust only gave up on the implementation details of tricky things like RCU, and the fine details of spinlock implementations, sure. That stuff is hard. But into to data structures 101. That seems to push pretty much every pice of code I write into unsafe land.

So my first statement was about where I would like an operating system to be, and how I don't see Rust making it easy to get there

My second statement ( "C is good enough") is about practical utility. In the common cases. In the cases where nobody cares about security.

Try standing up to a boss who has a make or break the company feature they want to ship, and tell them what you are building on is implemented insecurely and it will be another year before the foundation has a proper foundation.

You say:
> C has shown again and again and again and again and again and again that it's NOT possible to write secure non-trivial software in it. Pretty much every large network-facing product in C has seen its share of critical security issues.

I will point out that every time it is some silly mistake. An off by one error, or not getting error handling quite right. Overall code paths that don't matter for what people are trying to do. Almost always a localized and simple fix will do.

Which is to say it isn't the hard stuff that C gets wrong, C simply does not give enough support to prevent errors in the easy stuff

Telling people what they have been doing for their entire career is hopelessly broken, is not a selling point especially when they know better. They know what they have been doing gets the job done.

To actually achieve secure software I can leave on the Internet for decades without updating, I am pretty certain that will require supporting statically verifying assertions about the code. As only machine validation can be thorough and patient enough to check every little corner case.

So far Rust makes some headway in that department but then it's type system sees an unsafe and bows out. I haven't seen anything in Rust beyond it's type system that can be used to catch the rare mistakes conscientious people make.

A lot of good stuff in there

Posted Feb 15, 2025 13:26 UTC (Sat) by mb (subscriber, #50428) [Link]

>RCU... spinlock... That seems to push pretty much every pice of code I write

Nobody claims that you can implement RCU or spinlocks in safe Rust.

>I will point out that every time it is some silly mistake.

In Rust most of these silly mistakes are impossible to do. That's the point.

>Almost always a localized and simple fix will do.

Not having to do a fix, because nothing is broken, will do better.

>Telling people what they have been doing for their entire career is hopelessly broken, is not a selling point

Sure. People tend to not like changes.

>They know what they have been doing gets the job done.

... with a steady stream of security issues.
If you can accept that then yes, C gets the job done.

>but then it's type system sees an unsafe and bows out.

unsafe blocks don't disable any of the type, borrow and safety checks.

>I haven't seen anything in Rust beyond it's type system that can be used to catch
>the rare mistakes conscientious people make.

Sure. But that's just because you didn't look. Not because they don't exist.

A lot of good stuff in there

Posted Feb 15, 2025 21:43 UTC (Sat) by Cyberax (✭ supporter ✭, #52523) [Link]

> The Rust type system does not allow modelling what needs to be modelled to allow for data structures to be written in safe Rust.

Which data structures Rust doesn't allow to be modeled (perhaps with a bit of `unsafe` inside the implementation)?

> My second statement ( "C is good enough") is about practical utility. In the common cases. In the cases where nobody cares about security.

No. C is not "good enough". It's downright dangerous. At this point, merely using C for new projects should be considered bordering on criminal negligence.

> I will point out that every time it is some silly mistake. An off by one error, or not getting error handling quite right. Overall code paths that don't matter for what people are trying to do. Almost always a localized and simple fix will do.

Yeah. And that's how we get airplanes crashes and nuclear reactor blowups. This attitude is just distilled nonsense and sheer arrogant ignorance.

The single greatest advantage in the safety science was the recognition that humans can't help but make mistakes, and that measures should be taken to make these mistakes impossible: LOTOs, automatic safety interlocks, etc. So every other engineering discipline (including the ones that are much younger than Computer Science) has adopted tools that reduce the possibility and the impact of "silly mistakes". Even non-engineering disciplines are doing that.

The latest advantage that drastically reduced the death rate in surgeries was not some kind of tricorder powered by AI, but simple checklists: https://en.wikipedia.org/wiki/WHO_Surgical_Safety_Checklist

A lot of good stuff in there

Posted Feb 16, 2025 22:28 UTC (Sun) by mathstuf (subscriber, #69389) [Link]

> To actually achieve secure software I can leave on the Internet for decades without updating, I am pretty certain that will require supporting statically verifying assertions about the code. As only machine validation can be thorough and patient enough to check every little corner case.

Do you have any existing examples of this actually happening in any language, nevermind C? Maybe Erlang-based deployments at telcos (depending on your definition of "update")? But I can't think of any C program that has been "on the Internet for decades without updating" while also being "secure". Unless maybe you're counting `/bin/true` as "on the Internet"?

And I agree that machine validation is required for further trust in software. But Rust is not only more easily tooled (because its source just has more information than C does), but *has* better tooling in almost every dimension that matters (sure, UBSan is "better" for C than Miri is for Rust given Miri's limitations, but its need for such a tool is also *far far* lower).

A lot of good stuff in there

Posted Feb 15, 2025 11:58 UTC (Sat) by khim (subscriber, #9252) [Link]

> There is a different issue for me.

How is it different?

> I have to think hard to find any of the data structures from my intro to data structures and algorithms that Rust can be implemented in safe Rust.

If course they can all be implemented in safe Rust. Just put all your data structures into array and use indexes. That's how we studied them when I was in school which only had access to some crazy micros with BASIC in ROM.

What you can not implement in safe Rust is something that gives you the exact same beautiful and clear machine code that exist in your head.

That's different issue.

> Rust just doesn't give me the tools I need.

Are you sure? Why do you implement the exact same data structures again and again? What's the point? What happened do “don't repeat yourself” and code reuse?

> Which all matters because if we want to build aecure software.

Yes, less complex languages are desirable… but only if they are expressive enough to provide safety guarantees. C and even C++ don't provide them. Rust does.

It's as simple as that.

After all BASIC is arguably simpler than C and yet kernel is not written in QB64

> The kind where we can stand a server up on the Internet and start having to worry what happens when the uptime counter wraps because there hasn't been any security issues found in the software for years.

As long as you are using Linux that's impossible because even just security in kernel are found more often that we may like.

> The more complex the foundation is the harder it is to analyze.

Only if “all other things are the same”. In case of C and Rust they are not the same.

> Which means Rust really isn't a programming language I desire to use. It complains because it can not understand my safely written code, and it complicates analysis of that code.

Yup. That's what FORTRAN and COBOL programmers were telling themselves half-century ago. When structured programming debates raged. And then they were fired and replaced with Pascal and C programmers, and their ability to juggle complex GOTO-filled programs haven't mattered one jot.

COBOL programming got vindication years later when banks needed to do something for their decades-old systems. FORTRAN programmers mostly had to leave programming completely. I interviewed few, when they tried to return to IT, years later, and were even willing to learn “new ways of doing things”, but my boss rejected them.

Ironically enough over time strict restriction on “no GoTo ever” was relaxed a bit with MISRA C 2012 allowing forward goto and exceptions being nothing more than interprocedural forward gotos… but that happened years later. Initially rejection of unstructured control flow was implemented with almost religious fervor.

> That is C really is good enough to get the job done.

Just like FORTRAN and PL/I were “good enough” half-century ago. They are still with us, but no new projects are written in them. The same would happen to C.

The interesting questions are not if C would be replaced but what would it be replaced with – and if Linux would survive that replacement or not.

Because when all that heat was generated in the 1970th… no one could predict that it would be C that would win the race, in the end. Not just reigning king, Pascal, was much more popular, but even Ada was, briefly, more popular than C!

Then lawsuit happened that changed the course of history… something like this may happen in the future, still.

For example if Apple would either be broken up or would collapse (which would free swift from it's shackles) then we may end up with the future where Swift and not Rust would replace it.

Currently tremendous advantage of Rust lies with the fact that it was developed in Mozilla that's currently not financially capable of supporting and controlling it and thus it's the first contender for the C/C++ replacement.

But like AT&T breakup, suddenly propelled C and C++ to the stardom… something like this could still happen to Swift, or, heck, even Carbon.

Language development is fascinating area where technical capabilities determines losers while social dynamics determines winners.

That's how we can say that C is doomed (it's simply technically not good enough to survive), while saying that Rust would replace it… looks more and more likely, but then in the middle of 1980th it looked as if Pascal would replace FORTRAN and COBOL… OSes (e.g. Apple's Lisa and Classin MacOS) and programs (most early Microsoft programs for PC) were written in Pascal – yet ultimate winners were C and C++ from AT&T.

A lot of good stuff in there

Posted Feb 15, 2025 17:53 UTC (Sat) by excors (subscriber, #95769) [Link]

> I have to think hard to find any of the data structures from my intro to data structures and algorithms that Rust can be implemented in safe Rust.

I'd agree that Rust is a poor language for introductory computer science. E.g. it doesn't closely match the pseudocode of CLRS (where much of the book is dedicated to complex and subtle mathematical proofs of correctness, which is antithetical to Rust's idea of mechanically-verified correctness; and the book doesn't care about concepts like ownership because it never deallocates anything, effectively assuming you have a GC). Java or Python would be a much better fit while you're working through a textbook like that. For recursive data structures (singly-linked lists, trees), Rust is quite cumbersome - you'd be better off with ML or Haskell. For learning how CPU hardware works, you'd want an assembly language (or maybe C-as-portable-assembly, keeping away from its rough edges).

Rust isn't meant for computer science, it's meant for software engineering. It's for when you've already learned the CS and you're trying to apply it in real-world codebases, where scalability and robustness and concurrency and non-asymptotic performance are much more serious challenges than how to implement yet another chained hash table from scratch. You'll just use std::collections::HashMap. In the very rare case you need a data structure that nobody else has implemented and published as a crate, then you can make use of your CS education and write it yourself; Rust probably makes it harder to get it right, but you only need to do it once, and then you can go back to the real software engineering that Rust makes a lot easier.

If you aren't doing software engineering, that's fine - there's plenty of programming (scripting, research, scientific computing, etc) where you'd probably get little benefit from Rust, and other languages will be better. But OS kernels and internet-facing applications are exactly where Rust's benefits are needed.

A lot of good stuff in there

Posted Feb 15, 2025 20:08 UTC (Sat) by roc (subscriber, #30627) [Link]

You must have a very unusual job or hobby if most of your programming involves implementing new datastructures that can't be expressed in safe Rust. Especially the large number of good-quality Rust libraries that already implement a huge variety of data structures.

A lot of good stuff in there

Posted Feb 15, 2025 8:55 UTC (Sat) by wtarreau (subscriber, #51152) [Link]

I think you summarized my experience pretty well, I'm glad I don't feel that strange and that some wise people understand these differences and difficulties. Thank you.

A lot of good stuff in there

Posted Feb 15, 2025 11:29 UTC (Sat) by Wol (subscriber, #4433) [Link] (1 responses)

> No, it's now weird and, in fact, our discussions with Wol have shown the problem extremely acutely.

Hmm... interesting ...

Okay, I'm not a University Trained programmer, and I was already an experienced programmer when I first met C, but I do bang on about how people think and how your first experiences shape you. I very much treat programming like a school maths problem, my first real language was FORmula TRANslation, which encourages you to think that way, and so I've found Guile/Scheme and Forth very tricky, while I've always programmed C like it's Fortran. Like I'm answering a maths question.

And Pick/DataBASIC. While they were designed in lock-step (although the language came later), they are designed to store and manipulate objects, not rows. Again, a very different way of thinking.

It's always easy to extend your worldview to include similar views. Changing your worldview is a lot harder. It's quite likely kernel programmers think differently at a fundamental level, and Rust is maybe one step too far. I wonder how I'll fare when I really start digging in to it.

Cheers,
Wol

A lot of good stuff in there

Posted Feb 15, 2025 12:31 UTC (Sat) by khim (subscriber, #9252) [Link]

> I wonder how I'll fare when I really start digging in to it.

Not as hard as it may look from all these cries about “step learning curve” and “ugly syntax” (which I still perceive as ugly, ironically enough).

Story, typically, goes like this:

  • Alice tries to write JavaScript in Rust and finds it hard and pointless.
  • Bob tries to write C in Rust finds it really hard, unsafe and pointless.
  • Cecil tries to write Python in Rust and finds it somewhat hard, but strange.
  • David tries to write C++ in Rust and finds that it's possible to bend Rust into pretzel to imitate C++, but it's hard to do… even if safety sounds nice.
  • Esmeralda learns to write Rust in Rust easily and have no idea what Alice, Bob, Cecil and David complain about.
> And so I've found Guile/Scheme and Forth very tricky, while I've always programmed C like it's Fortran

I couldn't say that Guile/Scheme and Forth are super tricky, but they just have never “clicked” to me. I can force myself into mindset needed to use them but it's hard for me to think in these languages. Maybe if I would ever used them professionally I would have learned to do that, but since I never went beyond toy programs with them… I guess I never needed that.

But Rust… it's actually very easy and simple language – but only if you accept it's desire to form all data structures into a trees (with some escape hatches for more complex cases in a form of pre-made data structures like HashMap or BTreeSet)

After you stop trying to bend Rust to your will (possible but very hard and compiler fights you tooth and nail) it's actually pretty simple language… with the sole exception being, ironically enough, not the famed borrow checker, but it's typechecker that's some kind of Prolog (as expected) but with cut operator added in very strange places (half-expected, but still very jarring).

The hard part is to stop trying to write some other language in Rust, be it C++ or JavaScript. But for some reason dropping JavaScript habits is easier than dropping “we code for the hardware” habits.

> It's quite likely kernel programmers think differently at a fundamental level, and Rust is maybe one step too far.

Low-level programming in Rust requires thinking on two levels. Hardware is still there, hardware needs still need to be obeyed… but now you can not just back-translate assembler code into high-level language, but have to express hardware restrictions to Rust… this just feels wrong: I'm programming hardware, why the heck another agent is even needed?

But Rust is complex enough that trying to understand how your code would be represented in machine code… just doesn't work. Sometimes you guess right, sometimes you guess wrong and for the Rust to be usable you really need to delegate work with hardware to it! But that's precisely what many experienced kernel developers and, ironically enough, especially maintainers, don't want to do!

Once you accept that two-step process of learning Rust is not too hard… but as long as you are trying to sidestep Rust and convince it to produce the machine code that you envision in you head… you would have trouble.

A lot of good stuff in there

Posted Feb 14, 2025 21:49 UTC (Fri) by mb (subscriber, #50428) [Link] (78 responses)

> their pet language

Please stop that nonsense.

> code is not pronounceable

How do you pronounce

int foo(void) { return 0; }

and why is this worse than the equivalent Rust syntax?

Rust syntax very much *is* pronounceable. If you don't know how to pronounce it, than it's rather that you probably didn't learn Rust.
Even those fancy lifetime annotations and generics can be pronounced.
Yes, in C you don't have to learn that, because C lacks these features.

>At an era where everything wants to be "inclusive",

Here we go...

>Maybe you need a similar C-to-Rust compiler that will allow developers to write their code using a much simpler C syntax and turn it to Rust

Really. Please read some basics about Rust.
You didn't even read the beginners book. That's obvious.
And that's perfectly fine, unless you shoot against Rust.

The thing you are requesting is impossible and does not make any sense.
It is not possible to express the majority of Rust safety details in C. Therefore, it's impossible in general to translate C to safe Rust.

It's like requesting a machine code to C++ decompiler. That is impossible in general for obvious reasons.

A lot of good stuff in there

Posted Feb 15, 2025 8:53 UTC (Sat) by wtarreau (subscriber, #51152) [Link] (77 responses)

The simple fact that you blatantly reject the difficulties I expressed means it will always be difficult for several people to agree on something around the adoption of the language. It's not pleasant for me to express why I find it difficult to learn something, yet you're turning these explanations into "WTF?". It's as if someone went to the psy to describe their problem and the person said "I don't have time to waste on such stupidities". Please try to read about the difficulties some people face instead of deciding they're making them up.

I began with BASIC in 82, I was almost 8 years old. I did a lot of BASIC since it was the only language I had access to, and when you're a kid you have plenty of time available. At 10 I got a PC-mostly-compatible, then at 12 I got an update to DOS 2.11 which came with the "DEBUG" utility that allowed me to start with ASM. I wrote many ".COM" programs entirely in DEBUG, I had no particular source code, I could read it using command "u" to see the disassembled version, and "a" to start assemble new instructions. I got used to reserving space for forward jumps and later patching, and I even got used to writing self-modifying code. That was quite efficient. Then I discovered Turbo PASCAL but still used jointly with asm (particularly for stuff that I needed to do fast using self-modifying code) and wrote my own assembler that supported symbols and named labels. Good improvement. I only learned C at 20 on UNIX. I didn't and still don't like the language much, it has a number of deficiencies and usability issues (like UB, unexpected operators precedence, unconvenient stdlib particularly for string processing, etc). But for me C has always just been a way to write portable, high-level asm code that builds and runs (almost) similarly on all machines, with as little asm() statements as possible, and I had to stop writing self-modifying code due to non-writable code segments, while in parallel CPUs got larger decode units and efficient branch prediction making this less useful. What I'm thinking about when I write C is the code it will produce, and the portability of the types involved. I constantly look at the resulting asm code when I write C, and on at least x86 and arm, just to have a glance at what it does (i.e. whether or not I sufficiently help the compiler understand what I'm doing).

When I tried Rust, I found that it had nothing in common with these other languages. It cares about concepts that are totally counter-intuitive to me. For example, why does it care about who owns a pointer to a location since a pointer is just an integer, in terms of C, it's an offset from NULL to a memory location, and a memory location is just a bunch of areas filled with either transistors (SRAM) or capacitors (DRAM) that I'm having difficulties imagining that a compiler wants to prevent me from using since they're in my address space and if I decide that from an algorithm point of view I want to use them this or that way, I don't see why the compiler would prevent me from doing so. In addition, the language makes the learning very difficult to me because it uses characters/operators/signs that have different meaning from those of the languages above. I'm always stopping my mental parsing when I see the "'" (single quote) character (which I already forgot what it's about) as in other languages it was used to delimit either a char or a string for example. The fact that a variable declaration is in fact a constant if you don't write "mut" also feels very strange to me. Generally speaking I find the syntax difficult and poorly expressive. You can disagree because you know the language and managed to learn it. But that's the way I feel about it. I have tried some online howtos, like "30mn to rust" etc. I get lost very quickly, too many differences with what I'm used to, too much difficult for me, and for no perceived value except maybe have more difficulties to write programs. I feel like I have to lock-pick a door to enter a jail. That feels very strange to me.

I suspect that there are different expectations from different people (maybe generations BTW). There seems to be those who don't care much about the underlying hardware and want to express their thoughts in a program so that the compiler does its best to represent them in an executable, and those who think how they will use the available hardware to best implement an idea they're having, and feel like the compiler shall not stop them from experimenting with their idea, because their ideas are often sparked by perceiving an opportunity (e.g. you read the description of a CPU instruction you had never heard of and suddenly you figure what you could do with it and a new idea is born). Neither is right or wrong, these are completely different approaches, and it's possible that some languages are more suitable for the ones with the first approach and others are more suited to those with the second approach.

And for the same reason some Rust developers can be shocked to see what C permits and the risks that come with it, some C developers might be shocked to see what Rust tries to prevent and the difficulties that come with it. The goals are just not the same.

I hope this gives you more background about *my* difficulties with the language, which may or may not match others', but I'm expressing this to try to help the two camps listen a bit more to each other without systematically considering there is bad faith on the other side, because these attitudes have only resulted in heated debates and people quitting, which is bad for everyone. Let's just accept that everyone's brain is not the same and let's not make fun of the ones different from yours.

A lot of good stuff in there

Posted Feb 15, 2025 10:01 UTC (Sat) by mb (subscriber, #50428) [Link] (74 responses)

>instead of deciding they're making them up.

I have not said that. I agree that your problems with Rust are very real.

>For example, why does it care about who owns a pointer to a location since a pointer is just an integer, in terms of C

Because a pointer *isn't* just an integer. Even in C.
https://doc.rust-lang.org/std/ptr/index.html#provenance

>"'" (single quote) character (which I already forgot what it's about)
>as in other languages it was used to delimit either a char or a string for example

In Rust it *is* used to delimit a char. I'm not sure what your point is.

>The fact that a variable declaration is in fact a constant if you don't write "mut" also feels very strange to me

It's just a sane default, because most variables are in fact not mutable.

>Generally speaking I find the syntax difficult and poorly expressive

Ok. But that's really strange, because a lot of things in the syntax are really equal to C.
I guess you are locked-in to C then.

>I have tried some online howtos, like "30mn to rust" etc

See this book:
https://doc.rust-lang.org/book/

It takes you baby step by baby step through all those new things. And you will be able to write basic Rust programs after a couple dozen of pages.

>I suspect that there are different expectations from different people (maybe generations BTW)

I also did all the things you mentioned. I also started with Basic and so on.
I also have learnt many languages with very different syntax and look and feel. That's why I really don't see why Rust would be so different here. In fact it's very similar in look and feel to C. It's much closer to C than say Perl or Python or even C++.

You seem to be complaining that Rust added things to the syntax to be able to express new things.
Well. That's not fixable then and you won't ever be able to learn new languages.
C lacks the ability to express certain fundamental things. The only way forward is to add syntax for these things. Just like C did, when they changed their code from ASM to C. All those fancy for-loops there. I want my JMP back!

>and those who think how they will use the available hardware to best implement an idea...
>and it's possible that some languages are more suitable for the ones

Yes. C is completely unsuitable for that.

>Rust developers can be shocked to see what C permits and the risks that come with it

C does not permit more things than Rust.
That is just not true.
It *does* "permit" more UB all over the place, though. But that is not a useful feature. It's a trap.

>help the two camps listen a bit more to each other without systematically considering there is bad faith on the other side

Yes. You are in one of these two camps, as am I.
I am not considering bad faith in most people.
I am considering the lack of knowledge most of the time.

A lot of good stuff in there

Posted Feb 15, 2025 11:08 UTC (Sat) by jem (subscriber, #24231) [Link] (7 responses)

>>"'" (single quote) character (which I already forgot what it's about)
>>as in other languages it was used to delimit either a char or a string >>for example

>In Rust it *is* used to delimit a char. I'm not sure what your point is.

I think he means the quote character that is used to declare lifetimes.

A lot of good stuff in there

Posted Feb 15, 2025 11:18 UTC (Sat) by zdzichu (subscriber, #17118) [Link] (6 responses)

No, single quote is used to label loops! (https://doc.rust-lang.org/book/ch03-05-control-flow.html#...)

See the problem?

A lot of good stuff in there

Posted Feb 15, 2025 15:24 UTC (Sat) by mbunkus (subscriber, #87248) [Link]

Even in C there are characters that serve multiple, vastly different purposes: (. It starts a cast operation, or a list of function parameters (both in declarations & calls), and it controls order of mathematical operations. Or *, which can be used in a variable declaration to make something a pointer, or in a statement to dereference a pointer, or to multiply two numbers in mathematical context, or together with / it's used for starting & ending comments etc.

No, I do not see a problem with characters serving different purposes simultaneously.

A lot of good stuff in there

Posted Feb 15, 2025 20:32 UTC (Sat) by excors (subscriber, #95769) [Link] (4 responses)

I had always assumed the 'a syntax came from ML, where 'a is pronounced "alpha", 'b is "beta", 'c is obviously "gamma", etc (https://www.cs.cmu.edu/~rwh/isml/book.pdf page 68). And I assume that's originally because ML was designed by people who think primarily in maths and TeX, and then have to translate into ASCII; 'a is the closest they could get visually to α. (Fortunately they weren't complete monsters: they translated λ to fn, not 'l, when implementing lambda calculus.)

In ML it's a type variable, representing an unspecified type that will be determined via type inference when the value is instantiated (I think). Basically the same as T in C++ templates. In Rust it's a similar idea, but just for the lifetime component of a type. The Rust compiler was originally written in OCaml, so they would have been familiar with that syntax.

However, it turns out ML had nothing to do with it. Until Rust 0.6 they used "&a/foo" instead of "&'a foo", but they weren't happy with that: https://smallcultfollowing.com/babysteps/blog/2012/12/30/... . That blog post suggested several options including "&{a}". Someone on Reddit suggested "&{'a}", to make the syntax less ambiguous. Graydon Hoare thought the braces were ugly (https://web.archive.org/web/20140716163946/https://mail.m...), so they settled on "&'a".

The use of the same syntax for loop labels is not a coincidence: the original idea was that loop labels were actually lifetimes, and you could write " 'a: { let x: &'a T = ...; }" to explicitly tie a variable's lifetime to a block (https://web.archive.org/web/20140716182842/https://mail.m...). That didn't happen, so now the loop labels are only used for control flow and exist in a different namespace to lifetimes.

A lot of good stuff in there

Posted Feb 15, 2025 20:40 UTC (Sat) by mb (subscriber, #50428) [Link] (3 responses)

Thanks for the nice summary.

I'd like to add (for the people who don't know Rust, yet) that lifetime names are not restricted to single characters.
They are single characters starting at 'a by convention, but sometimes it is useful to use more descriptive names such as 'ctx or even longer names.

Lifetime names

Posted Feb 15, 2025 20:57 UTC (Sat) by farnz (subscriber, #17727) [Link] (2 responses)

FWIW, I encourage people who are "fighting the borrow checker" and can't just stick to owned copies of data to use long and explicit lifetime names, and then to work out how to elide them. While you can write fn foo(&mut self, name: &str) -> &str, it's often easier to reason about what's going on when you're hazy on the elision rules if you write that as fn foo<'this, 'name>(&'this mut self, name: &' name str) -> &'this str), because it makes it much clearer where the surprising behaviour is coming from.

And once you've got a good grip on how lifetimes work in your code, it's then a lot easier to apply the elision rules to remove useless lifetimes and leave your code clear to future readers. It is, of course, easier to start out without lifetimes at all, and just use owned copies (via Clone::clone()) whenever the borrow checker argues with you, but it's not always possible to do so.

Lifetime names

Posted Feb 17, 2025 12:17 UTC (Mon) by taladar (subscriber, #68407) [Link] (1 responses)

Not necessarily specific to lifetimes but I have also noticed when programming Haskell that single letter naming appeared a lot more in abstract code where you don't really have a useful name to give to a variable or function (e.g. f for the function passed into the higher order map function).

Over time I noticed that more abstract code is often actually easier to reason about because the more abstract some piece of code is, the fewer operations can be applied to it (e.g. you can't write a literal to produce values out of thin air, you can only pass on values you already have, can't call just any function, just the ones in the traits (or in Haskell typeclasses) specified in the constraints).

Concrete versus abstract

Posted Feb 17, 2025 12:37 UTC (Mon) by farnz (subscriber, #17727) [Link]

I also find similar - but I find that it's often easier to dig myself out of a mess if I start by thinking about things in deeply concrete terms (with long clear names), and move to the abstract once I understand the concrete sphere. So starting out with lifetimes like 'closure_changing_id and 'id_source, put in very explicit 'id_source: 'closure_changing_id style of annotations (rather than relying on lifetime variance), and then gradually trim back down to a minimal setup once I've understood what was going on.

I might, in that process, end up with short names, I might not - it will depend where I get to as I dig myself out of the mess. But it's similar to how I'd dig myself out of a mess with functions named "a", "b", and data items named "ud_1", "ud_2" etc - make the names excessively verbose, and shrink them to a sane size once I've understood WTF is happening here.

A lot of good stuff in there

Posted Feb 15, 2025 11:19 UTC (Sat) by jengelh (guest, #33263) [Link] (40 responses)

>>"'" (single quote) character (which I already forgot what it's about)
>>as in other languages it was used to delimit either a char or a string for example
>
>In Rust it *is* used to delimit a char. I'm not sure what your point is.

This use:

fn from_segments(segments: &[&Segment<'_>]) -> Option<Version> {

(it also complicates colorization rules for an editor if it can no longer expect that every opening ' has a symmetric closing '.)

A lot of good stuff in there

Posted Feb 15, 2025 12:16 UTC (Sat) by mb (subscriber, #50428) [Link] (29 responses)

We are fundamentally limited by what characters there are on the keyboard.
I really don't see how one could mistake 'a for 'a', because they come in completely different places of the code (type position vs. value position).
And also in the vast majority of the cases you don't even have to write lifetimes ('a) down at all. Your example is in-between having to write it down and not having to write it down, because it already elides the name, but not the whole '_ syntax. The advantage of '_ is that it allows you to not having to write down the lifetime declaration. So (at least) one less ' character in the code.

Would it have been better to use `a instead of 'a? I don't know. Then probably the next person would complain that it's hard to distinguish ` from ' or that it's harder to type on certain keyboard layouts.

Yes, you can write complicated looking code in Rust, but the common case is rather simple due to a set of default rules in lifetime and type inference.

The same thing applies to C. Complicated, confusing and clever macros, anyone? Or the way you read types in C? Which way do you read, left to right or right to left, where do you start and where do you flip directions?
http://unixwiz.net/techtips/reading-cdecl.html

>it also complicates colorization rules for an editor if it can no longer expect that every opening ' has a symmetric closing

Sure. Rust is not trivial to parse.

A lot of good stuff in there

Posted Feb 15, 2025 14:52 UTC (Sat) by jengelh (guest, #33263) [Link] (28 responses)

>Would it have been better to use `a instead of 'a

That is another character where a handful of other languages (sh, perl) and documentation-related formats have popularized symmetric use. So, personally, no, I would not use `.
The goal is not to make use of every symbol that's on the keyboard (hey, remember Perl?) and looks like a mathematician's scribblings, but to make a program that's amenable, to others, as well as oneself 10 years down the line. C# and Python's use of keywords appear like a nice idea, e.g. "f(ref int x)" and "for i in stuff", respectively.

A lot of good stuff in there

Posted Feb 15, 2025 15:19 UTC (Sat) by mb (subscriber, #50428) [Link] (26 responses)

Now we know what you would not do.
So..., how would you spell out lifetime annotations?

A lot of good stuff in there

Posted Feb 15, 2025 16:16 UTC (Sat) by wtarreau (subscriber, #51152) [Link] (25 responses)

> Now we know what you would not do.
> So..., how would you spell out lifetime annotations?

I guess one of the difficulties is that when you try to learn the language, you have to face both new concepts and new syntaxes at the same time. Getting a clear idea of what you're doing when blindly copy-pasting stuff that you're having a hard time developing reflexes for is hard, and that definitely does not help getting more up to speed with it.

And clearly, regarding some of the concepts, I spent two hours extending a hello world program to append a numerical argument passed on the command line, i.e. the equivalent of printf("hello world: %d\n", argc>1?atoi(argv[1]:0). I just gave up, being constantly told by the compiler that I was doing bad stuff. It did encourage me to try different things, which is great, but each thing I tried didn't work and at some point I was going in loops. It's quite discouraging, because I spend my time telling gcc to shut up when it doesn't know, and here I felt that the compiler was thousands of times more rigid and extremist. I can hardly see a use case where this could bring me anything except pain.

A lot of good stuff in there

Posted Feb 15, 2025 16:28 UTC (Sat) by intelfx (subscriber, #130118) [Link]

> I spent two hours extending a hello world program to append a numerical argument passed on the command line, i.e. the equivalent of printf("hello world: %d\n", argc>1?atoi(argv[1]:0). I just gave up, being constantly told by the compiler that I was doing bad stuff. It did encourage me to try different things, which is great, but each thing I tried didn't work and at some point I was going in loops. It's quite discouraging, because I spend my time telling gcc to shut up when it doesn't know, and here I felt that the compiler was thousands of times more rigid and extremist.

Yes, that is expected. You need to learn the language, which does not end at learning the syntax. This means aligning your mental model with the language, learning some new habits, and unlearning some of the old ones.

There would be no point in Rust if it was just a clone of C with a more inscrutable syntax. Rust is valuable *precisely* because it represents more than just C with an inscrutable syntax.

> I can hardly see a use case where this could bring me anything except pain.

This does not mean such a use case does not exist. This is precisely the "hubris trap" that so many high-profile Linux developers and maintainers are falling into.

A lot of good stuff in there

Posted Feb 15, 2025 16:30 UTC (Sat) by mb (subscriber, #50428) [Link] (16 responses)

>I can hardly see a use case where this could bring me anything except pain.

Rust forces you to not code the same bug as in your C code. The one where argv[1] is not a numeric string.
To me that is a good thing.

A lot of good stuff in there

Posted Feb 15, 2025 16:40 UTC (Sat) by wtarreau (subscriber, #51152) [Link] (15 responses)

> Rust forces you to not code the same bug as in your C code. The one where argv[1] is not a numeric string.
To me that is a good thing.

The thing is, there are plenty of cases where I *know* it's valid, e.g. because it has been validated a few lines before one way or another, or because it's guaranteed by contract in an API or anything. Instead I feel like the extra difficulty diverts me from doing the thing I was trying to do, and that constantly working around the compiler has serious chances of making me introduce bugs the same way I occasionally introduce some by trying to shut up an inappropriate gcc warning by rewriting code differently and making a mistake while the initial one was correct. I have strong doubts about the validity of all rust code in the wild in 10 years. Sure we'll see less overflows, but control bugs are very present as well and I suspect will be harder to spot (and sometimes even fix).

A lot of good stuff in there

Posted Feb 15, 2025 16:56 UTC (Sat) by mb (subscriber, #50428) [Link]

>The thing is, there are plenty of cases where I *know* it's valid, e.g. because it has been validated a few lines before one way or another,

Sure. And then you just have to tell the compiler about that knowledge. It's often as simple as calling unwrap(). Or sometimes even simpler by throwing in a question mark at the end.
And if you were wrong with that assumption, it won't UB on you like C does.

In C you would also at least have to add a comment about where your assumption that an error can't happen comes from. In Rust you can just write that comment into code. For example with expect("The caller shall handle this").

>and that constantly working around the compiler has serious chances of making me introduce bugs

This "constantly working around the compiler" is just because you didn't learn the language.

Everybody who knows Rust knows that the compiler is extremely helpful when dealing with errors.
It often suggests what exactly to change.
It explains exactly what is wrong and often even provides a link to an article about the error with examples of the coding error and examples how to fix it.

> I have strong doubts about the validity of all rust code

Based on what? Your non existing Rust experience?

A lot of good stuff in there

Posted Feb 15, 2025 17:27 UTC (Sat) by intelfx (subscriber, #130118) [Link] (13 responses)

> The thing is, there are plenty of cases where I *know* it's valid, e.g. because it has been validated a few lines before one way or another, or because it's guaranteed by contract in an API or anything

Sure. Then someday preconditions change, API contracts get violated (accidentally, or perhaps maliciously), and the CVE database grows a new entry.

If Rust forces you to code defensively, then that is a *very good thing*. That's the entire point.

A lot of good stuff in there

Posted Feb 15, 2025 18:52 UTC (Sat) by wtarreau (subscriber, #51152) [Link] (12 responses)

> If Rust forces you to code defensively, then that is a *very good thing*. That's the entire point.

But are you sure that it's not C that forces you to code defensively instead, given that you have no safety belt and you're on your own. If on the opposite I say "it compiled so it's safe", I quickly learn not to care anymore about defensive approaches.

A lot of good stuff in there

Posted Feb 15, 2025 19:00 UTC (Sat) by mb (subscriber, #50428) [Link]

Why do you need additional defensive approaches, if the compiles proved that something is safe?

A lot of good stuff in there

Posted Feb 15, 2025 19:19 UTC (Sat) by intelfx (subscriber, #130118) [Link] (9 responses)

Well, I guess it's on the programmer to not become complacent towards other classes of bugs that Rust does not inherently protect from (e.g., logic bugs). Yes, we're seeing this in Rust with unwrap() proliferation. But you can write bad code in any language, that's hardly news.

I would argue that if we are honestly trying to say "C keeps us on our toes by virtue of being such a mess", it's not a good picture either way.

A lot of good stuff in there

Posted Feb 16, 2025 22:22 UTC (Sun) by mathstuf (subscriber, #69389) [Link] (8 responses)

> I would argue that if we are honestly trying to say "C keeps us on our toes by virtue of being such a mess", it's not a good picture either way.

I understand the argument in the context of automating life-risky processes[1]. But the difference here is that Rust *isn't* guaranteeing "all bugs are gone". It is guaranteeing "the compiler will tell you when your code has *a class of problems*" so that you *can* focus on the logic bugs rather than having to think about "is this index calculation going to blow us up later?" Anything that has software for life-risky bits better have some level of logic bug detection (e.g., comprehensive test suite, formal verification, etc.), but this is needed *regardless* of the language unless one is actually doing their coding in Idris or something.

[1] A self-driving car that is wrong 50% of the time keeps the driver "in the loop" more effectively than a 75% accurate self-driving car, but once you hit some threshold, there's a bad overlap between human complacency with how accurate it *usually* is and hitting the gap in the AI behavior.

A lot of good stuff in there

Posted Feb 16, 2025 22:33 UTC (Sun) by intelfx (subscriber, #130118) [Link] (1 responses)

> Anything that has software for life-risky bits better have some level of logic bug detection (e.g., comprehensive test suite, formal verification, etc.), but this is needed *regardless* of the language unless one is actually doing their coding in Idris or something.

That seems to align with what I was trying to say? If the only thing that keeps the codebase working is "bleed-over" of attention from memory correctness to logic correctness, that's not a sustainable practice either way (== we should instead be using tests and other stuff for logic correctness).

A lot of good stuff in there

Posted Feb 16, 2025 23:07 UTC (Sun) by mathstuf (subscriber, #69389) [Link]

Yes, I was not disagreeing with you. Sorry, should have been clearer that I was expanding on it.

A lot of good stuff in there

Posted Feb 16, 2025 22:58 UTC (Sun) by jengelh (guest, #33263) [Link] (5 responses)

>A self-driving car that is wrong 50% of the time keeps the driver "in the loop" more effectively

A mechanism with 50% error rate is a mechanism that quickly gets disabled by the user. ("Fine, I'll do it myself" is effectively keeping the user in the loop, I give you that.)

A lot of good stuff in there

Posted Feb 17, 2025 8:29 UTC (Mon) by mathstuf (subscriber, #69389) [Link] (4 responses)

Yes. I'm aware. I'm trying to clarify that there's a worrying gap where humans feel that they don't need to pay attention because it is "good enough" but ends up failing in the corner cases that aren't all that uncommon. Manufacturers try to absolve themselves of any culpability by saying "but we alerted the user" without considering the "we trained them to be able to ignore what the car is doing" situation built up over time.

A lot of good stuff in there

Posted Feb 17, 2025 8:36 UTC (Mon) by mathstuf (subscriber, #69389) [Link] (3 responses)

Actually, I have relevant experience here. My latest vehicle has some of these fancy new tech feature things in it. One is that it makes a noise when it has a traffic camera in its maps is coming up. However, all it does is play a noise and a small icon appears on the navigation part of the map. The distraction of trying to figure out what the car is telling me and only noticing the icon after multiple instances definitely felt less safe than it needed to be. Most alerts show up right in front of the driver, but those that don't are probably best left alone.

A lot of good stuff in there

Posted Feb 17, 2025 10:42 UTC (Mon) by Wol (subscriber, #4433) [Link] (2 responses)

The other worry is "the car knows best". My car is most definitely NOT considered a "self-driving" car, yet whenever I engage cruise control the car takes control of the speed, and completely ignores the saying "it's a limit not a target", and police advice "the speed limit is not a statement that that speed is safe".

It will (less so now) select illegal speeds, accelerate inappropriately, do all sorts of things. I tend to refer to it as a "granny racer" given that it tries to as fast as possible at every opportunity, yet is excessively cautious at others. It will accelerate, and then when it gets itself into trouble it will scream at me to brake ...

Cheers,
Wol

A lot of good stuff in there

Posted Feb 17, 2025 11:18 UTC (Mon) by mathstuf (subscriber, #69389) [Link] (1 responses)

The cruise control in my '89 Cherokee was like that: race up a hill, brake going down (I never used it because of that). Fair for something that cannot sense anything beyond its direct measurement sensors. I find it much better to go a *little* fast down the hill and lose some speed on small hills. I also had a good sense of how hard to press the accelerator to avoid unnecessary shifting. It is a lot better with adaptive cruise control today which can keep a set distance with the car in front at least. It still wants to race up/brake down to some extent, but at least it will do so within the constraints of traffic.

A lot of good stuff in there

Posted Feb 17, 2025 12:58 UTC (Mon) by Wol (subscriber, #4433) [Link]

Yup. Our previous car was "set the speed and that's what it goes at". Friends have got cars with adaptive control, which slows down to suit conditions. My car has got predictive control, which can be simply translated as "do the speed limit at every possible opportunity". Imho thats blankety-blank dangerous! and could be considered as illegal, seeing there's a requirement on the driver to "be in control at all times", which they're clearly not if the car is programmed to accelerate unexpectedly with no driver input whatsoever.

And it breaks a whole bunch of safe UI guidelines as well, such as allowing a driver to *override* the acceleration!

Cheers,
Wol

A lot of good stuff in there

Posted Feb 16, 2025 11:41 UTC (Sun) by khim (subscriber, #9252) [Link]

> If on the opposite I say "it compiled so it's safe", I quickly learn not to care anymore about defensive approaches.

Not possible. Not even remotely close. If you proved to the compiler that something is safe then you have proved to yourself that it's safe, as well.

Precisely because compiler is dumb and doesn't understand many “subtle” ideas – you have to “dumb down” that proof of correctness that you have in your head to the level that compiler would understand.

But compiler is also tireless and persistent. You couldn't convince it to like your code by writing long but incorrect explanation – like may happen with human reviewer.

> But are you sure that it's not C that forces you to code defensively instead, given that you have no safety belt and you're on your own.

Nah. The most “defensively programmed” code that I saw was in Java or Python. Often “uselessly defensively programmed”. Because there you don't need to prove anything to anyone, any nonsense that you may write would be “memory safe” (by the virtue of virtual machine that decouples you from hardware) and then you have to program defensively and check everything 10 times because nothing (except these redundant checks) protect the integrity of your code from bugs.

A lot of good stuff in there

Posted Feb 15, 2025 19:57 UTC (Sat) by dralley (subscriber, #143766) [Link] (6 responses)

> I spent two hours extending a hello world program to append a numerical argument passed on the command line, i.e. the equivalent of printf("hello world: %d\n", argc>1?atoi(argv[1]:0). I just gave up, being constantly told by the compiler that I was doing bad stuff.

Even without an existing understanding of Rust, that probably should not have taken 2 hours to do. I'm not sure what it was exactly that you were struggling with but it wasn't a "Rust problem". This was just as easy for me to write as your C code likely was for you.

> use std::env::args;
>
> fn main() {
> let arg = args().nth(1).expect("must provide an argument");
> let number: u32 = arg.parse().expect("argument must be a number");
>
> println!("hello world: {}", if number > 1 { number } else { 0 });
> }

A lot of good stuff in there

Posted Feb 15, 2025 21:19 UTC (Sat) by dralley (subscriber, #143766) [Link] (1 responses)

I misread the C and as a result wrote code that does something slightly different. Here is Rust that actually does what your C does.

> use std::env::args;
>
> fn main() {
> let number: u32 = if args().len() > 1 {
> args().nth(1).unwrap().parse().expect("argument must be a number")
> } else {
> 0
> };
>
> println!("hello world: {}", number);
> }

There's nothing complex going on here, and Rust doesn't make it any more difficult than C. It's marginally more verbose, but only because Rust forces you to make potential failure points more explicit.

A lot of good stuff in there

Posted Feb 16, 2025 1:02 UTC (Sun) by MrWim (subscriber, #47432) [Link]

Using match to avoid one unwrap:
use std::env::args;

fn main() {
    let number: u32 = match args().nth(1) {
        None => 0,
        Some(x) => x.parse().expect("nan"),
    };
    println!("Hello, world! {number}");
}

A lot of good stuff in there

Posted Feb 16, 2025 10:38 UTC (Sun) by adobriyan (subscriber, #30858) [Link] (3 responses)

In some sense this specific problem is Rust problem -- doubling down on Iterators at main() time.

Had they make main() to be main(arg0: &[u8], arg: &[&[u8]]) or equivalent it should have been more obvious what to do.

A lot of good stuff in there

Posted Feb 16, 2025 14:19 UTC (Sun) by khim (subscriber, #9252) [Link]

> Had they make main() to be main(arg0: &[u8], arg: &[&[u8]]) or equivalent it should have been more obvious what to do

Sure, but why would they do something that doesn't work correctly? On POSIX there are no guarantee that arg0 exists and Windows program doesn't even receive the list of command-line arguments, but one, single, array of USC-2 characters (no, not UTF-16 as people often think)!

In essence that's an example of what Rust does: instead of “easy” it usually picks “correct”.

A lot of good stuff in there

Posted Feb 16, 2025 14:29 UTC (Sun) by intelfx (subscriber, #130118) [Link]

> Had they make main() to be main(arg0: &[u8], arg: &[&[u8]]) or equivalent it should have been more obvious what to do.

Yes. "For every complex problem, there's a solution that is simple, neat, and wrong."

argv is a pointer to global, mutable data. Attempting to represent it as a Rust reference is completely incorrect with respect to Rust aliasing semantics. The Rust standard library goes to some contortions to wrap argc/argv into a memory-safe abstraction, and an iterator is more-or-less the best way one can do it. Google "rust why args is an iterator" for details.

A lot of good stuff in there

Posted Feb 16, 2025 14:52 UTC (Sun) by intelfx (subscriber, #130118) [Link]

The cynic in me wants to say that all of this confusion stems precisely from this fallacious premeditated state of mind that "Rust is worthless." In other words, you are going from the answer to the question. You have this premeditated answer that "Rust is worthless," and you try to imagine the rest of the world that would fit the answer.

Indeed, if Rust is worthless, then you can just compare it piece-by-piece to C and every idiom where C is "easier" than Rust means that C is the "winner," because Rust has no added value (by postulate) and therefore the "easier" thing wins.

However, this is a fallacy. If Rust had been just a clone of C with a worse syntax, then it would indeed be worthless, but that's not the case. Rust is valuable precisely because it is *not* a clone of C with a worse syntax. It is a different language, built on different concepts and abstractions, chosen for their *value*, and those concepts and abstractions necessitate different idioms to realize that value.

A lot of good stuff in there

Posted Feb 15, 2025 15:30 UTC (Sat) by mbunkus (subscriber, #87248) [Link]

It's a matter of balancing terseness with easy to read code. You could easily extend your argument to replace all mathematical operators in favor of function names, ban array subscript operators for the same, and use "begin" & "end" instead of curly braces for scopes.

No matter where a language ends up on the spectrum, you certainly cannot satisfy everyone. Just a couple of hours ago someone complained having to write "mut" for mutable variables in Rust as they found it too… I don't know, tedious, I guess.

A lot of good stuff in there

Posted Feb 15, 2025 15:59 UTC (Sat) by wtarreau (subscriber, #51152) [Link] (5 responses)

> fn from_segments(segments: &[&Segment<'_>]) -> Option<Version> {

Yeah exactly that thing. I'm sorry, but in this line for me there are far too many characters that I cannot map to something I understand. That's what I meant by "coding with smileys". At some point it's becoming too hard for me to figure which ones work together or individually and what designates what. I just can't mentally parse such strings. It reminds me a bit when I had fun with the Brainfuck language a long time ago for those who know it. And yes, it troubles me that this character doesn't have a corresponding closing one.

A lot of good stuff in there

Posted Feb 15, 2025 16:34 UTC (Sat) by intelfx (subscriber, #130118) [Link] (2 responses)

> Yeah exactly that thing. I'm sorry, but in this line for me there are far too many characters that I cannot map to something I understand.

Then you need to spend effort to *understand* it, and only then return to critique.

Put it more bluntly: the fact that the language in question has concepts that you do not understand (some of which come with their own syntax) is not a language problem, it is a you problem.

C is not the pinnacle of programming languages, and therefore it is unreasonable to assume that a person who is proficient in C is automatically proficient in everything else (from which it follows that if "something else" gives you trouble, then it must be a problem with "something else," because you assume that you are proficient by default). If you disown this fallacious assumption, everything else falls into place.

A lot of good stuff in there

Posted Feb 15, 2025 16:57 UTC (Sat) by Wol (subscriber, #4433) [Link] (1 responses)

> Put it more bluntly: the fact that the language in question has concepts that you do not understand (some of which come with their own syntax) is not a language problem, it is a you problem.

Sounds like you've never worked designing a fail-safe system ...

Okay, when you're dealing with complex systems you can easily find yourself in a lose-lose situation, which is almost certainly the case here, but to say it's wtarreau's problem is just plain rubbish. "Which way is your brain wired?".

This must have been 80's, 90's, a magazine article. Where the writer was working on a gui program, and went to see - wonder of wonders - how the users were actually using it! And one user, demonstrating something, made a mistake and swore "I always get that wrong, what's wrong with me!"

But the lightbulb is that that bit was NOT CONSISTENT. Nine places in ten, what the user did was the RIGHT thing. That one place was badly designed, and here it was the WRONG thing to do. And this is why saying "syntax shouldn't be a problem" is bullshit. If your brain is programmed to think that you delimit strings with ", SQL is a damn nightmare! I'm seriously NOT used to using ' as a delimiter - I think FORTRAN used ", DataBASIC is happy with " (actually, it doesn't care, ", ', \, whatever ...) - being forced to use ' just grates on every level!

It's like forcing an emacs power user to use vi - NOTHING WORKS. And you can't (easily) reprogram them, because every where else is re-inforcing the emacs bindings, so vi is the odd one out, and it just grates - *all* the time.

There's quite a few places like this in programming languages. They have different heritages, they use the same syntax to mean different semantics, and are easy for people from the same heritage to learn while being a nightmare for others. And that's why I said it's a lose-lose situation. If 90% of your work is with C-style languages, every time you use a language with a different style you have to learn it from scratch. AGAIN. And AGAIN. Because your normal life is DE-programming all the new stuff you've learnt.

Cheers,
Wol

A lot of good stuff in there

Posted Feb 15, 2025 17:03 UTC (Sat) by mb (subscriber, #50428) [Link]

Sure. And that is why Rust heavily adopts C syntax style where possible.

A lot of good stuff in there

Posted Feb 18, 2025 5:11 UTC (Tue) by raven667 (subscriber, #5198) [Link] (1 responses)

tldr; a rant (maybe I should put down the booze and go to bed ;-)

I don't know Rust either and am not a C developer, but I thank you for taking the time to be open and vulnerable (in that by admitting you don't understand you open yourself to critique in the nature of "you 'just' need to get good") about the learning process. You can often get the jist skimming a language you don't know, eg. the other day I was curious how Oxidized network device backup handled a few platforms, but I could follow along well enough even though I couldn't do a variable assignment in Ruby without checking the documentation, but that's not really true of Rust as I've seen a number of code examples posted of various things and I don't get the jist at all, even though I do believe the people who are very passionate about the benefits of Rust, I'd need to spend a _lot_ more focused attention to be able to even skim read it let alone write in it.

Not every person is willing to learn something radically new that makes them a beginner again where they put themselves in a position to need help from or get critiqued for beginner mistakes by less senior people who have less experience in their area of expertise, when they can continue to be the trusted expert that other people look up to. Someone a long time ago noted that progress happens "one funeral at a time" but many people do learn and grow with the changing world, its *notable* when they don't.

I said in another thread that what is needed is a kernel-focused targeted training to on-board seasoned C developers who are thinking in machine or assembly language and how the underlying hardware works because the mental models are quite different, and the kinds of tasks/operations needed by an application developer using the standard library and an OS kernel developer using the kernel internal library are vastly different. Kernel developers are going to need to understand how to design and audit an unsafe block to bang on memory and implement something other code can use in a way that an application developer can just abstract away and use a standard library function or some common 3rdparty crate. This training would be best delivered where everyone can see it on the mailing list or some other mechanism that puts it where the people are, because expecting busy people to get hyped enough for totally self-directed training is not a reasonable assumption, you have to meet them where they are if you want to effectively advocate for change. Loudly proclaiming "I'm right, you're wrong, get with the program!" *even* and *especially* when you ARE right and they ARE wrong isn't an effective mechanism to get them to understand why and then advocate for the change themselves, especially when you don't give people time to _understand_ and just want them to believe you.

Like, if you want to convince wtarreau that Rust is the best thing since beer (and given the number of thoughtful competent people who are passionate about it, it probably is) then you need to *show* them and understand that they are a _beginner_ at Rust but not to computing and make your examples accordingly. If you get feedback that something was difficult to understand then you should *believe* them and not argue about their own experience, that is entirely foolish, but instead interrogate why it might be difficult for someone with a different background, empathize with them and try to learn from their experience. Just saying the same things again louder like like they didn't hear you the first time is insulting to both and not effective, "Oh now that you implied I must be stupid if I didn't understand this right away I totally get it, thanks Internet person!" said no one ever.

A lot of good stuff in there

Posted Feb 18, 2025 13:58 UTC (Tue) by kleptog (subscriber, #1183) [Link]

> Not every person is willing to learn something radically new that makes them a beginner again where they put themselves in a position to need help from or get critiqued for beginner mistakes by less senior people who have less experience in their area of expertise, when they can continue to be the trusted expert that other people look up to.

But that's one of the fundamental features of teamwork: no-one understands the whole of the kernel in its entirety. Any time a kernel maintainer is interacting with some other part of the kernel they're the "beginner that needs help". This is *normal* and *expected* and if you can't do that, it's going to make working on any large project a challenge.

As you noted, other threads here did clarify the situation somewhat: it's not the syntax (which is C-like with additions) nor even the standard library (which isn't used in the kernel anyway) but the fact that programming Rust requires the programmer to state their intent rather than simply be a "high level assembler". That's not a question of some training, that's like asking a chemist to learn biology. You're working at a completely different level of abstraction.

Not really sure if there is an easy solution here.

A lot of good stuff in there

Posted Feb 15, 2025 17:11 UTC (Sat) by dskoll (subscriber, #1630) [Link] (3 responses)

fn from_segments(segments: &[&Segment<'_>]) -> Option<Version> {

Wow. And all the criticism about Perl being line noise... huh...

A lot of good stuff in there

Posted Feb 15, 2025 17:37 UTC (Sat) by Phantom_Hoover (subscriber, #167627) [Link]

Given that Rust’s rival here is C, a language with function pointer syntax that people have earnestly tried to explain by telling you to read it in a spiral, the nested rich type syntax doesn’t look so bad.

A lot of good stuff in there

Posted Feb 15, 2025 18:09 UTC (Sat) by mbunkus (subscriber, #87248) [Link]

As a C++ programmer for over 30 years I find that pretty easy to read, to be quite honest. The only thing that C++ definitely doesn't have here are lifetimes; everything else even uses the same sigils for the same (references, function return types) or corresponding concepts (template parameters). I am at an "advanced beginner" level in Rust, yes, but this type of syntax is very, very close to what's already out there in other languages.

A lot of good stuff in there

Posted Feb 16, 2025 0:40 UTC (Sun) by himi (subscriber, #340) [Link]

> fn from_segments(segments: &[&Segment<'_>]) -> Option<Version> {

Compared to something like Python (sans type annotations) there's definitly a lot of syntactical complexity, but the only things that aren't in C syntax are the bits associated with the angle brackets, and the only thing that's not in C++ is the lifetime annotation. And importantly, the syntactic elements that are used are conceptually very close to what they mean in C/C++ - & means a reference to something, Foo<Bar> means a generic type Foo containing a Bar, both of which match the usage from C/C++, everything else is pretty much a one-to-one match.

The structure of the function definition is obviously different, but if you can't adjust to that change ("oh, the return type is at the end of the signature instead of in front of it") then I think it's probably reasonable to say the problem isn't the programming language so much as the programmer . . .

A lot of good stuff in there

Posted Feb 15, 2025 11:58 UTC (Sat) by Wol (subscriber, #4433) [Link] (24 responses)

> I also did all the things you mentioned. I also started with Basic and so on.
I also have learnt many languages with very different syntax and look and feel. That's why I really don't see why Rust would be so different here. In fact it's very similar in look and feel to C. It's much closer to C than say Perl or Python or even C++.

I notice you didn't mention machine code there ... a fundamental part of wtarreau's experience. How close is Rust to machine code? How close is your brain's wiring to wtarreau's? Which computer (or mathematical) language does your brain think in? (As oppose to speak in, like when I'm in France, I'm translating English to French. When I'm in Germany, I'm (mostly) thinking in German.)

Cheers,
Wol

A lot of good stuff in there

Posted Feb 15, 2025 12:16 UTC (Sat) by pizza (subscriber, #46) [Link] (16 responses)

> I notice you didn't mention machine code there ... a fundamental part of wtarreau's experience. How close is Rust to machine code?

There's another aspect there -- how well does machine code translate _back_ into Rust?

When you're debugging hardware or trying to analyze a random (optimized-to-the-moon with no debug symbols and _definitely_ no corresponding source code) binary, it turns out that C much more closely represents what the hardware sees and acts upon.

A lot of good stuff in there

Posted Feb 16, 2025 1:10 UTC (Sun) by himi (subscriber, #340) [Link] (15 responses)

> When you're debugging hardware or trying to analyze a random (optimized-to-the-moon with no debug symbols and _definitely_ no corresponding source code) binary, it turns out that C much more closely represents what the hardware sees and acts upon.

For highly optimised machine code with no corresponding debugging information or source code, there's very little chance that back-translating to /any/ higher level language will give you the original human-written code - optimising compilers just don't work that way. Any non-trivial code will be transformed quite drastically from the original source - it'll be a lot simpler in some ways (because the compiler optimised away a bunch of unnecessary stuff) as well as a lot more complex (because the compiler was doing stuff like unrolling loops or doing automated vectorising and what not); it may even be significantly different structurally (reordering blocks is allowed as long as it doesn't change the behaviour within the constraints of the language spec). Occasionally it just won't even be there at all! Back-translation won't undo those changes, it'll just give you one possible higher level language implementation of whatever came out of the final optimisation pass.

If the target is C that implementation may resemble something that a human would write, though it's probably not going to be anywhere near idiomatic human written C code; if you're targeting Rust it'll probably be /very/ different to what a human would write. But in either case the result should be human /readable/ (if the back-translation is of a similar quality), which is surely what matters in this scenario?

And you'll probably get Rust code that's far simpler than what a human would write, though almost certainly a lot more verbose - it'll be missing all the bits of the language that make it more human friendly to write. I imagine the result would be a lot easier for a C programmer to read, in fact.

A lot of good stuff in there

Posted Feb 16, 2025 10:46 UTC (Sun) by khim (subscriber, #9252) [Link] (10 responses)

> For highly optimised machine code with no corresponding debugging information or source code, there's very little chance that back-translating to /any/ higher level language will give you the original human-written code - optimising compilers just don't work that way.

They do, if you add enough optimization-disabling options and write your code in close enough to machine code fashion.

> If the target is C that implementation may resemble something that a human would write

And now we go back to “insults”. People who use C like a portable assembler are humans, too!

That's why I refuse to bend to the demands of SJWs: if someone wants to be offended – said someone would find a way to be offended, no matter how much effort would I use to step on eggshells… thus I don't even try: if you don't want to talk to me – you can, it's always your choice.

> But in either case the result should be human /readable/ (if the back-translation is of a similar quality), which is surely what matters in this scenario?

No. “We code for the hardware”, quite literally, learn to write code in C in a fashion that it would make it possible for them to easily go from C to assembler and back. They don't just write arbitrary C code.

Is it possible to write Rust in such fashion?

Surprisingly enough yes – but that would be highly non-idiomatic Rust full of unsafe. Going from C to this kind of Rust wouldn't help anyone.

> I imagine the result would be a lot easier for a C programmer to read, in fact.

Maybe, but that's not “we code for the hardware” crowd wants. They want to continue to write code in their portable assembler – and new folks are not cooperating!

They try to bring C++ or Rust or even just switch to a different style in C which would tear the code from the machine code on the output and would bring it closer to something that a human would write (in your words)… but for decade or two “we code for the hardware” guys have managed to browbeat down new guys to write the type of code that they like (remember that Linux developers insist on rewriting all the code in drivers “to match Linux standards”).

This means that new generation, en masse, doesn't even want to touch such code: it's written in language that they, supposedly, “know”… but it doesn't look even remotely close to what they would write! And learning to write code that's more verbose and less understandable from their POV (because it's closer to their intent… and father from machine code)… why would they want that?

That's where the source of the drama lies – and explains why Linux maintainers are almost all “old guys” who started worked with Linux decades ago… new generations wants to “throw away” precisely these things that “old guys” find valuable and important!

Then Rust arrives… and the very first thing it does… is deepening of that schism: now that code that “old guys” want is not just no longer what the “new guys” want to learn to write – it's, now, very explicitly, marked as unsafe… something that you have to avoid, not something that you may want to embrace!

See where the core of that drama lies? It's much deeper than “old C guys don't want to learn Rust” or “new Rust guys don't want to respect old C guys”.

Rust have only just exposed it, it haven't caused it.

And Linux written in C is their “last stand”, really. All more modern languages (be it C++, Rust, or Haskell with Python) couldn't be turned in a form that would make their use as “portable assembler” feasible.

That's what makes the whole story so bitter: when on one side you have guys who know, with 100% certainty, that future belongs to them (future may not be Rust, it could be Ada or Swift… but no matter what would that be… “we code for the hardware” approach wouldn't be used) – and on the other side are guys “defending this tiny foothold, their homeland… that has been under attack for 30 years”… can you really imagine civil talks and mutual understanding in such situation?

I am actually impressed by how well Linus and his underlings handle the situation: only two high-level resignations so far… that's far less than I have expected. But the real drama would happen later: when people like Theodore Ts'o and Christoph Hellwig realize that for all their resistance and open and covert sabotage… they couldn't stop the ocean from swallowing their tiny island… then we would see high-level resignations from the other side… and these are actually more threatening: when “ocean” loses members it's not critical… it's “ocean”… other people would come… it's cynical, but true… but when “island” loses members… it can crumble and collapse! That's why Linus haven't done anything to Christoph Hellwig, BTW: loss of marcan may be unfortunate, but it wouldn't critical, for sure… loss of hch wouldn't be critical, per see, but may trigger mass exodus of “old guys” too early, before “ocean guys” are ready to pick the slack.

A lot of good stuff in there

Posted Feb 16, 2025 13:37 UTC (Sun) by pizza (subscriber, #46) [Link] (6 responses)

> That's where the source of the drama lies – and explains why Linux maintainers are almost all “old guys” who started worked with Linux decades ago… new generations wants to “throw away” precisely these things that “old guys” find valuable and important!

I think you hit the hail on the head, thank you for writing all that up.

But over the course of my career, I've noticed that fewer and fewer (both proportionally and in absolute terms) know or even care about how the hardware actually works. Not just to make things work well/fast, but to be able to *debug* what's going on should something inevitably go wrong.

Even thirty years ago, these skills and interest were relatively rare, but today they're actively dumped on even as they're depended upon more than ever. Core infrastructure isn't sexy, but necessary -- It's the "I don't care about the plight of farmers, I get my food from the supermarket" attitude all over again,

A lot of good stuff in there

Posted Feb 16, 2025 14:12 UTC (Sun) by khim (subscriber, #9252) [Link] (3 responses)

> I've noticed that fewer and fewer (both proportionally and in absolute terms) know or even care about how the hardware actually works.

It's definitely true about “proportionally”, but I'm not sure it's true for “absolute terms”. There are enough people who want to know how the hardware actually works – or we wouldn't be getting so many emulators on /r/rust.

We had no trouble of finding students who wanted to poke with bytecode and machine code generation for internship, that's for sure!

They just don't want to think and worry about “how the hardware actually works” in every line of code they write!

> Not just to make things work well/fast, but to be able to *debug* what's going on should something inevitably go wrong.

But isn't it the same with other human endeavors? Once upon time every car driver was a car mechanic, too. And early plane pilots were sure plane without open cockpit wouldn't ever work because one have “feel” air to successfully maneuver the plane!

Today… there are still car mechanics and car designers and people who know how to build planes exist, still… but most drivers and pilots don't really care about “how all that actually works”.

Why should software development be any different?

> Core infrastructure isn't sexy, but necessary -- It's the "I don't care about the plight of farmers, I get my food from the supermarket" attitude all over again

True. That describes the majority. But the trouble for Linux (and for “old timers”, in general) comes from the other direction: It's one thing to “lift the lid” and understand what is happening when your program suddenly becomes 20x slower for no good reason (just a very recent experience when bad interaction of SSE and AVX caused us precisely that… we certainly needed to poke in the generated code to see what exactly have changed and what exactly makes everything so slow… filled the bug for the clang, too… it should be included in version 21 – and that was done by former students) – and completely different thing to write every line of code with full understanding of what would it produce on the machine code level.

First one is fun and interesting and important… second one… no one from “new generations” want to do that! Every time someone like wtarreau asks “how am I supposed to translate this machine code to Rust” the answer is always “well, there's unsafe and intrinsics and asm and other such tools for these exotic cases where that's needed”… they entirely miss the point that wtarreau brings about how someone may want to know that in every piece of code that one writes!

But, ultimately, Rust guys are right: writing code in way that you may always tell what precisely is generated from this or that line of source code was natural when compilers were extremely dumb and computers were extremely slow… today, even if code generated is not always the best… does it really matter if we don't have enough people to write anything else? Even the ones who do care about the machine code generated naturally assume that work of writing correct code and work of looking on what is happening on machine code level are separate works, you don't do them simultaneously!

A lot of good stuff in there

Posted Feb 16, 2025 20:27 UTC (Sun) by wtarreau (subscriber, #51152) [Link] (2 responses)

I think you nailed the whole thing right, khim.

And it's true that as time passes and machines improve, some older optimizations are no longer relevant. For example I took extreme care to avoid generating bsf/bsr on atoms because that was slow as hell while a hand-written version was much faster, to the point of being visible in the end-user's code. Nowadays first-gen atoms have disappeared and that distinction can disappear as well. Similarly, there's a lot of legacy code around to deal with each generation of compiler's misbehavior, like gcc 4.x doing stupidities with __builtin_expect(x, 1) that was turning x to an int and explicitly comparing it to value 1! Some of the complexity of old code does come from the collection of all such bad stuff, and many of us have been happy to fix such performance trouble caused by a single stupidity in a compiler or CPU that was causing 2-3% total performance loss. And code cleanups over time allow to get rid of that stuff, which sometimes can look like pieces of art, by the way, but just no longer relevant art.

Some of us know that such problems are recurring and continue to want to be able to provide solutions to them. Nowadays, with massively multi-core CPUs we're seeing extreme times caused by cache line sharing that happens very easily if you're not careful. I've been used to organize my structs to avoid accidental sharing between areas that are supposed to be accessible from other threads and those which don't for example. And that's just an example. Sometimes you realize that a function call significantly degrades performance just because pushing the return pointer to the stack forces cache writes because in parallel you're holding a lock that another CPU tries to acquire, and since that lock is in the same cache line as the data you're manipulating, every time it wants to read the lock's state, it causes a cache line flush which due to TSO also causes the stack to be written. I'd feel terribly useless by not being able to easily tweak all that when such problems are faced.

And at the same time I totally understand that the vast majority of developers don't want to care about this. A lot of code doesn't need to be finely tuned nor to be performant under specially stressful conditions. But what I like in programming precisely is getting the most from the hardware and making sure my code scales as much as possible. Just for the same reason I wouldn't develop a web browser in C myself and would rather have someone do it in a more suitable language, there would probably be specific points where that person would prefer to rely on low-level optimizations for certain things that are on the critical path and rely on someone doing my job in their preferred language. I think there are jobs for everyone, and when all developers will realize that this should be complementary instead of a competition, we'll have made great progress!

A lot of good stuff in there

Posted Feb 17, 2025 12:58 UTC (Mon) by taladar (subscriber, #68407) [Link] (1 responses)

If you enjoy that sort of micro-optimization, could you maybe scratch that itch by providing tooling that detects those situations or writing compiler code that avoids them/optimizes them instead of writing code manually that does all those things? Or would you consider that less enjoyable?

A lot of good stuff in there

Posted Feb 17, 2025 19:40 UTC (Mon) by wtarreau (subscriber, #51152) [Link]

> If you enjoy that sort of micro-optimization, could you maybe scratch that itch by providing tooling (...)

There's no single rule for this, everyone has their own methods and intuitions depending on what they see, and based on their experience. There's no magic, it's just called "development". Tools to help for this already exist and are overly used. "perf" is one of them, there's a reason it by defaults shows the instructions where you're spending time and also delivers hardware performance counters to figure if you're waiting for data too often etc, and such tools users expect to retrofit changes in their code to change the machine code's behavior and the execution pattern. It's not uncommon to achieve multiple unit gains on a whole program's performance using perf, when you're facing a scalability issue.

Another very popular tool is godbolt.org. Similarly, it shows you the resulting asm for your code, and allows you to try many compiler flavors, versions and architectures to verify if your workaround for a limitation is portable enough. It's precisely because a lot of developers care about this that such tools exist and are popular.

The C language is quite terrible for many reasons, and C compilers are particularly stubborn and will to everything they can not to help you. However once you've figured that instead you are the one supposed to help them to produce better code by giving them some hints about what you're trying to do, they appear to show quite reproducible behaviors, with worst case being optimizations that just degrade to the default case. In this situation it's often worth investing time to help them, including only on certain platforms if only certain permit some optimizations. The gains are sometimes high enough to reduce the number of machines in production, and at this point that starts to count.

A lot of good stuff in there

Posted Feb 16, 2025 15:54 UTC (Sun) by dralley (subscriber, #143766) [Link] (1 responses)

C is not "how the hardware actually works". Even assembly is not exactly "how the hardware actually works" (but it's a lot closer than C).

C represents a very simplified model of how *some* hardware works. There has also been a lot of convergent evolution such that hardware is designed to work in such a way that C works well with it, because C and C-like languages are so important.

A lot of good stuff in there

Posted Feb 16, 2025 17:33 UTC (Sun) by pizza (subscriber, #46) [Link]

> C is not "how the hardware actually works". Even assembly is not exactly "how the hardware actually works" (but it's a lot closer than C).

Please read and respond to what I actually wrote.

Perhaps this is enough

Posted Feb 16, 2025 15:20 UTC (Sun) by corbet (editor, #1) [Link] (1 responses)

You are using "SJW" like a slur. Do not do that here.

You are (at great volume) doing something similar with this strawman you have created wherein kernel developers really want to be working in assembly. I don't know any kernel developers like that; this image is not really helpful to the discussion.

There are times when you have to know precisely what the hardware is doing; poking an MMIO region, configuring page tables, implementing lockless algorithms. Happily, Rust enables all of that, and everybody knows it. Beyond that, kernel developers understand the problems of premature optimization as well as anybody else.

Khim, you have posted 26 times (at last count) on this article; people are broadly tuning you out. Can I ask you, please, again, to give it a rest?

Perhaps this is enough

Posted Feb 16, 2025 15:27 UTC (Sun) by khim (subscriber, #9252) [Link]

> Can I ask you, please, again, to give it a rest?

Fine with me. It's not as if I can change the future, I can only show my vision of the future and help someone to better prepare to the future… and I hope that ones who wanted to do that saw something interesting for them, but I'm not the “great savior”, the ones who want to ignore the future may continue to do that till it would arrive.

A lot of good stuff in there

Posted Feb 16, 2025 19:09 UTC (Sun) by branden (guest, #7029) [Link]

> I am actually impressed by how well Linus and his underlings handle the situation: only two high-level resignations so far…

Is anybody else reminded of "Only two remote holes in the default install, in a heck of a long time!"?

A lot of good stuff in there

Posted Feb 16, 2025 13:18 UTC (Sun) by pizza (subscriber, #46) [Link] (3 responses)

> For highly optimised machine code with no corresponding debugging information or source code, there's very little chance that back-translating to /any/ higher level language will give you the original human-written code

I'm not talking about "give the original human-written code" -- I'm talking about *what the machine sees and operates on*.

Because that is what the hardware is _actually_ acting upon, not what humans _think_ or _intend_ for the hardware to act upon.

Do you see the difference?

A lot of good stuff in there

Posted Feb 16, 2025 14:46 UTC (Sun) by khim (subscriber, #9252) [Link] (2 responses)

> Do you see the difference?

Well… sort of.

> what humans _think_ or _intend_ for the hardware to act upon.

But that's the only thing you care 99.9% of time! If your intent is correctly expressed… and compiler accepted it… and correctly compiled it… then everything should “just work”.

> Because that is what the hardware is _actually_ acting upon

That should only matter when there's a bug somewhere, isn't it? Could be bug in your understanding of language rules, or maybe bug in your description of hardware requirements, or even bug in the compiler (these also exist, sure)… but that's rare exception, not the rule!

> I'm not talking about "give the original human-written code" -- I'm talking about *what the machine sees and operates on*.

But that can be drastically different, depending on the exact version of the compiler, what library you used, what optimizations were enabled and so on… why would you even care about that, if things work as expected?

Basically: what “old school” use as the basis, as the beginning of understanding of the language… new generation puts and the very end… something to study and understand after you already know the language “inside out” and can use it, then, finally… it's time to “open the lid” and see how it interacts with hardware.

But the thing that's really scary and… not sure what word is best… maybe “sickening” in that whole story: Rust guys are not weirdos… Linux maintainers are!

Just think about it: programming languages, since ALGOL 58 languages were developed entirely independently from the machine code and then hardware was made to better accomodate these high-level ideas and thus question of “how this high-level construct would be implemented in hardware” was always not the central question but of more or… detail of implementation, I guess.

Then the microprocessor revolution happened… and Unix revolution happened… for two decades people had to deal with machine code and assembler – not because they liked that, but because there was no choice, their pitifully underpowered systems couldn't deal with anything like ALGOL or LISP – and that brought to us the generation or two of developers who think that knowing how their C code is translated into machine code is important… but that whole thing is the quirk of history!

Neither people who developed computers and computer languages originally thought that it's important (and should drive anything) nor people who are following after them think in that vein (computers are powerful enough that “opening the lid” on the machine code is rare debugging tool and not something you deal with all the time)!

For a long time “we code for the hardware” guys were needed because tiny microcontrollers couldn't be coded in anything but assembler… but today, when your charger have more powerful CPU than Apollo 11… that final bastion is crumbling, too!

A lot of good stuff in there

Posted Feb 16, 2025 14:59 UTC (Sun) by pizza (subscriber, #46) [Link] (1 responses)

> That should only matter when there's a bug somewhere, isn't it? Could be bug in your understanding of language rules, or maybe bug in your description of hardware requirements, or even bug in the compiler (these also exist, sure)… but that's rare exception, not the rule!

Not just bugs; It's he overwhelming norm when you *don't* have the source code and are trying to figure out WTF the binary is doing.

But even if it was just bugs, that doens't change the need for that basic capability.

A lot of good stuff in there

Posted Feb 16, 2025 15:08 UTC (Sun) by khim (subscriber, #9252) [Link]

> Not just bugs; It's he overwhelming norm when you *don't* have the source code and are trying to figure out WTF the binary is doing.

Why would relationship between machine code, C and/or Rust even matter in that case? You have the machine code, it works in a certain way, why involve high-level languages at all? Especially if said code could be hand-written assembly…

> But even if it was just bugs, that doens't change the need for that basic capability.

It changes the direction: instead of trying to imagine how that source code that produced that output can look like – you can just look.

That essentially turns what you perceive as “basic capability” into a “parlor trick that can be used to amuse people, but has no relevance to anything”.

P.S. I actually need to debug programs that I have no sources for pretty often at my $DAYJOB, but since they can be written in Java, C#, Go, or maybe even in some homegrown language with homegrown compiler… and rarely are written in C… I never had the luxury to back-translate low-level machine code code into high-level code… maybe that's what made it easy for me to accept impossibility of going back to C++ or Rust from the machine code. It's hard to mourn loss of something that you never had in the first place.

A lot of good stuff in there

Posted Feb 15, 2025 12:23 UTC (Sat) by mb (subscriber, #50428) [Link]

>I notice you didn't mention machine code there

I learnt to read and write assembly code alongside with Basic.

>How close is Rust to machine code?

It's as close or as far away as C is.
Programming in C with machine code in mind is a myth.

C and Rust are both languages with a virtual abstract machine model.

>When I'm in Germany, I'm (mostly) thinking in German

There's another option to not think in language at all.

A lot of good stuff in there

Posted Feb 15, 2025 14:24 UTC (Sat) by khim (subscriber, #9252) [Link] (5 responses)

> I notice you didn't mention machine code there ... a fundamental part of wtarreau's experience. How close is Rust to machine code?

Why does it even matters? Machine code generated by Rust is “good enough”. That's it. It's the same as with [modern] C.

> How close is your brain's wiring to wtarreau's?

Not too much different, I suspect. I also passed all these stages (except MS-DOS 2.11, I think the oldest one I used in XX century was MS DOS 3.20, although I played with MS DOS 2.11 in emulator in XXI century), but I haven't stopped there. When C have passed the stage when it was possible to pretend that you are dealing with portable assembler I embraced the duality: first you need to explain to the compiler what your program does – and then, perhaps, check what compiler actually produced.

Second step is very much optional, not critical. I wouldn't cry if compiler misunderstood and generated suboptimal code, most of the time.

> Which computer (or mathematical) language does your brain think in? (As oppose to speak in, like when I'm in France, I'm translating English to French. When I'm in Germany, I'm (mostly) thinking in German.)

Good analogue. Eseentially: the time have come to stop thinking about our program only in terms of machine code. Or even primarily in terms of machine code. Again, to understand the whole context it's best to reread the story of Mel, the real programmer. Please do. Then you would understand what this whole schism between Rust “believers” and C “diehards” is all about.

Or at least that critical tidbit:

Mel loved the RPC-4000 because he could optimize his code: that is, locate instructions on the drum so that just as one finished its job, the next would be just arriving at the "read head" and available for immediate execution. There was a program to do that job, an "optimizing assembler", but Mel refused to use it.

"You never know where its going to put things", he explained, "so you'd have to use separate constants".

It was a long time before I understood that remark. Since Mel knew the numerical value of every operation code, and assigned his own drum addresses, every instruction he wrote could also be considered a numerical constant. He could pick up an earlier "add" instruction, say, and multiply by it, if it had the right numeric value.

I have no idea if either wtarreau or mb even did tricks similar to what is described there… but I did. My first ever programmable device was calculator that was left from my sister (she was already in college when I was in the primary school… 10 years difference). And I did all these tricks that “story of Mel” talks about. Calculator had no drum, of course, but with 98 bytes (not kilobytes, not megabytes, but bytes) of programming memory… you did what you could.

And then, on real computer… with whopping 16KiB of memory… I can finally forget about all that silliness.

Of course I can still recognize these tricks and for some time after than somewhat similar tricks were still employed as Jumping into the middle of an instruction is not as strange as it sounds tells us… but today we no longer care about these things. Most of the time.

And Rust proponents just simply assume that people have stopped thinking about these tricks. Completely.

For tricky cases, when, once in a blue moon, you may jump in the middle of instruction or do something not expressible in high-level language… sure. There's asm! for that.

But most of the time? No. C was already unsuitable for that, why would anyone even try that with Rust?

But the trick is that not everyone believes that C is no longer suitable for that. If you add enough “disabled optimizations” flags (which, to these guys that think like a wtarreau, sound more like “stop breaking my code, damn it” flags) you may still pretend that you can write “portable assembler” in C… but how long would it work?

It wouldn't work with Rust, at least: Rust doesn't offer any such optimization flags – precisely to discourage that “thinking in terms of machine code” (again: you want machine code – there's an asm! for you). There are some flags but they don't change the language and don't make illegal constructs legal.

The Rust users assume it doesn't matter: if you stop thinking about your program in terms of machine code and would accept the fact that "What The Hardware Does" is not What Your Program Does… then going from C to Rust is not too hard… but that's precisely that step that people are struggling with.

Ultimately, though, Rust people are right: first there would be a mandate to only use safe languages, then all these flags that make it possible to still pretend that pointers are just an integers would go away… it would take decade or may be two decades, but eventually C would turn into a Fortran, pure legacy… OS kernels would be rewritten and people would accept that… and if you accept and embrace that future then learning to think in terms of high-level language would be needed, anyway… why delay the inevitable?

But “we code for the hardware” developers continue to fight that future, tooth and nail! They would just not accept it! Especially the ones who are kernel maintainers: their job is more than secure, it's more-or-less guaranteed for the next 10 years or so… why should they do anything to accept these pesky new kinds? It's very explicit, isn't it: Telling people what they have been doing for their entire career is hopelessly broken, is not a selling point especially when they know better. They know what they have been doing gets the job done.

P.S. It's also very funny how they say that Rust is not acceptable because it breaks their habits and then turn around and celebrate extra strong Rust (for the refernce: SPARK picked up its pointers safety story from Rust and thus, of course, inherited all the restrictions and limitations… in fact it's more strict that Rust… significantly more strict) – but I guess when someone is in “they are coming to take my job” position simple logic takes a backseat and anything that may delay the inevitable becomes a fair game…

P.P.S. The real question about memory safe languages adoption would be question of liability. After usual disclaimers about “lack of liability” would be deemed illegal… insurance companies would kill C very quickly. By simply offering sharply different prices for things written in C and things written in other, safer, languages. Whether this would push Rust to be adopted or if some other languages would be deemed more suitable remains to be seen, though. It would be funny if Ada would see a resurgence and instead of Rust people would be forced to actually adopt something even more strict and even more limited. But my current bet is still on Rust because it's better balanced between flexibility and strictness.

A lot of good stuff in there

Posted Feb 15, 2025 16:35 UTC (Sat) by wtarreau (subscriber, #51152) [Link] (4 responses)

> and if you accept and embrace that future then learning to think in terms of high-level language would be needed, anyway… why delay the inevitable?

It's not on purpose. I mean the more I'm reading the articles, the more I'm feeling like there are different jobs and some don't understand why others just don't want to switch jobs. It's exactly like here in europe where you're considered to have failed if you do tech stuff all your life instead of becoming a manager to manage people despite having zero skills for that.

I think that there are people who are at ease with thinking at low levels, the same way other people are at ease in mechanics and a lot of similar stuff, and there are people more at ease at higher levels and relying on tools to abstract the underlying levels. Both are equally fine and needed. But just like no python developer is asked to learn to program PIC or AVR in assembly, python developers do not expect those programming the microcontroller in their alarm clock to learn python package dependencies. I don't see why there isn't the same type of cross-respect between people doing lower-level stuff in C and those doing higher-level stuff in Rust.

> The real question about memory safe languages adoption would be question of liability. After usual disclaimers about “lack of liability” would be deemed illegal… insurance companies would kill C very quickly. By simply offering sharply different prices for things written in C and things written in other, safer, languages.

Most of the work being done in C these days is only extending stuff that already exists, it's not about writing entirely new stuff. That makes dropping the language quite hard, as is currently seen in the kernel by the way.

A lot of good stuff in there

Posted Feb 15, 2025 16:45 UTC (Sat) by mb (subscriber, #50428) [Link] (1 responses)

> That makes dropping the language quite hard

Yes.
But that's not what is happening in most projects.

The common approach is that new code shall be written in a safe language.
The old C code is well tested and there are not many bugs. It can stay as-is for the next 3 decades.
Bugs and security issues primarily happen in new code.

A lot of good stuff in there

Posted Feb 15, 2025 17:06 UTC (Sat) by khim (subscriber, #9252) [Link]

> The old C code is well tested and there are not many bugs. It can stay as-is for the next 3 decades.

This only works with code that's not supposed to change. But Linux kernel does change. And quite actively, at that.

So the only viable alternatives is rewrite it in some other, safer, language – or replace it.

Replacement would happen if Rust rewrite wouldn't take hold: there would be less and less maintainers, doing anything would become harder and harder and, at some point, one of numerous projects that are aiming to replace Linux would be able to catch up.

Of course for that to happen Rust for Linux have to fail, first… and despite an occasional drama we don't see it failing, yet.

A lot of good stuff in there

Posted Feb 15, 2025 17:02 UTC (Sat) by khim (subscriber, #9252) [Link] (1 responses)

> But just like no python developer is asked to learn to program PIC or AVR in assembly

Are you sure? I rather suspect that significant percentage of CircuitPython users go on to “program PIC or AVR in assembly”.

They just start with Python because it's easier to learn.

> python developers do not expect those programming the microcontroller in their alarm clock to learn python package dependencies

Why not? That's valuable skill if you are python user.

> I don't see why there isn't the same type of cross-respect between people doing lower-level stuff in C and those doing higher-level stuff in Rust.

It easy to see why: we are, quite literally, talking about the ones who would be retired and the ones who would replace them. They don't “work on different levels” like users of MicroPython and users of full-blown CPython.

> That makes dropping the language quite hard, as is currently seen in the kernel by the way.

Not really. “Dropping the language” is only hard when you are not willing to drop the whole package.

But give the companies enough incentive – and they would be ready to drop the whole thing and rewrite things from scratch.

The interesting question is not whether programs written in C would be replaced with programs written in memory-safe languages (I rather suspect few years down the road the pressure to stop using non memory-safe would be too hard to ignore) but how exactly would that happen: would Linux kernel be rewritten in Rust or would it be replaced by something else. And if it wouldn't be rewritten in Rust but replaced – would the replacement use Rust or something else (Ada? Swift?).

That are the real questions. The big issue that hurts people like marcan is that technical changes that are destined to happen in one funeral at time fashion is very hard to expedite. You can not pressure the majority, which means that initial stages of replacement happen very slowly. Mass exodus and resignation of “old timers” have to happen when their replacemnts are numerous enough, otherwise you risk destroying the whole thing, if they would leave before someone is ready to replace them.

A lot of good stuff in there

Posted Feb 15, 2025 21:12 UTC (Sat) by pizza (subscriber, #46) [Link]

>> But just like no python developer is asked to learn to program PIC or AVR in assembly
> Are you sure? I rather suspect that significant percentage of CircuitPython users go on to “program PIC or AVR in assembly”.

CircuitPython is a tiny minority of the overall Python ecosystem.

In fact, I'd wager the $7 I have in my pocket right now that proprotionally far more C developers are likely to write PIC or AVR asm than CircuitPython users.

Insofar as directly writing PIC or AVR is necessary to begin with.

A lot of good stuff in there

Posted Feb 15, 2025 20:02 UTC (Sat) by roc (subscriber, #30627) [Link] (1 responses)

My story started pretty similarly. Began with BASIC, learned Z80 and 8086 machine code, Turbo Pascal, then Turbo C and C++. Spent decades writing C++ in increasingly larger projects. Picked up some Prolog, Haskell, ML, Java, Python and JS along the way. Was delighted to adopt Rust because it fixed the glaring issues with C and C++. Less delighted lately when I changed jobs and had to resume using C++ and use Go.

When I write C++ or Rust I often think about the code that will be produced. This still matters at times, and I have side projects (rr, Pernosco) that require deep understanding of binary code. Last week I wrote code to decode some Aarch64 instructions! But when working with large systems you have to think about a lot of things and "producing the machine code I want" is usually not the most important thing.

> why does it care about who owns a pointer to a location

Every C programmer working on non-toy software has to care about who owns the memory each pointer points to. If you don't, you drown in memory leaks, crashes, and security vulnerabilities. Rust's premise is, given the importance of this and that you have to think about it all the time, the compiler should care about it too so it can make sure you get it right.

A lot of good stuff in there

Posted Feb 15, 2025 21:21 UTC (Sat) by pizza (subscriber, #46) [Link]

> Every C programmer working on non-toy software has to care about who owns the memory each pointer points to.

The destination of a pointer isn't always memory. Heck, I'd go so far as to say it rarely is.


A lot of good stuff in there

Posted Feb 14, 2025 21:11 UTC (Fri) by ferringb (subscriber, #20752) [Link]

> There is no way to fix either C and C++ without breaking backwards compatibility. At that point it's more productive (and with far less technical debt going forward) to use a new language that has been developed from scratch with safety explicitly in mind.

Well phrased, and exactly my experience.

A lot of good stuff in there

Posted Feb 14, 2025 2:53 UTC (Fri) by jmalcolm (subscriber, #8876) [Link] (2 responses)

I do not think that we need everybody to abandon Linux. In my view, free software does not have to be this kind of zero sum game.

All we need is to build a vibrant community, including talented contributors, around something new.

Anecdotally, it seems that about 70% of all new kernel initiatives are written in Rust these days. Statistically, it feels like the next "Linux" is likely to be written in Rust. Perhaps it is Redox. Perhaps not.

Microsoft apparently wrote an entire OS in C# and that was before they added a bunch of features that would make that easier. I love C# but it is not an obvious choice to write a kernel in. It is maybe a better choice than a lot of people think though. It can compile to completely native code. It gives you low-level memory control and precise bit layout control if you want it. The garbage collector can be completely avoided if needed. And anywhere you have the luxury of forgoing such low-level control, it is an excellent, safe, and productive language with a crazy comprehensive standard library. It would be interesting to see a C# OS where the VES/CLR (.NET version of the JVM) was a feature of the kernel or micro-kernel.

A lot of good stuff in there

Posted Feb 14, 2025 15:22 UTC (Fri) by excors (subscriber, #95769) [Link]

> Microsoft apparently wrote an entire OS in C# and that was before they added a bunch of features that would make that easier.

Their original attempt wasn't quite C#:

> Singularity is written in Sing#, which is an extension to the Spec# language developed in Microsoft Research. Spec# itself is an extension to Microsoft’s C# language that provides constructs (pre- and post-conditions and object invariants) for specifying program behavior. ... Sing# extends this language with support for channels and low-level constructs necessary for system code.
("An Overview of the Singularity Project" - https://www.microsoft.com/en-us/research/wp-content/uploa...)

and it wasn't the entire OS:

> Counting lines of code, over 90% of the Singularity kernel is written in Sing#. While most of the kernel is type-safe Sing#, a significant portion of the kernel code is written in the unsafe variant of the language. The most significant unsafe code is the garbage collector, which accounts for 48% of the unsafe code in Singularity. Other major sources of unsafe Sing# code include the memory management and I/O access subsystems. Singularity includes small pockets of assembly language code in the same places it would be used in a kernel written in C or C++, for example, the thread context switch, interrupt vectors, etc. Approximately 6% of the Singularity kernel is written in C++, consisting primarily of the kernel debugger and low-level system initialization code.
("Singularity: Rethinking the Software Stack" - https://www.microsoft.com/en-us/research/wp-content/uploa...)

(I think unsafe C# is basically normal C# plus raw pointers, and the compiler won't try to verify correctness. So it's quite similar to unsafe Rust.)

Singularity was succeeded by Midori which reportedly used "vanilla C#" instead of Sing#, but I guess it would still have a similar proportion of native code. There's an interesting series of posts about Midori at https://joeduffyblog.com/2015/11/03/blogging-about-midori/

The key idea of Singularity was that process isolation could be enforced via the high-level language's type safety and memory safety rules, instead of using MMU hardware, and that let them achieve the low-overhead IPC needed for a microkernel. But I think that idea is seriously undermined by Spectre; it turns out you do need support from hardware to provide security boundaries between processes, and that removes one of the main benefits of writing the whole system in a managed language.

A lot of good stuff in there

Posted Feb 15, 2025 0:25 UTC (Sat) by khim (subscriber, #9252) [Link]

> Statistically, it feels like the next "Linux" is likely to be written in Rust.

Yes, in 20 years OS kernels would be all in the memory safe languages. But I'm not 100% they would all be in Rust and, more importantly for the current discussion, the question is whether Linux would be among them.

A lot of good stuff in there

Posted Feb 14, 2025 12:48 UTC (Fri) by smurf (subscriber, #17840) [Link] (3 responses)

> automated, large scale code translation from C / C++ into Rust.

Can I please have some of the drugs you're on?

Seriously. The whole point of Rust is that it enforces guarantees which you can't even express in C (memory safety, lifetimes, locking rules, no NULL pointers, …), so where would your autotranslator get those from? Thin air?

… well you could wrap the whole kernel in a big "unsafe" block and then rewrite the resulting mess to safe Rust piece by piece, but if you have to do that anyway you might as well start with the original C version and save everybody the mountain of bugs that such autotranslation would invariably introduce.

Automated translation from C to Rust

Posted Feb 14, 2025 13:52 UTC (Fri) by farnz (subscriber, #17727) [Link]

Automated large-scale code translation from C to Rust is likely to look a lot like C2Rust with a set of rewrite rules that target obvious deficiencies in the output Rust (in the same way that cargo clippy --fix has rewrite rules that target obvious deficiencies in human-written Rust).

As Rust code goes, the output of C2Rust is pretty awful today. The open question is whether it's going to be easier to manually rewrite all the C that benefits from rewriting into good Rust, or to put together Rust → Rust rewrite rules that fix the faults in C2Rust's output; if fixing C2Rust's output is easier than rewriting C into Rust, then people will prefer automatic translation.

Note, too, that there's the possibility of semi-automated rewriting of C2Rust's output - have the human identify useful properties (e.g. "this function's return value is an OwnedFd, not a libc::c_int") and have the rewrite tool do conversions of code based on what you've just told it (in this case, changing the return type to OwnedFd, then adjusting all function parameters that "obviously" borrow to BorrowedFd and those that look like they consume it to OwnedFd, leaving unclear cases as RawFd for the human to fix).

Of course, none of this will happen unless interested parties make it happen.

A lot of good stuff in there

Posted Feb 14, 2025 13:57 UTC (Fri) by butlerm (subscriber, #13312) [Link] (1 responses)

First, obviously one would not use automatic translation of anything without careful manual review and testing if the translation introduced any flaws at all. The translation would have to be perfect - or rather correct - in every way that matters. Otherwise in any serious application people might suffer, be injured, or even die. None of this commit a fix and hope for the best and release a new version next week if it does not. That is not engineering at all, it is an embarrassment.

Second, you do not seem to have a great deal of confidence in the potential for semantic analysis of the source code to something that necessarily follows strict correctness constraints like the ones used in the Linux kernel to identify those features that you express explicitly in Rust and which in my opinion should be expressed better in any modern systems programming language.

The constraints required for the Linux kernel to function correctly themselves could be expressed in a model or meta-language designed for expressing those constraints anyway and that model used to read in between the lines so to speak about what is actually going on with locking and a considerable number of other things in any given piece of source code.

Computer science is a rather new field compared to something like physics, chemistry, or materials science and in some ways is still in the stone age. Computer Automated Software Engineering isn't used nearly as much as it could be, and for some reason people do a lot less of it than they did as recently as three decades ago. Not sure why that is, but solving problems like the Linux kernel is facing seems like it would benefit from a lot more of it.

A lot of good stuff in there

Posted Feb 14, 2025 17:33 UTC (Fri) by Cyberax (✭ supporter ✭, #52523) [Link]

> Computer science is a rather new field compared to something like physics, chemistry, or materials science and in some ways is still in the stone age.

Real computer science is now more than 70 years old. That's about the same age as the modern electrodynamics was in 1940 (the Maxwell's equations were formalized in 1861). There are really no excuses about "it's a new field" anymore.

By now, the applied programming is a well-established area. We have a set of best practices, things to avoid, and we will likely be even getting government regulations to establish consumer safety.

A lot of good stuff in there

Posted Feb 16, 2025 20:20 UTC (Sun) by timrichardson (subscriber, #72836) [Link] (2 responses)

I wonder that if in the future we have technology that can accurately rewrite old C/C++ in Rust, do we also at the same time have technology that can rewrite it in modern, safe C and C++?

A lot of good stuff in there

Posted Feb 17, 2025 0:17 UTC (Mon) by butlerm (subscriber, #13312) [Link]

As far as I know it is not there yet in either case, but it seems like it is in the realm of possibility to me, and for a code base the size of the Linux kernel might best be done with a static semantic analysis and translation program custom tailored to do the job - one which has the necessary behavior for translating or rewriting Linux kernel source code appropriately hard coded if necessary. That would probably be less work than making a general purpose tool even though the latter would of course have more general utility for other substantial C and C++ code bases.

A lot of good stuff in there

Posted Feb 17, 2025 13:38 UTC (Mon) by taladar (subscriber, #68407) [Link]

If we are ever able to treat code on a level that automated I doubt we would want to use any of the existing programming languages as, what would effectively be, a new intermediate language that basically nobody other than the developers of the automatic systems looks at. The trade-offs for such an intermediate language would be very different from those a programming language used by humans would make.

A lot of good stuff in there

Posted Feb 14, 2025 2:21 UTC (Fri) by jmalcolm (subscriber, #8876) [Link]

In my view, the ambition of the RedoxOS team is to be the "total rewrite" that you are talking about but, not only of Linux, but GNU as well.

From the Redox web page, they are aiming to "be a complete alternative to Linux and BSD" that offers "source compatibility with Linux/BSD programs".

"complete alternative" sounds like a thrown gauntlet to me.

That said, while Linux source code compatibility is a goal for Redox, it actually seems more like a rewrite of Minix. If Redox succeeds, it will be the ultimate vindication for Andrew Tanenbaum and his famous "Linux is obsolete" flame-war with Linus. For years we have all concluded that Linus won that one. Who knows, maybe a microkernel will win in the end.

A lot of good stuff in there

Posted Feb 14, 2025 1:36 UTC (Fri) by jmalcolm (subscriber, #8876) [Link]

The RedoxOS team is there for you.

A lot of good stuff in there

Posted Feb 14, 2025 12:32 UTC (Fri) by pizza (subscriber, #46) [Link]

> It might come as a surprise to you, but people don't like to be yelled at or be blocked in progress for no good reason.

Those performing the yelling or blocking usually have a _very_ different view of what constitutes a "good reason" versus the recipient.

(That doesn't make either party necessarily correct, but IME the recipient is _usually_ "wrong")

A lot of good stuff in there

Posted Feb 13, 2025 17:03 UTC (Thu) by dobbelj (guest, #112849) [Link] (9 responses)

People who weaponize social media to get their way aren't 'good people'. Any community is better off without those kind of people.

There's plenty of blame to go around here

Posted Feb 13, 2025 18:10 UTC (Thu) by dralley (subscriber, #143766) [Link] (7 responses)

I didn't think Marcan's Mastodon post was helpful to the situation, but I can't get on board with singling him out either.

There has been and continues to be a lot of bad behavior in kernel-land, and it was IMO the wrong move for leaders to step in and chastise him without actually addressing the issue at the root of this kerfuffle.

There's plenty of blame to go around here

Posted Feb 13, 2025 20:41 UTC (Thu) by ferringb (subscriber, #20752) [Link] (6 responses)

https://archive.is/uLiWX is the relevant post. And yep, someone did snapped and didn't do the perfect response of how to address it. Turns out devs are human; I'm sympathetic to his post, even if I recognize it's counterproductive (which he clearly realized, also).

As to the patch that kicked it off- https://lwn.net/ml/all/20250108122825.136021-3-abdiel.jan... . Even if you don't know rust, you should be able to read this.

Either introducing a centralized abstraction that drivers use, or (Hellwigs directive) having every driver try and muddle this out themselves... one of those is good engineering. One of those I lack an explanation for, especially when the person arguing it doesn't have to fricking maintain it and the R4L contract is "rust can be broken basically at will".

I really wish people would read the patches or pull stuff like above; it's easy to say "dudes pulling drama" until you start pulling the technical side and going "wait, wtf?".

There's plenty of blame to go around here

Posted Feb 14, 2025 8:49 UTC (Fri) by sima (subscriber, #160698) [Link] (5 responses)

This wasn't the post that spurred my reply. There was a separate one backhandedly calling out a bunch of kernel maintainers as religiously motivated sabotage, one of them working in the drm subsystem. That's why I stepped in, it impacted drm pretty clearly. Relevant archive.is link is this:

https://archive.is/rESxe

There's plenty of blame to go around here

Posted Feb 14, 2025 9:14 UTC (Fri) by dottedmag (subscriber, #18590) [Link] (2 responses)

Heh, that's an interpretation I have never ever thought about: I thought he was implying that his list of uncooperative maintainers contains people with names like Christian and Christopher.

I'm sorry, but calling it «religiously motivated» is applying optics from a different country, epoch or society, like calling someone out for slavery if they mention Slavic people.

There's plenty of blame to go around here

Posted Feb 14, 2025 14:55 UTC (Fri) by Wol (subscriber, #4433) [Link]

> I'm sorry, but calling it «religiously motivated» is applying optics from a different country, epoch or society, like calling someone out for slavery if they mention Slavic people.

Don't confuse religion with Religion. Don't confuse "a way that works" with "the one true way". And kernel maintainers who believe in "The One True Way of Kernighan and Ritchie" are definitely Religious with a very big R.

On the other hand, there's probably a lot of them who believe in the Standard of C (not sure if it's C89, C11, C18, Gnu or Brian) "because it works for me". And they're religious with a small "r". If a Rustacean moved in next door, they wouldn't care.

I'm somewhat Religious about databases as other people will very definitely attest - Relational/SQL is the work of the devil :-) - not really, it does have SOME good points. The *maths* is great ... not much else, though.

Cheers,
Wol

There's plenty of blame to go around here

Posted Feb 14, 2025 15:23 UTC (Fri) by marcan (guest, #103032) [Link]

> I thought he was implying that his list of uncooperative maintainers contains people with names like Christian and Christopher.

That is exactly what I implied. I started collecting a (private, as I stated in the post, "not for public consumption") list of frustrating kernel maintainers, and was amused by the fact that the first 3 were names like that. Then I made an (admittedly tasteless) joke about it. I thought it was clear it was a joke (triple question marks...), and then edited in the /s when some people didn't get the intent.

There's plenty of blame to go around here

Posted Feb 14, 2025 10:09 UTC (Fri) by intelfx (subscriber, #130118) [Link]

> There was a separate one backhandedly calling out a bunch of kernel maintainers as religiously motivated sabotage, one of them working in the drm subsystem.

Really? You may or may not consider the general act of calling someone out on social-adjacent media as "conduct unbecoming" (that's not a question that I'm going to opine on in this message), but seriously considering this kind of thick sarcasm as some sort of religiously motivated hate speech is probably the most disingenuously uncharitable interpretation of someone's words I have *ever* seen.

There's plenty of blame to go around here

Posted Feb 14, 2025 13:45 UTC (Fri) by valderman (subscriber, #56479) [Link]

Even without the /s, the "oh no it's a christian conspiracy" bit is obviously a joke, and English is not even my first language. That you're seriously trying to use the post WITH the /s added as proof that Hector is out there persecuting christian kernel developers does not reflect well on you at all.

I don't think Hector's Mastodon posts were terribly constructive, but it's pretty clear that this is not the "toxic outsider baselessly attacks innocent kernel maintainers" situation you and some others are trying to make it iout to be.

A lot of good stuff in there

Posted Feb 14, 2025 22:59 UTC (Fri) by raof (subscriber, #57409) [Link]

I'm not sure. My memory is that the kernel code of conduct came after a long pressure campaign work significant social media component, and I think that has been an improvement.

A core subsystem maintainer made it clear he was unilaterally vetoing Rust in the kernel¹ for non-technical² reasons. That *has* to be resolved, but I don't think the kernel continuity really has a way to resolve that within the normal processes.

Absent this sort of external pressure, it seems what people expected to happen was that Linus would eventually ignore the NAK and merge the patches, and everyone would try to ignore the fact that there's a core maintainer working directly against an agreed project goal. That's a terrible state for the community to be in!

¹: as most non-trivial uses of Rust would need to go through his subsystem

²: or, at least, no technical arguments that weren't already raised and considered before merging the initial Rust support

Big picture

Posted Feb 13, 2025 17:01 UTC (Thu) by Phantom_Hoover (subscriber, #167627) [Link] (33 responses)

Marcan’s one of those people whose grievances all look individually reasonable, but when you add them all up it suddenly paints an unreasonable picture of a world that consists of one flawed but decent person trying to do the right thing, and a legion of bastards, abusers and two-faced snakes arrayed against them. I hope he can find a hobby that’s less personally stressful, and I hope Asahi can find leadership that handles interpersonal friction better.

Big picture

Posted Feb 13, 2025 18:44 UTC (Thu) by proski (subscriber, #104) [Link] (29 responses)

I would normally agree, but the Asahi project is uniquely positioned to make maintaining it difficult.

Asahi runs on expensive hardware that is marketed for its features. It's no surprise that some users feel entitled to have all features work for the money they paid for the hardware. (In practice, the limitations can be mitigated by using different cables or by adding inexpensive hardware such as a USB headset.)

The Rust on Linux is also a known pain point. We know about it from other LWN stories.

Big picture

Posted Feb 13, 2025 20:39 UTC (Thu) by Phantom_Hoover (subscriber, #167627) [Link] (28 responses)

That’s kind of my point though, a lot of people handle the same or similar problems without feeling personally aggrieved to the extent Marcan has been. I mean, half of his examples of ‘entitled users’ are literally just straightforward questions about missing features! He says he was getting into exhausting conflicts back in the Nintendo homebrew scene as well, and eventually it becomes more likely that this is happening because of his own personality than everyone else’s.

Big picture

Posted Feb 13, 2025 20:47 UTC (Thu) by koverstreet (✭ supporter ✭, #4296) [Link] (27 responses)

He also sounds like someone who's been overworked and stressed out from being highly invested for a long time and not a lot of real help.

I think we can show those people some understanding and appreciation for the work they've done, instead of just dismissing their grievances. If things get to the point where they quit, we all lose.

And the kernel community is not always a particularly welcoming place, some of his grievances do sound quite real.

A substantial fraction of the comments I see any time something like this comes up are just wanting to sort things into good vs. bad, winners vs. losers; "oh he screwed up/acted out therefore it's fine for him to leave or be shown the door", and I find those particularly unhelpful.

Big picture

Posted Feb 14, 2025 10:26 UTC (Fri) by Phantom_Hoover (subscriber, #167627) [Link] (26 responses)

The thing is, when you extend the same charity to all the countless people Marcan accuses of behaving awfully, the picture gets a lot more ambivalent.

Big picture

Posted Feb 14, 2025 12:40 UTC (Fri) by dralley (subscriber, #143766) [Link] (25 responses)

This is more or less where I'm at. It feels like just about everyone shares some level of culpability here. Frustrated and overburdened people get touchy, defensive and self-centered and take their frustrations out on other frustrated people.

The thing is, on a broader level marcan is still entirely correct that remaining insular and refusing to cooperate with new people is just going to result in a death spiral. The kernel needs a pipeline of new dedicated long term contributors to survive, but the culture and the process seems to do a good job of scaring them away or burning them out.

Big picture

Posted Feb 14, 2025 13:19 UTC (Fri) by pizza (subscriber, #46) [Link] (20 responses)

> The thing is, on a broader level marcan is still entirely correct that remaining insular and refusing to cooperate with new people is just going to result in a death spiral. The kernel needs a pipeline of new dedicated long term contributors to survive, but the culture and the process seems to do a good job of scaring them away or burning them out.

I've brought this up before here.

In nearly every profession or field of study, the expectation is that the new folks must learn and understand the whats, hows, and (most importantly) whys of the way things are being done.

Some professions require literally a decade (or more) of study and apprenticeship. Others may have harsh and gruelling training regimes -- and these traits (and the emergent cultures) are known in advance by those considering these professions.

"But it's too haaaaaard" simply does not fly. Do you want excellence, or not?

It turns out that folks are not actually interchangeable, lacking the physicality, skills and/or temperament to succeed. Not everyone has what it takes be a doctor or pilot. Not everyone is going to keep up with one the most successful (and important, and performance-critical) distributed software engineering efforts of all time.

And that's okay.

(BTW, this isn't to say that improvements aren't possible -- just that they happen slowly and incrementally, unless forced by a major externality. But "bend over backwards to accomodate the new folks" is rarely a good reason.)

Big picture

Posted Feb 14, 2025 13:57 UTC (Fri) by kleptog (subscriber, #1183) [Link] (1 responses)

> Some professions require literally a decade (or more) of study and apprenticeship. Others may have harsh and gruelling training regimes -- and these traits (and the emergent cultures) are known in advance by those considering these professions.

We used to also regularly beat children with the idea it would help them grow up into better adults/build character/that kind of thing. That idea is considered ridiculous these days, but I'm sure many people still believe it.

Sure, training for a profession doesn't have to be easy, but it also doesn't have to be harder than necessary.

Big picture

Posted Feb 14, 2025 15:20 UTC (Fri) by Wol (subscriber, #4433) [Link]

> Sure, training for a profession doesn't have to be easy, but it also doesn't have to be harder than necessary.

And as someone with medical qualifications (no I am not medically qualified, nor trained), it also should not involve filling trainees with propaganda that bears little relationship to reality. All this training can seriously hinder the spread of good practice.

Take it from someone who has been at the wrong end of arrogant doctors, doctors who are well meaning but ignorant, doctors who can't do a good job because they can't communicate ... and all of whom are people who would almost certainly be horrified if they realised the harm they'd done.

And most of it is down the Religious Dogma (or Politics - equally as bad) instilled in professions with long apprenticeships. But we see that everywhere in life, the powerful want to hang on to power.

Cheers,
Wol

Big picture

Posted Feb 14, 2025 14:08 UTC (Fri) by dralley (subscriber, #143766) [Link] (14 responses)

> In nearly every profession or field of study, the expectation is that the new folks must learn and understand the whats, hows, and (most importantly) whys of the way things are being done.

> Some professions require literally a decade (or more) of study and apprenticeship. Others may have harsh and gruelling training regimes -- and these traits (and the emergent cultures) are known in advance by those considering these professions.

In nearly every profession or field or study, there is a reasonablely large body of "documentation" about how and why things work the way they work, it's not all kept as oral history dictated by the elites. Or else, those elites at least regularly undertake some form of mentorship or "advisor" relationship to help the next generation.

As was discovered last summer, even being asked to document or explain subtle details of the existing behavior was apparently too much for some maintainers, even though some of those maintainers couldn't even agree with each other about those same subtle details. Some people seem like they just want to be left alone in their sandbox and never asked to explain anything by anyone "beneath" them.

Those kinds of people certainly exist in all fields, but nobody likes working with them very much.

By the way, none of this should be "bending over backwards". The lack of documentation​ on API semantics is an actual problem with practical consequences.

Big picture

Posted Feb 14, 2025 14:40 UTC (Fri) by koverstreet (✭ supporter ✭, #4296) [Link] (10 responses)

Ask NASA why they don't just build more Saturn Vs.

And they document more than anyone.

These problems are not unique to our profession.

Big picture

Posted Feb 14, 2025 14:48 UTC (Fri) by dralley (subscriber, #143766) [Link] (2 responses)

Of course not, I don't claim that the problem is unique to programming nor the kernel nor that documentation would be sufficient by itself.

Just that having low-bus-factor elites that don't document things *and* won't help mentor new developers / maintainers is kind of a problem for the long-term health of any project. Responding with "RTFM" (while having a manual that actually answered the most basic questions one could have) would be an improvement on the status quo in some cases.

Big picture

Posted Feb 14, 2025 14:52 UTC (Fri) by koverstreet (✭ supporter ✭, #4296) [Link] (1 responses)

> Just that having low-bus-factor elites that don't document things *and* won't help mentor new developers / maintainers is kind of a problem for the long-term health of any project. Responding with "RTFM" (while having a manual that actually answered the most basic questions one could have) would be an improvement on the status quo in some cases.

Oh, that I'd agree with.

I spend most of my time available on IRC and I actively tell new people working on the code "ask me if you're blocked on something, it's my job to get you unblocked". There does have to be a way for new people to get involved and learn.

And maintainers need mentorship, too...

Big picture

Posted Feb 14, 2025 16:50 UTC (Fri) by branden (guest, #7029) [Link]

'I actively tell new people working on the code "ask me if you're blocked on something, it's my job to get you unblocked".'

That is the essence of good engineering management. I laud you for recognizing it and practicing it.

Big picture

Posted Feb 14, 2025 16:02 UTC (Fri) by excors (subscriber, #95769) [Link] (1 responses)

> Ask NASA why they don't just build more Saturn Vs.

Probably because it doesn't provide the capabilities that anyone has wanted since the 1970s (it's no good for launching satellites or reaching the ISS or Mars), and it cost >10x more to build than a modern rocket using modern tools and materials, and its safety was much lower than would be acceptable nowadays.

Big picture

Posted Feb 14, 2025 16:52 UTC (Fri) by branden (guest, #7029) [Link]

"...and its safety was much lower than would be acceptable nowadays."

Vegas is accepting wagers on how well this comment ages.

Big picture

Posted Feb 14, 2025 20:39 UTC (Fri) by ggreen199 (subscriber, #53396) [Link] (4 responses)

One of my first supervisors worked at Boeing on the design of the first stage of Saturn V. Near the end of his time in the space division, he found out that the 2 remaining sets of design documents for the stage were to be thrown out. He took it upon himself to take 1 set home and store it in his garage. Many years later in the 90's, he got a frantic call from a higher up asking if it was true he had the __only__ remaining set of the design docs stored somewhere. I assume this was in the preliminary stages of designing the SLS or predecessors. He said yes, and they were ecstatic and retrieved the documents from him.

Would I be surprised, if after they used the documents for whatever reason they wanted them, they threw them out again? No, I would not be surprised. One constant I have learned from a long career, is that long term planning or retention does not seem to be a concern of any organization I have been associated with.

Big picture

Posted Feb 14, 2025 22:23 UTC (Fri) by excors (subscriber, #95769) [Link] (1 responses)

Apparently they didn't keep all the documentation, but they also didn't ignore long term planning - there were active efforts to preserve knowledge for components they expected to be useful in the future:

> The longstanding story that NASA lost or destroyed the Saturn 5 plans quickly falls to pieces when one learns about the F-1 Production Knowledge Retention Program. This was a project at Rocketdyne, the company that built the F-1 engine, to preserve as much technical documentation and knowledge about the engine as was possible. According to an inventory of records, this produced twenty volumes of material on topics such as the engine’s injector ring set, valves, engine assembly, and checkout and thermal insulation and electrical cables, among others.
>
> But the project went beyond simply preserving documentation. Rocketdyne actually sought to preserve the knowledge inside the heads of the people who designed and manufactured the engines. They conducted tape-recorded interviews with them, asking about parts that were difficult to produce and manufacturing tricks that they had learned in the process of building multiple engines.
(https://www.thespacereview.com/article/588/1)

Engines are probably the most complex part of a rocket, and they can often be reused in new rocket designs, so they're worth preserving. A lot of the rest of a rocket probably isn't worth it; the original blueprints depend on 1960s components that are no longer available, they use 1960s materials that are inferior to modern ones, the tooling is too bulky to store unused for decades, the launch pads and assembly buildings have been repurposed, etc, and most parts aren't that hard to design anyway (compared to engines), so even if you had perfect documentation it would probably be cheaper to redesign the rocket from scratch.

So I think the reason they didn't keep perfect documentation is because they knew it wasn't going to be needed, not because their engineers just didn't want to bother writing it down.

Big picture

Posted Feb 15, 2025 22:14 UTC (Sat) by ggreen199 (subscriber, #53396) [Link]

Sure, but the engines were not a Boeing product, so doesn't apply to what I said. I was speaking of the Saturn V first stage. My comment about lack of planning and retention is not just based on the story of the 1st stage design, but on many, many other examples I witnessed in a long career.

While materials and tooling do change, loading, dynamic response to flight regimes, etc are of course relevant. There are many more points to a design than the those two I cited also.

Big picture

Posted Feb 14, 2025 22:24 UTC (Fri) by pizza (subscriber, #46) [Link] (1 responses)

> Would I be surprised, if after they used the documents for whatever reason they wanted them, they threw them out again? No, I would not be surprised. One constant I have learned from a long career, is that long term planning or retention does not seem to be a concern of any organization I have been associated with.

In my experience, the opposite of "retention" is deliberately practiced, with documentation being automatically shredded as a matter of routine, solely because doing so means you can't ever be accused of willfully destroying evidence in hypothetical future lawsuits.

This pathology is combined with a propensity to cull experienced technical staff, outsource work, and never promote from within (the excuses invariably distill down to "too expensive") to produce what is effectively a sort of institutional anti-memory.

(...I've seen this occur while a product is still being actively produced and sold, in businesses with product support cycles that are measured in decades.)

tl;dr: "Retention" has guaranteed costs, with completely unquantifiable benefits in at some point in the distant future. (where "distant" is anything beyond the current fiscal year)

Big picture

Posted Feb 16, 2025 22:13 UTC (Sun) by mathstuf (subscriber, #69389) [Link]

> (where "distant" is anything beyond the current fiscal year)

Pssh. We're reliant on the *quarterly* reports these days.

Big picture

Posted Feb 15, 2025 16:02 UTC (Sat) by hallyn (subscriber, #22558) [Link] (2 responses)

Sorry, what was discovered last summer? Link would be greatly appreciated, thanks.

Big picture

Posted Feb 15, 2025 16:59 UTC (Sat) by Wol (subscriber, #4433) [Link]

That was that explosion where the Rust people asked for documentation for a C API. Because as far as they could tell, even other C consumers of the API couldn't agree on how it was supposed to be used.

Cheers,
Wol

Big picture

Posted Feb 16, 2025 22:32 UTC (Sun) by mathstuf (subscriber, #69389) [Link]

Big picture

Posted Feb 14, 2025 21:35 UTC (Fri) by roc (subscriber, #30627) [Link] (2 responses)

> In nearly every profession or field of study, the expectation is that the new folks must learn and understand the whats, hows, and (most importantly) whys of the way things are being done.

You make a good point.

But in nearly every profession or field of study, there is also an expectation of continuing education --- that old folks must continue to learn and understand new ways of doing things. This is more true the more tech-adjacent the field.

Yet we see many examples of kernel maintainers who explicitly deny being subject to that expectation. For them, "but it's too haaaaaard" DOES fly.

Big picture

Posted Feb 14, 2025 22:40 UTC (Fri) by pizza (subscriber, #46) [Link] (1 responses)

> But in nearly every profession or field of study, there is also an expectation of continuing education --- that old folks must continue to learn and understand new ways of doing things. This is more true the more tech-adjacent the field.

Sure. But it's a more general requirement ("X hours of continuing education a year") rather than "you will all learn and immediately adopt/incorporate THIS thing, or else you're out." Even in highly tech-adjacent (and/or highly regulated) fields.

> Yet we see many examples of kernel maintainers who explicitly deny being subject to that expectation. For them, "but it's too haaaaaard" DOES fly.

Is R4L an officially-blessed mainline feature? Or is it still considered an experiment?

"Let other folks who care undertake and maintain this experiment, I'm already working more than full time" is a perfectly rational attitude to take. (and how most Linux features have been, and still are, developed) Given the amount of technical churn that's already taken place, it's hard to argue that it's not a reasonably justifiable position.

Is Linux a C project? Is Linux a Rust project only if you have certian hardware or want specific features? Or is Linux a Rust project that consists mostly of (purely legacy) C? One way or another, it's well past the point where Torvalds needs to make a decision.

Big picture

Posted Feb 15, 2025 3:35 UTC (Sat) by dralley (subscriber, #143766) [Link]

> Is R4L an officially-blessed mainline feature? Or is it still considered an experiment?

It's an officially blessed experiment, although at this point it's not much of an "experiment" anymore and it would be pretty surprising to see it completely rolled back.

The bigger question is whether it remains allowed for drivers only as it is currently, or if it is eventually allowed into the core kernel.

> "Let other folks who care undertake and maintain this experiment, I'm already working more than full time" is a perfectly rational attitude to take. (and how most Linux features have been, and still are, developed) Given the amount of technical churn that's already taken place, it's hard to argue that it's not a reasonably justifiable position.

That would be a very justifiable position, but it's not Christoph's position. Christoph's position is substantially less justifiable.

Big picture

Posted Feb 14, 2025 13:50 UTC (Fri) by koverstreet (✭ supporter ✭, #4296) [Link] (3 responses)

> The thing is, on a broader level marcan is still entirely correct that remaining insular and refusing to cooperate with new people is just going to result in a death spiral. The kernel needs a pipeline of new dedicated long term contributors to survive, but the culture and the process seems to do a good job of scaring them away or burning them out.

Yeah, this is a really important point.

We've got a real problem with overprofessionalism (c.f. elite overproduction in society at large); this where some of my beef with the CoC and the committee's approach comes from.

It seems they want the kernel to be an emotionally safer place for maintainers, but there's a cost. Where do new engineers come from, the ones who really drive things decodes down the road?

They start out as engaged, hot headed young people who are interested in technology, of course. Seeing comments on Phoronix and elsewhere gives me fond memories of where I was 30 years ago, and I find more and more that there's a lot to be had in those interections if you show a bit of patience and empathy.

Along with Ted's "thin blue line" mentality, I see a lot of ways in which the kernel community is becoming more insular and closed off when we need to be engaging with the outside world.

Big picture

Posted Feb 14, 2025 16:25 UTC (Fri) by nim-nim (subscriber, #34454) [Link] (1 responses)

I think what people wrote on both sides on the dispute is awful, but expecting people to be brillant technically and brillant emotionally is putting the bar sky high. Something has to give. The situation is sorely lacking some third party to help both sides accept the risks of not doing things exactly their way.

Superheroes can afford to stick to my way or the highway. Normal human beings can not.

Big picture

Posted Feb 14, 2025 17:01 UTC (Fri) by koverstreet (✭ supporter ✭, #4296) [Link]

Well, that's where leadership is supposed to come in. A couple decades of experience, should, in theory, give you time to acquire a degree of competency in both.

Big picture

Posted Feb 14, 2025 17:53 UTC (Fri) by branden (guest, #7029) [Link]

Permit me to attempt to rise to the challenge of the "Big picture" subject. The reader may want to brew a coffee for this one. (Or skip it.)

"We've got a real problem with overprofessionalism (c.f. elite overproduction in society at large); this where some of my beef with the CoC and the committee's approach comes from."

I don't think you're wrong, but I think your statement is easily misread. Let me attempt my own interpretation of it, with which you will not necessarily agree.

Our society (I'll speak here mainly of the U.S.--problems are similar though less extreme elsewhere) overproduces people credentialed for management. Unlike many, I don't claim that our ratios of engineering to social science to management to liberal arts graduates are out of whack, for the simple reason that in much U.S. employment, a bachelor's degree in _any discipline of study_ is regarded as a qualifying criterion for a management role--and often does most of the lifting of "sufficiency" for such a position.

And the reason people seek out these management roles after graduating college is that in many or most sectors, they're the only ones that pay a true living wage or offer a plausible path to one. Everybody else is tied to the federal minimum wage (or compensated by some meager increment above it), which hasn't approached a living wage in the memory of most of the workforce.

Understandably, every kid's parents push hard to get as many of their offspring as possible into the college prep/future management track, even if they have no temperament for or interest in knowledge production via scholarly methods (the reason universities exist). We end up with more "managers" than we need, but there's a tacit agreement in business leadership not to proletarianize the bulk of them (say, by sectors of the economy proclaiming, "okay, that's enough, no first-line managers without master's degrees"), because that risks upsetting the political equilibrium upon which the existing systems of rent extraction depend. The end result is a sloshy mass of managers without much real managing to do, so they become mandarins or commissars, the latter being a feature of the Soviet system but, being too good an idea to let die with communism, now constitute a means of achieving government "efficiency".

The result is that we have a lot of superfluous people applying their management training--or what passed for it--in places and to situations where worker self-management was adequate, or should have been permitted to develop organically, from the bottom up rather than top down. We see repeated instances of two kinds of problem. (A) heavy-handed CoC enforcers, often drafted from outside the communities they serve--because they're "professionals"--decreeing expulsions of significant (but not top-tier) contributors and drawing backlash, not because there wasn't a problem, but because the instincts of that community respond appropriately to sledgehammer tactics applied by people in mallet-shaped who have no other function and can contribute nothing else. And (B) an elite class of special contributors against whom the sledgehammer will never be swung no matter what. Sure, we can talk them into going to "sensitivity training" once, maybe. After that they'll rediscovered their indispensability, knowing just as do their employers that they can find greener pa$ture$ elsewhere. Given the choice to retain between one of marquee people and a faceless functionary, the outcome is obvious. That dynamic creates intense competition for one of those coveted untouchable spots, which in turn promotes bunker mentalities, rivalries, territoriality, and personal attachment to work product (like kernel subsystems) with which one is publicly associated in the minds of the community.

In summary, many projects seem to have drifted into a place where all of an immature developer's worst inclinations are seen to be indulged--if you're one of the "right" developers. A project survives and develops successfully in spite of these perverse incentives, not because of them. It's good that we have so many basically honorable and decent people attached to FLOSS projects, and a deep shame that the conventional wisdom is that they need to be managed "better" with approaches that will actively harm them.

Management is often a necessary function. But as with many products and services, buying from the seller employing the highest-pressure tactics or who pushes the least rational arguments ("everybody else is doing it!"), often leaves one underwhelmed and experiencing remorse. (But we can talk you out of saying so. C'mon, you can't just be disparaging managers like that. Do you want people to think you're a Marxist?)

I'll leave you with a lengthy quote from a historian friend of mine that helps show how we got here.

"Whereas in previous models of corporate governance, large shareholders (often from founding families like Ford) appointed fellow members of the owner class, the old school bourgeoisie/owner class, to internal office within these companies. In other words, one amassed stock (by inheritance or profit in another company) and then ascended to leadership. But the commissar class offered a concurrent competing model whereby one was appointed to leadership by other members of this class who managed funds and held voting proxies and, from that position, compensated oneself with shares and stock options.

"This ascendant class based these appointments on an expanding academic discipline, “business administration,” the skill at and understanding of the management of people based on the ability to analyze and manipulate mass psychology through statistical analysis. Whether this actually made any company more efficient is entirely debatable. The point is that the commissar class could justify its amplification of its own power by using a meritocratic discourse of expertise, not in what the company made or did but in psychological manipulation. People who ran companies, according to the logic of the commissars, didn’t run them because they were rich or because they understood the industry and had risen through its ranks but because they were masters of an arcane science McNamara and his ilk had helped to create.

"It was therefore perfectly logical that the commissars would endow business schools with funds to make more commissars. And as the austerity programs the commissars championed went into effect, these schools came to exercise an outside influence on university cultures as they expanded financially while the rest of the universities contracted. Logically, of course, the way to save other parts of the universities was to make them more closely resemble the business schools that produced the commissars or, conversely, to reassure the commissars by withdrawing into various forms of immaterialism so as not to produce graduates who might make competing meritocratic claims on the basis of specific, disciplinary knowledge as opposed to the meta-science of management. In other words, the postmodern turn and the rise of the business school were of a piece with one another, both driven by austerity, the ascendance of the commissars, Soviet subversion and immaterialism. By immaterialism, I mean that management was increasingly a science of psychological analysis and manipulation while humanities and social science scholarship relocated from describing the physical world to describing people’s thoughts about that world. The idea that reality is a social construction is one equally championed by the business schools and postmodernists who seized control of humanities and social science scholarship.

"These processes were already underway when the Eastern European dictatorships that had helped create them collapsed one after another. But without the worry of the Soviet Bloc as competition and with the removal of any serious political alternative, commissar-driven austerity could accelerate, as it rapidly did in the 1990s, raising tuition fees, ensuring that those working class people who did rise through the university system would be heavily indebted and thereby more controllable should they attempt to join this ascendant class.

I concede that this sort of analysis wanders far from LWN's editorial concerns. I realize that we're all here to hack, not to understand why firms like Intel or trade associations like the Linux Foundation operate the way they do.

Big picture

Posted Feb 14, 2025 18:56 UTC (Fri) by branden (guest, #7029) [Link] (2 responses)

Walk me through your reasoning here.

> Marcan’s one of those people whose grievances all look individually reasonable,

Okay, so we have a non-trivial set of grievances each of which has merit in isolation. 1+1+1+1+1...

> but when you add them all up it suddenly [suggests an implausible, or at least deeply distressing, result].

equals -1.

This process resembles nothing as much as shooting the messenger, or fallaciously rejecting a valid deductive conclusion because you don't like it. That's the opposite of a reasoning process.

When diagnosing this sort of surprising result, you should work harder to understand what's going on. Maybe you're right to reject the conclusion you reach (even once deflated of the hot gas you pumped into it when writing). A couple of approaches come to mind.

1. You noted a suddenness to the process. Okay. Delete marcan's grievances individually, one by one, until you reach a different conclusion that follows from them and yet doesn't surprise you. What is that conclusion?

2. It is sometimes the case that a person makes some decision premised on grounds that they don't disclose, then retrospectively cobbles together a "case" for it that bears little resemblance to the reasoning process they actually used. This comes up frequently in employment law cases, where a finding of a tortious or unlawful firing of an employee is made because the employer offered "shifting rationales" for termination over time (with the biggest differences typically showing up once the matter gets litigated and the employer retains professional legal advice). If you think that is the case with marcan, then the burden is on you to establish that he had some other basis, one the community would be less likely to accept as legitimate, for his decision. (Like, say, "too many Linux kernel people casually inspect their own names and reach the conclusion they're without sin".) Such a claim will demand evidence. If that exists only locked up inside marcan's scheming mind, then you have no objective evidence to offer.

In sum, the onus is on you to explain why 1+1+1+1+1 doesn't equal 5 here.

Big picture

Posted Feb 14, 2025 22:01 UTC (Fri) by Phantom_Hoover (subscriber, #167627) [Link]

If you want a silly mathematical analogy: if someone says they lost one bet on a coin flip, you accept that and commiserate. If someone says they’ve lost ten coin flips in a row, you start to suspect there’s more to the story.

If you want the harsher aphorism version: "If you run into an asshole in the morning, you ran into an asshole. If you run into assholes all day, you're the asshole."

Big picture

Posted Feb 15, 2025 6:16 UTC (Sat) by milesrout (subscriber, #126894) [Link]

If you meet pricks here and there, they probably are pricks. If everyone you meet is a prick, maybe it is actually you that is the problem.

Hector unfortunately seems to come into conflict with just about everyone. He always seems to be able to come up with a reason for having these conflicts but at the end of the day you have to sit back, look at the big picture, and ask if that isn't just "system 2" trying to make up for some serious "system 1" issues.

Not followed it closely... but was just wondering.

Posted Feb 13, 2025 18:23 UTC (Thu) by Lennie (subscriber, #49641) [Link]

I see some people having really unreasonable expectations on how well or how quickly devices are supported.

Would it be a good idea to have a flag in the documentation for devices which don't have hardware manufacturer co-operation ?

To make it clear: this device will not be supported at release time, it will take longer before it's supported, etc. because the manufacturer gives us no information to work with/does not help fund the development, etc.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 14:42 UTC (Sat) by rsidd (subscriber, #2582) [Link] (37 responses)

Karol Herbst resigned from nouveau citing, specifically, Ted Ts'o's "thin blue line" comment. Earlier Ts'o's behaviour at a conference caused a R4L maintainer, Wedson Almeida Filho, to resign.

Nouveau is planned to be superseded by Nova, which is Rust-based.

Rust is not going anywhere, and real-world hardware already depends on rust in the kernel. (That's Apple M* so far, but there will be more in the wider world, not just Nova.) I'm just astonished that Linus has let things get to this pass, after initially offering his support to R4L. If it is not resolved, a rust-friendly kernel fork seems inevitable.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 16:40 UTC (Sat) by Phantom_Hoover (subscriber, #167627) [Link] (13 responses)

The idea you can identify and condemn someone’s political beliefs from their use of a 70 year old stock phrase, or that this is acceptable behaviour on the kernel mailing list, is liable to lose more people in the long term if widely accepted. The kernel needs to be inclusive of people who don’t think and talk exclusively within a very narrow vision of what’s politically acceptable.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 17:04 UTC (Sat) by rsidd (subscriber, #2582) [Link] (12 responses)

Phrases can have different meanings from 70 years ago. A certain religious symbol, that has existed for millennia in Asia and is still widespread where I live, is nevertheless inappropriate in the western context unless its non-European symbolism in that context is made absolutely clear. And this was not the first resignation prompted by his behaviour.

The bigger point is that these particular, supposedly indispensable, maintainers are growing old too. If they really are indispensable the kernel is in trouble. As many have noted, young programmers just aren't interested in programming in C. And there is no good reason to.

So if this continues, I see only two possibilities: (1) A fork emerges that is explicitly rust-friendly and, over time, takes over the mindshare of linux developers so that it becomes the new non-Linus upstream. (2) An alternative like redox takes over.

Ok, there's a third (3): the dream of a general-purpose free/libre OS is dead. But I don't think that will happen.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 17:28 UTC (Sat) by Phantom_Hoover (subscriber, #167627) [Link] (11 responses)

You should be at least a little embarrassed to compare use of the phrase ‘thin blue line’ in 2025 to displaying a Swastika. Apparently Ted T’so has to treat everyone else like they’re made of glass but LWN commenters can cast whatever aspersions they like towards him.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 18:12 UTC (Sat) by Wol (subscriber, #4433) [Link] (10 responses)

And what is the thin blue line? I've heard of the thin RED line ...

Don't explain if you don't want to, but just remember that cultural references don't always translate very well ...

Cheers,
Wol

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 18:27 UTC (Sat) by farnz (subscriber, #17727) [Link] (7 responses)

Historically, it was meant to represent the US's police as the thin blue line between the lawlessness caused by allowing slaves to escape and act like people, and civilization where people acted lawfully.

As a consequence of the history, whether you see it as problematic or not depends very strongly on whether you would have been a person or a slave on the "lawful" side of the line.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 18:39 UTC (Sat) by Phantom_Hoover (subscriber, #167627) [Link] (6 responses)

This claim is entirely untrue: https://en.wikipedia.org/wiki/Thin_blue_line#History

Politically incendiary disinformation like this is a lot more socially and institutionally toxic than grumpy maintainers.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 18:46 UTC (Sat) by farnz (subscriber, #17727) [Link] (5 responses)

Given that the Wikipedia link you provide has citations about it being used to refer to exactly that, including police silence about racist attacks on non-white US citizens in the 1970s, calling it "entirely untrue" is stretching the words "untrue" and "entirely" to their limits.

Whether you like it or not, historically, the US police forces have had problems with racism, and terms like this were popular within US policing to justify that racist behaviour. Ignoring that history means omitting a lot of significant context, just as ignoring a Hitler supporting using a swastika on the basis that it's a symbol of peace (as it is in many Asian cultures) is ignoring significant context.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 18:54 UTC (Sat) by Phantom_Hoover (subscriber, #167627) [Link] (4 responses)

> Given that the Wikipedia link you provide has citations about it being used to refer to exactly that

You're going to find it very hard to find citations supporting your claim that the phrase "was meant to represent the US's police as the thin blue line between the lawlessness caused by allowing slaves to escape and act like people", because the article clearly says the first recorded use of the term to refer to police was in 1922, a full 57 years after the abolition of slavery in the United States. Please do not keep spreading this falsehood.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 20:49 UTC (Sat) by farnz (subscriber, #17727) [Link] (3 responses)

It was used in that sense in the 1970s and 1990s, and that sense has overtaken the original meaning by a significant margin. Further, it was referencing back to the history of the US, before slavery was abolished - and there were still plenty of people in the 1970s who wanted to go back to the era of slavery, where people knew their place.

Just as someone spraying a swastika on a synagogue in Germany cannot justify it by reference to its older meaning, so too is it unreasonable to say that this phrase can't have evolved over time.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 21:34 UTC (Sat) by Phantom_Hoover (subscriber, #167627) [Link] (2 responses)

You claimed that the phrase ‘thin blue line’ originated during slavery. This is false, and I am once again asking you to acknowledge that rather than moving the goalposts. Ted T’so is a real person and he deserves better than to have his name dragged through the mud by mendacious comparisons of use of a common turn of phrase with slavery and Naziism.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 16, 2025 19:02 UTC (Sun) by branden (guest, #7029) [Link] (1 responses)

> You claimed that the phrase ‘thin blue line’ originated during slavery.

No, he didn't. He said:

"Historically, it was meant to represent the US's police as the thin blue line between the lawlessness caused by allowing slaves to escape and act like people, and civilization where people acted lawfully."

He said "historically". Usage that was current when Richard Nixon was President is "historical", as was usage prior to the passage of the 13th Amendment. If you don't believe me, ask someone born after 1990. That Jefferson Davis may not have been familiar with phrase is not an argument against the imprecise claim he made, even if it would eviscerate the much more specific one you're putting into farnz's mouth.

The reframing of history to contrive a prelapsarian golden age is a frequent practice of reactionary movements. See, for example, _The Way We Never Were_, Stephanie Coontz, Basic Books, 1993.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 16, 2025 19:50 UTC (Sun) by Phantom_Hoover (subscriber, #167627) [Link]

Please tilt at the windmills of a ‘prelapsarian golden age’ elsewhere. They have nothing to do with the existing conversation.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 18:46 UTC (Sat) by wtarreau (subscriber, #51152) [Link]

I was going to ask that as well, because I saw that sentence highlighted out of its context in the LKML thread as well, not understanding why given that I lacked any reference.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 15, 2025 18:47 UTC (Sat) by Phantom_Hoover (subscriber, #167627) [Link]

The thin blue line refers to the police (who wear blue uniforms in a lot of places, as opposed to the red military uniforms of the original 'thin red line'). You're from somewhere in Europe IIRC, so you may not be aware of just how polarised and toxic everything involving police became in US political discourse over the course of around 2020 to 2022. I think it's quite likely Ted isn't either, as a lot of normal people who aren't very politically involved didn't keep track of the lightning fast changes to 'acceptable language' which were flying around at that time.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 17, 2025 13:09 UTC (Mon) by nim-nim (subscriber, #34454) [Link] (22 responses)

“thin blue line“ is internal US politics. You may like them or not but giving any credence to them is inviting every kernel contributor to add the politics of his own country to the mix.

What’s more disturbing in the message is a maintainer that vindicates his own pushing back by the fact the people being pushed back then leave. A completely self-destructive logic 101 circular argument.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 17, 2025 15:12 UTC (Mon) by corbet (editor, #1) [Link] (2 responses)

I believe your second paragraph is a complete misreading of what the original message said. It is all too common for contributors to disappear once they get their code upstream, leaving maintainers with the responsibility for it. it's not surprising that maintainers respond by at least wanting that code to be in good shape.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 17, 2025 16:12 UTC (Mon) by nim-nim (subscriber, #34454) [Link]

I know what was the original intent (which was documented on lwn IIRC) and I know what was written in the aforementioned message. It has morphed from "the maintainer needs to take care he accepts good code" into "sod new contributors, if I have the time I will look into proposed changes and till I have convinced myself the code is near perfect sod them all they will leave anyway; and if the code is near perfect sod them all I don’t need them then".

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 17, 2025 18:31 UTC (Mon) by Phantom_Hoover (subscriber, #167627) [Link]

Amidst all the smoke and noise I think this gets at a very real and interesting source of tension in the kernel. From what I’ve seen in LWN coverage, there’s a slow crisis of maintainer numbers and retention in the kernel, even as contributor numbers grow cheerfully. The thing is, there’s a real conflict of interest between maintainers and contributors, where the former take on the maintenance burden of the latter in exchange for… their awesome power over Linux kernel subsystems? The wealth and social prestige that comes with the role? The incentives are bad and the pressure is high; it’s a role designed to squeeze people out. And the natural response to increased pressure on maintainers is that they’ll become more conservative and likely to reject contributions, because that’s just going to increase the pressure on them further. A lot of people seem to see that last symptom and blame it all on the maintainers being grumpy naysayers by personal temperament, which they can solve by social media shame campaigns to put even more pressure on the shrinking pool of dedicated maintainers, and I don’t see how that leads to anything but ever deeper dysfunctional.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 17, 2025 18:08 UTC (Mon) by Phantom_Hoover (subscriber, #167627) [Link] (14 responses)

> “thin blue line“ is internal US politics.

I first encountered the phrase ‘thin blue line’ as a schoolchild in Scotland reading the biography of a South African playwright. It’s been a widely understood metaphor for around a century.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 17, 2025 22:37 UTC (Mon) by micka (subscriber, #38720) [Link] (13 responses)

I just had to look it up and am still confused about what it means ( generally qnd in the context of the kernel). So I don’t think it’s that widely understood.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 8:04 UTC (Tue) by mathstuf (subscriber, #69389) [Link] (12 responses)

My understanding (from US cultural osmosis) is that the police view themselves as a (the?) linchpin in keeping society civil and not going into unrest. They view their importance to be of such high need that even if one of their own rank is doing something completely antithetical to "keeping society civil" ("bad apples"), there is no need to introspect or fix themselves as a whole (at this moment at least, but I get the sense that there's mostly no dependence on time). Very much a "fix the symptoms (sometimes), don't ask about the root cause" kind of thing.

Applying *this* meaning to kernel maintenance is certainly quite extreme. IIUC, tytso is in the US and unless he lives even further under a rock than I do, is at least tangentially aware of this meaning.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 10:24 UTC (Tue) by paulj (subscriber, #341) [Link] (4 responses)

I don't know Ted Ts'O's background, but if he wasn't US born and raised, then - given his wiki lists a Chinese name - it is quite possible he grew up in a part of Asia where the English speaking culture (as might be found in some schools, etc.) was British influenced - not US influenced. As such he might have a more British influenced view of "thin blue line", in terms of useage in current affairs and news media that he was exposed to when younger. See my other comment.

Projecting all kinds of political views onto people, on the basis of a short phrase that means many things to different people... may not be entirely reasonable, to put it politely.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 11:20 UTC (Tue) by mathstuf (subscriber, #69389) [Link] (3 responses)

It's really been a newer thing in the US. I don't remember seeing any "blue line" US flags before the Black Lives Matter movement. I don't think I could have told you want "thin blue line" meant beyond a geometric object before 2019.

> Projecting all kinds of political views onto people, on the basis of a short phrase that means many things to different people... may not be entirely reasonable, to put it politely.

I agree and I don't think I was? I can see how my last paragraph could be read that way, but they're under conditionals. I don't know tytso enough to ascribe any specific views to him. Maybe he's unaware of these things, but given that they've reached even to me (who hasn't had TV service since 2007), I tend to assume there's at least some awareness of such things among those (and maybe I'm even completely wrong about him being US-based in any recent timeframe).

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 11:27 UTC (Tue) by paulj (subscriber, #341) [Link]

> I can see how my last paragraph could be read that way, but they're under conditionals. I don't know tytso enough to ascribe any specific views to him.

Ah, OK. Sure, I accept that. Your comment was in the wider context of where at least /some/ _others_ are projecting their own political takes of "thin blue line" onto T'so - so, let me clarify that my own comment on reasonableness was about those others, not yourself. ;)

Thanks!

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 11:35 UTC (Tue) by Phantom_Hoover (subscriber, #167627) [Link] (1 responses)

If you’ve been familiar with a phrase for decades you might reasonably be unwilling to drop all usage of it just because some political movement has recently started using it as a rallying cry. It starts to feel like you’re being led around by the nose not just by your own side, but by people whose politics you completely oppose, just through their choice of language and symbols.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 11:52 UTC (Tue) by mathstuf (subscriber, #69389) [Link]

As I said elsewhere, I've been completely unaware of the phrase beyond its attachment to the situations around Black Lives Matter. I also am unaware of tytso's personal familiarity with the phrase.

As for such phenomena in general, sometimes things just get "taken" like this overall. The Hindu religious symbol is still used for its purposes in its homelands and by those who live elsewhere. But outside of that context, it is definitely poisoned to a large extent (regardless of its chirality). Is "thin blue line" poisoned that much? I don't think so, but there's a continuum between "pure good" and "pure evil" upon which things can lie. As another example, see the 4chan adoption of a certain frog character which the original author has tried to "take back" but has largely (AFAIK) failed to do so.

Such things can move across the spectrum over time and it can differ between different cultures and people. For example, see curse word usage acceptability in Australia and the US. Just because one c-word is OK to use in Australia doesn't mean they'd get away with it in the US. One can make mistakes and accept being wrong about when and where to use such terms, but continued usage after being informed of the connotations without disclaimers of some kind is the kind of gap dog whistles fit nicely into.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 10:31 UTC (Tue) by Phantom_Hoover (subscriber, #167627) [Link] (6 responses)

There’s a really basic reading comprehension problem here: T’so’s use of the phrase to express the role of a Linux kernel maintainer *does not mean he thinks it applies to the role of police in society*. It tells you literally nothing about his opinion on policing policy! If I tell you I’m a crusader for memory-safe code in the kernel it does not mean I endorse religious war in the Middle East. If I tell you I have a five year plan for Rust in Linux, I am not calling for liquidation of the kulaks. When Ted says kernel maintainers are the thin blue line between the kernel and chaos it does not mean he thinks Blue Lives Matter.

This obsession with language games, this belief that you not only have to have the correct politics but you have to express it in exactly the right way in all avenues of your life lest you be called out by someone purity policing, is an absolutely toxic dead end. You will lose more people in the long run to the alienation and dysfunction that comes with trying to enforce it than you will keep by pandering to those who demand it.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 11:24 UTC (Tue) by mathstuf (subscriber, #69389) [Link] (4 responses)

> This obsession with language games, this belief that you not only have to have the correct politics but you have to express it in exactly the right way in all avenues of your life lest you be called out by someone purity policing, is an absolutely toxic dead end. You will lose more people in the long run to the alienation and dysfunction that comes with trying to enforce it than you will keep by pandering to those who demand it.

I would agree if it were not for the existence of "dog whistles". Everyone can say "but it wasn't meant that way!", but when, e.g., US militia movements see some "innocent phrase" as enablement, you might want to consider the additional ramifications of using such words carelessly. I don't know if tytso means it in that way, but it is not, IMO, a phrase to be tossed around carelessly.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 11:57 UTC (Tue) by Phantom_Hoover (subscriber, #167627) [Link] (3 responses)

You’ve acknowledged that you’d never heard the phrase before 2019, so it comes across as rather arrogant to lecture those of us who had about its role as a fundamentally harmful ‘dog whistle’ for the far right. The whole concept of ‘dog whistles’ starts out with a valid observation and immediately becomes a disaster when over-applied, creating a paranoid and inquisitorial atmosphere where everyone’s under constant suspicion of being a fascist. And the actual fascists know that, and delight in being able to sow paranoia among liberals by flaunting ubiquitous phrases and symbols at random! Better to tune out the noise altogether.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 12:15 UTC (Tue) by mathstuf (subscriber, #69389) [Link] (2 responses)

I added clarification of what the US meaning of "thin blue line" means to me (an American) to someone who has never heard of it before. I'm not here to tell you that "you can't use it", I'm here to say that it has meanings in the US that one might not want to convey. One can continue to use it however they feel (freedom of speech is a thing after all), but continued usage of it in contexts where it can be problematic without further clarification of actual intent may bring consideration from others about future interactions (one is not free of consequences of exercising their freedom of speech).

Anyways, we're far afield of LWN topics. I'll stop here at least.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 12:25 UTC (Tue) by Phantom_Hoover (subscriber, #167627) [Link] (1 responses)

> one is not free of consequences of exercising their freedom of speech

Yes — for instance, if you exercise your freedom of speech to create a purity culture where everyone needs to constantly signal their adherence to the Right Politics, the consequence might be alienating everyone outside of a small elite constituency, and a bunch of absolute maniacs might win power and start trashing your country. Just hypothetically, of course. I’m probably worrying about nothing.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 13:27 UTC (Tue) by jzb (editor, #7867) [Link]

As mathstuf recognized, this thread has gone far afield. I had hoped that comment would stop the thread there, but alas. We've gone way off topic, please end the thread here - and elsewhere.

Reading comprehension is a biggie

Posted Feb 18, 2025 12:54 UTC (Tue) by sdalley (subscriber, #18550) [Link]

Well said!

The fact is, one simply cannot simply pick apart the expressions people (e.g. kernel maintainers) use, as if one were deciphering some odd C preprocessor macro, and pronounce "that clearly means they're anti-Rust" or something. English isn't like that; it has context and "global state" all over the place!

English (and other) language comprehension requires:
* reading the full context,
* putting yourself in the shoes of the writer. What might their general circumstances be? What do they probably know that you don't? They clearly have a reason for saying what they do.
* Not making emotional assumptions about what they *might* mean
* What is their overall point?
* When in doubt, assume good faith.

Re-reading (for example) the whole of Ted's email https://lore.kernel.org/lkml/20250208204416.GL1130956@mit... in that light, one realizes there's no reason to be contentious.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 18, 2025 10:18 UTC (Tue) by paulj (subscriber, #341) [Link] (3 responses)

"Thin blue line" has been used in the UK - and hence a known usage across the Celtic Isles - to refer to the police generally for a good while. When you say "Thin blue line", the primary things I think of are UK bobbies in their old hat/helmet things lining up against CND protesters in the 60s, striking miners and poll tax protesters in the 80s; and - of course - Rowan Atkinson's '95-'96 TV comedy series "The Thin Blue Line".

Not all the world is America.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 20, 2025 11:02 UTC (Thu) by dvdeug (guest, #10998) [Link] (2 responses)

Ask the Northern Irish Catholics what they think about your Thin Blue Line. Not all the world is America, but even you mention how the police stand against striking miners but not Lord Robens. You don't have to be American to understand how the Thin Blue Line is problematic.

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 20, 2025 11:37 UTC (Thu) by paulj (subscriber, #341) [Link]

The RUC (like the RIC before them) were _not_ a normal police force, by any means. Even today, the PSNI retains a strong whiff of paramilitarism to it. The RUC were also not the only security force in the wee north of Ireland. There was also the UDR, in addition to the regular British army. I wouldn't think of the RUC and the troubles when someone mentions "The thin blue line", given the paramilitary-colonial-enforcement nature of them, which sets them apart from policing generally. Also, the RUC (like the RIC before them) wore *dark green* uniforms, _not_ blue.

I'm not from the north myself, but yet I have had British army soldiers point rifles at me at checkpoints. I grew up with people whose family fled the north cause of the troubles, and who also lost close family in the troubles.

You should also note the examples I gave are *not* at all widely considered "positive" examples of policing in the UK!

Slightly OT, one more resignation from one more project caused by that LKML thread

Posted Feb 20, 2025 12:04 UTC (Thu) by Phantom_Hoover (subscriber, #167627) [Link]

My entire family are Northern Irish Catholics, many of them grew up in West Belfast at the height of the troubles, and I can promise you none of them would be bothered to take offence at a programmer figuratively calling his technical role the ‘thin blue line’. Sensitivity scolds love to ask us to *imagine* how hurtful random phrases are to Marginalised People, because if you go and actually ask the Marginalised People they’re generally boring normies with functional levels of emotional resilience.

I did get a great laugh out of my relatives when I told them about the middle class student activists at my university calling for police abolition, who were pointing to the IRA’s role in West Belfast as a model of non-coercive community driven policing.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds