|
|
Subscribe / Log in / New account

Comparing Rust to Carbon

By Daroc Alden
September 16, 2025

RustConf

Safe, ergonomic interoperability between Rust and C/C++ was a popular topic at RustConf 2025 in Seattle, Washington. Chandler Carruth gave a presentation about the different approaches to interoperability in Rust and Carbon, the experimental "(C++)++" language. His ultimate conclusion was that while Rust's ability to interface with other languages is expanding over time, it wouldn't offer a complete solution to C++ interoperability anytime soon — and so there is room for Carbon to take a different approach to incrementally upgrading existing C++ projects. His slides are available for readers wishing to study his example code in more detail.

Many of the audience members seemed aware of Carbon, and so Carruth spent relatively little time explaining the motivation for the language. In short, Carbon is a project to create an alternative front-end for C++ that cuts out some of the language's more obscure syntax and enables better annotations for compiler-checked memory safety. Carbon is intended to be completely compatible with C++, so that existing C++ projects can be rewritten into Carbon on a file-by-file basis, ideally without changing the compiler or build system at all. Carbon is not yet usable — the contributors to the project are working on fleshing out some of the more complex details of the language, for reasons that Carruth's talk made clear.

[Chandler Carruth]

"It's always a little exciting to talk about a non-Rust programming language at RustConf," Carruth began, to general laughter. He has worked in C++ for many years, and has been working on Carbon since the project started in 2020. Currently, he is paid for his work on Carbon as part of Google's languages and compilers team. He briefly showed some research from Google indicating that the majority of the security vulnerabilities it deals with could have been prevented by memory-safe languages, but he didn't spend too long on it because he expected the audience of RustConf to be well-aware of the benefits of memory safety.

The thing is, there is a lot of existing software in the world written in C and C++. There is no magic wand to make that software go away. Migrating any of it to memory-safe languages will require those languages to integrate with the rest of the existing software ecosystem, he said. Interoperability is not just nice to have — it's a key part of what makes adopting memory-safe languages work.

Rust already has several tools to make interoperating with C/C++ code feasible. Carruth listed Rust's native foreign-function interface, bindgen and cbindgen, the cxx crate, and Google's own Crubit. But he claimed that none of these are really good solutions for existing C++ software. He defined software as existing on a spectrum between "greenfield" (new code, not tightly coupled to C++, with strong abstraction boundaries) and "brownfield" (tightly coupled to existing C++, with a large API surface). Greenfield software is relatively easy to port to Rust — it can be moved one module at a time, using existing binding tools. Brownfield software is a lot harder, because it can't be easily decomposed, and so the interface between code written in C++ and code written in Rust has to be a lot more complex and bidirectional.

The question, Carruth said, is can Rust ever close the gap? He doesn't think so — or, at least, not soon and not without a monumental effort. But Rust is not the only approach to memory safety. Ideally, existing C++ code could be made memory-safe in place. Lots of people have tried that, but "the C++ committee is probably not going to do it". There's no way to successfully add memory safety to C++ as it is, he said.

There are several languages that have managed a transition away from a base language into a more capable, flexible successor language, though: TypeScript is an evolution of JavaScript, Swift is an evolution of Objective-C, and C++ itself is an evolution of C. Carruth thinks that Carbon could be a similar evolution of C++ — a path to incremental migration toward a memory-safe language, prioritizing the most entrenched brownfield software. Rust is coming at the problem of memory safety from the greenfield direction, he said, and Carbon is coming at it from the other side. That makes Rust and Carbon quite different languages.

A closer look

The real focus of his talk was on showing where those differences are, and where he thinks each language can learn from the other. The syntaxes of Rust and Carbon are "not wildly different"; the differences he wanted to focus on were more abstract. For example, in Rust, a compilation unit is an entire crate, potentially composed of several modules. Therefore, it's allowed for modules to reference each other in a cyclic way, and that just works. That isn't something Carbon can support because "existing C++ code is often oddly dependent" on the ability to compile individual files separately. So, Carbon inherits C++'s model, complete with forward declarations, (optional) separate header files, and more complexity in the linker. This makes Carbon's model more complex, but that complexity doesn't come from nowhere — "it comes from C++".

Another example is the difference between traits and classes. Rust traits and Carbon classes are not that different, syntactically — Carbon just writes methods inside the struct definition, while Rust writes them separately — but they have major conceptual differences. Carbon has to handle inheritance, virtual functions, protected fields, and so on. "This stuff is complexity that Rust just doesn't have and doesn't have to deal with." Carbon wants to meet C++ APIs where they are, he said. One can even inherit across the C++/Carbon boundary.

This sort of difference is pervasive, he said, and comes up in all parts of the language. Operator overloading, generics, and type conversions are all more complex in Carbon. Why do it this way? Why is all of this additional complexity worth it? To explain, he showed an example of a hypothetical but not unusual C++ API:

    int EVP_AEAD_CTX_seal_scatter(
	    const EVP_AEAD_CTX *ctx,
	    std::span<uint8_t> out,
	    std::span<uint8_t> out_tag,
	    size_t *out_tag_len,
	    std::span<const uint8_t> nonce,
	    std::span<const uint8_t> in,
	    std::span<const uint8_t> extra_in,
	    std::span<const uint8_t> ad);

The example was adapted from a real function in the BoringSSL cryptography library. Each std::span is a combination of a pointer and a length. The main problem with faithfully representing it in Rust is actually not visible in the code itself; the documentation for this function explains that out must either be the same pointer as in, or a completely non-overlapping piece of memory. When the pointers are the same, the function encrypts a given plaintext input buffer in place. Otherwise, the encrypted output is written to the output buffer without disturbing the input buffer. None of the other pointers are supposed to alias.

Carbon is still a work in progress, but the current plan for expressing APIs like this in a machine-checkable way is to use "alias sets". These would be annotations showing which pointers are permitted to alias each other, and which ones aren't. The resulting Carbon code might look like this:

	fn EVP_AEAD_CTX_seal_scatter[^inout](
	    ctx: const EVP_AEAD_CTX ^*,
	    out: slice(u8 ^inout),
	    out_tag: slice(u8 ^),
	    out_tag_len: u64 ^*,
	    nonce: slice(const u8 ^),
	    input: slice(const u8 ^inout),
	    extra_input: slice(const u8 ^),
	    ad: slice(const u8 ^)) -> i32;

Here inout is a name given to a particular alias set, and used to annotate out and input. All of the other pointers in the function signature don't have an alias set specified, so the compiler would ensure they can't alias.

Trying to represent this API in Rust just doesn't work. The language never lets mutable references alias each other, so you end up having to have two separate wrapper functions with different signatures for the in-place case and the copying case. Rewriting the module that contains this function in Rust would become a complex process, intermingling the simple translation of the actual code with the refactoring of the interface.

The power of Carbon for interoperability, Carruth said, is that it lets you decouple these things and do them as small, separate steps. He showed another example of a C++ program that was actually memory-safe, but that wasn't compatible with Rust's lifetime analysis. No computerized analysis of memory safety can ever be perfect, so Carbon presumably won't be able to do much better here — but in Carbon, patterns that the compiler cannot prove to be memory safe can be turned into a warning instead of an error.

This focus on meeting C++ where it is makes Carbon a different language. It ends up being specially tailored to interoperability and gradual migration, which isn't free. This makes the language more complex than it could be otherwise, and Carruth doesn't think that's the right tradeoff for every language. But if the goal is to have memory-safe software throughout the software ecosystem, he thinks that there needs to be room for Rust and Carbon both. This isn't a competition between languages; it's two different languages working together to cover the widely divergent needs of different projects.


Index entries for this article
ConferenceRustConf/2025


to post comments

Hard truth

Posted Sep 16, 2025 17:10 UTC (Tue) by jbills (subscriber, #161176) [Link] (42 responses)

> "the C++ committee is probably not going to do it"

Don't let the C++ people hear you. They are adamant that profiles are the only viable solution for safety for C++.

Hard truth

Posted Sep 16, 2025 22:21 UTC (Tue) by willy (subscriber, #9762) [Link] (41 responses)

Hard truth

Posted Sep 16, 2025 22:39 UTC (Tue) by Wol (subscriber, #4433) [Link] (40 responses)

Apparently the C++ guys don't want memory-safe functions to be only allowed to call other memory-safe functions. Which is absolutely fundamental to Rust.

But isn't that just a fundamental requirement of safety? Aiui, the biggest problem with profiles, is that if you mix code combining different profiles, you lose the guarantees that profiles are meant to provide!?

So unless you say "there's only one profile", profiles don't work. So the STL will be forced to dictate a single profile. Which means all other programs will have to use the same profile. Which promptly means the C++ guys will - in practice - be banning safe functions from calling non-safe ones ... exactly what they say they *don't* want to do.

Cheers,
Wol

Hard truth

Posted Sep 16, 2025 23:32 UTC (Tue) by pizza (subscriber, #46) [Link] (37 responses)

> Apparently the C++ guys don't want memory-safe functions to be only allowed to call other memory-safe functions. Which is absolutely fundamental to Rust.

Uh... Rust is perfectly happy to call into non-safe things. And vice versa.

Hard truth

Posted Sep 17, 2025 0:28 UTC (Wed) by NYKevin (subscriber, #129325) [Link] (36 responses)

If we're a bit charitable, I think the point is that calling an unsafe function from a safe function is almost (but not quite) as annoying as calling a Java function that throws a checked exception. This is, of course, intentional - you're not supposed to call unsafe functions if you can reasonably avoid it, and Rust tones down the annoyance just enough to make it bearable in cases where you really do need unsafe.

But C++ is not Rust. C++ has a long tail of not-provably-safe legacy code to care about. If any call into such code were just as annoying as calling unsafe code from Rust, then C++ developers would quickly figure out how to turn off safety profiles and do so.

I'm not saying safety profiles are viable - like most sensible people who have looked at safety profiles, I think they're probably doomed. But I'm not sure there was ever a good alternative. The standards committee was really backed into a corner with this one.

Should C++ be deprecated?

Posted Sep 17, 2025 5:38 UTC (Wed) by DemiMarie (subscriber, #164188) [Link] (32 responses)

I wonder if it is time to outright deprecate C++ and tell people they need to rewrite their code. Yes, it will be a giant mountain of work, but it’s the only pure-software solution I know of. CHERI can solve memory unsafety in hardware, but that hardware isn’t available yet.

Should C++ be deprecated?

Posted Sep 17, 2025 7:11 UTC (Wed) by viro (subscriber, #7872) [Link] (23 responses)

Re telling people that they need to do arseloads of work on your say-so: "I can call spirits from the vasty deep". Remember the response to that?

You are presuming an authority that simply does not exist, neither in yourself, nor in anybody else.

Should C++ be deprecated?

Posted Sep 17, 2025 8:55 UTC (Wed) by smurf (subscriber, #17840) [Link] (22 responses)

Let's face it, C and C++ are not memory safe and cannot be convinced to become so. So yes they should be "deprecated".
That doesn't mean that anybody is forced to rewrite anything. Deprecated software (and hardware) can and does live on for ages.
However if you ever need to change more than three lines of that code, that deprecation tag does help when managers insist on putting yet another band-aid on that aging library instead of rewriting it in something sane – which, more often than not, will save a heap of time in the long run.

Should C++ be deprecated?

Posted Sep 17, 2025 9:40 UTC (Wed) by smurf (subscriber, #17840) [Link] (20 responses)

In any case. it already has been deprecated by a variety of people and organizations, e.g. CISA:

The development of new product lines for use in service of critical infrastructure or NCFs (national critical functions) in a memory-unsafe language (e.g., C or C++) … is dangerous and significantly elevates risk to national security, national economic security, and national public health and safety.

– CISA, Product Security Bad Practices

Should C++ be deprecated?

Posted Sep 17, 2025 17:43 UTC (Wed) by hmh (subscriber, #3838) [Link] (19 responses)

But what CISA has to say about languages where almost always one ends up with large dependency sets and a massive SBOM?

Complain whatever you will about C and C++, but dependency hell is rare in the C and C++ ecosystems, often due to the use of one of the *few* popular includes-even-the-kitchen-sink frameworks, but those are well maintained and well gatekeeped (e.g. glib, Qt).

So, yeah, maybe C and C++ should be deprecated in general, but IMO dependency-hell should never be acceptable in any security-sensitive context, it is far worse a problem than the use of well-written C or C++.

Should C++ be deprecated?

Posted Sep 17, 2025 19:32 UTC (Wed) by NYKevin (subscriber, #129325) [Link] (8 responses)

Rust does not force the use of huge dependencies. It does not even force you to have a memory allocator. The huge deps exist because developers (and by extension, their users and other stakeholders) like the benefits of those deps and choose to take them rather than reinventing the wheel.

Should C++ be deprecated?

Posted Sep 18, 2025 13:52 UTC (Thu) by LtWorf (subscriber, #124958) [Link] (7 responses)

This is kinda meaningless.

If a language doesn't have a big standard library or a few well established kinda big libraries, you will inevitably end up with a lot of dependencies.

Should C++ be deprecated?

Posted Sep 19, 2025 3:52 UTC (Fri) by NYKevin (subscriber, #129325) [Link] (6 responses)

> If a language doesn't have a big standard library

What does that have to do with anything? None of C, C++, or Rust have big standard libraries. Well, I suppose you could make an argument for C++ being kinda big... but it's still a far cry from (e.g.) Python (and a fair amount of its functionality is also in Rust's stdlib).

> or a few well established kinda big libraries,

I contend that this is not a Rust problem, it is a "developers in general don't like big libraries" problem. Rust is perfectly capable of supporting large libraries. Developers choose not to produce or use them, because Cargo makes it relatively painless to produce and use small libraries instead. We see similar behavior in basically every programming environment whose build and packaging system makes it sufficiently convenient (Go, JavaScript, etc., but not Java, C, C++, etc.). In the case of Python, which has a deeply flawed but still sorta kinda mostly usable packaging system, we see a mixture of large and small libraries.

Now, you might argue that this does not matter either, that it's still a language with a lot of deps. But think about the implications of this for a moment. You would be arguing, essentially, that a "good" language ought to have a "bad" build system in order to force developers to do things the way you would prefer instead of the way that they would prefer. My broader point is that the market does not answer to you and your preferences. If you want big libraries, you can write them yourself, commission them from somebody else, or find an employer who uses a monorepo.

Should C++ be deprecated?

Posted Sep 19, 2025 11:02 UTC (Fri) by excors (subscriber, #95769) [Link] (1 responses)

> You would be arguing, essentially, that a "good" language ought to have a "bad" build system in order to force developers to do things the way you would prefer instead of the way that they would prefer.

That sounds like Rust's basic design philosophy - it's an opinionated language, and its opinions are sometimes not the traditionally popular ones.

E.g. it's "bad" at writing code that needs lots of raw pointers (like code that has complicated ownership but can't afford the run-time overhead of Rc+RefCell). The syntax is uglier than in C, the aliasing rules are more complicated and less clearly defined, performance may be worse (since Rust pointers don't have type-based strict aliasing), the documentation is full of scary warnings, the community will shun you for using too much `unsafe`, etc.

Programmers coming from C often want to write code that way. One of the first data structures they may try to implement is a linked list, which C is really good at and Rust is really bad at. But Rust is bad at that because Rust's designers would prefer you don't write code that way. They want there to be a significant amount of friction when using raw pointers, because you shouldn't be using raw pointers. It might upset those C programmers in the short term, but it's for their own good, and in the long term they'll come to appreciate it.

I think it would be entirely consistent with that philosophy for the language designers to decide that massive dependency trees are dangerous, no matter how much programmers from other languages seem to prefer working that way, and to design the language and tools in a way that makes that less convenient. Require Cargo.toml to have an explicit allowlist of the owners of all transitive dependencies, so developers are forced to be aware of how many random GitHub users they're trusting and are more attracted towards self-contained groups of crates with shared maintainership, or whatever. Don't do anything as accidentally terrible as C's build systems, but still do something to drive users towards what the language designers have decided is best practice, as they have with many other parts of the language.

In this case Rust didn't make that decision, but I think they could have (and maybe should have).

Forcing reduced size dependency trees

Posted Sep 19, 2025 12:25 UTC (Fri) by farnz (subscriber, #17727) [Link]

Arguably, that's what tools like cargo vet are for. An interested organisation (say Debian, or the FSF, or Google, or CENELEC) can set up a URL that lets you grab their current approved list of dependencies, along with their audit criteria, and then tell you things like "if you want this to be in the main archive, you need to meet 'debian-main' criteria for dependencies" or "we require that new dependencies for the Chrome build system meet our 'safe-to-deploy' audit criteria".

This tames the "massive dependency tree" by requiring that you either audit your dependencies yourself (and publish an audits.toml that documents this audit), or that you import someone else's audit of your dependencies. It still allows people who don't care to have a massive dependency tree, of course.

Should C++ be deprecated?

Posted Sep 22, 2025 22:09 UTC (Mon) by marcH (subscriber, #57642) [Link] (3 responses)

> But think about the implications of this for a moment. You would be arguing, essentially, that a "good" language ought to have a "bad" build system in order to force developers to do things the way you would prefer instead of the way that they would prefer.

That sounds like: a "good" language ought to make using raw pointers a "bad "experience in order to force developers to do things the (memory-safe) way you would prefer instead of the (unsafe) way that they would prefer.

Could not resist sorry (and thanks to excors https://lwn.net/Articles/1038755/)

In this day and age of massive supply chain attacks, things like "cargo vet" are critical. I have no idea whether "cargo vet" is the best solution and I don't even have a strong opinion on "massive dependency trees". But for sure there has to be _some_ sort of SBOM constraints to force most developers not to do things the way they prefer, which is: let AI write some code that imports random, orphaned open-source libraries and go home sooner.

(I hope no one replies with "Just train, police and manage your developers" which is the "mythical workplace" argument)

Should C++ be deprecated?

Posted Sep 23, 2025 8:11 UTC (Tue) by taladar (subscriber, #68407) [Link] (1 responses)

Your supply chain argument (whether through AI or otherwise) doesn't really work since a large dependency project with lots of committers is, if anything, more vulnerable to someone slipping in some random code in a place that none of the maintainers know very well, than a bunch of small dependencies.

As for unmaintained dependencies, that is what why we have the RUSTSEC announcements about unmaintained libraries along with cargo-deny or similar tooling. Of course our method of detecting when a dependency is unmaintained could be improved here but that is inherently still better than pretending a large dependency is maintained when really the code base is 50% maintained and 50% code nobody looked at for years.

Should C++ be deprecated?

Posted Sep 23, 2025 15:16 UTC (Tue) by marcH (subscriber, #57642) [Link]

Did you click reply on the wrong comment ? I reread my comment and I can't find anything looking like "approach A is more vulnerable to supply chain attack than B" (as you just affirmed without any substantiation)

I only wrote that supply chain attacks are intense and not treated seriously enough yet. IMHO, today's most important question is not where they are most likely to come from. It's what the best defense is. Ideally, that defense would be effective wherever they come from.

Should C++ be deprecated?

Posted Sep 23, 2025 15:53 UTC (Tue) by farnz (subscriber, #17727) [Link]

In terms of SBOM constraints and the Rust ecosystem, I see one essential tool, and two things competing for "long term direction".

The essential tool is cargo deny, which gives you three vital features (plus checking SPDX licensing tags):

  1. You can block known vulnerable or unmaintained dependencies, so that you're not accidentally using something that's definitely bad, or that isn't being looked after.
  2. You can ban specific dependencies, or certain versions of dependencies, so that you can stop people pulling in multiple libraries for the same task, or ban versions that you know don't work well with your codebase. This can be done as an allowlist of things you're letting in, or a denylist of things you do not want.
  3. You can check that all dependencies come from a known-good source, rather than letting people point you at a random hosting site. This can also be used to prevent people pointing you at a random repo on a big hosting site like GitLab or BitBucket.

On top of that, you also want some functionality to at least let you distinguish "the code in this dependency has been audited by a trustworthy party" from "we're using this because it works, and we need to audit it before release". cargo vet does that with explicitly configured lists of trusted audits (and no transitive trust), cargo crev does that via a web of trust setup.

I have no particular bias towards either tool; both look like they could be made to work, and which one ends up preferred depends on details of what you're doing and how you determine who to trust.

Should C++ be deprecated?

Posted Sep 18, 2025 7:09 UTC (Thu) by taladar (subscriber, #68407) [Link] (9 responses)

Lets be honest. Numbers of dependencies are just smaller while individual dependencies are often huge in C and C++ because their build systems make it painful to create new projects and to include many projects, not because it is a good idea to put everything into one huge dependency.

Should C++ be deprecated?

Posted Sep 18, 2025 13:55 UTC (Thu) by LtWorf (subscriber, #124958) [Link] (8 responses)

You are right that it's why there are fewer libraries in use. The end result though is that there are fewer libraries in use, which tend to be more well known and tested and are easier to vet.

Should C++ be deprecated?

Posted Sep 18, 2025 15:11 UTC (Thu) by farnz (subscriber, #17727) [Link] (7 responses)

Fewer libraries in use does not imply better tested or easier to vet - IME, it's quite the opposite.

What you care about is that each functional component you use is well-tested and easy to vet. If each functional component is in its own library, this is relatively simple; you can see the tests and the changes to the component, and confirm that you're happy about the way they're maintained.

With big libraries that bundle multiple functional components together (such as Qt, which has 12 "Essentials" and 47 "Add-ons" that make up Qt), it gets harder; the library as a whole may be well-tested and well maintained, but that's of no use to you if the bits you care about are untested and barely maintained beyond regular updates to reformat to current house style.

Smaller libraries tend to have fewer functional components to them; it's thus more likely that if you generalise from the state of the library as a whole to the state of the component you care about, you'll get it right. This is trivially true for libraries with a single component (the state of the library and the component are the same thing), and tends to remain true when they have a small number of interesting components.

And my experience is that people vetting that a library is well-tested and functional tend not to do deep-dives into sub-components; they will assume that Qt is a single thing, and therefore if Qt GUI is well-tested and good quality code, the chances are high that Qt CoAP is as good, despite the fact that they're separate components, and there's quite possibly no overlap between the engineers who work on Qt GUI and Qt CoAP.

Should C++ be deprecated?

Posted Sep 18, 2025 15:46 UTC (Thu) by pizza (subscriber, #46) [Link] (3 responses)

> What you care about is that each functional component you use is well-tested and easy to vet.

You're focusing solely on the technical side of things.

One's regulatory/compliance/etc burden grows linearly with the number of unique components, and the effort due to component complexity is usually dwarfed by a large fixed baseline overhead.

(As a perhaps example of this; $dayjob-1 required separate paperwork and mfg/batch tracking for each unique type/size of *screw*. Because when placed into a 3T magnetic field with >1MW gradient pulses... even a tiny screw will become a deadly projectile)

Should C++ be deprecated?

Posted Sep 18, 2025 16:18 UTC (Thu) by farnz (subscriber, #17727) [Link] (2 responses)

IME, regulated spheres don't care whether you get 100 functional components from one library, or whether you use 100 libraries each with one component - they want you to do the compliance burden for each component you use, not for each supplier.

You don't get to avoid doing the paperwork for each unique type/size of screw by saying "they're all from The Phillips Screw Company"; you have to do paperwork for each unique type/size anyway. Similar, IME, applies in software - just because it's all "Qt" doesn't mean that you can avoid doing the paperwork for each component you use.

Should C++ be deprecated?

Posted Sep 22, 2025 23:05 UTC (Mon) by marcH (subscriber, #57642) [Link] (1 responses)

> IME, regulated spheres don't care whether you get 100 functional components from one library, or whether you use 100 libraries each with one component - they want you to do the compliance burden for each component you use, not for each supplier.

Not sure what the exact extend of "regulated spheres" is but here at $BIGCORP there is definitely some amount of per-supplier work. How could the compliance process not care about the supplier at all?

> You don't get to avoid doing the paperwork for each unique type/size of screw by saying "they're all from The Phillips Screw Company"; you have to do paperwork for each unique type/size anyway

You can at least copy/paste the supplier information, that's much less work that researching 100 different suppliers.

Should C++ be deprecated?

Posted Sep 23, 2025 8:18 UTC (Tue) by farnz (subscriber, #17727) [Link]

For some regulated processes, the compliance process does not care about the supplier at all. Instead, we have to show that every line of code available to the build system is (a) approved by a named employee of the company using the code, and (b) that there is a process to ensure that no changes are made to that code without a named employee of the company using the code approving it. This implies that when we update a dependency, we're having to take a diff between the two versions, and audit the changes line-by-line as well as in context, just as you do for new code from within the company.

And 100 small libraries does not have to imply 100 suppliers - Qt, for example, is 59 libraries from one supplier. And because it's 59 libraries, instead of having to review all of Qt if we pull it into the regulated system, we only have to review the Qt libraries we use - maybe 2 or 3, instead of 59.

Should C++ be deprecated?

Posted Sep 19, 2025 0:51 UTC (Fri) by mathstuf (subscriber, #69389) [Link] (2 responses)

Another thing is that larger projects require non-linearly larger software process and infrastructure. Testing a focused 1k line library is fairly trivial. Testing 10 of them is still reasonable. On the other hand, figuring out testing for multi-million-line projects is a completely separate endeavour because you have, all at once:

- a large test suite
- that takes a long time to run
- lots of interconnected bits, so a small change can affect oodles of tests
- desire to sequence contributions without completely linearizing them (merge trains)

So you end up with things like coverage-based test selection, CI sharding, notification tules (CODEOWNERS), cache management, machine wrangling, etc. at the directory level instead of the project level where the forges tend to be *way* more focused.

Of course, if you want to add some piece to your software process, applying it to a single project repo is *way* easier than applying it to dozens of them. But I feel that software process upcycles are a slim margin in the overall churn a software project sees (whether monolithic or separate, monorepo or multirepo).

Large libraries versus small ones

Posted Sep 19, 2025 10:42 UTC (Fri) by farnz (subscriber, #17727) [Link] (1 responses)

How you perceive that depends on where in the chain you are, too.

As a downstream consumer, if I need to vet 10M lines of code (LOC), I need to vet 10M LOC; it doesn't particularly help me if those 10M LOC are in 2 libraries of 5M LOC each, nor does it help me if they're in 10,000 libraries of 1k LOC each. I still have to vet the lot, and confirm that all 10M LOC are tested to my standards (whatever those are).

My upstreams, however, benefit from splitting into smaller libraries, for all the reasons you state; it's rare for anyone to make a single change that affects all 10M LOC in one go, and thus you want to get all the gains of being in smaller libraries.

Qt is a great example here; it's split into many smaller pieces that are independent, precisely because of the pain you point out. That also means that if I use Qt in a project, I'm not auditing "one library", I'm auditing the N subsets of Qt that I use.

The bigger deal is sharing audits among groups; things like cargo vet and crev help with the technical side of this, but the social side is a much harder nut to crack.

Large libraries versus small ones

Posted Sep 19, 2025 11:17 UTC (Fri) by smurf (subscriber, #17840) [Link]

> Qt is a great example here; it's split into many smaller pieces that are independent

For some value of "independent", anyway.

While you can just grab the pieces you want (within limits), *updating* just the pieces that need new fun[ctions] and leaving the rest to their 10-year-old splendor ('cause that's when you vetted them, and if it ain't broken …) is not going to cut it. (Consider libboost as an extreme example of this.)

Of course, dependency heck isn't limited to Qt or Boost … but truly independent libraries tend to be more explicit about which versions of their dependencies they require than a more-or-less-explicit "get me whichever version of libfoo that was current as of 2025-09".

Should C++ be deprecated?

Posted Sep 17, 2025 14:04 UTC (Wed) by nim-nim (subscriber, #34454) [Link]

That does mean that when organisations weight new software investments they are less likely to reinvest in old C++ codebases.

The whole point of creating “safer” C++ profiles/derivatives is to convince CEOs to pour more money into the C++ sink, because the proposal says new parts will be written in the “safe” derivative and the financier is strongly encouraged to believe that will eventually make the whole safe (eventually as when the pigs will fly but no one who wants to keep his job will say so openly).

Should C++ be deprecated?

Posted Sep 17, 2025 9:53 UTC (Wed) by farnz (subscriber, #17727) [Link] (7 responses)

How do you deprecate C++, though?

Inside a company, it's manageable, as long as you can get enough engineers who will do the same quality work or better in other languages - the mandate is "no new C++, or you get fired", and you're done.

But outside companies, you can't do that - how do you stop me writing code in private in C++? How do you stop me sharing C++ code with other people? How do you stop other people seeing my C++ code and reusing it?

If safety turns out to matter to people, I'd expect that C++ is on the Fortran road; it's not going to die out completely, but it's going to be less and less used outside of a few niches.

Should C++ be deprecated?

Posted Sep 17, 2025 10:11 UTC (Wed) by Wol (subscriber, #4433) [Link] (2 responses)

> How do you deprecate C++, though?

If companies and governments start banning it, then Universities will stop teaching it.

If C++ featuring high on your CV is a turn-off rather than a turn-on, students won't choose to learn it.

Once that happens, programmers will stop writing personal C++ because they aren't using it as their main language at work.

And C++ will join the legacy languages such as Cobol and Fortran, where maintenance programmers are paid fortunes (we wish) to keep old code running.

Cheers,
Wol

Should C++ be deprecated?

Posted Sep 18, 2025 14:10 UTC (Thu) by LtWorf (subscriber, #124958) [Link] (1 responses)

Do you have any good replacement for Qt then? Because until you do I don't think that's going to happen.

And Qt is just an example out of many.

Should C++ be deprecated?

Posted Sep 18, 2025 14:55 UTC (Thu) by smurf (subscriber, #17840) [Link]

https://blog.logrocket.com/state-rust-gui-libraries/ lists 11 GUI libraries … presumably one of them will serve whatever your particular usecase is.

Frankly, Qt is an arcane and very-difficult-to-debug mess (and so is GTK). Proof: Just start about any nontrivial program in the console and watch the warnings scroll by.

That being said, bindings for Rust/Qt do exist. Assuming they're written well, they can go quite some way towards safe-ing your GUI code. You don't need to replace 100% of your code all at once, after all. A hundred 1%-sized steps work just as well.

Should C++ be deprecated?

Posted Sep 17, 2025 10:45 UTC (Wed) by excors (subscriber, #95769) [Link] (3 responses)

Deprecating doesn't mean banning. It means saying "we think you shouldn't use this, and we're probably going to put less effort into supporting this in the future".

Anybody can say that, and you're always free to ignore them. It only really matters when the person saying it is a trusted expert and/or has some power over you. (If they're an expert, you can assume they've got good reasons for saying you shouldn't use that feature, and you don't need to waste time working through the whole rationale yourself to come to the same conclusion. If they have power, e.g. they can influence whether new compilers are going to be compatible with your old software, then their rationale doesn't matter and you should consider following their advice now to avoid some compatibility pain in the future. But you can still choose to ignore them, and accept the consequences.)

Standards bodies have both expertise and power, so it matters when they say something is deprecated. But things can also be deprecated by community consensus, or by a company's policies, or by an individual developer, etc. Any of them can say "we think you shouldn't use C++, and we're going to stop writing specifications/tools/documentation/etc for C++ and stop using C++ libraries in our applications and stop contributing to C++ projects", and it doesn't matter that not everyone will agree - it all contributes to a gradual shift away from C++.

Should C++ be deprecated?

Posted Sep 17, 2025 12:41 UTC (Wed) by farnz (subscriber, #17727) [Link] (2 responses)

But to deprecate C++ globally means getting most of the trusted experts worldwide to say that C++ is deprecated; I don't see a path through to that in the near future, because the people we'd need to deprecate C++ in any significant fashion are currently backing C++.

In the long run, things like the EU's Cyber Resilience Act are going to push in this sort of general direction, by stopping commercial entities from treating security as an externality, but that's a very slow process.

Should C++ be deprecated?

Posted Sep 17, 2025 16:26 UTC (Wed) by smurf (subscriber, #17840) [Link] (1 responses)

> to deprecate C++ globally means getting most of the trusted experts worldwide to say that C++ is deprecated

Not necessarily. If the government entity responsible for the standards organization that certifies your certified-and-thus-expensive access control system, esp. its compliance with regulations and whatnot, says "C++ is deprecated", this directly translates to requiring extra justification/scrutinity when you renew said certification, the number of C++ experts who say that C++ is fine nonwithstanding.

Should C++ be deprecated?

Posted Sep 17, 2025 16:38 UTC (Wed) by farnz (subscriber, #17727) [Link]

Most of the standards I'm aware of don't care about language in use - they won't ever deprecate C++ as a result. Instead, they have the notion of a "qualified compiler", and if your compiler is qualified and you meet the caveats of that qualification, then you can do your certification at source level, instead of binary level.

Ferrocene is an example of a qualified compiler; you'd use the Project Documents to determine whether you're meeting the qualification requirements for this compiler; in this case, there's a set of constraints in the Safety Manual which tell you what the caveats are for Ferrocene.

You might see the qualification caveats for your C++ compiler get gradually more stringent, which might have the effect of making you deprecate C++ (especially if they start to conflict with "custom and practice" in the wider C++ community), but that's the most certification is likely to lead to.

Hard truth

Posted Sep 17, 2025 6:50 UTC (Wed) by Wol (subscriber, #4433) [Link] (1 responses)

Umm ...

Then could you have weak and strong profiles - preferably with a pragma that says which applies to which module?

So a weak pragma will simply check that module, and say "this code does nothing silly, can't vouch for what it calls". A strong pragma would be your Java checked exception - "this code does nothing silly, nor do the routines it calls".

It would take a long time to percolate through, but then places could have a policy that every time you modify code, you apply the weak pragma, make that work, and then see if the strong pragma works too. So it's basically niggling you forward one module at a time?

Cheers,
Wol

Hard truth

Posted Sep 18, 2025 1:55 UTC (Thu) by mathstuf (subscriber, #69389) [Link]

> So it's basically niggling you forward one module at a time?

First, one needs to migrate to modules for this to be applicable.

Hard truth

Posted Sep 17, 2025 9:41 UTC (Wed) by farnz (subscriber, #17727) [Link]

There is a solution to that which has already been rejected, unfortunately, since it also acts as "add an annotation to make it possible to make a call to legacy code".

You'd have a special type of block that goes around items, that says that everything inside this block, including imported modules, can be used from safe code without an annotation at the point of use, even though it's not checked and safe. Because it covers everything inside the block, it covers everything brought in by a #include; because it's a special type of block with its own marker, it's possible to write tooling around the use of these blocks that requires review by senior developers.

Hard truth

Posted Sep 17, 2025 8:24 UTC (Wed) by danielthompson (subscriber, #97243) [Link] (1 responses)

> Apparently the C++ guys don't want memory-safe functions to be only
> allowed to call other memory-safe functions. Which is absolutely
> fundamental to Rust.

Curiously Rust moved even further on this for the 2024 edition: it is no longer assumed that unsafe functions should have unsafe bodies. At this point it is only an on-by-default warning but even within an unsafe function then the tools now strongly encourage the use of the unsafe keyword before accessing the superpowers.

https://doc.rust-lang.org/nightly/rustc/lints/listing/all...

Hard truth

Posted Sep 17, 2025 22:55 UTC (Wed) by NYKevin (subscriber, #129325) [Link]

This is because it never made much semantic sense to allow unsafe functions to do unsafe things in their bodies in the first place. The unsafe keyword has two entirely different meanings:

- As a specifier on the function signature, it is part of the API, and signals that the function has nontrivial safety preconditions which are not enforced by the compiler. By convention, these preconditions are listed in the function's doc comment under the "Safety" header.
- As a block of code, it represents an acknowledgement that one or more operations inside of that block have safety preconditions which are not enforced, and a promise by the developer to uphold those preconditions manually. By convention, the block is commented with a SAFETY comment explaining why the applicable preconditions hold at that point in execution.

Because the compiler does not know what precondition the programmer is promising to uphold with any given unsafe block, nor the exact preconditions any given unsafe function or other unsafe operation* might require, it has no way of checking one against the other. It is therefore not unheard of for programmers to deliberately make all unsafe blocks as small as possible. If only one operation is wrapped in unsafe, then you only have to worry about the preconditions of that one operation, and will not accidentally make unrelated promises about constructs you wrongly believed were safe. Fortunately, Rust treats unsafe blocks (and blocks in general) as expressions, so you can pinpoint the unsafe operation you want to allow and leave the rest of the expression in safe code, if you so desire.

By allowing unsafe functions to do unsafe things in their bodies without a separate unsafe block, older Rust editions inappropriately conflated these two meanings. More importantly, they made it much harder to minimize the amount of code that falls within an unsafe block.

* To be pedantically correct, the compiler knows full well that a raw pointer needs to point at a live allocation of the correct type, if you want to dereference it, and the compiler has similar knowledge of most of the other unsafe superpowers. But the compiler does not know about all the state surrounding that raw pointer, so (for example) it does not know that the pointer is always valid for reads when some flag is set, or any more complicated variation of that pattern. You can write a wrapper which enforces such an invariant in safe code, but the implementation of that wrapper ultimately still needs to use unsafe internally. There is no workaround for this, because the whole point of unsafe is to function as an escape hatch for things the compiler does not understand.

Carbon is probably a fine solution for Google

Posted Sep 16, 2025 20:15 UTC (Tue) by warrax (subscriber, #103205) [Link]

I think they've always marketed is as that and nothing more.

Leaky Interoperability

Posted Sep 17, 2025 1:58 UTC (Wed) by hmanning77 (subscriber, #160992) [Link] (4 responses)

I'm really curious how Carbon will play out. As mentioned in the talk, C++ already provides the same sort of evolutionary pathway for C code, but in practice I haven't seen that work out. Sometimes people will take the time to wrap the C API in std::unique_ptr's std::vector's, but just as often the use of C types and functions starts leaking out into the C++ code instead. It's fine, in theory, to say that we should just enforce better standards. In practice people have lots to do and little time available. Perhaps Carbon will find ways to discourage temptation to write "just a little C++ to make this work"?

Leaky Interoperability

Posted Sep 18, 2025 7:16 UTC (Thu) by taladar (subscriber, #68407) [Link] (2 responses)

Personally I think the "C++ is the successor of C" was only ever marketing and nobody really treated it that way in practice, so in essence it is just failed marketing.

Leaky Interoperability

Posted Sep 18, 2025 14:56 UTC (Thu) by smurf (subscriber, #17840) [Link] (1 responses)

Well, once upon a time there was this thing called "cfront" which transcoded C++ to C …

Leaky Interoperability

Posted Sep 18, 2025 18:38 UTC (Thu) by ejr (subscriber, #51652) [Link]

And then came the battle over template instantiation methods and the ODR...

Leaky Interoperability

Posted Sep 22, 2025 23:19 UTC (Mon) by marcH (subscriber, #57642) [Link]

> It's fine, in theory, to say that we should just enforce better standards. In practice people have lots to do and little time available. Perhaps Carbon will find ways to discourage temptation to write "just a little C++ to make this work"?

Yes, it's all about enforcement. One of the "easiest" ways is to tie bonuses to the "little C++" percentage. You can also block releases until that percentage falls under some target thresholds - exactly like any other quality metric. You can also inflict more mandatory review, test coverage, process overhead and what not on that percentage - making life with the "little C++" miserable.

There are plenty of ways - use your imagination. But they all require a strong, top-down push from management. That push exists in some technical enough companies. That safety push could be enough to make Carbon successful - exactly like it's been making Rust successful.

Who employs the speaker BTW? :-)

Any crossover with Sutter's cpp2?

Posted Sep 17, 2025 15:31 UTC (Wed) by Karellen (subscriber, #67644) [Link] (1 responses)

I wonder if any inspiration or ideas were taken from Herb Sutter's cpp2 alternate C++ syntax/language?

They look to be working in a similar conceptual space, to a certain degree.

Any crossover with Sutter's cpp2?

Posted Sep 17, 2025 21:58 UTC (Wed) by ajb (subscriber, #9694) [Link]

That reminds me of a similar proposal years ago from Ben Werther and Damian Conway: https://dl.acm.org/doi/pdf/10.1145/240964.240981
Well, similar in that they are both new syntax bindings for C++. Not sure if the actual syntaxs bear any resemblence.


Copyright © 2025, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds