Should C++ be deprecated?
Should C++ be deprecated?
Posted Sep 18, 2025 15:11 UTC (Thu) by farnz (subscriber, #17727)In reply to: Should C++ be deprecated? by LtWorf
Parent article: Comparing Rust to Carbon
Fewer libraries in use does not imply better tested or easier to vet - IME, it's quite the opposite.
What you care about is that each functional component you use is well-tested and easy to vet. If each functional component is in its own library, this is relatively simple; you can see the tests and the changes to the component, and confirm that you're happy about the way they're maintained.
With big libraries that bundle multiple functional components together (such as Qt, which has 12 "Essentials" and 47 "Add-ons" that make up Qt), it gets harder; the library as a whole may be well-tested and well maintained, but that's of no use to you if the bits you care about are untested and barely maintained beyond regular updates to reformat to current house style.
Smaller libraries tend to have fewer functional components to them; it's thus more likely that if you generalise from the state of the library as a whole to the state of the component you care about, you'll get it right. This is trivially true for libraries with a single component (the state of the library and the component are the same thing), and tends to remain true when they have a small number of interesting components.
And my experience is that people vetting that a library is well-tested and functional tend not to do deep-dives into sub-components; they will assume that Qt is a single thing, and therefore if Qt GUI is well-tested and good quality code, the chances are high that Qt CoAP is as good, despite the fact that they're separate components, and there's quite possibly no overlap between the engineers who work on Qt GUI and Qt CoAP.
Posted Sep 18, 2025 15:46 UTC (Thu)
by pizza (subscriber, #46)
[Link] (3 responses)
You're focusing solely on the technical side of things.
One's regulatory/compliance/etc burden grows linearly with the number of unique components, and the effort due to component complexity is usually dwarfed by a large fixed baseline overhead.
(As a perhaps example of this; $dayjob-1 required separate paperwork and mfg/batch tracking for each unique type/size of *screw*. Because when placed into a 3T magnetic field with >1MW gradient pulses... even a tiny screw will become a deadly projectile)
Posted Sep 18, 2025 16:18 UTC (Thu)
by farnz (subscriber, #17727)
[Link] (2 responses)
You don't get to avoid doing the paperwork for each unique type/size of screw by saying "they're all from The Phillips Screw Company"; you have to do paperwork for each unique type/size anyway. Similar, IME, applies in software - just because it's all "Qt" doesn't mean that you can avoid doing the paperwork for each component you use.
Posted Sep 22, 2025 23:05 UTC (Mon)
by marcH (subscriber, #57642)
[Link] (1 responses)
Not sure what the exact extend of "regulated spheres" is but here at $BIGCORP there is definitely some amount of per-supplier work. How could the compliance process not care about the supplier at all?
> You don't get to avoid doing the paperwork for each unique type/size of screw by saying "they're all from The Phillips Screw Company"; you have to do paperwork for each unique type/size anyway
You can at least copy/paste the supplier information, that's much less work that researching 100 different suppliers.
Posted Sep 23, 2025 8:18 UTC (Tue)
by farnz (subscriber, #17727)
[Link]
And 100 small libraries does not have to imply 100 suppliers - Qt, for example, is 59 libraries from one supplier. And because it's 59 libraries, instead of having to review all of Qt if we pull it into the regulated system, we only have to review the Qt libraries we use - maybe 2 or 3, instead of 59.
Posted Sep 19, 2025 0:51 UTC (Fri)
by mathstuf (subscriber, #69389)
[Link] (2 responses)
- a large test suite
So you end up with things like coverage-based test selection, CI sharding, notification tules (CODEOWNERS), cache management, machine wrangling, etc. at the directory level instead of the project level where the forges tend to be *way* more focused.
Of course, if you want to add some piece to your software process, applying it to a single project repo is *way* easier than applying it to dozens of them. But I feel that software process upcycles are a slim margin in the overall churn a software project sees (whether monolithic or separate, monorepo or multirepo).
Posted Sep 19, 2025 10:42 UTC (Fri)
by farnz (subscriber, #17727)
[Link] (1 responses)
As a downstream consumer, if I need to vet 10M lines of code (LOC), I need to vet 10M LOC; it doesn't particularly help me if those 10M LOC are in 2 libraries of 5M LOC each, nor does it help me if they're in 10,000 libraries of 1k LOC each. I still have to vet the lot, and confirm that all 10M LOC are tested to my standards (whatever those are).
My upstreams, however, benefit from splitting into smaller libraries, for all the reasons you state; it's rare for anyone to make a single change that affects all 10M LOC in one go, and thus you want to get all the gains of being in smaller libraries.
Qt is a great example here; it's split into many smaller pieces that are independent, precisely because of the pain you point out. That also means that if I use Qt in a project, I'm not auditing "one library", I'm auditing the N subsets of Qt that I use.
The bigger deal is sharing audits among groups; things like cargo vet and crev help with the technical side of this, but the social side is a much harder nut to crack.
Posted Sep 19, 2025 11:17 UTC (Fri)
by smurf (subscriber, #17840)
[Link]
For some value of "independent", anyway.
While you can just grab the pieces you want (within limits), *updating* just the pieces that need new fun[ctions] and leaving the rest to their 10-year-old splendor ('cause that's when you vetted them, and if it ain't broken …) is not going to cut it. (Consider libboost as an extreme example of this.)
Of course, dependency heck isn't limited to Qt or Boost … but truly independent libraries tend to be more explicit about which versions of their dependencies they require than a more-or-less-explicit "get me whichever version of libfoo that was current as of 2025-09".
Should C++ be deprecated?
IME, regulated spheres don't care whether you get 100 functional components from one library, or whether you use 100 libraries each with one component - they want you to do the compliance burden for each component you use, not for each supplier.
Should C++ be deprecated?
Should C++ be deprecated?
For some regulated processes, the compliance process does not care about the supplier at all. Instead, we have to show that every line of code available to the build system is (a) approved by a named employee of the company using the code, and (b) that there is a process to ensure that no changes are made to that code without a named employee of the company using the code approving it. This implies that when we update a dependency, we're having to take a diff between the two versions, and audit the changes line-by-line as well as in context, just as you do for new code from within the company.
Should C++ be deprecated?
Should C++ be deprecated?
- that takes a long time to run
- lots of interconnected bits, so a small change can affect oodles of tests
- desire to sequence contributions without completely linearizing them (merge trains)
How you perceive that depends on where in the chain you are, too.
Large libraries versus small ones
Large libraries versus small ones