Undefined Behaviour as usual
Undefined Behaviour as usual
Posted Feb 23, 2024 17:34 UTC (Fri) by flussence (guest, #85566)In reply to: Undefined Behaviour as usual by tialaramex
Parent article: Stenberg: DISPUTED, not REJECTED
This describes a fantasy world where C compilers (and perhaps all software) are made by insane villains and actively abuse people for doing things outside what a written standard specifies, and to be blunt, it's just "free speech" advocacy with different inflection. I for one am glad the tech culture of 40 years ago has been largely stomped out by more reasonable people.
Posted Feb 23, 2024 17:37 UTC (Fri)
by mb (subscriber, #50428)
[Link] (11 responses)
Compilers exploiting UB happens all the time. It is the base of all optimizations.
> for doing things outside what a written standard specifies,
UB is by the very definition of UB outside of what the standard specifies.
Posted Feb 23, 2024 18:44 UTC (Fri)
by fw (subscriber, #26023)
[Link] (1 responses)
Posted Feb 23, 2024 22:18 UTC (Fri)
by khim (subscriber, #9252)
[Link]
Java and C# absolutely do have undefined behavior. It's just handled like Rust handles it: “safe” language guarantees absence of UB by compiler whole “unsafe” part allows on to write programs with UB. Java forces you to write these parts in entirely different language using JNI while C# have an unsafe subset, similarly to Rust, but in both cases UB is still, very much, form the basis for all optimizations. Of course it does. Everything in C depends on absence of undefined behavior. Simply because it's permitted to convert pointer to function into pointer to And it's not even theoretical issue! Back in MS-DOS era That one may be exploiting by finding and changing constants in the compiler code. And that, too, was used back when compilers weren't smart enough to break such tricks.
Posted Feb 24, 2024 12:39 UTC (Sat)
by vegard (subscriber, #52330)
[Link] (7 responses)
The first part is true, but the second seems trivially false. Constant propagation does not in any way relate to or rely on UB, yet it is an optimization. Same with tail call optimizations, inlining, even register allocation just to name a few.
Posted Feb 24, 2024 12:47 UTC (Sat)
by mb (subscriber, #50428)
[Link] (6 responses)
And let me even add something: UB is required for connecting the virtual machine models of the compiler to the real world. Otherwise the virtual machine model would have to *be* the actual machine model. And even then it would still include UB, because actual machines have UB.
Posted Feb 24, 2024 13:21 UTC (Sat)
by vegard (subscriber, #52330)
[Link] (5 responses)
Posted Feb 24, 2024 13:49 UTC (Sat)
by mb (subscriber, #50428)
[Link] (4 responses)
If you are going to rewrite and recompile your Rust program each time the input changes, then the Compiler is part of your program flow. The compiler is the "unsafe-block" in this case, which provides the input data. Without it, the Rust program can't process anything. It would be static.
Yes, we can have fully safe languages like Rust. But they must *always* interface to an unsafe part. Otherwise they can't produce results. Safe-Rust alone is useless. General purpose languages will always have an interface to unsafe code.
The real world is unsafe. The real world has UB.
Posted Feb 25, 2024 0:55 UTC (Sun)
by tialaramex (subscriber, #21167)
[Link]
Generality is of course a price we cannot often afford. You wouldn't write the WUFFS compiler in WUFFS (the current transpiler is in Go) or an operating system, or indeed a web browser but the point is that our industry got into the habit of using chainsaws everywhere because they're so powerful, rather than using the right tool for the job.
Posted Feb 26, 2024 7:50 UTC (Mon)
by NYKevin (subscriber, #129325)
[Link] (2 responses)
* "Safe" code (where the compiler will not let you write code that contains UB).
But then this is just a matter of definitions - You can set up your build system in such a way that it will refuse to compile unsafe code without a valid proof of soundness. Then you can consider the proof to be part of the source code, and now your unsafe language, plus the theorem proving language, together function as a safe language.
Formally verified microkernels already exist. The sticking point, to my understanding, is the lack (or at least incompleteness) of a verified-or-safe userland.
Posted Feb 26, 2024 10:02 UTC (Mon)
by farnz (subscriber, #17727)
[Link] (1 responses)
The other sticking point is the difficulty of formal verification using current techniques. Formally verifying seL4 took about 20 person-years to verify code that took 2 person-years to write.
Posted Feb 26, 2024 10:12 UTC (Mon)
by NYKevin (subscriber, #129325)
[Link]
The question is, is that good enough? Most formal verification wants to prove more than "mere" type safety or soundness, and that tends to be hard because the property you're trying to prove is complex and highly specific to the individual application. But if you just want to prove a lack of UB, that's probably more feasible.
* There are soundness bugs in the current implementation of Rust. Most of them are rather hard to trigger accidentally, but they do exist.
Posted Feb 29, 2024 9:01 UTC (Thu)
by anton (subscriber, #25547)
[Link]
Posted Feb 23, 2024 19:34 UTC (Fri)
by geofft (subscriber, #59789)
[Link] (29 responses)
The point of undefined behavior is not that the compiler is allowed to be lawful-evil about how it interpreters your code, and so you have to be paranoid about what it might do. The point is that an optimizing compiler is permitted to assume that you are writing reasonable code that does not require the compiler to be paranoid about what you meant, and so it can make reasonable optimizations on real-world code. And every compiler that people actually use is optimizing. (There is a loose conceptual connection here with speculative execution vulnerabilities: you can avoid them with a non-speculating CPU, but nobody seems to be buying those.)
The code behind CVE-2023-52071 is actually a pretty good example of this:
However, in the actual case in the curl source code, the dead-code elimination is actually pretty bad! You do really want that code to execute; the coder's intention was not that the block was skippable. The compiler can do the exact same "useful" action and get you a pretty negative result: the curl command does no output (I think), but it's returning success anyway. It's not far-fetched to imagine that in turn leading to unexpected data loss. The compiler does not need to be actively evil to cause a real problem.
(Note that what's happening isn't that the compiler is doing something in response to undefined behavior being invoked. The compiler is simply not doing something on the assumption that undefined behavior is never invoked; specifically, it just doesn't compile the block. No real-world compiler has any interest in inserting code to do something weird that it wouldn't otherwise insert. But even so, optimizing out something that shouldn't have been optimized can cause problems - impact not intent and all that.)
Signed overflow being undefined behavior is a little bit silly because a well-intentioned optimizing compiler will only use that optimization for one purpose: to emit whatever the underlying CPU's most efficient arithmetic instructions are to handle the non-overflowing case. On essentially every modern CPU, that's two's-complement wrapping operations, but the historic existence of other CPUs means that the standard wanted to allow optimizing compilers to have a chance on those platforms too. Today it would be reasonable to make it no longer undefined behavior. All the other types of undefined behavior are undefined because there are reasonable optimizations that users actually want their compilers to do. Strict aliasing means that a loop that reads an array behind a pointer doesn't have to reread the pointer each time through, just in case something else in the loop changed it. Data races are undefined so that compilers don't have to use atomic operations or lock prefixes for everything. Buffer overflows are undefined so that there aren't bounds checks inserted everywhere. And so forth.
Posted Feb 23, 2024 20:04 UTC (Fri)
by Wol (subscriber, #4433)
[Link] (3 responses)
And this is an untrue generalisation :-)
Admittedly I don't use it much, but a lot of people make great use of it - I guess many of the DataBASIC compilers are not optimising. I know OpenQM isn't. The guy who wrote it is documented as saying the extra complexity involved wasn't worth the candle.
Okay, it's probably still vulnerable to optimisation, because DataBASIC compiles to high-level p-code, which is then processed by an interpreter written in C ... but that ain't an optimising compiler.
Cheers,
Posted Feb 23, 2024 20:37 UTC (Fri)
by geofft (subscriber, #59789)
[Link] (1 responses)
Posted Feb 24, 2024 5:12 UTC (Sat)
by willy (subscriber, #9762)
[Link]
Posted Feb 24, 2024 7:53 UTC (Sat)
by jem (subscriber, #24231)
[Link]
Or, if the definition of a non-optimizing compiler is that the binary code is a series of fragments that can clearly be used to identify the corresponding parts of the source code, how on earth are you going to formalize this definition?
Posted Feb 23, 2024 21:26 UTC (Fri)
by tialaramex (subscriber, #21167)
[Link] (1 responses)
This does seem to be the expectation of many C++ programmers and I'd assume also C programmers.
It's wrong though. Here's a very easy example, the compiler just constant folded your arithmetic overflow out of existence... https://godbolt.org/z/v4evh3eEG
Posted Feb 26, 2024 14:01 UTC (Mon)
by error27 (subscriber, #8346)
[Link]
Posted Feb 23, 2024 22:44 UTC (Fri)
by khim (subscriber, #9252)
[Link]
Sigh. I wonder if “we code for the hardware” guys would ever learn that “well-intentioned optimizing compiler” is an oxymoron, it just simply couldn't exist and doesn't exist. Compiler couldn't have intentions, well-intentions or ill-intentions. It's just simply basis of the whole compiler theory. That discussion happened more than decade ago and it's still relevant. And if you think that gcc have, suddenly, become “well-intentioned” simply because GCC12 or GCC13 don't turn that particular example into pile of goo then you are sorely mistaken: it's only because these have learned to [ab]use SIMD instructions in At this point we should stop pretending C and C++ are salvageable because trouble with them is social, not technical: even after decades of discussions “we code for the hardware” crowd is not ready to give up on their dream of “well-intentioned” compiler while compiler makers are not even trying to discuss changes in the language which may help these people to produce working programs.
Posted Feb 24, 2024 1:08 UTC (Sat)
by tialaramex (subscriber, #21167)
[Link] (20 responses)
More generally though over the past say five years I've become increasingly comfortable with the "demons fly out of your nose" characterisation despite the fact that yes, technically that specifically won't happen (because demons aren't real). The characterisation is appropriate because it inculcates the appropriate level of caution, whereas the "It will never do anything unreasonable" guidance you prefer reassures C and C++ programmers that they're safe enough when in fact they're in constant danger and _should_ be extremely cautious.
There's an Alisdair Meredith talk which I can't find right now where Alisdair confidently explains that your C++ program cannot delete all the files on a customer's hard disk unless you wrote code to delete all their files - He argues that while sure as a result of UB it might unexpectedly branch to code you wrote that shouldn't normally run, or pass different parameters than you expected; it cannot ever just do arbitrary stuff. This is of course completely untrue, I'm guessing every LWN reader can see why -- but it does make it easier to feel relaxed about having mountains of crappy C++ code. Sure it has "Undefined Behaviour" but that probably just means it will give the wrong answers for certain inputs, right?
If every C and C++ programmer had the "Ralph in danger" meme image on a poster over their desk I'd feel like at least we're on the same page about the situation and they've just got a different appetite for risk. But that's not the world we have.
Posted Feb 24, 2024 1:23 UTC (Sat)
by pizza (subscriber, #46)
[Link] (19 responses)
No, it is categorically true, because "undefined" does not mean "arbitrary and unbounded"
Using your logic, triggering UB means the computer could respond by literally exploding. Ditto if your computer gets hit with a cosmic ray.
If you argue that "no, the computer can't do that because nodody built explosives into it", why can't that argument also be applied to UB arbitrarily deleting your files instead? Sure, both are _possible_ in the sense that "anything is possible" but you're far more likely to have your car hit by a train, an airplane, and a piece of the International Space Station... simultaneously.
Posted Feb 24, 2024 2:03 UTC (Sat)
by tialaramex (subscriber, #21167)
[Link] (18 responses)
For the file deletion situation the usual way this comes up is that bad guys hijack a program (whatever its purpose may have been) to execute arbitrary code (not something it was intended to do, but ultimately not hard to achieve in some UB scenarios as numerous incidents have demonstrated). Then they do whatever they like, which in some cases may include deleting your files (perhaps after having preserved an encrypted "backup" they can sell to you).
Posted Feb 24, 2024 3:12 UTC (Sat)
by pizza (subscriber, #46)
[Link] (17 responses)
Seriously? Calling that the consequence of "undefined behaviour" is beyond farcical, as the _computer operator_ is *deliberately choosing* to delete files.
Just because the operator is unauthorized doesn't make them not the operator.
And "undefined behaviour" is not a requirement for, nor does it necessarily lead to, arbitrary command execution.
Posted Feb 25, 2024 18:28 UTC (Sun)
by tialaramex (subscriber, #21167)
[Link] (16 responses)
Emotionally it's satisfying to insist that you're right and Mother Nature is wrong. But pragmatically the problem is that Mother Nature doesn't care how you feel about it
And it's going to keep happening until you stop doing the thing that doesn't work, even though you find that emotionally unsatisfying as an outcome.
Posted Feb 25, 2024 23:39 UTC (Sun)
by pizza (subscriber, #46)
[Link] (15 responses)
No, Alisdair and I both claim it can't happen *unless someone intentionally writes code to make it happen*
...It won't happen by pure happenstance. (Which even your contrived script kiddie example demonstrates)
> But pragmatically the problem is that Mother Nature doesn't care how you feel about it
Uh.. there is nothing "natural" about computer software or even computer hardware; they cannot operate in ways that exceed what they were designed to do. But that's neither here nor there; "Mother Nature" doesn't respond to unexpected stimulus in arbitrary ways either; nature has rules that governs how it functions. (Granted, we don't understand many/most of them, but that doesn't mean they don't exist.)
For example, reading an initialized memory cell yields "undefined" results. However, in reality (sorry, "Nature") the value will either be 1 or 0. It literally cannot be anything else, because the computer can only register a 1 or a 0 in response to that input -- you won't ever get a value of "0.661233123413" or "blue". So yes, it is "undefined" but it is *bounded*. What happens in repsonse to that? That depends on what that value is used for in the larger system.
Going back to the curl not-a-CVE, when the worst possible outcome is that the user gets access to one byte of data they already had access to, there is no path from that read to "nuke your filesystem" unless curl is being used within a system already designed to nuke your filesystem (or the OS or runtime or whatever was intentionally designed to nuke your filesystem) if you read-out-of-bounds.
Another way of looking at this is that sure, the contents of that extra byte is technically undefined, but so is every other byte in the HTTP response from the server -- including whether or not you get one at all. Similarly, what the server does as a result of you making that request is also undefined and largely outside your control. It could trigger thermonuclear war for all you know. But it won't trigger global thermonuclear war unless someone deliberately gave it those capabilities. In other words, undefined, but *bounded*.
Posted Feb 26, 2024 6:40 UTC (Mon)
by mb (subscriber, #50428)
[Link] (11 responses)
That is not true.
The rest of your post is also largely not true. But that has been explained often enough to you, so I won't repeat.
Posted Feb 26, 2024 9:10 UTC (Mon)
by geert (subscriber, #98403)
[Link] (1 responses)
Posted Feb 26, 2024 9:16 UTC (Mon)
by mb (subscriber, #50428)
[Link]
Posted Feb 27, 2024 20:24 UTC (Tue)
by pizza (subscriber, #46)
[Link] (8 responses)
A TRNG does not respond "arbitrarily"; it still can only operate within its design constraints, which of course includes the characteristics of the materials it was constructed from. And, while any given read of a TRNG is "undefined" the value is bounded, with each discrete value being equally probabilistic as long as it is used within its designed operating conditions. [1]
It will always return a value between 0.0 and 1.0. [2] It cannot return "Fred" or kill your cat unless you put it into a box with a vial of poison.
...And the physical phenomenon that the RNG is measuring also has to have bounds, or you'd not be able to detect it -- Certain stimuli can make these events more likely (yay, Fission!) but that's just a change in probabilities [3] The point being, they don't respond "arbitrarily". Your Pb isn't going to turn into Au because a butterfly flapped its wings halfway across the world. Either an atom decays or it doesn't. Either an electron crosses the P-N junction or it doesn't.
[1] Several $dayjobs ago, I helped design a TRNG, so I have a decent idea how they work... and when they fail.
Posted Feb 27, 2024 20:46 UTC (Tue)
by mb (subscriber, #50428)
[Link] (7 responses)
Posted Feb 27, 2024 22:12 UTC (Tue)
by pizza (subscriber, #46)
[Link] (6 responses)
How exactly does a high school physics experiment support your claim that "Mother Nature doesn't respond to unexpected stimulus in arbitrary ways either; nature has rules that governs how it functions" is "not true"? [1]
...This experiment shows that we are still trying to figure out what those rules are, not that they don't exist!
It certainly doesn't change the fact that while any given observation is unpredictable, the probabilities are. (eg you can't predict an the decay of an individual atom, but you can accurately predict the overall _rate_ of decay of a mole of them)
[1] https://lwn.net/Articles/963598/
Posted Feb 28, 2024 10:20 UTC (Wed)
by Wol (subscriber, #4433)
[Link] (5 responses)
That's all, folks.
Cheers,
Posted Feb 28, 2024 12:19 UTC (Wed)
by Wol (subscriber, #4433)
[Link]
Cheers,
Posted Feb 29, 2024 14:24 UTC (Thu)
by pizza (subscriber, #46)
[Link] (2 responses)
Yeah, so? That doesn't demonstrate that Nature behaves arbitrarily and doesn't follow rules; it demonstrates that Nature's rules are a lot more complicated than we previously understood.
Posted Feb 29, 2024 18:29 UTC (Thu)
by mb (subscriber, #50428)
[Link]
https://en.wikipedia.org/wiki/Laplace%27s_demon
Nature has inherent randomness and undefined behavior.
Posted Feb 29, 2024 21:08 UTC (Thu)
by Wol (subscriber, #4433)
[Link]
We have classical physics, where everything follows rules and is deterministic.
We have relative, which iirc is the same.
And then we have quantum, where things happen at the micro level, but the rules only work at the macro level - we have no idea what (if any at all) the deterministic rules are. Especially as the main thing behind quantum seems to be the making of something out of nothing - if we have nothing there to start with, how can there be anything there to apply deterministic rules TO!?
So if quantum is basically nothing, surely it's reasonable to assume the quantum rules are nothing, too :-)
Cheers,
Posted Feb 29, 2024 21:02 UTC (Thu)
by rschroev (subscriber, #4164)
[Link]
Posted Feb 26, 2024 7:40 UTC (Mon)
by jem (subscriber, #24231)
[Link] (2 responses)
I can easily imagine a memory technology where reading an uninitialized memory cell produces the value 1, and could on the next read (still uninitialized) produce the value 0. If you repeat the process a sufficient number of times you could end up with a mean value of 0.661233123413.
Posted Feb 26, 2024 11:20 UTC (Mon)
by mpr22 (subscriber, #60784)
[Link]
Mmm, delicious sparkling bits. (The client was doing something ill-advised – can't remember whether it was power down or just hard reset – to the system while MLC NOR Flash was being programmed, which is admittedly something a bit worse than just "uninitialized memory".)
Posted Feb 26, 2024 15:04 UTC (Mon)
by farnz (subscriber, #17727)
[Link]
Ordinary DRAM can work that way; the capacitor in the DRAM cell is in one of three states; 0, intermediate, and 1. Intermediate is read out as either 0 or 1, but whether intermediate can stay at intermediate, or ends up forced to 0 or 1 on read depends on details of the DRAM design. Some DRAM designs will force the cell to 0 or 1 state on the first read or refresh, others have a more stochastic process where the cell can stay at intermediate until it's written (which sets the state to either 0 or 1, unconditionally), but may stabilise randomly.
And because this is outside the DRAM specifications (normally - you can get slightly more expensive DRAM ICs which guarantee read stability even without writes), different batches of the same IC may have different behaviours. In practice, you need special cases to observe this, since every refresh cycle counts as a read for the purposes of stabilizing the value, and any cell that's been written to is also stable.
As a result, you need to be depending on reads being stable even if the cell hasn't been written yet, and be reading from the DRAM shortly after it's been powered up, before there's been enough refresh cycles to stabilize its value, and using DRAM that can take several read or refresh cycles to stabilize the cell values. The first is almost certainly a lurking bug in your code (it was in the case I hit, it just took a long time to find and fix it, and the "quick" fix was to buy more expensive DRAM that guaranteed stability while we hunted down the software bug), the second pretty much requires you to be running code directly from flash or ROM, not booting the way a PC does (since the boot sequence takes long enough that you've had many 64ms or shorter refresh cycles during boot), and the third requires you to be unlucky with the specific DRAM ICs you buy.
Posted Feb 24, 2024 11:28 UTC (Sat)
by hsivonen (subscriber, #91034)
[Link]
This is not accurate. Compilers use the signed integer overflow UB for assuming that it doesn’t happen, which permits mathematical reasoning that’s valid in non-modular integer domain.
Google uses int for loop counters, and they seem to want the optimizations that arise from signed overflow being UB and, therefore, assumed not happening by the compiler.
https://google.github.io/styleguide/cppguide.html#Integer...
Posted Feb 24, 2024 18:26 UTC (Sat)
by faramir (subscriber, #2327)
[Link]
https://www.reddit.com/r/roguelikedev/comments/ytlw2f/a_b...
"Then people first started using the #pragma directive in C, its behaviour was implementation-defined. Version 1.34 of gcc, released around 1988/89, mischievously defined the behaviour as "try running hack, then rogue, then Emacs' version of Towers of Hanoi". i couldn't find 1.34 in the gcc archives, but gcc-1.35.tar.bz2 concedes that the directive might in fact be useful:"
Undefined Behaviour as usual
Undefined Behaviour as usual
Compilers exploiting UB happens all the time. It is the base of all optimizations.
Not really, Java and C# do not have undefined behavior, and yet there are optimizing compilers.
Even for C, it's a rather extreme position to say that register allocation (probably among the top three optimizations to implement in a compiler for current architectures) depends on undefined behavior. For others like constant propagation it's a bit of a stretch, too.
> Not really, Java and C# do not have undefined behavior, and yet there are optimizing compilers.
Undefined Behaviour as usual
chat *
. This may break your program if it does register allocation in a different way.register
variables could only use si
and di
(and they weren't used for anything else) and thus it was possible to write code that would [ab]use that in it's signal handler.Undefined Behaviour as usual
Undefined Behaviour as usual
Compilers exploiting UB happens all the time. It is the base of most optimizations.
Undefined Behaviour as usual
Undefined Behaviour as usual
You can't do I/O.
Undefined Behaviour as usual
Undefined Behaviour as usual
* "Verified" code (where the compiler will let you write whatever you want, but you also write a machine-verified proof that no UB will actually occur).
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Compilers exploiting UB happens all the time. It is the base of all optimizations
Nonsense! Compilers don't need to assume that the program does not exercise undefined behaviour in order to, e.g., optimize 1+1 into 2. Actually, assuming that the most troublesome undefined behaviours (e.g., signed overflow or whatever is behind strict aliasing) do not happen has little performance impact.
I think the traditional description of undefined behavior as "demons fly out of your nose" has done a disservice to understanding it. (After all, if the generated code had access to either demons or your nose, there would be a security vulnerability somewhere in granting that power to your userspace account in the first place. :) )
Undefined Behaviour as usual
WCHAR prefix[3] = {0};
if (something) {
DEBUGASSERT(prefix[3] == L'\0');
more stuff;
}
While it's pretty obvious to a human reader in context that this is just a typo, it's probably harder for a compiler to distinguish this from, say,
#ifdef LINUX
#define NUMBER_OF_THINGS 10
#else
#define NUMBER_OF_THINGS 5
#endif
thing_t things[NUMBER_OF_THINGS] = {0};
if (some_function_in_another_file_that_only_succeeds_on_linux()) {
thing[7] = something;
more stuff;
}
which, as long as some_function_in_another_file_that_only_succeeds_on_linux()
does what it says, never actually invokes undefined behavior. The compiler can notice that the assignment is undefined in the non-Linux case, and instead of doing something villainous, it can do something useful, i.e., assume that the assignment statement cannot be reached and dead-code-eliminate it and everything after it until the closing brace - and then dead-code eliminate the function call because there's nothing left.
Undefined Behaviour as usual
Wol
Off-topic
Off-topic
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
> Signed overflow being undefined behavior is a little bit silly because a well-intentioned optimizing compiler will only use that optimization for one purpose: to emit whatever the underlying CPU's most efficient arithmetic instructions are to handle the non-overflowing case.
Undefined Behaviour as usual
-O2
mode. Take that away and we are hack to square one.Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
https://en.wikipedia.org/wiki/Hardware_random_number_gene...
Undefined Behaviour as usual
Undefined Behaviour as usual
Unless the hardware access is correctly marked as unsafe volatile memory access. Which of course is necessary for every access to the real world (a.k.a. hardware) outside of the language's virtual machine model.
Undefined Behaviour as usual
[2] Strictly speaking it should also be able to return "Failure"
[3] Which is itself a predictable physical property of the materials, and is taken into account in the TRNG design.
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Wol
Undefined Behaviour as usual
Wol
Undefined Behaviour as usual
Undefined Behaviour as usual
Laplace's demon is wrong.
The thinking that we just don't know all the rules is wrong.
Undefined Behaviour as usual
Wol
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual
Undefined Behaviour as usual