Java/C# vs Rust
Java/C# vs Rust
Posted Aug 17, 2024 3:45 UTC (Sat) by Cyberax (✭ supporter ✭, #52523)In reply to: Java/C# vs Rust by khim
Parent article: Standards for use of unsafe Rust in the kernel
gcc was not much better. It did not support member templates until 2.95 in 1999. In 1995 it did not even support type deduction properly, instead relying on "guiding declarations": https://support.sas.com/documentation/onlinedoc/ccompiler... - you had to explicitly match types, as the compiler couldn't deduce the common types.
> Um. That's how Turbo Pascal 4+ did starting from year 1987 and how Ada did from the day one in year 1983.
They don't have to do monomorphisation that requires you to keep pretty much all the stuff in RAM.
Posted Aug 17, 2024 4:13 UTC (Sat)
by khim (subscriber, #9252)
[Link] (13 responses)
Which means that it would have been Rust with Extended Pascal/Ada like generics (which would have evolved into a Swift-like generics later, most likely). I think we are arguing about different things: you say that Rust in a form exactly like Rust was done in 2015 wasn't possible to gave in year 1995 while I say that Rust with all the important properties that matter for safety was easy to create using 1995 technology. It wouldn't have been competitive with what we have today but it would have been as fast as Java in 1995 (Java wasn't a speed daemon back then buy any means) and it could have evolved, over time, into safe language that could be used for low-level things like Linux kernel, too. But Java had better marketing and it also promoted write once, run anywhere myth thus it was chosen. And we had to wait 20 years for a something that's safer than what we had in a year 1981.
Posted Aug 17, 2024 10:28 UTC (Sat)
by ralfj (subscriber, #172874)
[Link] (12 responses)
So independent of computing resources, I think it's unlikely something like Rust could have happened in 1995.
Posted Aug 17, 2024 12:04 UTC (Sat)
by khim (subscriber, #9252)
[Link] (11 responses)
Oh, sure. Rust in 1995 could have been a reality only if IT industry would have picked write code so simple there are obviously no bugs in it way to resolve the software crisis. But in reality said crisis was resolved via write code so complex that there are no obvious bugs in it which, of course, made creation of Rust in 1995 impossible. It's ironic that around that time (in 1996 to be exact) Hoare wrote his article where he expressed his satisfaction with the fact that this approach was seemingly working. And, of course, when everyone was to busy piling layers upon layers of scotch and bailing wire there weren't enough people to do research that could have given us Rust in 1995. We needed two more decades to realize that while “let's pile layers upon layers of scotch and bailing wire till our creations would stop crashing every few hours” approach works to create something that is useful, but then it doesn't work in the face of adversary. Thus yes, Rust wasn't a possibility in year 1995, but not because of hardware, rather the social issues prevented it's creation, everyone was just too busy inventing various snake oil solutions which would make programs, magically, correct even when people who write them have no idea what they are doing. But hardware? Hardware was perfectly “kinda ready” for it: Java was insanely heavy and slow by standards of 1995 and Rust would have been a pig, too, but Rust could have become fast when computers would have become more advanced for more optimizations to become possible, same as happened to Java. That's why yes, C#/Java craze was gigantic waste of time and resources — but also, probably, inevitable one. World needed these trillion dollars loses to admit that this was the wrong turn, before that happened (as even Hoare, himself, noted) it looked as if enough layers of scotch and bailing wire may fix everything.
Posted Aug 17, 2024 17:22 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link] (10 responses)
I disagree. Java in particular has shown that large and complicated software can be written in memory-safe languages. This was not at all a given in 90-s.
And of course, the Java ecosystem had struggled a lot to formulate the best practices.
If anyone wants to be amazed by over-engineering, just look at the EJB 1.0 standard. But even with all of its over-engineering, EJB containers like WebLogic or JBoss pioneered some of the best practices that we use even now: artifact-based deployments, monitoring and metrics, centralized logging, and even a notion of proto-containers (WARs and EARs). All starting back in 1998-1999.
Over time, the bad parts were discarded, and good parts were promoted. It provided a great learning experience for the whole industry. Would have it been better if the industry magically got all this foreknowledge back in 1999 and avoided painful diversions into the AbstractFactoryFactoryBean territory? Sure. Could it have happened this way? Not a chance.
Posted Aug 18, 2024 8:43 UTC (Sun)
by khim (subscriber, #9252)
[Link] (9 responses)
And it's just a sheer coincidence that “bad part” are exclusive for Java but “good part” are not? Really? Pionered? In what sense? They raised hype around things invented by other, that's all. Syslog doesn't need Java and chroot was invented before it, too. IBM did remote monitoring for decades before Java was ever invented and Google's Borg never needed or used Java (it supported it, though) with Sawzall existing outside of Java, too. Lisp machines already existed and they have shown that before Java was even dreamed of. I couldn't name any single goal that Java set before itself and made better then some other language. Even cross-platform development today is mostly happening in JavaScript/TypeScript and not in Java. C#/Java failed on all goals they set out to deliver (initial goal was to replace C/C++, remember? Javastation was supposed to only run Java applications and Avalon was supposed to replace Win32, not complement it). C#/Java were pure waste of time and extremely disruptive for the companies that embraced them: Sun is dead is Microsoft have lost much of it's influence, because time that they wasted on unattainable goals was used by others to leapfrog them. If you recall the behaviour of “old” Microsoft then this result is 100% positive and it's true that only C#/Java could have achieved that, but somehow I seriously doubt it was intended. It would be really funny if Rust would, eventually, replace C/C++. Because it's developers have never embraced that as a goal. There are lots of jokes about “rewrite it in Rust” and some fanboys even say then Rust have to replace C/C++, but actual developers are realists and always designed Rust for the perpetual coexistence with C/C++. On the contrary: Java and C# were designed for the world where everything (except for some low-level components) is in managed code, all development as happening within confines JVM/CLR (basically: commercializaion of List machines concept) and all software is rewritten in managed code. That vision failed utterly and miserably and have only consumed countless resources. You may point out to the success of Android, but if you read interviews with Andy Rubin, you'll see that Android embraced Java not because of it's wonderful properties, but simply because when Android was made there were lots of Java developers. If Java detour would have never happened he would have picked something else (Apple picked up Objective C because macOS uses it and it worked well for them). Ultimately the only thing that C#/Java detour us is that world without managed code is viable, while world with only managed code is not. Anyone with two brain cells could have predicted that on the day one, but Google have done a third attempt which is, as expected, falling apart before our eyes. Not in a world obsessed with attempt to make write code so complex that there are no obvious bugs in it work, sure. Java is very much symptom, not a disease. Disease is that naïve belief that you may replace competence with tools. Planes that are losing doors and AbstractFactoryFactoryBeans come from the same root. And yes, when you traverse that road then C#/Java happens narturally. I just wish we stopped and realized that this is a wrong road to traverse before so much resources would have been wasted.
Posted Aug 18, 2024 12:27 UTC (Sun)
by pizza (subscriber, #46)
[Link] (8 responses)
*single* goal, no. But these goals didn't exist in isolation, and until Java came along, nothing else put them all of those "single goals" together in a way that was both accessible at the entry level (in part due to running on commodity systems) and useful at the high end.
Posted Aug 18, 2024 12:47 UTC (Sun)
by khim (subscriber, #9252)
[Link] (7 responses)
It looks like an attempt to draw a target around the place where arrow hit. Of course if you spend enough time reframing Java achievements then you can always find a way to define them in way to show that Java did something first. But premise was to achieve something pretty concrete, billions (if not trillions!) of dollars were spent in an attempt to deliver that and none of the achievements that may last needed these things. Sure, Java have shown that you may develop things in a managed code, but List machines did that before Java. Java have shown that you may write portable code but AS/400 did that before. None of achievements that Java may show are new and the ones that are new are completely unrelated to these things that Java was supposed to achieve. It's like these solar and wind power stations or electric cars: sure, they advanced certain narrow fields significantly, but does the damage they did to the world economy (and, ironically, to the world ecology) are worth it? This question is debatable but the fact that original promise of “self-sustainable development” wasn't achieved and wouldn't be achieved on this path remains. I suspect that it would be “achieved”, in the end, via sleight of hands when nuclear power would be declared “green”, too and then everyone would be happy about how “green” power works while completely forgetting decades of investment into a dead end.
Posted Aug 18, 2024 12:58 UTC (Sun)
by pizza (subscriber, #46)
[Link] (6 responses)
Congratulations, you just demonstrated my point.
Lisp machines and AS/400s were about as far removed from commodity systems as it could get, and were effectively unobtanium for mere mortals.
...Unlike Java, which you could obtain for low-to-zero cost and could run on the systems you already had.
Like it or not, Java's particular combination of features and accessibility changed the entire trajectory of the industry.
Posted Aug 18, 2024 13:35 UTC (Sun)
by khim (subscriber, #9252)
[Link] (5 responses)
And how is that a bad thing? They were solving programs that either are not needed by mere mortals or are not solved by Java. Java managed to [ab]use managed code but failed to achieve thing that managed code is really good for: forward compatibility. While AS/400 doesn't even give you an ability to execute in native code it's not uncommon for a Java program to code with it's own version of JRE because it may misbehave with any other version and conversion between .NET 1.x, 2.x and .NET core is pretty damn non-trivial. Thus, in the end, C# and Java achieved intermediate goals these exotic systems achieved in pursit of worthwhile goals yet failed to achieve anything worthwhile outside of hype train. Isn't that my original point? Java have diverged the industry, made it waste trillions of dollars on mirage that never materialized, sent it into a dead end, and now we would need to spend more trillions of dollars to undo all that damage. Hardly something to celebrate.
Posted Aug 18, 2024 13:58 UTC (Sun)
by pizza (subscriber, #46)
[Link] (4 responses)
...So billions of lines of inherently memory-safe code deployed onto commodity systems never happened?
Posted Aug 18, 2024 14:55 UTC (Sun)
by khim (subscriber, #9252)
[Link] (3 responses)
Of course they happened! Billions of lines of code already written is various memory-safe language (from COBOL and Clarion to Visual Basic and FoxPro) were, with great pains, rewritten in two other, more resource hungry memory safe languages. Yes, these languages never have been using “managed code” or “tracing garbage collection”, but they were perfectly memory safe and they worked perfectly fine. No matter how you look on it the whole thing it looks like a net negative to me: we haven't gotten any tangible benefits from that rewrite (although some industries have gotten rich, that's true, but that's like burning the house to heat a stew), code that was before than “grand revolution” written in memory safe languages was still written in memory safe languages (only now with lots of more unneeded complexity) and code written in non-memory safe languages continued to use non-memory safe languages.
Posted Aug 19, 2024 12:48 UTC (Mon)
by pizza (subscriber, #46)
[Link] (2 responses)
Wait, _rewrite_? Surely you jest.
New stuff only rarely replaces the old stuff; instead it's layered on top. It's turtles all the way down.
And again, it is a simple FACT that Java is vastly more approachable than the stuff it supplanted, and was useful for nearly everything, from deeply embedded stuff [1] to teaching/toy problems [2] to desktop applications [3] to enterprise consultant wet dreams -- All from the same base tooling. That was a *HUGE* change over the former status quo.
Sure, many of its use cases have since been better served with newer stuff. So what? Isn't that the fate of all technology?
[1] Multiple generations of ARM processors ran the Java bytecode natively
Posted Aug 19, 2024 13:39 UTC (Mon)
by khim (subscriber, #9252)
[Link] (1 responses)
> New stuff only rarely replaces the old stuff; instead it's layered on top. It's turtles all the way down. > And again, it is a simple FACT that Java is vastly more approachable than the stuff it supplanted Can you stop contradicting yourself at least in two adjacent sentences? Nope. Few ARMv5 CPUs has the ability to run some small subset of Java bytecode. It was, basically, to run few games on some phones and for nothing else. Starting from ARMv6 only “null” implementation of Jazelle is supported. So that's another example of pointless waste (thankfully very limited compared to the damage caused by the large C#/Java crazyness). Yeah. And also Scheme in some course. Another negative. Where do you see me objecting? Sure, C#/Java caused lots of changes. Almost all of them negative. But you are arguing as if I'm objecting about magnitude of change… I'm not! C#/Java caused absolutely huge negative change. There were, also, some minuscule positive changes, sure, but compared to problems that C#/Java craze caused they are hard to even notice. That's not important. What is important is that almost all use cases are better served by older stuff. Sure. But C#/Java is different: that's the rare case where bad technology was replaced with worse one. Wanted to say that it's the only such change, but nope, there are many other like that: solar and win power plants, electric cars, etc. That have only started happening recently. About quarter century ago. But, sadly, it's not limited to IT and not limited to C#/Java. True. But the fact that was achieved by temporary disconnect between feasibility of technology and availability of funding doesn't make that change good and we would pay for that stupidity, and, it looks like, rather sooner than later. Microsoft and Sun have already paid the price, but I doubt it would be limited to that.
Posted Aug 19, 2024 14:17 UTC (Mon)
by corbet (editor, #1)
[Link]
Thank you.
> They don't have to do monomorphisation that requires you to keep pretty much all the stuff in RAM.
Java/C# vs Rust
Rust is building on a bunch of academic Programming Languages work that just wasn't done yet in the 90s. For instance, it has taken a lot of good ideas from Cyclone.
Java/C# vs Rust
> So independent of computing resources, I think it's unlikely something like Rust could have happened in 1995.
Java/C# vs Rust
Java/C# vs Rust
> Over time, the bad parts were discarded, and good parts were promoted.
Java/C# vs Rust
Java/C# vs Rust
Java/C# vs Rust
Java/C# vs Rust
>None of achievements that Java may show are new and the ones that are new are completely unrelated to these things that Java was supposed to achieve.
> Lisp machines and AS/400s were about as far removed from commodity systems as it could get, and were effectively unobtanium for mere mortals.
Java/C# vs Rust
Java/C# vs Rust
> ...So billions of lines of inherently memory-safe code deployed onto commodity systems never happened?
Java/C# vs Rust
Java/C# vs Rust
[2] Completely supplanting Pascal in introductory programming courses
[3] Including browser applets. Which I don't miss.
Java/C# vs Rust
So this has gone on for quite some time; I don't think any minds will be changed at this point. Maybe time to wind it down?
Maybe this is enough?