LOL
LOL
Posted Jan 8, 2009 3:34 UTC (Thu) by khim (subscriber, #9252)Parent article: Quotes of the week
I will just note wryly that it used to be that I could compile 0.9x kernels on a 40 MHz 386 machine in 10 minutes. Some 15 years later, it still takes roughly the same amount of time to compile a kernel, even though computers have gotten vastly faster since then. Something seems wrong with that....Actually it feels exactly right to me: computers are doing the same perceived work in the same time. It just means software and hardware are in sync: new features are not added faster then hardware can cope yet new features are added so hardware does not sit idle...
Posted Jan 8, 2009 12:12 UTC (Thu)
by ahoogerhuis (guest, #4041)
[Link]
-A
Posted Jan 8, 2009 12:45 UTC (Thu)
by smitty_one_each (subscriber, #28989)
[Link]
Posted Jan 9, 2009 1:50 UTC (Fri)
by giraffedata (guest, #1954)
[Link] (4 responses)
I don't think it happens that way. Hardware makers don't make faster hardware without a pre-existing need for it.
What has happened is that hardware has advanced to meet the needs of the new features, which people want more than a faster compile.
It reminds me of an observation a traffic engineer once made. He found that the optimum speed, from a user utility point of view, on a California freeway was 35 miles per hour. (People claimed they hated driving that slow, but continued getting on the freeway until the speed went below 35). He noted that when he added a new lane to the freeway, the speed remained 35. More people used the freeway.
Posted Jan 9, 2009 3:15 UTC (Fri)
by pr1268 (guest, #24648)
[Link]
It reminds me of an observation a traffic engineer once made. He found that the optimum speed, from a user utility point of view, on a California freeway was 35 miles per hour. (People claimed they hated driving that slow, but continued getting on the freeway until the speed went below 35). He noted that when he added a new lane to the freeway, the speed remained 35. More people used the freeway. Quoting the late Johnny Carson with respect to highway speed laws: "55 miles per hour? We Californians are changing tires at 55!"
Posted Jan 9, 2009 16:26 UTC (Fri)
by rgmoore (✭ supporter ✭, #75)
[Link] (2 responses)
I think that it works both ways. Some applications- scientific computing, rendering, and high-end games are good examples- have an insatiable demand for computing power. Those applications are always going to give processor manufacturers a market for improved performance, and they're going to do their best to fill it. The economics of processor design means that there's a trickle-down effect, and those high performance designs will eventually work their way into cheaper and cheaper computers for the rest of the market.
But it's also important to remember that one of the markets for high end computers is software developers who want fast compile times. The developers then wind up targeting their own high-end systems when they design their software. They add cool features that take advantage of their faster machines, and they focus more on development speed than efficiency under the assumption of increasing power. That puts users on the perpetual hardware upgrade cycle.
I think he was misinterpreting his findings. That looks to me like a classic substitution effect. People compare how long it will take them to get to their destination using different modes of transportation. A freeway is a better alternative as long as it's faster than surface streets, which apparently traveled at about 35mph. It's not that drivers really think that 35mph is OK and they're lying to people who ask them about it. It's just that they need to get where they're going, and there isn't an available alternative that will get them there any faster.
Posted Jan 9, 2009 19:18 UTC (Fri)
by giraffedata (guest, #1954)
[Link] (1 responses)
I must have explained it poorly, because that's just what he said. Except that he knows drivers also substitute trips at less convenient times and trip forbearance for a 30 mph freeway trip.
And his only point about the disparity between people claiming to hate driving 35 mph and what actually happens is that while they hate driving 35 mph, they like it enough to do it, which is all that matters. There are apparently enough people whose cutoff point of hating the freeway enough not to use it is right about 35 that adding new lanes doesn't significantly increase the speed.
I see the same thing in computers. Users accept performance that, to me, is maddeningly slow so that as computers get faster, application developers put in more features and keep it down at that speed.
Actually, I just remembered I quoted the wrong number. 35 mph is what he said yields the greatest freeway capacity, based on the following distance at which drivers feel comfortable at various speeds. I can't remember what the figure was for when people stop getting on the freeway. Probably 25. So as more people start using a 70 mph freeway, it absorbs the traffic and slows down steadily until it gets to 35, then crashes to 25 and the number of people getting on stabilizes there.
Posted Jan 9, 2009 20:12 UTC (Fri)
by dlang (guest, #313)
[Link]
from a freeway builders point of view it's most efficiant to only spend enough money to get freeway speeds up to 35 mph as that results in the most cars per $$ spent, but that's not what the users of the freeway want.
Posted Jan 15, 2009 12:15 UTC (Thu)
by forthy (guest, #1525)
[Link] (2 responses)
I agree that something's wrong here. Compare e.g. my Forth system. 20
years ago, it took about one minute to compile on my Atari ST (8MHz 68k);
that was after I managed to speed up the compiler by a factor of 10
(because 10 minutes was unbearable slow). Now it takes 0.3 seconds on an
2GHz Athlon64, producing about 500k. That's a factor 200. The total size
of the binary expanded by a factor of 4, and there are really more
features in than there were 20 years ago. To compile that part I used to
have on the Atari takes a fourth of the total time, so the overall ratio
looks reasonable (factor 800 between Atari and Athlon64). One reason the Linux kernel takes longer today is that GCC became
slower over time. The GCC maintainer tell me, that this is because it
optimizes better. For my own C programs, I still get the best results
from 2.95.x (which was the last GCC which compiled reasonably "fast",
which is already dog slow compared to a Forth compiler). So maybe it's
not something wrong with the Linux kernel, but with GCC. After all, the
size of the Linux kernel and the features (in terms of the number of
supported hardware) increased a lot more over 0.9x than my Forth system -
it was already fully featured 20 years ago, just lacking things nobody
would have thought of back then (like no UTF-8 support or no X based
GUI).
Posted Jan 15, 2009 18:39 UTC (Thu)
by dfsmith (guest, #20302)
[Link] (1 responses)
Posted Jan 16, 2009 18:04 UTC (Fri)
by jch (guest, #51929)
[Link]
What takes time is the optimiser, which takes ages in recent GCC releases.
LOL
LOL
LOL
yet new features are added so hardware does not sit idle...
More California freeway humor (off-topic)
LOL
What has happened is that hardware has advanced to meet the needs of the new features, which people want more than a faster compile.
He found that the optimum speed, from a user utility point of view, on a California freeway was 35 miles per hour. (People claimed they hated driving that slow, but continued getting on the freeway until the speed went below 35).
traffic engineering, effects of increasing capacity
I think he was misinterpreting his findings. That looks to me like a classic substitution effect.
traffic engineering, effects of increasing capacity
LOL
LOL
LOL