Ushering out strlcpy()
Ushering out strlcpy()
Posted Aug 29, 2022 17:07 UTC (Mon) by wtarreau (subscriber, #51152)In reply to: Ushering out strlcpy() by Wol
Parent article: Ushering out strlcpy()
But you wrote the correct term: "programmer". That person got some training to learn that the machine they were going to program for uses discrete numbers, doesn't have infinite memory and so on. Otherwise it's just a human. Programmers know that no machine can accurately represent Pi. Programmers know that it's extremely dangerous to store numeric identifiers in Javascript because by default they're stored as floats and that if their absolute value is larger than 2^52 they will lose precision and possibly confuse some people, sessions, digital signatures or whatever. Nowadays a lot of people think that after having hacked in a text area on an online training site they are programmers. And the day they write code like: if (a < b) do_something(); else if (a == b) do_domething() else if (a > b) do_something_else(), they don't understand why there are some entire classes of values for a and b which result in no function being called. But this is taugh in computer courses, that remain an absolute necessity when writing programs aimed at running on real computers. Otherwise, as you're saying, some bugs can put you to jail (or even put someone else to jail).
Processing protocols requires to handle them using the types they were designed with. Most of the time the common type is the byte and protocol elements are just sequences of bytes. In this case it makes sense to be able to handle them as naturally as possible. That's why C strings are overly used in that area even though they don't shine for their efficiency in this specific purpose, but they're wonderful for many other cases.
And I really *do* want programmers to care for the machine they're writing code for. The languages can help, the compilers as well, and in that regard C is not your ally with its absurd precedence, annoying shifts and unclear type promotion, but it remains extremely reliable, so much that it still runs almost the entierety of the internet's core services and operating systems, many systems which would probably not exist if the only choices by then were much more lenient in terms of resource usage, or too constraining in terms of development. Some will say it's not user-friendly, others will say it is user-friendly, it's just selective about who its friends are. My personal feeling is that it could be way better but that I'm not willing to deal with caprices from another language that constantly puts itself in my way when I'm trying to do something simple for the machine. When I can't do something in C I do it in asm, and I grumble that the interface between the two is ugly and not very transparent (though it improved lately in gcc by allowing to return flags).
> amount of poor design / fire-fighting fixes / technical debt I see is horrendous
That's true and it seems to speed up with newer languages sadly, where developers seem to prefer to throw away everything and restart again and again, without rethinking the design. When your language of choice requires a bit of thinking, you're becoming very stingy about your code and you prefer to be certain to get the design right. You end up drawing stuff on paper before writing code and that significantly helps keeping code running well for decades with an acceptable level of maintenance.
