Python discusses deprecations
Feature deprecations are often controversial, but many projects find it necessary, or desirable, to lose some of the baggage that has accreted over time. A mid-November request to get rid of three Python standard library modules provides a case in point. It was initially greeted as a good idea since the modules had been officially deprecated starting with Python 3.6; there are better ways to accomplish their tasks now. But, of course, removing a module breaks any project that uses it, at least without the project making some, perhaps even trivial, changes. The cost of that is not insignificant, and the value in doing so is not always clear, which led to higher-level conversation about deprecations.
Module removal
Victor Stinner posted a notice to the python-dev mailing list saying that he wanted to remove asyncore, asynchat, and smtpd from the standard library for Python 3.11 (due in October 2022). His justification noted that there are better alternatives (asyncio for the first two and aiosmtpd, available in PyPI, for the last). As can be seen in the links, all of the modules suggested for removal are documented to be deprecated and have been since Python 3.6 was released in 2016. But Stinner also noted that the DeprecationWarning for those modules was only emitted at run time in Python 3.10, which drew an objection from Petr Viktorin:
According to the policy, the warning should be there for *at least* two releases. (That's a minimum, for removing entire modules it might make sense to give people even more time.)
Viktorin is referring to PEP 387
("Backwards Compatibility Policy
"), which says that features
cannot be removed "without notice between any two consecutive
releases
". Stinner interpreted
that to mean the documentation notice (for 4 releases) was sufficient,
but Viktorin and others did not agree. As can be seen in the bug tracking the removal,
which goes back to 2016, Stinner added a note about the removal plan
and, after getting several approvals from other core developers, merged
that change
on November 15.
But Brett Cannon thought that the removal was not in keeping with the PEP, thus premature. He noted that the steering council (which he is a member of) can grant exceptions, so Stinner opened an issue with the council to resolve the question. On December 6, the council issued its ruling that the removals should not be done for Python 3.11, so Stinner duly reverted the change.
Wider context
During that discussion, Viktorin started another thread about the more general question of deprecations for the language. It was motivated, in part, by rebuilds the Fedora project was doing using the alpha versions of Python 3.11. He pointed to a few examples of problems that arise from names that have been removed from standard library modules under the deprecation policy. He wondered:
Are these changes necessary? Does it really cost us that much in maintainer effort to keep a well-tested backwards compatibility alias name, or a function that has a better alternative?I think that rather than helping our users, changes like these are making Python projects painful to maintain. If we remove them to make Python easier for us to develop, is it now actually that much easier to [maintain]?
He also said that he was complaining that the PEP guideline was meant as a
minimum of two releases, but it is often treated as "wait two
releases and remove". He was not talking about
"code that's buggy, insecure, or genuinely hard
to maintain
", but that routinely removing names of various sorts
makes it harder for Python-based projects to keep up. He asked if there
might be a need for an alternative formulation:
If deprecation now means "we've come up with a new way to do things, and you have two years to switch", can we have something else that means "there's now a better way to do things; the old way is a bit worse but continues to work as before"?
Stinner pointed
out that all of the examples he cited had been deprecated (and emitted a
warning) for far longer than two cycles (the most recent was deprecated in
Python 3.5). Viktorin acknowledged
that but: "as far as I know, they haven't really caused problems in all that
time
"
Stinner described
some of the problems that arise from leaving deprecated functions in the
code base. They cause users and developers to do some extra thinking when
they are encountered: "Why is it still
there? What is its purpose? Is there a better alternative?
". The
answers to those questions are sometimes only found in the bug tracker or
Git repository. Beyond that, some deprecations are being done to prod
developers into using better alternatives; as Stinner mentioned for the
removals he wanted to do, deprecated code is effectively unmaintained:
Open issues in asyncore, asynchat and smtpd have been closed as "wont fix" because these modules are deprecated. These modules are basically no longer maintained.
But Viktorin said
that he is not arguing against the removal "if something's an
attractive-looking trap
", but there are other types of changes that
basically boil down to the core Python developers enforcing their opinions
on names and functionality:
Python users want to write commits that either bring value, or that are fun. Mass-replacing "failUnless" with "assertTrue" just because someone decided it's a better name is neither. Same with a forced move to the latest version of a function, if you don't use the bells and whistles it added.
Stinner had said that those who are using deprecated functions are building
up technical debt: "An
application using multiple deprecated functions will break with a
future Python version.
" But Viktorin said in some cases that is
the fault of the core developers:
But "will break with a future Python version" just means that people's code breaks because *we break it*. If we stopped doing that (in the simple cases of name aliases or functions that are older but not dangerous), then their code wouldn't break.
Christopher Barker said that keeping old names around "forever" was not a good solution, but that name changes like for failUnless should not be made if it is only done because someone liked the new name better. Removing the older versions of the names does have beneficial effects, Serhiy Storchaka said, since typos that land on the older valid names can persist in a code base without being noticed:
It is so easy to make a typo and write assertEquals instead of assertEqual or assertRaisesRegexp instead of assertRaisesRegex. Tests are passed, and warnings are ignored. Then I run tests with -Werror to test new warnings and get a lot of unrelated failures because of PRs merged at last half-year. I am very glad that these long time ago deprecated aliases are finally removed.
Increasing the window
Some kind of tool that helped maintainers make these simple substitution changes would be a nice addition, Stephen J. Turnbull said. Barker agreed but noted that a big problem is trying to support applications across multiple versions of the language, some of which have the new feature and some of which have to use the older name.
But do we need to support running the same code on 3.5 to 3.10? I don't think so. If you can't upgrade Python to a supported version, you probably shouldn't upgrade your code or libraries.Which is a thought — maybe the policy should be that we remove things when the new way is usable in all supported versions of Python. So as of today (if I'm correct) anything needed in 3.5 can be dropped.
That idea, in a somewhat different guise, was suggested by Eric V. Smith, who was also in favor of tools to assist maintainers:
[...] I think a useful stance is "we won't remove anything that would make it hard to support a single code base across all supported python versions". We'd need to define "hard", maybe "no hasattr calls" would be part of it.Reliable tools to make the migration between versions would help, too.
Cannon restated Smith and Barker's ideas in a more concrete form:
I think Eric was suggesting more along the lines of PEP 387 saying that deprecations should last as long as there is a supported version of Python that lacks the deprecation. So for something that's deprecated in 3.10, we wouldn't remove it until 3.10 is the oldest Python version we support. That would be October 2025 when Python 3.9 reaches EOL and Python 3.13 comes out as at that point you could safely rely on the non-deprecated solution across all supported Python versions (or if you want a full year of overlap, October 2026 and Python 3.14).I think the key point with that approach is if you wanted to maximize your support across supported versions, this would mean there wouldn't be transition code except when the SC [steering council] approves of a shorter deprecation. So a project would simply rely on the deprecated approach until they started work towards Python 3.13, at which point they drop support for the deprecated approach and cleanly switch over to the new approach as all versions of Python at that point will support the new approach as well.
That is a direction Viktorin would like to see, but if the PEP were to change along those lines (as Cannon suggested might be a path forward), he would rather see some recognition that there is a cost to these kinds of changes, but also to retain some flexibility:
I'm not looking for a contract, rather a best practice. I think we should see Python's benign warts as nice gestures to the users: signs that we're letting them focus on issues that matter to them, rather than forcing them to join a quest for perfection. If a wart turns out to be a tumor, we should be able to remove it after the 2 years of warnings (or less with an exception). That's fine as a contract. But I don't like "spring cleaning" -- removing everything the contract allows us to remove.Ensuring more perfect code should be a job for linters, not the interpreter/stdlib.
Terry Reedy noted that the change to yearly releases also shortened the minimum deprecation period from three years to two, so the idea of waiting until a replacement is available in all supported Python versions makes sense. Looking to the future:
Python is nearly 30 years old. I am really glad it is not burdened with 30 years of old names. I expect someone reading this may write some version of Python 50 years from now. I would not want [them] to have to read about names deprecated 60 years before such a time.
Jeremiah Paige mentioned a tool that might be used to help maintainers going forward: pyupgrade. It is a tool to upgrade code to newer versions of Python, but does not cover some of the cases that Viktorin mentioned. That could presumably be fixed with code contributions to the project. Stinner suggested a collaboration between the Python core developers and the pyupgrade developers to ensure that deprecations are added to the tool as things are being removed from the language.
The deprecation problem is something that crops up for Python and other projects with some frequency. There is a desire to remove old cruft (or "dead batteries") from the language, but there is obviously a balance to be struck. Other projects, the Linux kernel being a famous example, generally never deprecate anything unless it can be shown that it is no longer in use—or actively harmful. While there was some agreement with Viktorin in the thread, there were others who are intent on removing things as soon as reasonably possible. Whether that turns out to be longer than the minimum of two releases remains to be seen.
Index entries for this article | |
---|---|
Python | Deprecation |
Posted Dec 8, 2021 23:11 UTC (Wed)
by dvdeug (guest, #10998)
[Link] (29 responses)
Whereas I like the idea that I can take code from 4.4BSD and compile it for modern systems with (at worst) a minimum of changes. If there's an old Python program you want to run, something that does what you want that was tossed onto Sourceforge many years ago and never updated, the idea that the only way to run it is by loading an ancient version of Debian in a VM is not a pleasing one.
Posted Dec 9, 2021 0:13 UTC (Thu)
by gtb (guest, #3978)
[Link] (3 responses)
How do you feel about building yourself an ancient version of Python, installing it as `python2` (or `python1` even), and running the ancient Python code under it? Ancient Python interpreters are written in ANSI C, which all modern C compilers still support. So why wouldn't that work? In other words, I question the premise of your complaint that "loading an ancient version of Debian in a VM" is "the only way".
Posted Dec 9, 2021 0:23 UTC (Thu)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
And then struggle with module paths, because Python modules in /usr will not be compatible with older Python. Or your PYTHONPATH env variable might interfere.
Running ancient Debian in a VM/container is actually a pretty good way to get a working environment.
Posted Dec 9, 2021 2:55 UTC (Thu)
by NYKevin (subscriber, #129325)
[Link]
In short: It magicks up a shell environment in which PYTHONPATH etc. all point to an isolated set of directories which have nothing to do with the system Python. Then you can install whatever you want into that environment, and not have to worry about conflicts with anybody else. It is *not* a full-blown container or VM, or anything nearly as heavy as spinning up a complete instance of Debian.
Posted Dec 9, 2021 2:52 UTC (Thu)
by dvdeug (guest, #10998)
[Link]
Also, with deprecations every two releases, python2 is not enough; you may need various versions of python 3.
Yes, there's other solutions. But in many cases, loading an ancient version of Debian in a VM will be the easiest solution.
Posted Dec 9, 2021 1:35 UTC (Thu)
by himi (subscriber, #340)
[Link] (24 responses)
Also, "a minimum of changes" is a pretty ill-defined metric - there aren't that many changes required to bring older python3 code up to current with newer python3, which is what's being discussed here (i.e. not the python2-python3 transition, which is definitely harder).
There's definitely value to being able to use old code reasonably easily, but having to tweak it to make it run shouldn't be that much of an issue. In many cases the best, safest approach would be to rewrite the code entirely to run in the newer environment - that's obviously more work but at least you'll know that it's not going to do really unexpected things.
Posted Dec 9, 2021 2:30 UTC (Thu)
by dvdeug (guest, #10998)
[Link] (23 responses)
A minimum of changes is ill-defined, but BSD4.4 is from 1993 and parts of it date back to 1977. I'm more sympathetic to changes from that then from changes to code from 2008. The less stability with Python 3.0 is kept, the less I can trust that Python code will be reusable at any time in the future.
When a regular user asked me to why a program they were using didn't work on their new version of Windows, I should have shrugged and told them they should have been prepared to rewrite it every time they upgraded. I think it was a 16 bit program that didn't run on 64-bit Windows, but whatever the problem was, a specialized program that did the job they needed stopped working and they had to find a replacement. There are times when things must be upgraded, but there's code that just works and as long as it works for at least one user, it's frustrating to break it, and for most users, even programmers, rewriting the code entirely will not be a realistic solution.
Posted Dec 9, 2021 3:08 UTC (Thu)
by NYKevin (subscriber, #129325)
[Link] (22 responses)
Posted Dec 9, 2021 8:31 UTC (Thu)
by eru (subscriber, #2753)
[Link] (13 responses)
If it is a MS-DOS program, there are several ways to make that happen: DOSEMU, DOSBOX, VirtualBox, probably also QEMU could be made to load MS-DOS (the last one I have not tried myself). Years ago, I made hacks for running MS-DOS based cross compilers transparently on 64-bit Windows and Linux, these were needed for maintaining very old products, and could not be replaced.
Posted Dec 9, 2021 13:01 UTC (Thu)
by Wol (subscriber, #4433)
[Link]
That was one of IBM's selling points - every upgrade came with a VM for running earlier code - and that VM would run an earlier VM - turtles all the way down.
So your S360 code would be running on an S360 emulator, which would be running on an S370 emulator, which would be running ... on a z800 emulator which would be running on z9000 hardware.
It would be nice if we had something as reliable for x86 ... given the way hardware speed has increased, chances are your program is still running faster on top of the stack of emulators, than it would have done on the original hardware.
Cheers,
Posted Dec 9, 2021 18:16 UTC (Thu)
by nybble41 (subscriber, #55106)
[Link] (9 responses)
Posted Dec 9, 2021 18:23 UTC (Thu)
by Wol (subscriber, #4433)
[Link] (3 responses)
Cheers,
Posted Dec 9, 2021 18:45 UTC (Thu)
by NYKevin (subscriber, #129325)
[Link] (2 responses)
Posted Dec 9, 2021 19:15 UTC (Thu)
by Wol (subscriber, #4433)
[Link] (1 responses)
So why am I still running XP so I can carry on playing MS Age Of Empires :-) (although I haven't tried to install it on a later Windows recently, so it might work now ...)
Cheers,
Posted Jan 3, 2022 10:53 UTC (Mon)
by cpitrat (subscriber, #116459)
[Link]
Posted Dec 11, 2021 4:46 UTC (Sat)
by bartoc (guest, #124262)
[Link] (4 responses)
By the time 64-bit windows came out I think straight emulation was sufficient to run 16-bit apps, and that's a much simpler approach as far as the kernel and userland are concerned.
Posted Dec 11, 2021 15:49 UTC (Sat)
by nix (subscriber, #2304)
[Link] (1 responses)
The processor is capable of it (what doesn't work is real mode, "unreal" mode (i.e. real mode with nonstandard selectors), and v8086 mode). But it does take more work to implement this, mostly due to having to mess about with LDTs some more in ways you don't have to do if all you support is 32-bit protected mode programs. I can entirely believe that MS didn't want to do that.
Posted Dec 14, 2021 0:04 UTC (Tue)
by khim (subscriber, #9252)
[Link]
That's the most likely reason. Note that even pure 32-bit programs which use NtSetLdtEntries wouldn't work on Windows x64 whileas sys_modify_ldt works just fine on x86-64 Linux kernel.
Posted Dec 13, 2021 3:42 UTC (Mon)
by nybble41 (subscriber, #55106)
[Link] (1 responses)
I've tried it both ways, and I would say that it's easier to work with 16-bit apps running "rootless" via otvdm (even if the CPU is emulated) than inside a DOSBox frame running Windows 3.11. I never was able to get the emulated mouse working reliably, for example, though that probably had something to do with the fact that DOSBox was running on a remote system accessed via Remote Desktop, so it was turning relative mouse events into absolute "tablet" events and then back to mouse events. In any case, while the app did run in DOSBox it was basically unusable for my purpose. With otvdm all the inputs worked normally, the window size wasn't limited, and I could easily access files on both local and mapped network drives (albeit with truncated names).
Any modern 64-bit CPU should be fully capable of emulating the 16-bit environment the app was originally designed for at better than native speed, even without hardware support, so that isn't much of a handicap.
Posted Dec 13, 2021 6:01 UTC (Mon)
by raven667 (subscriber, #5198)
[Link]
Posted Dec 9, 2021 18:43 UTC (Thu)
by NYKevin (subscriber, #129325)
[Link] (1 responses)
Posted Dec 13, 2021 13:06 UTC (Mon)
by eru (subscriber, #2753)
[Link]
Posted Dec 9, 2021 19:48 UTC (Thu)
by dvdeug (guest, #10998)
[Link] (4 responses)
More to the point, so what? The fact is that a working program in use stopped working, frustrating the user, and forcing them to find a replacement. Why does whether Debian does the same thing matter?
Posted Dec 9, 2021 23:02 UTC (Thu)
by NYKevin (subscriber, #129325)
[Link] (2 responses)
The average user does not know how to operate a compiler, so from their perspective, source compatibility is entirely worthless.
> More to the point, so what? The fact is that a working program in use stopped working, frustrating the user, and forcing them to find a replacement. Why does whether Debian does the same thing matter?
So what to your so what. The user wanted to keep using the same software indefinitely, but maintaining existing software (fixing security vulnerabilities, supporting new hardware, etc.) is expensive. Somebody has to pay for that work, and right now, it would seem that nobody is willing to do so, including the user in question. So why should I care if their software stopped working abruptly? More bluntly, what's in it for me?
Posted Dec 10, 2021 3:32 UTC (Fri)
by dvdeug (guest, #10998)
[Link] (1 responses)
Which is a reason the average user doesn't use Linux. In any case, it's not entirely worthless; learning "./configure; make; make install" isn't that hard, and it makes it much easier to find someone who can make the program work.
> So why should I care if their software stopped working abruptly? More bluntly, what's in it for me?
So much for even dreams of "the year of Linux on the desktop". So much for dreams of free software taking over. Such an attitude will even drive away geek users; there's at least one user commenting on this article pointing out this is why they're using Go instead of Python, and comments on this thread have said that Python 2 was ANSI C and thus should still compile; surely the fact that ANSI C was stable and universal was part of the reason Python 2 was written in it.
At a more values-based level, this feels like digital vandalism. Instead of making a world where things work and someone can use a tool until something better comes along, it's making a world where planned obsolescence is built in, where code has to be rewritten regularly, not because it wasn't good, but because people decided to make some arbitrary changes, where everyone has to learn everything anew on a regular basis.
Posted Dec 10, 2021 4:35 UTC (Fri)
by NYKevin (subscriber, #129325)
[Link]
That's capitalism.
> Such an attitude will even drive away geek users; there's at least one user commenting on this article pointing out this is why they're using Go instead of Python,
Exactly. Vote with your wallet. Don't complain about X failing to meet your needs when you could just use Y instead of X.
> comments on this thread have said that Python 2 was ANSI C and thus should still compile; surely the fact that ANSI C was stable and universal was part of the reason Python 2 was written in it.
Sounds to me like Guido was already practicing this even back then.
> At a more values-based level, this feels like digital vandalism.
Nothing has been destroyed, at least on the FOSS side of things. All of the code still exists, is still publicly accessible, and can still be used, if you can only get it to run. Characterizing that as "vandalism" is ridiculous. If you can't get the code to run, that's your problem. This is the social contract on which all FOSS code has been founded ever since the FSF stuck a "THERE IS NO WARRANTY" disclaimer on the GPL.
> where code has to be rewritten regularly, not because it wasn't good, but because people decided to make some arbitrary changes, where everyone has to learn everything anew on a regular basis.
Unfortunately, we live in a society. If those changes were completely worthless, they would hopefully not be made. In most cases, they at least accomplish *something,* even if you personally think that thing is not worthwhile. You're always free to fork and revert, if you really think it's bad enough.
Posted Dec 15, 2021 18:03 UTC (Wed)
by anton (subscriber, #25547)
[Link]
That said, the Linux developers committed digital vandalism on a.out binaries on the switch to Linux-5.1, based on the excuse that core dumping of running a.out binaries was broken and nobody volunteered to fix it. Given that many systems (mine included) have a core file size limit of 0, core dumping is not an important functionality, certainly much less important than running a binary. Siegfried Pammer heroically implemented a partial user-space solution for this problem, but it's surprisingly hard to actually do that and the result still does not work as well as the kernel did before Linux-5.1.
Posted Dec 19, 2021 23:09 UTC (Sun)
by plugwash (subscriber, #29694)
[Link] (2 responses)
16 bit protected mode code (aka win16) can be run on an x86-64 system without emulation, MS simply couldn't be bothered doing the work to get the "WoW" and "WoW64" compatibility layers to work together. Wine on the other hand *does* support 16 bit code while running on a 64-bit system (though the wine process is 32-bit)
16 bit real mode code (aka DOS) on a 64-bit OS on the other hand does require either emulation or use of the virtualization features.
Posted Dec 20, 2021 6:00 UTC (Mon)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
Posted Dec 20, 2021 9:41 UTC (Mon)
by geert (subscriber, #98403)
[Link]
Posted Dec 9, 2021 0:19 UTC (Thu)
by IanKelling (subscriber, #89418)
[Link] (1 responses)
Posted Dec 9, 2021 10:28 UTC (Thu)
by kleptog (subscriber, #1183)
[Link]
Posted Dec 9, 2021 0:41 UTC (Thu)
by willy (subscriber, #9762)
[Link] (6 responses)
It was written about twelve years ago and is now completely illegible to a modern Python version. I don't know what the goals of the Python project are, but it's not creating a legacy of code that will work for a reasonable length of time.
Posted Dec 9, 2021 17:30 UTC (Thu)
by pj (subscriber, #4506)
[Link] (5 responses)
Posted Dec 9, 2021 19:17 UTC (Thu)
by NYKevin (subscriber, #129325)
[Link] (4 responses)
* The function dts() is exactly equivalent to the builtin function repr()... unless you pass a tuple with len() != 1, in which case it just blows up in your face. Fortunately, it's only used on integers.
I suppose you could fix all of the above in refactoring, but IMHO it might be better to just do a ground-up rewrite at this point.
(As far as helping fix it, you should talk to the person whose user page it is under. According to their contributions, they are still an active Wikipedia user, as of November 2021, so you can probably reach out to them via their talk page or by other means.)
Posted Dec 14, 2021 0:19 UTC (Tue)
by khim (subscriber, #9252)
[Link] (3 responses)
And, of course, it's better to use some else than python because if you would make that mistake again then chances that you would be able to run the code which you would write today in year 2033 (exactly 12 years from today — same as the code we are discussing here) is practically zero.
I sometimes wonder if that planned obsolescence is perceived as good thing in certain circles. E.g. scientists: because python is easily downloadable they can supply code to reviewers and and even safely publish it… but in 5-10 years when your work would become well-known nobody but you would be able to do work on it because your old, published, code wouldn't be compatible with then-current version of python and only you would have properly adjusted version of code.
Posted Dec 14, 2021 1:13 UTC (Tue)
by anselm (subscriber, #2796)
[Link] (1 responses)
That presumes that Python 12 years from now will be very different from Python today. But even Python today is not that different from Python 12 years ago. There were some important but backwards-incompatible cleanups but otherwise the language is still pretty much the same. Moving reasonably-written code forward really isn't rocket science, and can be at least partly automated. But if you wrote very bad Python code 12 years ago – like it seems the Wikipedia guy did – then you may have a bit of a problem. OTOH, who is to say what today's other fashionable programming languages will look like 12 years from now?
I've been writing Python code professionally for a lot longer than 12 years and I'm reasonably confident that the code I write today will still be working 12 years from now. The str-vs.-bytes thing which was the major new thing in 3.x was a bit of a hassle (albeit important) but that's sorted now. Even the introduction of major new language features like match (in 3.10) doesn't break existing code using match as a variable name, so the Python developers seem to have figured this out. Things are better now than they used to be, and are only likely to improve further as the language keeps maturing.
Posted Dec 14, 2021 19:14 UTC (Tue)
by tbird20d (subscriber, #1901)
[Link]
Posted Dec 14, 2021 6:46 UTC (Tue)
by fenncruz (subscriber, #81417)
[Link]
You think python itself is a mess? You've not seen python written by scientists, where you luckily if you (as the code author) get the same answer twice. But you would never know if you would because no on teaches about writing unit tests (or code testing in general).
Posted Dec 9, 2021 1:58 UTC (Thu)
by pabs (subscriber, #43278)
[Link]
OTOH I'd like to see the Python features with sharp edges deprecated, for eg pickle os.system os.popen subprocess.*(shell=True)
Posted Dec 9, 2021 9:13 UTC (Thu)
by taladar (subscriber, #68407)
[Link]
The main point should be to allow compiling and/or running the same code (possibly with some compile/runtime checks for feature support if absolutely necessary) on the range of currently supported operating systems with their included versions of the language compiler/runtime, so e.g. anything from the oldest supported RHEL version to the latest released stable Fedora, Debian or Ubuntu version.
That is a problem you actually encounter in practice when you have to support software and/or the servers it runs on, some people in your userbase want to run it on LTS and some on bleeding edge systems and ideally you don't want to have half a dozen different branches for that.
Posted Dec 9, 2021 13:10 UTC (Thu)
by agateau (subscriber, #57569)
[Link]
Unfortunately the transition from Python 2 to 3 was so painful that people do not want to live through it again (assuming it's now done!), but this is really what a major version bump is for. Python N should keep all its symbols, with some marked as deprecated. Then when Python N+1 comes, the deprecated symbols can be removed. Old projects relying on deprecated symbols would continue to work as long as they depend on the right major version of Python.
Posted Dec 9, 2021 14:35 UTC (Thu)
by sapphirepaw (guest, #121340)
[Link]
Or I could abandon the system Python, creating a delicate mess that would encourage leaving 3.8 running for 5+ years, maybe without even patch updates. That's not great, either.
Consequently, I'm actively migrating things from Python to Go. The only major pain point they've caused in 10 years has been modules, and that was a clear improvement that experienced rapid uptake in the ecosystem.
Posted Dec 11, 2021 22:14 UTC (Sat)
by ibukanov (subscriber, #3942)
[Link] (18 responses)
I do not remember similar articles about JavaScript, PHP, Perl, Node, shell scripts. Somehow those interpreted languages managed not to annoy users with breaking changes. Surely the first 3 tried some big incompatible changes like ES4, PHP 6, Perl 6. But those were either abandoned or forked into a separated language.
Posted Dec 12, 2021 12:00 UTC (Sun)
by fman (subscriber, #121579)
[Link] (1 responses)
Indeed! I don't care one bit about perceived advantages in wholesale replacements of stuff like asyncore.
Posted Dec 12, 2021 12:58 UTC (Sun)
by Wol (subscriber, #4433)
[Link]
Cheers,
Posted Dec 12, 2021 13:34 UTC (Sun)
by pizza (subscriber, #46)
[Link] (12 responses)
Python is the new Java. Except for the backwards-compatible part.
> I do not remember similar articles about JavaScript, PHP, Perl, Node, shell scripts.
You won't see this with Perl, because it takes backwards compatibilty *very* seriously:
'Requiring end-user programmers to change just a few language constructs, even language constructs which no well-educated developer would ever intentionally use is tantamount to saying "you should not upgrade to a new release of Perl unless you have 100% test coverage and can do a full manual audit of your codebase."'
"Historically, we've held ourselves to a far higher standard than backward-compatibility -- bugward-compatibility. Any accident of implementation or unintentional side-effect of running some bit of code has been considered to be a feature of the language to be defended with the same zeal as any other feature or functionality. No matter how frustrating these unintentional features may be to us as we continue to improve Perl, these unintentional features often deserve our protection. It is very important that existing software written in Perl continue to work correctly. If end-user developers have adopted a bug as a feature, we need to treat it as such."
Posted Dec 12, 2021 23:42 UTC (Sun)
by pebolle (guest, #35204)
[Link] (11 responses)
And somehow the people subscribing to this policy still managed to set in motion the breathtaking train wreck that perl6 turned out to be. By now no-one uses perl6 and any hope of a future for perl5 is best described by the perl7 farce. (Did lwn.net ever cover that? It's hilarious.)
The python2 to python3 process was pretty embarrassing but perl5 to perl6 is a lesson of how not to do such things of a purity to behold. Did anything similar ever turned out worse?
(For the record, I actually like perl5, but even I can see where it heading...)
Posted Dec 13, 2021 14:50 UTC (Mon)
by nix (subscriber, #2304)
[Link] (2 responses)
I dunno. After the usual pile of weird and impractical suggestions (because that's how development of almost anything *works*, it's just that this is in the open), it's converged on something reasonable: you get Perl 5 unless you say 'use 7'. i.e., the backward compatibility policy is unchanged, which means that you can expect everyone to upgrade to Perl 7 sooner or later, replacing Perl 5 completely, without worrying about things breaking or having to do massive audits of code nobody's looked at in twenty years and whose sole maintainer died of oxygen abuse ten years ago.
Posted Dec 13, 2021 16:15 UTC (Mon)
by pebolle (guest, #35204)
[Link] (1 responses)
I seem to remember that at one point some people suggested yet another split (i.e., a third version of Perl).
> it's converged on something reasonable: you get Perl 5 unless you say 'use 7'. i.e., the backward compatibility policy is unchanged,
I remember that too, and I also found it reasonable. But did anything actually happen? Has a decision been made to do a v7.0.0 instead of, say, v5.40.0? Nothing like that was mentioned on lwn.net.
Posted Dec 13, 2021 17:59 UTC (Mon)
by nix (subscriber, #2304)
[Link]
Posted Dec 13, 2021 15:06 UTC (Mon)
by rahulsundaram (subscriber, #21946)
[Link] (7 responses)
Perl 6 and PHP 6 (https://lwn.net/Articles/379909/) didn't impact users as much because they stepped past that into Perl 7 and PHP 7. Python 3 however did have a major impact for users who had to do a difficult transition. IMO, the story with Python was worse as a result.
Posted Dec 13, 2021 16:42 UTC (Mon)
by pebolle (guest, #35204)
[Link] (6 responses)
Has that ever materialized?
> IMO, the story with Python was worse as a result.
Of course this is a subject one can debate forever (especially on the web). But I'd say Perl 6 promised more, took forever, split the developer community, and delivered a "sister" language (and a compiler and a VM that are only used for Perl 6). They even had to rename it to Raku because having Perl 5 en 6 simultaneously hurt both languages. So I'd say Perl took a pretty bad hit.
Anyhow, both Perl 5 and 6 and Python 2 and 3 ended up incompatible. It never ceases to amaze me!
Posted Dec 13, 2021 16:45 UTC (Mon)
by rahulsundaram (subscriber, #21946)
[Link] (1 responses)
Most recent status update I am aware of is https://gist.github.com/Grinnz/be5db6b1d54b22d8e21c975d68...
Posted Dec 13, 2021 16:52 UTC (Mon)
by pebolle (guest, #35204)
[Link]
Thanks.
I stumbled on that too when looking into this this (Northern Hemisphere) summer. My impression was and is that Perl 7 petered out. We'll see.
Posted Dec 14, 2021 0:35 UTC (Tue)
by khim (subscriber, #9252)
[Link] (3 responses)
It's because for language developer backward compatibility is a pain, often an embarrassment, but for language users it's a boon… but language is developed by language developers, not by users. Still… python popularity never ceases to amaze me: so many were burned by it — yet people still continue to use it… why? What does it have that other, more stable languages, doesn't have? I guess that would remain mystery for the foreseeable future.
Posted Dec 14, 2021 9:26 UTC (Tue)
by rahulsundaram (subscriber, #21946)
[Link]
Languages don't thrive in a vacuum. There is a whole host of libraries that work very well and lots of components that you might want to work with provide language bindings and Python is among those that tend to come with good support (docs etc). The incompatibility was a hassle and I do strongly feel that it could have been handled better even if it the outcome after the changes were better code but the popularity itself isn't a mystery at all.
Posted Dec 15, 2021 19:17 UTC (Wed)
by xyz (subscriber, #504)
[Link] (1 responses)
Do you know any other language that has not burnt users one way or another?
> What does it have that other, more stable languages, doesn't have?
Very good modules/packages that you can use without reinventing the wheel.
Posted Dec 24, 2021 11:01 UTC (Fri)
by bartoc (guest, #124262)
[Link]
I'm sure the windows story for ruby and perl5 (and even bash, although that's harder as windows can't do fork() _or_ exec()) could be improved with some dedication and just ... clearing out the cobwebs, but there are a lot of cobwebs.
Despite the general shortcomings of windows as a platform a good percentage of developers need their software to run there
Posted Dec 20, 2021 21:10 UTC (Mon)
by flussence (guest, #85566)
[Link] (2 responses)
Coincidentally: I tried running an old internal PHP 5 app on PHP 8.0 this month (not 8.1, because several pecl extensions it relies on are abandonware). It worked for a few page loads and then exploded, because a cache-control function suddenly noticed that the caller had been passing it an iso8601 string instead of an integer timestamp. Doesn't matter that every "best practice" the language had at the time was followed, nor does it matter that the codebase was fuzz-tested several times, this error went silently unnoticed for a decade and a half until PHP added real type checking.
The subject of PHP being an atrocious language where nothing works consistently has been thoroughly flogged to death at this point; nobody's complaining because nobody uses it any more or are too ashamed to admit that they do. It may not break backwards compatibility in the same way Python does, but that's only because it's about as thought out and sight-readable as a DNA sequence - and it's become a *cultural expectation* that things in it are just breaking constantly and inexplicably. At one point the core developers genuinely couldn't understand why they were in the news - for admitting they didn't run the language's test suite before cutting releases because it was too noisy.
I'm not going to write at length for the other examples but they aren't much better:
Shell scripts have entire online wikis and linting programs dedicated to how *not* to write them - none of which seem to get used on the code running on everyone's distros; Javascript has supplanted Old Unix as a de-facto open-in-marketing-only platform that's actually owned by competing predatory corporations; and Perl's decline was more of a long-term cultural rot of slashdotter-style elitism from within than anything to do with Raku (which swung hard into the opposite problem, uncritically platforming awful people in official spaces in the illucid pursuit of SEO).
Everything is in fact awful. You hold up the buildings that aren't on fire as role models while missing the point that they're already burned out.
Posted Dec 21, 2021 0:19 UTC (Tue)
by pebolle (guest, #35204)
[Link] (1 responses)
Sometimes I wish I could construct sentences like that. But most of the times I'm glad I can't.
Posted Dec 31, 2021 1:31 UTC (Fri)
by flussence (guest, #85566)
[Link]
Posted Dec 16, 2021 18:22 UTC (Thu)
by LyonJE (guest, #139567)
[Link]
The world-at-large has huge volumes of older material that really does not need tweaking and updating and adjusting for the "latest" flavours, even as often as every year or two. I had enough of that pain with PHP breaking compatibility repeatedly for most of its lifetime.
I would be gravely disappointed to see Python shift away from a defensive, conservative, long-term stable approach in favour of a tiny percentage of the world's Python dev community wanting everything to be super clean and shiny.
This isn't critical functionality -- this is nomenclature and aesthetics, but which can break something that would otherwise be perfectly compatible and safe. Not every ".py" file is part of a sophisticated and/or big project -- there are millions of important short scripts, plug-ins, and all sorts.
I'm newish to Python, but not to software engineering, not by far :) So, my tuppence, anyway!
Je.
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
>Does 16-bit code run on x86_64 (say) Debian? Because I would be pretty surprised if that were the case...
Python discusses deprecations
Python discusses deprecations
Wol
Python discusses deprecations
Python discusses deprecations
Wol
Python discusses deprecations
Python discusses deprecations
Wol
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Linux has been more focused on source-code compatibility than binary compatibility
It seems to me that Linux (the kernel) focusses on binary compatibility for the most part. Given that it does not include a compiler, there is no way for it to provide source-code compatibility.
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Indeed. If the function/class/whatever were replaced with a raise DeprecatedError("explanation") it would be a lot less annoying to figure out what to do.
Python discusses deprecations
Python discusses deprecations
Source code here: https://commons.m.wikimedia.org/wiki/User:Jorge_Stolfi/ma...
Python discusses deprecations
Python discusses deprecations
* {buc,ovf}_has_{key,value} are supposed to be boolean, but they are initialized with zero instead of False in some places, and in others they are initialized with objects which appear to be mutable (e.g. op.keys, which is a list in Python 2, and a read-only "view" in Python 3). This gives them sometimes-pass-by-reference-sometimes-pass-by-value semantics, which is probably not intentional?
* There's a fair amount of backslash-escaped newlines inside of parentheses. You don't need to escape those newlines at all.
* Assuming this is Python 2, it is using classic classes, which don't exist in Python 3. Fortunately, it generally doesn't matter unless you're doing some sort of inheritance and/or polymorphism, which this code isn't doing.
> I suppose you could fix all of the above in refactoring, but IMHO it might be better to just do a ground-up rewrite at this point.
Python discusses deprecations
Python discusses deprecations
And, of course, it's better to use some else than python because if you would make that mistake again then chances that you would be able to run the code which you would write today in year 2033 (exactly 12 years from today — same as the code we are discussing here) is practically zero.
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python deprecations are too short for Ubuntu LTS
Python discusses deprecations
Python discusses deprecations
What i *do* care about are (maybe years) old work being flushed down the toilet by some adolescent quest for perfection
and having to re-acquaint myself with the module in question for no apparent gain overall.
Python discusses deprecations
Wol
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
> It never ceases to amaze me!
Python discusses deprecations
Python discusses deprecations
> I guess that would remain mystery for the foreseeable future.
Python discusses deprecations
It is a very nice language and it has a huge ecosystem.
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations
Python discusses deprecations