So much wasted energy
So much wasted energy
Posted Sep 9, 2025 15:51 UTC (Tue) by Karellen (subscriber, #67644)In reply to: So much wasted energy by Cyberax
Parent article: npm debug and chalk packages compromised (Aikido)
Any update should only download changes since the last time you tried to grab an update - especially if you're only tracking `main`, and checking out by commit id should be pretty secure. Also, git is totally capable of working from a "local" to your organisation/subnet clone of the repos you care about (updated, once every few hours or so), and have all your client tools clone from that, and still be happy that if you've got a specific git commit, it's the one you meant to get.
Every client downloading a whole new tarball, all from the upstream hosting provider, every single time, is absolutely bonkers.
Posted Sep 9, 2025 16:09 UTC (Tue)
by dskoll (subscriber, #1630)
[Link]
Every client downloading a whole new tarball, all from the upstream hosting provider, every single time, is absolutely bonkers.
It absolutely is. But so many build systems, especially in the embedded world, just externalize costs and hammer the upstream provider. They just don't care.
For a package I wrote and maintain (RP-PPPoE) I had to put the download link behind a form that asks you to verify you're human before letting you download. Something like 95% of the traffic to my server was downloads of the same tarball, over and over and over again.
So much wasted energy
