|
|
Subscribe / Log in / New account

Improving .deb

Improving .deb

Posted Jun 26, 2019 2:50 UTC (Wed) by fest3er (guest, #60379)
In reply to: Improving .deb by brunowolff
Parent article: Improving .deb

I was just thinking something like this: use a FS that performs compression. Sounds like SquashFS might work nicely, if it achieves an acceptable level of compression. Ideally, there'd be little need to pre-decompress the pkg; just loop-mount it and use rsync to install the files.

As I understand, uncompressing certain .xz archives (perhaps large archives) *can* require a lot of RAM

Haiku have created what sounds like a novel approach to packages. For most user packages, there's no need to unpack and install files. As I understand, the pkg file is simply put where it belongs; once there, its contents become available to the system. To remove the pkg, delete the pkg file. I've no idea how they made it work (perhaps some form of FS union).


to post comments

Improving .deb

Posted Jun 26, 2019 10:32 UTC (Wed) by excors (subscriber, #95769) [Link]

> As I understand, uncompressing certain .xz archives (perhaps large archives) *can* require a lot of RAM

I don't believe it depends on the archive size, just on the dictionary size that was chosen when compressing, because the decompressor has to construct that dictionary in RAM. The man page says the default compression mode ("xz -6") uses an 8MB dictionary, and the most expensive preset ("xz -9") uses 64MB, though with custom settings it can support up to 1.5GB. Compression takes roughly 10x more RAM.

(For comparison, zlib(/gzip/etc) uses a 32KB dictionary by default, which is partly why modern algorithms can perform so much better.)


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds