The Linux "copy problem"
The Linux "copy problem"
Posted May 30, 2019 16:17 UTC (Thu) by desbma (guest, #118820)In reply to: The Linux "copy problem" by KAMiKAZOW
Parent article: The Linux "copy problem"
For example zlib, one the most widely used software library in the world, has several forks (Intel, Cloudflare, zlib-ng...) with optimizations that improve compression/decompression speed.
Yet the changes have never been merged back in zlib, and everybody still uses the historic version, and happily wastes CPU cycles (including when your browser decompresses this very page).
      Posted Jun 2, 2019 3:07 UTC (Sun)
                               by scientes (guest, #83068)
                              [Link] (3 responses)
       
Compression is disabled for https sites due to various attacks on the file size information leak. 
     
    
      Posted Jun 2, 2019 10:27 UTC (Sun)
                               by desbma (guest, #118820)
                              [Link] (2 responses)
       
     
    
      Posted Jun 2, 2019 12:20 UTC (Sun)
                               by Jandar (subscriber, #85683)
                              [Link] (1 responses)
       
This command is obviously without any output. 
$ curl -v --compressed 'https://lwn.net/' > /dev/null 2>&1 | wc 
Perhaps you meant: curl -v --compressed 'https://lwn.net/' 2>&1 > /dev/null | grep gzip 
     
    
      Posted Jun 2, 2019 12:45 UTC (Sun)
                               by desbma (guest, #118820)
                              [Link] 
       
curl -v --compressed 'https://lwn.net/' -o /dev/null 2>&1 | grep gzip 
also works 
     
    The Linux "copy problem"
      
The Linux "copy problem"
      
curl -v --compressed 'https://lwn.net/' > /dev/null 2>&1 | grep gzip
> Accept-Encoding: deflate, gzip
< Content-Encoding: gzip
The Linux "copy problem"
      
      0       0       0
The Linux "copy problem"
      
 
           