7 comments

  • lesserknowndan 9 hours ago ago

    PNG and JPG files are already compressed...

    • fjfaase 9 hours ago ago

      Indeed, a PNG file is a kind of ZIP file specifically tailored for images that is lossless in the sense that no information of the original image is lost.

      A JPG file is a kind of ZIP that leaves out some information from the original image to even achieve higher compression rates.

      Apply another compression algorithm to already compressed files, usually does not result in any reduction of the file size and likely even could make the file larger. As fun fact is that ZIP, when fed with absolutely random data, will result in a larger file, namely by adding some information to the file to tell that it could not be compressed any further.

      • FerkiHN 8 hours ago ago

        I asked the wrong question, I wanted to know an efficient method for compressing many photos into a single archive to reduce the overall size, since the color bytes are repeated, maybe there is some special f compression format because the usual zip file doesn't work.

        • fjfaase 8 hours ago ago

          If the images are rather similar you might try to make a movie of them or an animated gif.

    • FerkiHN 8 hours ago ago

      I asked the wrong question, I wanted to know an efficient method for compressing many photos into a single archive to reduce the overall size, since the color bytes are repeated, maybe there is some special f compression format because the usual zip file doesn't work.

      • Bender 7 hours ago ago

        If anything could potentially get some level of compression of already compressed images I would expect lrzip [1] if you have large enough of an archive but there really isn't anything designed specifically for your use case as far as I know. So maybe put a large number of images into a .tar file and then use lrzip on that .tar file. It would need to be a very large archive. In the past I gained about 3% to 5% on large image archives using 7-zip but it was really slow and not worth it for me which says something as I am very patient.

        There is potentially some compression magic that could be performed by transforming images using ImageMagick [2] or GraphicsMagick and then compressing them but that gets into the topic of potentially reducing image quality. Not compressing them with those tools but rather confining color pallets, depth and a few other variables to optimize the images to be more batch compressible for marginal gains. If you are not concerned about image quality then those tools can absolutely compress images without any trickery. resize, adding smoothing or blurring, etc...

            for Derp in *.jpg; do convert -resize 30% "$Derp" "resized-$Derp"; done
        
        This is a topic people could debate until the end of the universe so instead give lrzip a shot if you have the time and CPU resources. If you get more than 5% please let us know. Otherwise one could use ImageMagick to batch resize, blur or other of the image files at the risk of losing image quality.

        [1] - https://wiki.archlinux.org/title/Lrzip

        [2] - https://imagemagick.org/index.php

      • beAbU 7 hours ago ago

        So like HEVC/.h265?

        Compression works because the data is structured and ordered. A folder of photos is neither. When you do structure and order your photos you have, well, a video.