PNG compression levels

  • Permalink
  • submit to reddit
  • Email
  • Follow


I need to save some files a PNG.  Never having used PNG 
before, I'm a bit unsure about the meaning of compression 
level.

Is it true that PNG is completely lossless regardless of the 
compression level, and that compression level only determines 
how fast or slow a file will be to save and open?   Is that 
the only consideration for PNG compression levels?


-- 
Joe
http://www.joekaz.net/
http://www.cafeshops.com/joekaz


0
Reply Joe 11/26/2003 9:27:57 PM

See related articles to this posting


Joe <nospam@joekaz.net> wrote:
> I need to save some files a PNG.  Never having used PNG 
> before, I'm a bit unsure about the meaning of compression 
> level.
> 
> Is it true that PNG is completely lossless regardless of the 
> compression level, and that compression level only determines 
> how fast or slow a file will be to save and open?   Is that 
> the only consideration for PNG compression levels?

Yes, yes and yes.

Of course it takes a tad longer to compress an image with the higher
compression levels, but recent computers really are fast ernough
to always compress using the highest compression level. You won't notice
the difference in time.

Bye,
       Simon
-- 
      Simon.Budig@unix-ag.org       http://www.home.unix-ag.org/simon/
0
Reply Simon 11/26/2003 9:38:41 PM

In <3fc52b6f@si-nic.hrz.uni-siegen.de>, Simon Budig wrote:

>> Is it true that PNG is completely lossless regardless of the 
>> compression level, and that compression level only determines 
>> how fast or slow a file will be to save and open?   Is that 
>> the only consideration for PNG compression levels?
> 
> Yes, yes and yes.

I want to add that *opening* a PNG with high(est) compression level isn't
slower than one with low(est) compression level.  It just affects the
speed of the compression.

Ciao,
	Marc 'BlackJack' Rintsch
0
Reply Marc 11/28/2003 10:48:39 PM
comp.graphics.apps.gimp 3493 articles. 3 followers. Post

2 Replies
914 Views

Similar Articles

[PageSpeed] 12


  • Permalink
  • submit to reddit
  • Email
  • Follow


Reply:

Similar Artilces:

Compression Level
Hi, I would like to compress some Oracle tables to see how well they can compress. Is there some sort of a guide or rules for how well tables can compress based on their datatypes? I know that some of our DBA's have managed to compress tables up to 70%. What tables if any shouldn't be compressed? Are there datatypes that compresses better or is it block related only? I would like to understand and decide which tables should be compressed without testing each table. Thanks, Yuval yuvalyuval wrote: > > Hi, > > I would like to compress some Oracle tables to see how...

ZLIB Compression levels
I am using zlib 1.1.4 for Solaris and would like to know the following: Is there a "C" api call that I can make on a compressed file that will return me the compression level used to compress that file. If not, can you provide me with a piece of code that can accomplish this task. It would be very much appreciated. Secondly, can you provide me with a list of compression levels. The only ones I am aware of are below. Are these accurate? NO_COMPRESSION 0 BEST_SPEED 1 DEFAULT_COMPRESSION 6 BEST_COMPRESSION 9 beefstu350@hotmail.com (Stu)...

compression level in SSH2
Hello, How can one set compression level different than 6 in SSH2? If it's impossible, why was it possible in SSH1? There are big differences when I compared: 1) compression level 9 in SSH1, 2) compression level 6 in SSH2, over a slow link, with huge amounts of text. -- T. ...

Level 0 compression in ZipArchive
Hello, Is there a way to use php's ZipArchive and specify No Compression? Something similar to "zip -0" on the command line. Thanks very much. ~rvr ...

JPEG compression and quality level
Hello all! Is there a way or some (free) 3rd party algorithm to determine the JPEG compression level? We need to make sure that some JPEGs have a certain quality. Images under a some threshold value shall be rejected form being upploaded. E.g. our JPEG quality scale would be from 0-100%, then all images below 70% can't be submitted. (100% is the lowest compression possible, large file) Is there such an algorithm? Karsten Karsten Wutzke <kwutzke-blahblah@emporis.com> writes: > Is there a way or some (free) 3rd party algorithm to determine the > JPEG compression level...

4-level image compression ?
Hello, I'm playing with a small system where the display is 320x240 pixels, with 4 grey levels. So a picture takes about 19200 bytes. Now I'm quite sure that these images can be reduced in size but the algorithms I'm aware of deal with 2-level only (fax) or many-level images (GIF/PNG etc.). Furthermore, my CPU power and RAM are quite reduced so even a sub-optimal algorithm could do the trick, as long as it saves some storage space. The final decoder will be hand-coded in asm and developed in C. currently, my idea is to use a paeth predictor followed by a static huffman encoded en...

Compression in PS level 2
I want to compress image data in PS using language level 2. How to use runlength and other filters to get this done..... "Needo" <mnmateen@yahoo.com> wrote: >I want to compress image data in PS using language level 2. > >How to use runlength and other filters to get this done..... The PostScript Language Reference has the details, and either defines the compressed data format, or gives a refererence to where it is defined. Some filters are very simple to compress for (RunLength), for some you would use outside code (e.g. zlib for FlateDecode), and for others i...

OT unbelieveable PNG compression
Here is a picture that contains full RGB colours 256x256x256. Each pixel is represented by different colour and the picture containr full 16.8M colours. The picture has resolution 4096x4096 pixels and the colours in the picture are placed in the way to propose maximal possible compression in PNG format. The size of this compressed picture is just 58kB. http://davidnaylor.org/temp/all16777216rgb.png source : http://www.dsl.sk/article.php?article=2850&title= B Bohus Kr´┐Żl wrote: > Here is a picture that contains full RGB colours 256x256x256. Each pixel > is represented by di...

compression level with tarfile (w:gz) ?
Hello, I was wondering if it possible to specify a compression level when I tar/gzip a file in Python using the tarfile module. I would like to specify the highest (9) compression level for gzip. Ideally: t = tarfile.open(tar_file_name+'.tar.gz', mode='w:gz:9') When I create a simple tar and then gzip it 'manually' with compression level 9, I get a smaller archive than when I have this code execute with the w:gz option. Is the only way to accomplish the higher rate to create a tar file and then use a different module to gzip it (assuming I can specify the compr...

colour bits compress? (png 48 bits)
Dear All, I know that lower bits will be cut when the "system bits" does not corresponds to the "image bits". For example, if I create 48-bit PNG image, and present it with 30-bit dispay system, how precisely the original RGB information can remained (on the displayed image)? I would like to know how it happens more qunatitatively. Does it depends on the display system? Would it be possible to let me have some imformation, say is there any documentations I could find? Thank you very much for your help in advance. kamano ...

J2k Lossy Compression
Hello, I am using jpeg 2000 lossy compression on my dicom images. I have noticed that often the default values in the header for window level are no longer appropriate after the image is compressed. Are window center and window width generally updated when j2k lossy compression is applied? Also, should I be updating the smallest/largest pixel value ( 0028,0106 and 0028,0107 ) after compression? Thanks in advance for any answers. Justin Hello Justin, as DICOM Lossy Compression usually requires the generation of a new Instance, you should appropriately all presentation rel...

Low Level Data Compression, Language Impact?
https://groups.google.com/forum/#!topic/comp.compression/t22ct_BKi9w outlin= es a method of data compression. The exact impact on source code structure = and hardware architecture are not yet fully known. I'd assume people in thi= s group know assembly level and below as well as more high level ideas. Cheers Jacko. ...

How to recompress existing zip files with a higher level of compression
I have loads (1000+) zip files, that were compressed with the default level of compression. Is there any easy way of recompressing these files, with a higher level of compression?? I don't want to compress the existing zip file into another zip file, but rather recompress the contents of the file. Thanks chortler@fetchmail.co.uk wrote: > I have loads (1000+) zip files, that were compressed with the default > level of compression. Is there any easy way of recompressing these > files, with a higher level of compression?? > I don't want to compress the existing zip file int...

Optimum number of steps for PNG compression (Infranview): about 8.
Optimum number of steps for PNG compression (Infranview): about 8. The PNG compression plug-in seems to have a logrythmic decrease in image size, step by step. ------------------------------------------------------------------------------- File : 'changes.txt' - Info about IrfanView updates (important changes only!) Author: Irfan Skiljan E-Mail: irfanview@gmx.net WWW : http://www.irfanview.com ------------------------------------------------------------------------------- Version 3.97 (Release date: 2005-04-22) - New "Browse Subfolders" dialog, when folder end/begin re...

Building Voxel arrays from png files (compression, analysis, and visualization)
Hello all, I have no previous programming experience, prior to this I was a biology student with some computing background. I am working on speeding up the below script and trying to reduce the amount of data lost during compression. Currently I am relying on the IDL Interpolation to create a new voxel array from a series of 2-dimensional images. I am constrained to 1.5 GB of RAM and I am using the two for loops in an attempt to reduce the amount of data in memory at any one time. The program usually takes 4 days or more hours to run a set of 10,000 images compressed 4x, resulting in a voxel...

compress utility != Compress::Zlib::compress() on linux and Windows
For some reason when the compress utility is used to create a .Z file, the result is not recognized by the Compress::Zlib uncompress() function (on either Linux or Windows). And it turns out that Compress::Zlib::compress() produces a _different_ .Z file. Editing the two seems to show that the contents are the same after a certain point and thus the headers differ. DOH! I am running with perl v5.8.6, in which Compress::Zlib v1.33 is bundled. (v1.34 was released in 2005-01; but I doubt if 1.33 is drastically out of date.) Can anyone shed some light on this? If so, you'll be doing better ...

Compressing compressed files?
I have a really crazy idea: Supposing you took a compressed file (any format) and pass it to a bijective decompressor,In principle the data should become less random. After that you can recompress the data with a superior algorithm. My reasoning is as follows: The set of input data for most general purpose compression algorithms tends to overlap. Within a certain margin , the set of compressed files overlaps too. I tried this out on a couple of files with David Scott's BICOM but I didn't get any positive result. My idea was that since PPMII does 40 - 50 % better than ZIP and there ...

compressing compressed video
Is it possible to noticeably compress already compressed video (DivX/MPEG) even futher using a loseless compression program? Which programs offer the highest compression radios in this case? copx copx wrote: > Is it possible to noticeably compress already compressed video (DivX/MPEG) > even futher using a loseless compression program? Yes, but not by much if at all. The most lossless compression I have seen of an .AVI occured because the PCM audio was uncompressed. The only .AVIs you'll see major compression occur is when there is so much redundant information in a frame or ...

To Compress or Not To Compress -- That is the Question !
Hi: I'm running a website (I have three sites on my older Dell -- Pentium 4 with 2GB of Memory and a 350Gig Harddrive) with an xHarbour .exe as the CGI 'script'. If I have many simultaneous clients, is there a significant benefit or penalty to compressing the .exe ? Currently, one of the compressed executables is 486,400 bytes. Uncompressed, it is 1,076,224 bytes. I welcome your opinions. Thanks. -Mel Smith -- Mel Smith Mel, In case of your setup IMHO the compressed .exe has no advantages, the decompressing operation ( aka UnZip ) - eats the serv...

c is a low-level language or neither low level nor high level language
hi! I have a doubt that 'c' is low level language or neither low level nor high level language.please give me details. with regards, vinod In article <1131165603.368656.296360@f14g2000cwb.googlegroups.com>, pabbu <vinodpabbu@gmail.com> wrote: > I have a doubt that 'c' is low level language or neither low level >nor high level language.please give me details. Rather than have this turn into a long quibble about semantics, let me turn the question around and ask you: What do *you* mean by "low level language"? What do *you* mean by "h...

Compress an already compressed Jpeg
We are trialing some software that very basically allows your average Joe with the correct client software installed send an image file via the internet to a server running the server side of this software. While this is already very easy to do with a bit of HTML code and asp/php e.t.c to receive and manipulate the file. This software claims to compress the image before it is sent and then decompress the server end with no loss using file compression rather than further image compression. What's more it claims it compresses them up to 80% compression before transfer! While I can believe t...

Compression Qs from a Compression Newbie
When compressing something, what are all the factors used to evaluate how helpful / good the compression method is, in general, and in a specific use? I am interested in finding ways to compress combinations of data that include time / date information about the functioning of an operating system, video data collected under the same o.s., and parts of the o.s. software itself. Does anyone have any experience with anything like this and would anyone have any helpful pointers? I am hoping that patterns or redundancies in the data can be used to compress it. Any helfup input appreciated, Matt ...

compressing an already compressed archive
Hi all, why it is said that compressing an already compressed archive will not have much benefit , independent of the algorithm being used ? annalissa <aarklon@gmail.com> wrote: > why it is said that compressing an already compressed archive will > not have much benefit , independent of the algorithm being used ? Well, it isn't true in general, but is usually for reasonable compression algorithms. Some time ago I was working on a program to generate PDFs of scanned images. First I did the run-length compression, as that was pretty easy. Then I added LZW, but didn't...

PNG to PNG alpha (on Adjust)
I have a black-on-white greyscale PNG diagram which is antialiased on a white background. I'd like to rejig it so I have a completely black PNG with the 8-bit alpha channel being the original diagram. In other words the black on the original diagram means 0% transparency and the white on it means 100% transparency. So that effectively it gets anti-aliased onto whatever colour I place it on (a blue background means a black-to-blue blend). Is it possible to do this with RISC OS? Adjust Paint does alpha sprites and PNGs, but I can't work out how to write the alpha channel other than ...