Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by Peter Olco » Thu, 04 Mar 2010 06:42:31


I want to save 24-bit BMP images as 8-bit PNG files, where
the number of unique colors <= 256. I don't want windows to
automatically *** up the colors and use its own
selections. I want to keep the original unique 256 colors,
exactly as they are in the original.

How do I force windows to not *** up the colors? Why
isn't NOT screwing up the colors the default?
 
 
 

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by David Chin » Thu, 04 Mar 2010 11:37:10


How is it "screwing up"? What code are you using to save the png's?

-- David

 
 
 

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by Peter Olco » Thu, 04 Mar 2010 11:45:34


I created a BMP file that has exactly 256 unique colors, I
created a GIF file from this file using an optimized palette
from an image editor. Both files were tested to have
identical pixels. I got GDI+ to do the same thing, and both
files did not have identical pixels. I researched this and
found that others have had this same problem.
 
 
 

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by Joseph M. » Thu, 04 Mar 2010 23:51:41

Note the following:

"Pixels" as stored in the file are not pixel *values*. They are pixel *indexes*. So I
can store my file with a color map of

RGB(255, 0, 0)
RGB(0, 255, 0)
RGB(0, 0, 255)

and my pixels might be 0, 1, 2, 0, 1, 2

But if I store the color map as

RGB(0, 0, 255)
RGB(255, 0, 0)
RGB(0, 255, 0)

then my file shows the pixel values as
1, 2, 0, 1, 2, 0
but they are the same image. So you have to say what you mean by "pixels". Are you
talking about the rendering or the file contents? File contents do not have to be
identical to get identical renderings.

Next, you have to deal with color rendering. If the color map is stored in the file, you
should *see* the same pixels as you stored. If you don't store the color map, one will be
assumed for you.

So your question is not clear.

You would have to write a little program that extracted the color map and the pixel
indices and indicated what was going on. Or look at my Image Comparator program. I know
it gives me the color map for GIF files, try it for PNG files and see what it says. If
GDI+ is doing something odd to the colors, that is probably a bug. But if you are looking
at file bits, you have no idea what the pixels mean without the color map as well.
joe







Joseph M. Newcomer [MVP]
email: XXXX@XXXXX.COM
Web: http://www.yqcomputer.com/
MVP Tips: http://www.yqcomputer.com/
 
 
 

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by Peter Olco » Fri, 05 Mar 2010 00:45:06

"Joseph M. Newcomer" < XXXX@XXXXX.COM > wrote in
message news: XXXX@XXXXX.COM ...

The original
RGB(255, 0, 0)
RGB(0, 255, 0)
RGB(0, 0, 255)

becomes
RGB(253, 0, 0)
RGB(0, 251, 0)
RGB(0, 0, 252)

How can I get Windows to quit screwing this up? The problem
is that for legacy reasons it wants to make sure that
certain colors are available even if they are not needed. It
apparently keeps at least 20 colors for itself, thus merging
the colors you specify if it needs to make room for its
colors.



 
 
 

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by David Chin » Fri, 05 Mar 2010 01:51:06


The business about reserving 20 colors for itself is only true if your
display resolution is set to 256 colors. It involves palettes which are not
used when your display is set to either 16/24/32 bit color which is almost
always true these days. Show your code you use to display this, and also
tell how you are determining the real color of the pixel that you say is
wrong.

-- David
 
 
 

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by Peter Olco » Fri, 05 Mar 2010 02:19:50


I have a 24-bit bitmap file that I carefully constructed to
have exactly 256 unique colors.
I manually convert this file to several 8-bit indexed file
formats and use BeyondCompare to show that the pixels are
identical.
I open the same 24-bit 256 color bitmap file using either
GDI+ or CImage and then save this file as GIF.
The BeyondCompare now shows that the shades of the pixels
have changed. Other people are reporting this same problem.
 
 
 

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by David Chin » Fri, 05 Mar 2010 06:49:37


I thought you had drawn it on the screen, done an Alt+Prtsc to copy to
clipboard and pasted into a graphics program in order to see what color the
pixels were! Given the screen has nothing to do with it, what I said about
the 20 color reserve doesn't hold. But what you said isn't true either.
This article says GDI+ saves the GIF image using the web safety palette:

http://www.yqcomputer.com/

-- David
 
 
 

Forcing an optimized pallette when converting a 24-bit bitmap to 8-bit using GDI+

Post by Joseph M. » Fri, 05 Mar 2010 14:03:14

Note also my ColorPicker program lets you examine the actual pixels on the screen and see
what colors they are. Download from my MVP Tips site. I use it all the time.
joe




Joseph M. Newcomer [MVP]
email: XXXX@XXXXX.COM
Web: http://www.yqcomputer.com/
MVP Tips: http://www.yqcomputer.com/