View Full Version : 16 bit vs 256 color palette
I read some posts in the archives that talk about giving a sprite its own 256-color palette and getting the appearance of 16-bit with half the file size for the sprites. It was mentioned by Pavlina (Dexterity) in a couple of his posts. I cannot find any details on this method.
Can someone elaborate on this method, giving an overview of how to implement it?
Also, what is the industry standard for graphics these days in shareware level games (16 bit or 32 bit, or this 256 color palette thing)?
We would like to upgrade our graphics from 256 and DirectX seems to support 16 bit well, but all graphic files are 24 bit. When I save a file from a 24 bit file format to targa 16 bit, too much gets lost. It converts better with gif.
I would just like to see some general discussion on what everyone is using for their games for graphics in their 2D games as well as some more info on this 256 color palette per image method.
10-29-2004, 09:38 AM
It's cheaper if you're doing software rendering, but if you're uploading the textures to a video card, that card is going to expand the texture to it's full 32-bit representation regardless.
10-29-2004, 09:50 AM
Besides what EpicBoy says, you can also save on disk space if your sprites come as 256 or even 16 color images on disk, and convert them to 16 bits when loading. 256 colors are too little for the whole screen but not necessarily for individual sprites.
For our DX games, we use 24bit PNGs, as they support full 8bit alpha.
For our web games, everything is saved as GIFs (Java 1.1 does not support PNGs :(), but once loaded, they are converted to 32bit for our software renderer. Runtuime memory is cheap - bandwidth isn't. All you have to do is get the data from the loaded texture & rebuild a full 32bit image in memory to upload to the card as normal.
This way, each individual texture uses 256 or less colours (sometimes only 16 or 32), but the whole game uses several thousand exact 24bit colours (and 1 bit alpha from the gifs).
10-29-2004, 09:59 AM
TJM:the 256colors thing is simple, instead using 16bpp colors you use 256color sprites and save half the space (before compression). With good loseless compression you can easily achieve another 50% compression (or more if your image is partially transparent)
What I used in my 2d game was a RLE encoded 256color sprites (i used 256 colors per character, every character had like 60 different sprites), after RLE compression I put another compression (don't remember the method atm, but similar to lharc if i remember good). Overally image size was very small with full loseless compression.
Maybe one explanation, i did RLE encode only the transparent bit, not color data as it was different from pixel to pixel.
My method was described somewhere on dexterity forums
10-29-2004, 10:10 AM
Depends on the target platform....
If you are doing software rasterization, use 256 color. Anything with hardware go 16 or 32 bit.
256 palettized will almost always look better than 16-bit color, due to the palette being 32-bit values. You have more color depth, but only 256 entries to choose from.
If you are doing software rasterization and really want to pack things down consider 16 color (4 bit!) for some items. They are absolutely tiny (good for downloading and cache coherency) and can look as good as 256 color, 32 bit color as long as the total amount of colors needed is fairly low.
If you do use palettized formats, be sure to thoroughly research a good palettizer. Getting the *right* colors will have a big impact on the visual quality of your game.
Thank you all for the advice and info.
10-29-2004, 10:26 PM
read up on createdibsection, this api may come in handy when loading 256 color images and prepare the gfx into 16-bit surfaces or others before blitting.
Powered by vBulletin™ Version 4.1.3 Copyright © 2013 vBulletin Solutions, Inc. All rights reserved.