I've got some Textures from an old computer game. They are stored in one huge file with a simple run-length encoding. I know the image width, and height, the images have also some fully transperant parts.
one texture is structured as follows:
-a color map: with 256 Colors, each color contains 4 bytes for each color. One value for b,g,r and one zero value
-texel Data: in general each byte corresponds to a color in the color map but texel datas transparent is compressed with some run-length encoding:
- first value indicates with how many tranparent texel the first row starts
- the following value indicates how many of the folloing bytes corresponds to a texel from the colormap (byte 5 means you have to map the next 5 bytes to the color map) the value after the 5 bytes could be another transparent pixel count indicator, which would be followed by another color pixel count.
- but if the value is 0xFE then there is a no other colored texels in this row. if there less texels in this row than given by the image width, the rest of the texels are transparent.
- if the value is 0xFF the end of the image/the last row is reached. if there was less rows then given by the image height, the rest of the texels are transparent.
for example a 4x4 texture could look like this:
02 01 99 FE (two transparent pixel, color 99, one transperent pixel to fullfill the width)
01 02 98 99 FE (one transparent pixel, color 98, color 99 , one transperent pixel to fullfill the width)
00 01 99 01 02 98 99 FE (color 99, one transparent pixel, color 98, color 99)
02 02 99 98 FF (two transparent pixel, color 99, color 98)
so i think because this is a very rudimentary compression, maybe somebody knows if it is called a specific name or something? And the most important, is there a way to upload this "compressed" data to openGL? I know for that i have to specify some encoding for the data in openGl.
I ve already writen an algorythm to convert this data to normal rgba data. But this takes much more graphics memory than the game actually specifies (about 30% of each image is transparent which could be run-length encoded instead). So if the game is not converting the image to all rgba i want also find a way for that.
can anybody give me some help?
CodePudding user response:
This is just a paletted image that uses RLE encoding for empty spaces. There's not really a name for that. It's like GIF only not as good, but probably easier to decompress.
I ve already writen an algorythm to convert this data to normal rgba data.
Then you're done. Upload that to an OpenGL texture.
So if the game is not converting the image to all rgba i want also find a way for that.
You can't.
While you could implement a palette in a shader by using either two textures or a texture and a UBO/SSBO for the palette, you can't implement the run-length encoding scheme in a shader.
RLE is fine for data storage and bulk decompression, but it is terrible at random access. And random access is precisely how textures work. There's no real way to map a texture coordinate to a memory address containing the data for the corresponding texel. And if you can't do that, you can't access the texel.
Actual compressed texture formats are designed in a way that you can go directly from a texture coordinate to an exact memory address for the block of data containing that texel. RLE isn't like that.