Author Topic: 8 bit pallete textures  (Read 8131 times)

ChaosControl

  • *
  • Posts: 741
  • ¤
    • View Profile
8 bit pallete textures
« on: 2006-02-08 07:23:23 »
Yeah i know this is n00b to ask but does anyone even know what exactly it does?

As far as i know of, no one has had anything but fail at 8 bit pallete textures so i'd like to know what it does...

Anyone have a screenie of ff7 or ff8 with 8-bit pallete textures enabled?

-gone away-

  • *
  • Posts: 385
    • View Profile
8 bit pallete textures
« Reply #1 on: 2006-02-08 11:20:14 »
It's just how many colour's the textures go up to.. so 8 bit will only go up to 256. The Vodoo card's (3&4) i think used to only go up to 16bit and then about the time T&T's started comming out graphic's cards started to go up to 32bit.

so a card that can handle 32bit can also handle anything before although its sometimes disabled for performance optimisation as 32bit is also quicker(dont know why.. just is), so by forceing the game to use 32bit you increase the overall fps. So basically its just how many colours are used to display the textures.

The most noticeable thing about a screen shot between a 32bit and a 8bit will be where the colours blend together (e.g. like in the menu when it goes from light blue to dark blue) also transparencies when it blends the foreground with the background. However unlike what everyone seems to think none of that should cause the transparencies to go black!!! the fact that theyre black means that the card is not even trying to process the transparencies or is doing it completly wrong. Also the fact that the rest of the textures are displayed correctly meants that your card can handle the 8bit textures... otherwise nothing would be displayed at all, except maybe vertex colours.

textures are made up of several layers, one of these (alpha channel i think) is used to tell the gfx card which parts of the textures are transparent and just how see threw they are ment to be. If the card fails to do this then the transparent area's will appear as either black or blue dependant on what its been programed to see threw. games as old as ff possibly expect only one layer for texture's and a result make every pure black pixel transparent.

correct me if im wrong about any of that... im not a perfect machine of unmistakeable accuracy as much as i wish i was

ChaosControl

  • *
  • Posts: 741
  • ¤
    • View Profile
8 bit pallete textures
« Reply #2 on: 2006-02-08 11:47:26 »
Quote from: smithie

correct me if im wrong about any of that... im not a perfect machine of unmistakeable accuracy as much as i wish i was

You like t3h borg don't you ;P

anyway, in FF8 I used to have this problem, the menu colors would be shitty and not smoothly run over in eachother. now 8-bit etc works its gone and everything looks way better! just wanted to know why actually :P

mirex

  • *
  • Posts: 1645
    • View Profile
    • http://mirex.mypage.sk
8 bit pallete textures
« Reply #3 on: 2006-02-08 11:51:44 »
Do you want to know about 8bit palette textures in general or in relation to FF7 / 8 ?

In my opinion FF with 8bit textures enabled should look exactly or very similair like truecolor mode.

ChaosControl

  • *
  • Posts: 741
  • ¤
    • View Profile
8 bit pallete textures
« Reply #4 on: 2006-02-08 11:52:47 »
well, no that you mention it im kinda intressted in the entire topic, so if anyone has anything usefull to say about it, feel free to post it here.

-gone away-

  • *
  • Posts: 385
    • View Profile
8 bit pallete textures
« Reply #5 on: 2006-02-08 12:09:43 »
Quote from: chaoscontrol
You like t3h borg don't you ;P

Actually I hate star trek.. too boring for me.

Quote from: chaoscontrol

anyway, in FF8 I used to have this problem, the menu colors would be shitty and not smoothly run over in eachother. now 8-bit etc works its gone and everything looks way better! just wanted to know why actually :P

The was reciently a topic on a forum about how particular cards running in 32bit mode mess up(not look as good) 8bit texture's in Quake2 - and there is actually a fix for it (Theres a test for your 8bit textures... play Quake 2) Maybe your card has the same problem. On mine the game looks normal with a ATI Radeon 9600.

If you want to see what the problem that im referring to looked like set your gamma up really high and it will give you a simular effect. It will make the different shade's in colour very different and very grainy.

side note: OpenGl is/was limited to 16bit texture's DX9 Isn't. As far as i know Quake 3: Arena was the first to break this rule supporting 32bit colours in an openGl program. In my opinion OpenGl runs smoother and faster however is harder to program for.

Cyberman

  • *
  • Posts: 1572
    • View Profile
8 bit pallete textures
« Reply #6 on: 2006-02-08 15:50:01 »
Actually Direct X borrows most of its useful API ideas from OpenGL, as originally DirectX... was horrible to program for.  That's why many refused to touch it tell the last 8 years.  When first introduced people were.. well pissed. (See circa 1992/3).  The current interface as I said is quite openGlish.

Now for 8 bit paletted textures.  Those are not supported in most modern hardware.  OpenGL has ALWAYS supported paletted textures.  <-- READ THE SPEC look under Rasterization.  However that doesn't mean the card is required to support it in hardware.  Also it doesn't mean the implementation is even required to support it.  There is no 'minimum' OpenGL configuration per sea just that it support the interface, basically it can ignore using 8 bit paletted textures and still be OpenGL 1.1 - 1.5 complient.

In any case you could look at the playstation to better understand paletted textures.  An example is FF7's battle scenes. These are textured polygons using an 8 bit paletted texture.  TIM files store indexs and palettes into the PS1's video memory and these are used to rasterize to a 16bit RGB display.

8 bit palettes are slow on modern hardware because they are not supported in the hardware and instead have to be decoded to 32 bit textures in software THEN used in rasterization.

Cyb

Micky

  • *
  • Posts: 300
    • View Profile
8 bit pallete textures
« Reply #7 on: 2006-02-08 18:06:13 »
The problem with paletted textures is that to get the colour for a texel you first have to look up an index in the texture, and then look up this index in a colour table (=palette). This requires two reads for point sampling, and lots of more reads for filtered sampling. Additionally these reads can spread out over memory, defeating some of the caching schemes.
One of the original reasons to have them was to safe memory, but as memory becomes cheaper the pressure to use them is reduced, especially as texture compression schemes like S3TC (=DXT1-5) have a similar effect. That's why they're not supported on many modern cards.
You can do some effects with palettes, like for example changing the colour of an object by rendering it with different palettes, but you can do similar effects with a pixel shader nowadays.

-gone away-

  • *
  • Posts: 385
    • View Profile
8 bit pallete textures
« Reply #8 on: 2006-02-08 20:13:07 »
Quote from: Cyberman
Actually Direct X borrows most of its useful API ideas from OpenGL, as originally DirectX... was horrible to program for.  That's why many refused to touch it tell the last 8 years.  When first introduced people were.. well pissed. (See circa 1992/3).  The current interface as I said is quite openGlish.

I must have another look at openGl, the version of openGl i was looking at was very old compaired to the Dx version i was looking at. It looked ... uh messed up & difficult so i havent looked at it since that first encounter really.

Quote
8 bit palettes are slow on modern hardware because they are not supported in the hardware and instead have to be decoded to 32 bit textures in software THEN used in rasterization.

Cheer's, thats what i wanted to know. I assumed that a 16bit palette would be backwards compatible in a way, what i thought was that by providing support for it you shouldn't need to support 8bit.