Difference between revisions of "Talk:Humongous Entertainment/Progress/16bits Support"
Eriktorbjorn (talk | contribs) (New section: Mixed 8- and 16-bit graphics?) |
|||
Line 7: | Line 7: | ||
Things could get hairy if you need to find the palette index for an on-screen colour, but that should be solvable. [[User:Eriktorbjorn|Eriktorbjorn]] 22:04, 29 December 2008 (CET) | Things could get hairy if you need to find the palette index for an on-screen colour, but that should be solvable. [[User:Eriktorbjorn|Eriktorbjorn]] 22:04, 29 December 2008 (CET) | ||
No, HE games requiring 16bit color, use only 16bit palettes, and convert 8bit graphics internally. | |||
[[User:Kirben|Kirben]] |
Latest revision as of 05:13, 1 January 2009
Adding Freddi 5 Screenshot
It says "I.e. if you'll take a look at current 16bits games" on the main page. Should I upload a picture (or two) to show how ScummVM handles it now? -Clone2727 21:59, 23 January 2007 (UTC)
Mixed 8- and 16-bit graphics?
Is it absolutely necessary for the backend to be able to handle mixed 8- and 16-bit graphics at the same time? It seems to me that it would be easier/cleaner to have one (mandatory) 8-bit mode and one (optional) 16-bit mode. Possibly two separate classes. Perhaps mixed mode could be handled by a platform-independent layer in between? We might need a layer in between anyway to convert between different 16-bit colour formats.
Things could get hairy if you need to find the palette index for an on-screen colour, but that should be solvable. Eriktorbjorn 22:04, 29 December 2008 (CET)
No, HE games requiring 16bit color, use only 16bit palettes, and convert 8bit graphics internally. Kirben