Wintershade
2011-May-03 08:48 UTC
[Wine] Switching to 16-bit colour depth (from 24) - is it possible?
Hello. I'm having a bit of trouble running some old games, like Planescape Torment (http://appdb.winehq.org/objectManager.php?sClass=version&iId=22660&iTestingId=61130) (the GOG.com version). When I try to start the game, it complains about the desktop bit depth (I have 24 as default, and it requires 16-bit). The game suggests I should either run it in fullscreen (which I am), or to switch to 16-bit colour depth (which is quite an inconvenience if I have to edit xorg.conf and restart X every time I want to play the game, and then edit xorg.conf back, and restart X again) Is it possible to use (or emulate?) 16-bit colours inside, e.g. a WINE virtual desktop? Or to use some opengl extension which would provide this effect? Thanks in advance :)
oiaohm
2011-May-03 10:16 UTC
[Wine] Re: Switching to 16-bit colour depth (from 24) - is it possible?
Answer is kinda No. Some applications fall for Virtual desktop window option and work. But if they don't fall for it you are kinda stuffed. X11 has its limitations. I have not looks at xepher or other sub X11 servers. Basically wine does not emulate what the screen is.
James_Huk
2011-May-05 07:19 UTC
[Wine] Re: Switching to 16-bit colour depth (from 24) - is it possible?
@vitamin:> And what xinit does? It starts another X server.OK, I messed up a bit - I know that it starts a separate X session, what I meant to say is, that by using xinit, we can easily start another session, without messing anything in Xorg.conf, then use that session, or switch back or whatever. So, yes we have to start separate X session , but we can do that with one simple command ;]> What all are you running in 16-bit mode that uses GPU acceleration?Mostly OpenGL games (native Linux ports, and through wine) but especially 2D games throught wine, with "DirectDrawRenderer" set to opengl, older strategises tends to get pretty huge boost in performance that way. Never had any problems with 3D acceleration in 16 bit depth. GF9800GT, newest drivers from Nvidia site+Debian Stable/Testing and GF9100M, newest drivers from Nvidia site+Debian Stable/Testing I use 16bit depth much more often on the latter of course, because it is too slow to run most games in 24 bit depth.