First post, by duralisis
- Rank
- Newbie
So I'm planning on getting a new laptop for use partly to play older games and to replace a netbook; and most all of them come with the Intel HD Graphics 4000 built in (Sandy Bridge based Core i3). Specs are still low enough that I can reasonably dual boot XP, plus XP drivers are easily available. So that sounds good, but I'm concerned if the HD 4000 has the same game compatibility that I've seen with older Intel graphics, like the i945GM, HD2000, etc.
All those previous Intel graphics have very good image quality and support for older games. Especially when it comes to 16-bit dithering; very good. But I'd like to know if for example, Intel has done what both AMD & NVIDIA did in later generation cards (like the 8800GTX onward and the HD2400 onward), and have eliminated for example, pallettized texture support, or support for texture formats that make 16-bit color games render blotchy.
For an example of what I'm talking about, see Thief 2, System Shock 2, Blood 2, Shogo, any 16-bit rendered game. Blotchy textures and no dithering support for newer cards. Supposedly any DX10 card had these changes made and their architecture removed fixed functions, etc.
So how do these older games look on HD 4000 level graphics in comparison?
Need a reference? See this last post in a thread concerning the 8800GT:
https://forums.geforce.com/default/topic/3675 … 636443/#2636443