mirh wrote:I thought the standard was releasing the game and blacklist the bad drivers.
Problem is, *all* drivers are bad when it comes to OpenGL. Which means you have lots of trouble even getting as far as a release-worthy game.
mirh wrote:That's really not something I accept.
I spent 2 days "analyzing" that game. And I can tell you most of the problem are with textures (which require you to create a cache folder) and 64 bit version that even though should be faster, had developer logging enabled by default.
For the remainder I had no problem.
There have been a number of patches to both the game itself, and to the various display drivers, before it worked properly.
The problem was that Rage used some extensions that had not been used before in production drivers. AMD needed no less than 3 hotfix releases before their drivers worked properly with the game.
mirh wrote:I thought after reading your very nice article that the point had never been timing.
Not necessarily, but if you're not the first, you have to do it a lot better than whoever was first.
mirh wrote:So there is an excuse for that? :\
Excuse for what? For OpenGL not integrating ES and mainstream OpenGL properly once the baseline of hardware allowed it? Nope, no excuse for that.
mirh wrote:And why DX can claim to be the same but different (even though it's still a subset), while GL ES can not?
I have a D3D11 engine, where the whole codebase can be used on regular Windows, WinRT and WP without any changes to code or content. The same texture formats and shader code are supported.
OpenGL ES does not offer any functionality to load textures from files/streams/whatever, so you need OS-specific code for that. Also, OpenGL ES has a slightly different dialect of GLSL, which means you can't compile the same shaders for regular OpenGL and ES without changes/hacks.
There's no excuse for that.
mirh wrote:Also, OpenGL ES has always been the golden standard in mobile exactly for the same reasons DX become a thing in desktop.
It was the best at its times and it always maintained its stand.
It wasn't the best, it was the ONLY option. It never had competition until Apple introduced its Metal. D3D is not really a competitor because there's no mobile platform where you have the choice between both.
mirh wrote:Then you said it's not the API itself to suck, but the surrounding environment
Yes, the API itself would have worked, if it had a decent ecosystem to support it. As I discussed earlier (proper driver model, driver validation, good SDK, mature drivers etc).
But it didn't. I never said the API design was the problem.
mirh wrote:Then again you repeat it's the API itself that sucks and that people only care of Windows
No, I did not say the API itself sucks. Reading comprehension issues? Everything I say is not about the API, but about the whole ecosystem.
mirh wrote:Then I'd say the API looks good, and you'd tell it's the environment. I'd tell you the environment changed and you again would tell me it's the API that sucks.
No, I'm saying the environment has not changed. Therefore, the ecosystem is going to suck as much as the OpenGL one did, because it's the same ecosystem, just a new name.
mirh wrote:Surprise: the power of open extensions.
Yes, extension hell, see above with Rage... One more reason why the OpenGL ecosystem sucks. No standardization, no validation.
mirh wrote:And it's not like PS2 is your usual emulator, you know.
It's ancient technology, all fixed-point, and far FAR more primitive than today's GPUs. If you have problems emulating 2000 console technology on 2015 hardware, you're doing something horribly wrong.
mirh wrote:And they are not surprised you understand nothing of the real problem (whom was quite simplified there actually)
I understand the problem perfectly fine: they're emulating a 2000 console on 2015 hardware.
mirh wrote:And, besides 16 bits operations aren't equivalent to 8 bits due to different rounding (that's another of the ps2 uniquenesses)
Look, why don't you stop pretending you know what you're talking about, okay (if all you know is '8-bit' and '16-bit', then apparently you have no clue about the differences between floating point and fixed point, the existence of other formats such as 2:10:10:10 ARGB, the fact that even PS1.x shaders will need 10 to 14 bit precision at least etc)?
I am an expert on graphics technology (it's why they pay me the big bucks), and my company has contributed to the DX12 development in the early access program (yes, DX12 has been around a lot longer than you've heard of it, and it's actually a finished product... where Vulkan was only *started* a few months ago, when AMD finally donated the Mantle documentation).
So stop being a jerk and insulting me by claiming I wouldn't even know how a PS2 works. I know that perfectly fine, and I could probably help the emulator developers to get their code working on D3D and make it perform well.