VOGONS

Common searches


Windows and portability

Topic actions

Reply 20 of 41, by Expack3

User metadata
Rank Member
Rank
Member
mirh wrote:
I really can't be very precise. Though I know texture_barrier and DSA are very useful. More or less, modern GPUs when computin […]
Show full quote

I really can't be very precise.
Though I know texture_barrier and DSA are very useful.
More or less, modern GPUs when computing transparencies, can only account for values between 0 and 1. PS2 hardware on the other hand could go beyond this.

And no, it's not only hdr-colclip.

I was precise for Scali and specifically asked him about the hdr-colclip and the "impossible blend". He responded to both counts. I suggest you re-read that part of the conversation.

I also suggest you start doing research on the person you're debating. He has a blog, you know. (I didn't link it because it isn't hard to find.)

Reply 21 of 41, by Scali

User metadata
Rank l33t
Rank
l33t

Taking this:

(Color1 - Color2) * Coefficient + Color3 […]
Show full quote

(Color1 - Color2) * Coefficient + Color3

The issues we have with this are as follows:
1. In the PS2, the coefficient factor could range from [0;2] (fortunately this almost never happens)
2. If Color3 and Color1 are the same source, the equation will be:

Color1 * (1+Coefficient) - Color2 *Coefficient

which will result in the first half of this equation always being larger than 1. This is a problem because the GPU is limited to 1. This is why this type of blending is impossible on the fixed function unit of a PC's GPU.

Without knowing the specifics of the PS2 hardware, I'd guess that scaling the coefficient by 0.5 beforehand is the best option (assuming there's a valid reason why they don't just use FP16 alphablend in the first place...?).
Then it will not saturate until you scale it back, which should yield the same results (and if I overlooked something here, you could also apply bias-values to avoid saturation above and below).
I'm surprised they don't understand that. PS2 is not a floating point device in the first place, but fixedpoint, the values in the range 0..1 are scaled anyway (they are probably just 8-bit internally or so, so you'd just have 0..255). And so is a range of 0..2. Scaling it down to ranges of 0..0.5 and 0..1 respectively doesn't really change anything, as long as you have enough bits in your fixedpoint representation. Which you should, since we're talking about modern GPUs with floating point pipelines.

The approach they suggest, seems overly convoluted, quite inefficient, and requiring special hardware features that you don't really need at all (they want to use read-modify-write on textures to speed it up... something that you'd probably do with compute shaders and/or UAVs in DX11).

This part sounds wrong though:

There is a catch however. Fragment shaders (like any CPU) are relatively slow. In order to compensate for this the fragments are executed out-of-order.

That's not the reason. The reason is that it's a triangle rasterizer, and it's an embarrassingly parallel problem. 'Out-of-order' implies some kind of active reordering of the input, which is not the same as doing operations in parallel, but in the order the data comes in.
Transparency is a special-case. Modern hardware (as supported in DX11.3/DX12_1) actually has rasterizer ordered views to make the fragments execute in a known order. Which is something I blogged about earlier, since only Intel and nVidia support this currently. AMD hardware does not.
But for emulating a PS2, which is decade-old technology, all that seems way overkill.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 22 of 41, by Stiletto

User metadata
Rank l33t++
Rank
l33t++

Guys, you're WAY off topic, so we're splitting this thread and putting this discussion over in Milliways.

"I see a little silhouette-o of a man, Scaramouche, Scaramouche, will you
do the Fandango!" - Queen

Stiletto

Reply 23 of 41, by Jorpho

User metadata
Rank l33t++
Rank
l33t++
HunterZ wrote:

Microsoft created a whole DOS emulator (NTVDM) for 32-bit Windows NT/2K/XP. It can't be that much harder to create a 16-bit emulation layer (for Win9x apps if not DOS ones).

About as hard as writing DOSBox, I reckon. I suspect the biggest concern would be maintaining security. You may recall EMS support was patched out of the NTVDM at some point during XP's lifetime because of a vulnerability.

Ironically, I think Wine is able to run 16-bit Windows apps on 64-bit Linux.

That relies on a bit of trickery with the kernel that was also nearly disabled in recent releases. I have been led to believe that they are trying to move towards DOSBox for executing 16-bit code.

Reply 24 of 41, by Scali

User metadata
Rank l33t
Rank
l33t
Jorpho wrote:

About as hard as writing DOSBox, I reckon.

DOSBox is a conventional emulator, and just interprets x86 code. NTVDM does not interpret x86 code, it uses v86-mode.
Microsoft could have used virtualization, with their Hyper-V technology. This way they could host a VM that runs in 32-bit mode and supports v86 processes, so basically they could run NTVDM inside that VM. The VM could then talk to the rest of the system via the Hyper-V layer.
You actually *can* do this anyway in Windows 7, using the built-in XP mode. It's not completely seamless, but it does allow you to run 16-bit DOS/Windows processes inside a 64-bit Windows environment.
With other versions of Windows, you could use Hyper-V/VirtualPC to manually install Windows XP to do the same.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 25 of 41, by HunterZ

User metadata
Rank l33t++
Rank
l33t++
Stiletto wrote:

Guys, you're WAY off topic, so we're splitting this thread and putting this discussion over in Milliways.

Thanks, I was thinking of doing it myself but haven't had time.

Not sure what I think about merging it with a 2-year old thread, though, especially since people are replying to the older posts too now 🤣.

Reply 26 of 41, by Stiletto

User metadata
Rank l33t++
Rank
l33t++
HunterZ wrote:
Stiletto wrote:

Guys, you're WAY off topic, so we're splitting this thread and putting this discussion over in Milliways.

Thanks, I was thinking of doing it myself but haven't had time.

Not sure what I think about merging it with a 2-year old thread, though, especially since people are replying to the older posts too now 🤣.

🤣 actually, EVERY post in this new thread came from the "indirect sound" thread. I guess I looked for where things started to go awry, and it was definitely with Scali quoting mr_bigmouth_502, which had been a two-year-old post, after mirh had bumped the thread for other reasons (to comment on new IndirectSound-related discoveries). 😁 So yeah, I looked for the "natural split point" where things started going off course, and I felt it was mostly mr_bigmouth_502, HunterZ and idspispopd commenting on the general state of affairs, so those came over too 😉 😜 🤣

"I see a little silhouette-o of a man, Scaramouche, Scaramouche, will you
do the Fandango!" - Queen

Stiletto

Reply 27 of 41, by mirh

User metadata
Rank Member
Rank
Member
Scali wrote:
mirh wrote:

I meant native as in you can get your project to build on a given platform with a couple of clicks. After all it's what modern game engines are about, isn't it?

The problem is not building it, but getting code that actually works properly and performs well on a wide range of systems.

You say?
I thought the standard was releasing the game and blacklist the bad drivers.

Scali wrote:

Which is why nobody develops games for OpenGL on Windows... it simply doesn't work. Just look at the huge issues with Rage).

That's really not something I accept.
I spent 2 days "analyzing" that game. And I can tell you most of the problem are with textures (which require you to create a cache folder) and 64 bit version that even though should be faster, had developer logging enabled by default.
For the remainder I had no problem.

Scali wrote:
mirh wrote:

Is it the same this time? I don't think so.

I think it is. Once again, DX12 is already there, and Vulkan isn't.

Even assuming that 4 months are a big deal and that nobody isn't already working on it under NDA (they are btw),
I thought after reading your very nice article that the point had never been timing.

Scali wrote:
mirh wrote:

They were pretty limited back then, you know?

That's no excuse. OpenGL has been around since 1992. Older versions of OpenGL ran on equally primitive hardware. There's no reason why they'd have a slightly different API and shader language targeting hardware with the same capabilities. Especially for revisions 2 and newer of OpenGL, which target shader-hardware only.

Scali wrote:

That's not the point (and WP supports DX9-class hardware with the DX11 API... as does the regular Windows version of course, since they're the same API).
Microsoft used to have its own mobile API as well, before WP8. But they made it into one API once the hardware was capable enough. OpenGL failed to do that.

So there is an excuse for that? :\

And why DX can claim to be the same but different (even though it's still a subset), while GL ES can not?
Also, OpenGL ES has always been the golden standard in mobile exactly for the same reasons DX become a thing in desktop.
It was the best at its times and it always maintained its stand. So well that nobody was planning to scrape it because there weren't reasons two years ago

Scali wrote:
mirh wrote:

Every major engine and hardware vendor already confirmed their endorsement

Seen that all before, with OpenGL.

Oh.. so you said: OpenGL doesn't have enough support, Vulkan won't either because there are the same parties involved.
I asked you: when the problems has ever been parties?
And you replied always because everybody focus entirely on Windows/D3D and like OpenGL, it [the API, right?] is still just a poor substitute for Direct3D.
Then you said it's not the API itself to suck, but the surrounding environment
So I told ya that every major party already jumped in the linux bandwagon (the windows part of the problem) and that the situation is pretty different nowadays than in the 90s..
And you say, no it's the same.

Scali wrote:

Again, seen it all before with OpenGL.
It's not going to happen. Vulkan is too late, it's going to suck, and nobody is going to care because we have Windows and DX12.
For OpenGL/Vulkan it's not enough to just exist. It needs to be better than DX12 and easier to use. That's just never going to happen, because Microsoft is just much larger and better organized. They get their APIs on the market faster, and have much better contacts with the IHVs for developing drivers and such. Microsoft also provides much better tools, such as the SDK and debugging/profiling tools.

Then again you repeat it's the API itself that sucks and that people only care of Windows

and here I'd point you out again mobile and you would tell me OpenGL ES sucked so even vulkan it's going to suck [fastforward] becuase they are the same parties.
Then I'd say the API looks good, and you'd tell it's the environment. I'd tell you the environment changed and you again would tell me it's the API that sucks.

Christ, what really sucks is circular reasoning. And wasting a hour to make up with everybody else's logical connections.
And btw that first person plural in front of "have" doesn't really look live a neutral indicator of your stance.

Scali wrote:

Those are VERY new extensions to OpenGL, circa 2014. Even if you were to compare to DX11, that's an API from 2009. So the only fair comparison would be against DX12. But I doubt those emulator makers use anything newer than DX9.

Surprise: the power of open extensions.

Scali wrote:

Which doesn't matter, because you can still do everything a PS2 can do, and more. You just need to know what you're doing.
In short, they are using fixedfunction framebuffer blending, when they should be using render-to-texture (or be clever with using a higher precision framebuffer than what the PS2 has, and scale the values to avoid saturation).

Which they are using by the way. And it's not like PS2 is your usual emulator, you know.
It's perhaps the most mystic console architecture ever. And it's very very different from a normal computer.
The problem is not with rendering effects I believe, it's that they need to go back and forth from CPU to GPU and that's extremely costly.
But I didn't read the esoteric manual so I'll gloss over.

Scali wrote:

Then it will not saturate until you scale it back, which should yield the same results (and if I overlooked something here, you could also apply bias-values to avoid saturation above and below). I'm surprised they don't understand that

And they are not surprised you understand nothing of the real problem (whom was quite simplified there actually)
GSdx is bandwidth limited and using a 16 bits Frame Buffer vs 8 bits wouldn't really help with frame rates
And, besides 16 bits operations aren't equivalent to 8 bits due to different rounding (that's another of the ps2 uniquenesses)

Jorpho wrote:

Ironically, I think Wine is able to run 16-bit Windows apps on 64-bit Linux.

That relies on a bit of trickery with the kernel that was also nearly disabled in recent releases. I have been led to believe that they are trying to move towards DOSBox for executing 16-bit code.

It was disabled by default actually.
But it's nothing more than a /proc setting away.

Stiletto wrote:

Guys, you're WAY off topic, so we're splitting this thread and putting this discussion over in Milliways.

And I deeply thanks you.
I was so sorry for the OT, yet still intrigued by the discussion.

pcgamingwiki.com

Reply 28 of 41, by HunterZ

User metadata
Rank l33t++
Rank
l33t++
Stiletto wrote:

🤣 actually, EVERY post in this new thread came from the "indirect sound" thread.

Ha! Good deal then.

I felt it was mostly mr_bigmouth_502, HunterZ and idspispopd commenting on the general state of affairs, so those came over too 😉 😜 🤣

Yeah, I have a tendency to do that, which is why I was hoping to get around to doing the splitting if nobody else did. Thanks for helping to assuage my guilt 😀

Reply 29 of 41, by Scali

User metadata
Rank l33t
Rank
l33t
mirh wrote:

I thought the standard was releasing the game and blacklist the bad drivers.

Problem is, *all* drivers are bad when it comes to OpenGL. Which means you have lots of trouble even getting as far as a release-worthy game.

mirh wrote:

That's really not something I accept.
I spent 2 days "analyzing" that game. And I can tell you most of the problem are with textures (which require you to create a cache folder) and 64 bit version that even though should be faster, had developer logging enabled by default.
For the remainder I had no problem.

There have been a number of patches to both the game itself, and to the various display drivers, before it worked properly.
The problem was that Rage used some extensions that had not been used before in production drivers. AMD needed no less than 3 hotfix releases before their drivers worked properly with the game.

mirh wrote:

I thought after reading your very nice article that the point had never been timing.

Not necessarily, but if you're not the first, you have to do it a lot better than whoever was first.

mirh wrote:

So there is an excuse for that? :\

Excuse for what? For OpenGL not integrating ES and mainstream OpenGL properly once the baseline of hardware allowed it? Nope, no excuse for that.

mirh wrote:

And why DX can claim to be the same but different (even though it's still a subset), while GL ES can not?

I have a D3D11 engine, where the whole codebase can be used on regular Windows, WinRT and WP without any changes to code or content. The same texture formats and shader code are supported.
OpenGL ES does not offer any functionality to load textures from files/streams/whatever, so you need OS-specific code for that. Also, OpenGL ES has a slightly different dialect of GLSL, which means you can't compile the same shaders for regular OpenGL and ES without changes/hacks.
There's no excuse for that.

mirh wrote:

Also, OpenGL ES has always been the golden standard in mobile exactly for the same reasons DX become a thing in desktop.
It was the best at its times and it always maintained its stand.

It wasn't the best, it was the ONLY option. It never had competition until Apple introduced its Metal. D3D is not really a competitor because there's no mobile platform where you have the choice between both.

mirh wrote:

Then you said it's not the API itself to suck, but the surrounding environment

Yes, the API itself would have worked, if it had a decent ecosystem to support it. As I discussed earlier (proper driver model, driver validation, good SDK, mature drivers etc).
But it didn't. I never said the API design was the problem.

mirh wrote:

Then again you repeat it's the API itself that sucks and that people only care of Windows

No, I did not say the API itself sucks. Reading comprehension issues? Everything I say is not about the API, but about the whole ecosystem.

mirh wrote:

Then I'd say the API looks good, and you'd tell it's the environment. I'd tell you the environment changed and you again would tell me it's the API that sucks.

No, I'm saying the environment has not changed. Therefore, the ecosystem is going to suck as much as the OpenGL one did, because it's the same ecosystem, just a new name.

mirh wrote:

Surprise: the power of open extensions.

Yes, extension hell, see above with Rage... One more reason why the OpenGL ecosystem sucks. No standardization, no validation.

mirh wrote:

And it's not like PS2 is your usual emulator, you know.

It's ancient technology, all fixed-point, and far FAR more primitive than today's GPUs. If you have problems emulating 2000 console technology on 2015 hardware, you're doing something horribly wrong.

mirh wrote:

And they are not surprised you understand nothing of the real problem (whom was quite simplified there actually)

I understand the problem perfectly fine: they're emulating a 2000 console on 2015 hardware.

mirh wrote:

And, besides 16 bits operations aren't equivalent to 8 bits due to different rounding (that's another of the ps2 uniquenesses)

Look, why don't you stop pretending you know what you're talking about, okay (if all you know is '8-bit' and '16-bit', then apparently you have no clue about the differences between floating point and fixed point, the existence of other formats such as 2:10:10:10 ARGB, the fact that even PS1.x shaders will need 10 to 14 bit precision at least etc)?
I am an expert on graphics technology (it's why they pay me the big bucks), and my company has contributed to the DX12 development in the early access program (yes, DX12 has been around a lot longer than you've heard of it, and it's actually a finished product... where Vulkan was only *started* a few months ago, when AMD finally donated the Mantle documentation).
So stop being a jerk and insulting me by claiming I wouldn't even know how a PS2 works. I know that perfectly fine, and I could probably help the emulator developers to get their code working on D3D and make it perform well.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 30 of 41, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

Well I'm not sure what to say regarding Direct3D vs OpenGL issues. In theory, it seems OpenGL is more 'chaotic' while Direct3D is more 'standardized', but in practice, my experience is the opposite. An example is Crimson Skies, which is a Direct3D game. It just doesn't work properly with modern Radeons --the player's aircraft turned to black silhouette when AA is activated. On nVidia GPU, CSAA causes the intro screen to freeze. Requiem: Avenging Angels, which is a Direct3D game, also refuses to run on DirectX 9 --it seems the game is only happy with DirectX 6.

On the other hand, I haven't found any old OpenGL games that refuses to run on modern GPUs. GLQuake, Call of Duty, American McGee's Alice, Kingpin: Life of Crime, Medal of Honor: Allied Assault, Return to Castle Wolfenstein; all just run fine.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 31 of 41, by Jorpho

User metadata
Rank l33t++
Rank
l33t++
Kreshna Aryaguna Nurzaman wrote:

On the other hand, I haven't found any old OpenGL games that refuses to run on modern GPUs. GLQuake, Call of Duty, American McGee's Alice, Kingpin: Life of Crime, Medal of Honor: Allied Assault, Return to Castle Wolfenstein; all just run fine.

As I recall, Anachronox (using the Quake 2 engine) bombs pretty badly unless you do something to disable extensions – unless an exception has since been built into the drivers.

Reply 32 of 41, by Scali

User metadata
Rank l33t
Rank
l33t
Jorpho wrote:
Kreshna Aryaguna Nurzaman wrote:

On the other hand, I haven't found any old OpenGL games that refuses to run on modern GPUs. GLQuake, Call of Duty, American McGee's Alice, Kingpin: Life of Crime, Medal of Honor: Allied Assault, Return to Castle Wolfenstein; all just run fine.

As I recall, Anachronox (using the Quake 2 engine) bombs pretty badly unless you do something to disable extensions – unless an exception has since been built into the drivers.

Yes, that's the thing... There's only a handful of OpenGL games/engines out there, so drivers are often written to make sure these games work (having very little, if anything at all, to do with the OpenGL standard as it is specified). So it is sugarcoated.
But in the demoscene, it's way different. People write their own engines, and because it's such a niche, drivers are not usually tweaked to make demos run. So you get to see the raw, unedited state of OpenGL.
A lot of OpenGL demos only work on the brand/type of card they were developed for, and quite often they even break on that after some driver updates.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 33 of 41, by mirh

User metadata
Rank Member
Rank
Member
Scali wrote:

Problem is, *all* drivers are bad when it comes to OpenGL. Which means you have lots of trouble even getting as far as a release-worthy game.

Afaik Nvidia is often regarded as being almost perfect.

Scali wrote:

There have been a number of patches to both the game itself, and to the various display drivers, before it worked properly.
The problem was that Rage used some extensions that had not been used before in production drivers. AMD needed no less than 3 hotfix releases before their drivers worked properly with the game.

Fair enough.
Though it's not like that's the only time similar disastrous results could be seen. OGL or not OGL.

Scali wrote:

All the openGL ES explanation and all

Ok you have really a point there.
Even though, it's funny how its incredible diffusion would still make more sense for a developer to just develop for it regardless.

Scali wrote:
mirh wrote:

Also, OpenGL ES has always been the golden standard in mobile exactly for the same reasons DX become a thing in desktop.
It was the best at its times and it always maintained its stand.

It wasn't the best, it was the ONLY option. It never had competition until Apple introduced its Metal. D3D is not really a competitor because there's no mobile platform where you have the choice between both.

Cmon.. I don't want to start a semantic debate now.
Obviously if you are the only one you can be both the worst and the best.
Though considering assembly is always technically an option, usually people stick with the later definition in these cases.

And anyway, kudos to Apple to have develop its own proprietary API that-works-only-with-its-devices before everybody else.
Now thanks, but I'll better wait for DX12 or Vulkan on mobile phones before saying OGL ES is surpassed.

Scali wrote:

Yes, the API itself would have worked, if it had a decent ecosystem to support it. As I discussed earlier (proper driver model, driver validation, good SDK, mature drivers etc).

Things we can say at the moment:

  • proper driver model: ✖️
  • driver validation: . Sort of. Basically.. the problems are all now in developers hands. And afaik they even have validation tools besides.
  • good SDK: ✖️
  • mature drivers: ✔️ (ie: none)

So that's your currently known ecosystem.

Scali wrote:
mirh wrote:

Surprise: the power of open extensions.

Yes, extension hell, see above with Rage... One more reason why the OpenGL ecosystem sucks. No standardization, no validation.

Except it worked flawlessly with nvidia and it was just ati too lazy to ever work check those.
Also, I didn't know RAGE required off-the-tree/unofficial extensions.

Scali wrote:
mirh wrote:

And, besides 16 bits operations aren't equivalent to 8 bits due to different rounding (that's another of the ps2 uniquenesses)

Look, why don't you stop pretending you know what you're talking about, okay (if all you know is '8-bit' and '16-bit', then apparently you have no clue about the differences between floating point and fixed point, the existence of other formats such as 2:10:10:10 ARGB, the fact that even PS1.x shaders will need 10 to 14 bit precision at least etc)?

Look. Those weren't my word. I took them directly from the mouth of the main graphics dev there.
And after tens of tomes on the argument, I guess you won't believe, but it's not stupid.
Yes I know it's an argument at hominem but from what I can see if it wasn't for him the project would be almost dead.

Scali wrote:

... yes, DX12 has been around a lot longer than you've heard of it, and it's actually a finished product... where Vulkan was only *started* a few months ago, when AMD finally donated the Mantle documentation

Little hint here: perhaps mantle documentation speed up things.

Scali wrote:

So stop being a jerk and insulting me by claiming I wouldn't even know how a PS2 works. I know that perfectly fine, and I could probably help the emulator developers to get their code working on D3D and make it perform well.

efnet.xs4all.nl:6667
#pcsx2dev

Kreshna Aryaguna Nurzaman wrote:

Well I'm not sure what to say regarding Direct3D vs OpenGL issues. In theory, it seems OpenGL is more 'chaotic' while Direct3D is more 'standardized', but in practice, my experience is the opposite. An example is Crimson Skies, which is a Direct3D game. It just doesn't work properly with modern Radeons --the player's aircraft turned to black silhouette when AA is activated. On nVidia GPU, CSAA causes the intro screen to freeze. Requiem: Avenging Angels, which is a Direct3D game, also refuses to run on DirectX 9 --it seems the game is only happy with DirectX 6.

On the other hand, I haven't found any old OpenGL games that refuses to run on modern GPUs. GLQuake, Call of Duty, American McGee's Alice, Kingpin: Life of Crime, Medal of Honor: Allied Assault, Return to Castle Wolfenstein; all just run fine.

Lol.
I was exactly working on a post regarding Crimson Skies issues.
And another about ati OGL implementation problems (yes, problems with opengl, exactly those). Soon™.

Last edited by mirh on 2018-08-24, 01:49. Edited 1 time in total.

pcgamingwiki.com

Reply 34 of 41, by Scali

User metadata
Rank l33t
Rank
l33t
mirh wrote:

Afaik Nvidia is often regarded as being almost perfect.

No, nVidia is generally taken as the de-facto standard implementation, because they are generally the first to implement things. That does not imply that they actually follow specs best. Just that people accept nVidia as the standard, lacking any way to validate the implementation.
With D3D, Microsoft actually provides a reference rasterizer implementation. If you have any doubts on how something is supposed to work, you can always run your code against the reference rasterizer, and compare it to what the driver does. The reference rasterizer is always right, by definition (which also makes it much easier for IHV's to make sure their drivers do what they're supposed to).
With OpenGL, there is no such thing.

mirh wrote:

Fair enough.
Though it's not like that's the only time similar disastrous results could be seen. OGL or not OGL.

Yes, there are so many games coming out, and so many different drivers and configurations, that even in D3D games, bugs do occur at times.
But with OpenGL there are far more bugs and issues, because as I said, there are many things just far less mature than D3D.
In the case of Rage, literally nobody used that particular extension in practice yet, because so few people develop for OpenGL.

mirh wrote:

Cmon.. I don't want to start a semantic debate now.
Obviously if you are the only one you can be both the worst and the best.

The semantics are important here. People use OpenGL ES because they have no choice. The question is, would they still use it if they had the choice?
On Windows, we know the answer to that: 99% of all games and other software are using D3D. Even software like 3dsmax, Inventor and Autocad use D3D by default now, while these were the classic OpenGL market.
Read this for example: http://forums.autodesk.com/autodesk/attachmen … olution_788.pdf

When we use OpenGL, we have found over the past many years (and still today) that we need to invest in a large, significant amount of QA that simply verifies that the OpenGL graphics driver supports the OpenGL API on the level that we use (which is actually rather dated, to be consistent with OpenGL GDI Generic, from circa 1997). In spite of the fact that we do not use any new fancy OpenGL extensions and use OpenGL almost on the level of 1997 graphics HW technology, we routinely encounter OpenGL graphics drivers that do not work correctly and as a result, we have our extensive OpenGL graphics HW certification process which involves a serious amount of testing effort on our part that does not actually test Inventor, it merely tests the OpenGL graphics driver. In fact, we currently have 44 (and counting) OpenGL "Workarounds" that we can invoke in our OpenGL graphics layer to "workaround" various OpenGL graphics driver problems.

The opposite situation exists for Direct3D. If a Direct3D graphics driver is Microsoft WHQL (Windows Hardware Quality Lab) certified, then it works correctly as far as Direct3D itself is concerned. This is the purpose of the WHQL testing and certification suite at Microsoft, to enforce compliance with the Direct3D specification. No such process exists in the graphics community for OpenGL and the quality of OpenGL graphics drivers suffers greatly as a result. With Direct3D, our QA team can focus on testing _our_ code and finding defects in _our_ graphics code, instead of having to spend all their time just verifying that the graphics HW vendors have done their job correctly to produce an OpenGL graphics driver that actually works.

I'm not making this up, that's how it is.

mirh wrote:

Except it worked flawlessly with nvidia and it was just ati too lazy to ever work check those.

It didn't work flawlessly with nVidia either. It was just not as completely unplayable as AMD.

mirh wrote:

Also, I didn't know RAGE required off-the-tree/unofficial extensions.

They were official, they just had never been used, because well, nobody cares about OpenGL. That's the lack of maturity I'm talking about. On paper, you can fool anyone into thinking OpenGL can do everything that D3D can. In practice however, a lot of stuff simply doesn't work properly on at least one vendor's drivers. Which means you can't really release software like that... unless you're a big name like Carmack. He can release anything, and he knows it's important enough to the IHV's to make it work.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 35 of 41, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
Jorpho wrote:
Kreshna Aryaguna Nurzaman wrote:

On the other hand, I haven't found any old OpenGL games that refuses to run on modern GPUs. GLQuake, Call of Duty, American McGee's Alice, Kingpin: Life of Crime, Medal of Honor: Allied Assault, Return to Castle Wolfenstein; all just run fine.

As I recall, Anachronox (using the Quake 2 engine) bombs pretty badly unless you do something to disable extensions – unless an exception has since been built into the drivers.

Hmm... I have yet to encounter such problem though

Scali wrote:

Yes, that's the thing... There's only a handful of OpenGL games/engines out there, so drivers are often written to make sure these games work (having very little, if anything at all, to do with the OpenGL standard as it is specified). So it is sugarcoated.
But in the demoscene, it's way different. People write their own engines, and because it's such a niche, drivers are not usually tweaked to make demos run. So you get to see the raw, unedited state of OpenGL.
A lot of OpenGL demos only work on the brand/type of card they were developed for, and quite often they even break on that after some driver updates.

So at the end, does it even matter? In theory, Direct3D is more standardized, while OpenGL is more 'chaotic' where each vendor can have their own extension. But in practice, it's OpenGL games that are more 'unified' --running flawlessly regardless of vendors and models. Yes, it's probably because there's only a handful of OpenGL engines, while lots of OpenGL games actually run on the same basic engine over and over again. But if anything, it shows OpenGL game developers tend to 'play it safe' instead of supporting vendor-exclusive extensions that would break their games on unsupported vendors. Of course demoscene is a different story, but I fail to see why does it matter to games, since OpenGL game developers want to sell their games to the widest audience possible --unlike demoscene developers.

On the other hand, old Direct3D games tend to cause headache on newer-generation GPUs, despite Direct3D is supposed to be a more unified API. Besides, according to the blog you've mentioned, Direct3D has evolved to be more 'OpenGL-like', where Microsoft had compromised with different vendors who wanted to insert their own vendor-specific features. Quoted from the blog:

Had I been working on Direct3D at the time I would have stridently resisted letting the 3D chip OEM’s lead Microsoft around by the nose chasing OpenGL features instead of focusing on enabling game developers and a consistent quality consumer experience. I would have opposed introducing shader support in favor of trying to keep the Direct3D driver layer as vertically integrated as possible to ensure feature conformity among hardware vendors.

The way I see it, both OpenGL and Direct3D have become a mess, despite the latter was originally designed to be a more standardized, more unified API.

However, in practice, it's OpenGL games that tend to run more flawlessly across different GPU models and different GPU generations. Probably not because OpenGL is a better standard, but because those games are basically using the same Quake engines again and again. But at the end, does it really matter? American McGee's Alice is using Quake 3 Arena engine, yet its gameplay and its 'look and feel' is vastly different than Quake 3 Arena itself.

Now let's consider Crimson Skies --a Direct3D game-- where enabling AA on modern Radeon turns the player's aircraft into black silhouette, while enabling CSAA on modern GeForce crashes the game menu. As a user, I frankly have more headache with such Direct3D games than typical OpenGL games --and I believe it's what matters to gamers. How many average gamers actually care about running demoscene on their computers anyway?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 36 of 41, by Scali

User metadata
Rank l33t
Rank
l33t
Kreshna Aryaguna Nurzaman wrote:

So at the end, does it even matter? In theory, Direct3D is more standardized, while OpenGL is more 'chaotic' where each vendor can have their own extension. But in practice, it's OpenGL games that are more 'unified' --running flawlessly regardless of vendors and models.

The sample space of OpenGL games is so small that you really can't make any useful comparisons at all.
Anyway, read what Autodesk said about OpenGL above. They abandoned OpenGL in favour of D3D because OpenGL was a nightmare to support. And this is professional graphics software, the birth ground of OpenGL.

Kreshna Aryaguna Nurzaman wrote:

But if anything, it shows OpenGL game developers tend to 'play it safe' instead of supporting vendor-exclusive extensions that would break their games on unsupported vendors.

I think things like Rage and Doom 3 prove exactly the opposite: In order to make any kind of up-to-date game, you often NEED to use new extensions (they don't necessarily have to be vendor-exclusive), because OpenGL is generally behind the curve. Note that there is a huge difference between OpenGL on paper and OpenGL in the real world.
For example, OpenGL 4.5 has been specced out more than a year ago. AMD's drivers still do not implement it. So you cannot actually target OpenGL 4.5 yet in real-world applications, unless you want to be nVidia-only.

It only works because once a big game such as Rage is released, the vendors HAVE to make it work on their drivers, to save their reputation and sales.

There should be no such thing as "unsupported vendors", we're talking about a *standard* here.

Kreshna Aryaguna Nurzaman wrote:

Of course demoscene is a different story, but I fail to see why does it matter to games

That's because you're moving the goalposts from OpenGL as a generic programming environment to (big name) games only.

Kreshna Aryaguna Nurzaman wrote:

On the other hand, old Direct3D games tend to cause headache on newer-generation GPUs

They do?
If you cherry-pick perhaps. There are so many D3D games out there, and I bet that 90% of them work fine. I know I have quite a large collection of games, and old stuff like Half-Life 2, Far Cry, Max Payne etc, they work just fine. I'd have to look hard through my collection to find something that doesn't work I guess.
My own DirectX code also still works fine even on Windows 10. And we're talking stuff that goes back even to very early versions.

Kreshna Aryaguna Nurzaman wrote:

Besides, according to the blog you've mentioned, Direct3D has evolved to be more 'OpenGL-like', where Microsoft had compromised with different vendors who wanted to insert their own vendor-specific features.

At the API level perhaps, but I'm talking about the ecosystem. We've already covered the differences.
Besides, that part is talking about OLD versions, around D3D6/7. Since then, D3D has moved away from OpenGL again, with OpenGL trying to add new D3D features as extensions, which eventually led to OpenGL being reinvented into Vulkan.

Kreshna Aryaguna Nurzaman wrote:

The way I see it, both OpenGL and Direct3D have become a mess, despite the latter was originally designed to be a more standardized, more unified API.

Then you're missing an obvious point:
Direct3D does not 'evolve' like OpenGL does.
OpenGL is one interface, that is extended for every version.
Direct3D uses the COM programming model, and introduces a NEW interface for every major version. This means that DX12 has literally NOTHING to do with DX11, etc.
So things don't become a mess because Microsoft doesn't keep bolting on new features to the same old interface.

Anyway, I'm talking as a developer, not as a gamer. You see, I have to *write* the software. Gamers only have to *use* it. You can only use the game once it's already been written, and the driver bugs have already been patched. In the end it may work.
However, if you have to *write* the game, and the drivers don't work, it's pretty much impossible. A proper standard should make sure that you can write software for that standard. OpenGL failed horribly in that respect. And I don't see a lot of changes in the ecosystem with Vulkan, so I don't think it will fare much better. It looks like they will make the same mistakes over and over again.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 37 of 41, by mirh

User metadata
Rank Member
Rank
Member
Kreshna Aryaguna Nurzaman wrote:

So at the end, does it even matter? In theory, Direct3D is more standardized, while OpenGL is more 'chaotic' where each vendor can have their own extension. But in practice, it's OpenGL games that are more 'unified' --running flawlessly regardless of vendors and models. Yes, it's probably because there's only a handful of OpenGL engines, while lots of OpenGL games actually run on the same basic engine over and over again. But if anything, it shows OpenGL game developers tend to 'play it safe' instead of supporting vendor-exclusive extensions that would break their games on unsupported vendors. Of course demoscene is a different story, but I fail to see why does it matter to games, since OpenGL game developers want to sell their games to the widest audience possible --unlike demoscene developers.

No please, this is just incoherent talking.
Older games are broken because Z-buffer is treated in a different way in newer hardware. Be it because it's more efficient this way, or because it would be unsupported otherwise.
OpenGL games on the other hand all have a compatibility profile (aka atiogl.xml) and I remember I had to downgrade driver or dlls sometimes due to this (these exists even for some DX games though)
Couldn't say for Nvidia.

Scali wrote:

I think things like Rage and Doom 3 prove exactly the opposite: In order to make any kind of up-to-date game, you often NEED to use new extensions (they don't necessarily have to be vendor-exclusive), because OpenGL is generally behind the curve.

You make it sounds as if it was rocket science. It's just a driver matter.

Scali wrote:

For example, OpenGL 4.5 has been specced out more than a year ago. AMD's drivers still do not implement it. So you cannot actually target OpenGL 4.5 yet in real-world applications, unless you want to be nVidia-only.

AMD does support it, and most of times you only require a couple of extensions (that if everything goes well could be supported even by 10 years old hardware)

Scali wrote:

It only works because once a big game such as Rage is released, the vendors HAVE to make it work on their drivers, to save their reputation and sales.

Looking to the future ( ͡° ͜ʖ ͡°)

Last edited by mirh on 2017-06-28, 14:40. Edited 1 time in total.

pcgamingwiki.com

Reply 38 of 41, by Scali

User metadata
Rank l33t
Rank
l33t
mirh wrote:

You make it sounds as if it was rocket science. It's just a driver matter.

Writing graphics drivers IS rocket science (why do you think there are so many bugs in them? Games/graphics drivers have to be some of the most buggy software out there). Especially a convoluted API like OpenGL... and no proper specs or reference.

mirh wrote:

AMD does support it, and most of times you only require a couple of extensions (that if everything goes well could be supported even by 10 years old hardware)

Uhhh, the screenshot in that thread clearly shows OpenGL 4.4, not 4.5.
Yes, they support some of the 4.5 functionality as extensions to 4.4. But not everything. Besides, the driver obviously doesn't report version 4.5 to the application, so it simply isn't going to work that way. You'd have to manually query for all extensions you need to use, and pray AMD supports them all. See how OpenGL is broken for development in practice?

mirh wrote:

Looking to the future ( ͡° ͜ʖ ͡°)

They make *Rage* work, not OpenGL. These hotfix drivers do just enough to make the specific game work, but that is no guarantee that the entire driver actually works according to the full spec. It just works well enough to run Rage.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 39 of 41, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

What I want to know is that if Direct3D is such a panacea of standardization, why do I keep seeing either nVidia or AMD pay developers to make their cutting-edge Direct3D game run like crap on the other company's hardware? If Direct3D is that much better than OpenGL, then it seems that newer features should be well-understood and supported by the hardware vendors by the time that the software companies get around to using them in big-budget titles.

I am curious to see what Vulkan brings to the table. Just because it's being made by the same players that produced OpenGL doesn't mean they won't take lessons learned from all existing APIs. The fact is that even if were technically feasible/sensible, Microsoft would legally block any attempt by the hardware vendors to implement a literal Direct3D API in Linux (and even if they didn't they'd just screw with the API in cat-and-mouse fashion to intentionally break Linux support), so we're stuck with OpenGL as the common interface to GPUs until something like Vulkan comes along (I'm discounting Mantle because vendor-specific APIs should be killed with fire, with Direct3D being a lesser example because it is always going to be Microsoft-only thanks to Microsoft but at least it's neutral ground among the hardware vendors).