VOGONS


Video cards and 2d acceleration

Topic actions

First post, by 386SX

User metadata
Rank l33t
Rank
l33t

Hi all,

what do you think was the best video cards in the PCI/AGP years that actually speed up the most Windows 9x/Me/.. on the usual 2D 'windows' usage? I am not only talking about ramdac speed and vga analog converters quality but also the acceleration itself that the os would benefit.

I really had some great feelings using the Matrox G200/G400 but only if we compare to the older cards.

When do you think these advantages were not visible anymore?

Thank

Reply 1 of 21, by Scali

User metadata
Rank l33t
Rank
l33t

When do you think these advantages were not visible anymore?

I'd say in the Vista era.
Namely, because of the new driver model, the whole GDI rendering had to be re-done as well, and the first version of WDDM, used by Vista, did not support GDI acceleration features. So everything was 100% software.
Nobody even noticed.
In Windows 7, GDI acceleration was re-introduced under the new driver model. Again, nobody noticed.

But yes, I think the Matrox cards would probably be the best Windows accelerators ever made.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 2 of 21, by 386SX

User metadata
Rank l33t
Rank
l33t

I remember reading somewhere about the GDI lack on newer os but obviously seeing how much smooth Windows has become in latest version it's impossible to think it's not using some acceleration. 😉
The problem is that probably the actual idea of "smoothness" is overused and useless and force cpu/gpu a lot into this quest. Like a game that "must" run at 200fps when you would not see the difference.

Reply 3 of 21, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Tom's Hardware did some testing on 2D acceleration with recent cards a few years ago. This was actually instigated by noticeably low performance of some Radeon 5000 cards. Starting with Geforce 8 and Radeon HD, GUI acceleration is not performed with dedicated 2D hardware but instead with the 3D shader ALUs and apparently ATI had been slacking on their implementation.

http://www.tomshardware.com/reviews/2d-windows-gdi,2547.html

G400 era hardware is missing some GDI functions and is certainly not the fastest. The first Radeon and Geforce cards are probably faster already.

Vista and onward GUI acceleration is mysterious but it is there as shown by the results of that article. I doubt that it is actually worse than XP considering GDI most definitely has problems that Vista WDDM onward do not have.

Reply 4 of 21, by Scali

User metadata
Rank l33t
Rank
l33t
386SX wrote:

I remember reading somewhere about the GDI lack on newer os but obviously seeing how much smooth Windows has become in latest version it's impossible to think it's not using some acceleration. 😉

Vista and newer render differently.
Aero runs on top of Direct3D 9. Each application window is basically a texture. The 3d hardware (z-buffering) solves the clipping for overlapping.
Old versions of Windows would redraw overlapping items and use rectangles to mask out areas that are overlapping, to solve the z-ordering (which is why you would get that weird effect when an application was not responding and you would drag other windows over it... the areas would get marked for redrawing, but the application never responded to the repaint messages, so the areas just showed whatever was last drawn there). This was a lot more drawing-intensive. But, when this system was developed, video memory was very scarce, so double-buffering was expensive anyway, and masked blitting hardware, let alone z-buffering, was not available yet.

But Vista renders all the GDI components inside these textures with the CPU, instead of using hardware functions.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 5 of 21, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The 3D aspects and apparent lack of GDI acceleration of Vista make me wonder exactly what was being measured and causing bottlenecks in that Tom's article. You see hardware like GMA 950 and GF7 IGP keeping up with monster cards in some cases.

It would be interesting to see this testing done on Win8.

Reply 6 of 21, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

The 3D aspects and apparent lack of GDI acceleration of Vista make me wonder exactly what was being measured and causing bottlenecks in that Tom's article. You see hardware like GMA 950 and GF7 IGP keeping up with monster cards in some cases.

Part of the problem is that 2d isn't meant for performance.
High-end cards will clock themselves down in 'desktop' mode to save power and keep the noise down.
So indeed, what are you measuring? Basically all you're measuring is the performance that the driver-developers considered 'good enough' for regular desktop usage, to keep the GPU power usage down as far as possible.
If you artificially lock your GPU on max performance, you will probably get much better benchmark results.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 7 of 21, by 386SX

User metadata
Rank l33t
Rank
l33t

Intersting facts I didn't know, thank! Considering how much effects are "expected" by people in latest os (also smartphones os) I always think that the older things were better. Later amount of useless effects that require an octa core with a desktop kind of gpu in a phone make me want to go back to Dos.

Reply 8 of 21, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie
386SX wrote:

Intersting facts I didn't know, thank! Considering how much effects are "expected" by people in latest os (also smartphones os) I always think that the older things were better. Later amount of useless effects that require an octa core with a desktop kind of gpu in a phone make me want to go back to Dos.

Well a lot of things ARE regressing. That said, I don't know if using the 3D core is necessarily a bad idea.
Redrawing at every window switch is a fairly ugly affair in older OSes.

Although I am quite sure that Windows Vista/7 also supports full redraws in software (since it supports a standard VESA driver with no 3d)

Reply 9 of 21, by Scali

User metadata
Rank l33t
Rank
l33t
smeezekitty wrote:

Well a lot of things ARE regressing. That said, I don't know if using the 3D core is necessarily a bad idea.
Redrawing at every window switch is a fairly ugly affair in older OSes.

If you have a GPU and enough video memory, it's more efficient to use it.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 10 of 21, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Phones and such have gone back to dedicated 2D hardware to save power. Dedicated hardware does it most power efficiently. Sometimes they call this hardware a CGPU.

And Windows 8 toned down the visual sparklies. I like that a lot. They had to lighten things up so it could run on gimpy tablets and not wolf down the battery by powering up the GPU excessively. I've seen a lot of whining from people who apparently miss Aero though. Can't please everyone.

Reply 11 of 21, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

And Windows 8 toned down the visual sparklies. I like that a lot. They had to lighten things up so it could run on gimpy tablets and not wolf down the battery by powering up the GPU excessively. I've seen a lot of whining from people who apparently miss Aero though. Can't please everyone.

It is honestly one of the ugliest thing I have ever seen. FLAT looks like Windows 2.0/3.1
Atleast W10 brings back the start menu but I still prefer Vista/7.

You actually CAN please almost everyone in UI design by providing lots of user options and customization features.

Reply 12 of 21, by alexanrs

User metadata
Rank l33t
Rank
l33t
smeezekitty wrote:

It is honestly one of the ugliest thing I have ever seen. FLAT looks like Windows 2.0/3.1
Atleast W10 brings back the start menu but I still prefer Vista/7.

You actually CAN please almost everyone in UI design by providing lots of user options and customization features.

I actually like the flat desing... Windows 8/8.1 are the first ones since Windows XP (god... I hate Luna) that I can use with the default theme and not be bothered. Aero was too overdone. Then again, I'm a Windows Phone user, and I love squares xD. But I agree, the more choice you give, the more you people you please.

The interesting thing about Windows 8 is that, while it toned down the effects, the effects it does have seem to work just fine even with their generic SVGA driver, and quite well.

Reply 14 of 21, by Scali

User metadata
Rank l33t
Rank
l33t
mr_bigmouth_502 wrote:

Wait, Windoze didn't have hardware compositing until 7? Lol. 🤣

They had hardware compositing in Vista, just not hardware-accelerated GDI-rendering.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 15 of 21, by obobskivich

User metadata
Rank l33t
Rank
l33t

From a purely observational perspective, R300 seems to perform a bit better than NV30 or NV40. None are "bad" but there is a noticeable, albeit subtle, difference when they're compared side-by-side. In terms of analog output quality, anything much older runs the risk of bad filter implementation, which can have a substantial impact on image quality. I haven't noticed any significant improvement beyond those with R500, R700, etc, so the "around Vista" timeframe seems pretty reasonable. 😀

Scali wrote:

High-end cards will clock themselves down in 'desktop' mode to save power and keep the noise down.
So indeed, what are you measuring? Basically all you're measuring is the performance that the driver-developers considered 'good enough' for regular desktop usage, to keep the GPU power usage down as far as possible.
.

This is highly variable across different generations of hardware - modern cards (e.g. Kepler, Maxwell, GCN) don't really have a "desktop" vs "gaming" mode dichotomy, instead they dynamically adjust their clocks and resources in response to load, much like modern CPUs. Of course earlier cards with power management features, like GeForce 7, aren't as sophisticated. IME I have not noticed any performance differences between different clock tiers on earlier GeForce cards (FX, 6, 7), even running Aero Glass in Vista, and would agree with your assessment that "Vista-ish era" pretty much equalized everything. That doesn't mean a benchmark couldn't "see" such differences if they exist, but I'm certainly not noticing them in daily usage.

Reply 16 of 21, by 386SX

User metadata
Rank l33t
Rank
l33t

I had the same feelings with the R300 based cards. It's difficult to see differences with such advanced cards even less when they connected trough the dvi. But surely I can see big differences in the late 90 years when from an S3 Virge to a Mystique or to a Riva128 the differences where huge in quality and speed. Some days ago I retested the G400 and its 2D quality more than speed really is something. Saturation of default colors are similar to the Vibrance option of the Geforce2 cards.

Reply 17 of 21, by swaaye

User metadata
Rank l33t++
Rank
l33t++

R300 cards got complaints about VGA interference patterns. I remember seeing it myself. Sometimes you could see faint scrolling lines. R300 boards also tended to die, or needed to be underclocked to remain stable, after years of use.

We had a thread a few months ago about VGA interference on various cards.

On the topic of overall VGA signal quality..... it varies a lot but there are many excellent cards out there. Even early NVIDIA can be great but they were at the mercy of their cost cutting board vendors.

Reply 18 of 21, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie
386SX wrote:

I had the same feelings with the R300 based cards. It's difficult to see differences with such advanced cards even less when they connected trough the dvi. But surely I can see big differences in the late 90 years when from an S3 Virge to a Mystique or to a Riva128 the differences where huge in quality and speed. Some days ago I retested the G400 and its 2D quality more than speed really is something. Saturation of default colors are similar to the Vibrance option of the Geforce2 cards.

The Mystique really wasn't that great as Matrox goes. It had good VGA quality but limited acceleration capabilities.

Reply 19 of 21, by obobskivich

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

R300 cards got complaints about VGA interference patterns. I remember seeing it myself. Sometimes you could see faint scrolling lines. R300 boards also tended to die, or needed to be underclocked to remain stable, after years of use.

We had a thread a few months ago about VGA interference on various cards.

On the topic of overall VGA signal quality..... it varies a lot but there are many excellent cards out there. Even early NVIDIA can be great but they were at the mercy of their cost cutting board vendors.

Oh yeah; I didn't mean to say R300 was the best ever. My 9600 and 9700 both died prematurely, the 9700 had interference issues and artefacts near the end, etc. GeForce FX seems much more "stable," especially long-term, even if it's slightly slower in 2D when compared side-by-side.

And yeah VGA is highly variable on the early nV cards. ATi apparently didn't have this issue early-on because they OEM'd their own hardware and didn't go cheap-as-possible on the filter (and I've always wondered if this is or isn't the case for the later 3dfx-branded boards). I vaguely remember reading something to the effect that nVidia finally put their foot down with GeForce FX after years of bad press from GeForce 2/3/4 cards having dodgy VGA outputs, and indeed I've never seen an FX with junk VGA (again, it may not measure as perfectly as something like Parhelia, Quadro FX, Wildcat, etc but subjectively they all pass muster).

DVI signal quality is also worth mentioning when talking about older cards - even GeForce FX and Radeon 9 cards can fail compliance there. At lower resolutions this usually doesn't matter, but at higher resolutions (near the bandwidth limit), and especially over long cable runs, there can be problems. Ideally if you're getting a card from this era it uses an external TMDS transmitter (the built-in on the GeForce FX and others is the source of problems), and a quality one at that (e.g. Sil164, 172, etc). I've probably posted this before, but here's an article with measurement examples: http://www.extremetech.com/electronics/55254- … liance-shootout