First post, by squiggly
Man...this is going to cause me to rethink a whole bunch of shit.
Man...this is going to cause me to rethink a whole bunch of shit.
wrote:amazing!
shocking!
Sh*t, that has to be a tremendous amount of work!
I'm pressing F for respect! 🤣
Socket 775 - ASRock 4CoreDual-VSTA, Pentium E6500K, 4GB RAM, Radeon 9800XT, ESS Solo-1, Win 98/XP
Socket A - Chaintech CT-7AIA, AMD Athlon XP 2400+, 1GB RAM, Radeon 9600XT, ESS ES1869F, Win 98
Sure, there's a bombshell in there, I guess. The effect of increasing CPU overhead, etc.
A great, fantastic, comparison to showcase and measure the performance aspect. The rest of the case for picking drivers consists of image quality ('speed optimizations') and various bugs in games.
Its nice to see these results in comparative charts like this. I reported similar occurrences when trying to find the fastest driver with a POD100 + GF2MX and with K6-III + GF2/GF4/FX. There are so many cards and system configurations that I recommend trying every driver version for your specific combination of CPU, OS, graphic card, and desired game resolution.
One of the more consistent trend from Phil's video seems to be that the oldest supported driver is the fastest, although there are some exceptions. Often the one or two driver versions later are equally as fast (but sometimes faster) before the performance drops with increasing driver revision. Aside from bug fixes, I can't help but wonder if there are any specific features that a user might be missing out on by using the older, faster drivers?
Plan your life wisely, you'll be dead before you know it.
It just confirms general consensus, that first drivers are fastest in most cases as no performance consuming bug fixes are applied. Later optimizations and compatibility tweaks don't really compensate for all the performance loss.
This is well known for... at least 15 years now?
Crank up the AA/AF filtering and you start to get some return for your GPU power with the newer chips. Maxed out resolutions and anti-aliasing remains the main reason to build a heavily CPU bottlenecked system.
Very nice... I need to get me an open bench test case.
wrote:This is well known for... at least 15 years now?
That a gf4ti is slower than a gf2gts with same drivers? The bombshell is not the driver difference, but the cards.
wrote:wrote:This is well known for... at least 15 years now?
That a gf4ti is slower than a gf2gts with same drivers? The bombshell is not the driver difference, but the cards.
As far as I can see his benchmarks only include old DX6/DX7 games on a heavily CPU bottlenecked system (still more than overkill for these DX-levels). What did you expect? A GF4 stomping its competition on a system like that?
The minor top speed differences between the GPU's sound unusual and may be the result of inaccurate measuring or system-related reasons. Even if confirmed it doesn't mean this guy has revealed a sensational discovery as later GPU's were heavily modified and optimized for their respective DX-level - probably at the cost of losing some percent on older DX-levels which seems neglectable as the contemporary systems which those cards were targeted for had more than enough performance already.
The gf4/fx were substantially slower than a gf2 in several benchmarks, I didn't really expect that, no.
I like these old VGA charts from Toms Hardware... it also compares Athlon 1000 VS AthlonXP 2700+ with the same VGA cards.
http://www.tomshardware.com/reviews/vga-charts-ii,579-7.html
Requests are also possible... /msg kixs
I am not surprised but that said there is a lot of performance that is lost all the way up to the FX series and it shows that Nvidia's drivers are shit. Doesn't help that in all of that there were performance nerfs to drive sales of newer cards much like how they are doing now.
On a far away planet reading your posts in the year 10,191.
wrote:I am not surprised but that said there is a lot of performance that is lost all the way up to the FX series and it shows that Nvidia's drivers are shit. Doesn't help that in all of that there were performance nerfs to drive sales of newer cards much like how they are doing now.
It is probably not that there were intentional driver nerfs. They just didn't test the older cards nearly as much as they did the newer cards with newer drivers.
The newer cards were meant for higher DX levels as well, so they weren't really messing with optimizing for older DX versions at that point.
Take a look at the OpenGL results. That is much more what I would expect when comparing the different cards.
The system used for benching was also very CPU limited with the newer cards.
A more accurate test would be benching using something like a PIII-1000 or even faster.
Take the CPU and the RAM throughput out of the equation and then you will see the real performance difference between the video cards.
You are most likely right. Later made drivers most likely expect faster cpu to be used with them and the system used to test the drivers probably have had faster cpu as well. It might not be possible, but it would be nice to see gpu and cpu load % values when doing a test like this.
Comments on the YT say he tried much faster cpus and got the same result. Wish he had tested with a 3ghz p4 though just to be sure.
Also I would have wanted 82.69 driver version to be tested, because I have read that they fixed many bugs in that version and it would have been nice to see if the upwards trend would have continued in performance after the last driver version Phil tested.
A company known for their hellishly short release cycle under fierce competition wouldn't sit there optimizing for obsolete titles nor prioritizing regressions in them, for which you expect a notable fall in driver quality relative to a static reference.
wrote:It is probably not that there were intentional driver nerfs. They just didn't test the older cards nearly as much as they did th […]
wrote:I am not surprised but that said there is a lot of performance that is lost all the way up to the FX series and it shows that Nvidia's drivers are shit. Doesn't help that in all of that there were performance nerfs to drive sales of newer cards much like how they are doing now.
It is probably not that there were intentional driver nerfs. They just didn't test the older cards nearly as much as they did the newer cards with newer drivers.
The newer cards were meant for higher DX levels as well, so they weren't really messing with optimizing for older DX versions at that point.
Take a look at the OpenGL results. That is much more what I would expect when comparing the different cards.
The system used for benching was also very CPU limited with the newer cards.
A more accurate test would be benching using something like a PIII-1000 or even faster.
Take the CPU and the RAM throughput out of the equation and then you will see the real performance difference between the video cards.
I agree that the cpu he used is a bottleneck but in some cases when that happens the performance should have leveled off where most of the cards performed the same but not necessarily worse however a lot of us tend to forget (my self included) that Nvidia likes making up for weaknesses their cards have by forcing the cpu to make up for it.
On a far away planet reading your posts in the year 10,191.