VOGONS


"Fake AGP" slots?

Topic actions

Reply 20 of 43, by 2mg

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote:

That actually proves my point:

8800GTS 320mb (not enougb VRAM) = bad performance.
X1900XTX 512mb (enough VRAM) = good performance.

So, if you've low on VRAM, more transfers happen on a slow PCIEx4 and it chokes FPS?

I mean, GTX680 isn't really low on VRAM, even if it's running those modern games, yet except BF, every PCIEx4 graph is 25-40% drop in FPS, that can mean a smooth 60FPS experience vs unstable 50FPS.
And even X1900XTX loses in VIEWperfspec and most other benchmarks in Tom's article.

Basically, PCIEx4 will work "fine" IF your VRAM AND game's needs align, which seems inconsistent at best?
Please correct me if I'm wrong on this, I'm really curious.

Just read VIA chipset specifications. My opinion can't change them.

I know, it's the Asrock manuals that trigger me.
ASUS has the same AGP+PCIe setup, and doesn't mention GPUs in the manual at all, so I'm kinda wary of Asrock's shenanigans/misleadings...
I can swear (my head is full now) I saw a board with PT880 that had written "AGP 3 8x, PCI Extreme" nad "PCI Express Lite", like what the hell?

I see that ULi M1695/ ULi M1567 and PT880 Ultra/Pro (only VT8237R+ fixes SATA issues AFAIK) are the only proper AGP+PCIe implementations (PT880 has x4 PCIe, ULi can do 2 PCIe x8 too), but from what I see in boards from that era - it's like Wild West, hence my panicking about what is AGP and what's "fake".

Perfectly suitable performance - 100fps+. And that's with cards which are much more powerful to what realistically can be used on VIA motherboard, with much more weaker CPU.

Again, if you're hovering around 60FPS, those kind of % drops in FPS could mean sub 60FPS performance.
I mean, it also proves my point - 8800 falls apart at PCIx4 while it would've rocked in x16, and that's with two games tested that are old-ish for those GPUs.

Reply 21 of 43, by timsdf

User metadata
Rank Newbie
Rank
Newbie
2mg wrote on 2022-12-04, 09:10:
So, if you've low on VRAM, more transfers happen on a slow PCIEx4 and it chokes FPS? […]
Show full quote
The Serpent Rider wrote:

That actually proves my point:

8800GTS 320mb (not enougb VRAM) = bad performance.
X1900XTX 512mb (enough VRAM) = good performance.

So, if you've low on VRAM, more transfers happen on a slow PCIEx4 and it chokes FPS?

Again, if you're hovering around 60FPS, those kind of % drops in FPS could mean sub 60FPS performance.
I mean, it also proves my point - 8800 falls apart at PCIx4 while it would've rocked in x16, and that's with two games tested that are old-ish for those GPUs.

I would not stress this in 2022. 640mb variants are common and someone could test the differences between them. I use sometimes G92 8800GTS 512mb on 775DUAL-VSTA, from what I have gathered it has ~15% lower performance for some reason 3dm06 score of 10k vs 12k of similar pcie x16 systems but it works fine what I use it for.

Main reason for a lot of people using DUAL-VSTA and 4coredual-VSTA is they are cheap, plentiful and mostly capacitor plague free. Older 775Dual-880Pro model has more capacitor issues. Running P4 doesn't probably help compared to 775 pentiums TDP.

Reply 22 of 43, by 2mg

User metadata
Rank Member
Rank
Member
timsdf wrote on 2022-12-04, 12:12:

I would not stress this in 2022. 640mb variants are common and someone could test the differences between them. I use sometimes G92 8800GTS 512mb on 775DUAL-VSTA, from what I have gathered it has ~15% lower performance for some reason 3dm06 score of 10k vs 12k of similar pcie x16 systems but it works fine what I use it for.

Main reason for a lot of people using DUAL-VSTA and 4coredual-VSTA is they are cheap, plentiful and mostly capacitor plague free. Older 775Dual-880Pro model has more capacitor issues. Running P4 doesn't probably help compared to 775 pentiums TDP.

Yeah, but that's what I don't understand fully, these are SATA USB PCIe systems with modern stuff compared to P3 or P4 478 systems.
Yet most of them do not have drivers for W98 (or any non-NT OS).
So it seems you are either at the mercy of your final build deciding to play along, or break stuff completely - in W98 that is.

Otherwise running these mobos in WXP seems pointless, you can go straight to final 775 stuff with Core2Quad/Extreme support, DDR3, PCIE2.0x16, you only lose AGP port, that was already filled with GPUs that were better for PCIE (both in bandwidth and no AGP bridges), right?
Heck, XP works officially on Ivy/Sandy Bridge and GTX960 (or more if you "unofficial it").

640mb variants are common

Yeah, the insane prices are also common... I kinda don't wanna mix something newer than GF7000 series in a 775dual-880Pro in the PCIE slot, since it doesn't do Core2 CPUs, meaning Preslers/Cedar Mills top, aka "478 v2.0" performance levels, they'd prolly bottleneck 8800, which would already be gimped with PCI x4...

Reply 23 of 43, by timsdf

User metadata
Rank Newbie
Rank
Newbie
2mg wrote on 2022-12-04, 12:56:
Yet most of them do not have drivers for W98 (or any non-NT OS). So it seems you are either at the mercy of your final build dec […]
Show full quote

Yet most of them do not have drivers for W98 (or any non-NT OS).
So it seems you are either at the mercy of your final build deciding to play along, or break stuff completely - in W98 that is.

Otherwise running these mobos in WXP seems pointless, you can go straight to final 775 stuff with Core2Quad/Extreme support, DDR3, PCIE2.0x16, you only lose AGP port, that was already filled with GPUs that were better for PCIE (both in bandwidth and no AGP bridges), right?
Heck, XP works officially on Ivy/Sandy Bridge and GTX960 (or more if you "unofficial it").

Yeah, the insane prices are also common... I kinda don't wanna mix something newer than GF7000 series in a 775dual-880Pro in the PCIE slot, since it doesn't do Core2 CPUs, meaning Preslers/Cedar Mills top, aka "478 v2.0" performance levels, they'd prolly bottleneck 8800, which would already be gimped with PCI x4...

Asrock website does not claim w98 support but VIA PT880 Pro/ultra chipset has VIA hyperion drivers with w98 compatibility. They work fine, SATA2 model does not have drivers but works otherwise fine too.

I like having one computer platform support for win98 and xp with options for any video card between 2000-2011 . Basically covers anything I'm going to do with said PC.

Price conversation is always difficult, depends on your patience, luck and local area. I don't pay more than 20€ for old PC parts. Ebay prices are funny. 8800 cards have been very common and could be had for free locally here. Never owned 6000, 7000 series gpus.

GPU list I have used in one of these boards:
voodoo2 pci
GF3 ti
ti 4400, ti 4600
FX5900 ultra
Ati 9600, 9700pro, 9800pro
8800GTX, 8800GTS

Reply 24 of 43, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

VIEWperfspec and most other benchmarks in Tom's article.

That's synthetic benchmark - not relevant.

yet except BF, every PCIEx4 graph is 25-40% drop in FPS

These cards are much newer and tested on Ivy Bridge platform. With games which also way too new for any potential retro-setup.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 25 of 43, by rasz_pl

User metadata
Rank l33t
Rank
l33t

Speaking of FPS vs PCIE BW I just bought new old GTX1660SC (finally reasonable $100 low end GPU) the other day. Connected to my ~i7-4790 Xeon setup using PCIE x1 1.0 extender and ran modern game Mount & Blade II: Bannerlord. Flat 30 FPS no matter the low/max graphic settings. Changed to PCIE 2.0 x1 and same 30fps. Switched to 2.0 x16 and max settings average 100fps, max 160fps, 1% min 57fps. Funny irrelevant anecdote.

Open Source AT&T Globalyst/NCR/FIC 486-GAC-2 proprietary Cache Module reproduction

Reply 26 of 43, by 2mg

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2022-12-04, 15:08:

That's synthetic benchmark - not relevant.

All the benchmarks make both cards fail more or less, and one game kills 8800.

I'm not here to contradict you, but knowing that your weak lane will bite you someday is OK thinking, but knowing which combination of game and GPU(VRAM) is impossible, hence I see PCIx4 as a trap still.

These cards are much newer and tested on Ivy Bridge platform. With games which also way too new for any potential retro-setup.

I still don't get that, is that article a repeat of 8800? Because 680 is a beefy card.

rasz_pl wrote on 2022-12-04, 16:48:

...
Switched to 2.0 x16 and max settings average 100fps, max 160fps, 1% min 57fps. Funny irrelevant anecdote.

That's what I'm trying to find out from The Serpent Rider, he's a bit more savvy about this.

timsdf wrote on 2022-12-04, 14:31:

Asrock website does not claim w98 support but VIA PT880 Pro/ultra chipset has VIA hyperion drivers with w98 compatibility. They work fine, SATA2 model does not have drivers but works otherwise fine too.

I didn't know that. I mean, I know you could go to Intel for example for a chipset driver for your mobo, but if a manufacturer killed/didn't provide the support in the first place, chances are that the part manufacturer also doesn't have it.
And "driverless" or "lacks one specific driver" builds have some horror stories.
Plus a bunch of "unknown devices" which are 99% of the time harmless, but if one decides to act up (or is quietly killing the system)...
Bit bummed out now knowing these mobos actually worke, my criteria was always to check if there are drivers on the net for that mobo/OS.

I like having one computer platform support for win98 and xp with options for any video card between 2000-2011 . Basically covers anything I'm going to do with said PC.

Sure, but you can fit only so much in a PC. Not counting period correct builds, there still is a limit, and especially if you want more than one GPU driver.
Or god forbid, disk fails (yes, I know, RAID/disk cloning).
Ivy/Sandy + GTX700/900 sounds delicious and covers 2000-2013 and more, and you can even "unofficial it" to Haswell...
Just kinda feel better knowing that if one era build fails, it's just that one. OTOH those will probably fail sooner than a new multi-system, eh I dunno.

Price conversation is always difficult, depends on your patience, luck and local area. I don't pay more than 20€ for old PC parts. Ebay prices are funny. 8800 cards have been very common and could be had for free locally here. Never owned 6000, 7000 series gpus.

Yeah, patience I have, luck, local area, and gold nuggets is what's killing me softly.
Also, how something can suddenly become "retro/vintage" is crap, and I'm not talking in a time flies sense.
Half of things considered retro in tech world is e-waste, not in a bad sense, "scavenging" has a charm to it, but yeah, eBayou prices...
Phil makes a Conroe build, suddenly everyone needs one, okay supply/demand, but you ain't convincing me it's now worth up to 500$.

GPU list I have used in one of these boards:
...

See, that's what doesn't fit in one huge "all OS" system. Still like the idea, but maybe a few chunks are better?

Main reason for a lot of people using DUAL-VSTA and 4coredual-VSTA is they are cheap, plentiful and mostly capacitor plague free. Older 775Dual-880Pro model has more capacitor issues.

But 775DUAL-VSTA and 4CoreDual-VSTA have that PCIEx4, that kinda loses it's effect by the end of 2000/early 2010s, unless you have a beefy 2010s GPU that can take the PCIEx4 hit.
Also, have an incoming 775Dual-880Pro, thanks for ruining it.
Did cap plague affect early 2000s, like early 478 and late 370 sockets? Dumb question, but it did also affect AMD mobos?

Reply 27 of 43, by pentiumspeed

User metadata
Rank l33t
Rank
l33t
2mg wrote on 2022-12-04, 18:25:
All the benchmarks make both cards fail more or less, and one game kills 8800. […]
Show full quote
The Serpent Rider wrote on 2022-12-04, 15:08:

That's synthetic benchmark - not relevant.

All the benchmarks make both cards fail more or less, and one game kills 8800.

I'm not here to contradict you, but knowing that your weak lane will bite you someday is OK thinking, but knowing which combination of game and GPU(VRAM) is impossible, hence I see PCIx4 as a trap still.

These cards are much newer and tested on Ivy Bridge platform. With games which also way too new for any potential retro-setup.

I still don't get that, is that article a repeat of 8800? Because 680 is a beefy card.

rasz_pl wrote on 2022-12-04, 16:48:

...
Switched to 2.0 x16 and max settings average 100fps, max 160fps, 1% min 57fps. Funny irrelevant anecdote.

That's what I'm trying to find out from The Serpent Rider, he's a bit more savvy about this.

Yes I can explain. PCIe is number of lanes, higher you go on lanes per card, higher the bandwidth goes. x1, x4, x8, x16 and this depends on Intel and AMD CPUs allocates the lanes, like you can only get x1, or x4 x4 x8, or x16. Desktop CPU by intel is always x16 total that you can use or divide them up among cards, and very newest CPUs was raised to x20 or x24 thanks to AMD competition. Xeon E3 series starting with ivy bridge is x20. And there is exceptions on Xeon E5 and later, some low end E5 is cut down on lanes to segment the xeon. That's Intel.

One more thing, if you have extra PCIe card, this will steal the lanes from GPU, that means GPU gets x8. Have to move the card to another PCIe slot that is from southbridge instead so your GPU keeps the x16 lanes. That is detailed in the manual for motherboard and for OEM computer, their technical manual.

Version 1.0 PCle is first gen bandwidth is starts with. 2.0 doubles this from 1.0. and 3.0, doubles again from 2.0, so on.

Here's the wiki on PCIe stuff.

https://en.wikipedia.org/wiki/PCI_Express

One more thing, Even you have PCIe 3.0 board, but most of time, GPU starts at PCIe 1.0 version for power management, you have to load up the GPU by running something will jump to 2.0 or higher.
IF the GPU card is 1.0, it stays at this. If GPU card is 2.0, will start at 1.0, then in load it up will go to 2.0, even the motherboard is 3.0.
Cheers,

Great Northern aka Canada.

Reply 28 of 43, by Horun

User metadata
Rank l33t++
Rank
l33t++

Agree timsdf ! I picked up a 8800GTS 320Mb for next to nothing not long ago. Prices in my area or ebay for most any 8800 are very reasonable......and they perform quite well for the price.

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 29 of 43, by 2mg

User metadata
Rank Member
Rank
Member
pentiumspeed wrote on 2022-12-04, 19:26:

Yes I can explain.

Thanks, but that part I'm familiar with, it's the starvation of GPU performance shown here: https://www.tomshardware.com/reviews/pci-expr … sis,1572-8.html that was explained that the 8800 doesn't have enough VRAM, so I'm guessing has to transfer needed textures more frequently and it chokes the PCIe x4 lane, but that shouldn't be happening here https://www.techpowerup.com/review/intel-ivy- … -scaling/8.html and it almost isn't happening here https://www.techpowerup.com/review/intel-ivy- … -scaling/6.html so I dunno what's going on.

This all started with my asking when will the PCIEx4 choke (for games and GPUs around 2004-2008).

Reply 30 of 43, by pentiumspeed

User metadata
Rank l33t
Rank
l33t

This depends on how 3D is processed on either CPU and GPU or both changes the performance overall, not surprising.

Secondly, Not all GPUs supports less than x16, some newer low to mid end start at x8. There are true x1 GPU cards out there but it is for low end use.

Cheers,

Great Northern aka Canada.

Reply 31 of 43, by Horun

User metadata
Rank l33t++
Rank
l33t++
2mg wrote on 2022-12-04, 23:11:

Thanks, but that part I'm familiar with, it's the starvation of GPU performance shown here: https://www.tomshardware.com/reviews/pci-expr … sis,1572-8.html that was explained that the 8800 doesn't have enough VRAM, so I'm guessing has to transfer needed textures more frequently and it chokes the PCIe x4 lane, but that shouldn't be happening here https://www.techpowerup.com/review/intel-ivy- … -scaling/8.html and it almost isn't happening here https://www.techpowerup.com/review/intel-ivy- … -scaling/6.html so I dunno what's going on.

This all started with my asking when will the PCIEx4 choke (for games and GPUs around 2004-2008).

At Toms Hardware If you read how they did the PCIe X1, X4, etc they taped parts of the vid card and only a few select boards could even work with things that way (leading me to think their method was an improper way to get real world results).
Also looking at Toms it appears that nearly all the games but one did within 80% or better under 4x versus 8x except for COD 2 for nVidia 8800GTS.
Since the 8800GT came out same time and was about 15% faster than a 8800GTS not sure why Toms used the 8800GTS to compare. In fact I would have preferred the 8800 GT 512Mb vs HD 2900 XT 512Mb for those comparisons...

Hate posting a reply and then have to edit it because it made no sense 😁 First computer was an IBM 3270 workstation with CGA monitor. Stuff: https://archive.org/details/@horun

Reply 32 of 43, by 2mg

User metadata
Rank Member
Rank
Member
Horun wrote on 2022-12-05, 03:34:

At Toms Hardware If you read how they did the PCIe X1, X4, etc they taped parts of the vid card and only a few select boards could even work with things that way (leading me to think their method was an improper way to get real world results).

AFAIK gtx680 could possibly need to draw more power from PCIe x16 which x1-x4 can't provide, that might've been the case...
And games might've had way more textures and had to load them more often even with 680's big VRAM.
Dunno how to interpret that article's results, or does it prove the VRAM point.

Also looking at Toms it appears that nearly all the games but one did within 80% or better under 4x versus 8x except for COD 2 for nVidia 8800GTS.
Since the 8800GT came out same time and was about 15% faster than a 8800GTS not sure why Toms used the 8800GTS to compare. In fact I would have preferred the 8800 GT 512Mb vs HD 2900 XT 512Mb for those comparisons...

I'm unsure how 8800 lineup performs, but 8800 GTS (G80) has 320-640MB, but GeForce 8800 GTS (G92) has 512mb and seems faster than 8800GT, maybe it was one of those "Nvidia has a bunch of misleading GPUs" things. Also, 8800 GT has 256, 512 and 1024mb versions...

Still, if (lower) VRAM was the limiting factor for PCIEx4, as I said, that period seems wild for one to know which GPU would've been gimped by PCIEx4 unless you saw benchmarks beforehand when buying one. It proves the point that more VRAM fares better on PCIEx4, but also that it's an trap variable in what's going on - we might be smarter now, but in 2007 when this was done, there's probably no way you could've known this.

Reply 33 of 43, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

680 could possibly need to draw more power from PCIe x16 which x1-x4 can't provide

That's not how it works. Once again, read PCIe specs.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 34 of 43, by 2mg

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2022-12-06, 11:40:

That's not how it works. Once again, read PCIe specs.

I know, I said maybe, I've read somewhere that it was an issue, maybe exclusively with these PCIEx4 + AGP combos, I probably mixed the issue with fake AGP not being able to provide enough power (since it was really a hack of PCI slots and extra wiring for power).

I'd still like your explanation why VRAM matter for PCIEx4, and why does GTX680 choke at PCIEx4 with it's huge amount of VRAM.
Again, genuinely curious.

Reply 35 of 43, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The relatively large GeForce performance drop at 4x is curious. There was some strange behavior with the 320MB GeForce 8 cards that may have been some kind of driver memory management issue. I think it was eventually fixed. I think it's pulling texture data across the bus when it shouldn't be necessary to do so.

The drop at PCIe 1x for both cards is probably demonstrating a bottleneck with geometry transfer. This is even more pronounced with conventional PCI GeForce/Radeon cards.

I'm not sure when games started to do heavy texture streaming during play but old games tended to avoid doing it. I don't think you'd want to play Rage on PCIe 1.0 x4 for example.

Reply 36 of 43, by pentiumspeed

User metadata
Rank l33t
Rank
l33t

PS: Power is doing work in energy rate per second called watts is expressed in one unit. Like you have a 200KW engine in your car while you have 200W video card at max utilization. And remember that watts changes all the time, when you are at cruise, around 50KW used, same idea with GPU lightly loaded generates less watts.

Incorrect thinking on power draw limits performance. Totally untrue. Either the GPU will not power up if there is no external power connected or saying a error.

Simple words, bandwidth and GPU type affects performance. And how the each games do with renderings data sent to GPU gives different results, some are efficient some are so heavy on bandwidth and heavy on GPU processing ie: low end is less performance, while mid end GPU gives mid performance while high end GPU gives all the eye candy.

BUT this depends on how old or how recent games is easier or heavy on GPU generation like
GTX 285 do poorly while GTX 1080 do better on one game while GTX 285 do better and GTX 1080 not reaching best potential on other games.

And once again, PCIe versions (1.0, 2.0 so on) bandwidth doubles each time with each version per lane. Another dimension, x1 is one unit of bandwidth, while x16 lanes is full width for 16 units of bandwidth.

This is why good review sites keep database using several games and different synthetic benchmarks when a product is reviewed to give you best idea of what is going on.

Cheers,

Great Northern aka Canada.

Reply 37 of 43, by 2mg

User metadata
Rank Member
Rank
Member
swaaye wrote on 2022-12-10, 14:35:

The relatively large GeForce performance drop at 4x is curious. There was some strange behavior with the 320MB GeForce 8 cards that may have been some kind of driver memory management issue. I think it was eventually fixed. I think it's pulling texture data across the bus when it shouldn't be necessary to do so.

So that 8800 320mb doesn't have a PCIEx4 performance drop due to lower VRAM compared to that ATi GPU in the linked Tomshardware page?
Serpent Rider explains that it's due to lower VRAM on 8800.

pentiumspeed wrote on 2022-12-10, 17:31:

And once again, PCIe versions (1.0, 2.0 so on) bandwidth doubles each time with each version per lane. Another dimension, x1 is one unit of bandwidth, while x16 lanes is full width for 16 units of bandwidth.
Cheers,

See above.

Reply 38 of 43, by pentiumspeed

User metadata
Rank l33t
Rank
l33t

No again.

Does not have to do with PCIe. x4 really bottlenecks any PCIe GPUs that are mid end to high end. If the game is not effected by fewer lanes then game is not using all the GPU period.

Back in the day, people was doing fine with 256MB, mid and high end GPUs and 8800 GTS 320MB is sightly more performance and has 64 bits (320bits) more than 256MB cards (256 bits) and other 8800 GTX is 384 bits at 768MB and is top of the line.

8800 GTS (upper mid end) and 8800 GTX (top of the line) are close together. Does not make sense. You should notice loss at x4 lanes if maxing out the either of these GPUs even on 8800 GTS. Depends on games how hard it is on GPU or light on it plus settings in the each game set up.

Remember, x16 lanes is more bandwidth vs x4 lanes.

That review is showing what each games is doing with GPU by crippling the GPU by restricting the bandwidth.

Cheers,

Great Northern aka Canada.

Reply 39 of 43, by swaaye

User metadata
Rank l33t++
Rank
l33t++
2mg wrote on 2022-12-10, 23:19:

So that 8800 320mb doesn't have a PCIEx4 performance drop due to lower VRAM compared to that ATi GPU in the linked Tomshardware page?
Serpent Rider explains that it's due to lower VRAM on 8800.

To be sure you would probably need to test a 8800 GTS 640MB. I think it is slower because of less memory but it is due something a bit more elaborate like the G80 chip or driver of the time wasn't good with having only 320MB to work with and the slow bus speed makes it even more obvious. There were various discussions about curious behavior with the card.

Nvidia admitted to some kind of slowdown issue eventually too.
https://forum.beyond3d.com/threads/nvidia-wil … eptember.38720/

I would probably avoid 2007 and 2008 drivers for G80+.