VOGONS

Common searches


The frustrations of the GPU market

Topic actions

Reply 140 of 191, by Meatball

User metadata
Rank Oldbie
Rank
Oldbie
Geri wrote on 2022-10-13, 19:00:

GPUs are dinosaurs of the past, it makes no sense for a generic person to buy a graphics card. Graphics cards are a fine topic for a retro forum like this, but its not something a normal person cares any more. When we observe new generation of igps, an integrated AMD, Intel, or even an vivante/s3/mali/powervr based chinese/korean/russian/british/taiwanese whatever igp clone will perform very well in PC and mobile systems. On modern PC systems, even software rendering - such as llvmpipe - offers enough 2D and 3D performance for the average user.

Buying graphics cards indeed makes sense for those who have very old systems, or their main hobby is gaming. That means: the demand for graphics cards are very small compared to like the late 90s/early 2000s where they had to sell yearly 200 million graphics cards to keep up with the market demand. Which means a shrinking market, where less and less people have to pay the increasing development costs. This means very expensive cards, and lower production numbers in overall.

As literally everyone is moving into integration and low power consumption, including manufacturers, businesses and end users, its a dead end for companies like nvidia and their multiple 100w graphics cards in the long term. Right now there is no task a geforce 4xxx can do, but the igp of a random $200-300 android phone or laptop cant, unless you ramp up the resolution to sizes a normie cant even tell apart.

A Geforce4 Ti4200 was a $200 card in 2002. Factoring inflation it would be $330 today. An RTX 3060 sells for $360. As you say, any card today would destroy cards from back then, but it does seem the performance to price ratio has stayed relatively constant, even if volume has dropped. If you were referring to the halo models, the kind of people who would buy an RTX 4090 today are the same people who would buy a Ti4600, and there were never many of those people (or product) relative to the Ti4200-type crowd (another constant).

Reply 141 of 191, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

A 1987 IBM 8514 card would cost about $3600 in 2022 dollars. But equivalent of that is probably some Quadro or Firepro etc.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 142 of 191, by TrashPanda

User metadata
Rank l33t
Rank
l33t
Geri wrote on 2022-10-13, 19:00:

GPUs are dinosaurs of the past, it makes no sense for a generic person to buy a graphics card. Graphics cards are a fine topic for a retro forum like this, but its not something a normal person cares any more. When we observe new generation of igps, an integrated AMD, Intel, or even an vivante/s3/mali/powervr based chinese/korean/russian/british/taiwanese whatever igp clone will perform very well in PC and mobile systems. On modern PC systems, even software rendering - such as llvmpipe - offers enough 2D and 3D performance for the average user.

Buying graphics cards indeed makes sense for those who have very old systems, or their main hobby is gaming. That means: the demand for graphics cards are very small compared to like the late 90s/early 2000s where they had to sell yearly 200 million graphics cards to keep up with the market demand. Which means a shrinking market, where less and less people have to pay the increasing development costs. This means very expensive cards, and lower production numbers in overall.

As literally everyone is moving into integration and low power consumption, including manufacturers, businesses and end users, its a dead end for companies like nvidia and their multiple 100w graphics cards in the long term. Right now there is no task a geforce 4xxx can do, but the igp of a random $200-300 android phone or laptop cant, unless you ramp up the resolution to sizes a normie cant even tell apart.

What world do you live in exactly ???

It sure as heck isn't the same world as me or the hundreds of thousands of other PC users who actually require a discrete GPU.

Even laptops have discrete GPUs and only resort to the IGPU for base trash tier business models, iGPUs have come a long way thanks to AMD but even then they have been rather basic affairs until very recently but they have their limits which the newest iGPUs hit hard. Integration can only go so far since you hit up against both thermal and power consumption limits and you are forced to sacrifice one or the other to stay within the limits, you are also fighting die space limitations. So while yes iGPUs can do the basic stuff they are not the solution here for modern computers.

Im going to ignore the bit about android phones and cheap iGPUs being comparable to a modern RT/AI capable GPUs .. thats just bullshit and we both know the RT/AI GPU can handle things the iGPU and Android cant. (And yes even the basic 4000 series GPUs have both RT and AI cores)

Modern GPUs dont just push pixels anymore.

Reply 143 of 191, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie
Meatball wrote on 2022-10-14, 02:51:

A Geforce4 Ti4200 was a $200 card in 2002. Factoring inflation it would be $330 today. An RTX 3060 sells for $360. As you say, any card today would destroy cards from back then, but it does seem the performance to price ratio has stayed relatively constant, even if volume has dropped. If you were referring to the halo models, the kind of people who would buy an RTX 4090 today are the same people who would buy a Ti4600, and there were never many of those people (or product) relative to the Ti4200-type crowd (another constant).

The RTX3060 is not the modern equivalent of the Ti4200. It'd be closer to the RTX3070. RTX3080 being the Ti4400 and RTX3090 the equivalent of the Ti4600. The 3060 (and RTX3050) doesn't have a good equivalent in that generation, since the MX cards are more similar to the GTX wariants kept for budget offerings (1660, 1650). It'd be easier to compare the FX series or the Geforce 2xx which had also messy product lineups. Especially the latter is very similar in that regard with newly developed Tesla 2.0 products for the high-end and a mix of Tesla 2.0 and rebranded Tesla 1.0 cards for the low-end.

The thing is, the product lineup changed a lot since the 90s and early 2000s with a huge leap upward in price way above inflation. The Ti4600 launched at $400 which would be around $620 adjusted for inflation while the RTX3090 launched at $1500. $1500 is not a consumer-grade product price. It's enterprise-grade price.

sreq.png retrogamer-s.png

Reply 144 of 191, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
RandomStranger wrote on 2022-10-14, 07:05:
Meatball wrote on 2022-10-14, 02:51:

A Geforce4 Ti4200 was a $200 card in 2002. Factoring inflation it would be $330 today. An RTX 3060 sells for $360. As you say, any card today would destroy cards from back then, but it does seem the performance to price ratio has stayed relatively constant, even if volume has dropped. If you were referring to the halo models, the kind of people who would buy an RTX 4090 today are the same people who would buy a Ti4600, and there were never many of those people (or product) relative to the Ti4200-type crowd (another constant).

The RTX3060 is not the modern equivalent of the Ti4200. It'd be closer to the RTX3070. RTX3080 being the Ti4400 and RTX3090 the equivalent of the Ti4600. The 3060 (and RTX3050) doesn't have a good equivalent in that generation, since the MX cards are more similar to the GTX wariants kept for budget offerings (1660, 1650). It'd be easier to compare the FX series or the Geforce 2xx which had also messy product lineups. Especially the latter is very similar in that regard with newly developed Tesla 2.0 products for the high-end and a mix of Tesla 2.0 and rebranded Tesla 1.0 cards for the low-end.

The thing is, the product lineup changed a lot since the 90s and early 2000s with a huge leap upward in price way above inflation. The Ti4600 launched at $400 which would be around $620 adjusted for inflation while the RTX3090 launched at $1500. $1500 is not a consumer-grade product price. It's enterprise-grade price.

Not sure if China is developing a similarly capable GPU with significantly less price tag, that would come in like a blessing to everyone but still other Nv, amd and Intel won't ve shaken since they are too ahead with their featured.

Unfortunately $1500 is 3/4th of the total money an average professor earns monthly in India and the price here is actually close to $2000, so I can't imagine these cards to be mainstream here at all. In the last 3 years, it's just 1 person that I know has a 2080Ti and the other has a 3080 and nobody has touched the two series cards.

previously known as Discrete_BOB_058

Reply 145 of 191, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

GeForce 4 Ti 4200 was just underclocked 4600, both share identical high-end chip. So nope, customers now are paying for less even with inflation included. Although it's arguable that Nvidia and other companies back then just couldn't afford chips with big die size, due to drawbacks of manufacturing.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 146 of 191, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie

Up until Geforce 4 the GPU lineup was simple and clean. Nvidia manufactured 1 or 2 GPUs and played with clocks and the memory BUS. Like all the TNT2 cards were essentially the same. The Pro and vanilla were just underclocked Ultra and the M64 and Vanta were underclocked Ultra with 64bit BUS.

sreq.png retrogamer-s.png

Reply 147 of 191, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2022-10-14, 08:32:

GeForce 4 Ti 4200 was just underclocked 4600, both share identical high-end chip. So nope, customers now are paying for less even with inflation included. Although it's arguable that Nvidia and other companies back then just couldn't afford chips with big die size, due to drawbacks of manufacturing.

Maybe again we'll see pre-builts becoming a norm again. I don't favour it but clearly a single hardware is making this happen a lot more than the last few years.

previously known as Discrete_BOB_058

Reply 148 of 191, by Geri

User metadata
Rank Member
Rank
Member
TrashPanda wrote on 2022-10-14, 04:42:

It sure as heck isn't the same world as me or the hundreds of thousands of other PC users who actually require a discrete GPU.

You are indeed living an echo chamber made by the hundreds of thousands of pc users who require a discrete gpu (number by you), and you are trying to project your views to 8 billion people, meanwhile the hundreds of thousands (the number, according to you) is less than 0.01% of the population. Your market demand is barely more relevant than your local harry potter fanclub on facebook.

TrashPanda wrote on 2022-10-14, 04:42:

we both know the RT/AI GPU can handle things the iGPU and Android cant. (And yes even the basic 4000 series GPUs have both RT and AI cores). Modern GPUs dont just push pixels anymore.

Ray traced picture doesn't have pixels? Do you even understand what you are talking about?

It just takes a few minutes to do a real research with the help of internet search engines:
https://www.jonpeddie.com/images/uploads/AIB-PR-Q219-002.PNG
https://www.displaydaily.com/images/2015/Febr … attach_rate.jpg

Last edited by Geri on 2022-10-14, 10:52. Edited 2 times in total.

TitaniumGL the OpenGL to D3D wrapper:
http://users.atw.hu/titaniumgl/index.html

Reply 149 of 191, by Geri

User metadata
Rank Member
Rank
Member
Meatball wrote on 2022-10-14, 02:51:

A Geforce4 Ti4200 was a $200 card in 2002. Factoring inflation it would be $330 today.

Your example is flawed. As others already explained it, a GeForcet ti4200 was actually a high-end card, its comparable to an $1000-ish card of today.

The price of high-end gaming graphics cards were indeed around $200 in the late 90s and early 2000s, as the numbers of graphics card sales started to shrink, the price of the gaming models and midrange models also started to creep upwards after the release of the 8800gt. After that, intel came out with its dx10 igp, and amd started to put its newly acquired radeon into its cpu.

TitaniumGL the OpenGL to D3D wrapper:
http://users.atw.hu/titaniumgl/index.html

Reply 150 of 191, by havli

User metadata
Rank Oldbie
Rank
Oldbie

https://www.anandtech.com/show/601/3

With very fast memory, high yield chips, and a minimum of 64MB of SDRAM on the cards, how expensive do you think the new GeForce2 Ultra will be? It’s definitely not going to be at the $400 mark the 64MB GeForce2 GTS cards began selling at, nope, you’re looking at around $500 for a GeForce2 Ultra.

So... not really $200 for hiend card...

HW museum.cz - my collection of PC hardware

Reply 151 of 191, by gaffa2002

User metadata
Rank Member
Rank
Member

Another factor not being taken into consideration is that the practical gains for using high end cards are becoming less and less noticeable.
In the 90s/2000s you could clearly see the difference between 320x200 and 640x480, not only that, but the color depth also increased. It was night and day difference for both the enthusiast and the average Joe. Not just in graphics but in usability as well: The UIs changed from simple command lines, or square gray buttons to pretty windows that shrink and grow in front of a beautiful animated wallpaper, not to mention the smooth movement of the mouse cursor.
Nowadays, all those impressive changes reached a peak: Resolution is so large that there is very little visible difference between 2k and 4k. Even less on higher resolutions... same thing with framerates, between 15 to 30 is a huge difference, so is from 30 to 60 and maybe a little from 60 to 120...but once the framerate increases, the difference gets less and less noticeable.
In summary: High end GPUs in the past were night and day compared to cheaper options, and everyone could notice the difference. High end GPUs today are just for people who are bothered by tiny little details (I mean, tiny little details visually speaking, because in computing terms they require exponentially more horsepower). Integrated cards today can do almost anything the average user wants, then even among people who use their PCs for gaming a mid-end GPU is more than enough for the vast majority of them.
My guess is that discrete GPUs, specially the very high end ones, will tend to become more and more expensive simply because of supply and demand.
Must say I kind of enjoy this technological peak because it means computer hardware will have a longer lifespan. The industry could focus on making existing devices sturdier and more accessible instead of reinventing the wheel again and again. And for gaming, there will be more room for creativity as there will be not so much pressure for making it an interactive tech demo.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 152 of 191, by Meatball

User metadata
Rank Oldbie
Rank
Oldbie
Geri wrote on 2022-10-14, 10:45:
Meatball wrote on 2022-10-14, 02:51:

A Geforce4 Ti4200 was a $200 card in 2002. Factoring inflation it would be $330 today.

Your example is flawed. As others already explained it, a GeForcet ti4200 was actually a high-end card, its comparable to an $1000-ish card of today.

The price of high-end gaming graphics cards were indeed around $200 in the late 90s and early 2000s, as the numbers of graphics card sales started to shrink, the price of the gaming models and midrange models also started to creep upwards after the release of the 8800gt. After that, intel came out with its dx10 igp, and amd started to put its newly acquired radeon into its cpu.

I don’t disagree with the masses and lower end GPUs getting the job (adequately give or take). Maybe I missed your point? I’m reading your take as the value is no longer present.

It doesn’t matter if the Ti4200 was actually a higher end card underclocked; it was targeted and marketed to the mid-range as a response to the 8500LE.

We’ll go even lower then using your example of people who don’t care about graphics. Take the MX cards of the GeForce4 line. An MX420 was suggested at $100 at launch. Inflation makes it $165. A GTX 1630 suggests at $200.

And a Celeron G6900 with integrated GPU listed for $42.

You get a lot more for your money today, even with money’s continuing devaluation.

Reply 154 of 191, by Geri

User metadata
Rank Member
Rank
Member
Meatball wrote on 2022-10-14, 14:14:

Maybe I missed your point? I’m reading your take as the value is no longer present.

There are far more differences than just the price of the cards, lets consider the following.

In the ti4200 era, if you wanted to play games, you had to buy a graphics card, because integrated intel 8xx, via prosavage, and sis 300 chips, wich were integrated to the motherboard, were either able to run newer video games with like 3 fps in lowest settings, or they was not even able to start them. Or you likely didnt even had a graphics chip on the motherboard, so you had to buy a card. If you wanted to play modern video games, you bought a geforce4 titanium and thats it.

Now, if you get a random game, pull the settings to low, you are going to get around 30 fps with your igp. For the overwhelming majority of people, this is enough. Two decades ago, your typical integrated chip gave one tenth of this.

What this means is, one magnitude less buyers for video cards, which means very expensive cards, as fewer people has to play for the development - especially, since its more complicated to develop the modern drivers for new apis and to develop the gpu itself.

And another thing is... People, who want more than these integrated graphics can offer (the gamers), and willing to spend money on it... they don't want a 20-30% over the integrated card. They want something significantly better. What nVidia can do is, they trying to escape from the integrated graphics with all of their knees, by using multiple hundred watts monstrosities with 10-20 gbyte vram and brutal cooling, because otherwise, no one even would buy their hardware.

Of course integrated graphics also getting a little bit faster and faster by every generation, biting more and more portions out from nvidia.
Its not just ,,oh the video cards are so expensive'', its just the question, when nvidia will reach the tipping point, when they cant compete with the 3d performance of integrated cards even with 400ish watts tdp any more, because they couldnt offer anything thats realistically can compete with the integrated cards in the long term. Thats why they desperately tried to build SOCs for tvs, phones and tables, but failed, and they ended up as a supplier for a few console instead... Thats why they tried to buy arm, which they failed agian. Thats why they jumped onto riscv, but it seems they have failing to deliver there too. If they cant come up with something within a few years, then vultures can appear above nvidia. Its just nowhere to go with graphics cards any more.

TitaniumGL the OpenGL to D3D wrapper:
http://users.atw.hu/titaniumgl/index.html

Reply 155 of 191, by gaffa2002

User metadata
Rank Member
Rank
Member
Meatball wrote on 2022-10-14, 14:14:
I don’t disagree with the masses and lower end GPUs getting the job (adequately give or take). Maybe I missed your point? I’m r […]
Show full quote

I don’t disagree with the masses and lower end GPUs getting the job (adequately give or take). Maybe I missed your point? I’m reading your take as the value is no longer present.

It doesn’t matter if the Ti4200 was actually a higher end card underclocked; it was targeted and marketed to the mid-range as a response to the 8500LE.

We’ll go even lower then using your example of people who don’t care about graphics. Take the MX cards of the GeForce4 line. An MX420 was suggested at $100 at launch. Inflation makes it $165. A GTX 1630 suggests at $200.

And a Celeron G6900 with integrated GPU listed for $42.

You get a lot more for your money today, even with money’s continuing devaluation.

At the time of the MX420, many motherboards came with integrated video. And those were sometimes cheaper than the motherboards without it, so those would be the "people who don't care about graphics" at the time.
This may change drastically between regions, but from my experience at the time the people who usually bought the MX series were people interested in gaming but couldn't afford higher end cards, people that today would purchase something like a 1650 at least.
In general, I agree with you that you get more for your money but this seems apply only until we get to mid-end GPUs. For people that must have the latest GPU, I don't see prices dropping because of the lower demand AND the now much longer lifecycle of older GPUs.
Only time will tell but I agree with Geri, and believe discrete GPUs will become a niche product. This will happen when integrated GPUs reach the point of running games in some kind of sweet spot for the general public.
For me this is a good thing because it means more people will have access to good gaming PCs.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 156 of 191, by Meatball

User metadata
Rank Oldbie
Rank
Oldbie
gaffa2002 wrote on 2022-10-14, 14:52:
At the time of the MX420, many motherboards came with integrated video. And those were sometimes cheaper than the motherboards w […]
Show full quote
Meatball wrote on 2022-10-14, 14:14:
I don’t disagree with the masses and lower end GPUs getting the job (adequately give or take). Maybe I missed your point? I’m r […]
Show full quote

I don’t disagree with the masses and lower end GPUs getting the job (adequately give or take). Maybe I missed your point? I’m reading your take as the value is no longer present.

It doesn’t matter if the Ti4200 was actually a higher end card underclocked; it was targeted and marketed to the mid-range as a response to the 8500LE.

We’ll go even lower then using your example of people who don’t care about graphics. Take the MX cards of the GeForce4 line. An MX420 was suggested at $100 at launch. Inflation makes it $165. A GTX 1630 suggests at $200.

And a Celeron G6900 with integrated GPU listed for $42.

You get a lot more for your money today, even with money’s continuing devaluation.

At the time of the MX420, many motherboards came with integrated video. And those were sometimes cheaper than the motherboards without it, so those would be the "people who don't care about graphics" at the time.
This may change drastically between regions, but from my experience at the time the people who usually bought the MX series were people interested in gaming but couldn't afford higher end cards, people that today would purchase something like a 1650 at least.
In general, I agree with you that you get more for your money but this seems apply only until we get to mid-end GPUs. For people that must have the latest GPU, I don't see prices dropping because of the lower demand AND the now much longer lifecycle of older GPUs.
Only time will tell but I really believe discrete GPUs will become a niche product and this will happen when integrated GPUs reach the point of running games in a "sweet spot" for the general public. For me this is a good thing because it means more people will have access to good PCs.

I didn't want to throw in the 1650 because it wasn't bottom of the rung and re-launched during 2020 (from 2019). But since you mentioned it, it was $150; Inflation of 2020 would make the MX $144.00.

Reply 157 of 191, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

Desktops looked dead in 2018ish, then there was a surge of interest from under 21s for games like fortnite, apex legends etc.. which kept them on life support, then the work from home and nothing to do at home covid thing boosted desktop interest more. I don't know if their fortunes are permanently revived to a steady portion of the market now, or whether we're gonna see the desktop market go byebye again.... and that is where non-integrated GPUs fit.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 158 of 191, by gaffa2002

User metadata
Rank Member
Rank
Member
Meatball wrote on 2022-10-14, 15:02:

I didn't want to throw in the 1650 because it wasn't bottom of the rung and re-launched during 2020 (from 2019). But since you mentioned it, it was $150; Inflation of 2020 would make the MX $144.00.

I agree with you that a GPU today is much more cost effective than back then. My only point was that this will not remain for long, and for a totally different reason... back in the day if you bought an ultra high end GPU it would be outdated in a couple of years. Outdated in the sense that you couldn't even run games on low anymore. This changed over the years and you can get away with a GPU for 5 to 10 years without it becoming useless, not because high end GPUs aren't getting faster, but because the gains of having faster GPUs are becoming less and less noticeable/relevant. I.e. Why buy a GPU 4x more powerful than what I already have just to be able to move from 2k to 4k resolution?
Then if this trend continues, there will be a point where integrated cards will be enough for the vast majority of people (today, this position belongs to mid-end GPUs IMHO).

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 159 of 191, by swaaye

User metadata
Rank l33t++
Rank
l33t++
BitWrangler wrote on 2022-10-14, 15:04:

Desktops looked dead in 2018ish, then there was a surge of interest from under 21s for games like fortnite, apex legends etc.. which kept them on life support, then the work from home and nothing to do at home covid thing boosted desktop interest more. I don't know if their fortunes are permanently revived to a steady portion of the market now, or whether we're gonna see the desktop market go byebye again.... and that is where non-integrated GPUs fit.

Intel, AMD and Nvidia are having a relatively awful year because demand has cratered.

Last edited by swaaye on 2022-10-14, 15:36. Edited 1 time in total.