VOGONS

Common searches


First post, by DosFreak

User metadata
Rank l33t++
Rank
l33t++

https://www.makeuseof.com/should-gamers-use-m … graphics-cards/

How To Ask Questions The Smart Way
Make your games work offline

Reply 1 of 27, by Jo22

User metadata
Rank l33t++
Rank
l33t++

I don't even know where to start. 😨

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 2 of 27, by leileilol

User metadata
Rank l33t++
Rank
l33t++

it's true. here's john carmack in 1995 codding Quake 1 in hd!!!

Attachments

  • joncamrack.jpg
    Filename
    joncamrack.jpg
    File size
    22.39 KiB
    Views
    1374 views
    File license
    Fair use/fair dealing exception

apsosig.png
long live PCem

Reply 3 of 27, by gaffa2002

User metadata
Rank Member
Rank
Member

Not sure if that was you meant, but...

Spoiler

In the end it suggests you to instead of buying better/more GPUs, buy better peripherals like a huge 4k/8k monitor.
Thing is, those will be useless if you don't have a monster GPU, in fact, it will make your current GPU struggle even more and the lower resolution look even uglier.
In other words, very bad advice as you will spend more money just to make your performance problem worse.
Or did I miss something?
Edit: I might have got it, is it the last line, about the meal?

Last edited by gaffa2002 on 2022-06-29, 21:55. Edited 1 time in total.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 4 of 27, by Gmlb256

User metadata
Rank l33t
Rank
l33t

At least 1920x1080 shouldn't be an issue on 4K and 8K Ultra HD 16:9 monitors. 4K and 8K UHD are two and four times the resolution of 1920x1080, respectively.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 6 of 27, by gaffa2002

User metadata
Rank Member
Rank
Member
Gmlb256 wrote on 2022-06-29, 20:15:

At least 1920x1080 shouldn't be an issue on 4K and 8K Ultra HD 16:9 monitors. 4K and 8K UHD are two and four times the resolution of 1920x1080, respectively.

The resolution itself will not be the problem (aside of course the waste of using 4 pixels to draw the exact same color instead of one, just to fill the whole screen), problem is the bigger screen size as the jagged pixels will be more visible.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 8 of 27, by luckybob

User metadata
Rank l33t
Rank
l33t
image-asset.png
Filename
image-asset.png
File size
238.35 KiB
Views
1160 views
File license
Fair use/fair dealing exception

Everything looks okay to me.

I see nothing wrong here.

It is a mistake to think you can solve any major problems just with potatoes.

Reply 9 of 27, by keenmaster486

User metadata
Rank l33t
Rank
l33t

Was that article written by GPT-3?

Screenshot from 2022-07-02 13-54-28.png
Filename
Screenshot from 2022-07-02 13-54-28.png
File size
145.58 KiB
Views
1126 views
File license
Public domain

World's foremost 486 enjoyer.

Reply 10 of 27, by gaffa2002

User metadata
Rank Member
Rank
Member

Did anyone see for real what is wrong with the article? Because I didn’t find anything wrong with it.
Only thing I disagree was it suggesting to buy a larger monitor (which in turn is a personal opinion, not a fact).
Can someone please point that out? Or is it like The Emperor’s New Clothes story?

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 11 of 27, by TrashPanda

User metadata
Rank l33t
Rank
l33t
gaffa2002 wrote on 2022-07-02, 20:01:

Did anyone see for real what is wrong with the article? Because I didn’t find anything wrong with it.
Only thing I disagree was it suggesting to buy a larger monitor (which in turn is a personal opinion, not a fact).
Can someone please point that out? Or is it like The Emperor’s New Clothes story?

Everything between the start and end of the article .. its clearly written by a machine that is drawing its misinformation from outside sources, no human would write that junk.

Reply 12 of 27, by DosFreak

User metadata
Rank l33t++
Rank
l33t++

Mabye this will help.

In the mid-2000s, gamers usually envisioned a top-end Intel processor with at least 16GB of RAM and a whopping four premium GPUs when talking about the ultimate gaming machine. These cards run under Nvidia SLI or AMD CrossFire technology.

However, almost 20 years later

https://arstechnica.com/gadgets/2005/04/syste … guide-200504/4/
https://arstechnica.com/gadgets/2005/04/syste … guide-200504/3/

Instead, it's all about expensive GPUs, cryptocurrency, and 4K displays

See article title.

So, what happened? Are multi-GPU setups still worth it for gamers?

They've never have been "worth" it. It's a nice to have because you have money to spend. If we are talking 4k then definetly not "worth" it.

However, from the mid-2000s up until the mid-2010s, gaming development was outpacing hardware capacity—meaning even top-end cards struggled to provide high-FPS 4K gaming.

That's why many gamers built computers with two or more GPUs

lolwut

For example, high-FPS 4K gaming requires at least a 3060 Ti or 2080 Super. However, in a video posted on YouTube, DudeRandom84 was able to run Grand Theft Auto V on ultra settings in 4K. This was way back in 2017, about one year before Nvidia launched its RTX GPUs.

What is high-FPS 4K gaming?
Wonderful let's go by a youtube video made by DudeRandom84 . That's isn't lazy at all. Also GTA V came out in 2015.

DudeRandom84 used two Nvidia GTX 1080 Ti GPUs linked via SLI and powered by an overclocked Intel Core i7-7700K.

It's 2022. Not 2016 (video card) or 2017 (cpu).
What is the point of this again?

For example, high-FPS 4K gaming requires at least a 3060 Ti or 2080 Super

Another advantage multiple GPUs deliver is the availability of a backup card

That's a ton of money for a backup card. Just grab an old one from your closet.

Furthermore, multi-GPU setups are far more useful in professional use.

Then you wouldn't be buying a gaming card. See article title.

Before the 2020 pandemic ravaged the world, most GPUs had reasonable prices. For example, the Nvidia GTX 1080 Ti had a $699 SRP. But, if you get two of those cards, you must shell out $1,398.

No.

So, if you were planning on getting two RTX 3090s, you have to shell out almost $3,000.

A "gamer" could, not very many would.

GPUs are some of the most power-hungry elements of any computer. If you're running an RTX 3090 Ti, the GPU has a 450-watt TDP—this is more powerful than what some PSUs can deliver.

I thought we were talking about "gamers" here. Whatever that means. They should already have a sufficient PSU.

This means you must invest in an air conditioning or heat exchanger unit, or you risk overheating your body while gaming.

Wut?

Say you've finally set up two RTX 3090s on your computer, and you're now rearing to test it with your games. For example, Grand Theft Auto V runs smoothly with it since it has in-game support.

I thought GTA V max was DX11? (I haven't verified) and the 3090 doesn't support SLI?

Furthermore, there are times that titles with multi-GPU support end up with poor performance, like frame drops and stuttering, due to poor driver implementation

Most of the time

If you have an unlimited budget, you can install a second (or third, or fourth) GPU to maximize the potential power of your system. But that's just it—potential. No game, current or past, uses that amount of horsepower. So, unless you're using your monster of a PC for work, you don't really need another GPU to game at the best possible quality.

Instead of putting up $2,000 for a second card, why not get better peripherals? For that money, you can get a massive 4K or 8K display

So the article recommends "gamers" buy an 8k display with a single 3090? How about no unless those gamers don't care about "high-FPS" gaming.

How To Ask Questions The Smart Way
Make your games work offline

Reply 13 of 27, by ZellSF

User metadata
Rank l33t
Rank
l33t
gaffa2002 wrote on 2022-06-29, 21:19:
Gmlb256 wrote on 2022-06-29, 20:15:

At least 1920x1080 shouldn't be an issue on 4K and 8K Ultra HD 16:9 monitors. 4K and 8K UHD are two and four times the resolution of 1920x1080, respectively.

The resolution itself will not be the problem (aside of course the waste of using 4 pixels to draw the exact same color instead of one, just to fill the whole screen), problem is the bigger screen size as the jagged pixels will be more visible.

For modern games, more advanced scaling algorithms that help with aliasing are increasingly used. Integer scaling is also the worst for aliasing, there are better scaling algorithms. Sadly, the only way to get the best ones is a bit hacky (Magpie).

DNSDies wrote on 2022-06-30, 16:49:

high refresh rate monitors are a blurse as well.
Yes, you get buttery smooth motion, but your GPU actually needs to be able to render those extra frames.
Or you can just play older games at 144hz!

For non variable refresh rate content, a higher refresh rate monitor should be able to deliver smoother frame pacing for the frames your GPU can output, even if it doesn't match the refresh rate of the screen.

There's no drawback to a good high refresh rate monitor (except the price, of course). Which is why I think it's one of the most important things to have in a monitor.

Reply 14 of 27, by gaffa2002

User metadata
Rank Member
Rank
Member

Oh, ok. I thought there was something more blatant or something funny to find.
Personally I don’t think the article says anything “wrong”, in fact, I don’t think the article says anything at all.
The text is so soulless and generic and the point is never made clear.
Does it imply that multi GPUs were a good option in the past? Or just that its even less useful now than it was than back then?
What does it mean by “gamers” exactly (hardcore guy or just the average joe that likes videogames)?
What does it mean by “professional use”? A guy that needs different MS Office applications opened in multiple monitors? Or some guy that works with 3D modeling using software specifically tailored for multiple GPUS?
As for pricing, this is very broad and region dependent so its very hard to verify its accuracy.
Basically its a whole wall of text that says: “Don’t buy multiple GPUs and save the money to buy other stuff you may like. Unless of course multi GPUs are useful for you.”
As some people already pointed out, auto-generated crap.

Last edited by gaffa2002 on 2022-07-02, 23:58. Edited 1 time in total.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 15 of 27, by gaffa2002

User metadata
Rank Member
Rank
Member
ZellSF wrote on 2022-07-02, 22:14:
For modern games, more advanced scaling algorithms that help with aliasing are increasingly used. Integer scaling is also the wo […]
Show full quote
gaffa2002 wrote on 2022-06-29, 21:19:
Gmlb256 wrote on 2022-06-29, 20:15:

At least 1920x1080 shouldn't be an issue on 4K and 8K Ultra HD 16:9 monitors. 4K and 8K UHD are two and four times the resolution of 1920x1080, respectively.

The resolution itself will not be the problem (aside of course the waste of using 4 pixels to draw the exact same color instead of one, just to fill the whole screen), problem is the bigger screen size as the jagged pixels will be more visible.

For modern games, more advanced scaling algorithms that help with aliasing are increasingly used. Integer scaling is also the worst for aliasing, there are better scaling algorithms. Sadly, the only way to get the best ones is a bit hacky (Magpie).

DNSDies wrote on 2022-06-30, 16:49:

high refresh rate monitors are a blurse as well.
Yes, you get buttery smooth motion, but your GPU actually needs to be able to render those extra frames.
Or you can just play older games at 144hz!

For non variable refresh rate content, a higher refresh rate monitor should be able to deliver smoother frame pacing for the frames your GPU can output, even if it doesn't match the refresh rate of the screen.

There's no drawback to a good high refresh rate monitor (except the price, of course). Which is why I think it's one of the most important things to have in a monitor.

There is no drawback, but there are no substantial advantages, either. At least not enough to justify replacing a working 1080p/60hz screen if you already have one in my opinion.
Now if we are talking about buying your first monitor or replacing a defective one, then buying a 4k/144hz screen won’t hurt and will be future-proof.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 16 of 27, by Shreddoc

User metadata
Rank Oldbie
Rank
Oldbie

Sure is gonna be a blast navigating the next several decades of infotech, as the interminable "toddler years" of AI and autogen content slowly fills up all available digital space with programmed mediocrity.

Reply 17 of 27, by ZellSF

User metadata
Rank l33t
Rank
l33t
gaffa2002 wrote on 2022-07-02, 23:47:

There is no drawback, but there are no substantial advantages, either.

I mean, if someone told me a huge improvement to frame pacing wasn't a substantial advantage by itself, I would say they shouldn't bother with a 120hz monitor even if their GPU could manage it. They're clearly not good at spotting motion problems.

Unless you're talking about the caveat about variable refresh rate content, but I've run into a lot of random games that just don't play well with VRR (not to mention how helpful it is for video content).

gaffa2002 wrote on 2022-07-02, 23:47:

There is no drawback, but there are no substantial advantages, either. At least not enough to justify replacing a working 1080p/60hz screen if you already have one in my opinion.
Now if we are talking about buying your first monitor or replacing a defective one, then buying a 4k/144hz screen won’t hurt and will be future-proof.

Well if you're buying a high refresh rate monitor now, you won't just get the frame pacing advantage I mentioned, but also VRR support, since practically all high refresh rate monitors have that now. It's a very noticeable upgrade, more so than if you put that money into a single GPU upgrade. And it can last several GPU upgrades.

IMO, it's one of the more sensible PC upgrades you can do today. Even while still working under the assumption that only demanding games will be played and the actual high refresh rate capability won't be used.

(yes I'm assuming the 1080p/60hz display isn't VRR, because I don't think 1080p/60hz displays with VRR were ever common)

Reply 18 of 27, by gaffa2002

User metadata
Rank Member
Rank
Member
ZellSF wrote on 2022-07-03, 00:30:

I mean, if someone told me a huge improvement to frame pacing wasn't a substantial advantage by itself, I would say they shouldn't bother with a 120hz monitor even if their GPU could manage it. They're clearly not good at spotting motion problems

Honest question…how can a 120hz monitor improve the frame pacing of a game running at fixed 60fps with vsync enabled?
AFAIK higher refresh rates only makes sense if whatever game you’re running can run constantly in a framerate at least as high as that refresh rate.
Only other reason IMO is to improve screen tearing if you like your games running without any fps limit.

LO-RES, HI-FUN

My DOS/ Win98 PC specs

EP-7KXA Motherboard
Athlon Thunderbird 750mhz
256Mb PC100 RAM
Geforce 4 MX440 64MB AGP (128 bit)
Sound Blaster AWE 64 CT4500 (ISA)
32GB HDD

Reply 19 of 27, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

However, almost 20 years later, people no longer talk about this kind of raw power. Instead, it's all about expensive GPUs, cryptocurrency, and 4K displays.

This is most depressing...

leileilol wrote on 2022-06-29, 19:21:

it's true. here's john carmack in 1995 codding Quake 1 in hd!!!

I vaguely remember that 16:9 CRT monitor. What's the brand and model?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.