VOGONS

Common searches


First post, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie

It seems many of the games I play from 2016 onward have some of the worst, horrific aliasing imaginable. Sure, older games benefited significantly from AA but nowadays a scene without AA isn't 'jaggies' it is a shimmering, vomit-inducing, shim, shim, shimmery eye massacre. With deferred shading/rendering it seems like the AA options aren't as good, either. They are either 1) Blur the crap out of everything or 2) Higher resolutions or downsampling/scaling.

I'm still using my 4GB 760GTX from 2014, it was $310 at the time. Before that it was a GTX 275 CO-OP PhysX edition for $350 in 2009 and a $320 7900GT in 2006 which had to be replaced by a 8800GTS when it died just over a year later, costing me the most I ever spent on a GPU - $380. This was the general price range I felt comfortable investing in my graphics card and I always felt like it was a huge leap when I'm upgrade 3-4 years later. Is it my imagination or are they expecting significantly more money for the same leaps in performance, or even less? ATI's reveal at CES was hugely disappointing to me because I've been wanting to jump ship from NVIDIA for a while but I don't feel like they're giving me a better alternative, either.

The asking price for cards that are already several years old and newer cards that are drawing comparisons to them -- I'm supposed to be excited when a 2019 card touts 20% better performance over a card from 2017? - seems ridiculous.

Would like to hear other people's thoughts since I am admit I'm pretty ignorant, I've been out of the loop having not bought hardware in 5 years now and with DRM the way it is I feel like moving back to console at least until they go all digital as well.

Reply 1 of 54, by keenmaster486

User metadata
Rank l33t
Rank
l33t

No resolution is a gimmick if your screen is big enough.

There are probably calculations you can do to determine just how much detail your eyes can actually resolve at a certain distance from a screen a certain size.

World's foremost 486 enjoyer.

Reply 2 of 54, by Unknown_K

User metadata
Rank Oldbie
Rank
Oldbie

$300 used to be what new high end cards cost back in the day (Voodoo 2 comes to mind). I snagged a RX-580 4GB card over a year ago before the mining craze made them super expensive and that is what I still use for my main gaming machine (which sits idle most of the time while I play old games on much older systems) and I only play 1080p.

There isn't really any graphics card out there you can buy that will run 60fps+ with all the settings maxed on new games and 4K monitors. Die shrinks are coming up with little speed improvements on both GPU and CPU designs so power usage is creeping up again with small improvements in performance. Nvidia with RTX is trying to bring something new to the game which will take a few generations to be usable and adopted by game designers.

Most AAA games are designed for consoles and then ported to the PC so some don't even take into consideration using better textures or properly use high end GPUs.

You mentioned the 8800GTS, that was the Tesla core they milked with die shrinks from 90nm down to 40nm GT 300 series.

Collector of old computers, hardware, and software

Reply 4 of 54, by infiniteclouds

User metadata
Rank Oldbie
Rank
Oldbie

I can tell the difference between 1440p and 1080p... but it is not substantial. On the other hand, if a game has horrendous aliasing like so many new games do, then the option seems to become crappy built in AA which blurs everything or dial up that "Scaling" % that usually goes up to 200%, I imagine increasing the actual resolution to 4K at 100% has the same effect.

Reply 7 of 54, by snorg

User metadata
Rank Oldbie
Rank
Oldbie
Unknown_K wrote:
$300 used to be what new high end cards cost back in the day (Voodoo 2 comes to mind). I snagged a RX-580 4GB card over a year a […]
Show full quote

$300 used to be what new high end cards cost back in the day (Voodoo 2 comes to mind). I snagged a RX-580 4GB card over a year ago before the mining craze made them super expensive and that is what I still use for my main gaming machine (which sits idle most of the time while I play old games on much older systems) and I only play 1080p.

There isn't really any graphics card out there you can buy that will run 60fps+ with all the settings maxed on new games and 4K monitors. Die shrinks are coming up with little speed improvements on both GPU and CPU designs so power usage is creeping up again with small improvements in performance. Nvidia with RTX is trying to bring something new to the game which will take a few generations to be usable and adopted by game designers.

Most AAA games are designed for consoles and then ported to the PC so some don't even take into consideration using better textures or properly use high end GPUs.

You mentioned the 8800GTS, that was the Tesla core they milked with die shrinks from 90nm down to 40nm GT 300 series.

How long ago was the Voodoo 2 released? I'm going to say it was before 2000? But let's say it came out in 2000, with inflation that $300 card is suddenly about $430. If the Voodoo 2 came out in 1997 or so then it might be closer to $500. Suddenly we are getting closer to those extortion level prices.

If you don't believe me, pop in your numbers here: https://westegg.com/inflation/

Reply 8 of 54, by BeginnerGuy

User metadata
Rank Oldbie
Rank
Oldbie

I'm not sure I understand what you're after here OP.

1. The prices are outrageous, that's a fact. 4k is still not viable for affordable PC hardware. Middleground is a 580 8gb (roughly) and a 1440P monitor. This has nothing to do with the viability of 4k though, and moreso to do with extreme price gouging on GPUs. Top end cards can now all handle 4k@60FPS, and even "lesser" cards like my vega64 handle it perfectly fine if you just turn a few minor detail settings down (which I never notice the difference anyway).

2. 4k itself isn't a gimmick. Also, my 4k monitor is 27" and I can still see aliasing and I still turn on AA in games that can take it such as GTA:V (I turn down other advanced settings to offset the performance hit) otherwise small distant objects still have that shimmering jaggy effect. So people who tell you 4k is only useful on a larger monitor either haven't looked for themselves, or perhaps sit far from their monitor. I wear thick glasses, so I wouldn't say I have the greatest eyesight either. When 4k is affordable for sane prices and is available with high refresh rates, all of the "too many pixels for the eye to see" or "useless on monitors under 32" arguments will vanish from the internet.

I would personally enjoy seeing 5k or 8k at 27", but I know I won't have a single strand of hair left that isn't gray by then 😜

Sup. I like computers. Are you a computer?

Reply 9 of 54, by leileilol

User metadata
Rank l33t++
Rank
l33t++

4k can be useful for moire-killing situations on CRT shaders for higher resolutions at least could benefit from the pixel density... the overall bar to have a decent 4k refresh at least 60hz is still way too high though

apsosig.png
long live PCem

Reply 10 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t

I find 4K useless for extra details or sharpness on the desktop or in games. 1440p is enough (1080p is a bit too blurry though). I do however find it better looking when scaling lower resolution content.

I wouldn't personally recommend paying the 500$ premium 4K costs for computer monitors right now (if you buy a 120hz+ monitor which you should: 120hz+ is more important than the difference between 1440p and 4K), but it is a nice feature that will be worth in when it's cheaper.

I always find it a bit weird when people say you shouldn't buy a 4K computer because your computer can't handle it for the latest games very weird though: It doesn't have to. Can't play a game at a comfortable framerate? just lower the resolution. 1800p will still look better on a 4K monitor than a 1440p monitor for example. Plus you're going to be using your monitor for way more things than just playing the latest AAA games.

Reply 11 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t
infiniteclouds wrote:

It seems many of the games I play from 2016 onward have some of the worst, horrific aliasing imaginable. Sure, older games benefited significantly from AA but nowadays a scene without AA isn't 'jaggies' it is a shimmering, vomit-inducing, shim, shim, shimmery eye massacre.

Could you post some examples?

I have always held the opinion that the ROI for using AA is pretty low, given its high computational cost, since most of the "terrible jaggies" people tend to post in AA-vs-no-AA images were rather mild, of the kind that seemed to only be able to bother someone if they compulsively focused on them (rather than enjoying the game).

I just recently bought a 4K monitor, with the following ideas behind it:

  • It gives me a whole lot desktop space (and Windows 10 is very nice in terms of automatically splitting windows to halves and quarters of the screen
  • Games that my GPU can handle at 4K I can play at 4K
  • Games that the GPU cannot handle at 4K I can downscale to FHD and it will at least be an integer divisor of the native resolution(even though it is still not as smooth as native FHD due to sub-pixel rendering)

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 12 of 54, by gerwin

User metadata
Rank l33t
Rank
l33t

I can see the issues with 4k screens as other people pointed out. There is this legacy of a somewhat standard PPI (around 95) that I cannot let go yet. But what I do want to add is that with this classic PPI 95 larger bodies of text are kinda primitive: the text smoothing options are not that great, and in my opinion crisp text without any smoothing still reads better on such screens. When i compare this with an iPad mini: this device can render text in digital books and websites very nicely.

--> ISA Soundcard Overview // Doom MBF 2.04 // SetMul

Reply 13 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t
ZellSF wrote:

I wouldn't personally recommend paying the 500$ premium 4K costs for computer monitors right now (if you buy a 120hz+ monitor which you should: 120hz+ is more important than the difference between 1440p and 4K), but it is a nice feature that will be worth in when it's cheaper.

I would pay almost any premium, but alas I also wanted a 32" monitor, and could not find any 32" 4K IPS with 120Hz+. Since I don't game that much, and definitely not competitively, I settled down for 60Hz; one day, though, I want to check if 120Hz really helps in day-to-day tasks and basic gaming as well.

ZellSF wrote:

I always find it a bit weird when people say you shouldn't buy a 4K computer because your computer can't handle it for the latest games very weird though: It doesn't have to. Can't play a game at a comfortable framerate? just lower the resolution. 1800p will still look better on a 4K monitor than a 1440p monitor for example. Plus you're going to be using your monitor for way more things than just playing the latest AAA games.

My thoughts exactly.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 14 of 54, by BeginnerGuy

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote:

I always find it a bit weird when people say you shouldn't buy a 4K computer because your computer can't handle it for the latest games very weird though: It doesn't have to. Can't play a game at a comfortable framerate? just lower the resolution. 1800p will still look better on a 4K monitor than a 1440p monitor for example. Plus you're going to be using your monitor for way more things than just playing the latest AAA games.

games have always been a benchmark for these kinds of things just as much as productivity. Games also take us to the limits of small objects that can prove what the eye can see vs cannot see, hence why I explained that I can see jagged edges at 4k in games and still turn on AA. What you see in a game also wouldn't be any different from editing photos or doing digital art. Finally, games were mentioned because OPs post was wide open to interpretation and leaning into hardware costs and discussing aliasing in games, which made it obvious games were in the equation. Who doesn't play games? As far as desktop use is concerned, mainstream operating systems seem to still be forcing scaling rather than updating to a smarter UI and higher res fonts, which leaves 1440p and 4k looking similar, 4k looks absolutely beautiful for text editing using Debian (which comes with way nicer fonts IMO). I don't think I'd feel giddy to jump above 4k at 27" for productivity, but for gaming, if I had the hardware, I'd jump right on 5 or 8k. ymmv.

I have to say though, your suggestion of 1800P is harshly criticized due to monitors interpolating between pixels. 1080P scales linearly with 4k (3840 % 1920 == 0) where 1800P does not. If you put a 1440P panel beside a 4k panel running at 1800P, you 'may' change your mind. However "smarter" games that allow resolution scaling do indeed somewhat help his problem (UE4 games like Dragon Quest XI). I'm ignoring screen technologies and refresh rates since they'll improve for 4k and prices will drop inevitably.

Sup. I like computers. Are you a computer?

Reply 15 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t
BeginnerGuy wrote:

As far as desktop use is concerned, mainstream operating systems seem to still be forcing scaling rather than updating to a smarter UI and higher res fonts, which leaves 1440p and 4k looking similar, 4k looks absolutely beautiful for text editing using Debian (which comes with way nicer fonts IMO).

Windows is improving in scaling, Linux probably as well, and MacOS has had this problem solve for years, or so I've been told.

BeginnerGuy wrote:

I have to say though, your suggestion of 1800P is harshly criticized due to monitors interpolating between pixels. 1080P scales linearly with 4k (3840 % 1920 == 0) where 1800P does not.

Obviously it was just a typo and he meant 1080P. There isn't even a common 1800P resolution.
Edit: I'm not so sure, actually.

Last edited by dr_st on 2019-01-19, 20:08. Edited 1 time in total.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 16 of 54, by Scali

User metadata
Rank l33t
Rank
l33t

I've had a 4k monitor for 4 years now I think, a Dell P2415Q.
The image quality is excellent, and it works great with Windows 10 (Windows 7 and earlier have trouble rendering at higher DPI).
Originally I had a GTX460 video card, which could only output 4k over DVI. And the monitor has DP and HDMI, but not DVI. So originally I was stuck at 1080p resolutions.
I upgraded to a GTX970 to fix that. Gaming at 4k is only possible with older games, or very low detail. As others said, I prefer to run games with more detail, but resolution at 1440p or even 1080p.
My next video card upgrade should solve that though. But I don't game much, so I'm not bothered. Having a 4k desktop is awesome though. In 1080p I would often maximize application windows, to get enough room on screen. With 4k, you rarely need to. Which makes it much easier to multitask. Just make your application windows as large as you need to, and have them (partly) overlap on screen. Much more productive.
At work we have just ordered a few 4k screens as well, for that same reason. Great for development work with Visual Studio and such.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 17 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t
dr_st wrote:
[…]
Show full quote

BeginnerGuy wrote:

I have to say though, your suggestion of 1800P is harshly criticized due to monitors interpolating between pixels. 1080P scales linearly with 4k (3840 % 1920 == 0) where 1800P does not.

Obviously it was just a typo and he meant 1080P. There isn't even a common 1800P resolution.
Edit: I'm not so sure, actually.

Not a typo. There's no reason to stick to common resolutions on LCDs.

BeginnerGuy wrote:

Finally, games were mentioned because OPs post was wide open to interpretation and leaning into hardware costs and discussing aliasing in games, which made it obvious games were in the equation. Who doesn't play games

There's more to games than the latest AAA titles though.

BeginnerGuy wrote:

I have to say though, your suggestion of 1800P is harshly criticized due to monitors interpolating between pixels. 1080P scales linearly with 4k (3840 % 1920 == 0) where 1800P does not. If you put a 1440P panel beside a 4k panel running at 1800P, you 'may' change your mind. However "smarter" games that allow resolution scaling do indeed somewhat help his problem (UE4 games like Dragon Quest XI). I'm ignoring screen technologies and refresh rates since they'll improve for 4k and prices will drop inevitably.

At 1440p you're discarding 360p of image information. Maybe for a small antialiasing benefit, but you're still losing part of the original picture. At 2160 you're showing all the details in the original picture and adding some interpolated information.

Also the clean scaling factor of 1080p to 4K is for the most part irrelevant. For interpolated scaling it doesn't make a perceptible difference and for most 3D games you want some sort of interpolated scaling unless you really like aliasing.

Reply 18 of 54, by BeginnerGuy

User metadata
Rank Oldbie
Rank
Oldbie
ZellSF wrote:

Not a typo. There's no reason to stick to common resolutions on LCDs.

Why is that? I prefer native 1440 over interpolated 1800P, it looks superior. I can't back my claim up until I do a blind test, but it's what I do in my own office. There IS a reason, unless you can show me some technological reason why newer LCD panels are perfectly scaling images. Which as far as I know, there is no such monitor. I don't think interpolation on top of interpolation (anti aliasing) is remotely optimal.

ZellSF wrote:

There's more to games than the latest AAA titles though.

What does it have to do with AAA titles? I can see jaggeds in the distance of borderlands 2 as well (granted lower texture resolutions), much older game that can pull 4k on a 280x with ambient occlusion off? I can see jagged edges in a screenshot of a game at 4k. I can see jagged edges in certain 4k photographs. The point is aliasing is still visible if you look for it, it doesn't matter if it's an indy game from 1982 if the object is small or detailed enough.

ZellSF wrote:

Also the clean scaling factor of 1080p to 4K is for the most part irrelevant. For interpolated scaling it doesn't make a perceptible difference and for most 3D games you want some sort of interpolated scaling unless you really like aliasing.

I perceive the difference, which goes right back to why I said YMMV. I can see it, many others can see it. Also, if the resolution of the monitor and the textures were high enough, why would you need interpolated scaling? The point of the highest resolutions is to make aliasing imperceptible, is it not?

not trying to be argumentative here, just that this same debate has raged on for years now via tech forums yet it's merely a subjective topic. One cannot present his opinion on the look of an interpolated vs native image as the gospel.

Sup. I like computers. Are you a computer?

Reply 19 of 54, by ZellSF

User metadata
Rank l33t
Rank
l33t
BeginnerGuy wrote:
ZellSF wrote:

Not a typo. There's no reason to stick to common resolutions on LCDs.

Why is that? I prefer native 1440 over interpolated 1800P,

There is a reason to stick to native resolution, but there is no reason to stick to other common resolutions is what I meant.

At any rate a 1800p image scaled to 2160p still has more detail than a native 1440p image. Not saying you're wrong in preferring a lower resolution image of course.

BeginnerGuy wrote:

What does it have to do with AAA titles?

You can't play the latest AAA titles at 4K, which is why people are saying 4K is stupid, but there's more to games than that as you mention. Older games or indie titles where you can easily play in 4K. Yes I'll admit I lost the thread of the conversation here, we might have been replying to different things.

BeginnerGuy wrote:

I perceive the difference, which goes right back to why I said YMMV. I can see it, many others can see it. Also, if the resolution of the monitor and the textures were high enough, why would you need interpolated scaling? The point of the highest resolutions is to make aliasing imperceptible, is it not?

If your hardware isn't good enough or your software doesn't support the higher resolution. Once you're not going for native resolution then what's important is going as high resolution as you can while not sacrificing anything else.

I'm hoping you're not understanding what you're replying to here though, are you saying you could see the difference between a interpolated 1080p>2160p scale and an interpolated 1200p>2160p scale and think the 1080p>2160 scale will be cleaner because of a better scaling ratio? I don't believe you (as for the many others, it's the first time I've heard anyone claim this and I really like discussions like this).

Only reason to stick to perfect scaling ratios is for integer scaling (which is not what I'm talking about above). Which is nice when you have pixel art based games and awful when you have low-res (1080p) 3D rendered games.