VOGONS

Common searches


Is anyone excited for Big Navi?

Topic actions

  • This topic is locked. You cannot reply or edit posts.

Reply 80 of 264, by ZellSF

User metadata
Rank l33t
Rank
l33t
darry wrote on 2020-09-26, 19:41:

The point that I am trying to make, or more accurately the question I have is just how noticeable the jump to 8K will be. I have my doubts about the "very noticeable" aspect

OK, here's a screenshot of GZDoom running at 1920x1080:

Screenshot_Doom_20200926_221118.png
Filename
Screenshot_Doom_20200926_221118.png
File size
803.51 KiB
Views
945 views
File license
Fair use/fair dealing exception

You might not see anything wrong, but that's fine, here's the same screenshot running at 3840x2160 downsampled to 1920x1080:

Screenshot_Doom_20200926_2211122.png
Filename
Screenshot_Doom_20200926_2211122.png
File size
786.98 KiB
Views
945 views
File license
Fair use/fair dealing exception

Now you should see that the rings under the closest UFO are much more defined at 3840x2160, and the one behind that is suddenly visible.

Now these shots are 1920x1080 for easier comparison, but the clarity of the rings looks exactly the same at 4K native. Notice how the rings below the farthest UFO are still missing at 3840x2160. Are you saying that's not very noticeable? Obviously, this gets fixed when rendering at 8K.

darry wrote on 2020-09-26, 17:25:

The point that I am trying to make, or more accurately the question I have is just how noticeable the jump to 8K will be. I have my doubts about the "very noticeable" aspect, especially if rendering is done at 8K only to have the results donwnsample to for display on a lower resolution display (effectively anti-aliasing, which is great, but also less and less useful as resolution increases).

As shown above, it is not only anti-aliasing, but you're right it becomes less useful as resolution increases, but the point it stops being useful isn't at 4K. Try playing Alien Isolation at 4K and then try telling me with a straight face that you didn't see any aliasing.

Alien Isolation is interesting because it has temporal anti-aliasing available as a mod. If you use that, you won't get any aliasing at 4K... However, sparks in the game will be practically invisible. Post-processing and temporal anti-aliasing (which is what a lot of modern games use) removing fine detail is also an issue that is greatly remedied by downsampling from a higher resolution.

darry wrote on 2020-09-26, 19:41:

I am not against progress, but I feel this is starting to look like the, IMHO largely pointless, megapixel race in the digital camera world . There is more to digital camera image quality than megapixels (dynamic range, noise performance, autofocus performance and accuracy, etc) and there is more to video game quality than rendering/display resolution (display and processing pipeline color depth, light/shadow rendering algorithms, texture resolution, frame rate, 3D model detail, etc ).

EDIT: Is increasing resolution beyond 4K the most "bang for the buck" path towards a better visual quality gaming experience ?

No, but several of the things you listed requires the developers involvement, and they will never spend resources for high end graphics that most people won't have the hardware for. Pushing resolution is often the only thing you CAN do at the high end of visual fidelity. And it still can give notable improvements even at 8K.

darry wrote on 2020-09-26, 19:41:

In other words, considering that there are diminishing returns at each jump in resolution (320x200 or less --> SD --> 2K--> 4K --> 8K), I am seriously questioning the value that 8K rendering and/or display will bring to me and others at normal monitor or TV viewing distances, versus the computational/financial/environmental cost .

If any of those are concerns, then sticking to 1280x720 is probably the most sensible thing. You probably can't go lower because it will make text unreadable in a lot of modern titles. It also makes more sense to get a console, though obviously game and controller preferences might remove that option.

Reply 81 of 264, by darry

User metadata
Rank l33t++
Rank
l33t++
ZellSF wrote on 2020-09-26, 21:34:
OK, here's a screenshot of GZDoom running at 1920x1080: Screenshot_Doom_20200926_221118.png You might not see anything wrong, bu […]
Show full quote
darry wrote on 2020-09-26, 19:41:

The point that I am trying to make, or more accurately the question I have is just how noticeable the jump to 8K will be. I have my doubts about the "very noticeable" aspect

OK, here's a screenshot of GZDoom running at 1920x1080:
Screenshot_Doom_20200926_221118.png
You might not see anything wrong, but that's fine, here's the same screenshot running at 3840x2160 downsampled to 1920x1080:
Screenshot_Doom_20200926_2211122.png
Now you should see that the rings under the closest UFO are much more defined at 3840x2160, and the one behind that is suddenly visible.

Now these shots are 1920x1080 for easier comparison, but the clarity of the rings looks exactly the same at 4K native. Notice how the rings below the farthest UFO are still missing at 3840x2160. Are you saying that's not very noticeable? Obviously, this gets fixed when rendering at 8K.

darry wrote on 2020-09-26, 17:25:

The point that I am trying to make, or more accurately the question I have is just how noticeable the jump to 8K will be. I have my doubts about the "very noticeable" aspect, especially if rendering is done at 8K only to have the results donwnsample to for display on a lower resolution display (effectively anti-aliasing, which is great, but also less and less useful as resolution increases).

As shown above, it is not only anti-aliasing, but you're right it becomes less useful as resolution increases, but the point it stops being useful isn't at 4K. Try playing Alien Isolation at 4K and then try telling me with a straight face that you didn't see any aliasing.

Alien Isolation is interesting because it has temporal anti-aliasing available as a mod. If you use that, you won't get any aliasing at 4K... However, sparks in the game will be practically invisible. Post-processing and temporal anti-aliasing (which is what a lot of modern games use) removing fine detail is also an issue that is greatly remedied by downsampling from a higher resolution.

darry wrote on 2020-09-26, 19:41:

I am not against progress, but I feel this is starting to look like the, IMHO largely pointless, megapixel race in the digital camera world . There is more to digital camera image quality than megapixels (dynamic range, noise performance, autofocus performance and accuracy, etc) and there is more to video game quality than rendering/display resolution (display and processing pipeline color depth, light/shadow rendering algorithms, texture resolution, frame rate, 3D model detail, etc ).

EDIT: Is increasing resolution beyond 4K the most "bang for the buck" path towards a better visual quality gaming experience ?

No, but several of the things you listed requires the developers involvement, and they will never spend resources for high end graphics that most people won't have the hardware for. Pushing resolution is often the only thing you CAN do at the high end of visual fidelity. And it still can give notable improvements even at 8K.

darry wrote on 2020-09-26, 19:41:

In other words, considering that there are diminishing returns at each jump in resolution (320x200 or less --> SD --> 2K--> 4K --> 8K), I am seriously questioning the value that 8K rendering and/or display will bring to me and others at normal monitor or TV viewing distances, versus the computational/financial/environmental cost .

If any of those are concerns, then sticking to 1280x720 is probably the most sensible thing. You probably can't go lower because it will make text unreadable in a lot of modern titles. It also makes more sense to get a console, though obviously game and controller preferences might remove that option.

I am anything but an expert, but I find it odd that a visual element of the size of those rings is not visible when rendered at 2K, but suddenly becomes apparent at 4K, even when down-sampled to 2K . Could issues with the rendering engine/pipeline (i.e. lack of precision or some form of rounding) be at play here, causing an anomalous lack of detail that oversampling (rendering at 4K and down-sampling) compensates for ? Nothing else in the scene really benefits from the 4K rendering, at least to my eyes .

Reply 82 of 264, by Bruninho

User metadata
Rank Oldbie
Rank
Oldbie

Here in Brazil, the RTX 3080 prices will start at R$ 5200 (approx. US$ 940) and the RTX 3090 start at R$ 12000 (approx. US$ 2160). And these GPUs will require a 750W PSU.

Excitement? ZERO. Absolutely ZERO. With the prices for a RTX 3090 I could buy a 2020 16-inch MacBook Pro or the latest Dell XPS 15. *rolleyes*

I seem to have made the right decision a few years ago.

"Design isn't just what it looks like and feels like. Design is how it works."
JOBS, Steve.
READ: Right to Repair sucks and is illegal!

Reply 83 of 264, by leileilol

User metadata
Rank l33t++
Rank
l33t++

video cards

It's always the more ram = uncompressed = quality cycle, as well as the higher res/fsaa cycle in the past 20 years, and the new raytraced doodads pretending buffered/texture rendering+shaders never happened.

Last edited by leileilol on 2020-09-27, 02:37. Edited 2 times in total.

apsosig.png
long live PCem

Reply 84 of 264, by darry

User metadata
Rank l33t++
Rank
l33t++

I think Nvidia could be pricing themselves out of any significant market share, unless they can come up with a decent mid-range product with a sane power envelope and competitive pricing . The highly motivated and/or rich will buy up the high end, but the masses want something with a bang for the buck and Nvidia may or may not be the one who will provide that . We will just have to wait and see .

Reply 86 of 264, by darry

User metadata
Rank l33t++
Rank
l33t++
kolderman wrote on 2020-09-27, 03:15:

You mean like the inevitable 3070 and 3060?

Yup .

Their performance/power envelope/price versus the competition will determine whether team green wins this one or not .

EDIT: What I meant to say is that AMD might still win this one even if they don't have upper hand in the ultra high end category . At this point, the only RTX 30x0 based cards are models I would not care for even if they were half the price . 320W+ TDP is simply unacceptable to me . I strongly suspect that I am not the only one with that opinion (and the opinion that those cards are overpriced) .

Reply 87 of 264, by ZellSF

User metadata
Rank l33t
Rank
l33t
darry wrote on 2020-09-27, 01:24:
ZellSF wrote on 2020-09-26, 21:34:
OK, here's a screenshot of GZDoom running at 1920x1080: Screenshot_Doom_20200926_221118.png You might not see anything wrong, bu […]
Show full quote
darry wrote on 2020-09-26, 19:41:

The point that I am trying to make, or more accurately the question I have is just how noticeable the jump to 8K will be. I have my doubts about the "very noticeable" aspect

OK, here's a screenshot of GZDoom running at 1920x1080:
Screenshot_Doom_20200926_221118.png
You might not see anything wrong, but that's fine, here's the same screenshot running at 3840x2160 downsampled to 1920x1080:
Screenshot_Doom_20200926_2211122.png
Now you should see that the rings under the closest UFO are much more defined at 3840x2160, and the one behind that is suddenly visible.

Now these shots are 1920x1080 for easier comparison, but the clarity of the rings looks exactly the same at 4K native. Notice how the rings below the farthest UFO are still missing at 3840x2160. Are you saying that's not very noticeable? Obviously, this gets fixed when rendering at 8K.

darry wrote on 2020-09-26, 17:25:

The point that I am trying to make, or more accurately the question I have is just how noticeable the jump to 8K will be. I have my doubts about the "very noticeable" aspect, especially if rendering is done at 8K only to have the results donwnsample to for display on a lower resolution display (effectively anti-aliasing, which is great, but also less and less useful as resolution increases).

As shown above, it is not only anti-aliasing, but you're right it becomes less useful as resolution increases, but the point it stops being useful isn't at 4K. Try playing Alien Isolation at 4K and then try telling me with a straight face that you didn't see any aliasing.

Alien Isolation is interesting because it has temporal anti-aliasing available as a mod. If you use that, you won't get any aliasing at 4K... However, sparks in the game will be practically invisible. Post-processing and temporal anti-aliasing (which is what a lot of modern games use) removing fine detail is also an issue that is greatly remedied by downsampling from a higher resolution.

darry wrote on 2020-09-26, 19:41:

I am not against progress, but I feel this is starting to look like the, IMHO largely pointless, megapixel race in the digital camera world . There is more to digital camera image quality than megapixels (dynamic range, noise performance, autofocus performance and accuracy, etc) and there is more to video game quality than rendering/display resolution (display and processing pipeline color depth, light/shadow rendering algorithms, texture resolution, frame rate, 3D model detail, etc ).

EDIT: Is increasing resolution beyond 4K the most "bang for the buck" path towards a better visual quality gaming experience ?

No, but several of the things you listed requires the developers involvement, and they will never spend resources for high end graphics that most people won't have the hardware for. Pushing resolution is often the only thing you CAN do at the high end of visual fidelity. And it still can give notable improvements even at 8K.

darry wrote on 2020-09-26, 19:41:

In other words, considering that there are diminishing returns at each jump in resolution (320x200 or less --> SD --> 2K--> 4K --> 8K), I am seriously questioning the value that 8K rendering and/or display will bring to me and others at normal monitor or TV viewing distances, versus the computational/financial/environmental cost .

If any of those are concerns, then sticking to 1280x720 is probably the most sensible thing. You probably can't go lower because it will make text unreadable in a lot of modern titles. It also makes more sense to get a console, though obviously game and controller preferences might remove that option.

I am anything but an expert, but I find it odd that a visual element of the size of those rings is not visible when rendered at 2K, but suddenly becomes apparent at 4K, even when down-sampled to 2K . Could issues with the rendering engine/pipeline (i.e. lack of precision or some form of rounding) be at play here, causing an anomalous lack of detail that oversampling (rendering at 4K and down-sampling) compensates for ? Nothing else in the scene really benefits from the 4K rendering, at least to my eyes .

Then you might actually need new eyes, yes, since most people cycling between those images would probably be able to tell the fence is better on the 4K downsampled shot. Of course, that's actually the same thing (transparent textures). It was just an example to show that anti-aliasing isn't the only benefit of higher rendering resolution.

Another example of that:

Valkyria_2019_04_12_20_25_12_108.jpg
Filename
Valkyria_2019_04_12_20_25_12_108.jpg
File size
349.28 KiB
Views
850 views
File license
Fair use/fair dealing exception
Valkyria_2019_04_12_20_21_15_023.jpg
Filename
Valkyria_2019_04_12_20_21_15_023.jpg
File size
397.44 KiB
Views
850 views
File license
Fair use/fair dealing exception
darry wrote on 2020-09-27, 03:28:

Yup .

Their performance/power envelope/price versus the competition will determine whether team green wins this one or not .

EDIT: What I meant to say is that AMD might still win this one even if they don't have upper hand in the ultra high end category . At this point, the only RTX 30x0 based cards are models I would not care for even if they were half the price . 320W+ TDP is simply unacceptable to me . I strongly suspect that I am not the only one with that opinion (and the opinion that those cards are overpriced) .

That's an odd thing to be complaining about now, since they haven't raised the price since the 2080. The 2080 was when they increased the price by 100$ over the 1080. The 1080 was also 50$ over the 980.

Is it still excessive? Sure, but that's the high end of all hobbies for you.

Reply 89 of 264, by darry

User metadata
Rank l33t++
Rank
l33t++
ZellSF wrote on 2020-09-26, 21:34:
OK, here's a screenshot of GZDoom running at 1920x1080: Screenshot_Doom_20200926_221118.png You might not see anything wrong, bu […]
Show full quote
darry wrote on 2020-09-26, 19:41:

The point that I am trying to make, or more accurately the question I have is just how noticeable the jump to 8K will be. I have my doubts about the "very noticeable" aspect

OK, here's a screenshot of GZDoom running at 1920x1080:
Screenshot_Doom_20200926_221118.png
You might not see anything wrong, but that's fine, here's the same screenshot running at 3840x2160 downsampled to 1920x1080:
Screenshot_Doom_20200926_2211122.png
Now you should see that the rings under the closest UFO are much more defined at 3840x2160, and the one behind that is suddenly visible.

Now these shots are 1920x1080 for easier comparison, but the clarity of the rings looks exactly the same at 4K native. Notice how the rings below the farthest UFO are still missing at 3840x2160. Are you saying that's not very noticeable? Obviously, this gets fixed when rendering at 8K.

darry wrote on 2020-09-26, 17:25:

The point that I am trying to make, or more accurately the question I have is just how noticeable the jump to 8K will be. I have my doubts about the "very noticeable" aspect, especially if rendering is done at 8K only to have the results donwnsample to for display on a lower resolution display (effectively anti-aliasing, which is great, but also less and less useful as resolution increases).

As shown above, it is not only anti-aliasing, but you're right it becomes less useful as resolution increases, but the point it stops being useful isn't at 4K. Try playing Alien Isolation at 4K and then try telling me with a straight face that you didn't see any aliasing.

Alien Isolation is interesting because it has temporal anti-aliasing available as a mod. If you use that, you won't get any aliasing at 4K... However, sparks in the game will be practically invisible. Post-processing and temporal anti-aliasing (which is what a lot of modern games use) removing fine detail is also an issue that is greatly remedied by downsampling from a higher resolution.

darry wrote on 2020-09-26, 19:41:

I am not against progress, but I feel this is starting to look like the, IMHO largely pointless, megapixel race in the digital camera world . There is more to digital camera image quality than megapixels (dynamic range, noise performance, autofocus performance and accuracy, etc) and there is more to video game quality than rendering/display resolution (display and processing pipeline color depth, light/shadow rendering algorithms, texture resolution, frame rate, 3D model detail, etc ).

EDIT: Is increasing resolution beyond 4K the most "bang for the buck" path towards a better visual quality gaming experience ?

No, but several of the things you listed requires the developers involvement, and they will never spend resources for high end graphics that most people won't have the hardware for. Pushing resolution is often the only thing you CAN do at the high end of visual fidelity. And it still can give notable improvements even at 8K.

darry wrote on 2020-09-26, 19:41:

In other words, considering that there are diminishing returns at each jump in resolution (320x200 or less --> SD --> 2K--> 4K --> 8K), I am seriously questioning the value that 8K rendering and/or display will bring to me and others at normal monitor or TV viewing distances, versus the computational/financial/environmental cost .

If any of those are concerns, then sticking to 1280x720 is probably the most sensible thing. You probably can't go lower because it will make text unreadable in a lot of modern titles. It also makes more sense to get a console, though obviously game and controller preferences might remove that option.

I agree that the fence is different between the GZDoom renders. I initially attributed that to lighting differences between both scenes, so I did not pay much attention to it . IMHO, those scenes are still quite similar and I still have my doubts about potential rendering precision issues in this one .

Your Valkyria example shows much more of a difference and, frankly, it looks unnecessarily soft when rendered at 1080p .
If I understand correctly, this is what you mean by post-processing (and possibly rendering shortcuts) done by devs at the detriment of image detail that can be compensated for by rendering at 4K of higher .

Having to compensate for rendering engine choices/limitations by rendering at higher resolutions is, IMHO, something that we should not have to do . It is something of a brute force approach (but the only one end-users have, as you say) to compensate for what is a arguably a "bad choice" by devs .

So, according to my (possibly erroneous) interpretation, the point is not so much that ever higher resolutions (rendering and/or display) are significantly better in and of themselves, but that the precision at which things are rendered for a given output resolution actually makes rendering at higher resolution advantageous .

If that is so, I find it counter-productive, because , IMHO, it would be more efficient and cost effective to actually render in a way that is optimized for best quality at the actual display resolution, whatever it may be, rather than playing theses "games" with rendering and post-processing.

EDIT: Then again, you can't sell 4K and 8K to people if you don't find ways to make it look better, if you are the hardware maker. For a game dev, being able to say my game runs fast at 1080p (because elements are possibly rendered at with sub-optimal precision) with all eye candy enabled on hardware X, makes them look good too . Incidentally, wasn't rendering with relatively low precision compared to display resolution something that was pioneered in the world of games consoles and possibly migrated to PCs ?

As for the pricing comment, I was not specifically referring to the RTX 30x0 generation, but to upper-midrange to high-end pricing in general of the last few years .

Reply 90 of 264, by ZellSF

User metadata
Rank l33t
Rank
l33t
darry wrote on 2020-09-27, 15:00:

If that is so, I find it counter-productive, because , IMHO, it would be more efficient and cost effective to actually render in a way that is optimized for best quality at the actual display resolution, whatever it may be, rather than playing theses "games" with rendering and post-processing.

If every game developer made their games technological marvels, then yes that would be more efficient, but when you factor in the cost of making that come true? Probably not.

Reply 91 of 264, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie

See heres the thing, to me spending money on anything MAC is a horrible waste of money. I guess its just down to preference.

I still haven't been able to confirm my 3090 purchase, the closest I've got so far is that one outlet put a hold on the amount from my bank account, but they haven't charged me or shipped yet. I accidentally ordered 2 3090s but they don't have weekend phone service so I'm hoping to clear things up with then Monday, otherwise I guess I will just have to run dual 3090s...

Reply 92 of 264, by Jo22

User metadata
Rank l33t++
Rank
l33t++

I'm not into modern PC gaming anymore, I'm afraid. 😢
Graphics are too realistic to relax. I'd rather play an Atari VCS game like Sky Jinks.
Or Skifree on a 286.

I was excited for Tesselation in DirectX 11 and the GTX 750Ti, though, since it had sane requirements for once (~350w psu, XP; tesselation needed 7).

This 3000 thing is just another meaningless card to me.
In general, I wished, video industry settled for a common base line configuration without compromises in quality.
Say 1440p only, but fully working (anti-aliasing, tesselation, 120 fps min.)

Same for video codecs. Instead of going higher and higher in resolution, settle for something good and ged rid of artifacts and blur. The improvements made in compression should not be used to save bandwith, but to enhance image quality at the same bandwitdh.

But these are just my two cents.
As things progress.. Maybe we will somewhen see expensive downscalers that improve quality by turning that pixel pulp into something friendly to th eye? 😉

"Time, it seems, doesn't flow. For some it's fast, for some it's slow.
In what to one race is no time at all, another race can rise and fall..." - The Minstrel

//My video channel//

Reply 93 of 264, by Bruninho

User metadata
Rank Oldbie
Rank
Oldbie

About the overpriced cards...

2060: USD 299 (BRL 2999)
2070: USD 399 (BRL 4199)
2080: USD 699 (BRL 6299)

3070: USD 499 (BRL 3999)
3080: USD 699 (BRL 5599)
3090: USD 1499 (BRL 11999)

I would NEVER pay such prices even if I were rich enough for that. Last GPU I bought five years ago was BRL 349 (!!!).

For BRL 2999 you can buy a reasonably good notebook. A 1st gen Dell G5 gaming laptop was BRL 4999 and had an i7, 16GB, 1TB SSD a mobile 1060 nvidia gpu. Could game rFactor 2 and F1 2018.

Now you can see why I complain “too much”? Its not a matter of “poor” or “rich”, “left”, “right” or “comunist” as commented on another thread. Its just the FACT that they are practicing abusive prices. With these prices I’d rather buy the new iPhones or iPads. Heck, If I were a console fan, could buy a PlayStation. A (censored) expensive GPU is nothing without an (censored) expensive computer to install it. And in one or two years we’ll see a new thread about 4XXX series GPUs... *rolleyes*

"Design isn't just what it looks like and feels like. Design is how it works."
JOBS, Steve.
READ: Right to Repair sucks and is illegal!

Reply 94 of 264, by Bruninho

User metadata
Rank Oldbie
Rank
Oldbie
subhuman@xgtx wrote on 2020-09-27, 13:27:

With the rise of 800$ cards that are only good for playing games, suddenly the idea of spending 2k on a non-upgradable macbook pro doesn't sound as outrageous as it once 'did'

Agreed. And if you don’t want a Mac, with the same price range you can find a powerful Dell XPS 15 with many other features.

"Design isn't just what it looks like and feels like. Design is how it works."
JOBS, Steve.
READ: Right to Repair sucks and is illegal!

Reply 95 of 264, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
Bruninho wrote on 2020-09-27, 16:36:
subhuman@xgtx wrote on 2020-09-27, 13:27:

With the rise of 800$ cards that are only good for playing games, suddenly the idea of spending 2k on a non-upgradable macbook pro doesn't sound as outrageous as it once 'did'

Agreed. And if you don’t want a Mac, with the same price range you can find a powerful Dell XPS 15 with many other features.

I'm buying 2 3090s. Man I love the free market.

Reply 96 of 264, by darry

User metadata
Rank l33t++
Rank
l33t++
Shagittarius wrote on 2020-09-27, 21:07:
Bruninho wrote on 2020-09-27, 16:36:
subhuman@xgtx wrote on 2020-09-27, 13:27:

With the rise of 800$ cards that are only good for playing games, suddenly the idea of spending 2k on a non-upgradable macbook pro doesn't sound as outrageous as it once 'did'

Agreed. And if you don’t want a Mac, with the same price range you can find a powerful Dell XPS 15 with many other features.

I'm buying 2 3090s. Man I love the free market.

I hope you enjoy them and have no issues .

And with 700W of combined TDP, plus that of the CPU(s), you will not be cold this winter thanks to your high tech space heater/PC . 😉

Seriously, what CPU(s) will you be using with those cards ?

Reply 98 of 264, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
darry wrote on 2020-09-27, 21:36:
I hope you enjoy them and have no issues . […]
Show full quote
Shagittarius wrote on 2020-09-27, 21:07:
Bruninho wrote on 2020-09-27, 16:36:

Agreed. And if you don’t want a Mac, with the same price range you can find a powerful Dell XPS 15 with many other features.

I'm buying 2 3090s. Man I love the free market.

I hope you enjoy them and have no issues .

And with 700W of combined TDP, plus that of the CPU(s), you will not be cold this winter thanks to your high tech space heater/PC . 😉

Seriously, what CPU(s) will you be using with those cards ?

Just a 9900k. I've got a 1200W PSU that I'm hoping will do the trick without having to resort to dual PSU.

Reply 99 of 264, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
Shagittarius wrote on 2020-09-27, 21:07:
Bruninho wrote on 2020-09-27, 16:36:
subhuman@xgtx wrote on 2020-09-27, 13:27:

With the rise of 800$ cards that are only good for playing games, suddenly the idea of spending 2k on a non-upgradable macbook pro doesn't sound as outrageous as it once 'did'

Agreed. And if you don’t want a Mac, with the same price range you can find a powerful Dell XPS 15 with many other features.

I'm buying 2 3090s. Man I love the free market.

That kind of setup just might hit 60 fps in Flight Simulator @ 2160p! 😜

P6 chip. Triple the speed of the Pentium.
Tualatin: PIII-S @ 1628MHz | QDI Advance 12T | 2GB DDR-310 | 6800GT | X-Fi | 500GB HDD | 3DMark01: 14,059
Dothan: PM @ 2.9GHz | MSI Speedster FA4 | 2GB DDR2-580 | GTX 750Ti | X-Fi | 500GB SSD | 3DMark01: 43,190