VOGONS


Advice on purchasing a gaming videocard

Topic actions

Reply 40 of 83, by sgt76

User metadata
Rank Oldbie
Rank
Oldbie

OIC.... well if the drivers bother you.... I'd pick up a GTX670 or a nice, used GTX570 or 580. Preferably Asus or MSI.

Reply 41 of 83, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie

Why are people complaining about AMDs legacy status? Are you really going to keep the same GPU for more than 5 years if you want to play the latest games? They still release driver updates for cards as old as the HD2xxx series and they've been around for 7 years! They have to eventually put some of the older cards to sleep so they can concentrate more on driver improvements for the more recent cards.

Nvidia also put their cards cards into legacy status too eventually, so it's not just an AMD thing. I plan on keeping my 7950 for the next year or two.

Reply 42 of 83, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

@Powerpie5000

My Radeon HD3870 runs flawlessly but it's in legacy status. No driver updates since June 2012.

Reply 43 of 83, by sunaiac

User metadata
Rank Oldbie
Rank
Oldbie

What would it need an update for ?

R9 3900X/X470 Taichi/32GB 3600CL15/5700XT AE/Marantz PM7005
i7 980X/R9 290X/X-Fi titanium | FX-57/X1950XTX/Audigy 2ZS
Athlon 1000T Slot A/GeForce 3/AWE64G | K5 PR 200/ET6000/AWE32
Ppro 200 1M/Voodoo 3 2000/AWE 32 | iDX4 100/S3 864 VLB/SB16

Reply 44 of 83, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

Mainly because drivers are not bug-free. In fact they have some regressions.

For instance GPU Caps Viewer says my Radeon card supports OpenGL 3.3 but most OpenGL 3.x tests fail with an unknown error. I tested it on a GF 9 that I have at work and every OpenGL test worked properly.

EDIT: Funny. Those tests are working now without changing anything. I love AMD... 😁

Honestly. I don't wanna sound lame. AMD cards are great and powerful but I want an nVidia just for the sake of change.

Reply 45 of 83, by BigBodZod

User metadata
Rank Oldbie
Rank
Oldbie
eL_PuSHeR wrote:
Mainly because drivers are not bug-free. In fact they have some regressions. […]
Show full quote

Mainly because drivers are not bug-free. In fact they have some regressions.

For instance GPU Caps Viewer says my Radeon card supports OpenGL 3.3 but most OpenGL 3.x tests fail with an unknown error. I tested it on a GF 9 that I have at work and every OpenGL test worked properly.

EDIT: Funny. Those tests are working now without changing anything. I love AMD... 😁

Honestly. I don't wanna sound lame. AMD cards are great and powerful but I want an nVidia just for the sake of change.

Nothing wrong with change nor testing out a competing GPU.

Interesting that nVidia finally allowed AMD to license the SLI tech, I wonder if the same limitations are still occurring with their PhysX tech too ?

No matter where you go, there you are...

Reply 46 of 83, by sunaiac

User metadata
Rank Oldbie
Rank
Oldbie
eL_PuSHeR wrote:

Honestly. I don't wanna sound lame. AMD cards are great and powerful but I want an nVidia just for the sake of change.

Well I do understand perfectly, I'm changing every generation, 😁
Thing is, this is the worst time to go nvidia, where previous gen they were very good with the GTX5XX.

BigBod : what SLI licencing are you talking about ?

R9 3900X/X470 Taichi/32GB 3600CL15/5700XT AE/Marantz PM7005
i7 980X/R9 290X/X-Fi titanium | FX-57/X1950XTX/Audigy 2ZS
Athlon 1000T Slot A/GeForce 3/AWE64G | K5 PR 200/ET6000/AWE32
Ppro 200 1M/Voodoo 3 2000/AWE 32 | iDX4 100/S3 864 VLB/SB16

Reply 47 of 83, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie

I usually just go with whatever has the best performance for the price without it being too expensive. The Radeon 7950 fit the bill this time for my main gaming PC, and then i got the GTX 660 Ti for our living room PC (just so i could check out the competition 🤣).

Reply 48 of 83, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

I agree that it's a nice idea to have the best of the two worlds for comparison sake.

Reply 49 of 83, by BigBodZod

User metadata
Rank Oldbie
Rank
Oldbie
sunaiac wrote:
Well I do understand perfectly, I'm changing every generation, :D Thing is, this is the worst time to go nvidia, where previous […]
Show full quote
eL_PuSHeR wrote:

Honestly. I don't wanna sound lame. AMD cards are great and powerful but I want an nVidia just for the sake of change.

Well I do understand perfectly, I'm changing every generation, 😁
Thing is, this is the worst time to go nvidia, where previous gen they were very good with the GTX5XX.

BigBod : what SLI licencing are you talking about ?

nVidia now allows AMD based motherboards to support both Crossfire and SLI 😉

For instance my Gigabyte GA-990FXZ-UD5 mobo supports both Crossfire and SLI where on previous chipsets it would only support Crossfire.

There was a time you could only get some Intel based boards that supported both Crossfire and SLI.

No matter where you go, there you are...

Reply 50 of 83, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

I myself am somewhat of an ATI/AMD fanboy. The reason for this is because in my experience, I've seen many Nvidia cards burn out and fail, while I've only ever seen maybe less than 5 failed ATI/AMD cards. As well, AMD tends to give their older cards better driver support.

I'm not saying all Nvidia cards are bad, as some of their older models (before the Geforce FX series) are actually quite reliable, and their cards are capable of delivering decent performance when they're actually operational, but like I said above I've seen too many of them burn out to really trust them.

Reply 51 of 83, by shamino

User metadata
Rank l33t
Rank
l33t

Unfortunately, ATI doesn't sell their own cards anymore so the build quality advantage (which I think they used to hold) is gone.
It seems many people had problems with ATI 10 years ago but like them today. My experience has been the other way around.
I was using a 2nd hand ATI Radeon 9800 Pro a few years ago in my last AGP system. I played Morrowind on it nearly daily. That's one of the best video cards I've ever owned. It was completely flawless for me. I was so happy with it that when I put together my modern "express" based PC a couple years ago, I decided to try another ATI card.
I didn't need anything fancy (yet), so I got a cheap HD4350. It was still faster than the 9800 Pro and that's all the speed I needed.
That card had problems with random flickering textures and text. I tried multiple driver versions, but that didn't fix it. It was one of those stupid factory overclocked cards (hard to avoid sometimes), so I tried using an overclocker utility to un-overclock it. That didn't fix it either and it screwed up dual monitor support.
This may have been an ATI problem or a Gigabyte problem (they made the card, and from repeated experiences with that brand I don't think they ever test anything). I'd have been happier if I could actually buy a card made by ATI.
I read of many people having the same flickering problems on much more expensive ATI based cards, so it started to sound like an ATI chip or driver problem. When it came time to buy my "real" video card, I changed my plans and went back to nVidia.

In terms of performance, the HD4350 exceeded my expectations.

My opinion of nVidia as an overall chipmaker has declined in the last 10 years, yet I find I've never had a bad experience with their GPUs. Some run too hot, but tweaking the fan in rivatuner fixes that.
Many higher end cards are allowed by the manufacturer to run too hot, and they have no overheat protections whatsoever. I think it's intentional (planned obsolescence), and that's something I resent about the video card industry in general. I don't get the feeling that I'm buying a quality item with any video card, it's all built to be thrown away.

I wish there was a brand of gaming-level video cards which could be relied upon for it's long term build quality. As I appreciate things that last, I would pay more for a brand that I knew was well built and would be virtually certain to still work 5-10 years from now. Unfortunately the whole gaming side of the industry treats video cards like throwaway items, and builds them accordingly.

Reply 52 of 83, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie
shamino wrote:

Unfortunately the whole gaming side of the industry treats video cards like throwaway items, and builds them accordingly.

Who's to blame them if the consumers are ready to buy a new card every year or two.

Reply 53 of 83, by PowerPie5000

User metadata
Rank Oldbie
Rank
Oldbie
d1stortion wrote:
shamino wrote:

Unfortunately the whole gaming side of the industry treats video cards like throwaway items, and builds them accordingly.

Who's to blame them if the consumers are ready to buy a new card every year or two.

I usually replace them every couple of years... The GPU i had for the least amount of time was the 1GB Radeon 6870 i had before getting my 3GB Radeon 7950 (i had the 6870 for about a year) 🤣.

Reply 54 of 83, by bucket

User metadata
Rank Member
Rank
Member
eL_PuSHeR wrote:

Also I didn't like at all AMD putting into "legacy status" a lot of hardware, including mine, that is still working quite well. I think you can even install W7 on a GF6 card.

I think I am going to purchase a Gigabyte card (my board is also Gigabyte). There are other brands I don't know (Zotac...)

You can install Win7 on a NeoMagic 4MB card if you use the Classic theme. Aero requires DX9, but anyway I think the point is moot. Both AMD and nVidia are great cards these days. I don't know of any serious common problems with either chipset. Just go with the best value you can find.

I recommend EVGA or BFG or even XFX as video card brands. Of course, you can't go wrong with AMD or nVidia themselves...

Reply 55 of 83, by shamino

User metadata
Rank l33t
Rank
l33t
PowerPie5000 wrote:
d1stortion wrote:
shamino wrote:

Unfortunately the whole gaming side of the industry treats video cards like throwaway items, and builds them accordingly.

Who's to blame them if the consumers are ready to buy a new card every year or two.

I usually replace them every couple of years... The GPU i had for the least amount of time was the 1GB Radeon 6870 i had before getting my 3GB Radeon 7950 (i had the 6870 for about a year) 🤣.

I don't buy new video cards very often, but even when I do, I still want the old one to work. I'm grateful that the old ATI 9800 Pro is still working in my nForce2 system.
Gaming aside, I rely on my computer to be reliable. It's not solely a game machine, it has important functions as well and I don't like hardware that's just going to break at random. Unfortunately that's the build quality standard you're stuck with at the midgrade and above gaming cards.

That was actually one of the reasons I decided to buy an HD4350 first, before upgrading later. I figured I'd need an emergency backup when the higher tier card craps out on me. I outmaneuvered myself though - I picked the 4350 because I expected to be buying a higher end ATI later, and so they'd be easy to swap. Instead I bought an nVidia, so if disaster strikes the swap won't be so quick and easy anymore.

Reply 56 of 83, by sliderider

User metadata
Rank l33t++
Rank
l33t++

Well, according to the nVidia website, there only going to be a few more driver revisions before GeForce 6 is relegated to legacy status, which means GeForce 7 can't be far behind seeing as it is pretty much an extension of the GeForce 6 architecture with some refinements. The real issue for nVidia users is going to come when GeForce 8 goes legacy because the GeForce 9 series is based in GeForce 8 and so is the bottom end of the 200 series. That's going to be a whole lot of nVidia owners left out in the cold with no driver updates to look forward to.

Video cards usually go legacy at points where there are major architectural changes made. For AMD, R420 was a major step up from R300 so they drew a line there with Win 9x support ending after R300. Moving from x1x00 to HD2K was also another major architectural change so x1x00 loses support after Vista. HD2K, 3K, and 4K will be the next to lose support since they are currently clumped together under the same support scheme. They are already nearing the end of the road since AMD does not consider them to be capable of further optimization though issues with specific games are still being worked on as they arise. That support will likely end inside of a year and they will go to full legacy status. At this point, if you are an AMD fan and haven't upgraded to at least a HD5k card, then you really need to consider it.

Reply 57 of 83, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

I have received the card but I am in doubt. The card has got two separate intakes for external power. Do I need to plug them both? If so, I am lacking one of them and I am thinking in purchasing a more potent PSU this afternoon to be on the safe side. The manual isn't clear. I have also checked the installation guide on CD and that part isn't even covered. See attached image.

Reply 58 of 83, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

Yes you do. They aren't there for no reason. If there is a 2x 4-Pin Molex -> 6-Pin PCIe adapter in the box (which there should be) you may try it that way at your own risk.

Reply 59 of 83, by eL_PuSHeR

User metadata
Rank l33t++
Rank
l33t++

I have ordered a new PSU. It has 750 Watts and provides up to 35A for the +12V line. Too bad it won't arrive until next tuesday. The power connectors on the card have got eight and six pins respectively, but the diagram sucks (it seems two 4-pin molex are being connected). I think I will connect two 6-pin ones.

I think the ones on the psu are labeled PCI-E. My current card just has got one 4-pin molex.

As always, any advice would be greatly appreciated. It's like I am shooting in the dark here. However I think the molex sockets are rounded/squared for avoiding mistakes when connecting.