VOGONS


HD 2900XT

Topic actions

Reply 20 of 40, by Scraphoarder

User metadata
Rank Member
Rank
Member
RacoonRider wrote:

Time is a funny thing. You guys here are discussing DirectX12, while my best modern videocard only supports DirectX10 😀

Hehe 😊 The last game i played on my PC was Half Life 2 after its release and i didnt finish it. My rig was even outdated at that time so i guess i have to update my knowlege after that. I only have some cheap HD5xxx today in my mediacenter that i newer used for gaming. It runs Win8.1 so its probably dx11?

Reply 21 of 40, by GeorgeMan

User metadata
Rank Oldbie
Rank
Oldbie

I'm happy I bought HIS 9600Pro when ATI had 40%, Club3D X1800XT when ATI had 46,5%, Sapphire HD3870 when AMD/ATI had 37% and finally Club3D HD7870XT (Tahiti based) when AMD had ~39%.

I'm also happy I bought ASUS 6600GT when Nvidia had 41,5%. 😊

Retro1: Athlon XP 3200+ @Arctic cooler | ASUS A7V600 | Radeon 9800XXL 128MB | SB Audigy 2 ZS | 160GB IDE HDD | Win98SE & XP
Retro2: under construction with a PIII 933 or a Tualatin Celeron 1200 and a GF2 GTS 32MB

Reply 22 of 40, by Scali

User metadata
Rank l33t
Rank
l33t
NJRoadfan wrote:

At most the non-AIW 9000/X000 cards came with video output.

Then perhaps the X1x00 were the first to have the T200 chip as standard.

NJRoadfan wrote:

You had to buy an All-in-Wonder to get full VIVO in that era. The Theater 200 is an oldie, but a goodie, its one of the best analog video capture chips out there. Thats why I'm curious if ATI changed how the T200 chip interfaced with the PCI bus with the HD2900. ATI/AMD stated that the older AIW cards will never get video capture capability in Vista and above because the WDDM driver architecture was incompatible with how they setup/connected the T200 chip on the older cards.

Well, I don't have official drivers for Vista and up either. I did manage to hack up something that installs on my X1800/X1900 cards, and they do capture *something*, but the resolution is wrong, deinterlacing doesn't work properly iirc.
I think it's just a software issue really. If it works in XP, there's no reason why it can't work the same in any other OS.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 23 of 40, by Scali

User metadata
Rank l33t
Rank
l33t
SPBHM wrote:
interesting graph I thought I should add […]
Show full quote

interesting graph I thought I should add

bNqJYgA.png
http://i.imgur.com/bNqJYgA.png

I think that illustrates what I was saying on ATi's side: their marketshare was in free-fall by the time the AMD-merger happened. At the introduction of the 2900 it seems at an all-time low, and they didn't really start to recover until much later.
I wonder if they would have gone bankrupt before the 2900 was launched, if it wasn't for AMD. Let alone that they had enough money to develop the 3x00 and 4x00 series, until sales finally started to pick up somewhat again (and more importantly: with healthy profit margins, because the 2x00 was very expensive to build with the huge die and wide memory bus. 3x00 and 4x00 focused on smaller dies and lower production costs).

They are currently in even more of a free-fall than back in the ATi-AMD merger-era.

Last edited by Scali on 2015-07-13, 09:49. Edited 2 times in total.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 24 of 40, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

What strikes me about this is how the seemingly popular 4800 series didn't put much of a dent in NV's marketshare. They even priced those cards in an obvious attempt to grab marketshare.

I have found over the years that AMD has about as much of a 'reality distortion field' as Apple does.
What AMD says, and what AMD fans on the internet say, does not necessarily correlate with reality in any way.
If you were to read forum threads of that era, then 4x00 was the best thing since sliced bread. But it's mostly a few vocal AMD fans, and not regular customers actually going out and buying the cards.

The same can be said about Mantle...
The myth that AMD developed Mantle to 'save PC gaming' and push MS to develop DX12 can be found everywhere... except if you pay closer attention, the information always comes either directly from AMD, or from companies in the Gaming Evolved/Mantle program, such as Dice or Oxide Games.

And as you know, Microsoft always releases a major new version of DX with a new OS. In this case that is Windows 10.
So AMD's claim is actually "We pushed MS to release Windows 10 sooner".
Now, is that likely? Or is it more likely that MS was actually working on DX12 anyway, scheduled to be released with Windows 10 as usual... and did AMD just release a pre-emptive strike by doing their own 'DX12-lite', based on what was in development at the time and releasing it as Mantle, because they knew Windows 10 was still a ways off? Hoping to create a bit of vendor-lock by also implying there was a link between Mantle and consoles? Perhaps knowing that Intel and nVidia had added extra rendering features to DX12, which AMD knew they couldn't implement in their own GPUs before DX12/Windows 10 was released?

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 25 of 40, by Skyscraper

User metadata
Rank l33t
Rank
l33t
Scali wrote:
I have found over the years that AMD has about as much of a 'reality distortion field' as Apple does. What AMD says, and what AM […]
Show full quote
swaaye wrote:

What strikes me about this is how the seemingly popular 4800 series didn't put much of a dent in NV's marketshare. They even priced those cards in an obvious attempt to grab marketshare.

I have found over the years that AMD has about as much of a 'reality distortion field' as Apple does.
What AMD says, and what AMD fans on the internet say, does not necessarily correlate with reality in any way.
If you were to read forum threads of that era, then 4x00 was the best thing since sliced bread. But it's mostly a few vocal AMD fans, and not regular customers actually going out and buying the cards.

The same can be said about Mantle...
The myth that AMD developed Mantle to 'save PC gaming' and push MS to develop DX12 can be found everywhere... except if you pay closer attention, the information always comes either directly from AMD, or from companies in the Gaming Evolved/Mantle program, such as Dice or Oxide Games.

And as you know, Microsoft always releases a major new version of DX with a new OS. In this case that is Windows 10.
So AMD's claim is actually "We pushed MS to release Windows 10 sooner".
Now, is that likely? Or is it more likely that MS was actually working on DX12 anyway, scheduled to be released with Windows 10 as usual... and did AMD just release a pre-emptive strike by doing their own 'DX12-lite', based on what was in development at the time and releasing it as Mantle, because they knew Windows 10 was still a ways off? Hoping to create a bit of vendor-lock by also implying there was a link between Mantle and consoles? Perhaps knowing that Intel and nVidia had added extra rendering features to DX12, which AMD knew they couldn't implement in their own GPUs before DX12/Windows 10 was released?

The HD4870 might not have been as great of an success as AMD would like to claim but it was a great card, especially for tweakers. I really liked the Volterra digital VRMs that let you change voltage on the fly from the command line. I have lots of GTX 2xx cards aswell and the first generation runs too hot and uses too much power, the second generation is good though.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 26 of 40, by Scali

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:

The HD4870 might not have been as great of an success as AMD would like to claim but it was a great card, especially for tweakers.

Sure, I never claimed otherwise.
Also, they supported DX10.1 with those cards, while nVidia was still stuck at DX10.0.
It just didn't hurt nVidia sales apparently.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 27 of 40, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

The Radeon 9800 and HD4850 were my only ATI cards I believe. At the time, both offered tremendous value. But the other times I was happy staying with NV.

YouTube, Facebook, Website

Reply 28 of 40, by Scali

User metadata
Rank l33t
Rank
l33t
philscomputerlab wrote:

The Radeon 9800 and HD4850 were my only ATI cards I believe.

I've had quite a few...
Radeon 8500
Radeon 9600Pro 128MB
Radeon 9600XT 256MB
Radeon X1800XT
Radeon X1900XTX
Radeon 5770

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 30 of 40, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:
I've had quite a few... Radeon 8500 Radeon 9600Pro 128MB Radeon 9600XT 256MB Radeon X1800XT Radeon X1900XTX Radeon 5770 […]
Show full quote
philscomputerlab wrote:

The Radeon 9800 and HD4850 were my only ATI cards I believe.

I've had quite a few...
Radeon 8500
Radeon 9600Pro 128MB
Radeon 9600XT 256MB
Radeon X1800XT
Radeon X1900XTX
Radeon 5770

Me here as well:

Radeon 8500
Radeon 9600XT
Radeon x1800GTO
Radeon 5670
Radeon 4870X2 (second-hand)

Including cards that I got as retro gear: 9800Pro's, 9600Pro, 9550, a 7000 and several old boring rages. Still looknig for a 8500 btw.

I always picked ATi for GPU and intel for CPU. I can't call myself a fanboy of any camp, it just so happened that I never had a new Nvidia GPU.

Reply 31 of 40, by Scali

User metadata
Rank l33t
Rank
l33t
RacoonRider wrote:

I always picked ATi for GPU and intel for CPU. I can't call myself a fanboy of any camp, it just so happened that I never had a new Nvidia GPU.

For me, the 'gaps' in that list are filled with nVidia cards.
Prior to the Radeon 8500, I had a GeForce2 GTS, which was my first nVidia card. Before that, I had a few Matrox, Cirrus Logic and Paradise.
I also have an Apocalypse 3Dx, which was my first 3d accelerator. I tried the Kyro II, because I quite like the PowerVR approach, but driver issues made it very impractical, so I returned it and got the GF2GTS instead.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 32 of 40, by ODwilly

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:
Scali wrote:
I have found over the years that AMD has about as much of a 'reality distortion field' as Apple does. What AMD says, and what AM […]
Show full quote
swaaye wrote:

What strikes me about this is how the seemingly popular 4800 series didn't put much of a dent in NV's marketshare. They even priced those cards in an obvious attempt to grab marketshare.

I have found over the years that AMD has about as much of a 'reality distortion field' as Apple does.
What AMD says, and what AMD fans on the internet say, does not necessarily correlate with reality in any way.
If you were to read forum threads of that era, then 4x00 was the best thing since sliced bread. But it's mostly a few vocal AMD fans, and not regular customers actually going out and buying the cards.

The same can be said about Mantle...
The myth that AMD developed Mantle to 'save PC gaming' and push MS to develop DX12 can be found everywhere... except if you pay closer attention, the information always comes either directly from AMD, or from companies in the Gaming Evolved/Mantle program, such as Dice or Oxide Games.

And as you know, Microsoft always releases a major new version of DX with a new OS. In this case that is Windows 10.
So AMD's claim is actually "We pushed MS to release Windows 10 sooner".
Now, is that likely? Or is it more likely that MS was actually working on DX12 anyway, scheduled to be released with Windows 10 as usual... and did AMD just release a pre-emptive strike by doing their own 'DX12-lite', based on what was in development at the time and releasing it as Mantle, because they knew Windows 10 was still a ways off? Hoping to create a bit of vendor-lock by also implying there was a link between Mantle and consoles? Perhaps knowing that Intel and nVidia had added extra rendering features to DX12, which AMD knew they couldn't implement in their own GPUs before DX12/Windows 10 was released?

The HD4870 might not have been as great of an success as AMD would like to claim but it was a great card, especially for tweakers. I really liked the Volterra digital VRMs that let you change voltage on the fly from the command line. I have lots of GTX 2xx cards aswell and the first generation runs too hot and uses too much power, the second generation is good though.

In my experience with the HD4k series the DDR2 memory starved pcie 4650 even does well on the modern internet and can do 1080p like a champ. I have a single core AGP HIS HD4670 system in the works 80% done that will be benchmarked on STALKER and Far Cry 2 at the least soon 😀

Main pc: Asus ROG 17. R9 5900HX, RTX 3070m, 16gb ddr4 3200, 1tb NVME.
Retro PC: Soyo P4S Dragon, 3gb ddr 266, 120gb Maxtor, Geforce Fx 5950 Ultra, SB Live! 5.1

Reply 33 of 40, by candle_86

User metadata
Rank l33t
Rank
l33t

well a big part of why the HD48xx didn't sell like AMD hoped was rather simple, the HD4850 was in the same segment as the 8800GTX/8800GTS 512/9800GTX and it showed, alot of people had just recently spent 200-250 for a 9800GTX or had an 8800GT and the HD4850 wouldn't have been a large upgrade, while the HD4870 was faster than the GTX 260, the HD4870 wasn't marketed as well as the 4850, and nvidia quickly price matched the HD4870 with a revised GT200 core called the GTX 260 216, which was marginally faster for the same price.

But really their biggest problem is a failure to market this GPU's, your average joe know what Nvidia graphics are, but alot of joes still have no Idea who AMD is.

Reply 34 of 40, by Skyscraper

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

well a big part of why the HD48xx didn't sell like AMD hoped was rather simple, the HD4850 was in the same segment as the 8800GTX/8800GTS 512/9800GTX and it showed, alot of people had just recently spent 200-250 for a 9800GTX or had an 8800GT and the HD4850 wouldn't have been a large upgrade, while the HD4870 was faster than the GTX 260, the HD4870 wasn't marketed as well as the 4850, and nvidia quickly price matched the HD4870 with a revised GT200 core called the GTX 260 216, which was marginally faster for the same price.

But really their biggest problem is a failure to market this GPU's, your average joe know what Nvidia graphics are, but alot of joes still have no Idea who AMD is.

I think thats perhaps true for the US but here in Europe the HD 4850 and HD 4870 sold rather good and AMD/ATIs standing with Joe the plumber was as good as Nvidias (back then), even the HD 38x0 sold good in Europe because you bought up most of the Geforce 8800 GT/GTS 512 cards so we diddnt get any. Later when we did get them the Geforce 8800 GTS 512 was the big seller.

When it comes to the HD 4870 the most popular version here seems to have been the 1GB version with 3d party cooler and factory overclock, I see lots of those on the local auction site.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 35 of 40, by Scali

User metadata
Rank l33t
Rank
l33t

Our projection machine at work has a Crossfire 4850 setup. Sadly AMD dropped support for it quite quickly, and there are no official drivers for Windows 8. We could sorta use the Windows 7 drivers, but they had some visual bugs in some cases.
I had the same with the X1800/X1900 earlier, which didn't get official Windows 7 drivers, so you had to use unofficial Vista drivers in Win7.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 36 of 40, by sgt76

User metadata
Rank Oldbie
Rank
Oldbie

My longest lasting main rig setup happens to be my current 7970s in CF. I bought them in mid 2012 to play Witcher 2 @ ultra settings and they're still doing duty with Witcher 3 on a mix of high/ultra. Of course the current beta drivers dont allow CF without incurring artifacts, but this should improve over time.

On the balance, I've probably owned more radeons than nvidias:
Sapphire 7970 dual-x (crossfire)
Msi 5870 twin frozr
6850 vapor-x (crossfire)
4650 (agp)
3870 (trifire setup)
3850 (agp)
X800xt (agp)
×1950 pro
9800 pro
8500
7500
rage 128 pro (ok, not a radeon)

But I dont restrict myself to ati/amd and have had great experiences with nvidia as well. Some of the memorable cards I've owned:
gtx460 (still in my 2ndary phenom 955 rig)
Gts250
8800gt
6800gs
4600ti
geforce 3 ti 200
tnt2

Am gonna stretch my 7970s as far as i can and then see what to buy. Doesnt matter green or red.

Reply 37 of 40, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:

Our projection machine at work has a Crossfire 4850 setup. Sadly AMD dropped support for it quite quickly, and there are no official drivers for Windows 8. We could sorta use the Windows 7 drivers, but they had some visual bugs in some cases.
I had the same with the X1800/X1900 earlier, which didn't get official Windows 7 drivers, so you had to use unofficial Vista drivers in Win7.

There is a driver on Windows Update that seems to work ok for the 3000-4000 cards. It even works with the DRMed Windows 8 Netflix app. Don't know about 2000 or any X1xxx support.

Long term driver support has been quite disappointing from AMD.

Reply 38 of 40, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

Long term driver support has been quite disappointing from AMD.

Hehe my laptop has HD5450, I did not untick auto-update after we bought it. Guess what? A year ago after another silent update I did not manage to keep track of (not a gaming machine anyway) the Catalyst started telling me that my card was too old and no longer supported by AMD. Why the heck did you update the driver then? 😁

Reply 39 of 40, by meljor

User metadata
Rank Oldbie
Rank
Oldbie

I had both ati/amd and nvidia cards and i simply buy what's best for my budget. I found that Nvidia was a bit limited everytime a new generation of games came out, and that way imo the ati cards ''lasted longer''.
For example: the 7900gtx was about equal when compared to the x1900xtx. Much later the x1900xtx was still going strong in a lot of games where the 7900gtx became to slow for my needs.

Everything changed when the 8800 series came out. It is amazing what these cards can do (even today, but at low settings). Same thing with the 5870, very strong card.

Today i am running an amd R290, tomorrow i don't know... cards are pretty even the last few years and it all depends on the games you play. I go for the most fps within my budget (as always).

Last 2 cards i bought used as i don't play games as much as i did and i am happy with 1080p. I had a used gtx660 for cheap and since a lot of miners gave up i got the R290 for less than half (2 months old when i bought it).

asus tx97-e, 233mmx, voodoo1, s3 virge ,sb16
asus p5a, k6-3+ @ 550mhz, voodoo2 12mb sli, gf2 gts, awe32
asus p3b-f, p3-700, voodoo3 3500TV agp, awe64
asus tusl2-c, p3-S 1,4ghz, voodoo5 5500, live!
asus a7n8x DL, barton cpu, 6800ultra, Voodoo3 pci, audigy1