Reply 20 of 39, by leileilol
- Rank
- l33t++
wrote:so maybe i feel a bit offended too that a RED is eaten alive by something green 😁.
Look on the bright side - Pine green doesn't kill your eyes like the competing neon puke green.
wrote:so maybe i feel a bit offended too that a RED is eaten alive by something green 😁.
Look on the bright side - Pine green doesn't kill your eyes like the competing neon puke green.
wrote:The only problems I had with Catalyst drivers were for
The name change is really a marketing thing, note that AMD kept the brand name going for 4+ years as they built the GPU technology up to where it is now.
Their next step is the same on-die melding of CPU+GPU silicon for the mainstream computing arena, this is why they are making the name change for this technology that will be introduced later on this year 😉
Exactly -- the point in ATI's acquisition was in this "Fusion" thing (previously a codename, now a marketing gimmick the so called "APU"). They hold a unique position in the market against Intel and NVIDIA, and it'll be interesting to see their next moves in the years coming.
wrote:wrote:so maybe i feel a bit offended too that a RED is eaten alive by something green 😁.
Look on the bright side - Pine green doesn't kill your eyes like the competing neon puke green.
🤣 but both don't look as nice as ruby red esp. here on vogons!
...but really, the thread was intended to be about hunting down "legendary" items...
wrote:...but really, the thread was intended to be about hunting down "legendary" items...
Then they should be in ORANGE 😜
No matter where you go, there you are...
wrote:wrote:The ATi name is a much more valuable brand than AMD.
The AMD brand doesn't have the 'worst drivers ever' reputation from the '90s days 😀
I don't think really matters whether it's ATi or AMD; Radeon is a much better recognized name than the previous two.
Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.
wrote:The only problems I had with Catalyst drivers were for Windows 9x and none when it came to 2000/XP/Vista/7 […]
The only problems I had with Catalyst drivers were for Windows 9x and none when it came to 2000/XP/Vista/7
I've owned both ATi and nVidia cards and didn't have many issues with either the hardware or driver packages.
The name change is really a marketing thing, note that AMD kept the brand name going for 4+ years as they built the GPU technology up to where it is now.
Their next step is the same on-die melding of CPU+GPU silicon for the mainstream computing arena, this is why they are making the name change for this technology that will be introduced later on this year 😉
Say what you will but this is where most new computer hardware is sold not to high-end gamers nor the budget minded crowd.
They already do this with integrated graphics and integrated graphics are barely usable. Unless they're going to integrate a GPU with some real power, I don't see this melding of CPU and GPU going very far.
wrote:Unless they're going to integrate a GPU with some real power,
They are. At least they said they will be doing this starting with "Bobcat".
Actually I too "didn't care" when AMD bought up ATI. But now, the name ATI itself is being dissolved in it.
ATI stood (and still stands) for the advanced high-end graphics cards. But absorbing the name into AMD is unnecessary. AMD itself stands for the catch-up, and to put it crudely, 2nd class cpu category (sorry AMD lovers, I've got nothing against AMD.)
I'm not a an avid fan of ATI, and other than "now and then" ATI buying norms, I usually have nvidia stuff in my system. (My current ASUS notebook has ati in it, btw). But I still feel bad for one of the greatest name being no more.
Believe it or not, a name has lot of things behind it.
Intel is the current leader but no serious gamer will even turn his head towards one of those Intel integrated graphics or even the stand alone intel graphics cards.
By removing the name, AMD is not burdened anymore of delivering the high performance expected from an ATI brand.
AMD has been playing catch-up with intel. Yes, it's still alive and that's about it.
Again, I'm not against AMD. I'm just against the corporate mentality.
In the beginning, every big company buy something, not to enhance it but to remove the competition. And once the competition is no more, they are not bound to work hard to improve and bringing out the best. Aureal died at the hands of Creative.
There is nothing wrong in leaving the name as such. I do not know where the "AMD is more widely known" poll source came from.
It's just like removing the name Mercedes-Benz and fix the name "General Motors" behind each car. Maybe General Motors SL320? Hehehe...
It's just a matter of time, till the AMD graphics matches the performance of Intel Graphics! 😁
From what I read on Wiki about Fusion I'm not liking it. It looks like it's going to be another 3dfx Rampage where they keep adding features and pushing back release dates until the product is no longer viable. One of the older specs called for an RV710 class GPU to be integrated which isn't very powerful at all. They now want to use a DX 11 class GPU but how powerful will it be if they only wanted to go with RV710 originally? Integrating in a HD5450 isn't going to result in many sales. Heat and power consumption are going to severely limit what they can integrate with the CPU. My HD4850 requires one 6 pin auxiliary connector and anything more powerful requires 2. How are you seriously going to draw enough power through the CPU socket to power both a high end CPU and GPU at the same time? You'd have to draw 600-700 watts through the CPU pins alone which isn't going to happen, I think.
Intel makes dedicated graphics cards? Never knew that 🤣!
Anyway, during the years I've been more pleased with ATI (sorry, AMD) then with NVIDIA
wrote:It's just a matter of time, till the AMD graphics matches the performance of Intel Graphics! 😁
what do you mean 😕
This time is the highlight of GPUs as opposed to traditional CPUs. Current GPUs have tremendous advantage for parallel computing over current CPUs. They sure will keep pushing the GPU technology forward. Intel on the other hand, doesn't have what NVIDIA or AMD/ATI has.
wrote:From what I read on Wiki about Fusion I'm not liking it. It looks like it's going to be another 3dfx Rampage where they keep adding features and pushing back release dates until the product is no longer viable. One of the older specs called for an RV710 class GPU to be integrated which isn't very powerful at all. They now want to use a DX 11 class GPU but how powerful will it be if they only wanted to go with RV710 originally? Integrating in a HD5450 isn't going to result in many sales. Heat and power consumption are going to severely limit what they can integrate with the CPU. My HD4850 requires one 6 pin auxiliary connector and anything more powerful requires 2. How are you seriously going to draw enough power through the CPU socket to power both a high end CPU and GPU at the same time? You'd have to draw 600-700 watts through the CPU pins alone which isn't going to happen, I think.
you forgot about the much more mainstream mobile market, such as netbooks and especially, tablets.
The most advanced performance will obviously come in separate package. Don't expect 7x GTX 480 performance (which you can have in a single 3U rack case today btw) in a single tiny chip just yet...
New AMD Radeon HD 6000 cards out next month!
A complete refresh is coming from top to bottom!
Btw, are there still a lot of boxed 9700 Pros left in Canada, AC? Or in your backyard maybe ... ?
I doubt it. Canadians are cheap. We couldn't afford the 9700 Pros to begin with, and a lot of people here buy OEM parts to save money. I think most of my friends had Radeon 9000s back then.
"Will the highways on the internets become more few?" -Gee Dubya
V'Ger XT|Upgraded AT|Ultimate 386|Super VL/EISA 486|SMP VL/EISA Pentium
wrote:New AMD Radeon HD 6000 cards out next month!
A complete refresh is coming from top to bottom!
From what I heard the changes from HD5000 will likely be minimal. A little less power consumption and a little lower temperatures but performance gains won't be a huge step up. It'll be kind of like the update from HD2000 to HD3000.
wrote:Intel makes dedicated graphics cards? Never knew that 🤣!
Anyway, during the years I've been more pleased with ATI (sorry, AMD) then with NVIDIA
*cough* i740 *cough*
wrote:wrote:Intel makes dedicated graphics cards? Never knew that 🤣!
Anyway, during the years I've been more pleased with ATI (sorry, AMD) then with NVIDIA
*cough* i740 *cough*
Yah ok ok! But the i740 was from the last millennium 😉
I thought he ment Intel were making them now 😀
larabee was supposed to have both integrated and non-integrated but those plans were scrapped
I wonder if this decision will turn out as poor as their driver support team.
From what I heard the changes from HD5000 will likely be minimal. A little less power consumption and a little lower temperatures but performance gains won't be a huge step up. It'll be kind of like the update from HD2000 to HD3000.
Rumours say it will be targeted against Nvidias 460 cards...
The 6770 could very well be the next Ti4200 / 6600GT / 8800GT / 4850 value card for the masses!
wrote:From what I heard the changes from HD5000 will likely be minimal. A little less power consumption and a little lower temperatures but performance gains won't be a huge step up. It'll be kind of like the update from HD2000 to HD3000.
Rumours say it will be targeted against Nvidias 460 cards...
The 6770 could very well be the next Ti4200 / 6600GT / 8800GT / 4850 value card for the masses!
I don't know about that unless the price comes down substantially. Even a 10% performance improvement over HD5770 wouldn't put it in the HD5800 class, which is what you really need if you're serious about playing games with all the bells and whistles turned on. Maybe if they turned on a few more shader and rendering units over what HD5770 has, then maybe, but a simple speed bump isn't going to launch it into the stratosphere.
I'll just wait and see what the first batch of reviews tell about them.
Atleast it'll mean prices of the older cards will come down 😀