VOGONS

Common searches


Comeback for AMD? [Polaris]

Topic actions

Reply 40 of 170, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

Nice!

Eager to see the first reviews of the 1060 😀

@ Deep Thought

You make a good point regarding noise and temperatures. Like with the RX 480, it seems a proper dual fan cooler will improve both.

YouTube, Facebook, Website

Reply 41 of 170, by Tiger433

User metadata
Rank Member
Rank
Member

Not comeback, rather beginning of end of AMD, they relased RX480 overpriced in Poland, that card draw to much power from PCIE, even their FX processors burning mainboards VRM`s, I don`t believe in AMD and their ZEN processor, propably they do with him that they do with RX480, and some company buy AMD.

W7 "retro" PC: ASUS P8H77-V, Intel i3 3240, 8 GB DDR3 1333, HD6850, 2 x 500 GB HDD
Retro 98SE PC: MSI MS-6511, AMD Athlon XP 2000+, 512 MB RAM, ATI Rage 128, 80GB HDD
My Youtube channel

Reply 42 of 170, by F2bnp

User metadata
Rank l33t
Rank
l33t
Scali wrote:

I'm not insulting you. You obviously just copied that rhetoric from elsewhere. I've seen it surface numerous times. It's all part of the AMD propaganda machine.

Scali wrote:

Anyway, the problem you're now facing is that you've mentioned a number of common points often brought forward by AMD fanboys. So I'm not sure if that's just coincidence, or if you're part of that group of AMD people that spread this nonsense across the web, like the guy that posted pretty much the exact same things on my blog some weeks ago.

I gotta start with this, because this is just ridiculous. I don't keep up with your blog, so I have no idea what you're talking about. But I certainly don't take kindly to you making these sort of accusations.
I'm facing a problem here? Get a grip man. I've been a member here for many years and have contributed multiple times. Had you actually taken time to see who I am before making that sort of accusation, you'd understand that I love all things tech and I certainly am not on any fanboy club or secret AMD payroll.

You seem to be incapable of discussing things without trying to insult and belittle the other person in some way. You had some interesting point in your post and I had some things to say on them, but I won't even bother since you're such an asshole to everybody that holds a different view.

ODwilly wrote:
F2bnp wrote:

Waiting for custom cards to jump to the RX 480. The card is great value!

This ^. Kinda reminds me of the founders edition nvidia 1080 cards. Just poorly designed. Just wait for Gigabyte, MSI and Sapphire to kick out some good quality cards, possibly with an extra 6pin power connector or 8 pin.

At least they didn't charge a 75-100$ extra for that "premium" reference cooler. 😵 Yeah, it's utter shit, but they are not charging you extra for it. How hard is it to cool a 150-160W chip though? AMD fucked up once again on this end.

Deep Thought wrote:
I get the aesthetic appeal of the small GPU - especially single-slot - but I'd rather have a card that stays cool and quiet. You […]
Show full quote

I get the aesthetic appeal of the small GPU - especially single-slot - but I'd rather have a card that stays cool and quiet. You can't tell what size the card is once it's in a case.
However you can tell when a system has a GPU with a small heatsink and fan inside because you'll hear it.
The larger a fan is, the quieter it will be and the lower-pitched any noise it makes will be when moving the same amount of air.
The more fans you have, the quieter the GPU will be when shifting the same amount of air.
The more surface area the heatsink has, the more efficiently it can be cooled.

Very good point. I remember buying the 4850 in July 2008, installing it on my PC that evening and playing until late at night. It was about 1am when I realized just how noisy that cooler was 🤣 . At least, I got a cheap Scythe Musashi from a friend not long after and my ears could finally relax.

Reply 43 of 170, by Scali

User metadata
Rank l33t
Rank
l33t
F2bnp wrote:

I gotta start with this, because this is just ridiculous. I don't keep up with your blog, so I have no idea what you're talking about. But I certainly don't take kindly to you making these sort of accusations.

I meant to say that *even* on my blog these things have been posted by AMD fanboys. They're also posted on all popular tech forums out there, such as Anandtech, TechReport, Tomshardware, Neogaf etc.
You probably picked them up from one of these sites.

You also have to understand that I am a software engineer, and I was actually part of the DX12 Early Access program. So I was actually involved somewhat in the development of the standard. You must understand that to someone like me, these outrageous fanboy claims are deeply insulting in how they completely distort reality, and the work that people like myself have put into developing the standard, and working with the GPU designers.

F2bnp wrote:

I'm facing a problem here?

The 'problem' being that I'm not sure whether you're genuinely questioning the things you've posted here, and are genuinely asking me for an alternative view on the situation, or whether you've already made up your mind. So I'm not sure whether it's even worth trying to engage in a serious, meaningful discussion.
The guy who posted on my blog was completely unreasonable anyway, and I don't care for something like that again.
But since you haven't actually bothered to 'argue' the things I put forward, I'll give you the benefit of the doubt, and I apologize if felt I treated you too hard for posting these AMD fanboy crackpot theories here. I hope you also understand where I'm coming from. Read some of the comments here for example: https://scalibq.wordpress.com/2016/05/17/nvid … -is-directx-12/
And note that this was not the first time. The guy first did the same for weeks on another forum. When I stopped posting there, he came to my own blog and continued.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 44 of 170, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah GPUs and CPUs really attract some crazy people with strange thinking processes and personal agendas. Armchair engineers with mental illness run amuck. It's something that is never going to go away. I'd like to sit in on discussions at AMD and NV on strategies to manage and harness the crazy. 😀

Reply 45 of 170, by Scali

User metadata
Rank l33t
Rank
l33t

Now there are claims of users getting motherboard failures with their new RX480 cards: http://wccftech.com/amd-rx-480-killing-pcie-slots/

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 46 of 170, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Yeah GPUs and CPUs really attract some crazy people with strange thinking processes and personal agendas. Armchair engineers with mental illness run amuck. It's something that is never going to go away. I'd like to sit in on discussions at AMD and NV on strategies to manage and harness the crazy. 😀

You'd almost think that's the demographic that AMD had in mind when they designed this campaign: http://www.forbes.com/sites/jasonevangelho/20 … n/#229256ce1c8d
😀

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 47 of 170, by clueless1

User metadata
Rank l33t
Rank
l33t
Scali wrote:

Now there are claims of users getting motherboard failures with their new RX480 cards: http://wccftech.com/amd-rx-480-killing-pcie-slots/

If these claims turn out to be honest, I smell a recall.

The more I learn, the more I realize how much I don't know.
OPL3 FM vs. Roland MT-32 vs. General MIDI DOS Game Comparison
Let's benchmark our systems with cache disabled
DOS PCI Graphics Card Benchmarks

Reply 48 of 170, by swaaye

User metadata
Rank l33t++
Rank
l33t++

It seems inconceivable that they could engineer such a complex GPU and then screw up the power regulation of the board. Could be another sign of the board being pushed beyond its originally intended power consumption.

Reply 49 of 170, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

It seems inconceivable that they could engineer such a complex GPU and then screw up the power regulation of the board. Could be another sign of the board being pushed beyond its originally intended power consumption.

Yup... that must be the case.
They designed it as a card with a single 6-pin power connector. So you know you can get 75W from the PCI-e slot, and another 75W from the connector.
No matter how you do the power regulation, you can't handle a draw of more than 150W that way. You'd draw too much from the slot, from the connector or both. So the card just draws more than it was designed for, that's the only logical conclusion.

If you're aiming for 150W, you shouldn't put a 6-pin on there in the first place. If you look at the GTX970 for example, it has a 145W TDP, yet nVidia opted for an 8-pin connector, to be on the safe side (and to make overclocking easier I suppose).
So my guess is that AMD didn't even design the card for 150W initially, but something considerably less... Then ended up having to push it to the limits, and beyond, to reach competitive levels. This could be because the 14 nm FinFET process is far less mature than AMD had hoped for.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 50 of 170, by Scali

User metadata
Rank l33t
Rank
l33t

3ipZA6n.png

For some reason none of the reviews I saw went into the supported featurelevels on RX480. This screenshot is the ultimate proof that it's still only DX12_0. Not sure why this isn't in every review. If AMD isn't specific about what features it supports, wouldn't the first thing a reviewer does be to grab a tool and find out what it is we have here exactly?

Edit: looks like it may have come from this Italian review: http://www.bitsandchips.it/recensioni/9-hardw … 480-8gb?start=5

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 51 of 170, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

Yeah, I think AMD will bounce back. You know, sometimes I get the feeling that they're the only guys who truly care about enthusiasts. It's all in the little things they do for us: Freesync, ECC memory support in AM3+ CPUs, Mantle, which jump started the low-level API movement...I could go on and on. Now if only they had some cash.

The 8GB RX480 is one hell of a card for $240, and I wouldn't be surprised if AMD's AIB partners correct the PCIe power consumption problem. I'm also hearing good stuff about Zen. Man I hope they come through and light a fire under Intel's ass. They've been lazy as hell since they released Sandy Bridge.

Oh, and before anybody starts calling me names...you'll find a 4930K and GTX 970 inside my computer. Still, I have nothing but respect for AMD. They've done a shitload of stuff for us, and all on a tiny R&D budget. If it wasn't for them I'd probably be running some highly incompatible Windows 7 IA64 edition on an overpriced Itanium.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 52 of 170, by snorg

User metadata
Rank Oldbie
Rank
Oldbie

I think AMD will correct the power issue with some kind of bios update.
The 3rd party sellers will probably put in an 8 pin power connector and
some other tweaks.

I'm going to go out on a limb and say that Zen is going to be the
turnaround point for AMD.

I have both AMD and Intel and an N-vidia gear in my systems.
I plan on buying a 480 when some of the kinks get worked out.
I don't really want to spend $400-$500 on a graphics card.
Intel has gotten lazy as f@ck and needs to get taken down a peg.

Reply 53 of 170, by Jade Falcon

User metadata
Rank BANNED
Rank
BANNED
Scali wrote:

If you're aiming for 150W, you shouldn't put a 6-pin on there in the first place. If you look at the GTX970 for example, it has a 145W TDP, yet nVidia opted for an 8-pin connector, to be on the safe side (and to make overclocking easier I suppose).
So my guess is that AMD didn't even design the card for 150W initially, but something considerably less... Then ended up having to push it to the limits, and beyond, to reach competitive levels. This could be because the 14 nm FinFET process is far less mature than AMD had hoped for.

8pin? Yeah two more gounds will certainly help a lot. Oh and most lowend and mid range psu's piggyback those extra 2 grounds. 2 six pins is what you should be refereeing too.

Reply 54 of 170, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:
http://i.imgur.com/3ipZA6n.png […]
Show full quote

3ipZA6n.png

For some reason none of the reviews I saw went into the supported featurelevels on RX480. This screenshot is the ultimate proof that it's still only DX12_0. Not sure why this isn't in every review. If AMD isn't specific about what features it supports, wouldn't the first thing a reviewer does be to grab a tool and find out what it is we have here exactly?

Edit: looks like it may have come from this Italian review: http://www.bitsandchips.it/recensioni/9-hardw … 480-8gb?start=5

So it's essentially identical to the prior GCN chips? It's a curious thing....

The Wikipedia page has been updated too.
https://en.wikipedia.org/wiki/Feature_levels_in_Direct3D

Maybe Intel should build a real graphics card. Looks like they are on top of DirectX spec over there.

Reply 55 of 170, by Scali

User metadata
Rank l33t
Rank
l33t
Jade Falcon wrote:

8pin? Yeah two more gounds will certainly help a lot.

Nope. See here: http://www.playtool.com/pages/psuconnectors/connectors.html
Yes, in theory it's just 'two extra ground pins'... However, the point is:
1) With these extra pins, the card will detect that the proper connector is connected.
2) The PSU needs to be designed to deliver 150W through the 8-pin connector rather than 75W. So it will be connected to a different kind of rail, and the connector may also have thicker wiring than a 6-pin connector would have, to be able to deliver the 150W.

2x 6-pin is the 'oldskool' way of delivering 150W to a card. 8-pin is the new way.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 56 of 170, by Scali

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

So it's essentially identical to the prior GCN chips? It's a curious thing....

Yup... it appears to be little more than a 14nm shrink of the previous GCN.
They've made some small changes to the architecture, but they didn't make it up-to-date where it matters.
I'm not too surprised though. I already said back when Maxwell was launched that adding CR and ROV to an existing architcture is not that simple. They change how a rasterizer works at the fundamental level. This requires a proper redesign.
Feels like tessellation all over again. This also required a rigorous redesign to avoid the obvious bottlenecks, such as nVidia's PolyMorph did.

swaaye wrote:

Maybe Intel should build a real graphics card. Looks like they are on top of DirectX spec over there.

During my time in the DX12 Early Access program, it was mostly nVidia and Intel devs that were active. I rarely ever saw AMD people engaging in any conversation. AMD also took ages to deliver a working driver for DX12. nVidia and Intel had them ready months earlier.
AMD totally dropped the ball here. Intel would probably be able to make a pretty decent card. They should have done it 2-4 years ago though, when they were the only ones around with 22 and 14 nm technology.

Last edited by Scali on 2016-07-02, 08:15. Edited 1 time in total.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 57 of 170, by Scali

User metadata
Rank l33t
Rank
l33t
snorg wrote:

I think AMD will correct the power issue with some kind of bios update.

The only way to correct it in the BIOS is to downclock it. The card has to be prevented from drawing more than 150W.
Problem there is that it will give lower performance, and people bought these cards based on a projected performance level.
They could file a class action against AMD if it can't deliver on these promises.

So the proper way to handle it is to fix the card design, and swap the cards of all the customers with this new fixed card.
Which still would be a problem I suppose... since some people may have specifically bought it because it has one 6-pin connector. The redesigned card would require an 8-pin connector (or 2x6-pin), so they might need a new PSU to connect that properly.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 58 of 170, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I studied one deck of slides about Polaris. This is the small print about the 2.8x improvement in perf / watt of performance:

Slide 28: Polaris 11 2.8x performance per watt: Testing conducted by AMD Performance Labs as of May 10, 2016 on the AMD Radeon RX470 (110W) and AMD Radeon R9 270X (180W), on a test system comprising i7 5960x@3.0 GHz 16 GB memorr, AMD Radeon Software driver 16.20 and Windows 10. Using 3DMark Fire Strike preset 1080p the scores were 9090 and 5787 respectively. Using Ashes of the Singularity 1080p High, the scores were 46 fps and 28.1 fps respectively. Using Hitman 1080p High, the scores were 60 fps and 27.6 fps respectively. Using Overwatch 1080p Max settings, the scores were 121 fps and 76 fps respectively. Using Performance/Board power, the resulting average across the 4 different titles was a per per watt of 2.8 vs the Radeon R9 270X. Test results are not average and may vary. RX-6

Sounds all good, but "Test results are not average and may vary" I'm not sure what they mean...

Was the R9 270X a especially "bad" perf / watt card?

@ Jade

The cards can indeed detect if you plug in a 6 or 8 pin. On a GTX 480 for example, the machine won't turn on if you don't have 8 pin plugged in 😀

YouTube, Facebook, Website

Reply 59 of 170, by archsan

User metadata
Rank Oldbie
Rank
Oldbie

Intel... i740 comeback. Now that's interesting.

"Any sufficiently advanced technology is indistinguishable from magic."—Arthur C. Clarke
"No way. Installing the drivers on these things always gives me a headache."—Guybrush Threepwood (on cutting-edge voodoo technology)