VOGONS

Common searches


AMD Ryzen CPUs dominate TechReport's May 2017 System Guide

Topic actions

  • This topic is locked. You cannot reply or edit posts.

Reply 20 of 54, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
havli wrote:
Kreshna Aryaguna Nurzaman wrote:

Ah, but for gaming, I'd rather spend my money on GPU instead of some expensive Intel CPU.

If CPU is the bottleneck, then more expensive GPU won't help. And the fact is CPU is the bottleneck quite often, especially when running hi-refresh LCD (120+ Hz).

If the CPU is the bottleneck. If. Sorry, not in my case, since I'm an AA fanatic, and when you maximized AA, it's the GPU that becomes the bottleneck.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 21 of 54, by dexvx

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:
They point out that AMD's solution results in 'inconsistent performance'. I don't know if that is true or not, but you at the ve […]
Show full quote

They point out that AMD's solution results in 'inconsistent performance'.
I don't know if that is true or not, but you at the very least can't say that from Intel's multi-chip module CPUs from the past.
In fact, the first Pentium D (8xx) was a single core, they weren't MCM until the second generation (9xx), and performance didn't suffer one iota from it.

The Core2 Quad was a bit of a strange beast, since each dualcore had its own shared L2 cache. So there was some inconsistency in caching. This did not amount to problems in practice though. There was good cache synchronization, and OS schedulers would try to keep threads on the same cores anyway, so by default on the same die/cache as well.

MCM isn't necessarily bad, just like single-die isn't necessarily good. It all depends on the execution.

Several points:

1. Main performance degradation comes when you have an application that has threads spawned on cores from the separate dies. This is partially mitigated by AMD, because Epyc (and I assume ThreadRipper) is basically seen as a system with 4 or 2 NUMA nodes, respectively. So the OS should try and schedule apps on the same node. Of course if app requires more cores than a node has, then there may be some problems.
2. Pentium-D 8xx (Smithfield) was most definitely a MCM solution. It was two Prescott dies together via FSB.

http://techreport.com/review/8285/intel-penti … n-840-processor
3. Core2 Quad was basically 2x Core2 Duo's stitched together via similar MCM as the Smithfield/Pressler. Both sets of dual cores communicate via the FSB.

https://www.bit-tech.net/reviews/tech/cpus/in … treme_qx6700/2/

Reply 22 of 54, by dexvx

User metadata
Rank Oldbie
Rank
Oldbie
Kreshna Aryaguna Nurzaman wrote:
havli wrote:
Kreshna Aryaguna Nurzaman wrote:

Ah, but for gaming, I'd rather spend my money on GPU instead of some expensive Intel CPU.

If CPU is the bottleneck, then more expensive GPU won't help. And the fact is CPU is the bottleneck quite often, especially when running hi-refresh LCD (120+ Hz).

If the CPU is the bottleneck. If. Sorry, not in my case, since I'm an AA fanatic, and when you maximized AA, it's the GPU that becomes the bottleneck.

Even a GTX 980 Ti (which is the equivalent of a GTX 1070) is bottlenecked by CPU at 1080p with Ultra settings (MSAA active in most games). If you have something like a GTX 1080 Ti in SLI, you would need at least a high clocked 4C/8T+ CPU.

Below is some CPU benchmarks with overclocked (4.5GHz) i5, i7's (not Ryzen).

http://imgur.com/a/WZlsM

Reply 23 of 54, by Scali

User metadata
Rank l33t
Rank
l33t
dexvx wrote:

2. Pentium-D 8xx (Smithfield) was most definitely a MCM solution. It was two Prescott dies together via FSB.

No it's not.
Smithfield is two Prescotts on a single die (so one chip).
https://en.wikipedia.org/wiki/Pentium_D

All Smithfield processors were made of two 90 nm Prescott cores, next to each other on a single die with 1 MB of Level 2 (L2) cache per core.

It's Presler that's actually two separate dies (so MCM).

Presler introduced the 'multi-chip module, or MCM, which consisted of two single-core dies placed next to each other on the same substrate package.

See here, left is Presler, right is Smithfield:
presler_and_smithfield.jpg

3. Core2 Quad was basically 2x Core2 Duo's stitched together via similar MCM as the Smithfield/Pressler. Both sets of dual cores communicate via the FSB.

Was that a 'correction' to something I said? Because it isn't.
The FSB protocol is designed for multi-CPU, and cache synchronization/snooping is part of it. You still need to initiate it from the two specific dies though, via the cache controller. The FSB is just a bus. As evidenced by the fact that both Pentium 4 and Core2 can be used on the same motherboards, with the same sockets, same chipset, and of course same FSB, yet the Core2 extracts a lot more performance from that same FSB by more efficient use of that same bus.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 25 of 54, by carlostex

User metadata
Rank l33t
Rank
l33t

It seems to me that a small company like AMD, whose profits aren't even comparable to Intels R&D, can't do anything to please everybody. No matter how good they deliver they'll always be shit to some people. This was the case in the early 2000's it is today and it will be in the future.

Reply 26 of 54, by Scali

User metadata
Rank l33t
Rank
l33t
carlostex wrote:

It seems to me that a small company like AMD, whose profits aren't even comparable to Intels R&D, can't do anything to please everybody. No matter how good they deliver they'll always be shit to some people. This was the case in the early 2000's it is today and it will be in the future.

I think a lot of people just get carried away every time AMD releases a competitive product.
When I see a 16-core CPU beating a 10-core CPU, I don't see technical superiority. I fully expect this performance, wouldn't even need a benchmark really (if anything, having 6 extra cores, and getting only 41% more performance means you're 19% off a linear scale). I just see that AMD has undercut prices (because price is the only reason why they're making this rather strange comparison in the first place). Which is nice, but it's not a technical feat. AMD has always undercut Intel's prices. It has just been a very long time since they have done that this far up the market.
And Intel has a very strong brand, so it's not realistic to expect Intel to come down to AMD's price levels. They don't need to.

Heck, if you look at the car market, there are just a few 'premium' brands, say Mercedes, BMW, Audi... They can simply sell their cars for much more than the competition, because of their brand recognition. Other companies even try to 'invent' new brands to try and circumvent that, such as Lexus, Infiniti, Acura, DS.
Nobody ever expects a Mercedes to be price-competitive with say a Toyota. Even if the Toyota is actually a better car. It just doesn't work that way. People pay extra for the brand. It's the same with Intel, especially in the server market. Intel has a big reputation for service, reliability etc. Charging premium prices is more or less what the customers expect anyway, and finding the sweet-spot to maximize profits is most probably not in matching AMD's prices.

Not everything is technology.
I personally am mostly interested in technology, so I try to keep price out of the equation, it only messes up comparisons. Price is just a sticker that the vendor puts on the product. It doesn't say anything about the technological value.
Speaking of non-technical things... Am I the only one who thinks 'Threadripper' is a stupid name? Sounds like some 12-year old coined that. It doesn't sound very well-suited to the professional workstation/server market it's targeted at.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 27 of 54, by Tertz

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

I personally am mostly interested in technology

While most people are interested in practical use, where they try to get what they need for minimum price. Not in abstract "technology" or else marketing bs.
AMD was very competetive in low segment by FX83xx line for several last years. Since 2017 by Ryzen it have become competetive up to high i7 sector (for home use, at least). And will stay such until Intel will be selling 8-core CPU with similar price and TDP.

Price is just a sticker that the vendor puts on the product.

It's just your money.

DOSBox CPU Benchmark
Yamaha YMF7x4 Guide

Reply 28 of 54, by appiah4

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:
I think a lot of people just get carried away every time AMD releases a competitive product. When I see a 16-core CPU beating a […]
Show full quote
carlostex wrote:

It seems to me that a small company like AMD, whose profits aren't even comparable to Intels R&D, can't do anything to please everybody. No matter how good they deliver they'll always be shit to some people. This was the case in the early 2000's it is today and it will be in the future.

I think a lot of people just get carried away every time AMD releases a competitive product.
When I see a 16-core CPU beating a 10-core CPU, I don't see technical superiority. I fully expect this performance, wouldn't even need a benchmark really (if anything, having 6 extra cores, and getting only 41% more performance means you're 19% off a linear scale). I just see that AMD has undercut prices (because price is the only reason why they're making this rather strange comparison in the first place). Which is nice, but it's not a technical feat. AMD has always undercut Intel's prices. It has just been a very long time since they have done that this far up the market.
And Intel has a very strong brand, so it's not realistic to expect Intel to come down to AMD's price levels. They don't need to.

Heck, if you look at the car market, there are just a few 'premium' brands, say Mercedes, BMW, Audi... They can simply sell their cars for much more than the competition, because of their brand recognition. Other companies even try to 'invent' new brands to try and circumvent that, such as Lexus, Infiniti, Acura, DS.
Nobody ever expects a Mercedes to be price-competitive with say a Toyota. Even if the Toyota is actually a better car. It just doesn't work that way. People pay extra for the brand. It's the same with Intel, especially in the server market. Intel has a big reputation for service, reliability etc. Charging premium prices is more or less what the customers expect anyway, and finding the sweet-spot to maximize profits is most probably not in matching AMD's prices.

Not everything is technology.
I personally am mostly interested in technology, so I try to keep price out of the equation, it only messes up comparisons. Price is just a sticker that the vendor puts on the product. It doesn't say anything about the technological value.
Speaking of non-technical things... Am I the only one who thinks 'Threadripper' is a stupid name? Sounds like some 12-year old coined that. It doesn't sound very well-suited to the professional workstation/server market it's targeted at.

This was the funniest thing I read in a while.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 29 of 54, by Scali

User metadata
Rank l33t
Rank
l33t
Tertz wrote:

While most people are interested in practical use, where they try to get what they need for minimum price. Not in abstract "technology" or else marketing bs.

You mean consumers. I'm a professional. I'm a software architect, and I care about the technology side. Hardware is what I use to develop products with.

Tertz wrote:

It's just your money.

Not mine. The customer pays for it.
It's mostly irrelevant, because the software systems we develop are orders of magnitude more expensive than the software they run on.
So whether a CPU is $1000 or $2000 is irrelevant for the total cost of the system.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 30 of 54, by bristlehog

User metadata
Rank Oldbie
Rank
Oldbie
carlostex wrote:

It seems to me that a small company like AMD, whose profits aren't even comparable to Intels R&D, can't do anything to please everybody. No matter how good they deliver they'll always be shit to some people. This was the case in the early 2000's it is today and it will be in the future.

They don't need to please everybody. They have to make a niche product. But what's the niche for Ryzens? I have heard that they are good for video editing and live streaming, while losing to similarly priced Intel solutions in gaming.

Hardware comparisons and game system requirements: https://technical.city

Reply 31 of 54, by Scali

User metadata
Rank l33t
Rank
l33t
bristlehog wrote:

They don't need to please everybody. They have to make a niche product. But what's the niche for Ryzens? I have heard that they are good for video editing and live streaming, while losing to similarly priced Intel solutions in gaming.

I guess the same as before, with Bulldozer: try to convince people that moar coars is the future.
I suppose the actual performance doesn't really matter... once someone has convinced themselves that they need to buy CPU X, they're probably not actually going to ever compare their daily usage on CPU Y to see if they really made the right choice considering cores vs single-threaded performance.
I think the only 'tangible' bit of performance these days is gaming: framerate is directly apparent. But if you already settled for less gaming performance with your purchase, are you really going to ever verify that your CPU actually gives you the better video editing performance, multitasking or whatever for your specific needs? I don't think so. So most people probably will never ever know whether they made the right choice or not.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 32 of 54, by appiah4

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:
I guess the same as before, with Bulldozer: try to convince people that moar coars is the future. I suppose the actual performan […]
Show full quote
bristlehog wrote:

They don't need to please everybody. They have to make a niche product. But what's the niche for Ryzens? I have heard that they are good for video editing and live streaming, while losing to similarly priced Intel solutions in gaming.

I guess the same as before, with Bulldozer: try to convince people that moar coars is the future.
I suppose the actual performance doesn't really matter... once someone has convinced themselves that they need to buy CPU X, they're probably not actually going to ever compare their daily usage on CPU Y to see if they really made the right choice considering cores vs single-threaded performance.
I think the only 'tangible' bit of performance these days is gaming: framerate is directly apparent. But if you already settled for less gaming performance with your purchase, are you really going to ever verify that your CPU actually gives you the better video editing performance, multitasking or whatever for your specific needs? I don't think so. So most people probably will never ever know whether they made the right choice or not.

I guess the same as before, with Kabylake: try to convince people that moar IPC is the future.
I suppose the actual performance doesn't really matter... once someone has convinced themselves that they need to buy CPU X, they're probably not actually going to ever compare their daily usage on CPU Y to see if they really made the right choice considering single-threaded performance va cores.
I think the only 'tangible' bit of performance these days is productivity: render times are directly apparent. But if you already settled for less productivity with your purchase, are you really going to ever verify that your CPU actually gives you the marginally better framerate performance, compression speed or whatever for your specific needs? I don't think so. So most people probably will never ever know whether they made the right choice or not.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 33 of 54, by Scali

User metadata
Rank l33t
Rank
l33t
appiah4 wrote:

I guess the same as before, with Kabylake: try to convince people that moar IPC is the future.

Firstly, 'moar IPC' is wrong.
Intel offers more single-threaded performance, which is a balance of good IPC and high clockspeeds.
And single-threaded performance is always the future, never heard of Amdahl's law?

Secondly, Intel lets you have your cake and eat it too, they already sell 28-core, 56 thread CPUs if you want 'moar coars': http://ark.intel.com/products/120496/Intel-Xe … -Cache-2_50-GHz
And a 32-core, 64-thread version is coming later: https://www.techpowerup.com/228023/intels-sky … ores-64-threads

So it's not that Intel only offers more single-threaded performance, they do offer the same high-core counts as AMD as well.

appiah4 wrote:

I think the only 'tangible' bit of performance these days is productivity: render times are directly apparent.

They aren't, because unlike games, where everyone plays the same game, and people will know from reviews etc that game A on CPU B with settings C will get X FPS, render times depend a lot on your source material, choice of filters, choice of codecs and various other factors. So there is no way to directly compare your render times with any kind of reference.
Which is why I said that the only way to find out is to actually test your specific workflow on both types of machines. Which nobody ever does.

Way to miss the point. Don't you love it when people bring a knife to a gunfight?

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 34 of 54, by appiah4

User metadata
Rank l33t++
Rank
l33t++

I would say you are the one who missed the point entirely but seeing as your own faux logic can't make you understand that I don't see the point in trying further.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 35 of 54, by Tetrium

User metadata
Rank l33t++
Rank
l33t++
appiah4 wrote:

I would say you are the one who missed the point entirely but seeing as your own faux logic can't make you understand that I don't see the point in trying further.

It's pointless. He also still hasn't answered my question yet how much Intel is paying him 😢

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 37 of 54, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
Dreamer_of_the_past wrote:

Scali,
Sorry bud I hate to disappoint you, but right now is Ryzen time and everyone knows it. 🤣 dude, even most of those popular youtube bloggers turned Intel down.

Hell yeah it is! I can't wait to see how Threadripper performs.
Mainstream Ryzen is held back by its dual channel RAM and responds quite nicely to higher memory clocks. It'll be interesting to see how much of a difference Threadripper's quad-channel IMC makes.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 38 of 54, by Scali

User metadata
Rank l33t
Rank
l33t
Dreamer_of_the_past wrote:

Scali,
Sorry bud I hate to disappoint you, but right now is Ryzen time and everyone knows it. 🤣 dude, even most of those popular youtube bloggers turned Intel down.

Ah yes, has it come to this? Random YouTube bloggers as 'industry experts'?

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 39 of 54, by Dreamer_of_the_past

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

Ah yes, has it come to this? Random YouTube bloggers as 'industry experts'?

Not sure what are you talking about, go to any technology websites, everyone screams about the Ryzen and gives these processors 5 out of 5 stars. Are you one of those immature folks who say "look the i7 7700K gives +5 more FPS in games!", but who in his right mind needs those 5 more FPS in a few games when Ryzen processors wipe out the CPU in other stuff by 40%-50%? Who in his right mind would go and buy right now an Intel CPU when there are such monsters like the Ryzen 5 1600 for just $200 and the Ryzen 7 1700 for $270?