VOGONS

Common searches


AMD drops the mic

Topic actions

Reply 40 of 279, by SaxxonPike

User metadata
Rank Member
Rank
Member

I want this to work out for AMD so that there's good competition again. Might wake the former trend of computing improvements up again. A better AMD means a better Intel. We all win.

Sound device guides:
Sound Blaster
Aztech
OPL3-SA

Reply 41 of 279, by Scali

User metadata
Rank l33t
Rank
l33t
SaxxonPike wrote:

I want this to work out for AMD so that there's good competition again. Might wake the former trend of computing improvements up again. A better AMD means a better Intel. We all win.

I never bought into that theory.
Why not? Because the market is saturated with Intel. Most people (regardless of whether they use AMD or Intel CPUs currently) have only one upgrade path, and that leads to Intel (AMD hasn't improved performance much in the past ~5 years, so unless you haven't upgraded in a long time, you probably already are at or near the top of AMD's offerings, and Intel is the only one offering a significant upgrade).
When Intel stops offering faster/better/newer CPUs, people will stop upgrading, meaning that Intel's sales would drop to near-0. CPUs don't really wear out or break down, so there is little to no demand for new CPUs outside of upgrades.
So Intel is mostly competing against itself. I would say the market is stagnating because people don't need more than a quadcore at ~3 GHz. Because Intel offers much faster/more powerful CPUs. Yes, they're expensive, but they're there. They would be less expensive if the demand was there.

AMD can come out with an octocore CPU, but as long as quadcores remain good enough, I don't think it will have much of an impact. How many people would upgrade to an octocore just because they can say they have an octocore? I think most people would be more interested in the quadcores dropping prices if octocores become mainstream. Which would still be stagnation.

So it's a bit of a chicken-and-egg problem. Software doesn't demand faster CPUs, hence consumers don't demand faster CPUs. Until more demanding software arrives, I don't think octocores will move to the mainstream soon. And as such, their prices will remain relatively high.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 42 of 279, by shiva2004

User metadata
Rank Member
Rank
Member
Scali wrote:
SaxxonPike wrote:

I want this to work out for AMD so that there's good competition again. Might wake the former trend of computing improvements up again. A better AMD means a better Intel. We all win.

So it's a bit of a chicken-and-egg problem. Software doesn't demand faster CPUs, hence consumers don't demand faster CPUs. Until more demanding software arrives, I don't think octocores will move to the mainstream soon. And as such, their prices will remain relatively high.

Err... with all due respect, you live in the same planet as the rest of us? Software keeps demanding faster CPUs ALL the time. Perhaps not all aplications, perhaps not the ones you use, but the trend is clear.

Reply 43 of 279, by Scali

User metadata
Rank l33t
Rank
l33t
shiva2004 wrote:

Err... with all due respect, you live in the same planet as the rest of us? Software keeps demanding faster CPUs ALL the time. Perhaps not all aplications, perhaps not the ones you use, but the trend is clear.

I'm talking about mainstream computer usage.
Sure, there's servers, workstations and that sort of thing, but that's a niche, filled by Xeons and Opterons and such.
Mainstream users only need a quadcore to play the latest games, surf the internet, watch Netflix or YouTube in 4k and whatnot. Stuff like MS Office requires even less powerful systems.

Also, see what I did there? I named actual applications.
If you claim there is a 'trend', then please name a few (mainstream) applications that show this trend.
Because I can't think of any. The only reasonably mainstream thing pushing CPUs in recent years has been gaming. And that too has stagnated a few years ago. About the same time when Core i7 remained quadcores at about 3 GHz. Somewhere around there is the current sweet-spot.
Heck, I still haven't upgraded my Core i7 860, because there really is no need. I can still play the latest games such as DOOM without a problem. I just upgraded my GPU a few times over the years.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 44 of 279, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Streaming videogames is a popular high demand use though and i've seen friends upgrade to i7's (from i5's of the same generation) just so the video encoder can behave with and play well with games for live >720p broadcasting.

apsosig.png
long live PCem

Reply 45 of 279, by Scali

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

Streaming videogames is a popular high demand use though and i've seen friends upgrade to i7's (from i5's of the same generation) just so the video encoder can behave with and play well with games for live >720p broadcasting.

Yea, but not octocores?
Besides, don't they offload that to the GPU these days? I know nVidia started integrating such features in their drivers sometime ago.
Aside from that I'd say it's very much niche.
In order to get octocores into the mainstream, there needs to be a 'killer app' that requires it. I haven't seen one.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 46 of 279, by badmojo

User metadata
Rank l33t
Rank
l33t
Scali wrote:

I still haven't upgraded my Core i7 860, because there really is no need. I can still play the latest games such as DOOM without a problem. I just upgraded my GPU a few times over the years.

Yep same boat here, and me with an i5 that I've had for... 6 years? I can't even remember when I bought it. I've upgrade the GPU 3 times since buying that CPU and occasionally I wonder what I'm missing out on with regard to SATA3 / a more modern motherboard, but a CPU upgrade just doesn't interest me like it used to.

Life? Don't talk to me about life.

Reply 47 of 279, by snorg

User metadata
Rank Oldbie
Rank
Oldbie

Man, Intel is going to get spanked hard:

http://www.forbes.com/sites/antonyleather/201 … l/#5f192e7b6e7c

I know what is on my list for my next upgrade. If an 8 core part is just over $300, then there is a good chance a 16 core part will come in at $600 and the 32 core part at $1000 or thereabouts. Can't wait.

Reply 48 of 279, by DosFreak

User metadata
Rank l33t++
Rank
l33t++

Adblocked. If it's more marketing but no price, benchmarks or stability tests then I'm not interested.

How To Ask Questions The Smart Way
Make your games work offline

Reply 49 of 279, by Scali

User metadata
Rank l33t
Rank
l33t
DosFreak wrote:

Adblocked. If it's more marketing but no price, benchmarks or stability tests then I'm not interested.

This time it's prices, but no benchmarks or stability tests.
So far the entire internet is basing everything on a single vague test that AMD themselves performed... Gotta love how this always goes down with fanboys.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 50 of 279, by m1919

User metadata
Rank Member
Rank
Member
snorg wrote:
eL_PuSHeR wrote:

What do you need so many cores for? Aside from Ashes of the singularity no other game uses more than 4 cores.

AMD Ryzen seems promising but there is also too much hype floating around. Let's hope it all goes well for AMD. Competition is good for end users.

I do computer graphics as a hobby. I would kill for a 32 or 64 core system that cost $2000 instead of $10,000-$20,000.

^

I'm also into this.

And, I'm a hardware enthusiast... so more cores just because I can is totally on the agenda if it comes at a decent price. Now if AMD can give me a dual socket enthusiast/workstation grade board ala Quadfather, but not shitty... I would be all over that.

Also there are games that utilize more than 4 cores; pretty much every Battlefield game released after Bad Company 2 will use more than 4 cores. How efficiently they use them? No clue. Similarly, Doom makes use of as many cores as you can throw at it, it seems.

Scali wrote:
DosFreak wrote:

Adblocked. If it's more marketing but no price, benchmarks or stability tests then I'm not interested.

This time it's prices, but no benchmarks or stability tests.
So far the entire internet is basing everything on a single vague test that AMD themselves performed... Gotta love how this always goes down with fanboys.

I'm not a fanboy of either brand, if AMD actually delivers with Zen, I would totally consider switching sides if the price was right and they were offering what I want in a platform.

Crimson Tide - EVGA 1000P2; ASUS Z10PE-D8 WS; 2x E5-2697 v3 14C 3.8 GHz on all cores (All core hack); 64GB Samsung DDR4-2133 ECC
EVGA 1080 Ti FTW3; EVGA 750 Ti SC; Sound Blaster Z

Reply 51 of 279, by Skyscraper

User metadata
Rank l33t
Rank
l33t

I'm waiting for a reason to upgrade from the system in my signature.

I do not think the first generation Zen CPUs will be fast enough for me to consider an upgrade but with luck they will help with kickstarting the development again. The fat lazy cat Intel has been sitting on its ass long enough.

I know Intel has been doing stuff like improving efficiency, tweaking their integrated GPUs and building phat Xeons but the mainstream CPU performance has been more or less at a stand still for 6 years!

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 52 of 279, by Scali

User metadata
Rank l33t
Rank
l33t
m1919 wrote:

I'm not a fanboy of either brand, if AMD actually delivers with Zen, I would totally consider switching sides if the price was right and they were offering what I want in a platform.

I think this is pretty obvious, and would go for every rational person.
Point remains, we have no idea if AMD 'delivers' with Zen, because no indepedent reviewers have actually reviewed these systems yet.
So far it's just wild speculation by fanboys. Which I've gotten very tired of.
Wake me up when the real reviews are here... No wait, only wake me up if the real reviews aren't spectacularly worse than AMD's claims were.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 53 of 279, by Scali

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:

The fat lazy cat Intel has been sitting on its ass long enough.

I know Intel has been doing stuff like improving efficiency, tweaking their integrated GPUs and building phat Xeons but the mainstream CPU performance has been more or less at a stand still for 6 years!

Mainstream is just a pricetag. These CPUs are only 'Xeon' because that's how Intel has calculated that they can maximize their profit. Flooding the mainstream with cheap 8+ core CPUs was apparently not the most profitable way forward, in a market with no competition. If competition arrives, that will change. But Intel has not been 'sitting on its ass', because otherwise all those Xeons wouldn't exist. Or their mature 14 nm process, or their 10 nm... and they're now building a 7 nm fab.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 54 of 279, by Skyscraper

User metadata
Rank l33t
Rank
l33t
Scali wrote:
Skyscraper wrote:

The fat lazy cat Intel has been sitting on its ass long enough.

I know Intel has been doing stuff like improving efficiency, tweaking their integrated GPUs and building phat Xeons but the mainstream CPU performance has been more or less at a stand still for 6 years!

Mainstream is just a pricetag. These CPUs are only 'Xeon' because that's how Intel has calculated that they can maximize their profit. Flooding the mainstream with cheap 8+ core CPUs was apparently not the most profitable way forward, in a market with no competition. If competition arrives, that will change. But Intel has not been 'sitting on its ass', because otherwise all those Xeons wouldn't exist. Or their mature 14 nm process, or their 10 nm... and they're now building a 7 nm fab.

As you might or should suspect I'm fully aware of this. The fat cat is providing the market with as little as it can get away with and at as high of a price as it can get away with. This is how you make money when you have cornered a market and don't have any competition any more.

When the development and new products are primarily focused on the server market and efficient CPUs for laptops while the sub $2000 desktop market has been stuck with 4 core i7 CPUs for ages that takes away some of the pressure on developers to code their games to actually use those nice 6+ core CPUs.

I know of the problems with coding multithreaded but it IS the future, the sooner we start finding ways to work around these issues the better.

Last edited by Skyscraper on 2017-02-11, 11:13. Edited 1 time in total.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 55 of 279, by Scali

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:

As you might or should suspect I'm fully aware of this but when the development is primarily focused on the server market and efficient CPUs for laptops

This very statement tells me you didn't get my point at all.
I repeat 'mainstream is just a price tag'.
CPU development is CPU development. The days of separate CPU architectures for different market segments are long behind us.
Intel and AMD use the same basic architectures for all market segments... mobile, desktop and server/workstation.
So by default the technology that they develop for one market segment, can and will also be used for others. So the only 'focus' they have, is that they develop a single x86 architecture. In many cases, a Xeon or Opteron is little more than a rebadged desktop or mobile CPU.

Skyscraper wrote:

I know of the problems with coding multithreaded but it IS the future, the sooner we start finding ways to work around these issues the better.

This is utter nonsense, with all due respect.
I keep hearing this from people who have no idea about (multithreaded) software development.
Firstly, it's not like if you optimize code for multiple threads, that you pick a specific amount of threads. You redesign your code to use a parallel algorithm, and this algorithm can be scaled up to any number of threads.
Secondly, just because the mainstream doesn't own anything with more than 4 threads (which isn't even true, since the Core i7 is a very popular option, and it offers 8 threads), doesn't mean you should stop there as a developer. Just like game developers generally develop their games for future/extreme high-end GPUs, with the latest technology (eg, Crysis being released only shortly after Vista/DX10 and the first wave of DX10-capable hardware), they can and will do the same with multithreaded CPUs. They can just get a Xeon with 22 cores and see how their code scales to that.
They don't need to wait for these CPUs to become mainstream.

Aside from that, come on, how long are people going to continue the 'the multithreading era is about to begin' nonsense? Multi-core CPUs have been mainstream for well over a decade now. And before that, multithreading/multi-CPU has been studied for decades on mainframes and supercomputers. It's hardly a new area of software development. We've been in the multithreading era for ages. This is what you get. See Amdahl's Law.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 56 of 279, by Skyscraper

User metadata
Rank l33t
Rank
l33t
Scali wrote:
This very statement tells me you didn't get my point at all. I repeat 'mainstream is just a price tag'. CPU development is CPU d […]
Show full quote
Skyscraper wrote:

As you might or should suspect I'm fully aware of this but when the development is primarily focused on the server market and efficient CPUs for laptops

This very statement tells me you didn't get my point at all.
I repeat 'mainstream is just a price tag'.
CPU development is CPU development. The days of separate CPU architectures for different market segments are long behind us.
Intel and AMD use the same basic architectures for all market segments... mobile, desktop and server/workstation.
So by default the technology that they develop for one market segment, can and will also be used for others. So the only 'focus' they have, is that they develop a single x86 architecture. In many cases, a Xeon or Opteron is little more than a rebadged desktop or mobile CPU.

Skyscraper wrote:

I know of the problems with coding multithreaded but it IS the future, the sooner we start finding ways to work around these issues the better.

This is utter nonsense, with all due respect.
I keep hearing this from people who have no idea about (multithreaded) software development.
Firstly, it's not like if you optimize code for multiple threads, that you pick a specific amount of threads. You redesign your code to use a parallel algorithm, and this algorithm can be scaled up to any number of threads.
Secondly, just because the mainstream doesn't own anything with more than 4 threads (which isn't even true, since the Core i7 is a very popular option, and it offers 8 threads), doesn't mean you should stop there as a developer. Just like game developers generally develop their games for future/extreme high-end GPUs, with the latest technology (eg, Crysis being released only shortly after Vista/DX10 and the first wave of DX10-capable hardware), they can and will do the same with multithreaded CPUs. They can just get a Xeon with 22 cores and see how their code scales to that.
They don't need to wait for these CPUs to become mainstream.

Aside from that, come on, how long are people going to continue the 'the multithreading era is about to begin' nonsense? Multi-core CPUs have been mainstream for well over a decade now. And before that, multithreading/multi-CPU has been studied for decades on mainframes and supercomputers. It's hardly a new area of software development. We've been in the multithreading era for ages. This is what you get. See Amdahl's Law.

Well I think competition drives the devolpment, two actors will sprout more ideas than one and so on. The mainstream market will get better stuff for less money which in turn will push the software development.

When it comes to coding (I admit I'm not a coder of that sort even if I do code for a living but in total unrelated languages on total unrelated platforms) I at least know about the limitations (with one main thread acting as the limiting factor and so on). Just because we can not think of any way to at least in part get around the problems so far does not mean that there isn't any, just that we have not figured it out yet.

If we did not try to achieve the unachievable humans would not be were we are today, this is at least my point of view.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 57 of 279, by Scali

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:

Well I think the competition drives the devolpment, two actors will sprout more ideas than one and so on. The mainstream market will get better stuff for less money which in turn will push the software development.

Perhaps... But we are not in such a situation. AMD has historically been cloning Intel CPUs. Initially they cloned them directly, later they made their own architectures, with very strong resemblances to Intel's. Only in the Athlon era did AMD get successful enough to actually drive some development of their own. They came up with a 64-bit extension to the x86 instructionset, which obviously was the better choice for the short-term oriented x86 market (Intel wanted a long-term solution with a new, non-x86 64-bit architecture, which would probably be the better long-term solution, but we will never know), and they integrated the memory controller into the CPU.
Then their next step was the fatally flawed Bulldozer architecture, and Zen is basically AMD being back at square one: they are pretty much cloning the Intel Core approach. Cores with relatively short pipelines, focused on maximum IPC, combined with simultaneous multithreading.

Skyscraper wrote:

Just because we can not think of any way to get around them so far does not mean that there isn't any, just that we have not figured it out yet.

But we did figure them out, and that's what you get. Biggest problem in most cases is not the CPU technology or the software technology, but the lack of skill in most people developing the actual software.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 58 of 279, by Skyscraper

User metadata
Rank l33t
Rank
l33t
Scali wrote:
Perhaps... But we are not in such a situation. AMD has historically been cloning Intel CPUs. Initially they cloned them directly […]
Show full quote
Skyscraper wrote:

Well I think the competition drives the devolpment, two actors will sprout more ideas than one and so on. The mainstream market will get better stuff for less money which in turn will push the software development.

Perhaps... But we are not in such a situation. AMD has historically been cloning Intel CPUs. Initially they cloned them directly, later they made their own architectures, with very strong resemblances to Intel's. Only in the Athlon era did AMD get successful enough to actually drive some development of their own. They came up with a 64-bit extension to the x86 instructionset, which obviously was the better choice for the short-term oriented x86 market (Intel wanted a long-term solution with a new, non-x86 64-bit architecture, which would probably be the better long-term solution, but we will never know), and they integrated the memory controller into the CPU.
Then their next step was the fatally flawed Bulldozer architecture, and Zen is basically AMD being back at square one: they are pretty much cloning the Intel Core approach. Cores with relatively short pipelines, focused on maximum IPC, combined with simultaneous multithreading.

Skyscraper wrote:

Just because we can not think of any way to get around them so far does not mean that there isn't any, just that we have not figured it out yet.

But we did figure them out, and that's what you get. Biggest problem in most cases is not the CPU technology or the software technology, but the lack of skill in most people developing the actual software.

Software development tools will improve somewhat in time but I agree that there are limitations on what humans can acheive so the progress will perhaps be painfully slow.

What I mean is more thinking outside the box when I say "get around the problems". Perhaps the programs/games need to work more like networks with a huge amount of independant parts so each part don't get that complicated. MMORPGs achieve a huge complexity by using many human minds as players.

A million birds can fly in a formation reacting to outside input as they were one individual bird with the help of a single simple rule, one line of code. Not that we will solve our issue with a simple rule but some revolutionary "out of the box thinking" breakthrough is probably needed to speed up the progress in multithreaded coding enough for my liking.

I admit that I'm a dreamer, but I do not mind beeing one.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 59 of 279, by gdjacobs

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:

This is utter nonsense, with all due respect.
I keep hearing this from people who have no idea about (multithreaded) software development.
Firstly, it's not like if you optimize code for multiple threads, that you pick a specific amount of threads. You redesign your code to use a parallel algorithm, and this algorithm can be scaled up to any number of threads.

I'm not a game programmer, so I'm not really aware of how guys like Tim Sweeney are evolving their approach with increasingly parallel hardware, but I am familiar with numerical programming on dozens or hundreds of cores. In general I can say that parallel programming is not straightforward or cheap and any opportunity to use knowledge of the target platform to simplify the task of adapting software to a multi core environment will be embraced by developers, especially if doing so (assuming small core counts) has little no downside in the economic lifetime of your product.

The most versatile approach to multiprocess computing is to partition the problem in a fine grained fashion amongst your processors, but this is very expensive in terms of engineering. Instead, if your network code, audio code, or some other subsystem does not lend itself to parallelism, you don't achieve worthwhile gains by partitioning, or it's not worth the cost in developer time, it can be spun off on a thread in coarse fashion for the scheduler to load balance. On a processor with a few cores, large elements such as graphics remain unitary within their respective threads, but have a core fully dedicated to their execution. Housekeeping threads and the OS share what's left.

This is useful for small core counts as it allows the resources available to be utilized while avoiding the significant cost of completely re-engineering the game engine potentially down to it's inner loops. The approach runs out of steam with large core counts due to the limited number of application elements to be spun off and the significant difference in compute time required for the different threads (compare network code vs scene setup and render).

All hail the Great Capacitor Brand Finder