VOGONS

Common searches


To end the AMD v. Intel debate.

Topic actions

  • This topic is locked. You cannot reply or edit posts.

Reply 140 of 181, by Scali

User metadata
Rank l33t
Rank
l33t
appiah4 wrote:

Yet another case of Intel choosing to fuck over a market they had monopoly on for personal gain and felt they could dictate whatever costs they whimsically willed onto the customers. But then, who would be surprised.

RAMBUS, not Intel.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 141 of 181, by appiah4

User metadata
Rank l33t++
Rank
l33t++
Scali wrote:
appiah4 wrote:

Yet another case of Intel choosing to fuck over a market they had monopoly on for personal gain and felt they could dictate whatever costs they whimsically willed onto the customers. But then, who would be surprised.

RAMBUS, not Intel.

Oh.. So Rambus made the deal with themselves? Wait, what?

Intel whored themselves at the customers' expense. Intel and Rambus went to bed together because Intel would actually get paid by Rambus if RDRAM succeeded.

Even you can't sugarcoat the truth here. Going with RDRAM was not a choice out of any technical merit RDRAM had (not that it had any). You can argue about RDRAM's 800MHz bus all day long but it was 16-bit chunks vs DDR's 64-bit chunks so at the end of the day it was supposed speed advantage was never a thing. Not to mention how much its latencies sucked ass.

Here's a rough memory bandwidth situation between SD/DD/RD RAM:

Single Channel SDRAM (PC133) - 1.06GB/sec
Single Channel DDR SDRAM (PC2100) - 2.1GB/sec
Dual Channel DDR SDRAM (PC2100) - 4.2GB/sec
Single Channel RDRAM (PC800) - 1.6GB/sec

Yeah..

Intel is a corporate whore. Through and through.

Last edited by appiah4 on 2019-12-19, 09:48. Edited 1 time in total.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 142 of 181, by Scali

User metadata
Rank l33t
Rank
l33t
appiah4 wrote:

Oh.. So Rambus made the deal with themselves?

RAMBUS had very strict conditions for RAMBUS usage.
Intel needed RAMBUS as it was the only memory technology to deliver the required bandwidth.
As I say: it could have been very different if there had not been DDR as a low-cost alternative, and RAMBUS would have become the dominant memory technology.

appiah4 wrote:

Intel whored themselves at the customers' expense.

'The customer' does not really apply, since RAMBUS was mainly aimed at high-end configurations, where price wasn't such a big deal anyway.
For other customers, there was still the SDR option from Intel, and third-party chipsets also offered DDR.

I think it's pretty devious to try and pose early Pentium 4 as 'customer' systems, specifically budget-conscious customers.

appiah4 wrote:
Here's a rough memory bandwidth situation between SD/DD/RD RAM: […]
Show full quote

Here's a rough memory bandwidth situation between SD/DD/RD RAM:

Single Channel SDRAM (PC133) - 1.06GB/sec
Single Channel DDR SDRAM (PC2100) - 2.1GB/sec
Dual Channel DDR SDRAM (PC2100) - 4.2GB/sec
Single Channel RDRAM (PC800) - 1.6GB/sec

This is pretty devious as well.
RDRAM is different because it's a *bus* system, which works by having relatively small (16-bit) data channels at high clockspeeds. This makes it easier to combine multiple channels.
Therefore the standard configuration for RDRAM was dual channel (making it 32-bit wide, like SDR and DDR), and the minimum speed was 3.2 GB/s, as I said before.
Dual channel DDR solutions did however not exist for Pentium 4 in the era of RDRAM, they were introduced later (when the P4's FSB was also significantly upgraded). Which means that on a Pentium 4 system, DDR PC2100 was 2.1 GB/s, as I said before (as it was on contemporary Athlons by the way).

Again, quite devious.

Last edited by Scali on 2019-12-19, 10:22. Edited 4 times in total.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 143 of 181, by appiah4

User metadata
Rank l33t++
Rank
l33t++

If you are joking with the above, it's not funny. If it was serious, then I had a good laugh, thanks.

Last edited by appiah4 on 2019-12-19, 13:49. Edited 1 time in total.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 144 of 181, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

RDRAM ran at 800 MHz, 16-bit, effectively delivering 3.2 GB/s.

You're wrong. 3.2Gb/s is mentioned for 32-bit i.e. two modules in dual channel or one 32-bit module.

DDR dual channel was originally 266 MHz, which delivered only 2.1 GB/s.

Dual channel DDR266 equals 4.2 Gb/s, practically the same bandwidht provided by RDRAM PC1066, but without high latency. Intel had better alternative from the start.

Last edited by The Serpent Rider on 2019-12-19, 10:32. Edited 1 time in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 145 of 181, by Scali

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote:

You're wrong. 3.2Gb/s is mentioned for 32-bit i.e. two modules in dual channel.

Exactly, that's what I said.
Dual channel was the default configuration for RDRAM on Pentium 4. There were no single channel chipsets.

The Serpent Rider wrote:

Dual channel DDR266 equals 4.2 Gb/s, practically the same bandwidht provided by RDRAM PC1066, but without high latency.

That is correct, I meant single channel there, but 32-bit, like the 32-bit dual channel RDRAM.
Dual channel DDR didn't arrive until years later (around 2004 I believe). This is because it required a 64-bit interface, which had some technological challenges that had to be overcome first. Exactly the challenges that RDRAM's serial bus nature provided solutions for, years earlier (it's easier to scale in clockspeed than in parallel data lines, which is ironically the same priciple that DDR applies over SDR: it performs two data transfers per clock cycle, so effectively you get twice the data rate over the same 32-bit interface).

So you're trying to argue that an alternative that didn't become available until 2004 should have been used by Intel in 2000 when Pentium 4 and RDRAM were introduced? Even PC1066 was in 2002, about 2 years before dual channel DDR.
Intel clearly did NOT have a better alternative from the start. That's the point.
Trying to rewrite history by not taking the timeline into account is devious.
By that logic you could also argue that DDR was pointless, because dual channel SDR would be as fast as single-channel DDR anyway, and a quad channel SDR solution could compete with dual channel DDR etc.

Last edited by Scali on 2019-12-19, 15:36. Edited 1 time in total.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 148 of 181, by Bruninho

User metadata
Rank Oldbie
Rank
Oldbie
Firtasik wrote:
Let me get this straight: […]
Show full quote

Let me get this straight:

intel bad*
amd good

*nvidia bad too

🤣

I have experienced both AMD and NVIDIA GPUs, and I can say the NVIDIA has been the better experience for me.

As for CPUs, Intel has the upper hand, I've always used their processors. AMD has a lot of limitations plus I have never seen a really good Hackintosh running on an AMD computer. People keep saying that ARM processors are the future, but having experienced a few of them available on the market, I highly doubt it. x86 will still be the standard for more decades.

"Design isn't just what it looks like and feels like. Design is how it works."
JOBS, Steve.
READ: Right to Repair sucks and is illegal!

Reply 150 of 181, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:

It's easy to say now that RDRAM was a mistake. But fact of the matter is that it had excellent performance, better than DDR (which explains why it was also used by Sony for the PlayStation 2 for example). The main issue was cost. [...] I think technically RDRAM was a good choice at the time, for the Pentium 4 platform. It delivered the bandwidth that a deeply pipelined CPU at high clockspeeds like the Pentium 4 required. Latency wasn't an issue for the P4 design, the huge caches dealt with that.

Alright, so, TBH, I haven't looked at this from a technical standpoint since I bought my Northwood, new. But, I seem to remember watching the benchmarks, and they showed the latency vs. bandwidth bargain was not panning out very well in favor of RDRAM. I spent the better part of a year frustrated that Intel had gotten into that exclusivity agreement, as the world just sat and waited out the clock for a DDR chipset... I could be wrong, and I'm curious to try some comparisons on my own 423 vs 478 systems, but I have some thermal issues to solve on my 423 box first. And, I don't think I have any of the first-gen DDR boards... I've been trying to track down the Epox board I had back then. It wouldn't be a fair comparison to stack it up to the i865 or i875 boards I have now.

Reply 151 of 181, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie
appiah4 wrote:

Oh.. So Rambus made the deal with themselves? Wait, what?

Intel whored themselves at the customers' expense. Intel and Rambus went to bed together because Intel would actually get paid by Rambus if RDRAM succeeded.

The level of irate vitriol, and corresponding lack of reasoning that you display here suggests that either the entire past, present, and future employee population of Intel has slept with your wife and killed your dog, or vice versa.

RAMBUS thought they had the technology that would enable the P4's need for memory bandwidth. They weren't going to give it away for free. Intel made some (in hindsight) bone-headed business deals to gain access to that technology. It seems more like a matter of course that Intel would benefit if the arrangement worked out, as compensation for agreeing to the exclusivity term. Negotiation, tit for tat. That's business, man.

AMD's been milking Intel for AMD64, the way Intel milked AMD for x86, and so on and so forth. There are no angels. It would be great if everything were open and patent-free, but it takes a significant investment to develop technology worth using, and usually somebody has a bill or two to pay that prevents them from dedicating their life, pro-bono.

Reply 152 of 181, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:
Exactly, that's what I said. Dual channel was the default configuration for RDRAM on Pentium 4. There were no single channel chi […]
Show full quote
The Serpent Rider wrote:

You're wrong. 3.2Gb/s is mentioned for 32-bit i.e. two modules in dual channel.

Exactly, that's what I said.
Dual channel was the default configuration for RDRAM on Pentium 4. There were no single channel chipsets.

The Serpent Rider wrote:

Dual channel DDR266 equals 4.2 Gb/s, practically the same bandwidht provided by RDRAM PC1066, but without high latency.

That is correct, I meant single channel there, but 32-bit, like the 32-bit dual channel RDRAM.
Dual channel DDR didn't arrive until years later (around 2004 I believe). This is because it required a 64-bit interface, which had some technological challenges that had to be overcome first. Exactly the challenges that RDRAM's serial bus nature provided solutions for, years earlier (it's easier to scale in clockspeed than in parallel data lines, which is ironically the same priciple that DDR applies over SDR: it performs two data transfers per clock cycle, so effectively you get twice the data rate over the same 32-bit interface).

So you're trying to argue that an alternative that didn't become available until 2004 should have been used by Intel in 2000 when Pentium 4 and RDRAM were introduced? Even PC1066 was in 2002, about 2 years before dual channel DDR.
Intel clearly did NOT have a better alternative from the start. That's the point.
Trying to rewrite history by not taking the timeline into account is devious.
By that logic you could also argue that DDR was pointless, because dual channel SDR would be as fast as single-channel DDR anyway, and a quad channel SDR solution could compete with dual channel DDR etc.

the rdram saga is weirder when you consider the early use on P3,

also Intel had (875 and 865 chipsets) dual channel DDR400 around mid 2003, with single channel up to 333 on the previous year,
Nvidia had the nforce(1 and 2) with dual DDR in 2001/2002

Reply 153 of 181, by Scali

User metadata
Rank l33t
Rank
l33t
SPBHM wrote:

the rdram saga is weirder when you consider the early use on P3,

Not really.
As I said, Intel needed RDRAM for the bandwidth on Pentium 4.
So it was important for Intel to get RDRAM standardized and in as many products as possible, to get demand up, price down, and make it into a commodity (there is some irony here: Intel had to give out x86 license to third parties because of the huge demand and importance of x86. RAMBUS could have gone the same way).

Sure, it didn't improve performance on P3, but there was a clear and obvious objective, which fits perfectly with the above story.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 154 of 181, by rmay635703

User metadata
Rank Oldbie
Rank
Oldbie
F0E2913E-FC92-493F-9C2A-5CE865692EF2.jpeg
Filename
F0E2913E-FC92-493F-9C2A-5CE865692EF2.jpeg
File size
28.7 KiB
Views
1936 views
File license
Fair use/fair dealing exception
E2662140-59B4-41BC-8206-32DCD4422AD3.jpeg
Filename
E2662140-59B4-41BC-8206-32DCD4422AD3.jpeg
File size
82.67 KiB
Views
1936 views
File license
Fair use/fair dealing exception
61AD245C-4DEB-4EB5-8184-0A4D7D14CAFF.jpeg
Filename
61AD245C-4DEB-4EB5-8184-0A4D7D14CAFF.jpeg
File size
109.72 KiB
Views
1936 views
File license
Fair use/fair dealing exception

Reply 155 of 181, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:
Not really. As I said, Intel needed RDRAM for the bandwidth on Pentium 4. So it was important for Intel to get RDRAM standardize […]
Show full quote
SPBHM wrote:

the rdram saga is weirder when you consider the early use on P3,

Not really.
As I said, Intel needed RDRAM for the bandwidth on Pentium 4.
So it was important for Intel to get RDRAM standardized and in as many products as possible, to get demand up, price down, and make it into a commodity (there is some irony here: Intel had to give out x86 license to third parties because of the huge demand and importance of x86. RAMBUS could have gone the same way).

Sure, it didn't improve performance on P3, but there was a clear and obvious objective, which fits perfectly with the above story.

the P3 RDRAM just helped people to see it as unnecessarily expensive for no benefit, and I don't think it generated much of a demand, SDRAM chipsets continued to dominate, and the p4 I think gained a lot of traction when they gave up on being rdram exclusive in 2001

Reply 156 of 181, by appiah4

User metadata
Rank l33t++
Rank
l33t++
SirNickity wrote:
The level of irate vitriol, and corresponding lack of reasoning that you display here suggests that either the entire past, pres […]
Show full quote
appiah4 wrote:

Oh.. So Rambus made the deal with themselves? Wait, what?

Intel whored themselves at the customers' expense. Intel and Rambus went to bed together because Intel would actually get paid by Rambus if RDRAM succeeded.

The level of irate vitriol, and corresponding lack of reasoning that you display here suggests that either the entire past, present, and future employee population of Intel has slept with your wife and killed your dog, or vice versa.

RAMBUS thought they had the technology that would enable the P4's need for memory bandwidth. They weren't going to give it away for free. Intel made some (in hindsight) bone-headed business deals to gain access to that technology. It seems more like a matter of course that Intel would benefit if the arrangement worked out, as compensation for agreeing to the exclusivity term. Negotiation, tit for tat. That's business, man.

AMD's been milking Intel for AMD64, the way Intel milked AMD for x86, and so on and so forth. There are no angels. It would be great if everything were open and patent-free, but it takes a significant investment to develop technology worth using, and usually somebody has a bill or two to pay that prevents them from dedicating their life, pro-bono.

God forbid I may just be mad at Intel as a paying customer of their products for them being the anti-competitive, anti-consumer and anti-innovation evil cunts they are.

And I'm not sure what the vice-versa here is; my dog slept with the entire population of Intel's employees and my wife murdered them? Not a bad fantasy overall.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 157 of 181, by ODwilly

User metadata
Rank l33t
Rank
l33t

To add in to the discussion rambus vs ddr, I just wanted to add in that my SiS 645 based Soyo 478 motherboard was released Q4 of 2001and utilized dual channel ddr266/333mhz. It was also under half the cost of a family friends RAMBUS Dell from a bit earlier than that as well as being much faster.

Main pc: Asus ROG 17. R9 5900HX, RTX 3070m, 16gb ddr4 3200, 1tb NVME.
Retro PC: Soyo P4S Dragon, 3gb ddr 266, 120gb Maxtor, Geforce Fx 5950 Ultra, SB Live! 5.1

Reply 158 of 181, by Scali

User metadata
Rank l33t
Rank
l33t
ODwilly wrote:

To add in to the discussion rambus vs ddr, I just wanted to add in that my SiS 645 based Soyo 478 motherboard was released Q4 of 2001and utilized dual channel ddr266/333mhz. It was also under half the cost of a family friends RAMBUS Dell from a bit earlier than that as well as being much faster.

Yes, that's what I said: third-party DDR chipsets were available. You didn't *have* to buy an Intel chipset, so you weren't necessarily stuck to RDRAM or SDR, if you wanted a Pentium 4.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 159 of 181, by Scali

User metadata
Rank l33t
Rank
l33t
SPBHM wrote:

the P3 RDRAM just helped people to see it as unnecessarily expensive for no benefit, and I don't think it generated much of a demand

That's not the point, is it? Even if a strategy failed, it was still a strategy.
I also don't think consumers should be involved. RDRAM for P3 was a high-end solution (they even had a dual-channel RDRAM solution for multi-socket Xeon systems), aimed at workstations and servers.

SPBHM wrote:

SDRAM chipsets continued to dominate, and the p4 I think gained a lot of traction when they gave up on being rdram exclusive in 2001

I don't think RDRAM alone is the reason for that.
That's just the pattern you see in general: When a new CPU is introduced, it is expensive. Over time, prices drop as manufacturing matures, demand rises, and more/cheaper variations of the chips become available. If you look at the original Pentium, the PII and the PIII, you see the same pattern: expensive at introduction, prices dropping after a year or so, eventually becoming mainstream.

But again, as I say, a failed strategy is still a strategy. Had RDRAM gained more traction in the beginning, then things could have looked a lot different. As it stands, there was never even a chipset for the Pentium 4 models with 800 MT/s or faster FSB. And there was never support for RDRAM faster than PC1066 (even though much faster memory modules existed, and even newer standards, like X DR, which was used in the PlayStation 3). So the performance of RDRAM was 'frozen in time', allowing DDR to eventually overtake it.
If Intel continued to commit to RDRAM, then there would have been faster RDRAM chipsets as well, and they may have remained the performance leader.

Heck, the Pentium 4 as a whole was a failed strategy. They wanted to push clockspeeds to extremes, targeting 10 GHz, but fell well short. The Core architecture was a different strategy, and aside from targeting much lower clockspeeds, they also changed the memory and caching around. Since Core, Intel focuses on very low-latency memory access, and the caches are so large and smart that memory bandwidth is not that much of an issue anymore.
As discussed earlier, Intel still has an advantage in gaming, and a big part of that appears to be their super low-latency memory.
So an entirely different strategy from the Pentium 4, where the cache was tuned to hide the latencies of RDRAM, but using extremely low latency memory didn't really make much of a difference.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/