VOGONS


First post, by observer

User metadata
Rank Newbie
Rank
Newbie

earlier today I wondered what the measured differences were between my dual p3 833mhz with SDRAM (Tyan S1834D/Via Apollo 133A) and my single 1ghz with RDRAM (Dell OEM i820). the 1ghz system feels faster, but can hit a wall if the single CPU maxes out. the dual CPU system feels slower, but it takes a lot to max out both CPUs.

i figured there must be a substantial differences in the performance from the memory. at least in SiSoft Sandra (same version on both, Windows XPSP3 both), it was quite minimal per the screenshots. if anyone has a P3 RDRAM system, let me know if you see a similar value. i could have popped out one of the 866's and put it into the slot-1 1ghz system, but was too lazy. in any case it would have only further harmed the thesis.

i'll do some testing with games at some point, but figured i'd share at least this so far.

Reply 1 of 17, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie

I reckon you'd see a much better result on a Pentium 4 with an 850 RDRAM subsystem as opposed to an 845 SDRAM subsystem... I actually have both boards to test this on.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 2 of 17, by ElectroSoldier

User metadata
Rank Oldbie
Rank
Oldbie

Pentium 3 requirement from RAM was low latency not high bandwidth.

Reply 3 of 17, by Matth79

User metadata
Rank Oldbie
Rank
Oldbie

133 SDRAM is nicely balanced for a 133FSB and has good latencies. RAMBUS / RDRAM was a car crash, terrible latencies, so hot it needed heat spreaders that weren't just a fashion statement, and Intel shackled themselves to it for that generation of chipsets, and it was a wrong turn, but not the only wrong turn Intel made

Reply 4 of 17, by observer

User metadata
Rank Newbie
Rank
Newbie

a willamette 1.5ghz with 1gb of RDRAM crushes it on synthetic tests. that system does feel very snappy... windows xp takes the longest to boot on it unfortunately. i'll have to reinstall it. sadly the Intel desktop board has no overclocking options to speak of. it can only go up to 1.8ghz (unless i try a dodgey 423 to 478 adapter)

Reply 5 of 17, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Pentium III with 133MHz FSB has maximum bus bandwidth of... single stick of SDR 133MHz.
Throwing RDRAM at the problem can't fix this, because effective memory speed is much higher than CPU bus can handle. It can only help with FastWrites speeds, since those can happen between AGP GPU and RDRAM directly (ie. no CPU involvement).

Reply 6 of 17, by dionb

User metadata
Rank l33t++
Rank
l33t++
Matth79 wrote on Yesterday, 01:13:

133 SDRAM is nicely balanced for a 133FSB and has good latencies. RAMBUS / RDRAM was a car crash, terrible latencies, so hot it needed heat spreaders that weren't just a fashion statement, and Intel shackled themselves to it for that generation of chipsets, and it was a wrong turn, but not the only wrong turn Intel made

That's simply not true. RDRAM was a strategic and marketing blunder alright, being touted as revolutionary but not offering any convincing advantages, at least for P3 systems. It was idiotically expensive and the industry really hated its proprietary nature. But strip all that non-technical stuff away and you're left with very decent performance, coming close to, equaling or in some cases even beating the best i440BX and i815 SDRAM can offer, depending on benchmark. Take a look at this set of benchmarks by VooDooMan:
Fastest Tualatin Chipset / Best Pentium III Motherboard

In general SiS635t and overclocked i440BX rule the roost, but i820 is generally only a few points behind and sometimes (with FX5200 actually using the AGP 4x, for example) it pulls ahead of its SDR SDRAM cousins.

The only utterly abysmal performance is if you add the buggy MTH into the mix to combine RDRAM lantency with (PC100) SDRAM bandwidth.

Was it worth the extra money? Of course not.Nor was the concept of proprietary system memory. But technically it's also not deserving of the hate, even on P3.

Reply 7 of 17, by konc

User metadata
Rank l33t
Rank
l33t

RDRAM had a significant performance advantage only for a very specific and short period: during the 850-P4-dual channel configuration time, before it was matched by the 845-DDR a few months later. So with all the drawbacks and, most importantly, the triple or even quadruple price it's no surprise it didn't last long. Even Intel realized it soon.

Reply 8 of 17, by dionb

User metadata
Rank l33t++
Rank
l33t++
konc wrote on Yesterday, 08:30:

RDRAM had a significant performance advantage only for a very specific and short period: during the 850-P4-dual channel configuration time, before it was matched by the 845-DDR a few months later. So with all the drawbacks and, most importantly, the triple or even quadruple price it's no surprise it didn't last long. Even Intel realized it soon.

Even when i845GE came along, i850E still had a performance advantage - i850E with dual channel PC1066 RDRAM outperformed single-channel PC2700 DDR1; the i875P dual-channel DDR chipset was the first to actually beat the i850E over a year later. That was not because RDRAM couldn't go faster, it was because Intel had admitted commercial defeat and had stopped development of RDRAM chipsets.

I stand by my statement that RDRAM was a commercial and strategic disaster for Intel, but not a technical one - particularly not in the case of the P4, whose bus/memory controller design was specifically tailored to take maximum advantage of RDRAM's benefits and suffer least from its drawbacks.

Reply 9 of 17, by konc

User metadata
Rank l33t
Rank
l33t
dionb wrote on Yesterday, 10:11:
konc wrote on Yesterday, 08:30:

RDRAM had a significant performance advantage only for a very specific and short period: during the 850-P4-dual channel configuration time, before it was matched by the 845-DDR a few months later. So with all the drawbacks and, most importantly, the triple or even quadruple price it's no surprise it didn't last long. Even Intel realized it soon.

Even when i845GE came along, i850E still had a performance advantage - i850E with dual channel PC1066 RDRAM outperformed single-channel PC2700 DDR1;

That's why I used the word "significant", large enough to justify the price difference (even only for a few) and not just marginally faster. I agree with everything you wrote though.

Reply 10 of 17, by myne

User metadata
Rank Oldbie
Rank
Oldbie

There's a reason ddr won.

Intel's attempt to corral the industry into proprietary ram was one, but the other was obviously that ddr quickly caught up in speed.

Plus, it was a proven technology. It is sdram that transfers at twice the reference clock rate. An idea that was already proven in agp and arguably as far back as the dx2-66 in a way.

As for why sd couldn't simply operate faster, I think history shows there is a real aversion to sending motherboard Clock signals much above 100mhz at least at long distances. I assume that interference gets problematic, probably requiring far more extensive testing and validation. I'm not sure whether modern ram is genuinely ddr, or whether it has an internal clock multiplier from the reference, but either way it is what it's called.

Also, the p6 architecture had a generous enough cache to mask SDRs limitations, and since the link to the chipset was still a shared sdr link, there wasn't really much in doubling the link to the ram.

I built:
Convert old ASUS ASC boardviews to KICAD PCB!
Re: A comprehensive guide to install and play MechWarrior 2 on new versions on Windows.
Dos+Windows 3.11+tcp+vbe_svga auto-install iso template
Script to backup Win9x\ME drivers from a working install
Re: The thing no one asked for: KICAD 440bx reference schematic

Reply 11 of 17, by observer

User metadata
Rank Newbie
Rank
Newbie

it's wonderful to see the SDRAM vs RDRAM debate going all these years later!
i ordered an I850E board to try the 1066 RDRAM I have and a 3.06ghz P4

it'll probably replace the intel desktop board listed above.

Reply 12 of 17, by dionb

User metadata
Rank l33t++
Rank
l33t++
myne wrote on Yesterday, 12:31:

There's a reason ddr won.

Intel's attempt to corral the industry into proprietary ram was one, but the other was obviously that ddr quickly caught up in speed.

Plus, it was a proven technology. It is sdram that transfers at twice the reference clock rate. An idea that was already proven in agp and arguably as far back as the dx2-66 in a way.

RDRAM was also proven - and used the same double-pumping you are referring to here well before AGP. There were products shipping with it as early as late 1995, early 1996 - things like Chromatic's mPACT series of DSPs and VGA cards based on the Cirrus Logic GD-546x chips.

The problem wasn't technical, it was economical and political - the industry really, really didn't want to pay royalties on every device that used a RDRAM interface, and to have no influence on the development of the specs of that interface. Plus the partnership with Intel would have given Intel even more leverage over the industry than they already had. And the customer didn't want to pay more so it was easy for Via in particular to undercut Intel to the point that it forced them first to mess around with MTH, then to offer mid-range SDR-SDRAM fallbacks (i815, i845) and finally dropping RDRAM for DDR-SDRAM.

[...]

Also, the p6 architecture had a generous enough cache to mask SDRs limitations, and since the link to the chipset was still a shared sdr link, there wasn't really much in doubling the link to the ram.

SDR's limitations were also the limitations of the P2/P3 FSB, so they were a perfect match. Conversely the P4 had been designed around much higher bandwidths in general and interfacing with dual channel RDRAM in particular - with a hugely deep pipeline that imposed a horrible cost for cache misses, so if anything, it was the P4 that needed cache far more than the P3. The performance of Celerons highlighted that - a Mendocino or Coppermine Celeron clocked to P3 bus speeds was competitive to within a few percentage points despite only having half the cache. Conversely, P4 Celerons already had the same bus speeds as Willamette and early Northwood CPUs but their smaller caches impacted performance far more.

Reply 13 of 17, by myne

User metadata
Rank Oldbie
Rank
Oldbie

I'm confused.
You seem to be reiterating my points in a way that seems like a correction.

I built:
Convert old ASUS ASC boardviews to KICAD PCB!
Re: A comprehensive guide to install and play MechWarrior 2 on new versions on Windows.
Dos+Windows 3.11+tcp+vbe_svga auto-install iso template
Script to backup Win9x\ME drivers from a working install
Re: The thing no one asked for: KICAD 440bx reference schematic

Reply 14 of 17, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie

RDRAM is like One Flew Over The Cukoos Nest.... When you're young and stupidly liberal, you root for Jack Nicholson's character. When you get older, you see that Nurse Ratched was justified and Jack was the asshole.

Yes, Intel and RDRAM tried to monopolize the market, but we all would have been better off today had they succeeded.

The memory conglomerate (or whatever they call themselves - Jedec?) proceeded to deceive the public with their false advertising for decades to boost sales. DDR is the same speed as SDR, but it transfers data at both the rising and falling edges, we call it DDR- 266 (read: 133Mhz). Fast forward to DDR2, lithography then improves, we can make RAM faster regardless of the technology used in the RAM, now we deceive you further by introducing it's speed in the thousands (you're not buying puny 133Mhz SDRAM anymore, now you're buying PC2-4200)... The review sites further gaslit you by telling you that PC2-4200 is actually only 533Mhz (also a lie, the bus clock is 266Mhz, and the "MT/s" is 533,) and that the PC2-4200 number is so that stupid people could subsidize you - the professional - but the RAM chips internal clock is still only 133Mhz (and you are actually also stupid and are subsidizing the snob technocrats with their mansions in Silicon Valley and the Taiwan elite).

Now I KNOW that it is still a lot faster, but industries shouldn't confuse people with invented nomenclatures. RAM should be advertised by internal clockspeed, and to hell with the other improvements in other subsystems that further enhance performance.

Another thing to consider is that GDDR and GDDR2 have insanely high failure rates (GPU vendors needed fast RAM for their cutting edge GPU silicon and the technology just wasn't there, so reputable memory manufacturers pushed it and sold inherently faulty chips - The GeForce 2 GTS is a prime example of this but this extends all the way to the GeForce FX, or until GDDR3 thankfully arrived which solved the issue) and are responsible for the vast majority of failures in retro GPUs. Had RDRAM succeeded, there would be a lot more retro GPUs out there in the market today.

Public technology consortiums have run their course and R&D should be returned to the private sector.

[/rant]

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 15 of 17, by douglar

User metadata
Rank l33t
Rank
l33t
mockingbird wrote on Yesterday, 20:00:

Yes, Intel and RDRAM tried to monopolize the market, but we all would have been better off today had they succeeded.

Public technology consortiums have run their course and R&D should be returned to the private sector.

Really? Tell me more about that ..

Reply 16 of 17, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
douglar wrote on Today, 03:08:

Really? Tell me more about that ..

Cliff notes:

1) When we're young we're egged on to "fight the establishment", when we're older, we realize it is was the establishment all along trying to get us to destroy ourselves. All those anti-Microsoft fanatics in the late 90s and Bill Gates is the devil anti Internet Explorer icons near link exchange banners on webpages -- so now we're better off that Internet Explorer is dead and the technocrats at Google have basically turned the Internet Browser into an operating system and have forced everyone to kowtow to their whimsical inventions of new "features" that make your browser slower (while they discontinue support for "old" operating systems)?

2) Early GDDR (and GDDR2 which really was GDDR and not based on DDR2) was absolute shite and caused the premature death of millions of GPUs. Had RAMBUS played a role in the GPU RAM market, we'd have been better off. Instead we allowed script kiddie BitBoy-style silicon wünderkind "theoreticians" to produce our chips. The graveyard that is the scrap heap of 2000s hardware in itself tells the story. We allowed the industry to make PCs a disposable product, DDR was a part of that. Precious metals, rare earth minerals, and noble gases (helium) cost money and countries are now entering into trade embargos because everyone's fighting over the same thing. The solution? Sell you a miniature PC for a couple of hundred dollars that can't do diddly squat.

3) Just because something is "public" and "open source" it doesn't mean it's better. 20 years of Ubuntu and the Linux Desktop is still backwards and a joke. People would rather be subjected to arbitrary system restarts, advertisements, and "telemetric" invasions of privacy from Microsoft than be forced to use Linux. Give me a Windows XP/2000 style operating system with all the new hardware features of Windows 11 and I don't need anything else.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 17 of 17, by dionb

User metadata
Rank l33t++
Rank
l33t++
mockingbird wrote on Today, 03:58:
douglar wrote on Today, 03:08:

Really? Tell me more about that ..

Cliff notes:

1) When we're young we're egged on to "fight the establishment", when we're older, we realize it is was the establishment all along trying to get us to destroy ourselves. All those anti-Microsoft fanatics in the late 90s and Bill Gates is the devil anti Internet Explorer icons near link exchange banners on webpages -- so now we're better off that Internet Explorer is dead and the technocrats at Google have basically turned the Internet Browser into an operating system and have forced everyone to kowtow to their whimsical inventions of new "features" that make your browser slower (while they discontinue support for "old" operating systems)?

Isn't that flipping it around? You don't get much more 'estabilishment' than international standards organizations like JEDEC. SDRAM and its derivatives were establishment, RAMBUS was the hot new upstart with bags of venture captital and the get rich quick or die trying mentality.

The big thing that put everyone except Intel off Rambus was the way they behaved like patent trolls, not just charging royalties for using their interface but egregriously litigating against SDRAM itself in a pretty transparent (and ultimately unsuccesful) attempt to sink their competition. That made a lot of companies not want to do business with them, regardless of the technical or even commercial merits of their products. That wasn't the result of young people "fighting the establishment", it was a bunch of old grey men in suits concluding Rambus were arseholes and saying: NO.

2) Early GDDR (and GDDR2 which really was GDDR and not based on DDR2) was absolute shite and caused the premature death of millions of GPUs. Had RAMBUS played a role in the GPU RAM market, we'd have been better off. Instead we allowed script kiddie BitBoy-style silicon wünderkind "theoreticians" to produce our chips. The graveyard that is the scrap heap of 2000s hardware in itself tells the story. We allowed the industry to make PCs a disposable product, DDR was a part of that. Precious metals, rare earth minerals, and noble gases (helium) cost money and countries are now entering into trade embargos because everyone's fighting over the same thing. The solution? Sell you a miniature PC for a couple of hundred dollars that can't do diddly squat.

RDRAM was used in graphics cards years before it ever came into use for system memory, in fact those were its first applications. Cirrus Logic and Chromatic first tried it out in 1995/1996 and later moved exclusively to it. It worked fine, but pricing was not competitive and the big players were very wary of getting vendor lock-in, so they stuck to cheaper solutions.

3) Just because something is "public" and "open source" it doesn't mean it's better. 20 years of Ubuntu and the Linux Desktop is still backwards and a joke. People would rather be subjected to arbitrary system restarts, advertisements, and "telemetric" invasions of privacy from Microsoft than be forced to use Linux. Give me a Windows XP/2000 style operating system with all the new hardware features of Windows 11 and I don't need anything else.

Decisions on computer technology are made in corporate boardrooms, not - despite what many on Reddit seem to think - in the bedrooms of geeks, young or old. Technical quality is only one thing choices are based on. Price is always a very important factor, but logistics, in particular stability of supply, is as important if not more so. With just-in-time manufacturing, you need to be absolutely sure you can get the parts you need for the price you are prepared to pay, and you need to have a plan B in case your main supplier has a problem. Take a look at what happened to supply chains in 2022 for an example of what can go wrong. That is why the industry tends towards open standards: it make it possible to source parts from multiple suppliers with a minimum of fuss. Rambus' corporate behaviour actively fragmented the memory market and limited supplier choice. Intel was big enough that they saw it as an opportunity to monopolize the memory market, which would potentially deliver bigger benefits to them than the downsides. That just pissed off other companies even more.
By casting legal doubt over SDRAM patents, they also gave companies a vested interest to make Rambus lose: if Rambus won, anybody with SDRAM in their devices could have been liable to pay royalties to Rambus. Despite making noises that they wouldn't do so, Rambus never gave a hard guarantee not to, so the legal department of pretty much every company out there advised their boards that anything to do with Rambus would be a big, big risk. So - with the notable exception of SiS - no one else touched it with a barge pole after 2000.