VOGONS


Reply 40 of 54, by slivercr

User metadata
Rank Member
Rank
Member
canthearu wrote on 2020-12-19, 13:52:

I currently use my core 2 quad box as an overpowered windows XP machine as well.

But, it certainly has some legs, and would still do a surprising amount of the stuff I need to do today with OK performance. Running Vista or Windows 7, or even windows 10, really isn't bad at all on it. Likewise, running windows XP really isn't a waste either, as there are plenty of newer computers about to run my later software.

As for the Pentium 4 .... bleugh. It was a pretty terrible, marketing focused design. However, for building retro computers with them, the northwood and cedar wood versions of the P4 are not too bad. The Prescott versions are quite awful, and the original Willamette versions, are fun because of their uniqueness and rareness. I always stuck with the Athlon CPUs at the time, switching back to Intel with the core 2 quad Q6600 came out.

I have a spare C2Q PC in the office that runs W10, its paired with a 750Ti that helps with video decoding. Its still a very capable machine and W10 runs very well. It dual-boots into XP and let me tell you, its a great XP gaming machine!

Concerning the Pentium4 being "marketing oriented", I'll continue to play devil's advocate here and say that not all the fault lies with intel, in general the consumer is fairly ignorant and equates frequency with performance 🤷🏽
You could argue that the Athlon XP was more "marketing oriented" than the P4, considering the performance rating they came up with. (N.B.: I used AMD exclusively since the first Athlon dropped, until Sandy Bridge.)

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 41 of 54, by Srandista

User metadata
Rank Oldbie
Rank
Oldbie

AMD had to do something (PR rating), otherwise they wouldn't be able to compete with raw frequency of P4, even though that number didn't make Intel necessarily faster. Unfortunately, for most people, it always comes down to one simple number (GHz, MPix, GB of VRAM, etc.). I really can't blame AMD for this marketing stunt and introduction of PR rating.

Socket 775 - ASRock 4CoreDual-VSTA, Pentium E6500K, 4GB RAM, Radeon 9800XT, ESS Solo-1, Win 98/XP
Socket A - Chaintech CT-7AIA, AMD Athlon XP 2400+, 1GB RAM, Radeon 9600XT, ESS ES1869F, Win 98

Reply 42 of 54, by slivercr

User metadata
Rank Member
Rank
Member
Srandista wrote on 2020-12-19, 21:19:

AMD had to do something (PR rating), otherwise they wouldn't be able to compete with raw frequency of P4, even though that number didn't make Intel necessarily faster. Unfortunately, for most people, it always comes down to one simple number (GHz, MPix, GB of VRAM, etc.). I really can't blame AMD for this marketing stunt and introduction of PR rating.

Agreed. And that's where intel's marketing team had its field day.

If you ask some people on the internet though, intel's marketing team was involved in the design of the Pentium4 🤷🏽
After they tell you that, they will then suggest a clock-per-clock comparison between Athlon and Pentium4 and tell you that's proof that the P4 was garbage.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 43 of 54, by canthearu

User metadata
Rank Oldbie
Rank
Oldbie
slivercr wrote on 2020-12-19, 16:44:

Concerning the Pentium4 being "marketing oriented", I'll continue to play devil's advocate here and say that not all the fault lies with intel, in general the consumer is fairly ignorant and equates frequency with performance 🤷🏽
You could argue that the Athlon XP was more "marketing oriented" than the P4, considering the performance rating they came up with. (N.B.: I used AMD exclusively since the first Athlon dropped, until Sandy Bridge.)

True, Intel saw an opportunity during the race to 1ghz and engineering and marketing bought into the whole moar mhz = greatness thing.

There was likely some voices in the background asking how Intel was going to manage the heat generation from these CPUs as the Mhz got higher, but Intel must have convinced itself that they would be able to fix these problems as they raced towards the mythical 10ghz. Of course, it all came crashing down at about 3.5ghz.

Don't forget that the P4 was also developed in the arrogant times when Intel thought they could force RAMBUS onto everyone. They seriously underestimated the needs of the market, and seriously under engineered their chipsets at this time.

Reply 44 of 54, by Warlord

User metadata
Rank l33t
Rank
l33t

"Revelations from former members of the P4's design team, as well as my own off-the-record conversations with Intel folks, all indicate that the P4's design was the result of a marketing-driven focus on clock speeds at the expense of actual performance and scalability. "
source
https://arstechnica.com/features/2004/07/pentium-2/

AMD beat intel to 1ghz though, they beat them to x86-64, they beat them, and to this day AMD holds the Guinness World Record of "Highest Frequency of a Computer Processor"

Reply 45 of 54, by slivercr

User metadata
Rank Member
Rank
Member
Warlord wrote on 2020-12-20, 02:38:
"Revelations from former members of the P4's design team, as well as my own off-the-record conversations with Intel folks, all i […]
Show full quote

"Revelations from former members of the P4's design team, as well as my own off-the-record conversations with Intel folks, all indicate that the P4's design was the result of a marketing-driven focus on clock speeds at the expense of actual performance and scalability. "
source
https://arstechnica.com/features/2004/07/pentium-2/

AMD beat intel to 1ghz though, they beat them to x86-64, they beat them, and to this day AMD holds the Guinness World Record of "Highest Frequency of a Computer Processor"

I stand corrected.
(still think the clock for clock comparison is out of line, though)

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 46 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t

Architecture deficiencies aside, there was a brief period, after the introduction of the Pentium 4 HT 3GHz+ CPUs and before Athlon64, that the fastest desktop CPUs on the market were Pentium 4. If you bought a system during that time, it is quite possible that the P4 was a good buy. The chipsets were also a bit better, as far as I recall.

Maybe the entire tangent of "Pentium 4 sucks" would not have happened, if I had used Athlon XP, not Pentium 4 to define the start of the "WinXP era". Alas, a P4 is what I had at the time, so that's what I remember.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 47 of 54, by Warlord

User metadata
Rank l33t
Rank
l33t

it's fine, it would have been the same if someone said, the Pentium pro, or Pentium mmx was the start or the 9x era so thats what 9x should run on. Which is the same thing. I could of just as well said well a Tualatin or a 1ghz pIII runs 9x like a champ tho. Then the person could be like naw 1ghz PIII is too fast, its more of a win 2000 cpu 🤣

In mid 2000s back in the day when I was working for a small computer shop, pretty much the month the C2D came out we quit shipping XP boxes with P4s and immediatly switched to shipping them with C2d. Vista was also a total disaster in its infancy and nobody wanted Vista at all. As most of our customers were business users, they didn't stop running XP until Microsoft started threatening killing it off and making people switch to 7 did that happen. That was well after sandybridge came out intel switched the Nehalem architecture. So thats from my perspective. Don't know what it was like in other countries, but I know in US people hated vista and to a much larger degree windows 8 so much that bigger tech company like DELL offered XP downgrades on new hardware as late as whenever intel stopped release drivers for new hardware. Probably circa 2013.

As far as anyone saying well but pci-express isnt really a XP thing, PCI-E came out on the P4 915 ICH6 and sata was as early as ICH5. So if p4 is a XP thing than certainly PCI-E, SATA and everything else that shipped on p4 boards is a thing, which is pretty much the same thing that shipped on C2d boards.

Reply 48 of 54, by dr_st

User metadata
Rank l33t
Rank
l33t

Any Core 2 system has pretty much 100% compatibility with XP, Vista and Win7. Performance-wise, though, lower-end Core 2 systems will be more comfortable with XP, while high-end will be equally comfortable with all of them.

If I didn't have other XP computers, I would probably dual-boot XP and Win7 on this one to achieve maximum compatibility.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 49 of 54, by slivercr

User metadata
Rank Member
Rank
Member

@Warlord, thanks for the link, the whole article (including part1) was a nice read. While I still stand by most of my opinions—I don't think the P4 is a failed architecture, just a different paradigm—I will revise my statement about it as follows;

The P4 was really just a product of its time, engineered to exploit the public's misconception of MHz = performance.

From 2000 onward intel sure had a number of blunders (talking about the consumer market): being second to 1 GHz, recalling the 1.13 GHz P3, the RAMBUS alliance, being second to x86-64, being second to "real" multi-cores... all of this in a span of ~4 years or so. From what I remember about tech sites back then, the general mood towards intel started to change when people started playing with Pentium M, around 2003 or so (I've always wanted a 479 to 478 adapter, but could never get my hands on one).

We're close to 4 years of Ryzen having released, I keep wondering what intel has up its sleeve. Back then we at least had Pentium M to hint us about the new direction, but this time around 🤷🏽

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 50 of 54, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
canthearu wrote:

True, Intel saw an opportunity during the race to 1ghz and engineering and marketing bought into the whole moar mhz = greatness thing.

As I mentioned earlier, Intel certainly took some notes from K6 CPUs (and later from 3DNow). AMD did the same trick, when they've realized that K5 couldn't scale much. And they did it again much later with FX series, but manufacturing process wasn't on their side this time. But still, they officially had 5 Ghz CPU at that time and Intel didn't.

In the end, Intel hugely overestimated manufacturing potential, thinking they could do their spin on "K6" much better than AMD, with bold promises to get 10 Ghz on Netburst.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 51 of 54, by slivercr

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2020-12-20, 17:15:

As I mentioned earlier, Intel certainly took some notes from K6 CPUs (and later from 3DNow). AMD did the same trick, when they've realized that K5 couldn't scale much. And they did it again much later with FX series, but manufacturing process wasn't on their side this time. But still, they officially had 5 Ghz CPU at that time and Intel didn't.

In the end, Intel hugely overestimated manufacturing potential, thinking they could do their spin on "K6" much better than AMD, with bold promises to get 10 Ghz on Netburst.

Since I'm getting educated in architecture lore in this thread, I might as well ask why you insist 3DNow was an influence over SSE? (Or why the P4 was influenced by the K6 for that matter, when the article provided by Warlord is a credible source stating that the longer pipeline was due to marketing considerations of wanting to achieve higher frequencies, not intel wanting to mimic the K6 line.)

If we talk about SIMD extensions, MMX (1996) was first (in consumer market) and it surely influenced 3DNow (1998), which was an extension of it. SSE (1999) was completely incompatible with 3DNow (aren't all of 3DNow's extensions deprecated by now?) It seems farfetched--to me at least--that it was a reactionary measure that took only 1 year to develop. I would think that it was in development in conjunction with the P3, which is the CPU it debuted in.

So I'm wondering, is there another nice ArsTechnica article or reference you can point me to where it says intel engineers looked at 3DNow and rewrote it for their CPUs?

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 52 of 54, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

that the longer pipeline was due to marketing considerations of wanting to achieve higher frequencies

Which was the initial goal of AMD K6. K5 just couldn't compete with Pentium in Mhz race.

that it was a reactionary measure that took only 1 year to develop

3DNow was an attempt to improve weak floating point operations on K6. MMX was a different story - integer instructions set, which originally was inable to be executed simultaneously with FPU on Pentium MMX.
Pentium 4 also relied quite heavily on SIMD (SSE2 and later SSE3) to improve performance in FPU scenarios, in which it sucks the most.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 53 of 54, by slivercr

User metadata
Rank Member
Rank
Member
The Serpent Rider wrote on 2020-12-21, 02:58:

Which was the initial goal of AMD K6. K5 just couldn't compete with Pentium in Mhz race.

Sure, I'm not questioning AMD's goal with K6. I'm questioning your claim that intel wanted to mimic K6, when the decision to boost frequencies is, per the article above, a marketing decision (which I presume is more likely linked to K7 hitting 1 GHz before intel didEDIT: scratch that as unlikely, since the P4 had to start R&D a couple years prior to release, obviously).

The Serpent Rider wrote on 2020-12-21, 02:58:

3DNow was an attempt to improve weak floating point operations on K6. MMX was a different story - integer instructions set, which originally was inable to be executed simultaneously with FPU on Pentium MMX.

As Im sure you know, this was fixed with SSE. Again, I'm not questioning 3DNow's purpose or the P4's reliance on instruction sets, I'm questioning your claim that intel took inspiration from 3DNow when it would seem 3DNow's biggest improvement over MMX—which you mention above—was already being worked on for SSE.

I'm wondering if there are references, or if we're just trading presumptions back and forth.

Outrigger: an ongoing adventure with the OR840
QuForce FX 5800: turn your Quadro into a GeForce

Reply 54 of 54, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

I'm wondering if there are references, or if we're just trading presumptions back and forth.

That's mostly presumption, but let's be real here - Intel is not existing in a vacuum. And they already had lawsuit from Cyrix before, which they settled with cross-licensing agreement.

I must be some kind of standard: the anonymous gangbanger of the 21st century.