VOGONS


Reply 20 of 53, by m1so

User metadata
Rank Member
Rank
Member

I think many people forget that the chips that go into their "gaming powerhorses" are made by actual working people. Bulldozer might have not been the best architecture, but it definitely wasn't geared up for ramping core counts and clockspeed up for marketing purposes as some people say. The Pentium 4 wasn't the abomination many people say either http://www.hardwarezone.com/articles/view.php?id=845 there is a running myth that Athlon XPs were always faster, but the top benchmarks from the period almost always favoured Northwood P4, now, P4 was slower PER CLOCK, but I don't care HOW the power is made as long as it delivers.

Reply 21 of 53, by carlostex

User metadata
Rank l33t
Rank
l33t
m1so wrote:

I think many people forget that the chips that go into their "gaming powerhorses" are made by actual working people. Bulldozer might have not been the best architecture, but it definitely wasn't geared up for ramping core counts and clockspeed up for marketing purposes as some people say. The Pentium 4 wasn't the abomination many people say either http://www.hardwarezone.com/articles/view.php?id=845 there is a running myth that Athlon XPs were always faster, but the top benchmarks from the period almost always favoured Northwood P4, now, P4 was slower PER CLOCK, but I don't care HOW the power is made as long as it delivers.

This was exactly the timeframe Intel did most damage with their unfair practices. It's funny how that article is mostly Benchmarks that would cause not true representation of performance of non-Intel CPU's. Pentium 4 was not an abomination, specially Northwood. But not caring how power was made? It's like accepting a 100 meter race on the olympics where Usain Bolt has to run with lead shoes while the other competitors run with super light equipment. Usain Bolt then goes in history as a mediocre athlete, when in reality he was the best of his time.

Reply 22 of 53, by alexanrs

User metadata
Rank l33t
Rank
l33t
m1so wrote:

Bulldozer might have not been the best architecture, but it definitely wasn't geared up for ramping core counts and clockspeed up for marketing purposes as some people say.

Agreed, it wasn't made for this, but rather had to so Bulldozer could keep up. They basically made HyperThreading's opposite (multiple cores sharing pipelines, etc.) without software to support it (like a compiler that outputs code optimized for that, or striking a deal with MS to have a good patch on Day 1 to better use the processor's power in Windows) in an architecture that didn't quite catch up.

m1so wrote:

The Pentium 4 wasn't the abomination many people say either http://www.hardwarezone.com/articles/view.php?id=845 there is a running myth that Athlon XPs were always faster, but the top benchmarks from the period almost always favoured Northwood P4, now, P4 was slower PER CLOCK, but I don't care HOW the power is made as long as it delivers.

P4 was just horribly short sighted. Intel believed they could ramp up the clock to eternity, but when it turned out there were far too many electronic and thermic issues to do so they realized the architecture was doomed. After all, even with Intel's unfair practices, AMD was still delivering comparable-to-superior performance in real life applications at much lower clocks, all they had to do was to keep raising the clocks until no amount of compiler optimization could hide AMD's technical superiority back then.

Reply 23 of 53, by devius

User metadata
Rank Oldbie
Rank
Oldbie
m1so wrote:

As for the "max out" gamers, I wonder how they would fare in the Quake/Duke 3D era, when the games already offered 1600x1200 VESA modes that would require a supercomputer to play in the years that they were released.

Funny you should say that. I remember playing Descent for the very first time and being blown away by the option to run it at 800x600!! That was a massively high resolution back in 1995. Most of the games up to that point used a fixed resolution of 320x200 or 320x240, so being able to run a game at the same high resolution you would use for Windows 3.1 was just insane 🤣 Of course my PC at that time would only run it at like 5fps so it wasn't really useful, but still looking at a game being rendered at such high resolution was impressive!

Reply 24 of 53, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

Q9550 is still a respectable CPU. If you overclock it you can still put it well above a Haswell i3, even in not so well threaded applications, I think. You can probably run 2013/2014 games just fine if you put something like a 750TI (or AMD equivalent) and don't have some sort of OCD for setting everything to Ultra.

Yeah its pretty decent it seems. I have been playing Portal 2 on high and Metro 2033 on low (picked it up free with the humble bundle!)

I don't care much about settings. I just like to be able to run at 1920x1080 because the monitor scaler is ugly. I have to run Metro at 1600x900 because of my weak GPU
but I hope to upgrade that at Christmas. The CPU stays below 60% except in extremely particle heavy areas.

On a side note, it has a doom/quake feel to it albeit with much better graphics.

I am on a G41 board so I don't know how much further the FSB can go

It runs all the modern games fine with my ATI R290 crossfire setup.

Wow that is a lot of GPU for that CPU!

I'm with vets on this one, I'm in favour of the mid 90's PC hardware.

90s HW seems like the most fun

AMD's big mistake was to assume Intel would follow their SSE5 instruction set proposal. If Intel had accepted they would have to get its designs on the drawing board and/or face a performance penalty, because their micro-architecture were optimized for Intel's SSE. So Intel rejected SSE5 and instead proposed AVX. Funny thing is that AVX is nothing but SSE with 256bit registers, it's just an SSE upgrade basically, and not radical new instructions like those proposed by AMD on SSE5. In the meanwhile the FMA instructions debacle provided nothing than more delays for AMD.

I didn't know that. But banking on the move of a competitor seems awfully risky.

Then there's another problem. AMD still has not recognize to this day the importance of a built in house compiler. One of the gr […]
Show full quote

Then there's another problem. AMD still has not recognize to this day the importance of a built in house compiler. One of the great weapons in Intels success is their main compiler.. Intel can assure their processors run properly optimized binaries and at the same time cripples AMD (or VIA or any other) performance by running not optimal code even though non-Intel CPU can run optimal code.

Example: 1- Compiler checks CPU vendor string as "GenuineIntel". So compiler runs the most optimized code using the best SIMD instruction available for that CPU.
2- Compiler checks CPU vendor string is NOT "GenuineIntel", so it runs other code path using a lower SSE instruction set or even 386 code even if the top SIMD instruction IS available on non-Intel CPU.

This is called unfair CPU dispatching.

Agner Fog even wrote a CPU ID string manipulator tool for VIA CPU's. This allowed to change the CPU ID string at convenience. When the VIA CPU string is changed to "GenuineIntel" there was a significant performance gain. Changing the string to something else would lose performance again, and changing to "AuthenticAMD" would make it even slower.

This is why I like open compilers like GCC which don't play in the CPU wars.

I wish that people just wouldn't use Intel's crap compiler.

I think many people forget that the chips that go into their "gaming powerhorses" are made by actual working people.

?
I am not sure how that is relevant.

The Pentium 4 wasn't the abomination many people say either

When it is both slower than the competitor per clock and slower than the PREVIOUS GENERATION per clock
then it pretty much was a failure. If P3 was just clocked scaled up it would have been better.

Long pipeline = poor design.

P4 was just horribly short sighted. Intel believed they could ramp up the clock to eternity, but when it turned out there were far too many electronic and thermic issues to do so they realized the architecture was doomed. After all, even with Intel's unfair practices, AMD was still delivering comparable-to-superior performance in real life applications at much lower clocks, all they had to do was to keep raising the clocks until no amount of compiler optimization could hide AMD's technical superiority back then.

Pretty much this. It wouldn't have been so bad if they didn't come up with something better for another 5 years

Reply 25 of 53, by alexanrs

User metadata
Rank l33t
Rank
l33t

Pretty much this. It wouldn't have been so bad if they didn't come up with something better for another 5 years

To be fair they had Pentium M, they just never released an actual desktop version.

Reply 26 of 53, by leileilol

User metadata
Rank l33t++
Rank
l33t++

The sixth (1995-99). I'm still mixed whether all that trouble of trial and error for the perfect audiovisual were aggravating or exciting. Pixel shader 2 really killed that off, leaving things to just deciding between two hype machines in the CPU and GPU departments, and for me nvidia's shooting itself in the foot with the FX and Intel's price gouging and marchitecture made that one easy. 😀

apsosig.png
long live PCem

Reply 27 of 53, by F2bnp

User metadata
Rank l33t
Rank
l33t

Not too worry, Bulldozer is going away next year. Carrizo will be the last product to come out of the Bulldozer architecture. 😀

As far as the question of the topic goes, I'm kinda torn between late 90's and late 00's. Early Pixel Shader 1.1 and 2.0 is exciting, but I'm not too fond of the CPUs, be it Pentium 4 or Athlon XP/64. So, I love stuff like Morrowind or Far Cry, but certainly not the whole package that comes with it. I think the Core 2 Duo line was simply amazing. 2006 also saw the introduction of the 8800GTX, very exciting year. Not to mention Crysis and a ton of other impressive games the following year.
I'd probably go for the late 90's however, I mean the fact that I'm here and I am discussing these things everyday has to account for something, right? It was certainly exciting and the swiftly shifting market forced you to upgrade very, very often. However, that's part of what makes it special to all of us, we saw progress year to year!

Reply 28 of 53, by HighTreason

User metadata
Rank Oldbie
Rank
Oldbie

Favorite Generation: Is between two, firstly I'd pick the early Athlon era, Desktop hardware was absolutely terrible and workstation/server hardware was awesome. It was a fun time watching AMD and Intel fight it out, watching the 3D graphics market mature and decent 3D Accelerators start appearing. Despite that, I think it would be 486 overall - the 486 went from a lowly 16-50MHz all the way to 133MHz (unofficially 160MHz) and things changed drastically; early 486 motherboards had ISA slots with some still being 8-Bit slots and they later evolved to use VL-BUS before getting PCI and even a Pentium chip in the form of the Overdrive. The software for the 486-era was also brilliant, the games kicked serious ass, the software had intuitive interfaces (at least, most of the time) and the operating system was efficient, this carried over into the Pentium era somewhat.

I think one of the best things about this era was the fact you could upgrade literally everything, the CPU, Cache, RAM (often choosing between two types of memory), drive controller, graphics card (could upgrade it's RAM), sound card (wavetable add-on) and a whole lot else.

Least favorite: Mostly putting this here because people were talking about the Pentium 4. I'll stick up for the P4, early P4 systems were slow, hot and overpriced but the Athlon systems of the era weren't much better - they were cheap, but they broke down quickly and were not stable, this was assuming you could run it without it setting fire. The later P4's though, probably around the time the 2800 was around, were actually very good. I briefly owned an Athlon 64 x2 but it was garbage and I got shot of it only a couple of weeks into using it, I built a Pentium D system to replace it and the differences were unimaginable - the DDR2 RAM left the aging DDR-400 my 64X2 took for dead, the processor was stable and I could render videos in literally a fraction of the time it took on the Athlon 64, even though I was using a crappy FireGL due to it being my first PCI-E system and it being the only card I could get. The Core 2 era, on the other hand, was bullshit. One only has to read the errata document for them to see that the processor is almost completely broken. Let's render a video... Oh, wait, no, that's going to take a long time, we're back in Athlon 64 territory, why is it slower than my Pentium D? Oh, and it's crashed now? Great. Let's upgrade to Windows 7 x64! Yay! Oh, hold on, this software is 32-Bit, but that's what the WoW layer is for right? Sure. Let's run that... Oh, crap, the WoW layer works, but the CPU doesn't and that's a total lockup of the system... Ah well, at least it... Oh, hang on, under the same cooling system the Pentium D used there's a 20 degree increase in temperature on a CPU running at a 600MHz lower clock speed. Garbage. Easy to see why the server world used Opterons like crazy back then.

My Youtube - My Let's Plays - SoundCloud - My FTP (Drivers and more)

Reply 29 of 53, by Zenn

User metadata
Rank Newbie
Rank
Newbie
HighTreason wrote:

[The Core 2 era, on the other hand, was bullshit. One only has to read the errata document for them to see that the processor is almost completely broken. Let's render a video... Oh, wait, no, that's going to take a long time, we're back in Athlon 64 territory, why is it slower than my Pentium D? Oh, and it's crashed now? Great. Let's upgrade to Windows 7 x64! Yay! Oh, hold on, this software is 32-Bit, but that's what the WoW layer is for right? Sure. Let's run that... Oh, crap, the WoW layer works, but the CPU doesn't and that's a total lockup of the system... Ah well, at least it... Oh, hang on, under the same cooling system the Pentium D used there's a 20 degree increase in temperature on a CPU running at a 600MHz lower clock speed. Garbage. Easy to see why the server world used Opterons like crazy back then.

You've got to be kidding right? Comparing Core 2 to Pentium D, Core 2 was faster at half the clock speed and much cooler as well. Show me a review that says Core 2 was slower AND 20 degree hotter than a pentium D?

Reply 30 of 53, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie
HighTreason wrote:

Favorite Generation: Is between two, firstly I'd pick the early Athlon era, Desktop hardware was absolutely terrible and workstation/server hardware was awesome. It was a fun time watching AMD and Intel fight it out, watching the 3D graphics market mature and decent 3D Accelerators start appearing. Despite that, I think it would be 486 overall - the 486 went from a lowly 16-50MHz all the way to 133MHz (unofficially 160MHz) and things changed drastically; early 486 motherboards had ISA slots with some still being 8-Bit slots and they later evolved to use VL-BUS before getting PCI and even a Pentium chip in the form of the Overdrive. The software for the 486-era was also brilliant, the games kicked serious ass, the software had intuitive interfaces (at least, most of the time) and the operating system was efficient, this carried over into the Pentium era somewhat.

I think one of the best things about this era was the fact you could upgrade literally everything, the CPU, Cache, RAM (often choosing between two types of memory), drive controller, graphics card (could upgrade it's RAM), sound card (wavetable add-on) and a whole lot else.

I agree about the 486. Its fun to have lots of upgrade options as you said. And (well supported) ISA slots next to PCI slots on the newer boards.

Least favorite: Mostly putting this here because people were talking about the Pentium 4. I'll stick up for the P4, early P4 systems were slow, hot and overpriced but the Athlon systems of the era weren't much better - they were cheap, but they broke down quickly and were not stable, this was assuming you could run it without it setting fire.

Ok. I'll give you that. 2001-2005 was kind of a lousy time for CPUs.

The later P4's though, probably around the time the 2800 was around, were actually very good.

"very good"

In what regard?

The Core 2 era, on the other hand, was bullshit. One only has to read the errata document for them to see that the processor is almost completely broken. Let's render a video... Oh, wait, no, that's going to take a long time, we're back in Athlon 64 territory, why is it slower than my Pentium D?

I am totally puzzled by this! There was some old P4 optimized software that was tuned for the long pipeline and high clockspeed and it sucked
on pretty much anything other than a P4. But by sucked, the worst was it didn't gain as much performance as other software. Not that it would be slower.

Oh, and it's crashed now? Great.

Lots of core 2 era hardware in use even still. Why is this crashing problem not prevalent.

Let's upgrade to Windows 7 x64! Yay! Oh, hold on, this software is 32-Bit, but that's what the WoW layer is for right? Sure. Let's run that... Oh, crap, the WoW layer works, but the CPU doesn't and that's a total lockup of the system... Ah well, at least it... Oh, hang on, under the same cooling system the Pentium D used there's a 20 degree increase in temperature on a CPU running at a 600MHz lower clock speed. Garbage. Easy to see why the server world used Opterons like crazy back then.

Again this doesn't seem to make sense unless you had a bad chip.

Using a C2D E6700 before and now a Q9550 a system crash is *very* rare. I also upgraded from a Celeron D (P4 with next to no cache 😵 )
to the Core 2 Duo and I saw nearly a 20*C DECREASE in temperature with the same heatsink.

Now with the C2Q in it I am seeing about the same temperatures as the original CPU. So 4 cores that are individually faster than a single core running the same temperature
seems like a fair trade to me.

You've got to be kidding right? Comparing Core 2 to Pentium D, Core 2 was faster at half the clock speed and much cooler as well. Show me a review that says Core 2 was slower AND 20 degree hotter than a pentium D?

^ this

He must have had a bad chip or something.

Reply 31 of 53, by carlostex

User metadata
Rank l33t
Rank
l33t

I guess everybody had different experiences.

Someone said early Pentium 4's were bad but Athlons weren't much better. It's true that Athlons ran hot but they had higher temperature tolerance compared to Willamete Pentium 4's. I had a Palomino Athlon XP 1700+ back in the day, hot little f***er but stable like a motherf***er. It was running with a Soltek SL-75KAV, and it was the finest Computer i had, with my AMD386DX-40 arriving at a close second. My neighbour assembled the exact same machine cause we bought them on the same place and he recalls this Computer being one of the Best systems he had too.

I remember my uncle had bought a Pentium 4 1800 Willamette, and so i tried my favorite game at the time on gis Computer. Operation Flashpoint was the game. I was surprisd to see how the game was slower and jerkier than on my Athlon XP. The first Pentium 4 that i felt faster and snappier than my system was a 2.4Ghz Northwood of a friend.

Couple years later my brother got a 3.0Ghz Northwood with an Asus P800 motherboard. Some time later i got a socket 754 Athlon 64 3000+. Even though i was running that CPU on a crappy ASRock motherboard, i left my brother's system eating dust.

Today i'm still rocking my Penryn Core 2 Duo, for almost 6 years running at 3.6Ghz. It has been easily the best Intel system i had in years. I plan to upgrade on the second half of 2015.

Reply 32 of 53, by HighTreason

User metadata
Rank Oldbie
Rank
Oldbie

I'll give the Core 2 just one thing; It was good at running games. Past that, it's broken.

I think a bad chip is out of the question, my laptop uses an original Core chip, that works for what it does but it's slow. I have a small desktop with a Core 2 Duo E-something, that one is garbage and it will not run stably. I used to use it for streaming to Twitch, but it was very slow and very unstable as it turns out compression is a weak point for those chips. The motherboard for my Pentium D failed and I had to drop in a Core 2 Q8600 on a new motherboard... Not good, it takes longer to render video, which is the sole purpose of that machine. All of them crash rather often. There has to be some reason that the server/workstation world stayed away from that arc, I suspect the poor performance, instability and severely flawed cache architecture are a contributing factor as Core 2 chips really don't perform well in that environment.

As for the Athlon chips, the main issue i noticed was that when they started putting them in sockets, some motherboards failed to supply enough power for the faster chips - this wasn't inherently the processors fault, but the majority of motherboards from that time really weren't great and to say how popular the Athlon was there certainly appear to be a lot more P4 systems out there.

My Youtube - My Let's Plays - SoundCloud - My FTP (Drivers and more)

Reply 33 of 53, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

Socket 939, baby! I built my first "enthusiast" machine on that platform and still use it as an HTPC for the basement. It's actually quite amazing how fast the old rig is on today's internet. The dual-core Opteron 185 @ 3GHz still has enough power to handle things like 1080p YouTube and 34Mb/s MKVs in software (the G80 based 8800GTS does not support h/w H.264 decoding, so it's all on the CPU).

On most of the sites I visit, there is not much of a difference between that old 3GHz S939 machine and my main rig (a 4930K @ 4.3GHz). Of course, a 25mb/s pipe may not be fast enough to really show the difference between the two computers. I used that Opteron as my main computer until 2013!

As for P4,
Whenever I plug in my 3.73GHz P4, which is only a year older than the s939 machine, performance drops off a cliff. Scrolling in Firefox frequently dips below 30fps @ 2560x1440 no matter what video card is used. Software-based decoding of 1080p Flash video is much too slow to be enjoyable. Even navigating Windows (7) Explorer just doesn't have quite the "snap" that the S939, C2D, and i7 units have. I don't have a Pentium D, so I'm not sure how much of a difference adding a 2nd Netburst core would make. As it is now though, I must say that, even with a 1066MHz FSB and clocked sky-high, Pentium 4 is not much of a performer compared to just about any other CPU--Intel or AMD--even when they're from roughly the same time frame and clocked much lower.

I also used a 1.8 Willamette as my main rig from 2002-2006, but was never happy with its performance. The poor performance was partly my fault for running it on a PC133 i845 board, but for something running at a whopping 1800MHz, I always felt that it should've performed so much better. Going from that Willamette system to the Opteron 185 back in 2006 actually felt like a bigger upgrade than my more recent switch up to a 4930K. The downright massive improvement in performance was just astonishing! Oh sure, the 4930K is a lot faster than the Opty in CPU intensive apps and games (around 11x faster at HQ x264 encoding, which I don't even know how that's possible), but when it comes to general purpose/Web-based usage, there isn't much of a difference between the two. Especially now that the Opteron system has a 250GB 840 Evo in it!

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 34 of 53, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie
HighTreason wrote:

I think a bad chip is out of the question, my laptop uses an original Core chip, that works for what it does but it's slow. I have a small desktop with a Core 2 Duo E-something, that one is garbage and it will not run stably. I used to use it for streaming to Twitch, but it was very slow and very unstable as it turns out compression is a weak point for those chips. The motherboard for my Pentium D failed and I had to drop in a Core 2 Q8600 on a new motherboard... Not good, it takes longer to render video, which is the sole purpose of that machine. All of them crash rather often. There has to be some reason that the server/workstation world stayed away from that arc, I suspect the poor performance, instability and severely flawed cache architecture are a contributing factor as Core 2 chips really don't perform well in that environment.

I am very genuinely confused by your results. I have not seen any of this. Neither have many thousands of other C2D owners. The only time I see instability is with a dodgy driver
or if I overclock too far. There were certainly workstations and servers with Core2 processors but they were rebranded as Xeon

Yes there is a big list of Errata. But did you look at the errata list on the Pentium 4? It is almost just as long. CPU bugs do exist but OSes are updated to work around them as much as possible.

Socket 939, baby! I built my first "enthusiast" machine on that platform and still use it as an HTPC for the basement. It's actually quite amazing how fast the old rig is on today's internet. The dual-core Opteron 185 @ 3GHz still has enough power to handle things like 1080p YouTube and 34Mb/s MKVs in software (the G80 based 8800GTS does not support h/w H.264 decoding, so it's all on the CPU).

I don't have much experience with Socket 939. I have a 939 board in a box but I don't have a CPU to test it.
I do have an AM2 board which has a Athlon X2 on it and it seems pretty good.

Reply 35 of 53, by Skyscraper

User metadata
Rank l33t
Rank
l33t

The discussion of the performance difference between Pentium D and Core 2 Duo is interesting.

I have always found the Core 2 Duo faster in games but not always in other tasks.
It depends on what type of Core 2 Duo you compare the Pentium D with.
The Pentium D was very cheap efter the Core 2 Duo release and a Pentium D 940/960 beats the low end Core 2 Duo like the E2*** and E4*** in many tasks.
The Core 2 Duo E6600 and faster with 4MB cache and 1066 Mhz FSB left the Pentium D behind performance wise in just about everything.

Last edited by Skyscraper on 2014-11-16, 10:15. Edited 1 time in total.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 36 of 53, by HighTreason

User metadata
Rank Oldbie
Rank
Oldbie
smeezekitty wrote:

I am very genuinely confused by your results. I have not seen any of this. Neither have many thousands of other C2D owners. The only time I see instability is with a dodgy driver
or if I overclock too far. There were certainly workstations and servers with Core2 processors but they were rebranded as Xeon

Yes there is a big list of Errata. But did you look at the errata list on the Pentium 4? It is almost just as long. CPU bugs do exist but OSes are updated to work around them as much as possible.

Nobody used them back then, absolutely nobody. I know of only a single business within a 20 mile radius that used Core 2 processors and they had nothing but trouble with them... The length of the errata document is nothing to do with it, it's that there are serious flaws regarding 32-Bit code running on the processor in 64-Bit mode and that there are serious issues with multi-threaded applications crossing cache boundaries on quad core processors causing them to completely lock up. There was also a serious thermal issue where the diode would shut off if the computer went into standby and would not come back on once the system was resumed, which was an absolute disaster on processor that had thermal problems - good thing they have throttling.

As stated above, the Core 2 was brilliant at gaming, but I'm not a gamer, my cut-off point for games (Other than The Sims) is somewhere in the mid-1990's. In a workstation environment the performance and stability simply isn't there and I can only report what I see; My Core 2 Quad Q8400 simply takes much longer to render 720p30 XVid videos than the Pentium D "QKDH" ever did, even using OpenCL and an updated version of Vegas.

My Youtube - My Let's Plays - SoundCloud - My FTP (Drivers and more)

Reply 37 of 53, by alexanrs

User metadata
Rank l33t
Rank
l33t
HighTreason wrote:

Nobody used them back then, absolutely nobody. I know of only a single business within a 20 mile radius that used Core 2 processors and they had nothing but trouble with them... The length of the errata document is nothing to do with it, it's that there are serious flaws regarding 32-Bit code running on the processor in 64-Bit mode and that there are serious issues with multi-threaded applications crossing cache boundaries on quad core processors causing them to completely lock up. There was also a serious thermal issue where the diode would shut off if the computer went into standby and would not come back on once the system was resumed, which was an absolute disaster on processor that had thermal problems - good thing they have throttling.

Is this on the first series? Like E2XXX and E4XXX? I have a Pentium D 3.00 GHZ that I upgraded to C2D E6700 2.66 and there wasn't a single task where the PC didn't feel faster, be it gaming or be it for university assignments using MATLAB, both with 32 and 64-bit systems. (Used to have Windows x64 installed, but then I found out the crappy VIA chipset could not address over 4GB, so I reverted to 32-bit on that machine).
Also, in my dad's job there is a C2D domain "server" (desktop PC used as a server) which also dubs as a file server. It's HDD sometimes screams for help (no RAID =/), but the PC has been rock solid stable for years.

HighTreason wrote:

As stated above, the Core 2 was brilliant at gaming, but I'm not a gamer, my cut-off point for games (Other than The Sims) is somewhere in the mid-1990's. In a workstation environment the performance and stability simply isn't there and I can only report what I see; My Core 2 Quad Q8400 simply takes much longer to render 720p30 XVid videos than the Pentium D "QKDH" ever did, even using OpenCL and an updated version of Vegas.

Netburst-optimized codecs, maybe? I cannot think of any other explanation, especially because you also had performance issues with the Athlon 64, which SHOULD match the P4 in raw processing power.
I'd love if you could get numbers... if you have the disposition, of course. If you still own the Pentium D chip, any C2D/C2Q-compatible motherboard should accept it, and you could try rendering the exact same project with each processor to measure up the time the PC takes to complete it.

Reply 38 of 53, by HighTreason

User metadata
Rank Oldbie
Rank
Oldbie

Core Duo (Recalled Dell Laptop) is a T2300, I tend to ignore that one, given the status of the machine it is in... Plus the Dell logo kinda says it all.

Core 2 Duo is an E8200, briefly it had an E45 in but I never even tested that one past it overheating with the Pentium's heatsink on top - the Pentium used liquid cooling, the E45 died in the process. This was built back when the Pentium D was stable, noticeably slower when rendering / encoding. Usually XVid, but noted with h.264 and VP6 too in both Vegas and FMLE. Some of that may be down the the Intel motherboard it runs though, but the capture board does some of the encoding onboard in that one where the Intensity in the Pentium D didn't. I can't find the model number on it now but I know they used it in the RM One.

Core 2 Quad is a Q8400, seems to be almost identical to the E8200 overall, maybe marginally faster, has more RAM. Same chipset (P35) that I'm not a huge fan of, MSI motherboard that I don't know for sure the model # of... P35 Neo 2 Game-sausage-fest type crap that I don't like with silly LEDs all over it.

Pentium D is... QKDH, that's ll I know because it's an ES chip, I had it before they were released and it wasn't cheap. It does perform somewhat differently than the retail models and HyperThreading can be switched on (It's broken though, so don't do that). Runs at 2.80GHz and most software detects a 920, so i assume this is what it resembles most closely. Even in PassMark it came back with a higher score in compression and encoding. That thing lasted 9 years and I'm still trying to repair the motherboard. The processor will not work in the MSI motherboard, the board simply will not POST despite it claiming to support the chip - could be due to it being an ES I suppose or the P35 chipset is a bigger waste of space than I thought it was.

When I had the Athlon 64 I used to encode in MPEG-2, pretty much just DVD - even 720x576 resolution - and it was not good at it, that one used to burn up the RAM every week, ruined it's GPU twice and often one core would halt... This would remain even if you shut the machine off and started it again, Task Manager reported that one core was at 100% load and assigning threads to that one would instantly crash whatever program it was, sometimes a vital part of the operating system would be put there or both cores would lock up. That was anothwer ES and is marked "Athlon 64 X2 3200+" which we probably both know does not exist... It runs at 2Ghz (Most of the time, I've seen it report 1.8 on numerous occasions) and has a 512K Cache, so I guess it would be the 3800+ and benchmarks identically... I did use a retail 4200+ and it appears to have fixed a few of the problems but it simply does not have the power of the Pentium in the real world, though I was confused because it yielded a higher score in most tests. The stability was a turn-off though and the FX-53/55? I briefly swapped it for was worse, I didn't like the Athlon 64 chips on the 940 platform (As this socket is marked, seems to be 939 with an extra pin, actually, looks like the socket the Opteron went into) though the 754 were a decent budget option despite the ancient memory they needed which was very expensive at the time. I remember though, how my Dual P3 system could actually outrun it sometimes. I can let so much of it go due to it being unfinished, but as I said, the faster model of the retail version was no better and they both spat out videos with serious problems.

I know a worse processor than the 64... The Phenom X3, now that was a real joke but I'm not going there. This is why I rate that whole generation as awful.

As I say, I can only go on what happened when I used these things and what was relayed to me by people in the only nearby similar environment that were using them. I suppose it'll all be irrelevant soon as I'll either move to an i7 or an E3, which smokes everything pretty good.

My Youtube - My Let's Plays - SoundCloud - My FTP (Drivers and more)

Reply 39 of 53, by alexanrs

User metadata
Rank l33t
Rank
l33t

Yeah, maybe the MoBo had something to do with it as well. Before Sandy Bridge integrated the memory controller, motherboard/chipsets had a greater impact on performance (I hate you VIA!!)... and more often than not these "super-gaming-sparkly-mobos" try to squeeze to much performance for games by cutting corners and I can see that maybe hurting both stability, compatibility (like you not being able to put your PentiumD on that) and perhaps even performance in non-gaming oriented tasks.

BTW did you OC that Pentium D? My Pentium D 925 worked just fine with crappy air cooling (wasn't the Intel cooler, but rather something even crappier, noisier and without speed control) and I don't recall it having thermal issues.

About Athron64... I only have experience with the single core variant on Socket 754, so I can't say anything about the X2 issues... also most of what I did was gaming back then. It served me well, though.

BTW, since it looks like you do a lot of encoding, is the C2Q the latest machine you have or you have Nehalen+ stuff too? I'm asking because if I were facing half of the issues you are, I would've gone insane by now.