Reply 20 of 42, by Snover
- Rank
- l33t++
Now that will happen. Me, I'd be infinitely happy with a single-processor Athlon64 (754 or 939, leaning toward 939 now that they're getting cheaper) board with a PCI-X slot.
Yes, it’s my fault.
Now that will happen. Me, I'd be infinitely happy with a single-processor Athlon64 (754 or 939, leaning toward 939 now that they're getting cheaper) board with a PCI-X slot.
Yes, it’s my fault.
oh man... 939 all the way
Dual core will never seen socket 754.
I'd settle for dual core socket 939 with 2x 16xPCI-E, how about you?
The Tyan Thunder S2895 is will have two CK8-04 (nForce4) chips and support SLI with 2xPCI-e x16, but the Thunder boards usually go for more than $400. Slap a couple of dual core Opterons and you'll have your quad CPU box with 2 16xPCI-e slots.
Now you're talking 😜
Of course at that point you'd have to get a PC power and cooling full tower, 6U rackmount, or something larger... because you would need 2 PSUs for something like that.
I think a Single socket 939 with dual core chip would be enough for hardcore gamers though. How many games could take advantage of more than 2 cores?
That's one of my worries for the upcoming multi-core cpu's. Supposedly they speed will either be lower than or equal to current processor speeds due to the increased heat. That's one of the main reasons why we haven't seen the massive speed increase that we've seen in the past. Heat. Unfortunately, not many games take advantage of dual-processing/HT/Multi-Core and I don't see them doing so in the future either.
Seriously, do you really expect game developers to take advantage of multi-core?
What I am glad to see is finally the increase in FSB/memory speeds. Mabye with this speed issue we may eventually see our FSB/Memory speeds catch up with processing speeds which would be really cool.
That's another reason why multi-socket configs won't have the greatest advantage.... It's called VapoChill. Potentially 1 4-core cpu would be better than anything else...
Yeah, but speed isn't all incumbent for memory... faster mhz usually = increased latency.
wrote:2 processors and 2 graphics cards.... How much ass would that kick?
Not sure how much ass, but it sure would cost a lot 😀
Keep in mind, though, that a big disadvantage that a multi core Opteron has against a system with discrete Opterons is that the cores share the memory controller and the same bank of system RAM. Two separate Opterons each have their own memory controllers and potentially their own banks of system RAM.
Which is also why the Opteron architecture kicks Xeon's ass 😀
Yes, it’s my fault.
In addition to the Opteron's Hyper Transport, increased number of general purpose registers, wider ALU registers, etc...
Speaking of historic hardware, I'd have to say that the Opteron is one scandalous little processor... outdoing Mac on its home turf of content creation in even the most holy of apps: Photoshop. Beats Xeon too.
http://www.barefeats.com/g5op.html
http://www.gamepc.com/labs/view_content.asp?i … cookie%5Ftest=1
But your probably seen that before.
Need I mention the Star Wars movies? 😜. Nothing like Opterons for Blade Servers for your new renderfarm or compilefarm.
American Micro Devices
Assembled in Malaysia. hahahaha.
Actually, the "A" in AMD stands for "Advanced" 😀
And Pixar is using G5 clusters for rendering.
Yes, it’s my fault.
Ahh, another reason to not like Disney or thier minions. Excellent!
The rendering for PRMan is done on apple g5s, but it isn't the only RenderMan compliant implementation. Most of the rendering done with RenderMan isn't done by Pixar, but by thier customers. And anyone going out TODAY looking for hardware for a new renderfarm would be a major dumbass to choose G5s, based on hours per frame/power consumption/cluster density/multiple vendor support. Pixar uses Apple because of a longstanding relationship which makes for good business deals, not because of architecture superiority. Besides its possible, though unlikely, that Apple may one day use Opterons. They switched from 680xx to PowerPC at one point, because Motarola just didn't cut it. Of course a lot of Motarola chips ride their merry way into cash registers, vacuum cleaners, sex toys (Sybian :p), so whats the inclination to make faster chips?
Anyways.. sorry for ranting. I hate Apple, if only because of all the dumbass who incessantly made shitty 'music' at school with SimpleText's voice.
The G5 is not a Motorola chip. It is an IBM chip, based on the Power 4, as in what has been several of the Blue machines (Deep Blue, etc.) Don't dismiss the G5 so readily. When applications are optimized for it, the G5 is impressive.
I do believe the fact that the Power 4 and 5 based chips used in Apple's machines weren't made by Motarola was my point. 😅
ribbon13: Ever used FruityLoops Studio? 😀
Yes, it’s my fault.
*Nervously looks around*
...
Maybe.
Sound Forge
Cakewalk
Cool Edit Pro
Why?
Oh, well, it has a vocoder & speech synth 😉
Yes, it’s my fault.
If any of my old classmates break in to my house just to start making music with Fruity Loops saying "Poo poo poo poo poo" to the tune of "In the Hall of the Mountain King", I will know who to blame Snover.
😜
Staying of the subjet of hardware. My New Year's Resolution is to buy my girlfriend a Tyan Thunder K8WEX S2895 as an anniversary or birthday present.
Um, I'm not your girlfriend, nor am I a girl, but could you buy one for me, too? ^__^ It'd be super swell if you could throw in a 3ware SATA 4-channel 9500S-4LP controller and 4 Seagate 7200.8 400GB ST3400832AS drives too, as well as a couple Opteron 250s and a ton of high-quality ECC RAM.
Seriously. I'll be your slaaaaaaave.
Yes, it’s my fault.