VOGONS


Reply 20 of 126, by HighTreason

User metadata
Rank Oldbie
Rank
Oldbie

A point of interest; even today, most movies only run at 24fps because at other rates they are simply considered to "not look right" - try playing with a video cameras frame rate settings and see the effects for yourself.

More on topic; 775 is still useable for the time being, but I would definitely think it was wise to be considering moving away from it within the next 12-18 months at the latest.

My Youtube - My Let's Plays - SoundCloud - My FTP (Drivers and more)

Reply 21 of 126, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

I have already disagreed about the sentiment of higher framerate "not looking right". It only doesn't look right because most people are used to the archaic framerate.
Personally I get tired of judder and flicker.

-edit-

1024th post!!

Reply 22 of 126, by HighTreason

User metadata
Rank Oldbie
Rank
Oldbie

Personally I just wish they'd pick one and stick to it. It annoys me to no end that there is still a "PAL/NTSC/SECAM" (and a million subsets of parameters for each, frame rate included) when it should just be "High Definition Digital TV" and be the same the world over, there is no reason to have that incompatibility anymore outside of extending DRM bullshit.

My Youtube - My Let's Plays - SoundCloud - My FTP (Drivers and more)

Reply 23 of 126, by candle_86

User metadata
Rank l33t
Rank
l33t

nah, thats like saying we American's should just switch to Metric, people like being different, and so do counties 😁

Reply 24 of 126, by dr_st

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:

Nothing I've ever noticed - maybe I'm doing it wrong though.

I wouldn't say you're doing it wrong, but perhaps you are not subjecting it to high loads. Also, it is more obvious when you are comparing to other systems. I may not have considered my C2Ds slow if I didn't have a C2Q and a couple more modern systems to compare to. The C2D is still usable, just not very fun to use. 😀

havli wrote:

Let me put it this way - if your screen is 30cm wide, an object travells at constant speed from one end of the screen to the other in 1 second (@ 30fps). Then every frame moves the object by 1cm. The same process on 60cm wide screen will result in 2cm step per frame. Of course this is much less smooth.

Well, the object itself will also be twice as big, so the relative smoothness will be the same. Plus, the bigger the screen is, the farther you are likely to sit from it.

alexanrs wrote:

Anyway, I have a 120Hz monitor and I do notice when something messes up my configurations and drops me back to 60Hz while in the desktop (the mouse pointer moviment looks a bit less smooth), though I'm not sure I'd notice anything in games;

You are not the only one. I think there is enough evidence to conclude that people do notice, quite easily, the difference between 60FPS and 120FPS. But with the possibl e exception of competitive fast-paced gaming, I don't believe it actually affects the quality of gameplay.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 25 of 126, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie
candle_86 wrote:

nah, thats like saying we American's should just switch to Metric, people like being different, and so do counties 😁

We should. Metric is easy to work with and just makes sense (for the most part atleast)

Being different just to be different just makes headaches

Reply 26 of 126, by obobskivich

User metadata
Rank l33t
Rank
l33t
havli wrote:

My laptop has a similar CPU (c2d T8100) and it is very hard to use because of the slow CPU. Web browsing is sluggish, video playback results in high CPU load (especially online video - flash), java programming is pain as well. I guess I'm too spoiled by the i5 performance. 🤣
Multitasking is indeed too demanding for c2d to handle.

My C2D system has no trouble with video - usually under 5% CPU load. It has GPU offload though (it has Quadro FX 1700). I've run into no problems with multi-tasking either, but I guess it depends on what specifically you're doing. I have faster computers available as well, but don't really notice any significant differences in "daily use" kinds of tasks. Again, that's all subjective - my daily use tasks may be "lightweight work" for some folks, and extremely-heavy for others, just as your daily use tasks may be very heavy or very light compared to others.

On Flash running on CPUs, my dual Xeon 2.8GHz (those are NetBurst chips with HT) could wrestle through 1080p on YouTube - it put all four logical processors at 100%, but it was not choppy as long as that's all it was doing. Opening another browser tab, trying to run some other application, etc was asking too much. Anything newer than that system has GPU offload, and thus has no problems with video/multimedia content (and I haven't had a mind to test any of the newer machines with an older GPU, so I have no idea how they'd fare in terms of SW-rendering). I know a lot of integrated/embedded solutions for Core 2/LGA 775 do not have GPU offload capabilities, for example the Intel GMA graphics (excepting the 4500HD) lack a lot of video decoding features, as do the older nVidia and ATi IGPs. Newer IGPs tend to have much better support for video decoding/multimedia stuff, which is a platform advantage for newer systems (like modern Core i* series).

jesolo wrote:
Just as a matter of interest. […]
Show full quote

Just as a matter of interest.

How fast do you want the frame rate of your games to be?
Standard film is 24 fps, PAL tv is 25 fps and NTSC tv is 29.97fps.
I'd always thought that so long as the frame rate of your game doesn't drop below 30fps, the graphics should be smooth enough? Or, am I missing something here? Is it because the higher the resolution, the higher the frame rate should be?

Something to add to what's already been said about film and TV content: filmed content is temporally anti-aliased as it is recorded. That is, the camera is actually exposing the film (or, if its digital, the CCD) for 1/24 or 1/30 or 1/48 or whatever its set to. That means you're actually capturing everything in the camera's FOV for that period of time. Frames generated by a computer game generally do not have that - they're essentially an instantaneous snapshot. So the frame-rates are not directly comparable there, because there's no "blurring" (and I don't like to use this word because I think most people in the gaming world associate it with LCD ghosting or some other "bad" artifact). Beyond3D has a nice article on this topic, which also includes some 3dfx-related stuff, if you're interested: http://www.beyond3d.com/content/articles/66/

Generally speaking (meaning "textbook average"), 25-30 FPS is considered to produce the illusion of motion for film and CG content (which is why a lot of filmed content runs in that range). Some people will accept/tolerate lower values, some people will insist upon higher values. The content being reproduced also has a bearing on this - oftentimes people seem to be okay with fairly frame-rates in RTS and RPG games, whereas in shooters they tend to be less okay. That discussion can become very charged very quickly, because it usually ends up in an argument about perception (e.g. "don't tell me what I see"). For a video-game, it's probably fair to say 30 FPS is a good "floor" for performance. In other words, if the minimum frame-rate is never worse than 30 FPS (e.g. 32ms per frame at absolute worst), that's a good starting point. Of course some people will still insist upon higher rates, and some people will accept lower rates.

Finally, remember that FPS is a rate over time, just like miles per hour. It isn't an instantaneous value. So you can have "60 fps AVG" in a benchmark, but that can be the result of sections running at 5 FPS and other sections running at 200 FPS, and the total averaging works out to 60. That's one issue with published reviews/benchmarks, especially older ones, that look at averages when discussing performance. The drops in frame-rate can come from a variety of sources, including "bad performance" on the computer (e.g. the CPU isn't up to a certain level/map/whatever), as well as background tasks on your machine drawing resources off (and remember that a lot of reviews run fairly "stripped" operating environments vs what most people have going on real-world). More recently a lot of reviews seem to be paying attention to this, and including min/max/avg frame-rates as opposed to just a simple average, or they'll include a plotted frame-time chart for the benchmark (which imho is usually more data than is necessary unless you're trying to track a specific stutter artifact).

smeezekitty wrote:

At 45 FPS, it may not be the framerate but rather frametime variance that is bothering you. Unfortunately CPU bottlenecks are more likely to cause frame rate variances.



+1. Especially if you have vsync enabled at 45 FPS - it's likely a stutter pattern where the machine is dancing between 16ms and 32ms frames to hit that 45 FPS average. It can look choppy as a result, even though it's a "good average." This is one case where a more detailed frame-time plot could be useful in figuring out what's going on.

Since input lag has been mentioned as well - it *can* be related to frame-rate, but it may also just be the engine's input caller that's not doing a good job. RAGE is a really bad offender in this respect (it's the worst I've ever seen measured) - in some cases the input caller can take something like 150-200ms to respond, so even if it were running at 900 FPS, it would still feel "chuggy" because there's a fairly significant latency between buttons being pushed and the commands getting into the engine and going to the renderer to produce an output on the screen. In a lot of cases this input lag value is not static - it can move in response to what else the engine is having to compute. For example if there's a lot of physics or collision in a specific scene, that can worsen things. And that's usually at the worst possible time, because usually the more complex scenes are the "action packed parts" of the game.

Running at stupendously high frame rates over refresh rate can also produce problems in some games. In some cases, the game's internal timing expects something in the 25-60 FPS range, and there may be stability issues or otherwise unpredictable behavior. In other cases, if the frame-rate is substantially higher than refresh rate (I'm talking like 1200 FPS into a 60Hz monitor 🤣), some games appear to exhibit something like the soap opera effect.

And as with anything to do with perception - different people will perceive, experience, etc all of this to different extents. Some people's only interaction with this kind of thing will be their game becoming unstable because the frame-rate is too high, others are very particular about frame-rate and monitor refresh rate.

Some more articles if anyone is interested:
http://www.anandtech.com/show/6857/amd-stutte … r-roadmap-fraps
http://www.eurogamer.net/articles/digitalfoun … -factor-article
http://www.eurogamer.net/articles/digitalfoun … -article?page=3
http://timothylottes.blogspot.com/2013/04/rep … f-light-of.html
http://www.anandtech.com/show/2803

dr_st wrote:

I wouldn't say you're doing it wrong, but perhaps you are not subjecting it to high loads. Also, it is more obvious when you are comparing to other systems. I may not have considered my C2Ds slow if I didn't have a C2Q and a couple more modern systems to compare to. The C2D is still usable, just not very fun to use. 😀



Comparatively speaking, it's probably roughly in the middle of machines that I own - I don't begrudge using it for non-gaming tasks (like writing this post), but I'd agree that it's likely out of the question for running newer DX11 games. As far as "subjecting it to high loads" - that's why I said I think there should be some qualification of exactly what we're declaring LGA 775 systems "obsolete" for. In terms of running GTA 5 or some other heavy console port, I would guess most LGA 775 systems probably aren't up to that task (especially if we're including Pentium 4, Pentium D, Celeron D, etc). But if we're just wanting a machine to install Windows Vista/7 onto and browse the web (but even this is kind of loaded - Vogons for example is a significantly lighter site than YouTube or Hulu), watch a DVD, etc I don't think there's any problem with that as long as you have sufficient memory (2GB should probably be considered a minimum these days, and 4GB+ is ideal), and a GPU that can assist with multimedia content. 😀

alexanrs wrote:

Anyway, I have a 120Hz monitor and I do notice when something messes up my configurations and drops me back to 60Hz while in the desktop (the mouse pointer moviment looks a bit less smooth), though I'm not sure I'd notice anything in games;

You are not the only one. I think there is enough evidence to conclude that people do notice, quite easily, the difference between 60FPS and 120FPS. But with the possibl e exception of competitive fast-paced gaming, I don't believe it actually affects the quality of gameplay.[/quote]

My suspicion is that it's more of a gradient from people who don't notice (or genuinely don't care), to people who notice a little bit, to people who notice more significantly, to people who are very sensitive to it. Personally I find 120-144Hz gaming to be somewhat "sharper" or "snappier" feeling, but it's more of a "luxury feature" to me. Just like having your monitor's color temperature setup properly. I can live without it, but having it set "right" does improve the subjective experience. That probably puts me somewhere in the middle of that gradient - I notice it, but it's not a big deal to me. That isn't a universal experience though - some folks are more sensitive to some things. I would assume that it's also probably something that can be trained, so highly competitive/practiced gamers (or people who otherwise have very honed reflexes) probably are more sensitive to this kind of thing than more casual users (like me). 😊

Reply 27 of 126, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

I currently don't have a permanent 775 setup--the closest thing I have is a s939 Opteron 185 @ 3.0GHz (or 3.13GHz, when I'm in benchmark smashing mode). In most benchmarks it performs just like a C2D E6600 (or an e6700, when I'm in 3.13GHz benchmark smashing mode 😜).

It's a little too weak for current games, but for general purpose stuff it just flies. It draws most websites up nearly as quickly as my 4.3GHz i7-4930k. It handles the new 60fps 1080p content on YouTube just fine. It handles Netflix HD just fine. Actually, it's quite remarkable how little difference there is between this machine and the i7 when it comes to Internet activity and other non-gaming tasks. The SSD and GTX 560 help it stay young. 😀

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 28 of 126, by shamino

User metadata
Rank l33t
Rank
l33t

I don't own any LGA775 machine except a useless Dell GX280 SFF board which only supports P4 Prescotts. It's a tiny toaster that consumes over 100W idle, has an overworked and noisy fan, and minimal expansion of basically 1 connector of each type. I have no interest in using that machine and have frankly failed to find any redeemable reason that it exists.

My nephew has a proper mid-tower sized Dell LGA775 Prescott system which is far better. He plays games on it. They're not the most demanding games though - mainly Minecraft. Videos work fine as far as I know, but it's assisted by a GT240 card. I'd like to upgrade him to a Core2 sometime, but don't see any need for more than that.

I have a socket-940 Opteron board that I use as a file server. I've thought about reconfiguring it to support everyday desktop usage, because it uses half the power of my main desktop and I think it would do just fine for everyday tasks. It would make sense in the summer months. I don't think I'd really need to run the main desktop except to play modern games.
Undervolting on the K8 works really well. They can be made to draw very little at idle, mine only needs to spin the CPU fan intermittently. I expect the same should be true of Core2 chips.

Reply 29 of 126, by obobskivich

User metadata
Rank l33t
Rank
l33t
shamino wrote:

Undervolting on the K8 works really well. They can be made to draw very little at idle, mine only needs to spin the CPU fan intermittently. I expect the same should be true of Core2 chips.

Both of my 939 chips play nice with this, even with a fairly basic heatsink. On 775, my 6550 actually doesn't have a CPU fan - it just has a case intake fan and a big CPU heatsink with a "wind tunnel." Works fine. My C2Q's motherboard supports the intermittent fan thing, but the CPU itself never gets cool enough to actually trip that, at least once its up and running. That is, it may delay turning the fan on for a little while on boot-up, but once the fan starts its generally going to stay on. The advantage though, is LGA 775 can generally mount very modern sinks - my C2Q has a big heatpipe affair with a 120mm fan, so unless you run the fan up to 100% (which I've never found to be required), you really don't hear much from it (and unfortunately the same cannot be said for the Intel heatsink). Neither of them really seems to heat up the room, for whatever that's worth. 😊

Reply 30 of 126, by Skyscraper

User metadata
Rank l33t
Rank
l33t

I want to investigate what happens when you add high speed DDR3 memory to the mix so I bought a Gigabyte P35 Combo board.

Stolen picture showing Gigabyte EP35C-DS3R
ns8EP9.jpg

I have already bought an untested MSI P35 Combo DDR2/DDR3 board but both DDR2 and one DDR3 channel are dead so that board didnt help much. Truth be told I have only heard negative things of all socket 775 DDR2/DDR3 "combo boards" but Im betting everybody else is wrong... The board I bought was boxed, which in this case meant 1x motherboard, 1x IDE cable and 1x box. This board has lived all its life as a low power system running a low end Wolfdale 3M dual core so the board seems to be in perfect shape, which is good since I payed unreasonable much for it.

So far I can report that the northbridge overheats to the point of system freeze if it dosnt get any air flowing over its heatsink, this is with 4 GB DDR3 and an E8400 Wolfdale at stock speed. With some airflow over the heatsink it dosnt even get warm.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 31 of 126, by Putas

User metadata
Rank Oldbie
Rank
Oldbie

I am using this for almost four years now: Asus P5E3 Premium, Q9300@3.25 GHz (433*7.5), 16 GB DDR3
Shadows of Mordor benchmark, 1920x1080, very high quality without motion blur and DOF, R9 270 1000/1470
average 75,13
minimal 25,71
And that is still enough for me. I was never stuck with one CPU for that long, but I cannot be bothered to upgrade yet. Prices of newer i5 are rather high for my taste, and AMD maybe does not even offer all around upgrade. Damn Bulldozer, I never said bad thing about it yet, but after those years I am sure it would be better to just die shrink Phenom. Another reason to keep old platform is that I still use the IDE from time to time, with mobos nowadays even PCI slots are getting scarce.

Reply 32 of 126, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie

I don't like modern games much, the most modern thing I played on my Xeon X3230\RadeonHD4870X2 was Risen 2. I had to turn the tesellation off and tune AA/AS down to 8x to get a decent framerate. The only difference I noticed was that Petty's curves became a little more rectangular 😁 So as far as it concerns me, C2D is still a good performer for modern games. As I see C2D, it's the most long-lasting CPU family in the whole history of PC.

Still, I think I'm not the only one here who likes older graphics more. The modern sharpness only makes me a little dizzy.

Reply 33 of 126, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
Skyscraper wrote:

I want to investigate what happens when you add high speed DDR3 memory to the mix so I bought a Gigabyte P35 Combo board.

Just FYI, such boards are known to work with DDR3 only with DDR2 already installed.

Reply 34 of 126, by BX300A

User metadata
Rank Newbie
Rank
Newbie

Here's a guy that compared his q6600 to an i7-4770k, both systems using gtx570 graphics.

http://www.legitreviews.com/upgrading-from-in … e-i7-4770k_2247

Going back in the thread a bit regarding input latencies etc, in the music software world 20 milliseconds is considered very high latency. Between hitting a key and hearing a sound, 20ms feels sluggish. 5ms to me feels instant. I'm not as sensitive in video games, but some games definitely feel like there's a rubber band between the keys and the game. Bioshock always felt like this, regardless of frame rate. I'd rather have 60fps and 20ms latency than 120fps and 100ms (for example).

386DX40, Amiga 600, Pentium 75, Celeron 300A, Pentium III-S 1.4, Athlon XP2400+, Pentium 4 I do not care for, Pentium M 780, Core 2 Q6600, i7 3770K

Reply 35 of 126, by GeorgeMan

User metadata
Rank Oldbie
Rank
Oldbie
RacoonRider wrote:
Skyscraper wrote:

I want to investigate what happens when you add high speed DDR3 memory to the mix so I bought a Gigabyte P35 Combo board.

Just FYI, such boards are known to work with DDR3 only with DDR2 already installed.

I've not seen a single regular desktop board that can work with both types of ram simultaneously.
Not even the super hybrids from asrock (with DDR1/DDR2, pci-e/agp and even CPU upgrade slot to a different socket!)

1. Athlon XP 3200+ | ASUS A7V600 | Radeon 9500 @ Pro | SB Audigy 2 ZS | 80GB IDE, 500GB SSD IDE2Sata, 2x1TB HDDs | Win 98SE, XP, Vista
2. Pentium MMX 266| Qdi Titanium IIIB | Hercules graphics & Amber monitor | 1 + 10GB HDDs | DOS 6.22, Win 3.1, 95C

Reply 36 of 126, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
GeorgeMan wrote:
RacoonRider wrote:
Skyscraper wrote:

I want to investigate what happens when you add high speed DDR3 memory to the mix so I bought a Gigabyte P35 Combo board.

Just FYI, such boards are known to work with DDR3 only with DDR2 already installed.

I've not seen a single regular desktop board that can work with both types of ram simultaneously.
Not even the super hybrids from asrock (with DDR1/DDR2, pci-e/agp and even CPU upgrade slot to a different socket!)

MSI G41M-P33 Combo for one has been repeatedly reported to have such behaviour.

Reply 37 of 126, by Sammy

User metadata
Rank Oldbie
Rank
Oldbie

I have an core2quad 9550 with a gtx570ti.

Web browsing, watch Videos, is no Problem, even Battlefield 3 at High details runs at 1920x1080 with 30-50 fps.

But i am playing most time older games or adventures so i think i will stay Happy with this machine many years.

Reply 38 of 126, by calvin

User metadata
Rank Member
Rank
Member

Games aren't really too CPU dependent. I'm running an E6850 (on a DQ45CB, with 5 GB of DDR2) paired with a Radeon HD 6670, which does all the work in games.

2xP2 450, 512 MB SDR, GeForce DDR, Asus P2B-D, Windows 2000
P3 866, 512 MB RDRAM, Radeon X1650, Dell Dimension XPS B866, Windows 7
M2 @ 250 MHz, 64 MB SDE, SiS5598, Compaq Presario 2286, Windows 98

Reply 39 of 126, by Skyscraper

User metadata
Rank l33t
Rank
l33t
calvin wrote:

Games aren't really too CPU dependent. I'm running an E6850 (on a DQ45CB, with 5 GB of DDR2) paired with a Radeon HD 6670, which does all the work in games.

If that isnt 2x2GB + 2x512MB then remove the third wheel (1GB stick) and double your memory bandwidth!

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.