havli wrote:
My laptop has a similar CPU (c2d T8100) and it is very hard to use because of the slow CPU. Web browsing is sluggish, video playback results in high CPU load (especially online video - flash), java programming is pain as well. I guess I'm too spoiled by the i5 performance. 🤣
Multitasking is indeed too demanding for c2d to handle.
My C2D system has no trouble with video - usually under 5% CPU load. It has GPU offload though (it has Quadro FX 1700). I've run into no problems with multi-tasking either, but I guess it depends on what specifically you're doing. I have faster computers available as well, but don't really notice any significant differences in "daily use" kinds of tasks. Again, that's all subjective - my daily use tasks may be "lightweight work" for some folks, and extremely-heavy for others, just as your daily use tasks may be very heavy or very light compared to others.
On Flash running on CPUs, my dual Xeon 2.8GHz (those are NetBurst chips with HT) could wrestle through 1080p on YouTube - it put all four logical processors at 100%, but it was not choppy as long as that's all it was doing. Opening another browser tab, trying to run some other application, etc was asking too much. Anything newer than that system has GPU offload, and thus has no problems with video/multimedia content (and I haven't had a mind to test any of the newer machines with an older GPU, so I have no idea how they'd fare in terms of SW-rendering). I know a lot of integrated/embedded solutions for Core 2/LGA 775 do not have GPU offload capabilities, for example the Intel GMA graphics (excepting the 4500HD) lack a lot of video decoding features, as do the older nVidia and ATi IGPs. Newer IGPs tend to have much better support for video decoding/multimedia stuff, which is a platform advantage for newer systems (like modern Core i* series).
jesolo wrote:Just as a matter of interest. […]
Show full quote
Just as a matter of interest.
How fast do you want the frame rate of your games to be?
Standard film is 24 fps, PAL tv is 25 fps and NTSC tv is 29.97fps.
I'd always thought that so long as the frame rate of your game doesn't drop below 30fps, the graphics should be smooth enough? Or, am I missing something here? Is it because the higher the resolution, the higher the frame rate should be?
Something to add to what's already been said about film and TV content: filmed content is temporally anti-aliased as it is recorded. That is, the camera is actually exposing the film (or, if its digital, the CCD) for 1/24 or 1/30 or 1/48 or whatever its set to. That means you're actually capturing everything in the camera's FOV for that period of time. Frames generated by a computer game generally do not have that - they're essentially an instantaneous snapshot. So the frame-rates are not directly comparable there, because there's no "blurring" (and I don't like to use this word because I think most people in the gaming world associate it with LCD ghosting or some other "bad" artifact). Beyond3D has a nice article on this topic, which also includes some 3dfx-related stuff, if you're interested: http://www.beyond3d.com/content/articles/66/
Generally speaking (meaning "textbook average"), 25-30 FPS is considered to produce the illusion of motion for film and CG content (which is why a lot of filmed content runs in that range). Some people will accept/tolerate lower values, some people will insist upon higher values. The content being reproduced also has a bearing on this - oftentimes people seem to be okay with fairly frame-rates in RTS and RPG games, whereas in shooters they tend to be less okay. That discussion can become very charged very quickly, because it usually ends up in an argument about perception (e.g. "don't tell me what I see"). For a video-game, it's probably fair to say 30 FPS is a good "floor" for performance. In other words, if the minimum frame-rate is never worse than 30 FPS (e.g. 32ms per frame at absolute worst), that's a good starting point. Of course some people will still insist upon higher rates, and some people will accept lower rates.
Finally, remember that FPS is a rate over time, just like miles per hour. It isn't an instantaneous value. So you can have "60 fps AVG" in a benchmark, but that can be the result of sections running at 5 FPS and other sections running at 200 FPS, and the total averaging works out to 60. That's one issue with published reviews/benchmarks, especially older ones, that look at averages when discussing performance. The drops in frame-rate can come from a variety of sources, including "bad performance" on the computer (e.g. the CPU isn't up to a certain level/map/whatever), as well as background tasks on your machine drawing resources off (and remember that a lot of reviews run fairly "stripped" operating environments vs what most people have going on real-world). More recently a lot of reviews seem to be paying attention to this, and including min/max/avg frame-rates as opposed to just a simple average, or they'll include a plotted frame-time chart for the benchmark (which imho is usually more data than is necessary unless you're trying to track a specific stutter artifact).
smeezekitty wrote:
At 45 FPS, it may not be the framerate but rather frametime variance that is bothering you. Unfortunately CPU bottlenecks are more likely to cause frame rate variances.
+1. Especially if you have vsync enabled at 45 FPS - it's likely a stutter pattern where the machine is dancing between 16ms and 32ms frames to hit that 45 FPS average. It can look choppy as a result, even though it's a "good average." This is one case where a more detailed frame-time plot could be useful in figuring out what's going on.
Since input lag has been mentioned as well - it *can* be related to frame-rate, but it may also just be the engine's input caller that's not doing a good job. RAGE is a really bad offender in this respect (it's the worst I've ever seen measured) - in some cases the input caller can take something like 150-200ms to respond, so even if it were running at 900 FPS, it would still feel "chuggy" because there's a fairly significant latency between buttons being pushed and the commands getting into the engine and going to the renderer to produce an output on the screen. In a lot of cases this input lag value is not static - it can move in response to what else the engine is having to compute. For example if there's a lot of physics or collision in a specific scene, that can worsen things. And that's usually at the worst possible time, because usually the more complex scenes are the "action packed parts" of the game.
Running at stupendously high frame rates over refresh rate can also produce problems in some games. In some cases, the game's internal timing expects something in the 25-60 FPS range, and there may be stability issues or otherwise unpredictable behavior. In other cases, if the frame-rate is substantially higher than refresh rate (I'm talking like 1200 FPS into a 60Hz monitor 🤣), some games appear to exhibit something like the soap opera effect.
And as with anything to do with perception - different people will perceive, experience, etc all of this to different extents. Some people's only interaction with this kind of thing will be their game becoming unstable because the frame-rate is too high, others are very particular about frame-rate and monitor refresh rate.
Some more articles if anyone is interested:
http://www.anandtech.com/show/6857/amd-stutte … r-roadmap-fraps
http://www.eurogamer.net/articles/digitalfoun … -factor-article
http://www.eurogamer.net/articles/digitalfoun … -article?page=3
http://timothylottes.blogspot.com/2013/04/rep … f-light-of.html
http://www.anandtech.com/show/2803
dr_st wrote:I wouldn't say you're doing it wrong, but perhaps you are not subjecting it to high loads. Also, it is more obvious when you are comparing to other systems. I may not have considered my C2Ds slow if I didn't have a C2Q and a couple more modern systems to compare to. The C2D is still usable, just not very fun to use. 😀
Comparatively speaking, it's probably roughly in the middle of machines that I own - I don't begrudge using it for non-gaming tasks (like writing this post), but I'd agree that it's likely out of the question for running newer DX11 games. As far as "subjecting it to high loads" - that's why I said I think there should be some qualification of exactly what we're declaring LGA 775 systems "obsolete" for. In terms of running GTA 5 or some other heavy console port, I would guess most LGA 775 systems probably aren't up to that task (especially if we're including Pentium 4, Pentium D, Celeron D, etc). But if we're just wanting a machine to install Windows Vista/7 onto and browse the web (but even this is kind of loaded - Vogons for example is a significantly lighter site than YouTube or Hulu), watch a DVD, etc I don't think there's any problem with that as long as you have sufficient memory (2GB should probably be considered a minimum these days, and 4GB+ is ideal), and a GPU that can assist with multimedia content. 😀
alexanrs wrote:Anyway, I have a 120Hz monitor and I do notice when something messes up my configurations and drops me back to 60Hz while in the desktop (the mouse pointer moviment looks a bit less smooth), though I'm not sure I'd notice anything in games;
You are not the only one. I think there is enough evidence to conclude that people do notice, quite easily, the difference between 60FPS and 120FPS. But with the possibl e exception of competitive fast-paced gaming, I don't believe it actually affects the quality of gameplay.[/quote]
My suspicion is that it's more of a gradient from people who don't notice (or genuinely don't care), to people who notice a little bit, to people who notice more significantly, to people who are very sensitive to it. Personally I find 120-144Hz gaming to be somewhat "sharper" or "snappier" feeling, but it's more of a "luxury feature" to me. Just like having your monitor's color temperature setup properly. I can live without it, but having it set "right" does improve the subjective experience. That probably puts me somewhere in the middle of that gradient - I notice it, but it's not a big deal to me. That isn't a universal experience though - some folks are more sensitive to some things. I would assume that it's also probably something that can be trained, so highly competitive/practiced gamers (or people who otherwise have very honed reflexes) probably are more sensitive to this kind of thing than more casual users (like me). 😊