VOGONS

Common searches


How much computer is enough?

Topic actions

Reply 20 of 39, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

Enough for what? If you are interested in modern games - look at their "recommended requirements" section (be ready after 3 years games will want more, even if they will be same as on $5 calculators - it's marketing, nothing personal). If to use office stuff and Internet, then look at requirements of these applications - better older versions as in common they have what you need at home but have much less requirements.
For home I'd buy PC from $300 and a monitor from same cost. To buy new hardware cheaper is just a problem. If you want "modern" PC - it's from $1000, where every hardware is close to avarage prices. For office stuff and a "little of gaming" from $500. Get a pricelist in local computer shop and fit hardware to preferable cost.
You may also find usefull articles at review sites as tomshardware, - there are many similar, on different languages.

It's almost a strictly a matter of more GPU. GPUs are still growing because they CAN add more core since graphics rendering is highly parallelized

Reply 22 of 39, by luckybob

User metadata
Rank l33t
Rank
l33t

You might think you have plenty of power in your desktop, but look at the latest server chips coming in 2017: http://www.extremetech.com/computing/206659-f … memory-channels

The notion of "enough" doesnt exist in the server world.

It is a mistake to think you can solve any major problems just with potatoes.

Reply 24 of 39, by jesolo

User metadata
Rank l33t
Rank
l33t

I remember that in the late 80's (when I started to be exposed to PC's) and the 90's, if you bought a computer, within 12 months (sometimes even less), your computer was out of date as it was unable to run the latest and greatest software/games (bear in mind the astronomical cost of PC's back then and that one wasn't always able to buy the fastest hardware).
Back then, the tempo at which games and software were developed and kept on pushing the boundaries was astonishing.

Believe it or not, my main PC is still an Intel Core 2 Duo (E8400) with 4 GB of RAM and an AMD/ATI Radeon HD3870, running on an Asus P5QL Pro.
Granted, should I even try to run the latest games on it, I would be in trouble. However, due to family commitments, I haven't had the time to play much games either but, I might consider upgrading again in the near future, should the need really arise.
Having said that, my current PC still runs Windows 7 perfectly, I can perform all my daily tasks on it and I can even play some fairly recently games on it (although, maybe not at their maxium settings).

What I have picked up in recent years is that hardware has advanced at such a pace that software didn't necessarily keep up at the same pace (compared to how it was in the late 80's and 90's).
Game developers are obviously trying to push the boundaries in terms of graphics detail, etc. and here I do agree that GPU's can still advance a bit but, I also feel that some games run slow on high end systems because of pure sloppy programming and non optimization of code (not that I'm an expert in coding). This, in itself, are due to various factors and in many occasions it's because of time constraints in order to release game as quickly as possible.

I'm starting to get the feeling that we might be approaching a point where graphics detail will become "photo realistic" (if not already).
So, when you've reached that point, what's next? Virtual Reality, Holodecks 🤣.

Reply 25 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
snorg wrote:

Yeah I think you're right. Most resource intensive stuff these days seems to be centered around UI enhancements and games. In 10 years you won't really "need" a 50 core desktop with 1 terabyte of RAM to run Word 2025 but you will want it so that Cortana doesn't stutter at you when she asks you how you want to compose your memo. 😉

🤣

gerwin wrote:

Progress...
Intel Pentium Pro, Date released 1995, Maximum memory: 64 GB
Intel Core i7-4790 Processor, Date released 2014, Max Memory Size: 32 GB

Pentium Pro supports 36-bit memory addressing via PAE, however afaik there is not a memory controller for the PPro platform that can handle anything approaching 64GB of memory (450GX supports 8GB), and at least within Windows NT running over 4GB on a 32-bit OS requires applications to be explicitly PAE aware and manage their own memory above the 4GB line. It's nowhere near as clean/versatile as Core i7-4790, which regularly are installed in systems with 32GB of memory.

You're also comparing a top-tier server processor to a consumer part. If you look at modern Xeon processors their maximum supported memory (which includes their internal memory controller) is significantly larger (768GB on Xeon E5, 1.5TB on Xeon E7).

jesolo wrote:

Game developers are obviously trying to push the boundaries in terms of graphics detail, etc. and here I do agree that GPU's can still advance a bit but, I also feel that some games run slow on high end systems because of pure sloppy programming and non optimization of code (not that I'm an expert in coding). This, in itself, are due to various factors and in many occasions it's because of time constraints in order to release game as quickly as possible.

And a lot of that is also the result of bad/inefficient ports from consoles (especially from non-x86 consoles).

I'm starting to get the feeling that we might be approaching a point where graphics detail will become "photo realistic" (if not already).
So, when you've reached that point, what's next? Virtual Reality, Holodecks 🤣.

"They" have been talking photorealistic graphics for as long as I can remember. Modern CGI for movies can generally "get there" but games are still a ways away. I think VR has become a bigger goal mostly because it offers a different UI/UX experience with existing hardware/software capabilities (and thus far it's been able to generate a lot of hype and do very well at making some people ridiculously wealthy just from selling the idea).

Reply 26 of 39, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

Pentium Pro supports 36-bit memory addressing via PAE, however afaik there is not a memory controller for the PPro platform that can handle anything approaching 64GB of memory (450GX supports 8GB), and at least within Windows NT running over 4GB on a 32-bit OS requires applications to be explicitly PAE aware and manage their own memory above the 4GB line. It's nowhere near as clean/versatile as Core i7-4790, which regularly are installed in systems with 32GB of memory.

That's true. For example a 386 in theory could support 4 GB of RAM but no chipset would let a standard 386 get anywhere near that.
The big thing is modern chips have the memory controller built in

Reply 27 of 39, by gerwin

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
gerwin wrote:

Progress...
Intel Pentium Pro, Date released 1995, Maximum memory: 64 GB
Intel Core i7-4790 Processor, Date released 2014, Max Memory Size: 32 GB

Pentium Pro supports 36-bit memory addressing via PAE, however afaik there is not a memory controller for the PPro platform that can handle anything approaching 64GB of memory (450GX supports 8GB), and at least within Windows NT running over 4GB on a 32-bit OS requires applications to be explicitly PAE aware and manage their own memory above the 4GB line. It's nowhere near as clean/versatile as Core i7-4790, which regularly are installed in systems with 32GB of memory.

You're also comparing a top-tier server processor to a consumer part. If you look at modern Xeon processors their maximum supported memory (which includes their internal memory controller) is significantly larger (768GB on Xeon E5, 1.5TB on Xeon E7).

Am aware of that, that it is probably a feature at best used for a few supercomputers. Regardless I found it a funny observation; in a strange way it seems to contradict Moore's law for two items 19 years apart.
Indeed, Intel has their Xeon line as well... I forgot about them.

Now seriously; I am surprised that in one way everyone jumped on the x64 train even before it was really necessary, whilst at the same time marketing concerns restrict memory expansion on the motherboards and CPUs severely. And most people don't seem to know or care about that.
In the year 1999 with a i440BX Mainboard; one could expand from the factory equipped 32MB to the astronomical 1000MB. In 2011 the default motherboards we got offered came with 8GB, and could at max hold 16GB. I opted for boards with a 32GB limit instead.

--> ISA Soundcard Overview // Doom MBF 2.04 // SetMul

Reply 28 of 39, by gerwin

User metadata
Rank l33t
Rank
l33t
obobskivich wrote:
jesolo wrote:

Game developers are obviously trying to push the boundaries in terms of graphics detail, etc. and here I do agree that GPU's can still advance a bit but, I also feel that some games run slow on high end systems because of pure sloppy programming and non optimization of code (not that I'm an expert in coding). This, in itself, are due to various factors and in many occasions it's because of time constraints in order to release game as quickly as possible.

And a lot of that is also the result of bad/inefficient ports from consoles (especially from non-x86 consoles).

I noticed that a certain flight sim has six models for each aircraft, with increasing amount of polygons: each one is to be used at the appropriate distance. On the other hand most freeware planes for this game contain only one model, and therefor cause lag. Since then, I always wonder how certain games may have been provided with model scaling according to distance (=good practice), or where devs have just been lazy and let the hardware struggle to render 'invisible' details.

--> ISA Soundcard Overview // Doom MBF 2.04 // SetMul

Reply 29 of 39, by KT7AGuy

User metadata
Rank Oldbie
Rank
Oldbie
Lo Wang wrote:

You'd be hard pressed to find a more wicked, self-centered, self-entitled, narcissistic, proud, shallow, vain, and just downright stupid generation, who are more than happy to voluntarily broadcast their pathetic, mundane existences and every single detail therefor to this dying world and to the NSA.

“The children now love luxury. They have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise.” -Socrates

🤣

(I mostly agree with you about this.)

Reply 30 of 39, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

Now seriously; I am surprised that in one way everyone jumped on the x64 train even before it was really necessary, whilst at the same time marketing concerns restrict memory expansion on the motherboards and CPUs severely. And most people don't seem to know or care about that.
In the year 1999 with a i440BX Mainboard; one could expand from the factory equipped 32MB to the astronomical 1000MB. In 2011 the default motherboards we got offered came with 8GB, and could at max hold 16GB. I opted for boards with a 32GB limit instead.

I agree. I don't really care for 64 bits but accept it as necessity with growing memory requirements. I don't understand all the adopters in ~2008-2010
64 bits totally breaks compatibility with 16 bit software and breaks some 32 bit software. It actually uses MORE RAM so using a 64 bit OS with less than 4 GB of RAM makes no sense

I noticed that a certain flight sim has six models for each aircraft, with increasing amount of polygons: each one is to be used at the appropriate distance. On the other hand most freeware planes for this game contain only one model, and therefor cause lag. Since then, I always wonder how certain games may have been provided with model scaling according to distance (=good practice), or where devs have just been lazy and let the hardware struggle to render 'invisible' details.

Probably at the cost of installed disk size.

“The children now love luxury. They have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise.” -Socrates

When the authority is corrupt, you can hardly blame them. The fact that you pulled up such a quote from Socrates shows that the argument has been around for a long time.
It seems that one generation always tries to condemn the next generation because they do things differently. Whether that is art, music, mannerisms or whatever else.
It's kind of funny that the generation you are singling out will almost say something similar about their children and grandchildren. While I agree some changes have definitely been for the worst,
people unreasonably venomously oppose natural deviation of society.

That said, this is off topic.

Reply 31 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
gerwin wrote:

Am aware of that, that it is probably a feature at best used for a few supercomputers. Regardless I found it a funny observation; in a strange way it seems to contradict Moore's law for two items 19 years apart.

Even in a "supercomputer" (e.g. beowulf cluster) you wouldn't actually have a single PPro system addressing 64GB of RAM - you may have 64GB (or more) across all the nodes, but each node will still be limited to at most 8GB (with 450GX), but more realistically it will be less than that (e.g. ASCI Red, which had something like 128MB per compute node). It doesn't contradict Moore's Law either - Moore's Law says nothing about RAM or computational ability. It just says that every two years the number of transistors will double.

Now seriously; I am surprised that in one way everyone jumped on the x64 train even before it was really necessary, whilst at the same time marketing concerns restrict memory expansion on the motherboards and CPUs severely. And most people don't seem to know or care about that.
In the year 1999 with a i440BX Mainboard; one could expand from the factory equipped 32MB to the astronomical 1000MB. In 2011 the default motherboards we got offered came with 8GB, and could at max hold 16GB. I opted for boards with a 32GB limit instead.

I'm not sure what the argument about marketing and "x64" has to do with anything, but as far as the memory limits/sizes - I wouldn't expect it to scale linearly over time, and it isn't surprising that it hasn't. Application needs have not scaled linearly, nor have they even kept up with hardware. For a consumer machine, I can't imagine 32GB (let alone more) being necessary, especially when most applications are still Win32. By contrast, being back in 1999 with 32MB would've been a constrained experience to say the least. I think somewhere in the middle of that, chronologically, we hit a "break point" where price:capacity outpaced demand, at least for consumers.

smeezekitty wrote:

I agree. I don't really care for 64 bits but accept it as necessity with growing memory requirements. I don't understand all the adopters in ~2008-2010
64 bits totally breaks compatibility with 16 bit software and breaks some 32 bit software. It actually uses MORE RAM so using a 64 bit OS with less than 4 GB of RAM makes no sense

I don't know about modern x86-64 CPUs, but don't the original Athlon64/P4-64bit systems at least support 16-bit mode applications under 98SE or XP? IOW, if you have a 64-bit Prescott on an i865, and load Win98, will it actually not run 16-bit applications where a Northwood would?

Reply 32 of 39, by luckybob

User metadata
Rank l33t
Rank
l33t

I was one of those people that got on the x64 bandwagon at first chance. I was always annoyed at my p4 for only having 3.5gb of ram available when i used 4x1gb sticks! Keep in mind, for a very long time I've been using last years server-grade hardware as my current system.

Also, I dont see 16 bit incompatibility as an issue for x64 systems. I'd venture to say that the people that need or want it are a fraction of a fraction of a percent. And the situation is easily remedied by dosbox/virtual pc/etc.

It is a mistake to think you can solve any major problems just with potatoes.

Reply 33 of 39, by smeezekitty

User metadata
Rank Oldbie
Rank
Oldbie

I don't know about modern x86-64 CPUs, but don't the original Athlon64/P4-64bit systems at least support 16-bit mode applications under 98SE or XP? IOW, if you have a 64-bit Prescott on an i865, and load Win98, will it actually not run 16-bit applications where a Northwood would?

Even a modern I7 still supports V86 mode. There is no problem with 16 bit apps on a 64 bit processor as long as the CPU is in 16 or 32 bit mode. It just doesn't work in long mode (i.e. with a 64 bit OS)

Reply 34 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
smeezekitty wrote:

Even a modern I7 still supports V86 mode. There is no problem with 16 bit apps on a 64 bit processor as long as the CPU is in 16 or 32 bit mode. It just doesn't work in long mode (i.e. with a 64 bit OS)

Oh I see. I was wondering if that wasn't the case (since I know that's a limit in Windows x64 editions), but then wondered if maybe "they" had removed that feature on newer CPUs assuming nobody uses it. 😵

Reply 35 of 39, by meisterister

User metadata
Rank Newbie
Rank
Newbie

I'd venture to say that the people that need or want it are us.

FTFY.

I suppose that there may also be some industrial control systems that need 16 bit compatibility, but they really shouldn't be running a modern, 64 bit OS anyway (Not having pre-emptive multitasking is a plus for that sort of thing).

On topic: Basically the only force pushing for faster hardware in the consumer space is the Internet, since the typical modern site is a bloated, JS-ridden POS. If more pages were static/small/easy to render, then I'd likely just stick with my dual Katmai build and call it a day.

The sad fact of the matter is that the most commonly used non-internet software hasn't actually advanced in a decade or more. If you gave me Office 95 and Office 2013, I couldn't find a functional difference between the two except for that the new version dumped menus and LIKES TO USE CAPS A LOT. There's also barely any functional difference between Windows 2000 and Windows 8. They are both compatible with the same general API, both can support multiprocessor systems, both offer pre-emptive multitasking and memory protection, both provide support for hardware acceleration in games, both provide a TCP/IP stack, and both can run on the same hardware (except for the artificial restriction that Windows 8 places on instruction sets, but let's not get into that).

The average person hit "good enough" in 1989 with the 486. Virtually every change since then has been "Previous version but bigger and maybe with a new UI" on the software side or "Wider, deeper, and smaller" on the hardware side.

However, when we consider the high performance, server, scientific computing, and arguably gaming side of things, we will never approach "enough." As soon as GPGPUs advance a generation, some researcher can find a way of applying the additional FLOPS to making things better. In fact, neural networks basically died from the 1980s until fairly recently because we didn't have the computing power to see them come to fruition properly. Now we do, and they are awesome! Servers can always use more cores by the nature of most server software. Gamers can always use higher resolutions and frame rates. Look at the move to 4k, a resolution that can still cripple today's most expensive graphics cards. With VR and other such technologies, we can readily take advantage of even more computing muscle to emulate a virtual world. There's also the fact that game AIs can always get better, but I'll just leave it at that...

Dual Katmai Pentium III (450 and 600MHz), 512ish MB RAM, 40 GB HDD, ATI Rage 128 | K6-2 400MHz / Pentium MMX 166, 80MB RAM, ~2GB Quantum Bigfoot, Awful integrated S3 graphics.

Reply 36 of 39, by tayyare

User metadata
Rank Oldbie
Rank
Oldbie
jesolo wrote:
I remember that in the late 80's (when I started to be exposed to PC's) and the 90's, if you bought a computer, within 12 months […]
Show full quote

I remember that in the late 80's (when I started to be exposed to PC's) and the 90's, if you bought a computer, within 12 months (sometimes even less), your computer was out of date as it was unable to run the latest and greatest software/games (bear in mind the astronomical cost of PC's back then and that one wasn't always able to buy the fastest hardware).
Back then, the tempo at which games and software were developed and kept on pushing the boundaries was astonishing.

Believe it or not, my main PC is still an Intel Core 2 Duo (E8400) with 4 GB of RAM and an AMD/ATI Radeon HD3870, running on an Asus P5QL Pro.
Granted, should I even try to run the latest games on it, I would be in trouble. However, due to family commitments, I haven't had the time to play much games either but, I might consider upgrading again in the near future, should the need really arise.
Having said that, my current PC still runs Windows 7 perfectly, I can perform all my daily tasks on it and I can even play some fairly recently games on it (although, maybe not at their maxium settings).

What I have picked up in recent years is that hardware has advanced at such a pace that software didn't necessarily keep up at the same pace (compared to how it was in the late 80's and 90's).
Game developers are obviously trying to push the boundaries in terms of graphics detail, etc. and here I do agree that GPU's can still advance a bit but, I also feel that some games run slow on high end systems because of pure sloppy programming and non optimization of code (not that I'm an expert in coding). This, in itself, are due to various factors and in many occasions it's because of time constraints in order to release game as quickly as possible.

I'm starting to get the feeling that we might be approaching a point where graphics detail will become "photo realistic" (if not already).
So, when you've reached that point, what's next? Virtual Reality, Holodecks 🤣.

Almost exactly the same story here. My daily machine is a Core 2 Quad on an Asus P5Q Premium with 4 GB RAM, that I put together in 2009. I upgraded GTS 250 display card to a GTX 560 (1.5 - 2 years ago), and changed XP into Windows 7 32bit (in 2013). Also upgraded 500GB RAID1 arrays into 1TB ones, not because I need to expand my storage, but because I need to replace my HDDs before hey die, and 1TB ones were cheap enough.

I have no short/mid term upgrade plans for it, since I'm really happy with it. It does everything for me (photography, personal databases, office work, light engineering stuff, family movies, etc.), including games even from 2013, without a glitch. It is a funny thing for me, since I was an avid upgrader, I don't even exactly remember how many motherboards or display cards that I changed during late 486 / Pentium / Pentium II era.

GA-6VTXE PIII 1.4+512MB
Geforce4 Ti 4200 64MB
Diamond Monster 3D 12MB SLI
SB AWE64 PNP+32MB
120GB IDE Samsung/80GB IDE Seagate/146GB SCSI Compaq/73GB SCSI IBM
Adaptec AHA29160
3com 3C905B-TX
Gotek+CF Reader
MSDOS 6.22+Win 3.11/95 OSR2.1/98SE/ME/2000

Reply 37 of 39, by Jorpho

User metadata
Rank l33t++
Rank
l33t++
jesolo wrote:

Game developers are obviously trying to push the boundaries in terms of graphics detail, etc. and here I do agree that GPU's can still advance a bit but, I also feel that some games run slow on high end systems because of pure sloppy programming and non optimization of code (not that I'm an expert in coding). This, in itself, are due to various factors and in many occasions it's because of time constraints in order to release game as quickly as possible.

I'm starting to get the feeling that we might be approaching a point where graphics detail will become "photo realistic" (if not already).
So, when you've reached that point, what's next? Virtual Reality, Holodecks 🤣.

It occurs to me that no matter how advanced graphics become, the proportion of people interested in playing games isn't going to get any bigger – and we've probably already gone past the point where putting more resources into better graphics can be expected to result in a comparable increase in sales. That in turn suggests that we'll only get better graphics when developers don't need to put as many resources into them, which I suppose means an increasing reliance on pre-constructed engines, which in turn means decreased variation in visuals.

Reply 38 of 39, by Blurredman

User metadata
Rank Member
Rank
Member
TELEPACMAN wrote:

the assembler puts together the most outdated parts he have around and, voila, it is ok. It's fine for the average user.

Too right!! I worked in a computer shop and that is true.

http://blurredmanswebsite.ddns.net/ 😊

Reply 39 of 39, by sliderider

User metadata
Rank l33t++
Rank
l33t++

I think once dual core CPU's managed to run at more 3 ghz, that was more power than most people would need for a long time. Even a $300 entry level machine is more than anyone but a hardcore gamer would really need now. Core i3 sells like mad at the bottom end.