VOGONS

Common searches


Where does this all end?

Topic actions

First post, by snorg

User metadata
Rank Oldbie
Rank
Oldbie

What is the logical endpoint of this computing revolution we are all collectively participating in?

I think the current state of the art is chips with 28nm features. How much smaller can things get? 10 nanometer? 5, or even 1? Do we do 3d chips from there? Chips made of exotic materials, like diamond (exotic in that we don't make chips out of it currently)

How much power would a typical laptop have 25 years from now? A supercomputer? If we go by past trends, laptops 25 years from now will be equivalent to supercomputers in the top 500 now, I hate to think what a supercomputer of that time would be like. An Exaflop system? What would you use that for? Climate modeling? Drug research? Can we build the next order of magnitude beyond that, or are we reaching hard limits? Would there ever be such a thing as a Yottaflop (yottascale?) system? What would we even do with that much computing power?

I've just been thinking how many changes we've seen since I first got into computers in the early 80s, and where things might be going. Certainly we are in for some strange times ahead. What do you all think?

Reply 1 of 47, by Sune Salminen

User metadata
Rank Member
Rank
Member

All that computing power will be used to stream content to consumers. Powerful CPUs may be made of diamond in the future but you will never see one. You probably won't even be able to buy a motherboard, if there is such a thing.
A typical laptop will be less powerful than your average smartphone is today. It doesn't need to be powerful - your future PC is a television with a keyboard attached to it and a fast internet connection.
In a few decades computers as we know them now will no longer be used in private homes. There's your logical endpoint.
Instead we'll be sitting at a display, streaming our content, games, movies, whatever, with all our personal documents stored in "The Cloud". There will be no more tinkering, hacking and no abandonware sites! Enjoy your vast computing power while you have it.

Last edited by Sune Salminen on 2012-09-03, 03:43. Edited 4 times in total.

Reply 2 of 47, by VileR

User metadata
Rank l33t
Rank
l33t
Sune Salminen wrote:

Instead we'll be sitting at a display, streaming our content, games, movies, whatever, with all our personal documents stored in "The Cloud". There will be no more tinkering, hacking and no abandonware sites!

Quite a bleak vision there, but one that would undoubtedly happen if the big players have their way. More like a 4-decade regression: the personal computer being downgraded once again to a dumb terminal.

"The cloud" is for people who think it's raining when they're really being royally pissed upon.

[ WEB ] - [ BLOG ] - [ TUBE ] - [ CODE ]

Reply 3 of 47, by snorg

User metadata
Rank Oldbie
Rank
Oldbie
Sune Salminen wrote:

powerful - your future PC is a television with a fast internet connection.
In a few decades computers as we know them now will no longer be used in private homes. There's your logical endpoint.
Instead we'll be sitting at a display, streaming our content, games, movies, whatever, with all our personal documents stored in "The Cloud". There will be no more tinkering, hacking and no abandonware sites! Enjoy your vast computing power while you have it.

So you're saying this is the future?

[/img]

Attachments

  • idiocracy.jpeg
    Filename
    idiocracy.jpeg
    File size
    11.17 KiB
    Views
    3037 views
    File license
    Fair use/fair dealing exception

Reply 4 of 47, by Sune Salminen

User metadata
Rank Member
Rank
Member

Yes, that's totally it.

It already exists, imagine these three combined into one:
http://www.netflix.com/
http://www.onlive.com/
https://www.icloud.com/

Last edited by Sune Salminen on 2012-09-03, 03:46. Edited 1 time in total.

Reply 5 of 47, by snorg

User metadata
Rank Oldbie
Rank
Oldbie

I certainly hope that that is not the case, it definitely would be horrible.
The whole point of computers is that they are supposed to be a tool for amplifying creativity, yet it looks more and more like they're going to be used to market to us and implement an omnipresent surveilance state.

Reply 6 of 47, by Sune Salminen

User metadata
Rank Member
Rank
Member

Creativity tools like today's audio/video producing/editing software, or even developer tools like today's Apple's XCode will be available on rental basis from the cloud....and..haha, I just thought of this..of course, as a "Content Creator" you must first aquire some kind of license to legally produce "certified" content.

Reply 8 of 47, by MaxWar

User metadata
Rank Oldbie
Rank
Oldbie

I think Sune Salmien's vision of it has some truth in that there will be a large market for this type of "streamed computing", for the more mainstream applications it will be a convenient solution for the media companies. They can easily distribute their products while assuring a strict control over the users, just what they want. And of course most users will be content.

But i totally do not believe that it will come to completely displace standalone machines and computing. It is totally unpractical for many applications so there will always be a demand for such machines. Competition will ensure they are still there, provided that any competition still exists of course...

Anyway, I thought the point of this thread was maybe more about the hardware advancement. Its hard to predict and honestly i have not been following development news alot lately, however in the short term we should expect magnetic storage to be more or less fully replaced by solid state technologies, this seems rather obvious.

As far as the conventional computer architectures, seems the rise of the GHz has much slowed in the last years, in favor of miniaturization and lower power consumption, this allows more cores in less space while maintaining tolerable heat.

Beyond that whats next, Photonic computing?

And even further beyond? Unless humanity self destructs before that ( which honestly i more and more think is likely ) I can envision computers becoming more organic. As nanotechnology progresses, at some point, and its already started, it will become more and more inspired by or even entwined with biochemistry.

Sure is a fun exercise to try to guess. But at the end i believe we are in for many surprises, pleasant or not.

FM sound card comparison on a Grand Scale!!
The Grand OPL3 Comparison Run.

Reply 9 of 47, by snorg

User metadata
Rank Oldbie
Rank
Oldbie
MaxWar wrote:
I think Sune Salmien's vision of it has some truth in that there will be a large market for this type of "streamed computing", f […]
Show full quote

I think Sune Salmien's vision of it has some truth in that there will be a large market for this type of "streamed computing", for the more mainstream applications it will be a convenient solution for the media companies. They can easily distribute their products while assuring a strict control over the users, just what they want. And of course most users will be content.

But i totally do not believe that it will come to completely displace standalone machines and computing. It is totally unpractical for many applications so there will always be a demand for such machines. Competition will ensure they are still there, provided that any competition still exists of course...

Anyway, I thought the point of this thread was maybe more about the hardware advancement. Its hard to predict and honestly i have not been following development news alot lately, however in the short term we should expect magnetic storage to be more or less fully replaced by solid state technologies, this seems rather obvious.

As far as the conventional computer architectures, seems the rise of the GHz has much slowed in the last years, in favor of miniaturization and lower power consumption, this allows more cores in less space while maintaining tolerable heat.

Beyond that whats next, Photonic computing?

And even further beyond? Unless humanity self destructs before that ( which honestly i more and more think is likely ) I can envision computers becoming more organic. As nanotechnology progresses, at some point, and its already started, it will become more and more inspired by or even entwined with biochemistry.

Sure is a fun exercise to try to guess. But at the end i believe we are in for many surprises, pleasant or not.

Well, before things took the decidedly dystopic slant, yeah I was more thinking what type of advancements we can see. Barring major advances (quantum computing, rod logic like in Diamond Age) I think we are going to top out with Exascale systems on the supercomputer side (in my lifetime, anyway) and your typical desktop will probably do something between 1 and tens of petaflops. I think it will be common to have a mobile device (phone/pda) that will do teraflops. We may get there in the next ten years for mobile devices, particularly if you count GPU advances. They seem to be moving at the speed of developement (or even faster) that home systems were moving from 1975-1985.

I don't know that DNA or biocomputing will be practical unless you need to store lots of data that doesn't need to be accessed very quickly. (LOL maybe we are all part of a distributed computing system and the universe is an enormous Turing machine)

Photonics, for sure will be used in networking and chip level interconnects, I'm pretty sure of that.

What are the unintended consequences of all this going to be? Maybe the internet will turn into some kind of hive mind, perhaps we are going to merge more with our machines? I'm sure the guys at Xerox and IBM didn't foresee huge numbers of programming/IS/IT jobs heading oversees when they were originally inventing all this stuff. Or streaming hardcore porn for that matter.

Reply 10 of 47, by snorg

User metadata
Rank Oldbie
Rank
Oldbie

Forgot to mention:

I think you'll still be able to get something like a desktop or laptop that we have now, if you want it. There will probably be relatively high-end systems yet for research, engineering, development, etc. Even if not, you've got all those unemployed/retired/underemployed engineers/hackers doing cool open source projects that I would be very surprised if you couldn't get some sort of open system (Raspberry Pi), retro machine that has been upgraded way beyond its original design spec (think C64, Amiga, etc) or some combination.
There will probably always be some version of Linux that is free.

We may see a resurgence of homebrew type systems, only with much more modern gear. 32 and 64 bit processors in an FPGA, that type of thing.

Of course, you may be considered an eccentric (borderline criminal?) for not having an iWhatever or Googlebox/pad. The pendulum is swinging very hard back toward clamping down and closed systems, maybe we need to have some sort of authentic counter-culture movement?

Reply 13 of 47, by GXL750

User metadata
Rank Member
Rank
Member

With the takeoff of cloud computing, and the popularity of such devices as the iPad, I see terminal and mainframe concept as having made a comeback. However, in a much refined manner and with broader purpose. However, I don't see the personal computer ever going away; it ill have reduced market share at most. There's still countless reasons to have a full OS and offline local storage.

I don't think emerging trends in computing in the past several years are pushing the PC away so much as complimenting it.

As for the technology itself... It's a wonder. I remember in the early 2000s it was becoming pretty obvious we had hit a plateau for clock speeds and as a result, we now have multi core chips designed with an increased emphasis on efficiency. It's funny to think how the power provided top tier supercomputer technology used by government, scientists, etc will, in a decade or two, be matched by a kids game console.

Reply 14 of 47, by Filosofia

User metadata
Rank Member
Rank
Member

Bigger, Better, Faster, More.

I think evolution and progress are not the same, this applies to tech also.
For instance, how many mouse buttons do you really need? Unless for specific apps I'm happy with the traditional 2-buttons and the scroll wheel.

The screen size is a very good example, did we start to made content designed for bigger screens or the other way around? Because if we had a software that needed a bigger screen (if you work as an art director or with CAD) and we built the bigger screen to optimize the tasks that is one thing. And if we want to play games on a monster screen, that's fine too, we're still optimizing that task (of entertaining our selfs).
BUT when we are forced to have a 40'' screen because that is the only way we can acess a content, and everyone else has one "just because"...

BGWG as in Boogie Woogie.

Reply 15 of 47, by The Gecko

User metadata
Rank Newbie
Rank
Newbie

Two words:
Massive Parallelism

At least on the processing/hardware side of things. Especially on the enterprise hardware side of things. The desktop/interactive side will probably require a software mini-revolution to really scale to this sort of thing. Parallelism in enterprise number crunching or request handling is one thing, but all the division of labour and synchronisation you'd need for interactive desktop applications is a royal pain.

But yeah, barring a major breakthrough in a game-changing technology, smaller, cheaper, faster, leaner, more storage. Might see a shift to fast solid state media in a big way as production becomes cheaper. Once you hit a certain threshold, the storage tradeoff becomes less pronounced compared to spinning disks. Most home users are going to get more value out of near instant seek times and massive throughput than they will out of 3TB (and growing!) storage capacities.

I share a less bleak vision of the cloud. Cloud apps are popular not because they're somehow better than desktop apps, or being forced on people, but because we live in a networked world of a mess of different devices in different locations. The selling point is that suddenly, all your devices can use a common interface (the web) to share data, and sync it all in real time. That's an actual viable value proposition, not necessarily (just) a power grab.

In some ways, I'm actually hopeful that the cloud will solve some problems. Since the heavy lifting gets done on the backend, we can use this to help close the digital divide. When a $30-$50 SOC based computer can run a web browser off shared community wifi, and send output to a cheap television via HDMI or something, then you've just given a huge number of people access to a vast wealth of information, communication/collaboration channels, things like word processors, education resources (Khan Academy?), community resources, and so forth.

I also think using netflix as an example is a big disingenuous, since you're talking about deliverable content, not a computing platform. Television/movie content has always been something external that you have to bring in (either over a wire or on physical media) from an outside source to view. Netflix just moved it off the wire owned by your local cable co. and put it onto the common wire you use for all general purpose data.

The worrying trend I see is the "walled garden" approach Apple takes to software. Apple sells you the device, and is the sole source for software. Approval, sales, authorisation - all Apple. They've got a lockdown on their mobile devices now, and while their proper computers remain open, they've made moves in this direction with their desktop app store. I think Microsoft is making similar moves with Windows 8. This is particularly dangerous, because they have a vested interest in suppressing objectionable software, disruptive software, and software in a competing position to their own. I say Apple here because they're the most visible offender for this right now, but they're certainly not the only ones who want this kind of software ecosystem.

Some of computing's important advances have come from disruptive software that would never have made it through the Apple Filter. Case in point, and it cuts pretty close to home on this forum, system emulation software.

If all else fails, use fire.

Reply 16 of 47, by Dominus

User metadata
Rank DOSBox Moderator
Rank
DOSBox Moderator

In Apples defense they mean well it seems to but make dumb decisions.
1. slapping the App Store license onto whatever app gets released makes it impossible for many open source software to be on the app store because their license doesn't allow another top license (gpl).
2. prohibiting running other executables than the app one which made the dosbox port to ios impossible and thrown out ofthe store a couple of times. Of course 1. applied there, too and it would have vanished anyway. But understand that this done for security reasons which is great on the one hand (ios has much less malicious software than android) but too restrictive on the other. Dosbox IS potentially dangerous as you can mount the root and just start deleting stuff 😉
Other system emulators are still on the app store.

And I'd like to hear more about computing advances through disruptive software...

Windows 3.1x guide for DOSBox
60 seconds guide to DOSBox
DOSBox SVN snapshot for macOS (10.4-11.x ppc/intel 32/64bit) notarized for gatekeeper

Reply 17 of 47, by Joey_sw

User metadata
Rank Oldbie
Rank
Oldbie

also this: http://www.tomshardware.com/news/microsoft-pc … eger,17381.html

somehow, i got this "PC Users Licensee" vibes,
that you must pay licensee fee to MS (periodicaly) just to (continue) using your computers, somewhere in the future...

-fffuuu

Reply 18 of 47, by The Gecko

User metadata
Rank Newbie
Rank
Newbie
Dominus wrote:

In Apples defense they mean well it seems to but make dumb decisions.

That's what worries me, though. If the platform is locked down, then you're subject to the decisions of the owner company. If they make bad decisions, you suffer. Even if they make decent decisions that just happen to not align with your needs, you suffer.

I'm not actually trying to paint Apple as a bad guy here - they are, at this point, largely benign in terms of app store policies. They're just one of the largest (possibly the largest) user of this software model in the consumer computing market.

Dominus wrote:

And I'd like to hear more about computing advances through disruptive software...

To clarify, perhaps I should have said disruptive technology. I don't necessarily want to limit it to software, although the software running the devices often plays a part.

Obviously, everything is speculative "what if" scenarios, since we'll never know what happened if Napster&Friends hadn't been created,

- Napster (and its spinoffs). Completely illegal, but it really got people away from the idea of "the media is the music". I'm convinced that the portable music player market couldn't have been as successful as it was without easy (near effortless) access to an extensive library of digital music. I see this spark that lit the 'online digital media' fire. It never would have passed muster in a managed software environment.

- DVRs. You know what really sucked? Having to watch a show when a network decided to air it, regardless of what you might otherwise be doing, or not watch it at all. You could do timed recordings with VCRs, but the quality was shit and degraded everytime you used it. The ease of use was also terrible. I'm not sure if you can even find a mainstream cable provider who doesn't offer DVR boxes now, but when they were a new thing, there was a lot of cable co resistance to them. They messed with the almighty ratings, they messed with prime timeslots, advertisers were afraid people would just skip ads. Given how they fought DVRs (there are still some court cases ongoing regarding ad skipping), I don't think this is something the networks would roll out on their own. More likely, we'd have some form of provider-run "on demand" service.

- Cleanroom reverse engineering of IBM Personal Computers started the IBM Compatible PC industry and commoditised personal computers. I'd think there would be little argument that IBM would not have authorised this.

- Crypto libraries implementing SSL, TLS and the like. They're free, ubiquitous. The digital economy is based on them - transactions need to be secure. You'd be a fool to conduct banking or purchases without them. Public-key cryptography makes the digital economy possible - that's not an exaggeration. I also shudder to think of the state of computer security if remote administration, authentication, etc. were all still done in cleartext. The US tried to restrict the export of such technologies at first, resulting in old browsers only being capable of useless short-key 40-bit systems (from days to seconds to brute force, depending on computing power).

If all else fails, use fire.

Reply 19 of 47, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

While computers have gotten faster and faster IMO the software hasn't changed much. What most do with Windows 7 and Office 2013 isn't much difference compared to Windows 95/98 and Office 97. Productivity has maybe increased a little bit but not much. Most tasks are still manual and repetitive.

So IMO software is really what is holding things back.

What I hope for is something close to AI. Technology you can communicate with, give tasks that are easy to do for a computer.

Things we need to do manually at the moment like setting up backup schedules, sync relationships, privacy settings can just be communicated verbally and completed in no time.

Things like I have 300 TV channels, can you see if there are any documentaries on are easily done on various tv sites, but this is still a manual task IMO and if you have many things like this you constantly search and look for things rather than doing them.

So that's my vision / dream 😀

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel