VOGONS

Common searches


First post, by 386SX

User metadata
Rank l33t
Rank
l33t

Hi,
I would like to know what movies have you seen where actors actually used old or older computers to do impossible tasks for that hardware period/generation, obviously only for the usual 70s,80s and 90s technological wow effect were everything was possible even with some 6502 1Mhz cpu. 😁

Right now I can just remember the Tron Legacy good movie where the 80s i386 linux-based server running the immense complexity of the "Grid" virtual world for twenty years and also being capable of connecting a Nokia phone directly from the internal hard disk to a sd card. 😁

Reply 1 of 23, by Gemini000

User metadata
Rank l33t
Rank
l33t

Uh... Tron Legacy's server room is supposed to be a modern bank of servers. In fact, you see a shot in the movie of when Flynn creates the new grid where all you've got is the floor which stretches for infinity and a vast void within which to make a new world. He even mentions how Tron was created by Alan for the "old system", which would be the server bank from the first Tron movie. :B

I do question how he managed to get that massive laser rig to fit into that tiny office though. :P

WarGames has some questionable computing power for sure. The speech synthesis thing, not to mention the AI and the use of modems back at a time when modems were still EXTREMELY rare. The Last Starfighter isn't a good example of computer impossibilities but arcade impossibilities, as the type of 3D gaming going on in the arcade machine in that movie is MUCH more elaborate than was possible at the time.

If you wanna go the other direction though, Star Trek Nemesis has way too many low-resolution LCD panels which look like crap, even when compared with the ST:TNG television series itself. Guess financial waste finally caught up with the Starfleet ship techs. :P

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 3 of 23, by brassicGamer

User metadata
Rank Oldbie
Rank
Oldbie

They used a Toshiba Tecra 740CDT in Iron Man to make the improvised Iron Man suit in the bunker in the middle of the desert.

Check out my blog and YouTube channel for thoughts, articles, system profiles, and tips.

Reply 5 of 23, by ahendricks18

User metadata
Rank Member
Rank
Member

Ghostbusters (2?) had a cool looking old machine. Can't find the picture on the web right now though.

Main: AMD FX 6300 six core 3.5ghz (OC 4ghz)
16gb DDR3, Nvidia Geforce GT740 4gb Gfx card, running Win7 Ultimate x64
Linux: AMD Athlon 64 4000+, 1.5GB DDR, Nvidia Quadro FX1700 running Debian Jessie 8.4.0

Reply 6 of 23, by 386SX

User metadata
Rank l33t
Rank
l33t
Gemini000 wrote:
Uh... Tron Legacy's server room is supposed to be a modern bank of servers. In fact, you see a shot in the movie of when Flynn c […]
Show full quote

Uh... Tron Legacy's server room is supposed to be a modern bank of servers. In fact, you see a shot in the movie of when Flynn creates the new grid where all you've got is the floor which stretches for infinity and a vast void within which to make a new world. He even mentions how Tron was created by Alan for the "old system", which would be the server bank from the first Tron movie. :B

I do question how he managed to get that massive laser rig to fit into that tiny office though. 😜

WarGames has some questionable computing power for sure. The speech synthesis thing, not to mention the AI and the use of modems back at a time when modems were still EXTREMELY rare. The Last Starfighter isn't a good example of computer impossibilities but arcade impossibilities, as the type of 3D gaming going on in the arcade machine in that movie is MUCH more elaborate than was possible at the time.

If you wanna go the other direction though, Star Trek Nemesis has way too many low-resolution LCD panels which look like crap, even when compared with the ST:TNG television series itself. Guess financial waste finally caught up with the Starfleet ship techs. 😜

But in Tron Legacy wasn't it supposed that the hardware was built and left in the dust before last Flynn login in the 80s? So I would expect server to be 386 based considering that no other poeple entered/upgraded that room for twenty years.(not talkin about the great out-of-discussion-capacitive-touchscreen screen 😁)

Last edited by 386SX on 2015-09-02, 09:39. Edited 1 time in total.

Reply 7 of 23, by 386SX

User metadata
Rank l33t
Rank
l33t

The thing that I like a bit more on Tron Legacy of the other tech movies was that Sam actually used a command line realistic interface and that you actually can hear the sound of the old hard drives running up on the old servers when he is at the desk. 😉

Reply 8 of 23, by 386SX

User metadata
Rank l33t
Rank
l33t
Matth79 wrote:

The Last Starfighter has an excuse ... the arcade machine is a alien recruitment test.

PS. Didn't Terminator have 6502 assembly code?

When they show the interface that Terminator was actually seeing with his eyes? I think to remember some assembly commands...

Reply 9 of 23, by Interlace

User metadata
Rank Newbie
Rank
Newbie

Basically.. 90% of "Hackers" from 1995.. the only real(istic) things in that movie were Angelina Jolie's nipples..still an awesome movie though 😁

Last edited by Interlace on 2015-09-02, 18:03. Edited 2 times in total.

Reply 10 of 23, by Gemini000

User metadata
Rank l33t
Rank
l33t
386SX wrote:

But in Tron Legacy wasn't it supposed that the hardware was built and left in the dust before last Flynn login in the 80s? So I would expect server to be 386 based considering that no other poeple entered/upgraded that room for twenty years.(not talkin about the great out-of-discussion-capacitive-touchscreen screen :D)

Interesting point... The tech going on in Tron seems to be slightly more advanced for its age than when these things take place in the movie so... I think we can give it the benefit of the doubt there. :B

Also, Sam's phone has clearly been hacked so it can connect to just about anything he can jam a cable into. ;D

BTW: If they make a new Tron movie at some point I really hope they do it in such a way that we get to see BOTH systems in the same movie, the old one and the new one! :)

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 11 of 23, by 386SX

User metadata
Rank l33t
Rank
l33t
Gemini000 wrote:
Interesting point... The tech going on in Tron seems to be slightly more advanced for its age than when these things take place […]
Show full quote
386SX wrote:

But in Tron Legacy wasn't it supposed that the hardware was built and left in the dust before last Flynn login in the 80s? So I would expect server to be 386 based considering that no other poeple entered/upgraded that room for twenty years.(not talkin about the great out-of-discussion-capacitive-touchscreen screen 😁)

Interesting point... The tech going on in Tron seems to be slightly more advanced for its age than when these things take place in the movie so... I think we can give it the benefit of the doubt there. :B

Also, Sam's phone has clearly been hacked so it can connect to just about anything he can jam a cable into. ;D

BTW: If they make a new Tron movie at some point I really hope they do it in such a way that we get to see BOTH systems in the same movie, the old one and the new one! 😀

I've seen the first movie after having seen the Legacy, and I can understand why it is certainly a cult movie because the all world is virtual and I don't know if but probably one of the first movie with a similar idea.
But also in the first the computing power was a bit "too much"...😁

Reply 12 of 23, by Gemini000

User metadata
Rank l33t
Rank
l33t
386SX wrote:

I've seen the first movie after having seen the Legacy, and I can understand why it is certainly a cult movie because the all world is virtual and I don't know if but probably one of the first movie with a similar idea.
But also in the first the computing power was a bit "too much"...:D

Well, yeah. :D

AFAIK, Tron was the very first movie to combine computer animation and live action together. The Last Starfighter was the first movie to use computer animation to represent real-life objects and in fact, they had the technology to make The Last Starfighter look even MORE real, but not the time, so they had to cut down on the polygons and everything to be able to actually finish it. :P

--- Kris Asick (Gemini)
--- Pixelmusement Website: www.pixelships.com
--- Ancient DOS Games Webshow: www.pixelships.com/adg

Reply 13 of 23, by Kerr Avon

User metadata
Rank Oldbie
Rank
Oldbie

Bender, from Futurama, has a 6502 for a brain. No wonder he's so rotten!*

6502.jpg

http://jledger.proboards.com/thread/3194/benders-6502-brain

* I was on the Spectrum side of the Spectrum/C64 wars, so I prefer the Z80A to the 6502!

Reply 14 of 23, by sliderider

User metadata
Rank l33t++
Rank
l33t++
Kerr Avon wrote:
Bender, from Futurama, has a 6502 for a brain. No wonder he's so rotten!* […]
Show full quote

Bender, from Futurama, has a 6502 for a brain. No wonder he's so rotten!*

6502.jpg

http://jledger.proboards.com/thread/3194/benders-6502-brain

* I was on the Spectrum side of the Spectrum/C64 wars, so I prefer the Z80A to the 6502!

If it wasn't for the 6502, you might not be sitting in front of a computer right now. MOS Technologies created a masking process for the 6502 that boosted the number of chips you got from a single wafer exponentially, pushing CPU prices down and forcing other CPU makers to do the same. The Intel x86 line might never have become affordable enough to become the standard if it weren't for the downward price pressure applied to the entire industry by the 6502.

Reply 15 of 23, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
Gemini000 wrote:

If you wanna go the other direction though, Star Trek Nemesis has way too many low-resolution LCD panels which look like crap, even when compared with the ST:TNG television series itself. Guess financial waste finally caught up with the Starfleet ship techs. 😜

...and Babylon Five has CRT monitors, despite it's supposed to happen in our future. And yes, the aliens use CRTs too. What happens to LCDs in the future? Probably the Vorlons think LCDs are offensive?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 16 of 23, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie

Star Trek The Original series. The computer can do tasks of infinite complexity and accepts speech commands that only human can understand, yet does not have video output.

Reply 17 of 23, by Tertz

User metadata
Rank Oldbie
Rank
Oldbie

oldschool magic 😀 Weird Science (1985), Memotech MTX512
http://www.youtube.com/watch?v=B8dldLG_ZhI

Klondike:
http://starringthecomputer.com/computers.html

DOSBox CPU Benchmark
Yamaha YMF7x4 Guide

Reply 18 of 23, by Kerr Avon

User metadata
Rank Oldbie
Rank
Oldbie
sliderider wrote:
Kerr Avon wrote:
Bender, from Futurama, has a 6502 for a brain. No wonder he's so rotten!* […]
Show full quote

Bender, from Futurama, has a 6502 for a brain. No wonder he's so rotten!*

6502.jpg

http://jledger.proboards.com/thread/3194/benders-6502-brain

* I was on the Spectrum side of the Spectrum/C64 wars, so I prefer the Z80A to the 6502!

If it wasn't for the 6502, you might not be sitting in front of a computer right now. MOS Technologies created a masking process for the 6502 that boosted the number of chips you got from a single wafer exponentially, pushing CPU prices down and forcing other CPU makers to do the same. The Intel x86 line might never have become affordable enough to become the standard if it weren't for the downward price pressure applied to the entire industry by the 6502.

I didn't know that! I hope it was clear from my post, though, that I wasn't really knocking the 6502. It's a great CPU, it's just that I was a part of the Spectrum vs C64 playground arguments*, so I'm contractually bound to (jokingly) knock the C64 and anything related to it at every opportunity.

* At least the less serious, I-wish-I-also-owned-the-other-machine part of it, unlike the more vicious (and immature) format wars you get nowadays. Even though nowadays the format is much less important that it ever was, as most of the machines now have similar hardware (well, the PC, PS4, and XBox One have), and most of the big games are cross platform. Not like in the 8bit days, when many of the best games were exclusive to one platform, or if they were ported to other machines, then the ports were often lacking or just plain bad.

Reply 19 of 23, by shamino

User metadata
Rank l33t
Rank
l33t

Independence Day
It kills me how Jeff Goldblum plugs his laptop into an alien computer network and infects it with a virus.

Every Crime Show
The magic of "image enhancement"

====
I'm not sure I can count Terminator's 6502 code. I think it was intended that those who caught it would take it as an easter egg.

The one that bothered me was the futuristic CPU chip itself as shown in Terminator 2. This was the CPU for the old model Terminator (the Schwarzennegger model, which Wikipedia calls a T-800). It is the focus of at least a couple scenes and part of the plot, so I think it counts as an important detail.

In one scene, the Cyberdyne dude is in the present day, designing the chip that would become the T800 CPU. I can't find a picture, but as I remember it, it's presented as an archaic DIP chip with something like 16 long pins. There's no way I can accept that this packaging form factor could or would be used for a high performance CPU, even in the early 90s. It belongs in an early 8-bit PC playing Pac-Man.
Hmm... at least the look of that chip forms some consistency with the 6502 code mentioned earlier. Maybe MOS is Cyberdyne.

Maybe a DIP looks more interesting on screen than a PGA or some surface mounted chip, so I guess I can forgive the movie makers for using a DIP. They could have at least given it more pins though. It must have terrible bandwidth.

The only CPU pictures I could find were of the actual T800 CPU (from the future) which they pull out of Ahnold's melon. That chip looks like a USB Flash drive with a heatsink on it. It has an edge connector with maybe 4 pins. I'll concede this is future technology, but I still have a hard time accepting how that form factor could ever work as a CPU. If it's pushing power and data through 4 pins, it must be running at such insane frequencies that an edge connector wouldn't work.