BinaryDemon wrote:That's pretty ridiculous, unless Nvidia wanted to do this I can't see it working.
But, I bet you could come close to simulating 286 performance by taking a semi-modernish CPU/mobo that does support PCI-e reducing it to slowest clock speeds and disabling all cache. You could even use a PCI-e 16x to 1x adapter and possibly throttle the PCI-e speed in BIOS (if supported).
Personally I'd use a modern OS on it, because what kind of meaningful benchmark are you going to run under Windows 3.0?
No fun in simulating. The real thing would be the real fun! The tougher the challenge, the greater the fun!
derSammler wrote:No. Apart from the fact that you can't adapt PCIe to ISA, a 286 would not even be able to execute (or even address) the BIOS of the card.
Again, the tougher the challenge… (same as above ↑). A work worth thousands of resumes in IT!
Standard Def Steve wrote:agent_x007 paired a GTX 1080Ti with a Northwood Celeron, courtesy of a rare S478 PCIe motherboard. Those Northwood-128 chips aren't much faster than a 286. 😁
🤣! Respect bro!
mrau wrote:will nvidia's driver even fit in the 16mb max of ram that can be addressed?
edit: its not even 16 megs - devices have to fit in that space too :>
You totally have a point! Mmmm…..
oeuvre wrote:pff, RTX2080 is what you want to pair with a 286
Ahahahahahaha FANTASTIC! I was undecided about mentioning that… I just wanted the thread popularity to spike… but, definitely! Yes, let’s change plan, let’s go with the RTX2080 in the 286!
Hey, why not, a tweaked mobo to get a dual CPU 286? HAHA!
The Serpent Rider wrote:with the support of a skilled programmer that will write drivers accordingly
Good luck reverse engineering proprietary Nvidia spaghetti code drivers to fit them into 16-bit system. Heck, at least AMD has open source drivers for old GPUs.
Didn’t think about legality issues… well, whoever will do it will add some hacking/cracking to the resume! (y)
They said therefore to him: Who are you?
Jesus said to them: The beginning, who also speak unto you