PC games from 1994-1996 didn't have any 3D hardware to work with; the devs had to write their own rendering code in software. Regardless of what "internal precision" the game does its 3D calculations at, the rasterization step - converting polygons in 3D space to pixels on a 2D screen - is written into the game code itself. In other words, the game is only going to draw the exact amount of pixels on screen that the devs wrote it to. There's nothing an emulator can realistically do about that.
For consoles (Gamecube, etc.) the rasterization step was part of the console's firmware or OS. The game would feed the console its 3D geometry, and the console itself would convert that to pixels at its screen resolution. In other words, the rasterization half of the rendering process is being done by something that an emulator would have to impliment, so the emulator authors have the ability to change the output resolution.
(This is why upscaled PS1 games can look kind of ropey - the internal geometry calculations were never meant to feed data to a rasterizer operating at higher than 320x240, so they're just not precise enough to make everything "smooth.")
You actually can upscale PC games that use 3D hardware APIs which have been re-written into wrappers - I believe DGVoodoo can do this, GLRage definitely can. I was playing the ATI version of Wipeout at 1600x1200 a while ago which it never natively supported. Not everything works & if the game is drawing 2D on top of its 3D in the wrong way (HUDs, etc.), things can break.
I've oversimplified a few things here, but that's the basic explanation. Hope it was easy to follow.
twitch.tv/oldskooljay - playing the obscure, forgotten & weird - most Tuesdays & Thursdays @ 6:30 PM PDT. Bonus streams elsewhen!