There are two major reasons that VGA-based LCD solutions struggle to produce pixel perfect upscaling of Mode 13h 320x200 -> 1600x1200.
The first reason is that VGA does not actually carry a pixel clock line, so the horizontal scanlines are analog and require the LCD to do clock reconstruction to find the pixel clock. This "genlock" mechanism needs to rely on heuristics since it cannot know what the original pixel width of the image was (was it 320? 360? 640? 720? or something custom?) If you check the VGA capture threads, it is a common issue that LCD displays mis-guess the input video to be e.g. 640 pixels wide when it should be 720 pixels, or vice versa.
The second, and far more trickier reason is because the DOS Mode 13h is not actually a 320x200 resolution as far as the signal on the VGA line is considered, but first, it is actually padded with an active picture "border" signal to grow it to 328x414 pixels. (328, or 656 pixels horizontally, you can think of it as either, since there is no pixel clock), and then on top of that, it is further enlarged to become a 356x447 pixel signal after adding horizontal and vertical front and back porches.
Reliably cropping this border and the porches is what constitutes a major workload for implementing pixel perfect upscaling. Doing so correctly would require a heuristic in the LCD displays that would contain deep domain-specific knowledge about the different video modes. A display should heuristically count and match this 356x447 size, to know that it should crop away exactly 10 pixels from the left, 22 pixels from the right, 13 pixels from the top and 34 pixels from the bottom from the image. And this should only be done when the picture pixel contents, clocks and syncs are first deemed to follow what looks like a Mode 13h.
Further, these video modes are just conventions and the VGA adapter is programmable, so game programmers could (and did) freely have changed the border and porch sizes to have different pixel sizes.
800x600 is slightly different in that when VESA standardized SVGA, it did away with the programmable width border in that video mode. And by that time, programming custom timings was far less of a thing that it used to be, so LCD manufacturers could read the "one true 800x600 SVGA timings mode" off of a predetermined VESA DMT.pdf standards specification, and develop mode and crop detection based on that.
After DVI-D came along, having a pixel-perfect VGA support would probably not be the biggest selling point that a LCD display could have, since DVI-D was marketed exactly for that, so I presume no vendor ever tried to perfect the ultimate DOS VGA border+porch cropping heuristic for differently shaped video signals, and they generally just do one heuristic that is good enough for most video modes, even if not perfect. And then expect that users would have moved on to DVI-D to get the pixel-perfect result.
Our CRT Terminator project does solve these issues by implementing this type of deep signal shape heuristic analysis to figure out the required border and porch crop sizes, and bypasses the analog VGA signal altogether: CRT Terminator Digital VGA Feature Card ISA DV1000 , although unfortunately the project is not yet at a stage where the card would be available. Hopefully still later this year though.
It upscales 320x200, 320x240 and 800x600 pixel-perfect to 1600x1200 for viewing on 1600x1200 and 1920x1200 displays. We find that viewing 640x480 gives best results when first upscaled 2x to 1280x960, and then letting the LCD monitor do its own scaling to upscale the 1280x960 image further up to 1600x1200. 70hz modes are supported as long as the monitor in question supports viewing 70hz modes (there exists two modern 1920x1200@75hz displays that I have found, ASUS ProArt PA248QV and Philips Brilliance 252B9/00 that can both do this fine. I like ASUS a little bit better for its UI buttons), but CRT Terminator does also support decimation down to 60hz as well if using a monitor or capture card that would not support 70hz.