There is no blur and no distortion. Of course it's not usable due to the slowdown, but it looks good.
In fact, there is a small amount of blur due to bilinear interpolation...
I can see some loss in brightness (and maybe contrast), but no blur. Maybe the slightest hint of blur, but nothing I'd actually recognize as blur without constantly switching between the two images. I took two screenshots of a 320x200 game scene at 1440p and bumped the brightness and contrast with Gimp in the near-perfect screenshot, and they look very close.
Colors are still not correct though and it's easy to tell the difference in the dark areas in the hills in the left part of the image. But bilinear seems to be bad in general at this, as it gets the colors wrong. Bilinear downscaling seems to be more "muddied" than "blurry". There's other algorithms that are supposed to be much better for preserving the original color balance and contrast when downscaling (Mitchel I think is one of the best?) Doing it on the CPU though is probably not feasible.
This picture pretty much sums up my opinions:
The effect is grossly exagerrated.
No, that's a CRT TV with an NES. It doesn't apply to PC CRT monitors. They are much, much sharper than CRT TVs. It was just an example to make the point that retro games were played on screens that had a natural smoothing effect (gaussian blur) with the phosphor grid further hiding some of the pixel aliasing. Shaders can help replicate that look to some extent (easier for retro console games, much harder for PC games, since they ran on sharp, low dot-pitch CRT displays and you need 8K LCD monitors to emulate that kind of image.)
In any event, a completely unprocessed image of a PC retro game just looks "wrong" to me. Bilinear upscaling is horrible of course, I'll just take interger upscaling instead any day of the week. But I like a good CRT shader even more 😀