Does your monitor have sampling phase adjustment, and does it have an effect of any kind, like shifting the bars horizontally? Does automatic adjustment help?
I took a look at some of the TVGA9000 cards, and simplest cards do not have DAC output anti-aliasing filters of any kind.
Analog monitors may not care or it may show signal undershoot or overshoot at pixel boundaries, but the TFT ADC will care depending on sampling phase. Some later TVGA 9000i models do have a place for filter components, but in some models they are mounted and in some models they are not mounted. So I *do* blame VGA cards in this case.
Based on the striping, the TFT samples the signal at its own frequency, as it most likely wants to scale the analog active portion to fit the screen, instead of sampling the signal based on VGA pixel clock and scaling the signal to fit the screen digitally. As most TFTs could lock to the signal and scale digitally, phase adjustment should work. So it might be that the display does not support 70Hz VGA signal, or it is optimized to 720x400@70Hz VGA text mode rather than 320x200@70Hz VGA graphics mode, as they should have almost identical timing (except for pixel clock and the fact that graphics mode is double-scanned to 400 lines, but the TFT cannot distinguish this).