I guess "100% brightness" can only be an arbitrary definition, because it varies quite a bit between monitors. And just to illustrate the issue... if vertical deflection fails and all electrons are illuminating a single horizontal line across the middle of the screen, it will be many times brighter than any kind of "100%" you could get under normal conditions.
But even if you decide on some arbitrary "100%", there can't really be a 'correct' answer because RGB color spaces are very different from single-phosphor "color spaces", and I don't think there's a way to even compare the gamuts. The long and short of it is that you'd have to settle for an approximation which looks like it's somewhere in the ballpark perceptually.
Long ago I made some attempts here (some of them based on this), but their "correctness" is also subjective.