Why do you think they are inaccurate? All it would take is for someone else to have a similar video or simply frames similar to other content and there would be a match of same/similar content between two users which is probably why you got the notices. The fact that it is game footage, and something as popular as DOOM... it's highly unlikely that you are the only person to upload unique (not even similar) DOOM frames to anyone else.
Indicating something as a false positive would not be in the scope of a user to say. The nn's would be completely skewed and pretty useless if every user could say this isn't correct (even if it was). If it is a false positive, it would become "training data" for the models, and in which case would require a "human" moderator to visually confirm if it is a false positive or not... ergo every video flagged as false positive would have to be manually checked and confirmed.... which we can pretty much assume youtube/google do not have the "human" resources to do. Or perhaps would even bother doing if they did have the resources.
A false positive indicates an identification error. Options for "original recording" or "public domain" are more data classification, not data matching.
i.e There is a match with other content, but youtube does not know if the content itself is public domain or copyrighted... you can probably say it's public domain and sleep well at night.
As for what game recordings would come under... I would have thought it would be well within the realms of "fair use", but I don't know so don't quote me on that. You can tell youtube about "fair use" though.
Image Recognition gets better with larger populations, and given the masses of data their models can learn from (all uploaded content, and probably more since it is google!) I highly doubt it would be getting things that wrong in our day and age... could be wrong though, only they know 😀.
/2cents