Afaik you should always use EIP on 80386+. It's just that EIP's high 16 bits are truncated when loading it with 16-bit jumps(operand size is 16-bits). That's one of the issues of 32-bit huge real mode(unreal mode with CS.D being set and limits and base high used with large granularity). Apparently, loading CS in real mode in any way(interrupts, jumps) causes those bits to be reset, which causes an high base/limit program to come crashing down into low memory(<1MB), executing whatever's there(whatever it may be, but not intended). So it will execute e.g. JMP F000:12345678, which will run F000:5678 instead, becoming a 16-bit program until reentering protected mode or LOADALL.
In other words, besides the segment descriptor being updated internally, there are not many other differences(besides interrupts and segmentation loading logic(descriptors loading directly instead of from memory)) between protected(or for a better example simplified virtual 8086 mode looking at segment loads) and real mode. Much of the protection logic(descriptors cached, maybe even V86 bit in flags when using LOADALL(illegal (#GP) when loaded through POPFD) is still in place when using real mode. So basically, protected and V86 mode aren't extensions of real mode: real mode and Virtual 8086 mode are extensions and hacks for protected mode, for easy compatibility with the old CPUs. Hence why things like unreal mode works in the first place. I'm curious though, what V86 mode in real mode would do, when setting it up through LOADALL... Would it use Protected mode or Real mode interrupt handling? And would it load segments like in Virtual 8086 mode(full internal cached descriptor loads/construction) or like in Real mode(only loading fully when loading CS, partly otherwise)?