First post, by superfury
Currently it just reads values from memory one byte at a time, while shifting left to obtain 16-bit and 32-bit quantities. All structures(CPU and hardware) used in the emulator are little-endian-structs. Do I need to add any logic to convert to/from little endian(no bitfields are used anymore)? If so, where would that logic need to be? In the MMU for reading/writing 8/26/32-bit from memory? Do the structures(typedef struct) need to be changed?
Edit: I've modified all CPU registers and word/dword conversion unions to support big-endian byte order. Are there any other data structures(related to x86 PCs) that would need to be changed?
Currently memory as words are read like this(writes are reversed):
Word: addr|(addr+1<<8)
DWord: word addr|(word addr+2<<16)
Is that fully cross-platform compatible? Will it work correctly on big-endian machines?
Btw some registers, like IDTR and GDTR are read as word and byte quantities on the 80286. Is it still cross-platform when shifted that way?
E.g.
LDTR.Limit = readMMUword(addr)
LDTR.Base = readMMUword(addr)|(readMMUbyte(addr+2)<<16)
What about arrays(like descriptors) read from memory in 64-bit quantities using two 32-bit reads(thus converted to big endian), which are read in byte and word interleaved quantities?
Edit: I've modified the full CPU (as well as the SoundFont loading) to process anything from memory as Little-Endian, converted to Big-Endian when used for 16-bit or 24-bit purposes(and read byte-by-byte).
Does this mean UniPCemu can now run on any Big-Endian CPU as well as Little-Endian CPUs?
Author of the UniPCemu emulator.
UniPCemu Git repository
UniPCemu for Android, Windows, PSP, Vita and Switch on itch.io