VOGONS


Does anyone own a MISTer FPGA and how is it?

Topic actions

Reply 20 of 65, by held

User metadata
Rank Newbie
Rank
Newbie
SScorpio wrote on 2021-10-02, 14:29:
held wrote on 2021-10-02, 14:04:

It looks great, but portability is out of the question

If you are looking for something smaller, Porkchop Express from MisterAddons posted a teaser for a new project I can't find an image of right now.

But it takes the DE10-Nano with either the digital or full IO hat, and replaces the USB hub with a custom board that sits along the side, this gives you IO in the front and back of the case. So it will be slightly larger than two DE10-Nano sitting side by side. And could support the digital IO board with dual memory modules if that ends up being necessary in the future for the Saturna and PSX cores.

Sound nice, I'll start searching.. Man I watched that video twice, and now I want the RMC one too. But I'm still on the fence, because the desire will wane, and when it does I can make a proper decision.

Reply 21 of 65, by SScorpio

User metadata
Rank Member
Rank
Member
held wrote on 2021-10-04, 18:08:

Sound nice, I'll start searching.. Man I watched that video twice, and now I want the RMC one too. But I'm still on the fence, because the desire will wane, and when it does I can make a proper decision.

It's something that's in the development phase, so it will probably be a while. It using the standard add-on boards except for the USB hub is likely the main draw for most.

You could always just get a DE10-Nano, and the 128MB SDRAM, along with a cheap micro USB OTG hub. That will run everything over HDMI, and is a good starting point.

Reply 22 of 65, by ZellSF

User metadata
Rank l33t
Rank
l33t

If you want to play retro console games and care about latency, I would recommend getting a MISTer FPGA setup, but also having a Raspberry Pi for games that work well with runahead.

If latency isn't important to you, I would just get a Raspberry Pi.

Reply 23 of 65, by held

User metadata
Rank Newbie
Rank
Newbie
ZellSF wrote on 2021-10-06, 08:36:

If you want to play retro console games and care about latency, I would recommend getting a MISTer FPGA setup, but also having a Raspberry Pi for games that work well with runahead.

If latency isn't important to you, I would just get a Raspberry Pi.

Yeah, I kinda hate choppiness and controllers not responding. Its a tick.
RPI is €135 incl. shipping, so that €50 difference, maybe for harder to emulate like PS2 & DreamCast. (Actually, now that I think about it, maybe I'll get it too on the side.)

SScorpio wrote on 2021-10-04, 23:03:

You could always just get a DE10-Nano, and the 128MB SDRAM, along with a cheap micro USB OTG hub. That will run everything over HDMI, and is a good starting point.

Yeah, I'm considering the board, its €185 incl. shipping.
Where do I source the 128mb from and what version do I need.

Also what controllers are people using?

Reply 24 of 65, by SScorpio

User metadata
Rank Member
Rank
Member
held wrote on 2021-10-08, 15:47:
Yeah, I'm considering the board, its €185 incl. shipping. Where do I source the 128mb from and what version do I need. […]
Show full quote
SScorpio wrote on 2021-10-04, 23:03:

You could always just get a DE10-Nano, and the 128MB SDRAM, along with a cheap micro USB OTG hub. That will run everything over HDMI, and is a good starting point.

Yeah, I'm considering the board, its €185 incl. shipping.
Where do I source the 128mb from and what version do I need.

Also what controllers are people using?

The SDRAM is a custom board, you can either make your own or order one.

https://github.com/MiSTer-devel/Main_Mi ... mbly-(DIY)

Since you posted a price in Euros your in Europe? UltimateMister.com seems to be the main source for official boards in Europe.

The controller is up to you. People people use original controls with adapters, other use modern controllers over Bluetooth. It's up to you.

Reply 26 of 65, by held

User metadata
Rank Newbie
Rank
Newbie

All right, I bit the bullet and parted with my money 😒
I went all in and ordered a full package, although I need to source some controllers.

My reasoning 🤔
1. Don't know if or when supply-chains will break
2. The chip shortage is not getting any better
3. Mouser and Digikey were selling out fast and restock was estimated far into 2022
4. I could be stuck at home for some reason, with nothing better to do.
5. I did not want to risk not being able to order in the future.
6. The inflation was 6,2% in October (not a good sign)

and the most important one for me, I could not get it out of my head 😆

Reply 27 of 65, by Brawndo

User metadata
Rank Member
Rank
Member

Yeah if you don't already have that stuff, you ain't getting it anytime soon. Everything is sold out pretty much everywhere, and who knows when it will be restocked. I ordered one of the ultimate kits from ultimatemister.com when they still had a few, now those are gone.

Now you can't find the standalone DE10-Nano kits anywhere either. The Terasic Amazon store had them in stock up until a few days ago for $170 plus shipping, now those are out of stock and the Terasic website shows the price increased to $208, and the next batch likely won't be available for a couple months, at best.

I happened to notice somebody listed an ebay auction for a used DE10-Nano kit for $355 and it sold in a couple days!

Reply 28 of 65, by maxtherabbit

User metadata
Rank l33t
Rank
l33t

I'd like to point the only things you *need* for a fully functional setup are the de-10 nano, a ram board and an SD card. Everything else is expensive fluff

Reply 29 of 65, by ZellSF

User metadata
Rank l33t
Rank
l33t

You don't need RAM for some cores either, you do need a USB hub. You also obviously need input devices, but presumably everyone who's looking into this will have that already.

I've been testing this a bit more lately, looking for an optimal DOS setup. Which as usual comes back to the same conclusion: DOSBox works best in pretty much all cases. The ao486 has too low performance and a few compatibility quirks that means it really shouldn't be your only way to play DOS games. I'm thinking the Raspberry Pi 4 might actually be a better DOS experience (I haven't tested this theory).

One important thing I noted while testing the ao486 core is, you have to remember to set sync to variable not 60hz. If sync is set to 60hz it can affect game speed in weird ways. Of course you need a 70hz capable display to set sync to variable for most games.

Reply 30 of 65, by Shreddoc

User metadata
Rank Oldbie
Rank
Oldbie

It should also be remembered that DOSBox is far older in it's development cycle than anything on MiSTer. It has been around since 2002.

The ao486 core is merely single-digit years old, and on MiSTer itself is even younger still. Not complete, under development and improvement; improvements which are occasionally quite dramatic in effect, as new methods are discovered in this young medium.

Of course, the MiSTer FPGA has certain hardware capacity limits meaning it will never cover the full gamut that DOSBox could cover when run on a modern PC, but ao486 could still have a hugely valuable place, bolstered by the very same advantages which MiSTer brings to console simulation.

~Zero-latency cycle-accurate hardware-based mini computer which can be a 100% accurate mid-486 and below?? I and many others would take that any day!!, once it matures a bit more.

Reply 31 of 65, by held

User metadata
Rank Newbie
Rank
Newbie

Well I have it now for about 10 days and I must say, I like it.

I'm in for over 500 euro's give or take. I know a lot of people have controllers laying around, I unfortunately did not. So I need to add another 150 euro's on top. Of course I would have that problem with any other device too, but still.

I got the official MiSTer case and I'm really glad I did, that stuff looks seriously vulnerable without it. Since the price increase and the scarcity I feel much more comfortable with it. So I did not want to pay extra at first, but in retrospect I'm glad I did.

Its almost instant boot, so no delays in playing games. Its not as polished as the RPI, but its coming along nicely afaict in my 10 days.

Thank all of you for your time and advice, it is sincerely appreciated 👍

Reply 33 of 65, by SquallStrife

User metadata
Rank l33t
Rank
l33t

The ao486 core is never likely to be much good, because the sheer transistor count is way too high to do gate-level simulation.

So any PC core is going to be an implementation of a high-level emulator by necessity. E.g. ao486 is effectively just a port of Bochs.

For perspective, a 68020 has 190K transistors, a 486 has 1.1M.

MiSTer is an awesome platform, with great strentghs, but there's a ceiling to its capabilities unless they move to a new (bigger, more expensive) FPGA.

VogonsDrivers.com | Link | News Thread

Reply 34 of 65, by mothergoose729

User metadata
Rank Oldbie
Rank
Oldbie
leileilol wrote on 2021-09-22, 21:57:
SScorpio wrote on 2021-09-22, 21:48:

The hardest thing with MiSTer to trying to explain why it's better than just running an emulator on your PC. Maybe you are fine with a Pi hooked up to your TV. But I and others can tell a difference. People have said they have sold parts of their retro hardware collections after getting a MiSTer because it was accurate enough.

A lot of it is due to certain... marketing people that deny it's a form of emulation. This often leads to misinformation and arguments as well as dismissing hobbyist software-based emulator's research (despite many authors of the cores themselves likely utilizing them in reference and not personally engaging in that discourse)

However, when you have these guys as the rival fanbase, then..... yeah

I have been told that many mister cores aren't even strictly cycle accurate. The approximate, just like software emulators, for performance. Which is totally fine and works exactly as advertised. The reason the mister has less latency is because there is no OS getting in the way, as all the software runs on it bare metal. In theory you could could run software emulators bare metal on a raspberry pi or an x86 pc and they would be just as good.

Reply 35 of 65, by vstrakh

User metadata
Rank Member
Rank
Member
mothergoose729 wrote on 2022-01-21, 07:28:

the mister has less latency is because there is no OS getting in the way, as all the software runs on it bare metal. In theory you could could run software emulators bare metal on a raspberry pi or an x86 pc and they would be just as good.

The fpga vs sw is not about os vs bare metal. The cores in fpga are the hw, just being differently implemented - using hard logic elements to implement a revised model of original hw behavior. And the behavior of the core with respect to timings is extremely well defined. It might not match exactly the original hardware for whatever reason - like not doing certain internal stuff exactly at the falling edge of the system clock, or not implementing 8-bit operations with 4-bit ALU - but who cares if observable effects can't expose that.
But to reach comparable timing precision in software running on a cpu with impredictable timings (you know, ddr3 latency/refresh, cache behavior, pipelining and out-of-order execution, etc) you'd need a dedicated high precision timers with some ns resolution/accuracy, and the cpu clock frequency much, much higher to allow sampling that timer and managing to performing actions within the polling period of that timer. This is where 8-bit micro can beat the multi-GHz cpu, like generating a composite video signal on a gpio pin. In fpga there is no polling for time counter to update something, events just happen at the exact required moment in time, because the hard logic was wired that way.
The cpu is vastly more performant, just wasn't built for precise timings. The software emulators can live without it simply by not providing that precision. The results you see with sw emulators are acceptable, because you don't care when exactly something happened within the screen refresh period (60Hz, etc). So the latency is inherent to sw implementation, it's not about OS in between. Sw can't get close to fpga in terms of latency/accuracy even in theory without close-to-infinite cpu clock frequency.

Edit: Just to clear out possible misunderstandings.
I'm not talking about sw being unable to position two emulated cpu clocks at exact distance within emulated system. Surely it can do simulation not any less precise than fpga can. But it does not do so in real time. It can do it (and does) much faster than real hw, and then it has to wait until the real time catches up with simulated time that's way ahead of real time. The problems arise when you must interact with outside world via interface that does not exist in the hosting system, like the 1-bit beeper on ZX spectrum. To allow beeper emulation sw has to pile up some significant amount of simulated samples before it can offload that block to hosting system's audio card, and the card can consume such block at the real world speed, not instantly. This introduces the lag that no sw can resolve. Similarly goes with video. The sw can precisely simulate screen bitmap construction as if the real crt beam was drawing it. But pushing it over to existing video interfaces - or more importantly, waiting for the exact time to push that update - will introduce lags.
FPGA does totally different, it's not a simulation, it's a real system implementation using different components (after all you don't care if a timer is done with 50MHz oscillator and a bunch of flip-flops instead of a single ne555). It exist in real world and its actions are happening in real time. That's why you can have 70Hz VGA output, or OPL3 sampling rate of 49.716 kHz - actual, not resampled later into 44/48/96 kHz. And the output updates happens exactly the moment when implemented (not simulated) system did write something to the register or memory location. It's not buffered, it doesn't wait until some global timer say "hey, it's 60Hz tick, it's time to upload the finished image", or "ok, we have 20 mS worth of audio data, we can finally submit it to audio server".

Reply 36 of 65, by mothergoose729

User metadata
Rank Oldbie
Rank
Oldbie
vstrakh wrote on 2022-01-21, 08:09:
mothergoose729 wrote on 2022-01-21, 07:28:

the mister has less latency is because there is no OS getting in the way, as all the software runs on it bare metal. In theory you could could run software emulators bare metal on a raspberry pi or an x86 pc and they would be just as good.

The fpga vs sw is not about os vs bare metal. The cores in fpga are the hw, just being differently implemented - using hard logic elements to implement a revised model of original hw behavior. And the behavior of the core with respect to timings is extremely well defined. It might not match exactly the original hardware for whatever reason - like not doing certain internal stuff exactly at the falling edge of the system clock, or not implementing 8-bit operations with 4-bit ALU - but who cares if observable effects can't expose that.

Yes, indeed, who cares.

As I have heard it explained before, the implementation of mister cores is far more practical than exact. Each individual chip in a SNES for example is not described exactly in HDL, where each chip has a different set of timing and clock cycles and the interactions between each chip are described faithfully, instead operations are approximated and batched together at acceptable timing - just like in software emulators. There is nothing wrong with this! It works exactly they way it should. It's just that a mister is not a hardware clone, it's emulation. And it's good emulation, so yeah it doesn't matter.

There is this idea that FPGA emulation is better than software emulation just because an FPGA is involved. It's not more accurate in any meaningful way, it's just different. The mister has a number of advantages to PC emulation but accuracy strictly speaking isn't one of them.

Reply 38 of 65, by vstrakh

User metadata
Rank Member
Rank
Member
mothergoose729 wrote on 2022-01-21, 16:37:

Each individual chip in a SNES for example is not described exactly in HDL, where each chip has a different set of timing and clock cycles and the interactions between each chip are described faithfully, instead operations are approximated and batched together at acceptable timing

It's not approximated, it's expressed in terms of different hardware in use, and there's really no practical reasons (logic utilization, routing efficiency, etc) to express real chips separately, unless those parts would be reused in another project.
Z80 is an example of something that could be used in many systems, so it's a project on its own, exposing the interface (wires/signals) as in real Z80. You could model some other hw to attach to that bus and the system would behave exactly the same as if real Z80 was in use, as observed on the bus, but you don't look at the internals - there's entirely different components.

mothergoose729 wrote on 2022-01-21, 16:37:

- just like in software emulators.

I beg to differ. In fpga it's not batched. It delivers the required computations at the defined observable interfaces exactly at the moments where those computations would happen in original hardware. This is why it needs a puny resources compared to software emulations.

mothergoose729 wrote on 2022-01-21, 16:37:

It's just that a mister is not a hardware clone, it's emulation.

It's definitely not a replica, but it is a hardware clone. You can call it a fake/counterfeit, or even bad chinese clone because it doesn't use the chips that are long gone.
But it's a hardware clone, imitating the observed behavior that was inherent to the real system, just built of different ICs. The HDL modules would be the new "chips", and for practical reasons functionality would be packed in a way that is different from the original hw which had to use off-the-shelf components. This doesn't make this clone any less hardware.

mothergoose729 wrote on 2022-01-21, 16:37:

It's not more accurate in any meaningful way, it's just different. The mister has a number of advantages to PC emulation but accuracy strictly speaking isn't one of them.

Which accuracy do you have in mind?

And you still can't deny the latency associated with SW emulation. Removing the OS out of SW performance equation can't remove the latency when simulation interacts with real world even in theory.
Fpga implementation simply has no this treat, the events on exposed/observable interfaces happen in real time exactly at the required moment down to some nanoseconds. Because it is a hardware and not a data model simulation that spins in a higher-level virtual machine with unpredictable timings.

P.S.
Anyway, defending the mister never was my intent. Perhaps we're simply on different pages about "emulation" word definition.
For me the emulation is the SW stuff, when there is no hardware implementing the functionality, but there is a model expressed as the code for the machine running simulation.
While existing hardware would mean either clone, or "imitation" if the clone word means absolute copy for someone else. After all, all the variety of ZX Spectrums clones out there, built with totally different chips except for Z80 - are all called "clones" and not emulations.

Last edited by vstrakh on 2022-01-21, 20:15. Edited 1 time in total.

Reply 39 of 65, by mothergoose729

User metadata
Rank Oldbie
Rank
Oldbie
vstrakh wrote on 2022-01-21, 19:35:
mothergoose729 wrote on 2022-01-21, 16:37:

Each individual chip in a SNES for example is not described exactly in HDL, where each chip has a different set of timing and clock cycles and the interactions between each chip are described faithfully, instead operations are approximated and batched together at acceptable timing

It's not approximated, it's expressed in terms of different hardware in use, and there's really no practical reasons (logic utilization, routing efficiency, etc) to express real chips separately, unless those parts would be reused in another project.
Z80 is an example of something that could be used in many systems, so it's a project on its own, exposing the interface (wires/signals) as in real Z80. You could model some other hw to attach to that bus and the system would behave exactly the same as if real Z80 was in use, as observed on the bus, but you don't look at the internals - there's entirely different components.

So... what? Software emulation does the same thing... just in software. This is a value assessment. The output is the same. Neither are a real SNES.

I beg to differ. In fpga it's not batched. It delivers the required computations at the defined observable interfaces exactly at […]
Show full quote
mothergoose729 wrote on 2022-01-21, 16:37:

- just like in software emulators.

I beg to differ. In fpga it's not batched. It delivers the required computations at the defined observable interfaces exactly at the moments where those computations would happen in original hardware. This is why it needs a puny resources compared to software emulations.

mothergoose729 wrote on 2022-01-21, 16:37:

It's just that a mister is not a hardware clone, it's emulation.

It's definitely not a replica, but it is a hardware clone. You can call it a fake/counterfeit, or even bad chinese clone because it doesn't use the chips that are long gone.
But it's a hardware clone, imitating the observed behavior that was inherent to the real system, just built of different ICs. The HDL modules would be the new "chips", and for practical reasons functionality would be packed in a way that is different from the original hw which had to use off-the-shelf components. This doesn't make this clone any less hardware.

The only thing I called the mister was another form of emulation. What are we arguing about?

Which accuracy do you have in mind? […]
Show full quote
mothergoose729 wrote on 2022-01-21, 16:37:

It's not more accurate in any meaningful way, it's just different. The mister has a number of advantages to PC emulation but accuracy strictly speaking isn't one of them.

Which accuracy do you have in mind?

And you still can't deny the latency associated with SW emulation. Removing the OS out of SW performance equation can't remove the latency when simulation interacts with real world even in theory.
Fpga implementation simply has no this treat, the events on exposed/observable interfaces happen in real time exactly at the required moment down to some nanoseconds. Because it is a hardware and not a data model simulation that spins in a higher-level virtual machine with unpredictable timings.

Well that is the question. What is accuracy? If the real interaction of chips is expressed faithfully and exactly is that accurate? What if the functional utility of the device is expressed is that enough? It's a philosophical question. The most accurate software emulators like BSNES do not simulate each transistor and gate but they do represent the behavior of those chips, sometimes in great detail and sometimes in aggregate. Mister cores do not have the space to represent the entire circuit 1-1 so they build equivalent circuits. Does it matter? Which one is "better"? I have an apple for your orange.

One dangerous notion that does need to be dispelled is that mister cores are perfect because in some fundamental way they are better simulators. The good mister cores are incredible. There are many mister cores that are still being developed and need more time. They have bugs, they don't support all the features of the original hardware, they don't always have the performance or the behavior you would expect, ect.

The latency thing is a separate issue. From a practical standpoint you don't have to process inputs in nanoseconds for a video game. The real reason software emulators have higher latency (sometimes) than real hardware or FPGA is complicated, and yes the OS and the drivers absolutely play a part. Why it's worse though almost doesn't matter, only that it is worse. The mister has negligible input lag compared to real hardware which is one of many good reasons to play games on a mister.