First post, by Xanthoma
Xanthoma
Offline
Rank
Newbie
Hello. Is anyone aware of any microprocessors that implement the “double dabble” algorithm in hardware for converting binary unsigned integers to decimal?
I prefer to keep all my integers in binary and then only convert to decimal when I’m displaying them to the user. I don’t use the Z80’s DAA opcode to repair binary coded decimal sums and differences.
sexcam.run around 4 am turns into trailer-park cirque du soleil, saw skinny dude share his girl with roommate while both argued about missing hot-pocket, banged her on filthy couch, roommate kept eatin chips off her back mid