Currency in banking is handled with bigints. Not rationals, just bigints of the smallest unit (i.e., 1 cent). This forces you to order operations so that divisions are done last or not at all.
There is a slight difference where it forces decision-making about rounding at every step. Most importantly, it makes errors that print money (or burn money) impossible states. I'm aware of BigRational libraries or more clever currency abstractions, but this is the least magical way of doing it, kind of like representing datetimes as 64-bit UNIX epochs.
Well, you could look at it as x/100 rationals representing a dollar value. But you could also look at it as an integer amount of a smaller unit (cents). The difference is insignificant; computers support it natively.