There are only 7 bits in ASCII. An 8th can be used for parity when transmitting data but a regular program will never see it. Anything above 0x7F is simply not a character.
But this is not what's the code doing, is it? It's not doing (ch & 0x7F), it's doing ch <= 0x7F. And the parity checking/filtering is done in the tape drive/serial port driver anyhow, it would never reach wc in the first place.
Yes, that's true for that code. But that wasn't really the point, the point I wrote in my earlier post was that ASCII is 7 bits, it's 0..127, and, depending on where the characters came from, only values below 128 are valid ASCII. What I was talking about was that because a parity bit was common, ASCII was limited to 7 bits, to make room for a parity bit. When other transports are involved, e.g. reading from a file, there aren't any parity bits (well, that's not entirely true - a minicomputer I worked with back in the day used parity bits on characters in text files, but that's not the case for the platform where this particular old 'wc' was used), the code simply focuses on valid ASCII, which is below 128.
I am going about the parity bit. 0x46 has odd number of bits set (three, to be precise) so for the parity to check out (that is, the number of bits set has to be even), a parity bit needs to be set and the resulting encoding has to be 0xC6, with four bits set.
Assuming parity is enabled, the parity check is done at a lower level (serial port, TTY driver, etc.) and you'll never see it from the application. I used to mess around with serial ports and terminals a ton in my youth.
The parity bit is not part of the character. It's external, an error detecting device. To read ASCII you always look at bits 6..0, seven bits. You don't filter away the character because it has the parity bit set, you filter off the parity bit (whether it's set or not).