That's what I means by "we have to consider the iPhones backdoored". Once you can't trust the device any more, all bets are off and thus we can't be sure that what Apple says they do with that fingerprint is what they actually are doing.
(edit: regarding jailbreaking, I seriously doubt that a sufficiently well-hidden backdoor would be found by a jailbreaker. Or have we found the backdoors in OSX or Windows yet? Since the latest leak, we know they are there)
edit: Also, there is a difference between a subtle crypto vulnerability and sending data to a server that, according to the announcement, is designed to be protected in its own enclave and never sent anywhere. The latter would be far more obvious in the code and easier to spot.
But we can't see the code. And it's far from certain you'd be able to pick it up through watching data packages.
They might want to do approximate matching, though. That could make it hard or impossible to do without the decryption key.
good spy novel stuff... steal the prints of some foreign bigwig, or say Julian Assange, plant a copy in some compromising crime scene...
If you're going to do anything, including working on the computer you're typing your comments on, you have to trust a lot of parties. Some of that trust involves knowing who made the code, and some of it may involve the knowledge that the NSA will not be using their best, most secret backdoors against a whole lot of people.