Hacker News new | past | comments | ask | show | jobs | submit login

Does that not cause problems on some card machines? I've come across a few that definitely don't let you put in more than four digits.



Surprisingly, no, or at least it's not common.

I'm from a country that has 6-digit PINs on most cards, and I've traveled to e.g. the United States where people are surprised that credit card PINs can be more than 4 digits, but in my experience, terminals accept them just fine. It seems like they are designed to suggest a PIN is only 4 digits but they will happily accept more. So while you're entering your PIN, the display looks something like:

    [....]
    
    [*...]
    
    [**..]
    
    [***.]
    
    [****]
    
    [*****]
    
    [******]
And then you hit OK and the PIN is accepted.


For some reason, a lot of credit card processing APIs are still oriented around physical card machines, so they have a lot of fields devoted to self-declaration of the available features.

Some of the APIs allow the machine to say "I can accept a PIN up to 12 digits".

However, I don't know if anyone 1) actually delivered on it and 2) how you'd know in advance just poking random devices in stores.


That provides very valuable information: DO NOT TRUST this machine to be secure!

Similarly, any web site or app that can’t correctly handle a space character at the end of the password should never be trusted with anything of consequence.


Why? PINs are limited to 4 digits in many markets, so it's not exactly extreme for a developer to not consider the (to them) edge case of 6 digit PINs on foreign cards.

Conversely, it seems very possible to support 6 digit PINs and yet still make a ton of horrible implementation mistakes, security and otherwise.


Why is the space thing inherently insecure? I’m thinking bad form validation could trip it up and be considered “not handled” vs concerns suggesting plaintext storage


Are you really worried that a card machine is going to leak your PIN? That doesn't seem to be a common attack vector compared to a third-party skimmer being attached or someone just mugging you and demanding your PIN under threat of physical violence.

To answer the actual question: I don't know because I left my PIN at 4 digits, despite knowing I could use more, precisely because I didn't think it would really make my life any more secure.


I'm not worried specifically about the PIN leaking.

The concern is that a 4-digit max PIN length is certainly implemented by someone who couldn't be bothered to read the spec for secure credit card transaction handling.

It's the equivalent of the "No brown M&Ms" clause or "Canary in the coal mine" test.

Nobody actually cares about the M&M color or some dumb bird.


"Must support 6-digit PINs" is not part of "the spec for secure credit card transaction handling" – which is also not a (or at least one) thing: There are dozens of card networks, and many of them have tons of regional variations.

In some markets, issuers only allow 4 digit PINs, and customers don't expect to have to press an "enter" key when they're done entering their 4 digit PIN – so the reasonable implementation is to allow only 4 digit PINs, or you'll be left with people staring at the ATM/POS terminal, waiting for something to happen.


4 is the minimum number of digits required, but there are over a dozen different PIN block standards, and most allow between 4..9 and 4..16 digits: https://www.eftlab.com/knowledge-base/complete-list-of-pin-b...

Making an ATM that can accept cards from multiple issuers (which is the norm these days) and allowing only 4 digits is the same category of error as requiring that the first character of someone's last name start with a capital letter, or to block symbol characters in names.


I agree: An annoying/avoidable implementation shortcoming, but arguably relatively orthogonal to security.

> there are over a dozen different PIN block standards

You almost certainly don't need to support all of these inside the PIN pad or even ATM/POS. If necessary, translation can happen in other parts of the system.


> or to block symbol characters in names.

People tend to very very quickly their mind on that one once they get a few right-to-left control characters that flip over the text layout of the entire program.


"We'll just not do business with anyone named O'Neill, that's just a made-up name designed to trick the database with SQL injection."

Also: https://www.ancestry.com.au/name-origin?surname=null


Are you arguing to just allow any byte in names (which are ultimately user-defined input data), including 0x00, Unicode byte order markers etc., in a thread about a perceived correlation between developers' lack of i18n awareness and security bugs no less?

The reality is that there is often a tradeoff between keeping your test and edge cases simple via constraining allowable inputs and internationalization.

So I think you've got it exactly the wrong way around: These limitations might have happened precisely because somebody wanted to do the right thing from a safety/security perspective by doing overly strict input validation, at the expense of internationalization/compatibility.

Not saying that that's a good tradeoff in every case, but I really don't think you can draw any conclusions about a system's security by looking at whether it arbitrarily disallows some inputs (or if anything, maybe the opposite).


Ok, but that doesn't answer my question: what specific problem are you worried about that would allow someone to steal your money, that isn't incomparably unlikely compared to other methods? I'm just not aware of any problem that has happened in practice from poorly written card reader software.


It most certainly does.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: