
Starbucks caught storing mobile passwords in clear text - JumpCrisscross
http://www.computerworld.com/s/article/9245438/Evan_Schuman_Starbucks_caught_storing_mobile_passwords_in_clear_text_?taxonomyId=17&pageNumber=1
======
ZoFreX
> Starbucks could have chosen not to store the password on the phone, but
> users would then be forced to key in their username and password every time
> they wanted to use the app to make a purchase.

These aren't the only two options. Storing a token would let users remain
logged in without having the same security implications as storing the
password.

Some advantages of a token vs a password:

1\. Lots of users use the same password on multiple sites

2\. You could allow common usages via a token but still request a password re-
entry for more potentially dangerous actions like changing the email address
on the account

3\. Tokens can be invalidated, so if the phone is lost the user can disable
the app on it without needing to change their password

4\. Tokens can be selectively invalidated, so if you have multiple devices you
could log some of them out without logging them all out

5\. Tokens can be set to expire so you can request password re-entry every so
often just to ensure a bad actor would get locked out eventually

~~~
Almaviva
> 1\. Lots of users use the same password on multiple sites

I'd never say this at a job interview but I'll be Devil's advocate: As a
business, this isn't my problem, it's yours. If you want the convenience of
the same password for multiple sites, in the real world there are going to be
weak points on some of those sites and someone who can abuse any point of the
chain on any site can obtain your password for all.

For the rest, expecting to change my passwords if my phone is stolen is not an
unreasonable thing at all. I should do this anyway, even if businesses ensure
me that I don't have to.

And 99% of users who aren't IT or security professionals would just prefer to
be done with entering their password after the first time, period.

~~~
thirsteh
> I'd never say this at a job interview but I'll be Devil's advocate: As a
> business, this isn't my problem, it's yours.

This argument is the same as saying "It's not my fault you're being spied on
because you're not using OTR in your IM; it's yours." You're technically right
that the user could theoretically avoid this problem, but you're wrong in
practice since you're setting impossible expectations that even security-
conscious people often don't meet.

It's a service provider's responsibility to not let user credentials be easily
accessible because those credentials are used in many places. The latter is a
more fundamental issue, yes, but you deserve the flak you get if you just say
"not my problem."

------
ben1040
The actual disclosure linked in the article is a little bit different than
what the article is implying.

[http://seclists.org/fulldisclosure/2014/Jan/64](http://seclists.org/fulldisclosure/2014/Jan/64)

The file in question is a log generated by the application. They are NSLog'ing
stuff to the console, and Crashlytics must be capturing it up and putting it
in this file. Along with debug tracing messages like this below, they're
logging server interactions and JSON responses that contain personal
information (I see my home address, telephone number, etc coming back from the
server and being logged).

    
    
       2539 $ -[CardDetailViewController refreshCardDetails:] line 798 $ 
       2548 $ -[CardDetailImageViewController viewDidLoad] line 28 $ view did load
       2551 $ -[CardDetailImageViewController viewWillAppear:] line 48 $ view will appear
       2551 $ -[CardDetailImageViewController viewDidAppear:] line 53 $ view did appear
       2588 $ -[CardDetailViewController refreshCardDetails:] line 798 $ 
       2607 $ -[CardDetailViewController doPageChange:] line 684 $ :
       2652 $ -[CardDetailViewController viewDidAppear:] line 301 $ I APPEARED!!
       2652 $ -[StarbucksAppDelegate trackView:] line 1084 $ 2014-01-14 18:50:37 +0000 /Card/MyCard
       2791 $ -[CardDetailViewController viewDidAppear:] line 301 $ I APPEARED!!
    

So the situation here isn't one of tokens vs passwords vs encryption or
otherwise how they're being stored for interacting with the server. The user
is going to have to enter a password at some point in the workflow, regardless
of whether it's encrypted at rest or exchanged for an OAuth token or whatever.
You shouldn't be logging that password back to the console when the user
enters it!

edit: This also means that all that personal information of mine is presumably
on some Crashlytics host as a side effect of all this having been logged and
sucked up by Crashlytics.

------
brudgers
_" If you grab someone's phone, you can effectively go through this log and
see effectively where this person has been," Wood said. "It's a bad thing for
user privacy"_

Compared to what? The implicit assumption is that Starbucks gathering and
storing geolocation data is not a potential invasion of privacy or a
meaningful risk. The person who steals my iPhone is very unlikely to do so for
the data it contains. Their goal is to flip it for cash and a datum ain't
worth much to anyone other than the PI my wife hired to find out if I'm
sleeping around or an attractive lab technician in _CSI: Miami_.

No the real risk to privacy is when Starbucks' servers are comprised. Today's
Willy Suttons are still bank robbers not pickpockets. Spreading the data out
spreads the risk and reduces or eliminates the probability of a catastrophic
breach.

Of course it will be popular sport to pillory Starbucks for not following the
conventional wisdom because it allows us to ignore the fact that passwords are
broken. There's no technical fix for poor password hygiene among iPhone owners
and an encrypted password will barely slow down a determined attacker with a
couple of GPU's and physical possession of the phone.

~~~
gamerdonkey
_"... a datum ain't worth much to anyone other than the PI my wife hired to
find out if I'm sleeping around or an attractive lab technician in CSI:
Miami."_

This is completely off-topic, but am I the only one who wondered why his wife
would hire a PI to find out if he is an attractive lab technician?

Not as a critique, I just find grammatically-correct ambiguous cases like this
one interesting.

------
jader201
Pretty off topic, but I submitted this story last night [1] with the exact
same URL and title. I thought whenever duplicate stories were submitted, it
just upvoted the original submission without posting a duplicate story (that's
what's happened to me in the past).

I originally saw this, and thought the URL was just different (even if just
slightly), so I wasn't even going to say anything.

But since the URLs are identical, was just curious how the HN logic works when
submitting duplicate URLs like that. Is it that too much time had passed,
considering them "different" submissions?

[1]
[https://news.ycombinator.com/item?id=7068298](https://news.ycombinator.com/item?id=7068298)

~~~
robbiea
It's not the same URL. Yours was the mobile version (it has a m.) and his was
the normal URL.

I think this one got more traction just because of the timing of the post too
being in the morning and getting enough traction to make it to front page.

~~~
jader201
Ah, that's it, thanks. I made the mistake of clicking the link (I'm on a
desktop now), and copying the redirected URL to compare, which was the same as
this one.

That's what I get for posting submissions from my phone.

------
bigtunacan
I LOL'ed so big just reading the headline. It is absolutely terrible, but I
have seen this before with big companies that should know better. About 6
years ago I was working on a project for The Wall Street Journal (yeah; that
WSJ) in which all customer data was being stored in the DB in plain text then
exported nightly to an Excel report and emailed unencrypted to client managers
so they could review daily sales.

On numerous occasions I told them that was extremely risky and that we were
violating PCI compliance and opening the companies to huge potential fines in
addition to just putting customers information at risk.

Every time I brought this up I was told there wasn't time to fix the
application and that the client managers thought it was too difficult to deal
with encrypted files so just leave everything the way it was.

Eventually they got busted on a PCI compliance audit and started using PGP to
encrypt the files sent via email; but by the time I left they still were not
encrypting the backend data or actually maintaining PCI compliance. Extremely
sad; but this happens all the time.

------
nly
_Every_ single app on your phone that remembers a username and password
combination, or any other credential, is likely vulnerable.

Imho its about time that Google required the presence of a HSM in Android
devices for key storage. A HSM that locked me out after ~10 x 6 digit PIN
guesses (with software locking me out at a lower number) strikes me as a good
thing. If someone wants to destroy my $500 phone, prise out a chip, grind it
down, and look at it under an electron microscope to extract my passwords,
then good luck to them. Why isn't this happening?

~~~
jessaustin
_A HSM that locked me out after ~10 x 6 digit PIN guesses..._

And now you've got a DOS enforced by hardware. Hopefully if I bring it back to
the store I can get it reset? The existence of this reset sort of negates the
point of an HSM.

However, I could see the point of an Android module that, rather than locking
the user out, would simply delete keys the password to which had been entered
incorrectly a configurable number of times. For an app like this, the user
would simply have to enter a password and CC again.

~~~
nly
Yes, a reset/unlock is a bad idea. Keys should be erased, and it should be up
to apps/services to determine an appropriate means to re-authenticate users.

------
eli
Storing a token isn't really much more secure than storing the password
itself. (If I steal your token, I'm buying coffee with your account even
though I don't know your password. A token that authenticates account access
_is_ a password.)

But it _seems_ much more secure (which, it turns out, matters) and it does
somewhat protect people who reuse the same password everywhere.

~~~
troebr
You can invalidate tokens for a compromised phone. Well you can also change
your password, but I'd rather revoke the access from my phone than have my
password stolen. Some kind of validation mechanism before you order could be
nice (like a screen lock), something you can do with one hand.

------
unreal37
I sympathize with the developers because I face this maddening argument every
day between convenience and security, but storing passwords in plaintext on
local. Geez.

Make a token on the server after initial login and store that! Not much more
secure, but then this story wouldn't be news.

~~~
dangoldin
Seriously. Bigger problem is password reuse. If someone gets access to a
user's password they then most likely have access to their email account, etc.

------
coldcode
We use crashlytics as well but only an idiot would store username and password
in the clear in their log. Geez, just store a randomized database ID if you
have to. It's funny how people justify stupidity when they get caught.

------
jessaustin
_Only when adding money to the app is the password required._

This seems precisely backwards.

~~~
ctdonath
"Adding money" means applying a credit card to purchase Starbucks credit,
usually by a sequence like "Reload card -> $25 -> Confirm" on a pre-stored
credit card number. Buying from the app has a small cash pool to draw from;
reloading from a CC can get one a whole lot more money.

~~~
jessaustin
I'm not a Starbucks regular so I may have missed a detail. Transferring money
from a stored bank account to the app seems to be a server action, so the
server should be doing the auth. What then is the point of storing the
password on the client? If it's just to confirm possession of the phone, a
token would be better for usability, as well as in all the other ways a token
is superior to a password. TFA says the password is also used to activate the
app, but a token signed with a timestamp and emailed to the user would be
better for that.

------
nickflees
The assertion that users can make unlimited purchases after entering
credentials just once is false: "Customers need only enter their password once
when activating the payment portion of the app and then use the app to make
unlimited purchases without having to key in the password or username again."

That is, unless you have automatic reloading on, which is a crucial point.
This doesn't excuse the practice of storing passwords in clear text, but it's
an important detail.

------
lukasm
I don't get why not at least AES it.

~~~
eli
With a static key? I mean, sure... but that's just obfuscation.

~~~
curmudgeoned
Not really.

It goes like something like this:

    
    
      1. Starbucks' server has the private key, the iPhone app 
         has the public key.
    
      2. The app locks the plaintext up in AES with the public 
         key, local to the phone, and keeps the locked data, 
         and sends a copy to the server. The server has the 
         private key, and can unlock the data locked up with 
         the public key anytime, even though the app (in 
         possession of the public key only) cannot unlock the 
         data by itself.
    
      3. The app needs network access to operate properly, 
         because honestly, why is Starbucks attempting to 
         transact without a network connection, so if there's 
         no network access and the protected data can't be 
         accessed, oh well. Oh, and by the way, if the app 
         really needs the plaintext, why not just ask the user? 
         Oh right, thinking is hard. Don't ask a lazy user to 
         do anything.
    
      4. Each time the app needs to unlock the protected data 
         and use it locally, it sends a GET request to the 
         server via HTTPS. Maybe it sends XML, maybe it sends 
         JSON. Who cares, as long as it's not keeping and using 
         the plaintext.
    
      5. Based on the nature of the request, the server decides 
         whether it needs to send the plain text back over 
         HTTPS, or whether the app is just asking the server to 
         do something server-side involving sensitive data. If 
         the app *REALLY* needs the locked data sent back in 
         plaintext, the server sends it back for one time use 
         via HTTPS (still protected from interception, even 
         though it's being sent over network), to be nulled out 
         after the process or function returns complete.
    
      6. The server is a fortress, and has the private key 
         (...somewhere). It does not store the sensitive data 
         in plain text. It too only stores the locked data, but 
         is capable of unlocking the data on the fly, per 
         request, each request, every time. The server should 
         actively garbage collect the plaintext data, and not 
         leave stale copies lying around.
    
      7. The server *NEVER* give an app a copy of the private 
         key. NEVER, EVER. The iPhone app can rot in hell if it 
         can't get the data unlocked. If it has to wait, it 
         waits. Find something else to do. Mine bitcoins, 
         unfold some proteins, whatever.
    

Yes. This demands a server infrastructure with high performance and high
availability, according to the popularity of the app (many millions of
concurrent users). It will be expensive and complicated to execute something
like this. One would not _JUST_ AES it.

But hey, lazy users can't be bothered to type passwords and such. Gee whiz!
Isn't this Starbucks app easy to use? How did they do that?

~~~
drdaeman
> AES with the public key

I beg pardon, but this part is a nonsense because AES is a symmetric cipher.

And even if you use public-key crypto, the schema's not better than old plain
OAuth 2 tokens (or alikes). Actually, it's worse because of unnecessary
complexity and because, as opposed to encrypted password, OAuth token has no
relation to password at all.

Lose the encrypted key and until you change the password or revoke phone's key
(so encrypted key would be unusable) you're not secure. Same with a token
except that you don't have to change the password, just revoke a token.

~~~
curmudgeoned
Whoops! Looks like your right... AES doesn't use public key exchange, and
public/private key pair generation at all! I was entirely confusing it with
other completely different things.

Ha ha! Sorry...

