Hacker News new | past | comments | ask | show | jobs | submit login

Secrets in environment variables is such a bad security anti-pattern, and it seems to be getting more popular.



Storing secrets elsewhere (for some value of elsewhere that may or may not be better than environment variables) would not have helped this vulnerability: even if the OAuth token were in a file, and it were copied into .git/config, git would still have printed the URL on a failed push.

But storing secrets in environment variables isn't a universally bad thing (which is what I interpret "anti-pattern" to mean). It's a bad thing in certain contexts, including OSes (e.g., AIX and OpenBSD) where environment variables are readable by any local user. But there are certainly uses of it where it's more secure than any alternative (e.g., mosh uses $MOSH_KEY for this and it's pretty sound), and except for the OSes where environment variables are intentionally world-readable, most attacks that would let you read an environment variable would also let you read any other place to store a secret, including a file, a pipe, or a special kernel thing like Linux keyrings.


I think you have some faulty assumptions. Environment variables are not usually disclosed because somebody got shell access.

It's very common to print them out while debugging, but not common to print arbitrary files containing your credentials. I have yet to see phpinfo() print ~/.ssh/id_rsa to the world for example, but I sure have pulled a lot of passwords out of it.


Storing secrets elsewhere wouldn't have prevented this specific problem, where git potentially leaked credentials (in URL form) to the build log.


What is a better pattern?


Secrets or credential management is hard, but the first step is to centralise. Many folk use Vault. There's also Knox, KeyWhiz and I forget some others. I've been a secrets-management product team (CredHub) for several months now.

We've looked at different ways of shuttling secrets but really, it's going to be specific to the context. For example, one job our software does is to hand credentials to a trusted BOSH director during deployments. That's basically done at this point and works very nicely from an operator perspective.

But then when we look at handing secrets to applications, or getting secrets to CI, it's a bit trickier.

We use Concourse a lot and for Concourse the next major track of work centres entirely around creating a secrets-management layer that backs onto secret-management systems.

Disclosure: At the moment I work on CredHub on behalf of Pivotal.


Not applicable for y'all, but for our AWS people, I'll cape up for Credstash[1] (or its moral equivalent, Sneaker[2], but I prefer the use of a Dynamo table to S3 for this). Uses IAM to ensure secret access and offers revisioning. Plus it's super easy to work with. My normal stack, using my own Auster[3] for orchestration, uses an offline, file-based secrets store (usually an encrypted USB key in production) for stuff like database root passwords that don't need to be online, then push database passwords into Credstash with encryption contexts (the KMS thing that makes IAM effective for this). My Chef cookbooks happily slurp in credstashed secrets via rcredstash, now that the PR to make it work with parameterized KMS keys has hit, and provisioning is very straightforward. (There are a lot of proper nouns in this paragraph, but the nice thing is that each component handles its own business and so there aren't many vertical concerns.)

CI secret access isn't a thing in the systems I develop (unit tests don't need them, integration tests get a spun-up environment that provisions its own secrets), but you could provide access with a bog-standard token machine.

[1] https://github.com/fugue/credstash

[2] https://github.com/codahale/sneaker

[3] https://github.com/eropple/auster


Secrets or credential management is hard, but the first step is to centralise.

Ah yes, the "all eggs, one basket" approach to secret management. This is the correct approach, if you are trying to sell a platform -- gets you lock-in, and if you fail to keep secrets secure, you were going to blow up anyways, so the business risk management dictates that you should shoot for the moon and risk your client's data in the hopes of getting traction.


That's definitely one way of looking at it.

Another view is that:

1. You can't invest in heavily defending scattered resources.

2. Individual teams are not all experts in secret management.

Pivotal started CredHub (it's now in the Cloud Foundry Incubation process) partly because of client requests and partly because of the problems we and our fellow Cloud Foundry Foundation members have encountered. There are literally thousands of secrets and credentials scattered across dozens of teams, including hundreds of high-risk operational secrets.

We have had multiple unintentional leakages, usually git. It's so easy that we now have tools to watch commits and checkouts for secret-like patterns. The same tools constantly comb our repositories for possible secrets as well.

Development teams should not need to care. Operators should not need to have to hand-manage thousands of secrets. There should be a safe, sane, central, highly assured place or places to keep your secrets.


I've seen secrets stored in special files, or personally I prefer something like https://www.vaultproject.io/, which may still use environment variables, but they can potentially be passed in in a more secure, controlled way.


I'd love to know, too. Best as I can tell, the only really secure option is to read them in from STDIN.

Depending on how the process is running, it may also be reasonably secure to read the secrets from files. But getting this right is tricky, and really prone to human error: all it takes is one errant chmod/chown to remove the security.

Honestly, given the challenges of those options, keeping secrets in environment variables seems like a reasonable compromise to me.


Even putting them in text files is far better than environment variables.

Consider how often people log environment variables or even leave phpinfo() lying around. As someone who does penetration tests, I can tell you that it's far more common to gain access to environment variables than it is to read files on the server. The difference between services that didn't have secrets there and the ones that did was a low severity finding that was barely worth logging, and a full remote compromise.


How often people log environment variables? Maybe the places I've worked are outliers, but `puts ENV.inspect` or the use of PHP wouldn't pass code review in my prior gigs, nor would they be things I'd write today. My experience is literally the opposite--somebody forgetting a chmod is a lot more common than somebody leaking environment variables like that (leaving it up to somebody to figure out a really smart way to read `/proc/pid/environ` or something).


I've written code that does that [1] to log information to help diagnose crashes in long running daemons. Now, the dumping of environment variables is optional but it has been helpful to me in isolating issues with some CGI based programs (as Apache passes critical information to the script in environment variables). Of course, it was in a situation where core dumps are not generated so anything related to a crash is helpful.

[1] https://github.com/spc476/CGILib/blob/master/src/crashreport...


I suspect the vast majority of software development organizations do not do code reviews, so you're already in the minority.


Even if they do, they 1) Don't reject PHP wholesale and 2) Don't require that the devops person trying to find a bug go through code review when deploying a debugging tool.


Distressingly likely. Point taken.


Encrypt the data at rest with an encryption appliance (HSM). Or use a key management service[0][1] to store encryption keys that can be used to decrypt your at-rest data.

[0] https://cloud.google.com/kms/

[1] https://aws.amazon.com/kms/


+1 with kms you could implement access control for different jobs.


I think there are services designed to store and provide secrets... maybe one of those.


While I agree with this statement I don't know of any other way to do so with Travis-CI.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: