
Show HN: SeKey: An SSH Agent for OS X, Using Secure Enclave and TouchID, in Rust - ntrippar
https://github.com/ntrippar/sekey
======
nyolfen
see also the really excellent kyptonite, which i found linked here a few
months ago and i've been positively delighted with:

[https://krypt.co/](https://krypt.co/)

i'm not a security expert, and i have a lot more trust in my iphone for
managing secrets than my personally configured linux pc's.

~~~
srcmap
I have a slightly different point of view:

In Linux, I can easily monitor how much data, what kind of data are
transferred to the remote IP and disable them on per apps base anytime I want.

In IOS, I can't do any of that. At most, I can only disable an app from using
cell data. If anyone else know how to do monitor/block network connection on
per APP base in IOS, I would love to hear about it.

For me, it is a different between "trust" and "trust and verify".

I use IOS and have some level of trust on Apple/IOS. But I don't trust
majorities of the IOS APPs for security/privacy.

~~~
staplers

      If anyone else know how to do monitor/block network connection on per APP base in IOS, I would love to hear about it.
    

I've been seeking a Little Snitch type connection monitor/firewall for iOS for
some time. I believe Apple purposely makes this difficult/impossible on
iPhones due to the constant metrics/analytics sent to Apple servers.

Look at what they have done with turning off bluetooth and wifi from the
Control Center.. It's really eye-opening to see how often Apple products ping
home.

~~~
matthewmacleod
_I believe Apple purposely makes this difficult /impossible on iPhones due to
the constant metrics/analytics sent to Apple servers._

I don't think that follows. Allowing applications to intercept and mess with
other application's network traffic would be an obvious security issue; that's
far more likely to be the reason than some kind of vague "we want to track
your data" thing.

It's certainly possible that Apple could construct an appropriate API for
allowing users to configure apps in such a fashion that they could monitor
network traffic, much the same way as similar APIs exist for accessing e.g.
photos. But since it's a niche application at best, I'm hardly surprised they
haven't done so.

~~~
comex
Such an API already exists: the NetworkExtension APIs, specifically
NEPacketTunnelProvider and NEAppProxyProvider. They're meant for VPNs, but it
should be possible to use the same APIs to monitor traffic and send it on,
rather than tunneling it through a VPN. (There is also NEFilterProvider, but
that API is designed to run the filter in a tight sandbox that doesn't allow
it any outbound communication, so that it can filter packets but not
exfiltrate them.)

…though if you're concerned about privacy, perhaps you should be using a real
VPN anyway, in which case you could handle traffic monitoring and filtering on
the server side.

Edit: I guess the server-side approach wouldn't allow identifying on which app
is making the connection. The NetworkExtension APIs, however, should allow
that: you get a flow of NEPackets, each of which has a 'metadata' property
containing a 'sourceAppUniqueIdentifier' and 'sourceAppSigningIdentifier'. I
don't have personal experience using these APIs though.

------
_hyn3
Very, very cool.

To use a public key with something like Userify
([https://userify.com](https://userify.com), plug ssh key management) or a
service like Github, use _\--export-key_ to export the public key in OpenSSH
format:

    
    
        sekey --export-key <key-id>
    

(see also in the readme that you can use sekey --list-keys to get key ID's.)

This is seriously such an awesome project that I might have to get a new MBP
just for this.

------
floatboth
For TPMs on non-Apple computers: [https://github.com/ThomasHabets/simple-tpm-
pk11](https://github.com/ThomasHabets/simple-tpm-pk11)

But with TPM there's no external unlock mechanism like Touch ID, the TPM
unlock happens from the operating system.

------
abhv
This is a really cool project. I just tried to compile it myself, and code-
signed it with my mac dev key. I changed the assets/sekey.entitlements file by
replacing 5E8NNEEMLP with my own hex id for my developer key.

However, when i run ./bundle/Sekey.App/.../sekey, i keep getting a "Killed 9"
message.

When i run the unsigned version, the binary at least runs (shows the -h
messages). Any hints on how to fix?

~~~
ntrippar
thats because the code signing its invalid. you are signing the whole app? or
only the binary. could be many reasons, one is the lack of the provisioning
profile on your computer (for your key).

------
froderick
This looks useful. I'd love to use this with gopass
([https://github.com/justwatchcom/gopass](https://github.com/justwatchcom/gopass)).
It would have to support preexisting keys for that to work for me.

I wonder if the limitation around elliptic curve keys originate with the
Secure Enclave, or is that just the one type of key this tool supports?

------
epistasis
I would love to see this usable as a U2F device too.

------
falcolas
Very cool project. I’ve done similar things with yubikeys before, having it
built into the hardware is great.

Can it also support a pin to go with the biometric auth?

~~~
ntrippar
as far i read in the documentation no. i will do more research on the
implementation. also you device has to be unlocked for the enclave to work.

[https://developer.apple.com/documentation/security/ksecattra...](https://developer.apple.com/documentation/security/ksecattraccessiblewhenunlockedthisdeviceonly)

------
viraptor
> Can’t import preexisting key

That makes me sad :( Does anyone know if that's a SE limitation, or the app's?

~~~
ntrippar
it's SE limitation :(. also makes me sad, will be awesome if you can import a
key generated by yourself. so if you reinstall you can import again to the
enclave

~~~
tedchs
It's common for hardware security tokens like this to limit themselves to
self-generated private keys. The intent is the device needs to provide a
guarantee that the private key is not otherwise accessible, whereas it would
be if the user generated it, perhaps wrote it to disk, provided it to the SE,
and perhaps SEs on other devices.

~~~
0culus
YubiKeys allow you to load your own private key, which turned out to be a
great feature. [0]

[0]: [https://www.yubico.com/support/security-
advisories/ysa-2017-...](https://www.yubico.com/support/security-
advisories/ysa-2017-01/)

------
arete
Awesome work, this is a really cool idea and well executed.

------
galadran
I'm amazed there's a market for this kind of thing. Since the 1990s its been
known how to implement an undetectable backdoor in hardware based crypto which
leaks your keys to an attacker.

E.g.
[http://paper.ijcsns.org/07_book/201006/20100623.pdf](http://paper.ijcsns.org/07_book/201006/20100623.pdf)
details how to do it for Elliptic Curves. But its been studied since
[https://link.springer.com/content/pdf/10.1007%2F3-540-69053-...](https://link.springer.com/content/pdf/10.1007%2F3-540-69053-0_6.pdf)

The tl;dr is that any device in which you can't check the implementation or
get the private key out (i.e. Secure Enclave, TPM etc), can leak your key to a
passive attacker in a way that you _provably_ can't detect.

~~~
falcolas
On the flip side, you can also demonstrably prove that private keys stored on
disk can be exposed with a minimal intrusion of your user account, also with
no indication they were leaked.

Another layer of security is never bad.

~~~
thisacctforreal
Additional layers of security can be bad when they reduce usablity, reduce
accessibility/availability, or increase complexity too much.

As usual it's trade-offs all the way down.

