
GOST cryptography – Russian Federation’s cryptographic algorithms - stargrave
http://www.cypherpunks.ru/gost/English.html
======
eternalban
I don't get these crypto types that use plain http for the download page [1]
and then make a show with "You have to verify downloaded tarballs integrity
and authenticity to be sure that you retrieved trusted and untampered
software." And if you go to the "alternate resources" links[2][3], you get
"Error code: sec_error_cert_signature_algorithm_disabled."

[1]:
[http://www.cypherpunks.ru/gogost/Download.html#Download](http://www.cypherpunks.ru/gogost/Download.html#Download)

[2]:
[https://lists.cypherpunks.ru/mailman/listinfo/gost](https://lists.cypherpunks.ru/mailman/listinfo/gost)

[3]:
[https://git.cypherpunks.ru/cgit.cgi/gogost.git/](https://git.cypherpunks.ru/cgit.cgi/gogost.git/)

~~~
nullc
If your primary threat model is an active attacker in proximity to the server,
rather than the client, SSL provides _NO_ security: The attacker can intercept
the plaintext communication to your domain used to show control over the
domain and trivially obtain a certificate.

If your threat model is adversarial nation states that control a CA or ten and
are willing to create some bad issuance drama... again no security added by
browser SSL.

If your threat model is an attacker who will compromise your webserver because
no one can keep up with the flood of new vulnerabilities, and the only way you
can keep a private key private is to keep it offline-- which can't be used
with SSL then again, no joy.

Some people believe the use of HTTPS in these cases creates a false sense of
security and reduces the likelyhood that people will check using other
mechanisms. I am pretty confident( _) that they are wrong and that they 've
not actually measured the effect. But it's not a crazy position to take.

(Especially when you mix in how easy it is for https snafus to result in
giving users scary warnings that make them blind to scary warnings)

(_: confidence due to religiously verifying packages and keys, and finding
_frequently_ that they are unverifiable even on major high profile targets
like major linux distros or crypto libraries... e.g. signed with a key that is
signed by no one else and exists only in the same directory as the binary; if
people were actually checking I wouldn't find so many messed up cases; Also
confident by watching the number of .sig downloads on my own software-- no one
checks).

~~~
3pt14159
> The attacker can intercept the plaintext communication to your domain used
> to show control over the domain and trivially obtain a certificate.

Could you explain? Are you talking about DNS spoofing / hijacking or protocol
downgrade attacks? There are answers to those, so I'm not following.

> If your threat model is an attacker who will compromise your webserver
> because no one can keep up with the flood of new vulnerabilities, and the
> only way you can keep a private key private is to keep it offline-- which
> can't be used with SSL then again, no joy.

If you are important enough to worry about 0days, then you need to invest
heavily in monitoring and tools like Appcanary that can autoupdate your
packages. If you detect that your key is compromised then issue a new one and
move on with your life.

HTTPS does not create a false sense of security, MD5 sums for packages
delivered over non-TLS do. HTTPS isn't perfect but not using it because it
won't stop some very well funded actors is silly. I'm worried about hacked
wifi routers at my cafe, not about state level actors stealing HTTPS
certificates.

~~~
stargrave
But OpenPGP detached signatures, that are isolated and does not depend no
transport protocols (TLS), defend you from all that kind of MitM attacks,
because they are point-to-point (directly from developer to end-user), without
depending on any third-party (like CA issuing TLS certificates, DNSSEC
provides, intermediate DNS proxies that must not strip DNSSEC off, and so on).

~~~
Quiark
Except when the user sees the author's public key for the first time and
downloads it from the same site. Key distribution sucks ....

~~~
eternalban
That was my point. How do you know the pub key is not tampered with? Come to
think of it, is meeting in person the only reliable way to reliably exchange
keys?

~~~
lfam
Perhaps, but it depends on your needs.

When using PGP, you have to decide how much you trust each key. All that PGP
does is enforce your trust preferences.

------
contingencies
Some projects, such as Gentoo, use multiple hashing algorithms in parallel to
protect against potential collision attacks while verifying package sources.
Adding Streebog for diversity may be a good idea.

~~~
hannob
Actually Gentoo stopped doing that. (Which imho is sane. It's useless overhead
for a completely theoretical scenario that will never happen.)

~~~
contingencies
I think perhaps portage simply stopped winking ;-) at you while validating
package sources on a per-algorithm basis. Typical _Manifest_ file:

DIST <filename> <size> SHA256 <sha256_checksum> SHA512 <sha512_checksum>
WHIRLPOOL <whirlpool_checksum>

Looks like SHA256, SHA512 + Whirlpool to me. Apparently the SHA algorithms
have FIPS (US) and NESSIE (EU) certification. Whirlpool has NESSIE (EU)
certification. Streebog has GOST (Russian) certification.

Added
[https://bugs.gentoo.org/show_bug.cgi?id=597736](https://bugs.gentoo.org/show_bug.cgi?id=597736)
for discussion.

~~~
hannob
You're right, I thought this was deprecated with thin manifests. Still I don't
see a point in this. We should discuss cryptographic algorithms based on
technical arguments, not on algorithm origin.

~~~
ethbro
Seems a fairly small computational price to pay for protection against
government crypto weakening.

Yes, that's paranoid, but if the price of being safe from paranoid outcomes is
dropping $0.02 in a change jar, I'm willing to pay.

~~~
hannob
The price is not mainly computationally, but in complexity. You need to have a
library implementing that alg, maintain that, make sure it has no security
flaws...

And by the way, these Gentoo manifests, if you're worried about their
cryptographic security there's something much bigger to worry about: For most
users they're transmitted unprotected and unsigned via rsync. There are non-
default ways to improve that, but the default is insecure. (It pains me to say
this, because I'm a long time Gentoo dev, but it's a nasty truth about
Gentoo's lack of security.)

If you want to improve Gentoo's cryptographic integrity this is the first
thing that should be worked on (either through a working signing system that
would be acceptable by default or by switching to an authenticated
transmission mechanism like git over https). This would be much more helpful
than adding an obscure hash algorithm.

~~~
contingencies
The current sync recommendation is _emerge-webrsync_ :

    
    
      # emerge-webrsync
      Fetching most recent snapshot ...
      Trying to retrieve 20161021 snapshot from http://mirror.com/gentoo ...
      Fetching file portage-20161021.tar.xz.md5sum ...
      Fetching file portage-20161021.tar.xz.gpgsig ...
      Fetching file portage-20161021.tar.xz ...
    

GPG is enough cryptographic assurance for me.

(Edit: WTF! Either past-midnight has zapped my brain, or this is a total
community pants down moment. _emerge-webrsync_ doesn't actually verify the GPG
signature by default, _despite downloading it_ , and no warning is issued! One
must follow the obtuse, well-buried instructions @
[https://wiki.gentoo.org/wiki/Handbook:AMD64/Working/Features...](https://wiki.gentoo.org/wiki/Handbook:AMD64/Working/Features#Validated_Portage_tree_snapshots)
to get it to actually verify ... I've added another bug @
[https://bugs.gentoo.org/show_bug.cgi?id=597800](https://bugs.gentoo.org/show_bug.cgi?id=597800)
about this... _serious_ misfeature! Looks like for all intents and purposes if
you want to own the average Gentoo box, having MITM on sync + emerge is
enough! NSA must have been using that...)

------
ex3ndr
We are using in Actor.im double encryption of all our traffic with
AES+Kuznechik and SHA256+Streebog. We are modified Signal protocol to handle
such encryption. While keeping curve25519-only for public key cryptography as
russia doesn't have any kind of standart for pki.

Main ussue is performance. AES and SHA256 usually have hardware optimizations
in ARM and x64 processors, but Russian doesn't have such thing.

Second thing is i think that this is not actually required as AES and
Kuznechik have very similar ideas in them with just slightly different
combination. Also AES is not cracked and it is not going to be in the nearest
future.

~~~
stargrave
Maybe I misunderstood you, but
[http://www.cypherpunks.ru/gost/enVKO.html](http://www.cypherpunks.ru/gost/enVKO.html)
VKO 34.10-2001 is ECDH analogue. It uses two 256 or 512-bit elliptic curves
keypairs for deriving common shared 256-bit key. It is Diffie-Hellman, like
curve25519, with at least 128-bit security margin.

[http://www.cypherpunks.ru/gost/en3410.html](http://www.cypherpunks.ru/gost/en3410.html)
GOST R 34.10-2001/2012 are digital signatures based on elliptic curves. Just
like ECDSA.

------
tormeh
So, if you encrypted a file with a NIST algorithm and then encrypted the
resulting file again with a GOST one, would that make it secure?

~~~
kkl
That is called a "cipher cascade"[1] and practically speaking, no.

[1]
[https://en.wikipedia.org/wiki/Multiple_encryption](https://en.wikipedia.org/wiki/Multiple_encryption)

~~~
code_sardaukar
I think you're missing the implication in the original comment, that the NSA
has put a backdoor into NIST and the Russian equivalent has put a backdoor
into GOST, but neither can use the other's backdoor.

~~~
ex3ndr
We have done double encryption based on same ideas.

~~~
tptacek
I cannot tell you how silly this is. If you're using GOST, you're no longer
building NIST-compliant crypto. If you're using NIST, you're no longer
building GOST-compliant crypto. _For Christ 's sake, just stop using
standardized crypto if you're worried about backdoors like this_. Use an
eSTREAM portfolio cipher for bulk crypto, use Blake2 as your hash, and use
Curve448 for key agreement and signatures.

You can just use the Noise protocol framework to accomplish this, which was
designed to use all of these components.

~~~
hueving
It's still NIST compliant crypto on the outside regardless of the inner
contents. Assuming NIST(GOST(plaintext)), NIST would be terribly broken if
using GOST ciphertext as the payload weakened the security.

It's crypto 101 that, given a ciphertext without the key, an algorithm's
correct input should be indistinguishable from random input of the same
length.

I'm shocked that you think the plaintext contents would have an affect on
whether or not something is NIST compliant.

~~~
tptacek
I don't follow this objection even a little bit, but I'm also not very
motivated to try, so: no need to clarify. I'm just going to reiterate.

I am making a very simple point. If you don't trust NIST standards because you
think they're backdoored, but won't run Russian standards because you think
they might be too, the answer isn't to _compose the two flawed standards_.

Instead: just use a crypto stack composed of well-reviewed, well-regarded
components that are neither NIST nor GOST standards.

Nobody in the world thinks Curve25519 is backdoored, or that Chapoly is, or
that Blake2 is.

In fact: this is what I think you should do anyways. Maybe, just maybe, you
should keep using AES because it will be more performant --- but the
cycles/byte cost of bulk encryption is so low that I'm skeptical that this
matters. Otherwise: avoid crypto standards like NIST and GOST. Standards
processes produce crypto that is at best ungainly and at worst actively
harmful. Standards are evil.

I am, of course, addressing this advice to the _very, very_ limited subset of
engineers who should be working with crypto directly. Everyone else should
just use Nacl.

~~~
hueving
If you think standard A is good except for a potential backdoor with the key
held by entity X and you think standard B is good except for a potential
backdoor held by entity Y _and_ you assume entity X and entity Y do not
cooperate, then composing A and B is completely reasonable.

It's the same thing as having 3 computers vote on the space shuttle control
signals. You could follow your argument and claim, "If the software has a bug
in it that would produce output different from the other 2, then don't use
it!" The problem is that we don't know if there is an issue or not, so we go
the safer route with multiple implementations.

You also did not address the main issue I have with your comment. You made
this assertion: "If you're using GOST, you're no longer building NIST-
compliant crypto.", which implies that the contents of the plaintext determine
if the crypto is NIST compliant. This is completely false.

It's like claiming that uploading an AES encrypted file over an HTTPS
connection is less secure than uploading via HTTP.

~~~
tptacek
Considering only the secure channel problem and not the entire systems problem
(which might motivate encrypting clientside in anticipation of the file being
stored), encrypting before sending on a secure channel is indeed pointless,
which is the reason you'll find very few soundly designed cryptosystems that
do this.

The point is again simple: there are far better options to untrustworthy
standards than composing them in the hopes of mitigating their flaws. It's for
the same reason that we used to use hash combiners to handle MD5 and SHA1, but
now we use HKDF over SHA-2.

~~~
hueving
>which is the reason you'll find very few soundly designed cryptosystems that
do this.

Nearly every secure system I've dealt with (in the military side) encrypted at
the network layer (VPN) and they sent encrypted files over that channel.

~~~
tptacek
Yes, because (as I just said), encrypting files mitigates systems problems
outside the scope of the secure channel problem. A secure channel doesn't help
you if the bag of bits you send down it ends up persisted on an exported,
unencrypted filesystem.

That doesn't mean that redundant clientside encryption of files is a sensible
feature for a secure channel to have.

------
meshko
So if you encrypt your data with both only people who have access to backdoors
from both sides will be able to decrypt it!

------
based2
[https://en.wikipedia.org/wiki/GOST_(block_cipher)](https://en.wikipedia.org/wiki/GOST_\(block_cipher\))

------
avodonosov
GPL licensed Belarusian crypto standards implementation:
[https://github.com/bcrypto/bee2](https://github.com/bcrypto/bee2)

------
mataug
Its nearly impossible to predict when someone would find vulnerabilities (or
if they have already in secret, Bletchley Park anyone) in crypto primitives
and the problem gets compounded we try to use untested crypto primitives such
as those highlighted in this article.

AES has been around since 2001 and researchers haven't gotten past 7 of the 10
rounds so that significantly improves my confidence in its ability to not
crumble under the most simple cryptanalysis.

Here's an interesting video by the author of one of the attacks on the inner
round of SHA-3 explaining why public analysis is exceptionally important.
[https://www.youtube.com/watch?v=uT4hrWkbBxM](https://www.youtube.com/watch?v=uT4hrWkbBxM)

My point is that though gaining popularity may be good because more
researchers may find vulnerabilities but until these primitives are proven its
probably not a good idea to use then in any real world application.

------
matt_wulfeck
> "Why those algorithms could be interesting and great worth alternative to
> foreign ones? Because they are obviously not worse, in some places are much
> better and have high serious security margin."

Is there a reason these algorithms aren't formally introduced as NIST
standards? Are they copyrighted? Couldn't anyone submit them?

~~~
chiph
The uncertainty around their origin (a foreign country who until a few years
ago was hostile to the US) would preclude them being used by US government
agencies (who depend on NIST for the stamp of approval).

~~~
matt_wulfeck
Is that true for an open-source algorithm? I'm sure that the Russian
Government still uses things like SHA256, even though it's an approved NIST
standard.

~~~
andwur
I would imagine so, particularly in cryptography, being able to view the
source is not the same as being able to (easily) verify it doesn't contain any
backdoors. See ECDSA as good example; we can see the entire standard and yet
we can't be sure that it hasn't been weakened/backdoored by its creators
despite it looking secure on the surface. See the start of [0], the bit about
the standard pushing state leaking through secret dependent operations.

[0]
[https://blog.cr.yp.to/20140323-ecdsa.html](https://blog.cr.yp.to/20140323-ecdsa.html)

Edit: my point here is that America would have to trust that there was no way
that the Russians had added backdoors. Given their history and the current
political climate I can't see that trust coming soon.

~~~
rocqua
You can go a long way towards showing a standard hasn't been backdoored by
showing the parameters were generated by 'nothing up your sleeve' numbers.
E.g. start trying RSA keys at SHA-256 of 1. And keep incrementing until
requirements are met.

------
pps43
> they are obviously not worse, in some places are much better

Not sure why it is obvious, especially after Alex Biryukov et al reverse
engineered S-Boxes of Streebog and Kuznyechik [1].

If you suspect Dual_EC_DRBG kind of weakness, why not use some algorithm
without magic constants like Speck [2]?

[1]
[https://eprint.iacr.org/2016/071.pdf](https://eprint.iacr.org/2016/071.pdf)

[2] [http://eprint.iacr.org/2013/404.pdf](http://eprint.iacr.org/2013/404.pdf)

