
Kangaroo Twelve Implementation in Go - forestjc
https://github.com/mimoo/GoKangarooTwelve
======
tptacek
K12 is _not_ a "faster SHA-3". K12 is K12, M14 is M14.

K12 is related to SHA-3, but if you're throwing standards out the door (and I
think you should), you can use any of the hash core algorithms, whether or not
they're related to Keccak.

The best all-purpose cryptographic hash is probably truncated SHA-2 512, and
the hipster modern hash is probably still Blake2.

~~~
snakeanus
> The best all-purpose cryptographic hash is probably truncated SHA-2 512

Could you elaborate on that please? Why would you consider it as the "best
all-purpose cryptographic hash"?

~~~
wongarsu
On 64-bit processors SHA-512 truncated to 256 bits is faster than SHA-256, and
has the advantage of being safe against length-extension attacks (which are a
major gotcha of SHA-512 and the rest of SHA2).

Out of SHA3, K12, SHA2 and Blake, SHA-512 is one of the fastest (some variants
of Blake2 are faster), and it's the one with the longest track record, while
the Blake-family and SHA3 are fairly new. It's also widely supported nearly
everywhere. All that makes truncated SHA-512 a sane default.

~~~
snakeanus
> Out of SHA3, K12, SHA2 and Blake, SHA-512 is one of the fastest (some
> variants of Blake2 are faster)

[https://twitter.com/KeccakTeam/status/834789451708628995](https://twitter.com/KeccakTeam/status/834789451708628995)

> It's also widely supported nearly everywhere

This is true for the non-truncated variants, but I am not aware of any
protocol that uses the truncated ones.

I will agree with the "longest track record" however, this is an important
part.

~~~
wolf550e
If you are free to choose a hash function, you are free to truncate SHA-512.
Truncated SHA-512 is secure, there is no need to wait to see it used in a
standardized protocol to increase confidence in its security.

Creating a truncate(SHA-512, 256) out of regular SHA-512 is trivial, so it
true that it is widely supported nearly everywhere.

~~~
snakeanus
This is wrong, SHA-512/256 is not the same as calculating the output of
SHA-512 and then taking the first 256 bits of it, there are some differences
between them such as different IVs that would make the outputs also different.

~~~
tptacek
This only matters if there's an existing protocol that uses truncated SHA-2
you need to interoperate with (and your premise is there is not).

There is no functional advantage to having a different IV for SHA-2 5-2/256.

~~~
baby
Yup, this looks like it's a cross-domain protection.

~~~
tptacek
Sure, but if you care about domain separation, you can't rely on the IV
anyways.

------
dom0
(from linked resources)

> Keccak won the SHA-3 competition, and became the FIPS 202 standard on August
> 5, 2015. All the other contenders lost.

I don't think many professionals would agree with this view of cryptographic
competitions.

> Being too late to take part in the fight to become SHA-3, it just stood on
> its own awkwardly.

Again with the wording... SHA-3 is not an epic battle for blood and honour.
Acting as if a NIST competition is a public fight is rather disconcerting.

> It is only in 2013 that SHA-2 joined Intel's set of instructions, along with
> SHA-1...

Technically true, but only now -- four years later -- desktop processors were
released supporting the SHA extensions, and none of these are from Intel so
far.

------
tankfeeder
KangarooTwelve on PicoLisp:
[https://bitbucket.org/mihailp/tankfeeder/src/5a91a025d78eacf...](https://bitbucket.org/mihailp/tankfeeder/src/5a91a025d78eacf68655b6dbec5c2dd346624146/crypto/kangaroo12.l?at=default&fileviewer=file-
view-default)

------
mtgx
Isn't this SHA-3 less safe, too? Why not just go with BLAKE2?

~~~
snakeanus
BLAKE2 is less safe than BLAKE as well.

>Isn't this SHA-3 less safe, too?

Yes, both K12 and M14 are less safe than SHA3. Instead of 12 + 2ℓ rounds
(where ℓ is 25 * 2^ℓ = r + c [where r + c = 1600 in both sha-3 and K12]) it
uses just 12 rounds (14 for M14). It also uses half the c (c = 512 in
sha3-256, c = 256 in K12, where c is the sponge capacity). It provides the
same amount of bits of security against collision attacks for both quantum and
classical systems, however the preimage security is the ~85 bits for quantum
systems (128 in sha3-256) and 128 bits in classical systems (256 in sha3-256)
due to the changes in c.

~~~
tptacek
The word "safe" isn't doing us any favors here. What we're really talking
about is the "security margin" of the different hashes. Every hash we're
talking about on this thread has a margin far beyond any plausible or
foreseeable attacks. None are unsafe.

~~~
dsacco
This is a serious issue, and I think it's a problem of cryptography education.

On the one hand, we (rightly, in my opinion) teach people who haven't studied
cryptography that they shouldn't try to implement it on their own. This is
good, because cryptography is mostly applied math (hard) and careful software
implementation (also hard), and mixing these two hard things without shooting
yourself in the foot is _very_ hard.

On the other hand, developers who follow this advice don't have a clear way to
find the answer to questions like, "Which of these great algorithms do I
want?" It's easy to find sound recommendations for which algorithms are well-
studied and essentially safe; it's less easy to find guides that compare and
contrast algorithms along nuanced axes like computational cost,
interoperability, security margin or adaptability.

This leads developers to come to their own conclusions based on imperfect
understanding of the metrics they can quantify:

"I need to encrypt something - Serpent has a much higher security margin than
AES (Rijndael), I should probably go with that."

"I need to hash something - Keccak won the SHA-3 contest, so I should use that
instead of BLAKE."

"I need to authenticate something - if I choose two strong algorithms it will
be better than one, so I'll use AES-CTR and HMAC-SHA, instead of an AEAD."

"More bits means higher security, so I'll use AES256 and SHA-512."

I'm noticing this more and more in online discussion, where people seem to be
talking past each other about things like the "safety" of algorithms without
discriminating between other metrics like speed. In my opinion, it would be
better for people to consider "safety" as a mostly binary property if they
haven't done enough due diligence to know why one algorithm has a higher or
lower security margin than another one, because the other metrics are so much
more important.

There probably needs to be better cryptography education as well, something
that is a bit higher than "do this, don't do this" and which provides some
elaboration about the _whys_ inherent to different security margins.

~~~
zokier
> It's easy to find sound recommendations for which algorithms are well-
> studied and essentially safe; it's less easy to find guides that compare and
> contrast algorithms along nuanced axes like computational cost,
> interoperability, security margin or adaptability.

I might be hugely mistaken, but isn't the current situation that almost all of
the "essentially safe" hash functions are good enough for almost all uses? I
understand that the amount of choice can lead to analysis paralysis, but
sensible solution generally is just pick one (possibly the one that has a
reasonable implementation most readily available) and move on. Sure, the
almost randomly picked algorithm might not ideal on all possible metrics, but
perfect is the enemy of good.

~~~
dsacco
Yes, this is precisely my point. "Safe" is a binary property in cryptography.
But developers don't have much to go on, so even if they follow best practices
they often interpret differentiating metrics as variances in "safety", when in
actuality those things are only relevant for computational cost,
implementation difficulty, adaptability and interoperability.

In general if you provide people with a list of possibilities and say, "Yeah,
any of these is fine", they're not going to literally choose one at random.
They're going to search a bit and read about differences between the
algorithms, and unless they spend a lot of time reading about the respective
properties of each algorithm they might eventually say something like, "Argon2
is safer than bcrypt."

"Safer" is a misnomer. There are meaningful reasons to choose one algorithm
over another, but these generally boil down to performance impact,
implementation time and how easy it is to shoot yourself in the foot with
implementation.

This matters because people seek understanding, and if you don't provide a
satisfiable amount they might draw erroneous conclusions that have a
meaningful _non-security_ impact. For example, you most likely don't need to
be using AES256. People like to, because 256 > 128\. But seriously - you
probably don't need to. The impulse to just use the bigger number leads to
decision making based on signaling more than real security, and has a
meaningful reduction in things like usability and performance.

To rephrase my point here - "safe" is safe enough, and there should probably
be better education on the peripheral implementation metrics for cryptographic
algorithms, because you're far more likely to run up your AWS bill or
inadvertently screw up your security by e.g. choosing a complex cryptographic
algorithm with a huge security margin versus a simple one with a "good enough"
security margin. If we're not able to provide more education about what to
_actually_ look for when discriminating between algorithms, then we need to
find a way to decouple certain metrics from the perception of "safety", like
key size.

For a real world example of this, look at the advertising with things like
"bank-grade" or "military-grade" security which are heavily associated with
phrases like "128-bit" or "256-bit." Sometimes a developer will see that
AES128 is used and think to themselves, "I'll use 256 and be even safer!", all
the while the public is slowly taught to associate safety with ever larger bit
sizes. Then when a legitimately superior ECC algorithm comes out that uses 224
bit keys, people cling to RSA because they can use a 2048 bit key size and
2048 > 224.

~~~
zokier
When I said good enough, I meant good enough on all aspects, including non-
safety related. I mean it is somewhat difficult to imagine that someones
project would fail because they picked the "wrong" hash function or encryption
algorithm from the "essentially safe" ones.

------
plygrnd
A mention of the always-on culture of these times seemed appropriate. Easy
access to information doesn't seem beneficial if said information is geared to
attract clicks or foment fear. Or both.

