
Machine Learning on Encrypted Data Without Decrypting It - ViralBShah
https://juliacomputing.com/blog/2019/11/22/encrypted-machine-learning.html
======
ChrisRackauckas
This blog post reminds me of the "Machine Learning Systems are Stuck in a Rut"
paper [1], where they mentioned:

> It is hard to experiment with front end features like named dimensions,
> because it is painful to match them to back ends that expect calls to
> monolithic kernels with fixed layout. On the other hand, there is little
> incentive to build high quality back ends that support other features,
> because all the front ends currently work in terms of monolithic operators.
> An end-to-end tool chain for machine learning requires solutions from many
> specialist disciplines.

This is a fantastic example of getting around this problem using Flux.jl's
interaction with the full Julia language. Here the author does some
exceedingly cool stuff (doing machine learning on encrypted data!), and in
order to get there he needed to write what many would think of as lower-level
kernels that should be "provided by the library" (encrypted matrix
multiplication, encrypted convolutions). To make the interface useful, he
needed to use a mature interface that people are already using and make it
something that can automatically switch over to encrypted implementations.
And, because of the nature of fully homeomorphic encryption, the
implementation has to be fast, otherwise it's dead in the water since FHE is
expensive!

To me, this example showcases one way how Flux.jl's is helping machine
learning get out of that "rut". The author adds dispatches to standard kernels
which allow for his encrypted data types, which then allows standard Julia
Flux.jl machine learning models to act on encrpyted data, and it uses type-
inference + JIT compilation to make it fast enough to work. Not only that, but
it's also not tied to some "sub-language" defined by a machine learning
framework. That means the FHE framework not only works nicely with machine
learning, it can be used by any other package in the Julia language
(differential equation solvers, nonlinear optimization, macroeconomics
models?). This allows composibility of tools and community: all tools from all
fields can now use this same FHE implementation, so authors collaborate and
mature this to a very good one. These knock-on effects give people doing
research in Julia a lot of competitive advantages over other researchers, and
it will be interesting to see how this effects not just the ML research
community, but also everyone else!

[https://dl.acm.org/citation.cfm?id=3321441](https://dl.acm.org/citation.cfm?id=3321441)

