
Show HN: Larq – Binarized Neural Network Inference with MLIR and TFLite - khelwegen
https://github.com/larq/compute-engine/
======
khelwegen
Author here: I work at Plumerai, a startup that’s building chips for efficient
inference of binarized neural network (BNN). We previously open-sourced Larq
[1], the library we use to build and train BNNs, and Larq Zoo [2], our
repository of pre-trained BNNs from the literature. Now we’re opening up the
deployment side as well with Larq Compute Engine (LCE).

You can grab a BNN from Larq Zoo (or build and train your own using Larq), use
the MLIR converter in LCE to convert it into a TensorFlow Lite-compatible
Flatbuffer file, and then use the LCE runtime to run inference on mobile and
edge devices like Android phones [3] and the Raspberry Pi [4] (support for
Armv8-M microcontrollers coming soon).

Internally at Plumerai, LCE has enabled our ML researchers to benchmark
different BNN architectures and downstream application implementations on real
hardware, so we’re excited to see what other developers and researchers will
use it for!

Happy to answer any questions :)

[1] docs: [https://docs.larq.dev](https://docs.larq.dev), github:
[https://github.com/larq/larq](https://github.com/larq/larq)

[2] docs: [https://docs.larq.dev/zoo](https://docs.larq.dev/zoo), github:
[https://github.com/larq/zoo](https://github.com/larq/zoo)

[3] docs: [https://github.com/larq/compute-
engine/blob/master/docs/quic...](https://github.com/larq/compute-
engine/blob/master/docs/quickstart_android.md)

[4] docs: [https://github.com/larq/compute-
engine/blob/master/docs/buil...](https://github.com/larq/compute-
engine/blob/master/docs/build_arm.md)

