
Design of CMOS-memristor Circuits for LSTM architecture - godelmachine
https://arxiv.org/abs/1806.02366
======
tw1010
This is cool stuff. But you guys, the perception that there are no low-hanging
fruits anymore is false, and the reason it is perpetuated is because people
oddly obsess over frontier techniques and their combinations, instead of
innovating at first principles.

Just trying to be the voice of psychological health in this community.

~~~
bcheung
AI is largely a research dominated field, and as such, it emphasizes novelty.
Once the average developer starts using it I think things will change towards
more practical concerns. We're starting to see this already with existing
libraries and porting to non-python environments.

------
bcheung
I'm not that familiar with CMOS-memristors. The paper is a bit scarce on
background.

Is this just for inference or training time as well?

Is it digital or do the inputs have analog voltage rates and only then trigger
like a neuron?

Can the network topology and weights be modified or is this fixed like an
ASIC?

------
londons_explore
Interesting concept, but note this can't be used for _training_ an LSTM
network.

it's the training that involves far more computation and memory, because
rather than just storing the latest state of the LSTM cells, one needs to
store all past states and activations of the cells.

For forward inference, this logic looks good, although I'm unclear what use
cases it would apply to.

~~~
amirhirsch
It makes much more sense to design ASICs for inference than for training. Any
useful inference network will be executed many millions of times more than its
training and so will require much more computation in aggregate. Inference may
also be run in embedded environments, running on battery power with no network
connectivity while training can probably run in the cloud.

------
senatorobama
Are there any Master's degrees in AI + chip design?

~~~
soravux
In my university, both machine (and deep) learning and FPGA + chip design are
specialties of the electrical engineering department. We just launched earlier
this year an industry-oriented AI masters degree. Taking your extra classes in
chip design would give you a solid foundation to perform this.

~~~
godelmachine
Would you kindly provide the link to the particular course of your university?
Thanks :)

~~~
soravux
The master's program in AI I was referring to is only available in French
(Laval University, Québec City).

The chip design expertise is provided by most electrical engineering
departments, with courses usually named VLSI design, FPGA/ASIC development or
microelectronics.

If you apply for a master's degree (in AI, for example), you can often mix-
and-match speciality classes and ask for those chip design courses to be added
to your cursus.

If you are a hands-on person curious about the matter, you can buy an FPGA
(~50$ for entry-level) and follow a Verilog or VHDL tutorial online. Quickly
put, an FPGA is a chip that can be "rewired" at will, very useful to learn or
prototype before building a production chip.

~~~
godelmachine
Thanks for the info :)

Hope it will be available in English soon enough.

------
monk_e_boy
Any interesting algorithm eventually finds its way down into silicon.

