The best definition I can come up with is hardware that implements neural network architecture directly, especially McCulloch–Pitts spiking neurons (which have a temporal component). In neuromorphic chips, neurons are an actual component in the hardware, you can ask questions like "how many neurons does this chip have?". Contrast to how neural nets as we use them today which are actually implemented as a computation graph on tensors. It turns out that a special kind of neural network can be abstracted well as a series of tensor ops (dense feedforward layered networks[1]), but this is not necessarily the case for any neural network. So neuromorphic chips have a possibility of being far more general.
[1]: Which are wired something like this: http://neuralnetworksanddeeplearning.com/images/tikz40.png - Notice that dense connections and layered architecture. For all intents and purposes, this what neural nets look like today because of how easily it is to treat a NN with this specific wiring as a chain of tensor computations and thus execute on more conventional hardware.
[1]: Which are wired something like this: http://neuralnetworksanddeeplearning.com/images/tikz40.png - Notice that dense connections and layered architecture. For all intents and purposes, this what neural nets look like today because of how easily it is to treat a NN with this specific wiring as a chain of tensor computations and thus execute on more conventional hardware.