Yep. falcor84: you’re thinking of the so-called ‘multilayer perceptron’ which is basically an archaic name for a (densely connected?) neural network. I was referring to traditional perceptrons.
While ReLU is relatively new, AI researchers have been aware of the need for nonlinear activation functions and building multilayer perceptrons with them since the late 1960s, so I had assumed that's what you meant.