Rosenblatt's perceptron has little to do with neural nets. Geoffrey Hinton regrets coining the name "multi-layer perceptron" precisely because they're really unrelated.
Hm, yeah I suppose you could argue it's not really a neural network unless you involve the idea of hidden layers. And according to wikipedia backpropagation wasn't a thing until 1975 so you might have a point.
"It's not really a neural network" is an understatement. Perceptrons are about as related to neural nets as linear regression is (note that you can "train" linear regression with stochastic gradient descent).