You can't just "toss a NN to stock market data" and expect good results. There's too little historical data, you can't easily "generate" more data for the network to learn from, making it really easy to overfit. In other fields (e.g. computer vision) a lot of research has been focused on inventing techniques that prevent overfitting, thus enabling "learning" (i.e. generalization of patterns), such as dropout, convolutional neural networks, flipping/rotating images, etc. Very few of these techniques can be applied generally.
Dropout can certainly be applied generally - it's useful as a regularization technique (especially in wide and deep networks) to combat overfitting in other fields than computer vision.