MNIST and other small and easy to train against datasets are widely available. You can try out anything you like even with a cheap laptop these days thanks to a few decades of Moore's law.
It is definitely NOT out of your reach to try any ideas you have. Kaggle and other sites exist to make it easy.
My pet project has been trying to use elixir with NEAT or HyperNEAT to try and make a spiking network, then when thats working decently drop some glial interactions I saw in a paper. It would be kinda bad at purely functional stuff, but idk seems fun. The biggest problems are time and having to do a lot of both the evolutionary stuff and the network stuff. But yeah the ubiquity of free datasets does make it easy to train.
It is definitely NOT out of your reach to try any ideas you have. Kaggle and other sites exist to make it easy.
Good luck! 8)