Well, AWS does have some very useful value add services: Elastic MapReduce, S3, and DynamoDB being my favorites.
Way back when, Amazon gave me about $1200 of free use credits over a two year period, and I played, experimented, used it for most customer projects, etc. I also used AWS for almost all of my own projects.
In the last year or two however, I have started going back to renting large VPS by the month (I use RimuHosting, but there are a lot of good providers) because you get so much more capabilities for the same amount of money.
A little off topic, but another way I have found to save money is to wean myself off of Heroku by taking the little bit of time to set up a git commit/push hook to automatically deploy my web apps. I was using a manual deployment scheme before that took me a minute for each deployment - not so good.
All that said, AWS is really awesome for some jobs like periodically crunching data with MapReduce, etc. I bought a very useful little book "Programming Amazon EC2" a few years ago, and I recommend that as a good reference for using the AWS APIs.
Using a VCS as your deployment strategy is The Wrong Way to do it.
I won't go into it in this comment because it's been beaten to death and you can find information about deployments everywhere; even the commenter on your blog post makes a better suggestion than your VCS deployment strategy.
Continuous integration, source/binary distributions, automated provisioning, sandboxing, versioning, etc...; it's all out there.
For a lot of cases, you are correct that continuous testing, integration, deployment is the way to go, especially when deployed systems have many moving parts. However for small projects, mimicking a Heroku style of testing locally, commit changes, and use git hooks for deployment seems like a good solution.