The pros to Docker so far:
Dependencies: Dockerfile gives a list of explicit system dependencies for each app. This can be done in other ways with package files or config management but this was not being done before and this is an easy catch all to force it for any different type of environment.
Logical Grouping: App environment (Dockerfile + docker-compose.yml) lives alongside codebase in a single git repo
Deployment: Deploy to any box with `git clone myapp && docker-compose up` for testing/dev instances or migrations
Development: We mount the codebase from a host directory into each container, with git hooks to update the codebase, which works well for us (we have no CI)
Plus it's fun!
Operational Complexity: Devs/Ops teams probably won't want to learn a new tool. I setup a Rancher instance to provide a GUI which makes things a bit easier to swallow. It has things like a drop in shell, log viewer, performance metrics, etc.
Network complexity: we never needed reverse proxies before, now we do.
Clustering/Orchestration: We don't cluster our containers, but the more we add the more I think we might want to, which would add a whole new layer of complexity to the mix and seems unnecessary for such a small shop.
Security?: lots of unknowns, lack of persistence can be bad for forensics, etc.
Newness: Documentation isn't great, versions change fast, online resources may be outdated.
Like you, I'm sometimes unsure if this is the right choice. Maybe a monolithic server or traditional VMs + Puppet would be easier, simpler, better? In the end, I think Docker just fit with the way I conceptualized my problem so I went for it. You may never get that "definitely good enough" feeling, but if it fits your workflow and keeps your pipeline organized and manageable, then I say go for it.