Within less than a day I had a prototype up and running that handled inbound requests, did some processing, offloaded the rest to background workers and returned the request. The workers asynchronously stream data to BigQuery for real-time reporting.
The only thing I found baffling is that I could not load Googles' own cloud sdk from within the GAE environment and had to "vendor" it in, doing a bunch of workarounds/patching to get it to run. That probably consumed 3 of 4 hours it took to get this running.
Load testing this thing, GAE automatically managed everything and I never saw any error responses throughout the load test with. It's amazing how little ops work is needed.
AWS lambda and kinesis firehose/s3/Athena could provide part of this, but it's far from turnkey/conclusive. GAE/GC are just so much more complete.
This always happens when I try to use one of Google's APIs/SDKs. I dread using them, I feel like they're overengineered most of the time.
The SDK works as expected once imported as the outside library and various outbound requests things are configured and patched.
I think I overstated the complexity. All I needed to do was enable SSL for outbound connections (one-liner in the yaml file) and add the requests library patch to use URLFetch service (python/GAE specific step). Searching SO and Google documentation is what took up 95% of the 3 hours I spent on this.