> Simply provide your BigQuery credentials and we stream logs into your BigQuery table while automatically managing the schema
I didn't know BigQuery was capable of accepting streaming log data - in my mental model of the world it was the kind of database that you update using the occasional batch job, not from a streaming source of data.
It's pretty awesome & very cost effective! When I was at Twilio we streamed logstash/fluentd (forget which) -> kinesis -> BigQuery and it worked great - certainly better than the days we were trying to manage ES ourselves
is kinesis in the middle for more durability/being able to handle spikes? when and how does one make that sort of decision to insert kinesis into things? (esp since you were gonna get the data from aws -> gcp so you dont save anything staying within aws anyway)
> Simply provide your BigQuery credentials and we stream logs into your BigQuery table while automatically managing the schema
I didn't know BigQuery was capable of accepting streaming log data - in my mental model of the world it was the kind of database that you update using the occasional batch job, not from a streaming source of data.
Looks like that's the tabledata.insertAll method which has been around for quite a few years - though it's now called the "legacy streaming API" on https://cloud.google.com/bigquery/docs/streaming-data-into-b... which suggests using the more recent Storage Write API instead: https://cloud.google.com/bigquery/docs/write-api