Hacker News new | past | comments | ask | show | jobs | submit login

https://github.com/Logflare/logflare/tree/staging

> Simply provide your BigQuery credentials and we stream logs into your BigQuery table while automatically managing the schema

I didn't know BigQuery was capable of accepting streaming log data - in my mental model of the world it was the kind of database that you update using the occasional batch job, not from a streaming source of data.

Looks like that's the tabledata.insertAll method which has been around for quite a few years - though it's now called the "legacy streaming API" on https://cloud.google.com/bigquery/docs/streaming-data-into-b... which suggests using the more recent Storage Write API instead: https://cloud.google.com/bigquery/docs/write-api




It's pretty awesome & very cost effective! When I was at Twilio we streamed logstash/fluentd (forget which) -> kinesis -> BigQuery and it worked great - certainly better than the days we were trying to manage ES ourselves


Were people using it as an actual ES replacement for logs? fluentd -> GCP log sink -> BQ can get awful if you need to work with it on daily basis


Curious how was it awful for you? Really have not regretted any part of using BigQuery.


is kinesis in the middle for more durability/being able to handle spikes? when and how does one make that sort of decision to insert kinesis into things? (esp since you were gonna get the data from aws -> gcp so you dont save anything staying within aws anyway)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: