
Ask HN: S3 but with append? - dhbradshaw
Ideally, append-only.<p>Basically, what do you do if you want to pool a bunch of streams into a (potentially very large) log file?
======
abd12
You can use Kinesis Firehose to stream data to S3. It'll buffer data for a
while -- you set thresholds based on size of data or time, whichever is hit
first -- then it will save the data to S3.

It won't be a single large file, but they'll all have the same prefix based on
date. Most data processing tools will let you suck up an entire prefix and
treat it like a single file.

------
idunno246
If you know the size ahead of time, you can use multipart uploads. Otherwise
you would have to buffer to disk. you could consider kinesis firehose which
had dumping to s3 built in

The google storage api has a mode where you can stream bytes, and it doesn’t
become visible until you close it(and then can’t modify like s3). And unlike
s3, it requires a delete permission to be able to overwrite, though with s3
you can turn on versioning and not give deleteobjectversion.

------
nikonyrh
You could always parition them into chunks and upload to S3 as individual
objects.

