Cool! Handling each event record as a structured data (JSON) is very interesting, because it can connect with MongoDB(DocumentDB) seamlessly for stream output or aggregation. This slide is written about a collaboration with Fluentd and MongoDB.
http://www.slideshare.net/doryokujin/an-introduction-to-flue...
Excellent, and I am going to try it. But I still think using something like Flume has some advantages mainly because of Hadoop eco-system. For instance you can plug-in the log data to HBase and use Hive to write to high level abstracted queries and run on Hadoop. I am guessing but seems like there are plugins on the way for various systems but not Hadoop.
Update:
Also Flume can use any data stream, for instance Twitter stream so not limited to log Analysis only.
We're developing Hoop (REST API for HDFS) plugin, and also Stargate (REST API for HBase). It requires additional setup at the infra-side, but still very useful and performs well.
You can feed Twitter stream by writing in_twitter plugin. That will 30 lines of Ruby code :-)
It's more about log analysis than just transferring the files over. Apart from splunk there aren't many projects which handle this well.
It also seems to have many input plugins apart from standard syslog. And that's great. I think we're way past the time where lines of text just don't work well enough. First you format your data to text, then you want to get your original data back from that text by parsing... seems like a wasted effort.