I have terabytes of new data arriving per day that needs to be verified, ingested and translated. There simply is no way to do this via online or stream processing. If you ingesting clickstream or Twitter data then sure it will work. But more often than not you need to work with sets of data. And for that batch processing is the only option.
Are you folks doing sensor analysis or something, or eating logfiles, or what?
What we can get for that data is a major competitive advantage. We can offer much cheaper financial products since we model risk individually rather than as a cohort. It also allows us to have a single customer view despite reselling other companies products.
Building a single customer view with lots of disparate data sets is a big trend right now.