chunks <- Streams.toList i
let feed = decode (BL.fromChunks chunks) :: Maybe Feed
You lose streaming here; this reads the entire response body into memory before sending it to Aeson to decode. Which is a shame since both Aeson and http-steams otherwise seem able to stream arbitrarily large content through without buffering lots of it in memory. (I have not tested how well Aeson manages this; just looking at the types!) This could be used to eg, fold an operation over the list of earthquakes, no matter how long the list becomes.
Fixing this is probably beyond the scope of a simple walkthrough. I suspect it could be managed using unsafeInterleaveIO to translate between the io-streams chunks and the lazy bytestring chunks. Might be a nice thing to ask the io-streams developers to add. (Dealing with resource finalization could make it tricky though.)
On a less esoteric note, fetchQuakes would be improved by using withConnection. This would make it shorter, and ensures the stream is closed if there's an error.
It's not all that difficult, since io-streams has attoparsec support and aeson is an attoparsec parser... coded directly in the input box, I don't have io-streams installed so I can't see if this typechecks:
Yeah, I figured that defeated the benefit of streaming. And I agree, withConnection is cleaner, I'll add that if I do a followup (which I think I will given all the good advice I'm getting from HN). Thank you!
Fixing this is probably beyond the scope of a simple walkthrough. I suspect it could be managed using unsafeInterleaveIO to translate between the io-streams chunks and the lazy bytestring chunks. Might be a nice thing to ask the io-streams developers to add. (Dealing with resource finalization could make it tricky though.)
On a less esoteric note, fetchQuakes would be improved by using withConnection. This would make it shorter, and ensures the stream is closed if there's an error.