Many browsers throttle javascript performance (particularly looping timers) when the tab doesn't have focus. My guess is the "color bomb" is just a side effect of the JS performance returning to a normal level when focus returns to the tab.
60 events per second and it's pretty but basically unfollowable.
It's something I always wonder about visualizations - no human can follow the amount of data in even a small system. How much are they just hype and feelgood, and how much are they genuinely useful?
In finance this is a classic problem and is why you see traders with proliferations of monitors, trying to track everything, and they are anyway outmatched by the machines. The irony is that the vast majority of tick data points do not contain much incremental information. They're very correlated. Think how most stocks tend to move in lockstep to the index.
We use principal components analysis a lot to cut thousands of feeds and get the "big picture" in usually 4-6 "global" variables, and then we use PCA regression to find the "outliers" in the rest of the data and show those. Thus we get at least 2-3 orders of magnitude less data that allows mere humans to actually interpret - big picture + outliers - and it's very rare that using this simple technique we ever miss much. And it can literally cut thousands of feeds into a couple of dozen. We've found this to be much more effective than creating animated "dot swarms" which look beautiful but are very poor at conveying rich information.
That's actually a challenge that we tried to overcome - being able to get the big picture on the one hand, while still being able to catch and filter a single event.
I think visualisations need to be a seriously terse summary, leaving out most of the detail apart from one or two critical dimensions where you're looking for the pattern or trend. A latency distribution heatmap can make a lot of data points visually useful, but you can't cram in five other dimensions and still make it work.
Alooma pricing varies greatly. Our customers are paying anywhere between $1000 and $15000 per month. Because the variance is so big, we prefer to have a conversation before providing a quote. There is a two weeks free trial though, to test things out.
It would be great if you have this on the pricing page. It's really frustrating for there to be nothing there.
My startup has a ~$50 / mo Kafka ETL system hooked up to a few things and it seemed like a good candidate to move to hosted. But if your small clients are $1000 / mo... Noooope, not our scale! Good thing I saw this post or I might have spent a lot of time trying it out.
1) What is the cost relating to: requests? GB ingested? unique users? sources? Too many variables...
2) On a different topic. There is one thing I'd like to see that would put you at a great advantage over all competitors: The ability to have the tracker appears under my own domain like "stats.mysite.com/whatever".
Right now all client-side javascript scripts are expected to reside on a unique address and it's impossible to change (i.e. alooma.com or google-analytics.com/analytics.js or segment.com). The unique address are systematically blocked by all adblockers.
1. Mostly event volume, but also number of integrations
2. It's actually possible if you cname your domain eg. `alooma.user5994461.com` to `inputs.alooma.com`. However, SSL won't work as we don't support installing custom certificates at the moment.
Alooma pricing varies greatly. Our customers are paying anywhere between $1000 and $15000 per month. Because the variance is so big, we prefer to have a conversation before providing a quote. There is a two weeks free trial though, to test things out.