Will all the people who bought it be refunded? Or does Apple simply take the money and run, leaving customers with empty hands?
Are you saying that Apple shouldn't let developers stop selling an app?
That wasn't clear in the article. However as others noted: It's also been removed from their list of purchased apps, so if their iphone gets damaged, or it gets deleted from their phone due to a mishap, they're still out the cash they paid and left with empty hands.
> Are you saying that Apple shouldn't let developers stop selling an app?
They should do the same thing Steam does: Remove the store page and keep it available for redownload by customers who paid until it positively cannot function anymore.
Not if they have it on their mac/windows pc they sync with
> When an app is acquired, the acquirer usually requests for it to be pulled.
Record video, extract frames, simulate burst mode. Not that difficult really.
My understanding (from reading the various threads by those much more qualified than me) is that the application was an incredible technical achievement, and that the skill and knowledge required to develop such an application was extraordinarily rare - so rare that acquiring such a developer for a few million dollars (or whatever they might pay) would be a great investment by Apple.
Do you have information/knowledge that leads you to believe otherwise?
The app maker wrote about the tech behind on HN a while back here: https://news.ycombinator.com/item?id=6137979
A really nice technical achievement. Not at all simple or run-of-the-mill.
"At the core of SnappyCam is a capture and image signal processing engine with innovations that took over 12 months of research and development. With it, we can also beat competing iOS camera apps by 400% on full-sensor shooting performance with the same iOS device and hardware.
Once photos are captured and buffered in real-time, our multi-threaded JPEG compression engine takes over. It compresses shots in software at speeds that exceed that of the hardware encoder normally dedicated to the task.
We had to reinvent JPEG to do it. First we studied the fast discrete cosine transform (DCT) algorithms of the early 1990s, when JPEG was first introduced. We then extended some of that research to create a new algorithm that’s a good fit for the ARM NEON SIMD co-processor instruction set architecture. The final implementation comprises nearly 10,000 lines of hand-tuned assembly code, and over 20,000 lines of low-level C code. (In comparison, the SnappyCam app comprises almost 50,000 lines of Objective C code.)
At first we did try to leverage the iPhone graphics processing unit (GPU) for the DCT computation. It turned out to be a dead-end. Back then, iOS 4 limited the data transfer speed in and out of the GPU; but even with that limitation eliminated, with the introduction of OpenGL pixel buffers in iOS 5, it appeared that the GPU parallelism was limited to about two render units that ran at a slower clock-rate than the main CPU. Without support for OpenCL or multiple render targets, we were also forced to use a naive (slow) DCT algorithm that was essentially a full matrix multiplication.
The ARM NEON approach was optimal: the SIMD pipeline can perform up to 8 simultaneous arithmetic operations in parallel at the full clock rate of the device, without any data transfer overheads, and allowing us to use any DCT algorithm we could conceive. And when it comes to speed, it’s all about doing less for more. Less computation, more work done, faster.
We also optimized out pipeline bubbles using a cycle counter tool so that every clock tick was put to work.
JPEG compression comprises two parts: the DCT (above), and a lossless Huffman compression stage that forms a compact JPEG file. Having developed a blazing fast DCT implementation, Huffman then became a bottleneck. We innovated on that portion with tight hand-tuned assembly code that leverages special features of the ARM processor instruction set to make it as fast as possible.
Similar innovations were put into a custom JPEG decoder, powering the unique SnappyCam thumb-to-interact living photo viewer. When dealing with massive 8 Mpx (32 MByte BGRX uncompressed) images, decoder performance became critical to a great user experience."
The idea behind SnappyCam was also to code each picture independently, and not rely on motion prediction or video codecs. If you try and pull a single frame from a HD video you might be disappointed: they compress the YUV dynamic range (studio swing) and it looks washed out, even if you land on an i-frame
On the other hand, the iphone 4 and 5 video modes advertise 1080p 30 fps. 1080p is equivalent to about 2.1 megapixels.
So the iphone video mode compresses about 180 megabytes of raw data a second, while snappycam compresses more like 457 megabytes a second (assuming three bytes per pixel).
Of course, Snappycam could be using a simpler compression algorithm to get throughput at the cost of quality.
I'm happy to have it and I'd pay for it again if it's ever reissued. Highest praise I can offer.