Hacker News new | comments | show | ask | jobs | submit login
SnappyCam disappears from App Store without a trace (smh.com.au)
41 points by mcbain 1297 days ago | hide | past | web | 26 comments | favorite



Annoyingly it's also disappeared from my big list of purchased apps too.


Where is a class-action lawsuit when you need one!?!



Doesn't iOS 7 come with the main features of this app as stock nowadays? Since its release. Was he acquired beforehand maybe?


I'm not sure if it's the same thing. I just did a test and the burst mode on my iPhone5S captured about 9 frames per second. SnappyCam claimed to have 20 to 30 fps full resolution.


I don't own an iphone and hearing that Apple will happily remove purchased applications with exactly no explanation doesn't exactly sway me towards ever getting one; however i have to ask:

Will all the people who bought it be refunded? Or does Apple simply take the money and run, leaving customers with empty hands?


The app wasn't removed from people's iPhone (I still have it on mine, for example); it was removed from the App Store—likely at the request of the developer.


Except that it's only been removed from the App Store. I still have it on my iPhone.

Are you saying that Apple shouldn't let developers stop selling an app?


> I still have it on my iPhone.

That wasn't clear in the article. However as others noted: It's also been removed from their list of purchased apps, so if their iphone gets damaged, or it gets deleted from their phone due to a mishap, they're still out the cash they paid and left with empty hands.

> Are you saying that Apple shouldn't let developers stop selling an app?

They should do the same thing Steam does: Remove the store page and keep it available for redownload by customers who paid until it positively cannot function anymore.


>they're still out the cash they paid and left with empty hands.

Not if they have it on their mac/windows pc they sync with


It's almost certainly not Apple that removed the app (unless they're the ones that bought it). Read the article again, it clearly states:

> When an app is acquired, the acquirer usually requests for it to be pulled.


jpap discussed a lot of what the app does another HN thread: https://news.ycombinator.com/item?id=6137979


I could see Instagram acquiring this as well.


Since iOS apps do not have direct access to native hardware, what exactly would Apple be buying?

Record video, extract frames, simulate burst mode. Not that difficult really.


I'm (genuinely) curious what your background is to make that assessment of snappycam that it was "Not that difficult really."

My understanding (from reading the various threads by those much more qualified than me) is that the application was an incredible technical achievement, and that the skill and knowledge required to develop such an application was extraordinarily rare - so rare that acquiring such a developer for a few million dollars (or whatever they might pay) would be a great investment by Apple.

Do you have information/knowledge that leads you to believe otherwise?


Apple is buying someone who was smart enough to find a way around that limitation and build a top-ranked app. I actually paid for this app and hope it doesn't disappear from my phone, it was extremely useful at several weddings to get a bunch of rapid fire shots off and choose the perfect one. It worked reasonably well even on an older iPhone 4s and the quality was fine.

The app maker wrote about the tech behind on HN a while back here: https://news.ycombinator.com/item?id=6137979


I don't think apps disappear from your phone, even if they disappear from the app store. If they're in your iTunes library, you can put them on your phone, in my experience.


Jpap invented a new more efficient JPEG codec - even faster than hardware encoders that use less efficient codecs. Also I think you can optimize hardware usage of iOS devices at the assembler level - it is not as locked down as Android.


You can include assembly in Android apps (it might require the NDK). The bigger problem is that an Android device could be ARM, x86, or even MIPS.


That matches my recollection as well. Also see http://www.gizmag.com/snappycam-iphone-app/28558/

A really nice technical achievement. Not at all simple or run-of-the-mill.


If video frames are of a lower quality than a photo then your technique will not work. From the sounds of it he figured a way to encode the images far faster and this allows for faster use of the photo mode (That is most of the time of higher quality than the video mode)


Here is part of what Apple is buying (snappyLabs has disappeared from the web, so thank goodness for wayback):

https://web.archive.org/web/20131010012005/http://www.snappy...

"At the core of SnappyCam is a capture and image signal processing engine with innovations that took over 12 months of research and development. With it, we can also beat competing iOS camera apps by 400% on full-sensor shooting performance with the same iOS device and hardware.

Once photos are captured and buffered in real-time, our multi-threaded JPEG compression engine takes over. It compresses shots in software at speeds that exceed that of the hardware encoder normally dedicated to the task.

We had to reinvent JPEG to do it. First we studied the fast discrete cosine transform (DCT) algorithms of the early 1990s, when JPEG was first introduced. We then extended some of that research to create a new algorithm that’s a good fit for the ARM NEON SIMD co-processor instruction set architecture. The final implementation comprises nearly 10,000 lines of hand-tuned assembly code, and over 20,000 lines of low-level C code. (In comparison, the SnappyCam app comprises almost 50,000 lines of Objective C code.)

At first we did try to leverage the iPhone graphics processing unit (GPU) for the DCT computation. It turned out to be a dead-end. Back then, iOS 4 limited the data transfer speed in and out of the GPU; but even with that limitation eliminated, with the introduction of OpenGL pixel buffers in iOS 5, it appeared that the GPU parallelism was limited to about two render units that ran at a slower clock-rate than the main CPU. Without support for OpenCL or multiple render targets, we were also forced to use a naive (slow) DCT algorithm that was essentially a full matrix multiplication.

The ARM NEON approach was optimal: the SIMD pipeline can perform up to 8 simultaneous arithmetic operations in parallel at the full clock rate of the device, without any data transfer overheads, and allowing us to use any DCT algorithm we could conceive. And when it comes to speed, it’s all about doing less for more. Less computation, more work done, faster.

We also optimized out pipeline bubbles using a cycle counter tool so that every clock tick was put to work.

JPEG compression comprises two parts: the DCT (above), and a lossless Huffman compression stage that forms a compact JPEG file. Having developed a blazing fast DCT implementation, Huffman then became a bottleneck. We innovated on that portion with tight hand-tuned assembly code that leverages special features of the ARM processor instruction set to make it as fast as possible.

Similar innovations were put into a custom JPEG decoder, powering the unique SnappyCam thumb-to-interact living photo viewer. When dealing with massive 8 Mpx (32 MByte BGRX uncompressed) images, decoder performance became critical to a great user experience."


This reads like some genuine old-school low-level wizardry hacking and tuning that is rarely seen these days. Amazingly well done by the developers. Sounds like an epic amount of effort went into it.


From jpap's thread on: https://news.ycombinator.com/item?id=6137979, here is why he isn't trying to extract pictures from the video buffer:

The idea behind SnappyCam was also to code each picture independently, and not rely on motion prediction or video codecs. If you try and pull a single frame from a HD video you might be disappointed: they compress the YUV dynamic range (studio swing) and it looks washed out, even if you land on an i-frame


Snappycam reportedly captures 8 megapixel images at 20 fps.

On the other hand, the iphone 4 and 5 video modes advertise 1080p 30 fps. 1080p is equivalent to about 2.1 megapixels.

So the iphone video mode compresses about 180 megabytes of raw data a second, while snappycam compresses more like 457 megabytes a second (assuming three bytes per pixel).

Of course, Snappycam could be using a simpler compression algorithm to get throughput at the cost of quality.


I can't recall the details of what exactly he claimed he'd developed but what I can say is that he wrung more performance out if the iPhone camera than Apple was able to. It's really something impressive in use.

I'm happy to have it and I'd pay for it again if it's ever reissued. Highest praise I can offer.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: