Hacker News new | comments | show | ask | jobs | submit login

Android's Camera API has existed for _literally years_ to take high-quality photos and these guys are still just taking a screenshot of the viewfinder and calling it a day. It's been like this forever and Snap has refused to fix this. I have yet to come across any other app on Android that utilizes the camera this way. The photos are complete garbage.

And here they are, trying to add more 'features' into an app where its main feature has been inherently broken since inception.

edit: this post kinda blew up. to see a real side-by-side comparison, https://imgur.com/a/wuaZi




As a near daily snapchat user, and someone who has developed an Android app that use's the camera, I prefer it this way. I'm not using SnapChat to take super high quality pictures, but to share moments quickly. The WYSIWYG works well for that, and a lot of Android devices are super slow to take pictures.

Dealing with Camera 1 vs Camera 2 Android APIs, and device specific issues is a mess, I don't blame them for taking the easier route. The iOS camera system is a lot easier to develop against, so I blame Android for this rather than Snapchat.


It screws over higher-end devices though. One of the reasons I bought an LG G3 was the "laser auto focus" whatever, which actually did make it insanely fast to take an in-focus photo. It was way slower with Snapchat, and I couldn't focus on anything closer than about 3 feet.

I kind of blame Android too, as a dev I know how crappy the camera APIs can be. But as a user, it stands that only Snapchat takes crappy photos because it doesn't use the standard API, everything else works fine. (Okay actually Facebook Messenger does this too, but oddly the main Facebook app does it right).


Screws over! Oh no!

Good thing it still has the native app

Is Snap obligated to transport your high-res images via the agreement you have when using their software?

Sounds like a market opportunity for software that doesn’t screw you over


I'm not sure how you can think your sarcasm adds anything to the conversation, nor how it could be warranted.

In case you haven't noticed, Snapchat main hurdles these days is that their growth is vastly slowed down by the "stories" timeline on the Facebook app (Instagram, WhatsApp, messenger) that do offer higher quality of pictures.


I don’t really see how histrionics over a tech company in a so-called free market with an obligation to make money, not “not screw over users” adds anything to any conversation

We’ve been here before. Especially on this HN. The whole setup itself is banal

Are we supposed to sit here and fix their problems? What point does that serve

It’s like forcing an author to change their story.

Get over it, America! No one cares about gameifying Silicon Valley


Taking the easy route was OK when they were a scrappy startup, but now that they are a multi-billion public company, I honestly don't see why they can't hire a couple of qualified Android devs to fix this.


It's never really about the devs. It's about creating buy-in from management to support the work that needs to be done. You'll be hard-pressed to find leadership that is generally willing to re-architect/re-engineer apps from scratch without significant pressure from somewhere else in the org other than engineering.

Technical debt is always the last concern in the product. But for once, it's manifesting itself as a real user-facing problem which was acknowledged in the last SNAP earnings call for the Android app.


Ugh and I thought that "The Camera App Company" that only hires rockstar engineers would be different...


Doesn't Instagram use the same approach on their Stories Camera on Android? Seems more of an issue with Android Camera performance/APIs than an engineering critique.


So you prefer gimping your photo quality because you're too lazy to learn how the API works, and assume that your users are OK with this as well?


Aren't the photos on Snapchat intended to be ephemeral? I'd be totally okay with potato quality if I was sharing something quickly because Snapchat isn't a photography app.

It's not as if Snapchat are a tiny company with nobody smart enough to notice there's a camera API, this is intentional.

If you want to send a quality photo to a friend, there are apps for that.


"snapchat isn't a photography app"

you literally take photos and send it to people


Snapchat is to photography as passing notes in class is to literature


Sure, but it's a sticky note, not a novel.


> Aren't the photos on Snapchat intended to be ephemeral? I'd be totally okay with potato quality if I was sharing something quickly because Snapchat isn't a photography app.

This hasn't been the primary focus for years. You can now send Snaps that don't expire after a time limit, and can save your snaps to your storage or Snap's cloud storage (called "memories").


The non-expiring snaps still disappear when you close them; they just don’t auto-close. I guess there’s memories, but I doubt most people take photos just to put in memories.


[dead]


[flagged]


We ban accounts that show up just to violate the guidelines. If you'd like the account to be unbanned, you can email us at hn@ycombinator.com.

https://news.ycombinator.com/newsguidelines.html


Please follow the HN comment guidelines: https://news.ycombinator.com/newsguidelines.html .


Snapchat (the company) defines itself as a camera company. This can be seen in their IPO and SEC filing documents. And I bet it has a lot to do with their snapchat spectacles.


I've used some pretty recent Android devices which take 1-2 seconds to focus and capture using the actual camera, so I see where he's coming from. Done > perfect when you're trying to capture something happening RIGHT now.


Hate to break it to you, but Snapchat still struggles to focus quickly. Probably because they're using the viewfinder's built-in lower resolution to perform the focus action, which further hampers its ability to focus on an object accurately


We we're doing video, so we are using Camera1 (and Camera2) where its available, but its a bad situation. Lots of obscure devices we don't have access to test on fail regularly and users blame us not Android. The API is clumsy and broken, not just difficult to use.

If I was making an app with photo taking capabilities, I would explore the Snapchat route. Obviously Snapchats users are ok with this for the most part.


This^

I have made countless cross-platform AR apps for mobile and desktop. Never an issue with WebRTC/getUserMedia, iOS, windows, etc. However Android is always a nasty problem. From incorrect aspect resolutions, image stretching, silent fails. Each device needs a custom fix which is time consuming and costly. No wonder why Snapchat went an alternative route.


Are time-consuming custom fixes still commonly required in other (not camera reliant) Android apps too? I haven't really done any Android development since I had a Galaxy phone four years ago because this problem really turned me off to making hobby apps for Android.


The camera & low-level video encoding / decoding are the most device-specific quirky in my experience. If you're making something that is just "software" it's all pretty standardized, but when you start accessing the "hardware" (video codec, camera, possibly low-level audio) is where the device specific differences catch you.


It sounds like he does understand how the API works, doesn't like the way it works, and has decided that for the scope and goals of his project, doing things the fast and low quality way meets the needs of his users.


How often is the Venn overlap of "High Quality Photo" and "OMG Selfie!" exceedingly relevant?


What kind of photo quality do you need to take a selfie to send to your BFF with cat ears drawn on?


Why not implement both and have the screencap method for devices with device specific issues or at leat have an option to use both.


I worked on an app that we needed to take a live video stream, run that through OpenCV, but also quickly take high-res photos while that's running.

On Android I really had trouble in getting the preview stream to line up with the photographs. I needed to keep the preview resolution low enough that devices could process it and present to the screen in real time. But not all devices have the same preview and photograph resolutions.

I found there were inconsistencies between the way the image was cropped (on the OS/hardware level) between different resolutions.

You can take a preview stream at resolution X, and it would appear full frame. Then you would take a photograph at resolution Y, and despite being the same aspect ratio, it would be cropped differently. The results varied between devices, with some matching the crop, some vertically, some horizontally, and (as far as I could tell) there was no way to identify programmatically how the camera was doing the cropping. It seemed to be being done at a level much lower than I could access.

In the end we just had to compromise on a resolution in the middle and use it for both preview and photo.

It may not be the reason, and maybe there's a way around this, the project was a year or two since now. Implementing the same thing on iOS I didn't run into the issue.


I'm fine with the picture quality (they are ephemeral after all) but I'm not fine with how the Android app is slow, clunky, battery-destroying mess.

Example: If I get a plaintext chat message through snapchat, and I tap the notification to open it, do I go straight to the chat thread? No, I'm dropped at the camera, which has to start streaming and download all the new filters and god knows what else until it allows me to swipe left to get to chats, then tap the specific thread to see the new message.


Yeah. Android app has sucked for a while now. Can't quickly snap something interesting. What's the point?


I always thought this was to ensure you get a photo of what you're seeing in the moment; not something after a delay which is what was happening with older phones. Essentially it's a fast and dirty way for the photos to be WYSIWYG and avoid camera lag.


Snapchat takes beautiful photos on iOS. Some people I know even prefer to use it to the native camera. I wonder why this is different for Android? Does anyone have an insight here?


Just a hunch, but I'm guessing the call to capture a screenshot is restricted by Apple, forcing them to do it the 'correct' way.


Saving a view to an image on iOS is a few lines of code. They could save the view to an image if they were so inclined.


People prefer to use it because it shows you the mirrored picture which you're more used to seeing.


It has a built in low-light photo mode on iOS that I use over the stock camera app all the time. In full light you can't even tell the difference between stock and snapchat photos.


It sure doesn’t for me, it takes the worst photos of any app I use by a long shot. It is fine for ephemeral stuff but nothing I want to save ever turns out well. (This includes from my 6S and my 8+)


When you save, you're saving exactly what would be uploaded to Snapchat, which is downsized for bandwidth reasons. You'll especially notice this on videos.


I'm looking at some of my saved snapchat videos on iOS and I am almost certain from the quality that there's no downsizing done here. Maybe if you send it to your story and then download it from there it'll be downsized to what exists on their server.


Move it to a desktop and you'll notice. I saved a number of videos directly from the camera view and it's much lower resolution than the built-in camera, like 720p or lower, which I didn't (disappointingly) discover until I got home.


As much as I love to hate on Snapchat, there's probably no reason to use the Camera API for high quality pictures given Snapchat's use case. And if your nudes were going to be leaked would you rather they were blurred or high-res?


as a social network whose main feature is taking photos, why wouldn't you want that to be in the best quality possible?


The main feature is not "taking photos", it's "capturing memories for quick sharing". The additional time it takes to focus/take the photo (which, judging by certain built in camera apps, could be seconds) makes or breaks the in the moment feel.

In the end, these photos are meant to be looked at for max 10s with stickers/drawings/text superimposed. You don't need a high quality shot for that.


I think you are overstating the delay involved in taking a photo and having it process inside Android. Whether or not you're taking a screenshot of the viewfinder or properly processing the photo, the moment you press the shutter button is the exact moment that Android captures what the camera sees. The processing that goes inside may take a second or so (if you're using the camera API) but it's fairly negligible in the long run. Any other app on Android that properly uses the camera - whether it's a built-in camera app or something third party like Facebook, Instagram, OpenCamera, Snapseed, etc - it's not like you're waiting 15 seconds for an image to process. It's all fairly quick and seamless.

The excuse of "oh it's just for quick sharing so we don't have to focus on quality" is a strawman argument that doesn't really work, because the tech is there to allow it to function both beautifully and quickly - they just don't put the resources into making it happen.


I think it can take a hundred milliseconds or more to actually capture. It has to switch the camera from preview mode to capture, which involves a change in resolution, encoding and perhaps recalibrating white balance and automatic gain. So press the button and hold for a beat, or when you release the button you make shake/blur the image at the moment of capture.


None of these performance hits sound like they negatively affect the purpose of the app though, at least not detrimentally to the point where poeple would complain about it.

I am but one person, but there are far more complaints about the quality of the photos, than the milliseconds of lag you see hitting the capture button.


Using the stock camera app on my unmodified LG G6, this is just not true. There's about a second of lag between pressing the button, and actually capturing photons. Trying to capture my son in a specific pose is basically impossible. Maybe I'll download an app that screenshots a live video, rather than using the camera API, or whatever it's doing.


Try the Google Camera app that's in the store.


The main feature is absolutely not taking photos. Snapchat's product fit is high volume low effort communication. In fact, I'd say the low quality pics is almost a feature, because most of the times people put stickers or one of those dynamic face manipulation 'sunglasses' or 'crown' on the picture, and those (inherently low res) would look ridiculous if they were put on very high res images.


Depends on what you mean by "high quality" If you're talking about >1MB photos, that can be a hefty payload to upload for a subset of users with poor connections. Keeping the photos lean enough but where it still looks great on the wide variety of phones is probably a better approach. That being said, I'm not sure if they get that kind of quality just from a viewfinder screenshot.


It would be the same amount of data if they scale the image on the client and transfer it. The problem isn't compression quality or resolution, it's that it doesn't properly autofocus or adjust for very bright or low light conditions.


The apparent delight in low-quality photography that pervades both Snapchat and Instagram are among the reasons I don't bother using either. Earlier photography applications and platforms such as Flickr celebrated high-quality camera equipment and the resulting photographs. Intentional dramatic down-sampling and filters that add vintage discoloration have never resonated with my senses.


Instagram is a very diverse place, much more so than Snapchat. If you follow the right people, you will basically only see high-quality photos (likely taken from high-end camera equipment too!), but if you just follow your “friends” et al you are gonna be disappointed.


> And here they are, trying to add more 'features' into an app where its main feature has been inherently broken since inception.

I am an Android user as well (I own a Nexus 4, 5, 6, and the 2013 wifi version of 7) and I welcome this feature. I think they will fix the Android way of doing things at some point. In fact, if I had the choice of entirely disabling the new swipe right vs a beautiful camera API experience I'd pick the former. I don't care about the Kardashians and I don't want them on my snapchat. If that means I have to live with a crappy camera, I am ok (mostly because my friends are all on iPhone and I mostly watch snaps rather than post them)


Sorry what? Why would making the app actually take photos mean you have to see the Kardashians?


I believe the point mcny is making is that the developers chose to focus on the social/media split thing instead of the camera API thing. "Making the app actually take photos" means devoting resources to that instead of devoting them to this new feature they've announced today.


> the developers chose

This seems unlikely.

More likely scenario: the developers chose to keep their jobs and did what management told them.


My impression was that they did this because it made taking a picture faster than using the camera and they specifically did not want to encourage high quality photos. They wanted quick personal moments where you don't care so much how you look or capturing the perfect lighting or whatever.


The Android Camera API is notoriously bad, and the Camera1/Camera2 switch isn't helping either. I would guess the main reason they do the screenshot approach is that it's actually faster than asking Android to take a photo and produces a photo that exactly matches what the user expected.


I wonder if it has to do with pic size and bandwidth.. since they never seemed to focus on Android. Due to a lot of low end devices .. and meaning no value for advertisers ... ?? Also they ain't Instagram .. or Flickr.. it was disappearing pics ... Meant to last few seconds,.. right ?? Or I got this wrong?


They can last for 24 hours sometimes or forever in the “memories” section.


Thanks, I've done my bit in return by citing your comparison URL in my (relevantly critical) review of the app on the Play Store.


I always liked that, because what you see is what you get, and it's much faster.


I cannot wrap my head around why it was even implemented like this in the first place. Were they too lazy to look up the Camera API implemetation? Was it a hacky workaround to avoid some retriction in Android?


It could be for a variety of reasons;

1. Compatibility with old devices or API levels 2. Performance optimisation for the filters 3. Lack of priority for the Android app

On my iPhone, Snapchat definitely uses the full fledged APIs and even uses OIS or the image processor to pick/merge the best frame out of the video stream (a lot of blurry photos I take are unblurred), so I’m not sure why the Android experience is allegedly so lacklustre


omg they are really doing this? A shame!


lol is this true? Proof? If this is the case that's most likely why my phone took so shitty snaps when I had android.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: