Hacker News new | comments | ask | show | jobs | submit login
Sex toy company Lovense admits its Android app locally stored audio recordings (theverge.com)
193 points by danso on Nov 10, 2017 | hide | past | web | favorite | 55 comments

Normally I'd call BS on a Company calling unauthorized collection of data a 'minor' bug, but in this particular case, it seems likely that this is the case.

I say this because -- according to the company's official response[1] -- the recordings were only created on the Android version and not the iOS version of their app.

They also state that the recording is only cached locally on the phone and not uploaded to their servers.

Leads me to think that it's very likely their Android app programmer(s) wrote some test code to save the file, and forgot to delete it or conditionally hide it in their release version of the App. There's no excuse for the sloppy programmer(s) if this was the case, I'm just saying the 'bug' angle is a possibility, because the Company has also been very quick to fix it and release an updated version for the Andriod app.

One more thing, I wouldn't call something like this a 'minor' bug. Likely their PR team threw that word in.

> Source: Lovense's official account on Reddit: https://www.reddit.com/r/sex/comments/7bmi3i/psa_lovense_rem...

Hanlon's Razor : Never attribute to malice that which is adequately explained by stupidity.

I think it totally applies here.

Not removing temporary files is a very common bug. As a developer, I've encountered it several times, sometimes I caused it, sometimes I fixed it, and I've seen it happen in other people software just as often. This is one of the reason I hate temporary files, they just don't wan't to be temporary.

The Godfather's Switchblade:

Make it look like an accident.

Ultimately I don't think we can judge intent in situations like these, especially because it's so easy to disguise[0]. A sentence can be constructed in favor of any opinion; so I'll spare you my attempt at an alternate reading of the situation that makes them sound guilty so long as you agree that it's possible to do.

I think a few questions can be raised about whether we can trust a company's claim that a problem stops exactly at the limits of what's been made publicly visible. How do we really know that select installations weren't phoning home with their captures?

This is a great moment to plug the idea that we really should be able to see the source code of the software we're running.

[0] http://www.underhanded-c.org/

This is very good. The parent comment is very reasonable, but your switchblade admits the possibility of aonethig very concerning which is not easily dismissed.

They did post a bug fix though.

Well, why wouldn't they post a bug fix? I don't see how that says anything about motive/cause. They'd do that even if it was intentional.

> Hanlon's Razor : Never attribute to malice that which is adequately explained by stupidity.

Why? How do you "accidentally" record someone?

Presumably you have a feature that depends on recording (but not storing) audio. You then write some code to store the audio anyway, because you’d rather just use the stored audio as a development aid while working on the audio-dependant feature. then you forget to disable the storing in the release build.

It seems that storing the file is not just a development aid, because it seems they are still doing it after the fix (according to the report, I don't have any compatible product to try it out...). They just make sure to delete it when it is not needed.

The reason they store the file probably has to do with technical constraints. Maybe the audio API they are using only works with files and not memory buffers, maybe the buffer took too much space, maybe it is a workaround of some kind. We won't know without analyzing their code.

It's possible this was only done on Android specifically because the odds of this being picked up and the app rejected by Apple are close to 100% while the odds of that happening in Google Play are about 0%.

haha. yeah, no. Apple catches UI issues and obvious use of private APIs. You just need the slimmest of excuses to do what you want. Like if you want to track a user's location put a location feature ANYWHERE in the app.

Also if the app is using the mic outside of the app, you'd see a big red bar at the top of your screen.

Does Apple hellban in cases where they discover someone was maliciously trying to call private APIs? How easy is it to get away with this long term?

I know places that have been doing it for years. They just tell you to stop. It's basically an arms race of them trying to figure out how you're doing it and figuring out ways to prevent them from detecting it. The app has a few million users so not sure if Average Joe would just get hellbanned.

Private APIs are APIs that only Apple is allowed to use. They let you do stuff you're not supposed to be able to do like put stuff on the home screen, however, these days, it's mostly enforced by the security model, so you can call the private API all you want and it's not going to do anything.

They also let you do things like record what other apps, etc are doing. Think of the Linux top command, on iOS there's no real reason for an app to need to list the other processes, so you can't. However, if you want to know what apps are running on the phone this would be a very useful API to have access to.

Apologies for my laziness, but what is meant by "private API" here and why Apple does not want them to be used?

They are functions (in the programming sense) that can be called by other code, but Apple reserves them for their own use. They don't necessarily have anything to do with privacy, though some of them do. The "private" refers to them not being publicly documented and not provided for developer use. Internal developers at Apple are allowed to use them, but only as appropriate. Consider a function (API) that allows one to wipe the phone. Apple doesn't want just anyone to be able to write an app that provides that feature. Same with a lot of other functionality. Other private APIs are things that Apple might someday offer to developers, but hasn't yet. For example, Apple can change the icon of its own apps (note the calendar app, which shows the current numeric month day, and the clock app, which shows the time including moving second hand) but it does not offer this functionality to developers. [Edit: I was wrong about this; see firloop's comment below.] If they did, the results could be off the rails, visually, and make the platform look bad, so I expect when they do allow this, they'll do it with a heavy review process in place... but that's just one small example. A lot of private APIs are more under the hood housekeeping stuff that you just don't want the average programmer to mess with.

The API to change an app's icon is not private (as of iOS 10.3)


However, it's much more restricted than the functionality natch referred to. An app can only change its icon while running, and even then only with the user's explicit permission.

Oooh, nice! Thanks.

> Apologies for my laziness, but what is meant by "private API" here and why Apple does not want them to be used?

Private APIs are not meant for use by third-party developers. They are not publicly documented and are subject to change without warning.

Often the private APIs are used to track users, because they allow getting identifying serial numbers from the phone.

Apple markets privacy as one of the features for the iPhone.

In the technical sense it sounds like a minor bug in, in the other implications not so minor - perhaps another mistake on their part of not being more clear with language.


> hahaha do you even code bro?

Yes I do. https://github.com/theShiva/

'no excuse' because of the context and sensitivity. It's a different issue if the programmer was leaving around some code execution log files or non-sensitive data on the disk...

A file containing recordings (audio & video?) of a sexual experience doesn't constitute 'temp file clean up bug' in my world. That said, we may be working in different worlds.

One nice thing about their hardware though is that it is very easy to write custom software to control it. Almost concerning easy actually. See the Metafetish project for one example of this: https://github.com/metafetish

This is my new favorite Github project, but I only read it for the app names.

The issue was probably blown more out of proportion by the writers who are encouraged to come up with click-bait headlines

Oh wow. While recording audio is concerning, I am so glad to be living in the timeline where I can control my sex toy with my smartphone. The next thing they need is an Alexa skill (which someone might already have).

Feature or a bug the company deserves no mercy, you should triple check those things--after you quadruple check them.

BUT, personally I assume that anything I do with a smartphone gets uploaded to the cloud via a permission I mistakenly gave, bug, or they can get hacked. In other words, certain kind of pictures, movies or acts don't (IMO) mix with a smartphone.

Incendiary title, even if correct. There is no proof whatsoever the recordings were uploaded anywhere.

Not sure why you were so heavily downvoted; the title explicitly says "recording user's remote sessions". To record something remotely heavily implies a connection between two places. If it was only recording on the device and nothing else there would be zero reason to say "remote" because it wouldn't be remote, it would be a local recording.

...where does the title say they were uploaded? It says they were recorded. The company admits they were recorded.

It says they recorded it, which strongly implies they had the data and their servers did the recording.

Recording something in a technical context generally implies that it's accessable to the person or group that caused the recording to happen. That is not the case here and the headline is somewhat misleading.

> admits recording users' remote sessions

To me, this sounds like the recording of remote sessions which would involve some sort of connection (otherwise how would it be remote if it's only on the device itself?)

the title is crafted to imply that the sex toy company was recording its users, could only do that remotely. but hey, titles sell

"Sex toy company admits recording..." implies that the company did the recording. A more accurate title might begin "Sex toy company admits product records..."

Hence the "incendiary but correct". Just the "admits recording" is incendiary enough. Or you think the title wasn't written with the purpose of being clicked upon?

That's a real humdinger!

Just like it was "accidental" when Google's street view cars were capturing wifi data.

Yes, likely. It'S plausible that Google captured Wifi signals in order to get to the SSID etc. to build a location database and they forgot to cut of the data part, possibly assuming it was encrypted anyways. Considering that they announced the mistake themselves and the short time the car was in receiving distance makes this a quit believable thing.

Similar here: The developers logged information onto the device itself in one version of the software and didn't hesitate to push a new release.

From an engineering perspective relatively small bugs. Impact a bit larger.

> and they forgot to cut of the data part, possibly assuming it was encrypted anyways

At the point in time when Google pulled that stunt, unprotected networks were still incredibly common. A company like Google playing dumb on that, or actually not knowing that, would be pretty sad.

Let's assume they did that on purpose: What's the purpose they get? Some random fragments of communication? They already have more relevant info via Google analytics and such about anybody, than what they gather while driving by ...

You’re being facetious, but it’s more like when the whole “Apple is tracking you through your iPhone” thing was going on - in that case, it was similarly a local database which was recording data which should have been erased, but it wasn’t transmitted to anyone.

I can completely see how this sort of thing could happen accidentally - not that it’s a great excuse for sloppy work, but mistakes do happen.

They only call it a 'minor bug' because nobody's discovered the 'major bug' yet. Gotta leave yourself some wiggle room (so to speak).

I believe we should push Google, Apple etc. into taxing permissions usage, say 2% for every permission if you use it and 5% if you require it but don't use it, so that if common sense doesn't work at least greed will encourage programmers to require access only to what is really necessary.

How about just giving users the ability to turn off a permission. Like Google already does.

I use that a lot to turn off contacts access to apps that have no business reading my contacts.

The problem is that 99% of users don't know about them or don't know how to disable them. The damage was already done when years ago apps started to require permissions for hardware or data they shouldn't have access to, so that users now blindly accept every request because "the app comes from the official store, so it is safe".

*Like Google added in Android 6 but Apple had for years.

How would you tax free apps?

In many professional contexts, a bond is required to be paid.

In the Free Software world, packaging of applications generally involves a third-party who does that packaging, and has specific guidelines to adhere to. E.g., Debian Package Maintainers and the Debian Policy Manual.

Do not permit them. Free apps are just a way to reinforce the user-is-the-product problem that exists today.

Note: I don't agree with a "permissions tax."

You may also want to purchase your phone from a company that isn't the world's largest advertising company.

It's kinda funny that everyone is upset about a temp file when the device itself sends their entire porn history to Google everyday of the week.

Basically, if you don't want to be spied on, stop using Google products.

The problem is now that turns into a cost-benefit problem. Can we make back the 5% that we're paying to get access to the phone book by selling that data? Chances are we can, so it's worth it.

I feel that could encourage them to lower the number of permissions though, which I'd argue is worse.

It does make sense to pay a fee for usage of the hardware, if only to guarantee legitimate usage.

That was my point: pay a little if you use it and pay more if you don't, which should discourage lazy programmers to ask for access to everything by default "just in case I need that later" and then leaving potential holes in every app they release. Not to mention the bloat involved.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact