Hacker News new | comments | show | ask | jobs | submit login
Apple hires one of the developers behind Signal (techcrunch.com)
467 points by scottyates11 on Feb 26, 2016 | hide | past | web | favorite | 113 comments



This article seems more than a little silly. They blew up a single tweet into an article about Apple's corporate strategy in relation to the FBI.

What next? Are they going to dig through Apple employees' trash, looking for variations in the number of credit card offers?

"Apple Employees Load up on Credit"

"Investigators have uncovered a 10% uptick in the number of accepted credit card offers from key Apple employees. Speculation about Apple's poor recent performance seems validated by their own employees obtaining as much cheap credit as they can get before the inevitable catastrophe approaches. Leading VCs interviewed had this to say: 'We always recommend to our partners that they obtain credit during times of prosperity, so that they don't need to unnecessarily dilute their shares by raising money in a downturn. If you're profitable but don't need the money, it's a great time to at least seek a line of credit from your bank.'

Apple representatives declined to comment on this article, possibly wishing to delay the bad news until the next shareholder meeting.

Next up: Microsoft reallocates its purchases of employee free soda to 20% Coke / 80% Pepsi. But what are the impacts on its cloud computing business?"


I know you joke about looking at credit card info, but it reminded me of this story[1] where fraud researchers at a credit card company (ab)used their access to credit card transactions of their customers in order to mine the data and perform fundamental research about various companies' retail performance.

They then used this information to trade on the companies just before earning release, and made a lot of money. They were eventually caught by the SEC because their trading was deemed suspicious, i.e. their options bets always seemed to work out.

1. http://www.bloombergview.com/articles/2015-01-23/capital-one...


So basically, they did their homework and got penalized for it?


>Here, Bonan Huang and Nan Huang allegedly got the information from their employer, Capital One, which was supposed to have exclusive use of the -- hey, wait a minute, does that mean that Capital One was allowed to trade on this data for its own profit? Wouldn't that be amazing? Surely the answer is no: I assume that Capital One signed agreements with retailers (or rather, with Visa and MasterCard, which signed agreements with retailers) in which it promised not to disclose transaction data, or use it for nefarious purposes. Really anyone who used this data would be misappropriating it from, ultimately, Chipotle. Which gets to keep its sales data to itself. Except once a quarter when it releases that data and the stock jumps.


I see everyone's point, but they did write one hell of a SQL statement, and having written a few of those in the past (seems typical with ANY kind of "report" for some reason), it's too bad they got penalized for it instead of just reprimanded/most (but perhaps not all) of the profit confiscated. cleverness, if not overtly malicious, should not be penalized too harshly IMHO


If I set up cameras with visual recognition software in front of a statistically significant number of Chipotle branches to count the number of customers and base pre-earnings security purchase decisions on that, have I committed fraud? What they did seems equivalent to that, in a way.

I mean, that's basically a company idea right there, if someone hasn't already done that


It's not equivalent because the data was given to them under certain terms that preclude such uses.


For a long time satellite imaging of things like WalMart parking lots (used a proxy for sales) have been a thing analysts and traders look at.


It's still a different scenario. Cops, up until a few years ago, used to think they could stick GPS devices on people's cars without a warrant because they were legally allowed to tail and observe people driving in vehicles without a warrant. The GPS device, like a lot of technology, allowed them to "observe and tail" hundreds of vehicles all at once.

The SCOTUS stepped in and said that the practice was illegal without a warrant because, despite a vehicle moving around a city in plain and public view, being allowed to monitor hundreds or even thousands of vehicles from a central location with only a handful of officers was outside of the scope that allowed them to hop into a vehicle and follow someone else, which would require hundreds of officers and hundreds of vehicles observing with their own eyes.

It's the same idea. Getting access to data that is not public (in reference to the credit card transaction data, not satellite imagery), in order to profit from a publicly traded stock, does not create a level playing field. Semi-realtime satellite imagery, on the other hand, may not be completely public, but it's publicly available data (with a fee, possibly, from the operators of the satellites, which is a device or technology that wasn't built to specifically observe walmart parking lot capacity). I would argue that it's still a grey area, as you can only interpolate sales based on a tangental dataset like parking lot capacity. But getting access to actual transaction history from the stores is a direct correlation to their sales and revenue model, which drives their eventual stock price.

I don't see how anyone could argue that they were "just doing their homework". They were subverting a system for financial gain. They weren't taking data that anyone could obtain and doing a novel approach to interpret tangental sales figures.


If, by "homework", you mean fraudulently/inappropriately obtaining information from their employer in order to profit from it, then yes, I guess they did their homework.

I'm not detracting from the effort and skill they demonstrated in their endeavor, (which IMO was substantial), but the amount of effort and skill put into an activity doesn't necessarily determine criminality/inappropriateness or lack thereof.


> Microsoft reallocates its purchases of employee free soda to 20% Coke / 80% Pepsi.

You joke, but I bet you could get a leading indicator if you looked at companies reallocating from European fizzy drinks to Coke/Pepsi.


They blew up a single tweet into an article about Apple's corporate strategy in relation to the FBI.

This sort of thing happens all the time from "news" blogs. You're just particularly keen to this one. The writer takes a smattering of fact, fits in into the existing narratives the news media is already telling themselves and bloviates until they have something to staple advertisement to.

What next? Are they going to dig through Apple employees' trash

Yes. News organizations that should know better pad their pages with exactly that. Constantly and forever. The trashcan is typically metaphorical, and the rank and file employees are boring.


>>> "Investigators have uncovered a 10% uptick in the number of accepted credit card offers from key Apple employees. Speculation about Apple's poor recent performance seems validated by their own employees obtaining as much cheap credit as they can get before the inevitable catastrophe approaches.

That is exactly the sort of thing that investors might look for. The personal behavior of executive is very telling. It is more so for privately-held corps, but can be applied to apple. Seeing an exec liquidating assets or taking on apparently unnecessary dept can speak to that execs future plans, which are tied to corporate moves. Some investors watch family members. A wife/mistress/girlfriend/husband shopping for a new house out of town may be the sign that the exec is about to leave one firm for another. How they plan to finance the purchase also counts.

One big tell, especially with startups, is communications with particular immigration lawyers. Execs facing a windfall often want to abandon their US citizenship in favor of somewhere with better tax treatment. See Saverin at facebook. So any communication with lawyers specializing in this process is a good sign of a buyout in the works. Either that, or they plan on winning the lottery in the coming months. You have to finish the process before you win the money.


+1.

the deal was on it's way probably way before FBI scandal started.


Oh, they hired a developer behind Signal. No offense to Mr. Jacobs, I'm sure he is an excellent developer. But I saw the headline and assumed they had grabbed Moxie.


Everyone who has ever known, or known of Moxie, thought the same :D I think we'd all love to see him be CCO (Chief Cryptography Officer) or something similar for Apple. Not to diminish his work at Whisper Systems, but talent like his should be reaching the 100 of millions of customers that Apple has reach to. Moxie, I know you hop on hacker news every so often - if you read this - would you go work for Apple? Or are they too closed source for your tastes?


I think we'd all love to see him be CCO (Chief Cryptography Officer) or something similar for Apple.

I assumed that were referring to Moxie and my initial thought was that it would represent a sad loss of autonomy for him.


I don't see Moxie ever joining a company like Apple.


I for one hope that his efforts aren't wasted within one narrow ecosystem.


Exactly. Writing a user of Signal on Android, Apple is the last place I want him to be!


I'd honestly prefer he remains independent of Apple.

I have no desire to purchase Apple products and them picking him up would probably be a loss to Signal which I actually use.


Agree, Apple's security culture is not compatible in my opinion; unless Apple changes, which is unlikely.


I assumed the same


Hmm, so Apple just hired away the dev of pretty much only secure open and cross-platform iMessage alternative?


Frederic Jacobs has been planning on moving away from Open Whisper Systems for a while. They didn't hire him away, he took another job.


I think this move shows that Apple is serious about security. They previously assessed the risk of a government ordered backdoor low and the potential for bugs in the Secure Enclave higher, and hence made the trade off the allow signed updates.


The CoreOS (https://coreos.com/) security team, or just the core OS security team? If the former, I'm curious what Apple's involvement with that project is.



Apple's internal OS development team is called the core OS team, totally unrelated to coreos.com


Does anybody see through these PR plays? They've unlocked many phones in the past for the government, they're protecting their technology and using the moral issue to look good at a time when they're still majorly losing their way. To me this looks like governmental appeasement. Shutting down Snowden and other's methods of private communications is a fantastic gift to the government who doesn't want more of that type of scrutiny and people talking about the NSA badly, there's already enough thinking they're a major problem. What perfect a guise to get it done under another companies name that also happens to be having a great PR week on the back of data they gave up or are going to give up anyway, they always knew that. I wish more people would think for themselves or at least consider why the script might not be reality. They hired him! What happened is a formerly non corporate secure, private form of communication is now... who knows what. Maybe the government just figured out how to deal with the next Lavabit and not deal with more backlash. Nobody trusts them right now, everybody seems to love this Apple letter PR play.


>They've unlocked many phones in the past for the government

I do not really see why this is always brought up. Ofcourse they unlocked phones in the past, they had a master password, they could not legally refuse to do it. There was no legal way for them to resist such actions by the government

Do you understand the difference between the security model today, and previous versions of the iPhone?

Further I do believe there is a Fundamental Difference between Apple run by Steve Jobs, and Apple run by Tim Cook in how they view government. This is why your seeing Apple shift its technology to resist government agents as well as more "traditional" threats

//For the Record, I hate apples business model, and their Walled Garden Ecosystem. I will never own a iPhone because of that, however this on going theme of "well they unlocked it in the past" is just technological ignorance that need to be put down.


What do you mean by hating their business model? Genuinely curious. I don't like Google collecting and selling my personal data so I don't use their products. What is your reasoning for "hating" Apple's model? Seems pretty identical to other companies except they don't collect and sell user data.


>What do you mean by hating their business model? Genuinely curious.

They do not support open protocols, they do not support interoperablity, they want to much control over the device I supposedly bought from them..

I have to use their App Store, they Operating System, their Backup (iCloud), their Desktop App (iTunes) etc etc etc

There is no F-Driod, for iOS for example.

I hate walled gardens.

>Seems pretty identical to other companies except they don't collect

They may not sell it, but they are certainly collecting data about you...


Things of all kinds went wrong after Steve left us. They've lost their way.


In what way?

You preferred the less secure work with the government in all things Apple? that is what Steve Jobs Apple was...


I preferred when the software didn't suck and got out of the way. I preferred when they built truly world glass hardware with out of this world materials. I'll keep all the downvotes fuckers, no prob.


Yes. And I'll say it again. When Apple revised their privacy text (around 12 months ago), it looked to me that a warrant canary died.

Replaced with text like

"We care about your privacy." "We protect you with all legal means available."

And then there's PR like

http://www.dailytech.com/Feds+Cant+Crack+Apples+iMessage+Enc...

Maybe it's true. But are you really that trusting?


How can you possibly be so cynical that you think this? Be an Apple hater all you want, but your comment is just silly. If Apple has "unlocked many phones for the government" before, why does the government have so many they want unlocked? A warrant canary can't "look to you" like it died. It did or it did not, and it most definitely did not. You're blowing this almost as far out of proportion as the article. Where is your proof for all these prior phones they unlocked??


Yes, this could be all smoke and mirrors and Apple is already in bed with the USG and they just playing us and taking us for suckers but setting all these legitimate concerns aside and assuming that this whole controversy is legit, I think Apple's position is vulnerable given their tax policies and hoarding cash reserves in offshore subsidiaries and holding the repatriation of these funds to the US.

I think that the USG could really twist Apple's arm and take them to courts over trumped up tax evasion charges and force Apple to cooperate with them on that FBI issue.

Let's just wait to see how this interesting story evolves and concludes before passing judgements very early.


While I agree that closed-source privacy protection isn't trustable, and therefore isn't sufficient, that doesn't mean that Apple doesn't intend to protect APple customers.

There's a big difference between not going far enough and deliberate deception on this topic. There are also multiple stages to the battle for privacy, and most technology products and services are relatively far behind Apple in that progression.


Not sure if I believe that Apple acqui-hiring this developer was a concession for the bad press they've been giving the government lately. It was his choice to work for Apple; I'm going to guess that they didn't coerce him into taking this deal.


Conjecture: Isn't Apple's private signing key already a "master key to turn 100 million locks"?

I.e. the key they use to sign software updates. With that key, someone could create malware and sign it... Apple creating the malware just saves them a step. Ergo the "target on that piece" is already pretty high value, yet Apple is able to keep it secret / prepared for contingencies (like rotating the key..)

Thoughts?


Well, this is true for any form of authentication. If you have information you need to update, you need to have a form of authentication, and authentication data can get lost. You just need to have good routines limiting the access to this data.

This is a problem for signing software, but also things like updating their webpage and content on the App Store. All these systems need to have authentication data exist, and if lost to people with malicious intent it could be lost.


So what does this say about Apple's claim that a "master key" is too dangerous to create? Don't they already have that.. something that hackers could use to unlock iPhones? Doesn't that danger already exist? (Again this is meant as thought-provoking conjecture.)


Yes, Apple has never denied that it is possible for Apple to create a signed build of iOS with some of the security stripped out. They just point out, rightly that it is not a good idea.

It follows that this is a pretty thin layer of security.

And it seems that Apple's signing keys are well-protected high value targets. Has Apple been "able to keep it secret" ? As far as we know, yes. But we don't know everything.


I don't care about Apple but I hope this won't end bad for Signal.


This news is another +1 for Apple in my view.


Congrats on the new gig. I wonder if this is a sign of bad financial health of Open Whisper Systems ?


Frederic Jacobs announced he was looking elsewhere some time ago. I don't have any insight into Open Whisper System's internals, but considering they've still been committing code and they're still posting new job openings, I doubt this has anything to do with Open Whisper Systems and more to do with Jacobs wanting a change of scenery.


Apple is not known for high salaries, so I wouldn't jump to that conclusion. There are lots of reasons beside money why people switch jobs...


Good for Apple. Maybe he can help critique Apple's security methodology. It will be interesting to hear what he works on and how he finds Apple's security systems.


I can understand how people want to put puzzle pieces together, but this is completely idiotic.

Whatever remaining security holes there are with secure enclave, they have nothing to do with a software chat app.

This is entirely coincidental and has nothing to do with anything.

TechCrunch should be ashamed of itself (again) for being such a douchebag.

Edit: I'm not saying Apple hiring the guy is stupid. I'm responding to the hattery from the article itself.

As a hire, it makes sense. But trying to decide that it means "Apple is now serious about security" is just a bunch of horseshit on both ends.


It's arguably a poor, baity article, but please don't rant like this on HN. It lowers the quality of the discussion and usually sets off a degrading spiral (as below).


I apologize for the rant. It was misplaced here. If you feel like it needs to be deleted, I'm okay with that.


No, but thanks. We really appreciate your understanding.


Is it idiotic to assume a company embroiled in a debate about privacy and security for a communication device-- the biggest driver of revenue for the company, hired someone in the secure communications space to work on communications products?

Also, Apple has a PR problem and can't operate without secure systems. Article title notwithstanding, it is a pretty big deal that while an intelligence agency is coming at them hard they hired a developer, in a very public manner, that's application is used by the very person who made the evidence of surveillance known.

This could be a signal to the market that they not only passively oppose this, but they are actively locking down their systems and they won't cooperate. Seems like a very sharp developer and as a bonus he did secure system messaging so it is not idiotic.

edit: I ammended post to reflect that he is likely not working on iPhone directly.


I will argue that this guy has none of the skills needed to up the ante on the current security model of the latest versions of iOS. What's not known is how security enclave works. But what is known is that it's firmware.

Something very much outside what we know about the secure chat app.

We also know that iMessage has never been known to have any fundamental security flaws.

I tried to clarify above, and I'll do so here again. I don't think the hire was idiotic. I think TCs characterization of hiring a security messiah was idiotic.

That is not anywhere close to reality.


It is a strong signal to the market that they aren't cooperating and actually, actively hiring to get to market with something that is non-trivial to break into.

I don't know about his engineering abilities but the interview I read and some of the news articles presented him as quite a talented person. Signal, if it is as secure as the EFF audit suggests, would be one way to shore up older iPhones.

> I think TC's characterization of it was idiotic.

I mean, if we grade it on the TC scale it wasn't. It is hard to say it is unrelated. Their communication device is very publicly being regulated into compliance and they want to hire all the good people they can get. This is good on 3 levels, solid engineer, strong communication to market and that commitment brings in other solid engineers.


"I will argue that this guy has none of the skills needed to up the ante on the current security model of the latest versions of iOS."

What are you basing that assertion on?


I disagree and think this is potentially big news. You push us to relax security...we push back by trying to make a play in the secure chat (for everyone not just iPhone users) market which would make your life a lot harder.

The market is tough but it would be interesting if Apple would actually enter it. They have enough power to seed the network effect needed with a large enough user base. I think this entire saga has actually opened up a nice spot to push really hard for the positioning slot of "secure by default". It's been done by a lot of people including Apple before but I think we're at a point in time where the media echo might be good enough for a big company to make a true positioning play. It's also a great differentiation against Google/Facebook. Apple has voiced the "essentially our competitors are in the we make money off privacy violations business" (in other words) but they might want to hit that harder soon. A bit fickle since you need FB/Google in the "security now" alliance but still interesting.

I'm still skeptical about closed source software for secure X but I guess it's better than nothing.


The worrying this is that Apple really has a terrible record of making their solutions available to other platforms. A secure communication software that will only work well on Apple platforms and have a half-broken solution for perhaps one more is not really the direction we want to move to :/


It's not idiotic, it's interesting news given the climate. They didn't say what his role or project will be. What's wrong with reporting on Apple hiring a developer of one of the most popular secure messaging tools?


Are you blind to the difference between reporting an event and interpreting the event badly?


Please quote the article where you feel it interpreted events poorly. And be civil.


How about the first paragraph?

    > Apple hires plenty of interns all year round, but one particular addition
    > revealed this week caught the eye given the company’s current position
    > opposing a controversial order to enable the FBI to access the iPhone used
    > by one of the San Bernardino shooters.
// Of course it's worth writing about, but it certainly would have been higher quality reporting if they didn't immediately link it to the FBI story.


That is hardly a misinterpretation of events. They're just saying,

"hey, there's this software developer who's done some work in the field of security on an app which is famous, and he's going to work at Apple during a time when some issues with Apple's security are in the news. And we noticed and we want to share that with you"

It's interesting. Tech Crunch can write about whatever they want. If you don't like the article, downvote it and move on. Perhaps Ian is just jealous nobody is writing an article about him, because he is clearly smarter than this developer.


TechCrunch can certainly write whatever they want, but it becomes problematic when they think a single tweet is newsworthy.

When news outlets start writing puff pieces about memes[0], you know that we've all collectively hit rock bottom.

[0] http://qz.com/622001/damn-daniel-the-new-viral-meme-is-gener...


> If you don't like the article, downvote it and move on.

Generally it's courteous to leave a comment explaining a downvote before moving on.


[flagged]


I read it, Ian. I didn't see them call him a messiah anywhere. Is the word transparent like your comments are now?

Also, I'm no longer a student in the traditional sense. That's sort of a life mantra of mine, to be perpetually learning. You can think of it as the opposite of your world, in which you think you know everything.


It does make sense to hire a guy who has had great success in Security. The chat app is just one of the use cases which he handled, and a good experience in designing secure software always helps.


No, but this is very likely to be them tightening up other parts of their software stack.


Oh right. All the other insecure parts of their software stack. You know, all that other insecure stuff that's notoriously insecure. That one guy who wrote a chat app is going to tighten up.

Come on. Don't pretend this is anything more than it is. A really hard-working guy worked hard and built a thing that worth while. Apple said, "hmmm, it would be easier to buy this person than to hire him." So they did.

There is no one-person fix to secure enclave or any of Apple's other problems. You are being delusional. Apple's problems, such as they are, are systemic and cultural. Apple cannot buy its way into better cloud services or better Siri, or better security, and certainly not with the purchase of a such a small company.


> Apple said, "hmmm, it would be easier to buy this person than to hire him."

Those two things sound the same to me. The guy was hired. What are you saying here?

> There is no one-person fix to secure enclave or any of Apple's other problems

Nobody said he's going to work on that.

> Apple's problems, such as they are, are systemic and cultural. Apple cannot buy its way into better cloud services or better Siri, or better security, and certainly not with the purchase of a such a small company.

You seem to know a lot about Apple's culture. Do you have some evidence to support your claims?


> All the other insecure parts of their software stack

I didn't say all the other parts, or that he's going to do it singlehandedly. Maybe they want to improve end-to-end encryption for iMessage or similar and figure he's got relevant experience.

> Apple's problems, such as they are, are systemic and cultural.

Possibly, but even then, I would argue that this current situation is a culturally defining moment for post-Jobs Apple, maybe even strong enough to override other parts of their culture.

One thing for sure, it's being driven from the very top down and Tim Cook is making clear, unequivocal comments about where the line in the sand is.


> "hmmm, it would be easier to buy this person than to hire him."

In what way did they buy him?

Apple didn't buy Whisper Systems. They hired Frederic Jacobs.


There are probably not many tech companies who would turn down an internship applicant with this guy's résumé. So, yes, no puzzle to be pieced together here.


He's not been hired to work on messaging though. He's been hired to work on the Core OS team, i.e. the low level parts of iOS and OS X.


Why is this about Edward Snowden?


I'm starting to think Apple has found its first viable post-Jobs narrative.


I would say that this has been Tim Cook's narrative for a while, and along this path we've seen iOS integrate things like WiFi MAC randomization and website ad blocking.


I would say that this has been Tim Cook's narrative for a while,

While part of it may be PR, I also believe that these are Tim Cook's values shining through. Having an orientation that gets you jailed or killed in some countries, makes you value individual freedom and privacy.

It's great to see this new Apple.


> Having an orientation that gets you jailed or killed in some countries

Cook was born in Alabama in 1960. For a long time, his orientation could get him jailed or killed in his place of birth. Hell, as of 2016 Alabama still is hardly a good place for people with Cook's sexual orientation[0][1][2].

[0] https://en.wikipedia.org/wiki/Roy_Moore#Same-sex_marriage

[1] http://www.npr.org/sections/thetwo-way/2016/01/06/462161670/...

[2] 13A-6-65(3)[3], though ruled unconstitutional (since 2003 and Lawrence v. Texas), is still on the books

[3] http://codes.lp.findlaw.com/alcode/13A/6/4/13A-6-65


Why does his orientation have to be included when we discuss his ideals and motivations?


Why not? He said himself that this gave him a deeper understanding of the struggles of minority groups:

Being gay has given me a deeper understanding of what it means to be in the minority and provided a window into the challenges that people in other minority groups deal with every day.

Source: http://www.bloomberg.com/news/articles/2014-10-30/tim-cook-s...

I think it is relevant to the discussion, because I believe that this is a much deeper motivation than profit maximization. Though, there are of course many other ways to reach the conclusion that privacy is important.


And on top of that, he probably understands the tactics the FBI used against civil rights leaders in the 60s and before (and even after).


If your opinion is that his orientation influenced/drove his ideals and motivations, it's relevant to mention. I don't think the gp is arguing that's the only way to have those ideals.


In a perfect world it wouldn't be, because it wouldn't make a difference.

Although things are better these days than there were, there is not complete equality - and I am not speaking legally.

There is no need to "come out" to your friends and family that you are straight and like girls (if you are male and vice versa).

Until there is no bigotry, until there is literally no difference in what sex you prefer, then something like this will have an effect on your life, thus it may have played a part in Tim's view regarding his current position.


He(or she) literally said why in his(or her) post. If you are part of a group for which discrimination is very real, then you naturally value privacy laws much more.


It is a matter that is very private to some people, and he gets that. Not all people understand at all what personal privacy means.


> While part of it may be PR

It definitely is. The NSA has access to all this data anyways. They just need to get it nailed down legally now, as far as possible.

Apple just can't be seen as collaborating.


> The NSA has access to all this data anyways

How so?


This seems like a disingenuous question.


You can't seriously ask that question if you have been following the Snowden leaks, the corrupt behavior and proven lies of certain very powerful government officials.


And Apple is owned by the government?


1. Is the NSA capable of doing it? YES 2. Does the NSA have a reason to do it? YES


> Is the NSA capable of doing it? YES

Serious question - is it?


Serious question. Is it safe to assume that the NSA can't? NO

The NSA might or might not share with the FBI, and the FBI might or might not be able to use what's shared in court, which could explain why they want new authority.


I was simply questioning the bald assumption that the NSA already has access to this, not suggesting that we should assume the opposite.


Have you heard of Ernst Röhm the Nazi general who was openly gay during his tenure?

Apparently his homosexuality and the resulting stigmatization didn't stop him from espousing horrendous values and ideals.

Someone's sexual orientation has no weight on the values system that he/she would subscribe to and thus shouldn't be taken into consideration in any serious discussion about the topic.


> Someone's sexual orientation has no weight on the values system that he/she would subscribe (...)

They _may_ have no weight, indeed, as your anecdote illustrates.

People tend to empathize more with people they have something in common with (some research suggests [0][1]), so I think many of us would certainly have a mindset affected by our minority-held sexual orientation, in matters where our sexual orientation actually plays a role (security being one of the more obvious cases).

[0]: http://psp.sagepub.com/content/36/3/398 [1]: http://journals.cambridge.org/action/displayAbstract?fromPag...


The OP's statement was basically that being a homosexual makes him automatically being a lover of individual freedom and liberties to which I objected and exposed the flaw in that argument citing a historical example of a famous openly gay person who was all for everything that's the opposite of freedom and liberty.

If the OP's argument was limited to only the privacy part, I'd have agreed tentatively with him/her as it's undisputed truth that gays under persecution or living in discriminatory environments favor privacy intensely and therefore it could be argued that this influenced the decision of Tim Cook in the apparent fight with the FBI.


  > I found a cat that does not like milk.
  >
  > Therefore, if I see a cat, I can draw no inference
  > about the probability that they like milk.
This is not correct. It is correct to say that having found a cat that does not like milk, we know that cats do not necessarily like milk. That is valuable and relevant.

But nevertheless, we can still have some confidence that, absent other indicators, cats like milk.

p.s. Here’s another example closer to home of a “cat that did not like milk:” Roy Cohn, who carried out anti-homosexual witch hunts, and later died of AIDS.

Edit: Adding link to https://en.wikipedia.org/wiki/Roy_Cohn


This is all a waste and show off as far as I'm concerned until they go open on everything.

With this move, they will also waste very valuable developer's (crypto experience ain't cheap) skills.


I am not, nor have I ever been, a big fan of Apple. (apart from all the usuall complaints, I am one of those freaks who just doesn't like Apple's interface & design). But this is a step in the right direction. Would it be better if it were more open? sure. Is it useless if it isn't? not really, it still protects the users and puts pressure on other manufacturers to do something similar.

Who knows, maybe internally they're even looking at making things more open but that could require a lot more work w.r.t. scanning for patents and such in their code and perhaps they want to have a strong solid release of whatever they would consider a "full system" before they put up their code for all to see.

Apple doen't, historically, have much of a history of opening up but they did open up Swift recently. Perhaps new winds are starting to blow like they did at Microsoft.


This is all a waste and show off as far as I'm concerned until they go open on everything.

What would that bring them? Other companies would rip off iOS, bastardize & put their slow skins on it, and never release security updates (or modified source files).

I agree that opening up security infrastructure would be good though.


I agree with you that there is no privacy protection one can rely against the highest level state actor threats without open tools for privacy.

BUT Apple appears to be effective at keeping the contents of smartphones out of the hands of police and prosecutors and out of the courts. If you can't distinguish between those levels of protection, you are allowing the perfect to be the enemy of the good.


So having his work on billion of devices is a waste in your view? Ok.


I'm inclined to agree in that they can't really be trusted if they can't be thoroughly audited and all the lockdown is an obstruction to that.

On the other hand I think they could really be doing good things behind the veils, and that could benefit very large numbers of people who don't have the knowledge or inclination to defend their own communications, (and anyone who has the knowledge and inclination but also the misfortune of needing to communicate with those who don't).

I don't know anything about Jacobs beyond what we've just seen here, but I would guess someone who has worked on that level with Open Whisper Systems wouldn't be prone to accepting poor security design, nor to accepting unethical practices in handling user information. I'd be much happier with an open Apple Inc. too, but as long as it keeps standing for a closed and locked environment, Jacobs seems like just the kind of person I would want working there.


I'm a big fan of using open source software to build a business on - particularly BSD/MIT/Apache (aka "permissive") licenses - but the idea that "Open Source === Audited" is laughable.

How many huge bugs have been discovered in very widely used open source libraries/applications and identified as having affected the software for many years?

Would you be satisfied if Apple provided the option for NDA-sealed access to the source, allowing people/researchers to view (but not redistribute) their stack?

Edit: fixed brain shart (extra word)


Heartbleed is a classic example.

OpenSSL was vulnerable since end of 2011. Fixed mid 2014.

And it's one of the most popular and commonly used open source technologies.


>Apple Hires Developer Behind Signal, Edward Snowden’s Favorite Secure Chat App

A "secure" chat app that depends on Google Play Services (spyware) and is only available through the Play Store (rather than F-Droid, an open source software repository for Android) and maintained by an author who refuses to integrate fixes to either of these problems upstream.

For those wondering if Google Play Services really is spyware: one of the purposes is to backdoor your phone for Google so they can _silently_ update any of their apps on your phone. It has access to _every_ Android permission and can (and does) grant any permission to any app silently. It also monitors your location and reports it to Google, along with brief voice snippets for "OK Google", as well as a list of all apps installed on your phone, and more. It's definitely an awful thing to have on your phone if you're privacy conscious.


You're welcome to inform yourself on the subject before posting crude FUD: https://github.com/WhisperSystems/Signal-iOS/blob/master/BUI...


I encourage you to do the same:

https://github.com/WhisperSystems/Signal-Android/issues/127

Security should be available to all, not just those with the environment and know-how to compile apps from source. Doubly so on iOS where you have to pay x dollars for a developer license.




Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: