Hacker News new | past | comments | ask | show | jobs | submit login

I don't think disabling the enterprise certs was particularly moral, Facebook and Google were flagrantly violating the terms of the enterprise program. Apple also apparently didn't even notice (or didn't care) until articles about it started getting a lot of attention.

Apple definitely does make some commendable decisions, but I think it's also important to distinguish between bravery and what Ben Thompson calls "Strategy Credits" (https://stratechery.com/2013/strategy-credit/):

> Strategy Credit: An uncomplicated decision that makes a company look good relative to other companies who face much more significant trade-offs.






> Apple also apparently didn't even notice

Do they have any information about enterprise apps? As I understand it, Apple never phones home with app info (such as the identifier, name, etc) when verifying or installing enterprise-signed apps, so the only thing they know is probably the IP address requesting to verify the enterprise-signed app and the frequency of how often Apple devices do this certificate verification.

Considering FB and Google have many employees in all different parts of the world, it wouldn't be too suspicious to see a good amount of diversity between GeoIP regions.

Correct me if i'm wrong about what info Apple collects about enterprise apps.


As far as I can see this is correct. Even if devices are enrolled In Apple's Enterprise MDM program, the administration staff are the ones who get to see which applications are installed on the iDevice, not Apple. And I really do not think they are so preoccupied with this that they want to actively scan IP addresses for suspicious behavior (of which there probably isn't any to begin with).

Anyway I wholehartedly agree with you here and I think Apple genuinely had no knowledge of this activity until news outlets reported on it. Or if they did, it did not make its way to the higher-ups that revoke developer certs.


Going forwards, Apple will require that companies provide their enterprise apps to be audited.

I see them adding something like the macOS "notarization" requirement to iOS enterprise apps.

Indeed a company's "morals" are better exposed when it has to make inconvenient choices.

I would say true morals lead to structuring your company in such a way that you don’t have to rely on business people making ethical decisions moment to moment, because they won’t.

As nice as that sounds, I think it requires an impossibly perfect prediction of future events. You face ethical decisions whenever you have power or limited resources.

No. You can bend your business model towards transactions you are comfortable with, without perfect future vision, or even a clear strategic understanding of how that might happen.

In fact, the world around you will bend to meet your values whether you’re even aware of it. And that includes any companies you run.

The world does extend beyond your knowledge of it.


You really need to build a company that values that right down to its core. It has to be embedded so deep into the hiring process that you only select for people who share that value, and it has to be easy to let go of people who don’t fit.

Otherwise, it only takes one person to short-circuit that value to set the ball rolling on a shift towards lower standards.

You need to run a super tight ship, which I think is not as hard as it sounds until you put VC, investment, and shareholders into the mix. You at least need to be super diligent about those people you bring in who are not accountable to you, but you are accountable to them.

Basecamp is an amazing example of a company that has succeeded without compromising itself a jot. They do all kinds of things that we might consider unthinkable because they won’t budge on their values. Probably the one company I’d drop everything to work for if I had a chance at getting through their hiring process.


The same forces that require several levels of management make it increasingly difficult to enforce ethical decisions. Basically, when no one person can keep track of all the moving pieces you get splits around what individuals think is acceptable behavior. The larger organizations grows the more things tend to diverge, with different branches often having wildly different perspectives.

This tends to further degrade as new employees are added and any whatever original vision was going on continues to degrade over time. Especially as both the times and even business models change.


> make it increasingly difficult to enforce ethical decisions

Nonsense - this is a solved problem. You simply need to remove the ability to make defined classes of bad decisions by binding the company's future decision making capability with a Ulysses pact[1]. Cory Doctorow gave a good talk[2] about using Ulysses pacts in the tech industry.

>> It's not that you don't want to lose weight when you raid your Oreo stash in the middle of the night. It's just that the net present value of tomorrow's weight loss is hyperbolically discounted in favor of the carbohydrate rush of tonight's Oreos. If you're serious about not eating a bag of Oreos your best bet is to not have a bag of Oreos to eat. Not because you're weak willed. Because you're a grown up. And once you become a grown up, you start to understand that there will be tired and desperate moments in your future and the most strong-willed thing you can do is use the willpower that you have now when you're strong, at your best moment, to be the best that you can be later when you're at your weakest moment.

>> The answer to not getting pressure from your bosses, your stakeholders, your investors or your members, to do the wrong thing later, when times are hard, is to take options off the table right now.

This shouldn't be a problem for anybody that actually wants the moral outcome. Why would any=body insist on preserving the option to behave badly in the future unless that bad behavior is part of their future plans?

[1] https://en.wikipedia.org/wiki/Ulysses_pact

[2] https://www.youtube.com/watch?v=zlN6wjeCJYk (transcript: https://d3j.de/2016/06/24/cory-doctorow-how-stupid-laws-and-... )


They don’t insist on preserving the option, they simply can’t predict the choice.

Companies lack unified decision making. The founders can’t predict every moral choice any employee will make. And for any large organization the CEO has no idea what most of the day to day decisions involve.

Consider something as simple as an old brick company setting up an email server for the first time. That opens up a host of choices management likely had zero idea even exist.


It's easy to talk about "not preserving the option to behave badly" in the abstract, but actually doing it requires perfect foresight (predicting every way anything could be used for evil) and often a lot of implementation difficulty (because you need to completely foreclose on the bad options without impeding good or neutral ones).

Yes, and all of those are decisions you can go along with or reject. Deciding how big an organization you will join is one of many ways you apply your ethics.

That’s another issue. As an organization is viewed as less ethical, only less ethical people join creating a downward spiral.

Hopefully the unethical organization develops a reputation for being unethical. Then only unethical people will patronize said organization. Ideally, unethical behaviour leads to marginalization, though obviously that does not always happen in practice.

Samsung are a strong counter-example. They're been caught red handed doing all sorts of truly nasty stuff. Pretty much anything Apple has even been suggested as doing, they've been enthusiastically up to their eyeballs in and a lot more, but they seem to just shrug off the negative press like water off a duck's back.

I think once a company develops a reputation like that, most people just get desensitised to it. Also there is a degree to which lower cost products get a pass, because hey, they must be cutting corners somewhere.


Yeah, I agree, but I think that is just a way of stating that morals are impossible to perfect.

Hmm maybe - what I have in mind is that you could run something undeniably good e.g. a hospital, and you will still face hard ethical decisions about how you handle uncertainty, apply power or allocate your resources. Doing good things just isn’t easy!

You can certainly form a company whose line of business minimizes how many ethical questions it will face; I'd consider that to be ducking out and ultimately less moral than entering a business where there are genuinely tough ethical questions that you will need to take positions on (and inevitably sometimes get wrong).

How is that even possible?

For example between easily upgradeable environment-friendly product and a box of glued components with no-user replaceable parts so that they need to buy a new item in the line sooner.

I'm not convinced that morality and self-interest are mutually exclusive. Very often the best decision for a given entity to make is a moral one.

We should still reward/praise companies who make decisions that are morally superior to their competitors, regardless of whether the morality itself was a primary motivation.


I personally believe morality and self interest strongly overlap over the long term, but short term they are largely independent conditional on the probability of getting caught...

Humans might have too short lifespans, memories and limited rationality for the long term benefit of morality to be strongly in our individual self interest though... one of the possible benefits of anti-aging and cognitive enhancement tech is it might incentivize us to be more moral all other things equal as a side effect.


Yeah, I'm not sure I attribute Apple and Tim Cook's latest stances to strong moral fortitude. I think it's more corporate 101:

1) Public sentiment is hammering companies for perceived privacy violations

2) Our business model does not rely heavily on selling user data

3) Make public statements about how much we value privacy at literally no cost to us

4) Get in a good dig at our competition at the same time


Perhaps. But I also find it easy to buy that a guy who grew up gay in Alabama could think privacy is of fundamental importance.

Indeed. But I also think that Zuckerberg, Bezos, Page & Brin all value their own privacy. They just don't value your privacy that much.

Zuckerberg definitely does, there are pictures of him with tape over his webcam, microphones taped, ect.

[flagged]


Those are very different things. The former is doing the latter is showing.



Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: