Hacker News new | past | comments | ask | show | jobs | submit | noelsusman's comments login

Here's a short demo of a pretty good voice to text interface that's available for free: https://twitter.com/lunixbochs/status/1378159234861264896

I had to use it for a while when I was unable to touch a keyboard or mouse while recovering from RSI and I was surprised by how quickly I was able to get to about 80% of my previous productivity using just my voice. I still use it sometimes even though my RSI is fully healed.


That's not how extremist online echo chambers tend to play out. See January 6th for an easy example.


You should already know the answers to those questions.


Why? It's only been out for a day, you telling me everyone's already had their run at hacking it? That seems like pretending you know what folks will do with this thing.


The fact that you and many, many others believe all of these things to be 100% true is exactly the problem. Social media has poisoned your brain.


Epic is the worst EMR system, except for all the others.


He doesn't seem to be stepping down from much of anything. It's an online social media company and he's still going to be in charge of product and technology, so what exactly is this new CEO going to be doing? This just looks like he hired a VP of Advertising and decided to call her a CEO for no reason. What kind of tech CEO doesn't control product and technology?


> What kind of tech CEO doesn't control product and technology?

The kind that has a CTO to do that.


The decision to kill third party clients has cut my usage dramatically.

Beyond that, the decision to show every single reply from a Twitter Blue subscriber above any reply from a non-subscriber is one of the worst changes I've seen in a social media product. Elon chose to brand subscribing to Twitter Blue as a political act, so now below every tweet there's nothing but people who have agreeing with Elon's politics as a significant part of their identity.

It's not like Twitter replies were all that good before, and yet somehow he's made them significantly worse.


The tricky thing about cutting third party clients is that those users were never counted as a 'mDAU' in their reports anyway. They could have brought other benefits (like 'power users' generating content to attract monetisable users).


I'll give it a go.

A large chunk of your daily behaviors are governed by habits. Habits are made up of cues followed by some sort of routine that you do which results in some sort of reward. If you want to change a habit then you need to focus on the cues that set off the routine. When a cue occurs, alter the routine and give yourself an alternate reward.

I used to have a drinking problem, say 1-2 bottles of wine a night every night. I cook almost every night in my house, so starting to cook dinner was a major cue for me to start drinking. Specifically, whenever I would put on my apron around 6pm I would get a strong urge to pour a glass of wine. I had a lot of difficulty resisting that urge even when I genuinely wanted to quit. It felt eerily automatic and involuntary. I didn't start having success until I focused on that cue and replaced the routine that followed it. For me, I decided I would put on my apron and immediately make myself a plate of fancy cheese and some crackers. I still had a routine and a reward after my cue, but the new routine was significantly less destructive.

So you're right with your exercise example that simply placing your gym bag by the door isn't going to be successful. You need some cue to go exercise, then exercise, then immediately reward yourself with some chocolate or your favorite candy or whatever.

I won't go so far as to say we can cure everyone's addiction with this one neat trick, but I have found it to be a useful framework on my life.


are you now ingesting 2 boards of crackers and cheese every day instead?

(kidding)


What investors?


>As we have stated in the past, there is no effective way to weaken encryption for some use cases such as law enforcement while keeping it strong for others.

I've never been fully satisfied by this assertion.

Apple has the ability to push whatever code it wants to whatever device it wants. They can make a version of iOS that bypasses encryption and restrict it to only run on devices identified by a warrant. They could post it on GitHub and it wouldn't matter because iPhones would refuse to run the code if it had been modified by somebody other than Apple to run on any device. You would need Apple's internal code signing keys to actually do any damage.

Currently, the security of your iPhone is dependent on Apple's internal security. If that gets compromised then your phone can be compromised. In a world where law enforcement can get a warrant to force Apple to unlock a specific iPhone, the security of your phone is still dependent on Apple's internal security. Nothing changes for anyone not targeted by a warrant.

You don't need to give law enforcement a universal backdoor key to enable them to execute warrants on devices. What am I missing?


> They can make a version of iOS that bypasses encryption and restrict it to only run on devices identified by a warrant.

Of course. Apple could absolutely do this and undermine any trust people have in them in seconds if they want to.

Didn't they pass a law in Australia requiring corporations to do exactly what you describe? I remember reading news here about several corporations just moving off of Australia as a result of the inherent untrustworthiness of any system where the government can compel any party you're doing business with to ship you malware.


In the current world Apple is not actually able to compromise your iPhone, warrant or not, by design (remember the whole FBI debacle?). Encryption is done through separate hardware on the phone that they can’t remotely bypass. In the world you’re proposing they would have to be legally compelled to engineer a weaker system. That’s the problem.


I could be wrong but I don't remember Apple claiming it was literally impossible for them to open an iPhone. If that were true then Apple wouldn't have had to fight them at all. The government can't force you to do impossible things.

Apple objected to being compelled to create the tooling to bypass an iPhone's security. That implies they can do it whenever they want, they just choose not to.


They could create a build with unlock attempts removed, making it possible to brute-force weaker unlock schemes. That's what they didn't want to do because if they created something like that they would lose a lot of customers.


Are unlock attempt limits not built into the security chip?


I wasn't sure about that so I erred on the side of caution, but I do think that is how it works yes.


You already can't sideload Signal on an iPhone. The Apple store is a single point of failure that's being ignored by too many.


great idea. while we are at it, we should also require that the police have a key to your house and vehicle, as well as your debit card pin number and, email password, and bank password.

its necessary security, so you agree with this correct?


They effectively have all of those things already. You don't need a key to enter a house or a car, and they can get anything they want from your bank and email provider with a subpoena.

I agree that giving law enforcement a universal key that can defeat all encryption would be a monumentally stupid idea, but it's not actually necessary to enable them to bypass encryption on specific devices. As far as I can tell, privacy activists have just made up the fact that law enforcement wants a universal key because it's easier to argue against that than to argue in favor of their actual position.


They already have the capability to remotely access most modern vehicles, this is built in to cars manufactured in the past few years, along with telemetry that tracks everywhere you drive.

Access to bank account details and emails is also routine in police investigations. And they can simply smash your door in if they need access to your house.


To the degree that they can do it, they can only do that because they own both the hardware and the software, and they are tightly integrated.

You need hardware that can securely hold secrets; you need software to detect tampering and tell the hardware about it; you need software that the hardware trusts to communicate with apple servers. Without the whole integrated pipeline from local secrets cache to apple servers, protected because it's all owned and managed by the same secret-keeper, what apple does can't be done.


> What am I missing?

The fact that this is not how security on iPhones works at all.

There are unencrypted partitions of the storage yes, which means the device can boot into iOS without any user password, but the ones containing user data are encrypted and an OS update can at best remove the limit on how fast or how many times you can attempt to unlock the user data (or rather the Secure Enclave which stores the actual key for the user data).

Sure, if you have a 4-6 digit PIN code this is then quickly unlocked in that scenario you're talking about, but if you have an alphanumeric password you can make that attack completely infeasible.

Bottom line is there is no way for Apple to make a version of iOS that completely bypasses encryption, encryption doesn't work like that.

They could, I guess, make a version which silently checks if it's host devices serial number is in a certain list provided by e.g. law enforcement, and then silently removes the SEP encryption, IF the user unlocks the device AFTER that software is installed. But they would have to secretly add this code to the normal iOS releases which would severely compromise their customers data en masse if they did it this way, or create a way to push a specific build over the air to a specific device (which actually doesn't sound that far-fetched honestly, now that I'm thinking about it), without alerting the user that they are installing a backdoor for law enforcement.

I don't think that's feasible either way, because it would quickly come out that they've done that and once the cat's out of the bag they are losing big bucks, and would be criminals will simply stop installing updates on their iPhones.


an OS update can at best remove the limit on how fast or how many times you can attempt to unlock the user data

Wouldn't an OS update be able to store the user password in a plain text file on the non-encrypted partitions? I don't think those partitions are hardwired to be readonly until the rest of the system is unlocked?


Absolutely, but only if that code is running when the user supplies the password. I mentioned this possibility as well.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: