Looks like Apple is doubling down on at least two features that really seem like duds -- 3D Touch (which as far as I can tell nobody even knows exists), and Siri (which everyone raves about but nobody uses for anything).
I'd love to see UX improvements around minor inconveniences of the iOS experience:
- what the hell the volume button does in different contexts (sometimes it changes the ringer, other times the in-app volume, other times it just seems to go into a black hole)
- easier access to app settings for the app you're using (things like privacy settings and notification settings and revoking/re-allowing privileges (like camera access, etc.); maybe context-sensitive pull-up or pull-down things? I like that OS controls this, not the app, just need a better way to navigate to it.
- turn off buzzing and booping when I plug in a charger, and maybe some sort of wireless charging thing if the technology is where it needs to be yet
- stop showing album art on the lock screen (or make it optional)
- more uniformity on how lock screen notifications are acknowledged and dismissed -- sometimes there's an 'x' in the swipe-over, sometimes actions, sometimes just pressing the notification does something
The only feature that seems interesting to me is the voicemail transcription; previous services I've used for this have been pretty terrible, but hopefully this is better.
With Marshmallow's security model (of run-time grantable and revokable permissions) Android has become a lot more attractive to me -- my biggest pain point in using Android devices has always been the nightmare of looking at what an app wants in order to install.
Hold down the Option key and click on Airport or volume control to see what I mean in macOS. It gives you more information, and the information it provides is information you didn't necessarily need to have without holding down the Option key.
In airport, with the option key held, you see options to generate a wireless diagnostics report. Without the option key held, you can just switch networks.
In the volume control, with the option key held, you can see options to change speaker outputs and mic inputs. Without the option key held, you just see the volume controls.
On iOS, with regular touch on the camera you get the camera. 3D touch - you get the ability to specify the camera you want before loading the app.
3D Touch is for iOS power users.
Option key is for macOS power users.
I've been using macs for I guess a decade or so now and I still have no idea what the option key is for. When a keyboard shortcut is like ^+⌥+F6 I just mash buttons until it works. Why do they use a different symbol on the button than in the UI? I really have no desire to learn the seven different buttons in Mac OS and I wish application authors would stop using them so they could someday simplify the keyboard.
I don't understand why we tolerative having shift, fn, control, alt, option, and command, not to mention ⌘ whatever that is. Is it really necessary? Do I need to be an Apple historian to be able to understand the primary input device on a Mac?
Command makes sense. And we need shift for capital letters. Let's just leave it at that.
Useless fact of the day: the special characters Apple uses to represent keyboard shortcuts in the menu bar, with the exception of Ctrl (^)[1], are all part of an ISO standard [2]. I've never really understood why they stopped printing them on the keyboard, though-- it turns figuring out shortcuts like ⌥⌘⎋ into a game of trial and error for new users, rather than something that's relatively self-explanatory.
A few of the symbols are still printed on Apple's keyboards in a few locales, but it looks like the confusing ones (option, escape) are gone in all locales on their most recent keyboards.
[1] Apple follows the popular convention from *nix here, rather than the ISO standard, which is a sort of a dharma wheel thing that I can't find in Unicode.
"The "⌘" symbol (the "looped square") was chosen by Susan Kare after Steve Jobs decided that the use of the Apple logo in the menu system (where the keyboard shortcuts are displayed) would be an over-use of the logo. Apple's adaptation of the symbol — encoded in Unicode (and HTML) at U+2318 ⌘ (HTML ⌘) — was derived in part from its use in Scandinavian countries to denote places of interest. The symbol is known by various other names, including "Saint John's Arms" and "Bowen knot"."
Yes, that's not ideal, and it's a striking UX failure on Apple's part that the keycap says "option" while the menu says ⌥ - all the more striking given Apple's generally well deserved reputation for getting that kind of thing right.
Same goes for control and ^ - granted, Unices have represented Control as ^ since forever, but part of the appeal of a Mac is supposed to be that you don't have to be a grizzled hacker to understand it.
That said, how does it take (more than) a decade to learn these correspondences? Not that I don't understand your complaint here, and I agree it should be fixed, but...I mean, this stuff doesn't seem all that hard.
I use option and control all the time for muscle memory keybindings, but every time I see "⌥" or "^" as a shortcut in a menu, I need to check which is which. After all the time I've spent on cli, you'd think I'd have "^" memorized by now, but for some reason it doesn't stick.
Aye, but on newer mac keyboards it lacks the symbol. So you have that funny symbol in the menus, and you're supposed to divine (you're a new user, remember) that it means "option" with no visual similarity.
How odd! But I believe you are right [1], all two dozen Apple keyboards have the option (aka "alt") key, including the "British" one, but not the "English" one... (Who's the imperialist now ;-)) Perhaps the idea is that the average user only needs the option key for non-ascii characters like £, so Americans won't need it.
It's in flux. International English keyboards (a bit different layout from US and not the UK one either) used to have the symbols. The most recent iteration has the command symbol, but not the others, it says "control" and "alt option" on them. Still has a symbol on the return, delete and tab keys instead of US-style labels.
Why? It's no different than pressing "Windows + D" to display the desktop on a PC.
As for the characters ⌘, ⌥, etc. they are printed on the key themselves...
Edit: Also, ⌘ historically denotes a place of interest. ⌥ is visually taking an alternative path... it's a straight line which goes parallel. It's the key for alternative functions... alt on pc.
⌥ is not printed on any key on the Mac I am using, and I am sure ^ on top of 6 is not the ^ parent mentioned. I agree with parent, these are ridiculous.
Could be that it's difficult to describe over the phone which keys are alt/option/command/control? "The one with four loops" vs "the one with forking road" would be funny :D
I feel like your complaint is a little silly. ⌘ is Command and is printed right on the keyboard, Shift is an up arrow, and ^ is instantly recognizable as Control if you've used a Unix ever.
The only one that's really strange is ⌥, which by elimination is Option.
I found ^ the difficult thing. Although working with PCs for more then 20 years (although mostly Windows, only some Linux) I had no idea that it could mean Control. And therefore I needed to google what it means as an OSX keyboard shortcut. ⌥ was obvious, as it is printed on my keycaps, ^ not.
I miss open-apple and closed-apple. I guess I didn't realize that Apple took those away at some point, and that the command and option symbols aren't printed on the keyboard anymore. Weird.
I am still not sure why PC keyboards have a context menu key. I've only ever pressed it by accident.
Well, then you haven't been using macs for a decade, but for a year, times ten (and that's totally fine, unless you're leaving arrogant comments on HN about "why we tolerate this or that"). I've been using macs for about 5 years and I still learn new useful features and shortcuts, and having 3 modifier keys is so useful (e.g. emacs + iterm2 + spectacle) that I can't see myself going back to linux/similar. ETA: also, brew is lovely for people who want a stupid simple package manager
There's Control, Option/Alt, Command, and Shift. Four modifier keys. This is the same set you have on other OSes, except there, Command is called Windows Key or Super.
Yes, Apple has symbols as abbreviations for Control, Command, Option, and Shift. Yes, Option is also called Alt. But it's not really all that complicated.
The problem isn't the number of modifiers, the problem is that Apple uses symbols for them in the UI that over time it has progressively removed from keyboards, and which have no obvious connection to the names on the keyboards (except for Ctrl sometimes getting the same ^ treatment on other platforms, this just isn't the case outside of Apple.)
What I really meant was Windows laptops as the parent was expressly mentioning Windows desktops. I did not know about the Chromebook though. I am curious though does it have F keys which are usually the ones that are used with the Fn key on Windows?
> Hold down the Option key and click on Airport or volume control to see what I mean in macOS. It gives you more information, and the information it provides is information you didn't necessarily need to have without holding down the Option key.
You just blew my mind. I've been using Macs for nigh on 10 years and this is the first time I've heard of this. This is going to make so many common tasks (choosing sound output) so much easier!
a: mind blown
b: isn't this a terrible UX? I mean, these features are GREAT but unless I randomly caught this comment in an HN thread, I would never have known about them. And when I option-click an icon, I have no idea what's going to happen before I do it. (Incidentally, I have the same problems with 3D Touch).
> what the hell the volume button does in different contexts
If the app you're using right now has a handle for audio playback, it changes the speaker volume. If not, it changes the ringer volume. As far as I can tell, that's all, and in a half decade of using iPhones and iPods, I've never actually seen it misbehave.
This bugs me like mad on Android. When I know the app is going to blast out, I have to let it before I can turn it down. I'm forever having to hop through menu after menu to turn the volume down.
My phone is permanently on vibrate (have young kids in the house and don't want to wake them), so changing the ringer is frustrating at most.
I'm not sure what solution they're looking for. A single universal volume control where if you turn up a video it means your ringer is now on full blast? 6 volume buttons on the side of your phone to adjust them each individually?
Android's system of adjusting media volume when media is playing and ringer volume when it isn't seems like a reasonably clean solution. It takes a single tap to expand to the three sliders.
Maybe a shortcut where where pressing volume up/down simultaneously mutes everything?
It's not that exotic, you've got a child or a sleeping partner or a boss in the same room and you want to watch a mostly visual video without any unnecessary jingles or other audio cues from disturbing them.
So you reduce the volume, but it only reduced the ringer for calls, which isn't what you want.
As the sibling answer says, newer Androids can do this. One click to display the volume, a touch to see the individual controls and then slide.
Press the volume once, the display has an options "cog" on the right, tapping that shows all the volumes on the device and you can set them to your heart's content.
If you're on a newer version of Android you can actually open up a three-slider volume control that has alert volume, media volume and alarm volume separately.
So, on Android, if you know it's going to blast out, you can't go into settings from the notifications shade and change the volume for Music, video, etc. to a reasonable level? I can on mine...
It just needs to be smarter about the context. If a video is loading in chrome, 90% of the time the intent is to turn down the volume of the video, it's broken that it waits for it to start playing.
Honestly, the most intuitive option for me would the audio controls should never touch the ringer, bury that behind a menu, rather than switching based on context.
How about having them not ever adjusting the ringer?
You can always adjust the ringer (something people rarely do -- for quite times there's the "silent" mode anyway) from the UI, and always use the volume buttons for the system's (music, video etc) audio volume.
It is even a bigger nightmare if you have chromecast streaming to a TV. Now you have another volume control (which chromecast usually takes priority in this situation). This have been the biggest usability issues I've had with Android.
It is still impossible to change audio volume for headphones before the app begins playing.
Consider this scenario:
- Use iphone at max volume to power speakers (pseudo line-out)
- Switch to sensitive IEM's later, attempt to change volume before starting... can't. Just changes notification volume.
- Begin playing music, rupture eardrum if IEM's in.
Fortunately iphone doesn't push enough current to damage the hardware (which can be replaced), but it is plenty enough to damage hearing (which can't be replaced).
I have yet to find a way to change headphone volume without initiating playback.
Android's volume control is a mess but at least it is possible to change these volumes.
Blackberry 10 handles the situation best: It remembers the volume for line-out usage and remembers the headphone volume used last. If Apple can detect headphones are unplugged they can detect line impedance vs. headphone impedance.
(There is a setting in Settings that will turn off ringer volume changes with the buttons. That way you will always adjust the volume and never the ringer, no matter whether anything is playing or not. As such you can then adjust the volume of headphones without playing anything with the buttons. Works like a charm. I find that I personally never need to adjust the ringer volume once I have set it to an appropriate level. I just care about turning the ringer off and on, not adjusting the volume.)
I also think Apple already does remember headphone volume? It seems to for me?!
You can plug in the headphones then adjust volume from the control center. I do this habitually after experiencing several near eardrum blowouts as your describe.
What I do is slide up the tray from the bottom, and then very carefully, turn down the volume slider, making sure not to accidentally hit the play button.
I think there are a few other edge cases. Siri has its own volume controls while you are using siri. I think phone calls have their own volume too. And while you are in the camera volume buttons become 'take photo' buttons.
In Siri and a phone call, you're still adjusting the speaker volume. The current setting appears to be independent of random apps that make noise, but it's still speaker volume vs. ringer volume.
That's true but how do I turn the volume down before a video starts. As far as I can tell there is no way to do that. In Youtube, even if my phone is muted, as soon as the video plays it will be 70% volume and I have to rush to turn it down.
If you're like me and never want to change the ringer volume, then go into Settings, Sounds, Ringer and Alerts, and set Change with Buttons to off. Then the volume buttons change the playback volume all the time.
That's confusing as hell though. How do I know whether the app I'm using has a handle for audio playback? It makes it rather a guessing game what's going to happen when you use the volume button.
I've never actually run into that case, but in informal testing, it looks like the camera overrides whatever else you happen to be doing at the same time, even if that's music or a call.
That's a design choice, and there're arguments to be made for and against, but it seems hardly likely to surprise anyone more than once or twice.
Anecdotal, but I've found Siri to be quite helpful with Car Play, along with Google's equivalent (can't remember its marketing name) in Android Auto.
Just drive along, push to issue command, and "Siri navigate me to <address> in <city>" and boom. This might have existed in luxury cars for years, but its coming down into the mainstream ~$30k cars recently and I'm loving it.
I'm thankful for technologies like Siri in the context of driving at least.
Anecdotal counterpoint: Siri is horrible at recognizing my speech if it is anything that it doesn't expect. I can ask it to define a word all day long but I doubt it will get any better. It just keeps giving me the same (incorrect) response repeatedly. Google Now does a very good job at recognizing voice including words I may mispronounce when I say "define X"
I can sort of imagine why. Google "knows" me better. What is awesome/scary is that I have tried this on other people's iPhones and Google Now still does better than Siri.
Google Now's dictation is so much better for people with accents.
My family's English pronunciations have a very strong Indian/Punjabi accent. For people with accents, Siri works very unreliably to the point that we've given up on it, whereas Google Now "magically" works all the time.
Indeed, I have noticed the gap in comprehension between Siri and Google Now. I'd attribute that to Google having a much deeper expertise of ML than Apple. I wouldn't entirely discount things like Siri though, they'll get better with time.
Google built their own voice recognition using neural nets. Apple licenses theirs from Nuance, the makers of Dragon Naturally Speaking. I don't know what technology Nuance uses now, but historically it was Markov Models, I believe. The founder of what-would-become Nuance, and early pioneer in voice recognition, Ray Kurzweil, was hired a few years ago by Google to "to bring natural language understanding to Google."
I speak a weird hybrid of four Norwegian accents, and Siri + iOS dictation is indispensable in both using my phone faster, and operating it when my hands are occupied with other things. So there's clearly a variety of experiences here.
English with Norwegian accent is generally very clear and pronounced though.
I never have problem with google now either, and I use it several times a day. As you I speak a Norwegian that is called "lett blanding" or "light blend" in English, composed of western, southern and eastern dialects :)
Well there are some problems.. If I have it set to English, I can't really get it to grasp Norwegian names for streets etc. I wish it could have a mix of some sort.
One thing that always worked for me on iOS but never on Android was triggering Siri from a standard bluetooth head unit.
I could hit the "call" button and it would launch Siri and I could say "message my wife" or a plain "read me my messages".
Despite loads of mucking about, installing and buying 3rd-party utilities, I could never get Android to do the same (and Google Now would never "read me my texts" and often "text my wife" would say it had sent but would never arrive).
I'm sure it's easy with Android Auto, but I have old cars and I'm not shelling out for a head unit that costs almost as much as the car itself.
I've heard this sentiment with 3D Touch and Siri before. I'm probably an outlier, but I use each about a dozen times or so a day.
3D Touch is one of the reasons I upgraded to the 6S. Lack of 3D Touch was the one thing that disappointed me the most about the new iPad Pro. I wasn't sure at first if it was something I'd end up using long-term, but it didn't take long for me to develop a lot of habits around it. Here are some of my daily actions through 3D Touch:
* When I go to Starbucks, I use 3D Touch to get to the app's Pay option.
* I use it on the Phone app to quickly dial the people I call the most.
* Same with iMessage.
* I use it on Sleep Cycle to quickly start my morning alarm.
* On Intercom to speed up launching/getting to the part of the UI I want (People vs. different support mailboxes).
* Day One to quickly record a journal entry.
* Drafts to jot down a quick bit of info (someone's name or something, or a thought) to later process/organize somewhere.
* Pandora to quickly get to my most-used stations (I only really have a few I often use, so it's perfect)
When I need to place a call and am in the middle of something (getting ready in the morning, looking something up, making dinner, etc.), I use Siri. Actually, I'd say most of my calls are placed with Siri unless I have the phone in my hand.
"Hey Siri, call <person> on speakerphone" is pretty common here.
I also use Siri to start timers for laundry or to remind me of something I need to think about later. "Hey Siri, in 4 hours, remind me to do <blah>".
I hate naps, but sometimes I need them. "Hey Siri, wake me in 30 minutes."
I ask her every day in the morning what the weather's going to be like. Even though I live in Palo Alto and it's not exactly hard to guess. Super handy when I'm going to go somewhere else though.
I also use her as an app launcher pretty frequently. Mostly in the morning when I want to kick off some music at home. I'll wake up, start getting dressed, and say "Hey Siri, launch Harmony." Since that takes a few seconds to launch sometimes, I do that while I'm otherwise occupied. Then when I can get to the phone, I just have to tap the item on the screen to get music going.
My car will let me know when I have incoming texts, but I can't respond through it, so Siri's also my dictation if I'm stopped at a red and need to fire off a message to someone saying I'm late.
So that's how I use 3D Touch and Siri. Consider me a data point :)
Same here. I use Siri for everything that would be slower to do by hand, or my hands are busy.
"Wake me up in X minutes"
"Call <person>"
"Remind me to ... [in X hours, in the evening, when I get home]"
"What's the weather like today", when I'm getting ready in the morning
"What's on my agenda today?", while driving to work
I rarely use Siri to query for specific data, which seems to be the thing Google Now does well. Siri for me is just a way to operate the phone faster.
I am really curious how the features they said were activated by 3D Touch will work on a non-3D Touch iPhone. I'm guessing that, like iOS 9, you won't get the menus on the home screen, but what about the notifications on the lock screen? There's no semantic meaning for a long press, so maybe they'll make a long press activate the menu as well?
I use it all the time, and have been praying for the ability to do more with it. I spend a ton of time in the car, and its really useful in that context.
Also, its faster to tell Siri to remind me __________ or, create an appointment for ____ on ____ than to (unlock, find icon, tap +....)
> 3D Touch (which as far as I can tell nobody even knows exists)
I think people know it exists. The problem is that it doesn't do anything remotely useful, except for the iPhone keyboard cursor thing.
> Siri (which everyone raves about but nobody uses for anything).
I use it all the time (at least once a day) to add a reminder, set a timer, set an alarm, check sports scores (I don't really follow sports, so this will probably end after the Warriors' run), call someone (I do this rarely enough that I'm incredibly slow trying to figure out the Phone/Contact apps), or do a Google search. I would love some basic improvements to Siri UI, like sending my Google search straight to google.com instead of reading my entire query back to me first.
> what the hell the volume button does in different contexts
I've never had a problem with this. Another commenter already covered it, but it's extremely predictable and fairly obvious.
> turn off buzzing and booping when I plug in a charger, and maybe some sort of wireless charging thing if the technology is where it needs to be yet
Charging cues have never bothered me in the slightest, but I see no reason not to have a setting for it. Wireless charging would be absolutely huge. It's probably my number one feature request. I'd probably pay for a brand new 6s if all they added was wireless charging. Heck, I don't even need it to be fast, just let me set my phone down at my office desk and bedside table and I'll be happy.
> stop showing album art on the lock screen (or make it optional)
Again, this is something I've never even thought about, but it's hard to argue against making it optional.
> more uniformity on how lock screen notifications are acknowledged and dismissed -- sometimes there's an 'x' in the swipe-over, sometimes actions, sometimes just pressing the notification does something
That's enough for me to never want to miss 3D touch again.
>I use it all the time (at least once a day)
I use it almost daily too, you get used to that things so quickly. The thing that annoys me with Siri is that it is slow. I want to say: "Hey Siri, how is the weather today?" and not "Hey Siri" wait, wait, wait, bling, bling, wait, wait, wait, "How is the weather today?".
I don't actually use Hey Siri much, because it is so unreliable. I mostly press the home button and hold for the entire duration of my voice command (because that allows you to pause in the middle of your voice command without Siri thinking you're done speaking).
I didn't know you could do that, maybe it could be useful sometime. Currently I use Siri only when I can't use my hands to interact with the phone (cooking, clothing the child, etc.).
Can you not do that? In my experience (including a test just now), speaking that phrase in a natural cadence (no weird pauses) is recognized perfectly.
That's interesting. For me, after saying "Hey Siri" it takes maybe half a second to a second for Siri to "wake up".
I use it in German, maybe that's the difference.
Just the shortcut menu for "Take Video"/"Take Slo-Mo"/"Take Photo". Not the greatest of use cases, I'll grant you, but it does remove that small hump of "launch camera, swipe left" when you're wanting to capture something quickly.
>looking at what an app wants in order to install.
I believe they request the same things in iOS as they do in Android in the case of multi-Platform apps like Facebook, but Google decided to show them in a different way that looks like it is worse than how Apple does it.
I haven't actually used Marshmallow yet; only read about the security model.
But historically, apps on iOS can only ask for permissions at run time, not at installation time, and you can refuse any permission if you choose (and restore it later if you want). So if an app wants to use a camera, it can either (terribly) ask you the first time you run it, or (nicely) ask you when you do something that uses the camera, and app store guidelines say the app has to not break horribly when it is told no.
In Android (historically) at installation time you get a list of requested permissions, and you either accept them all ("can access the microphone", "can access your contacts", "can access your browser history", "can control your phone", "can implant thoughts into your mind") or you can opt out by not installing the app. That's the scary thing; you go to install a todo app that wants to access your contacts -- what do you do? Is it malware? Will it be changed to be malware in the future?
From what I can tell, the interface in Marshmallow is still annoying in that the app tells you what it is going to ask for at install time, but doesn't get granted those permissions until it asks explicitly and you say yes.
In marshmallow, apps that are built for marshmallow behave like iOS, in that they will install without asking for permissions. For old apps, they will behave like previous versions of Android and pop up a list at install time. Those permissions are granted at install. You can go into settings and disable granted permissions in both cases (with a warning that things can break for old apps)
> - turn off buzzing and booping when I plug in a charger, and maybe some sort of wireless charging thing if the technology is where it needs to be yet
Does setting it to mute not work? My iPad 3 makes the noise on plug and unplug exactly when it is not muted.
Still buzzes when it's on mute; in a quiet room at night that's still more than I want. If they just made it configurable like other device sounds I would be more than satisfied.
> 3D Touch (which as far as I can tell nobody even knows exists)
Is this feature that bad? I'm an Android guy and it's probably the most appealing thing I've seen on the Apple side in a long time. At least as a replacement for the long press.
The real issue with 3D Touch, is that there is nothing contextually there to say a button has 3D touch on it. It's just a let's try it, which means you don't even think about it.
It just isn't incredibly useful. Looks like they're refining how it's used (glanceable info instead of just shortcuts on a force touch on a home screen icon). I don't know what the parent expected them to do (throw out years of development?) but looks like it's heading in the right direction to me.
They should just replace the long press with it. The problem is it's only on the 6S models, and the difference between 3d touch and a long press is a delicate balance.
Replace the long press, and the experience degrades nicely on older hardware.
It just flopped on Android like it's now doing on iOS. Push-hold / long-click / whatever you want to call it is just easier / more intuitive, I suppose. I believe the only big hit pressure sensitivity had on Android was on the Samsung Note type devices (which used it as pressure sensitivity on their pen) as well as the various Wacom-style drawing tablet things.
The things you see in the news about "X new Android smartphone will come with an Android version 3D touch!" are basically just marketing hype / PR. Provably, and beyond a doubt, Android has had 3D touch for 7 years now.
On what devices? The problem with these kind of features (e.g.: huawei's knuckle touch) is that unless you have a large enough device support, the developers can not rely on it. 3D Touch will have a massive support in a few iPhone generations.
My main use of 3d touch is switching apps. That combined with swiping from left eliminates having to reach down to the home button or up to the back button when using the phone one handed.
Holy moly, Apple is very aware of the competing chat apps. iMessage looks amazing and they packed a lot of features in there, but I wonder how they're going to make it all easily discoverable.
Really? I felt the presentation of iMessage was embarrassingly trying to seem hip. The only thing I didn't find to be annoying bloat was the ability to tap and "like" a message. Really cuts down on messages like "great" and "ok." Otherwise it honestly felt like Yahoo Messenger from 2005 in level of bloat, where you could "buzz" people and shake their windows. People use things that aren't iMessage, not because of features, but because they are cross platform and secure. Apple doesn't have that, so everything to me just felt like texting with bloatware.
I don't find those features confusing, just annoying.
No. People use other apps because they have a large adoption and stickers (we chat and line) or because they were free while sms were not (whats app). Only us tech nerds care about security.
Of course, that is how you get to the large user base. but I think it is wrong to assume that users care about security. If they did nobody would use SMS, email or Mc Donald's wifi but here we are. It is a very important feature, but it does not sell.
I dunno, I think my parents will like it. One of the great features of snapchat is the ability to caption images with stickers and text. It's a lot of fun.
The problem is not discoverability, the problem is certainty that the counter part can receive these things.
I already find the SMS app on iOS confusing. Am I sending a paid SMS or a free message?
I dare say one of the major advantages of messaging apps in general is the certainty that the receiver will receive the message in the same shape as the sender send it.
With MMS you never know if the picture can be opened, and people just gave up.
Green bubble = SMS. Blue bubble = iMessage (data only, doesn't count towards text limits). I would assume the app would only let you use the new features with other iMessage users.
Even better, you can tell before you click "Send" whether the message will be green or blue, by looking at the color of the send button. Kinda nice, you can actually just enter any phone number in the "To:" field and wait a second, and check the color of the button. This will tell you instantly whether a given phone number is for an iPhone or not. Kinda neat.
It's really troubling to me that 3D Touch features are a big part of this release and yet their most recent phone release (the iPhone SE) has no support for it at all, and neither does ANY iPad, including the flagship iPad Pro 9.7 released in March.
I know many people are sceptical about 3D Touch. I myself found it to be quite useful if you build habits around it - until I got an iPad Pro 9.7 and found all those habits were useless. Since I had to break the habits on the iPad I rarely use it on the iPhone now.
If Apple wants people to adopt 3D Touch and make it useful it has to actually be available on anything other than the 6S.
Finally they managed to remove the weird intelligent selection behavior in safari!!! (The one that unpredictable marks blocks or the whole website). Thank you apple :)
They also ignore any viewport meta data so you can always zoom. My Cordova app doesn't like this :( I hope it can be disabled for WKWebView programmatically
For websites this seems ok for me, but for apps that run inside wkwebview, I think there should be more control about this behavior. You can't zoom normal apps so this would make the Usebility properly worse or unexpected.
How is object recognition possible on a (i)phone without a huge/massive dataset like Google uses on Google Photos' AI backend? It has to be worse AI than from Google Photos. Can someone clarify?
You only need large datasets to train a model - the result can be shipped around pretty easily. My company builds an app http://forevery.com/ which actually runs a neural net on your phone to organize your photos.
It's a whole subfield of machine learning called model distillation or model compression that concerns with shrinking large neural nets, or ensembles of them, to fit on small devices like phones and tablets. By reducing the space needed by 10x, they only lose 1-2% in accuracy.
But what I like about it is that neural networks can be transferred between different architectures, and the whole process of training a neural net can be sped up by starting from a previous result.
When you say relatively, just how many MBs of data would a trained net be that can recognize objects and do the kind of recognition/photo grouping they showed at the keynote?
Is this the death of hybrid apps? Seems all the announcements were focus on the deep integration that apps can now have with the OS. For a long time, the big knock against hybrid development was that performance was never as good as native, but it seems that this new wave of integration presents a far more compelling case for native development, no?
As long as one can leverage plugins for hybrid apps, I don't see why most of the demoed features today aren't possible in a hybrid app. At least nothing sent up huge red flags for me as a hybrid app developer.
It's always easy to say that things are inspired from a different company or product, but always keep in mind that these features don't always come ready before a year or two of work. First person to announce it working and integrated gets the merit, but a lot of time, the same things are being worked on in parallel in different places :)
And no one knows what effect those 'monumental calculations' will have on the battery. Paranoia kept apart, doing them on the cloud makes more sense IMO.
Yes, the company that is expending significant effort on (differential) privacy, end to end encryption, ad blocking (and just wrapped up their small, privacy-sensitive ad network)... is going to start hosting Pampers ads.
No, you didn't get it. They won't host ads, but they'll have to upload things on their servers to compete with Google in terms of features (both Google Photos, and Google Now) (of course, they'll still do it with the holier-than-thou attitude). It's not that they've never taken complete U-turns anyway.
Which is reasonable. For all that Apple gets a lot of stick for "failure to innovate", as a reasonably longstanding iPhone user, I'm quite pleased to see they're maintaining their habit of letting other manufacturers take risks, and letting time and user preference shake out what's worth actually implementing.
You already swipe for widgets in iOS, it's just a different direction. I'm not sure that's exactly a copy.
But there is a ton of stuff in iOS 10 that's 'copied':
iMessages looks like WeChat/SnapChat/some other stuff had a baby
A lot of the photos features seem right out of Google Photos
3rd party apps have done the Mac unlock thing before
Android has had the ability to add plugins for some apps for quite a while, no? Or at least things like keyboards weren't sandboxed the same so they could do similar things to what iOS is now getting.
That's progress. Someone makes a cool feature, if it works out well other companies start integrating it. Google has 'stolen' from Apple, Apple has 'stolen' from Google. Everyone borrowed stuff from Palm.
I feel like all the photo features are them just putting features back in to Photos after dropping iPhoto. iPhoto did face matching, geo tracking, etc etc.
Since they’re tweaking copy/paste to work between devices, hopefully they are also working on the phone UI for text selection. Before I care about cloud-sharing any text, I need to be able to SELECT IT properly, and frankly iOS fails completely at this right now. They need to remove the second-guessing, over-selecting, contextually-smartly-selecting code, or whatever it is doing now.
Am i the only one thinking that Steve Jobs would have refused most of the things that were introduced this morning? Apple sure has changed over the years.
Maybe not. Does it matter? Steve Jobs, brilliant though he was, has been dead for a while now, and shows every indication of staying that way into the indefinite future. While that doesn't necessarily mean he has been unable to stay up on the last few years' worth of change and innovation in the mobile space, it does mean that whatever unique insights he's been able to derive from same are not in a place where they can do Apple's designers and engineers much good.
Steve was fantastic at saying no. He said no to the app store multiple times. He said no to almost everything that ended up being Apple staples. I'm not sure it's fair to give him so much credit.
Yeah, I don't think that all that junk added to Messages would have been added under Steve Jobs. Not that I think that's a bad thing, but it's definitely a notable departure away from clean and simple and minimal to something that is much, much more complicated and glitzy.
Why build a first party messaging app if most people aren't going to use it? For people who just want to send plain text messages the UI is pretty much identical. Text box, type, send.
Did you miss the part where they indicated that Messages was the most used app on iOS? Why would they abandon that experience for their customers and leave it up to a 3rd party to control?
Always wondered if they'd stall at iOS 10 and just start moving up minor versions like macOS. 10.1 etc. Another change that doesn't make a ton of sense but I guess looks nice.
I've got iOS 10.0 installed and there is nothing you can do from my lock screen except access the camera. Settings has you completely covered if you want to disable those things.
That was my first thought, too, but you'll most likely be able to toggle what information is visible when the the device is locked, the same way you can now with Messages, Mail, etc.
iOS 10. 10 Features that I think they had a hard time to convince the audience that were so important.
I've been using iOS since v3 and it's going ^2 complicated every year. Come on guys, nobody expects that you can deliver a revolution with every WWDC. Just don't make the interface more complicated.
Siri, 3d touch, home kit? I barely know anyone using those and everyone these days has an iPhone. The focus is shifted from usability to features. So sad.
>> "Siri, 3d touch, home kit? I barely know anyone using those"
Shouldn't Apple improve them so that they become useful for more people then? I used Siri very occasionally but with the SDK I can see myself using it quite often when I'm at home and need to, for example, order an Uber. I can just yell it at my phone without stopping what I'm doing. With 3D touch do you expect them to just throw out years of development? No, they work to improve the UI and more people will use it - the glance able info on a force press on a home screen icon looks great to me.
I also use voice to start timers, probably more than once a day on average.
Which is why, by now, the idiotic and unfunny jokes Siri makes almost every time I do that are really fucking annoying (even though it doesn't say them out loud, and I just have to see them printed on the screen).
You can tell their engineers never continuously tested Siri's responses for multiple weeks. I'd rather have the "Ok" than Siri's 5 other longer responses, and the one sassy response.
Siri is great in a car and other hands free situations. In a car is when I used it the most. Mostly for music control, navigation, calls and text messages.
All of these things will improve usablity. Right now you have to fumble around inside apps for many common actions that will be much more readily accessible.
Third party integration and inter-operability with third party apps has been one of the areas Android fared better than iOS. Now with Siri, Maps and Messages third-party APIs and keyboard updates, apple can completely shed its walled-garden image.
Anyone know if Siri will still only respond when plugged in? I wish it were an option to turn on at all time. I realize battery would be drained considerably.
"It looks like the Photos app is getting a huge upgrade and now looks more like Google Photos. The big difference, obviously, is that Apple handles everything on your device and doesn’t collect data about you. "
is it just me or has apple really lost its advantage with the departure of steve jobs, i mean johnny ives is still there, but they seem to be doing a mediocre job, dont get me wrong, i really liked the new macbook when it came out, and i own an apple watch (more of a fashion accessory more than anything else), but maps, siri, music, photos, etc seem like minor increments. blow me away apple like hou did with the first intel based macs, or how about when mac os x first came out, the very first ipod, we need more makers, not an operations guy
i've been a loyal apple user since the motorola 68030 mac ii, this was in the 90's so it's natural to expect apple to change the game every few years, with cook at the helm, he's done nothing exciting, just a lot of conservative incremental changes, you see where the stock's trending lately. i am not some fanboy that jumped on the apple bandwagon in the past 10 years, so from my perspective, apple is starting to falter, but jobs has laid a solid foundation of innovative products, couldn't see myself buy a dell/lenovo or android phone anytime soon, but things have certainly changed in the past few years, regardless if apple has so much cash in the bank. wwdc's in the past unveiled mac os x, the intel based macs (from powerpc), ipod, iphone, ipad, this wwdc has really been underwhelming granted it is a developer conference, but apple has always been using this to market their new gadgets.
Uh, most of us don't have that use case to worry about, but notification center and lock screen settings have our backs when we do. (Which I did, in the lead-up to the divorce. No trouble at all with phone opsec, which was nice, because as anyone who's been through it knows, divorce means enough trouble all by itself without overly helpful electronics adding more.)
I'm sure you'll be able to decide the amount of stuff you want accessible from the lock screen, as you already get to do now, but you may have more pressing concerns regarding your partner's trust.
Oh, I used the example of my GF responding to messages/viewing notifications to essentially bring out my privacy concerns. Not that I don't trust my partner :)
I must have disabled this feature when I installed iOS 9 and being unnecessarily surprised at this now.
iOS already shows stuff like message previews on the lock screen, with the option to hide their contents. I don't see any reason to believe they won't have a similar option here.
I'm extremely surprised this was downvoted by two people. Anyone care to explain? (I just show how to watch the keynote.) Did I post this on the wrong article or is this not what this is about, or what's wrong?
The vast majority of what you'll learn there still applies. UITableView, for example, hasn't changed appreciably since it was introduced in iOS 2.0 (just to take a totally random example that popped into my head).
More philosophically, you're looking at it all the wrong way. Learn the fundamentals and an understanding of new features comes easy.
If you're developing production apps you're likely supporting at least the last two versions of iOS. I'm a full-time iOS developer and don't remember half the new stuff in iOS 9 simply because I don't need it yet or it's iOS 9 only and I have to support iOS 7+. Also all of that knowledge transfers. iOS 10 will bring extra deeper stuff that you're not going to get anywhere near learning on an intro course.
Which course did you buy? The iOS development courses from Stanford are free on iTunes U and are probably the best out there. I did the one for iOS 7 and I learned all the fundamentals and more.
Looks pretty interesting.... Apple is putting support for 3D touch, more features we expect to come & a better siri support for future...Siri is fine now I think, developers should work more hard for siri!!!
Apple used to be about quality products. They would ignore the demands of shareholders, who are always wrong and never actually understand what customers want.
Now they're a generic computer company like Compaq and Gateway. Shareholders are running their business into the ground, and they're making terrible products, from head to toe.
Does Apple really think that people use Siri? Yes, limited voice control is useful in the car when you need to call someone. Other than that, I've never seen someone using Siri for anything other than screwing around. I think Amazon has forced Apple's hand with Alexa. The problem is, Alexa is more useful, more open, and overall far ahead of Apple's technology. I don't think that Apple understands that the main reason Alexa is great is that Amazon put it in a tube and made it able to control the lights in my house, the music, and lots of other stuff. It's not just for changing your name to something stupid and asking it to Google things for you like Siri is.
I'd love to see UX improvements around minor inconveniences of the iOS experience:
- what the hell the volume button does in different contexts (sometimes it changes the ringer, other times the in-app volume, other times it just seems to go into a black hole)
- easier access to app settings for the app you're using (things like privacy settings and notification settings and revoking/re-allowing privileges (like camera access, etc.); maybe context-sensitive pull-up or pull-down things? I like that OS controls this, not the app, just need a better way to navigate to it.
- turn off buzzing and booping when I plug in a charger, and maybe some sort of wireless charging thing if the technology is where it needs to be yet
- stop showing album art on the lock screen (or make it optional)
- more uniformity on how lock screen notifications are acknowledged and dismissed -- sometimes there's an 'x' in the swipe-over, sometimes actions, sometimes just pressing the notification does something
The only feature that seems interesting to me is the voicemail transcription; previous services I've used for this have been pretty terrible, but hopefully this is better.
With Marshmallow's security model (of run-time grantable and revokable permissions) Android has become a lot more attractive to me -- my biggest pain point in using Android devices has always been the nightmare of looking at what an app wants in order to install.