> you don't even have to look down to see the slider's progress because macOS will mirror it smack-dab in the middle of your display.
Progress isn't the issue. Input is. To operate a slider, your pointer/finger needs to be where the handle is. The location of the handle is variable. The touchbar could potentially handle this smoothly by displaying sliders in such that they always appear with the handle underneath the finger, but this is not the behavior the touchbar was displaying when I tested it out -- sliders opened out to the side of the target you tap to open it, which means you then have to move your finger to the location of the handle, which requires a visual reorientation.
And none of this is an improvement on quick taps with well-defined change increments for stuff like adjusting brightness and volume. Not to mention the value of tactile feedback.
Unfortunately, this is exactly the issue I'm talking about: Apple has solved these problems; it just hasn't done a great job spreading it.
> To operate a slider, your pointer/finger needs to be where the handle is. The location of the handle is variable. The touchbar could potentially handle this smoothly by displaying sliders in such that they always appear with the handle underneath the finger, but this is not the behavior the touchbar was displaying when I tested it out -- sliders opened out to the side of the target you tap to open it, which means you then have to move your finger to the location of the handle, which requires a visual reorientation.
The location of the slider actually does not change, interestingly. What actually happens is the touch target for the slider changes to include your original tap point when you tap it, even if it shifts visually. This means all you have to do is slide, and this works even from the original location.
This is an improvement over the choice I thought they'd made, but it's an improvement in utility alone, and also even more of a headscratcher of a choice when it comes to the UI. It means they realized you want the handle right where the touchdown happens, so they can't plead ignorance, but they chose to have the in-touchbar visualization the user is interacting with appear elsewhere. It's not just non-obvious and therefore a failure of spreading the news, it's actually misdirection. Not having any visual feedback in the touchbar at all would have been better UX. As you pointed out earlier, the screen does fine for independent visualizations.
Interesting, it's got a lot in common with one of the complaints around the escape key -- the actual touchable area is different than the outlined area (which is positioned farther in from the keyboard than it's historically been).
Laptops have long had a display surface and a touch surface. If marrying them is worth experimenting with, getting the visual feedback connected with touches correctly is a crucial idea.
Progress isn't the issue. Input is. To operate a slider, your pointer/finger needs to be where the handle is. The location of the handle is variable. The touchbar could potentially handle this smoothly by displaying sliders in such that they always appear with the handle underneath the finger, but this is not the behavior the touchbar was displaying when I tested it out -- sliders opened out to the side of the target you tap to open it, which means you then have to move your finger to the location of the handle, which requires a visual reorientation.
And none of this is an improvement on quick taps with well-defined change increments for stuff like adjusting brightness and volume. Not to mention the value of tactile feedback.