Touchscreens, which are ideal for representing multiple user unterfaces in an intuitive way, but requiring constant visual contact are being replaced by a physical user interface that can be used by both touch, and memory?
Why was touchscreen ever even a consideration for controls you're not looking at?
Touchscreen is great for phones. It is awful for keyboards (see macbook pro). And if is even more awful for controls. Has nobody researched this before spending a few hundred million dollars?
Though, on a serious note, I work in industrial automation and user interfaces, aka HMI's have been touch oriented for quite some time. It was and still is common to see graphical elements which emulate the look of physical buttons used on machinery. This was done to help operators navigate touch screens who were used to panels full of buttons, knobs, and switches.
Recently I rebuilt a machine that was half analog and half digital controls to all digital control. I first started with a full touch interface with provisions for knobs and buttons. During testing operators hated, and I mean HATED the touch screen when it came to virtual potentiometers (one operator got up and walked away saying "this screen is a fucking piece of shit"). I installed encoder knobs to emulate potentiometers and it was a success. Everyone was happy.
Some things just can't be done with a screen. You need physical things to interact with.
IIRC the official explanation from the makers of Star Trek is that the displays look like simple touch screens, but they are actually overlaid with structured force fields, so for the users it feels like a physical button press.
By the way, are you "Le Jojo" of "Jojo on UI", or a different "that_jojo"? ;)
The high voltage needed makes it a no go for most/all portable devices.
I played with some other "cl controllable friction" tech but don't remember the basis of it. Iirc one actually made small indentations in the screen to simulate buttons. Had a bunch of limitations, I think the button placement was baked in at time of manufacturing, but it's been awhile and I don't trust my memory on the topic.
You can twist virtually with good precision. But, what is much harder is the feel, state of the knob, particularly when it is released and then gripped again.
Mechanically, the structure of the knob can take some energy input, and it serves as a mechanical pivot, or fulcrum, depending on how people use a knob.
Without all those physical things, people lack the complex frames of reference needed for fine, "thought is action" type control.
This is especially true in environments where gloves are worn.
Have you ever used an iPod (capacitive touchpad) scroll wheel?
A good physical knob is still better, but a touch wheel can be made pretty decent.
A bigger thing (for me) is being able to sense gaps/shapes without pressing anything and a fixed layout - touchscreens are about the change but that's only good for UI that you look at.
Something that creates physical feedback is for example that Disney VR project where they use air to create the feeling of resistance
There's really no need to force this decision to go in either direction. Ever since I first heard of capacitive multitouch, going back to the timeframe before the first iPhone announcement, I've been waiting for someone to build 'stick-on' encoder knobs that the touchscreen controller can read.
These would simply take the form of a knob with a metal leaf or other polygonal electrode in its base, whose rotation could be sensed by code similar to that used to implement crappy 'virtual potentiometers' on existing touchscreens. The fixed part of the knob base would be epoxied or otherwise bonded directly to the screen surface, or perhaps held in place with some sort of frame.
Doesn't seem to have happened yet despite being an incredibly obvious (and inexpensive!) thing to do. Seems like the MIDI community would be all over something like this, even if no one else considered it worthwhile.
Being able to set the dial on the screen and just turn it is a really good-feeling workflow, though likely not for something as mission-critical as the article is describing.
>On-screen detection: Touch digitizer reports the onscreen location through a capacitive pattern (Studio only)
I envision using cam levered suction cups to hold on rotary and linear sliders that had touch screen sensitive rubber tips. One could go as far to 3d print arbitrary interaction devices that could get attached to the face of the touch screen. You can use the multitouch sensor w/o the screen but still be able to configure arbitrary devices to go on the front.
I even had a design for joystick. Lots of analog opportunities when you have something like a back illuminated camera or a touch controller that can sense areas. You could also serially transfer data from the device to the touch screen, either using physical touches or electrically simulated touches.
I'd be surprised if the concept weren't already patented, though, just because the idea of a generalized capacitive control surface seems fairly obvious, and the patent office doesn't really apply an "obviousness" test. What definitely surprises me is that, patented or not, I can't just go out and buy these sorts of controls.
We would then be able to receive that pattern and understand how the knob is rotated.
What’s the resolution of todays touchscreen?
But for simple things, if possible without affecting the other parts of the screen, it'd be amazing to have a broadly supported, low bandwidth standard.
This might be as simple as "Here's my encoded position * frequency of sampling", but for a general interface you'd want something adaptable.
What if there were two dials on the control? Two dials and three buttons? One dial, four buttons, and a joystick?
Check this: http://huyle.de/2019/02/12/accessing-capacitive-images/ As you see, the sensor elements are huge, 4×4mm each, i.e. there’re only 15×27 sensors for the complete touch screen. On top of that, there’s high amount of noise in the signal of each sensor.
The reasons why it works OK in practice, fingers have very predictable shape, also a lot of software involved on all levels of the stack. Touch screen firmware filters out sensor noise and generates touch points. Higher level GUI frameworks “snap“ touches to virtual buttons, some platforms go as far as making virtual keyboard buttons different sizes, depending on which virtual keys are expected to be clicked next, according to predictive input software i.e. dictionaries.
What you propose probably can be done, by using a finger-like object, but I don’t expect the resolution will be great. At least not in comparison with hardware turning knobs, even cheap ones can be make extremely precise. See this https://en.wikipedia.org/wiki/Rotary_encoder and https://en.wikipedia.org/wiki/Incremental_encoder for more info, both are used a lot in wide variety of applications. Old mice with a ball had 2 of them, the reason why ball mice sucked was not sensor precision, it was dirt accumulation, a minor issue for a knob.
 VR and AR Fundamentals: https://medium.com/@michaelnaimark/vr-ar-fundamentals-prolog...
 Other Senses (Touch, Smell, Taste, Mind): https://medium.com/@michaelnaimark/vr-ar-fundamentals-3-othe...
 Food Simulator: https://www.wired.com/2003/08/slideshow-wonders-aplenty-at-s... https://ars.electronica.art/center/en/food-simulator/ http://icat.vrsj.org/papers/2003/00876_00000.pdf
Hiroo Iwata is a brilliant mad scientist , and in a previous HN discussion about pie menus and haptic multitouch interfaces , I linked to his wonderful work on 3DOF Multitouch Haptic Interface with Movable Touchscreen.  
 Professor Hiroo IWATA: http://www.frontier.kyoto-u.ac.jp/te03/member/iwata/index.ht...
 HN discussion of pie menus and haptic multitouch interfaces: https://news.ycombinator.com/item?id=17105984
 3DOF Multitouch Haptic Interface with Movable Touchscreen: https://www.youtube.com/watch?v=YCZPmj7NtSQ
 3DOF Multitouch Haptic Interface with Movable Touchscreen: https://link.springer.com/chapter/10.1007/978-981-10-4157-0_...
>Shun Takanaka, Hiroaki Yano, Hiroo Iwata, Presented at AsiaHaptics2016. This paper reports on the development of a multitouch haptic interface equipped with a movable touchscreen. When the relative position of two of a user’s fingertips is fixed on a touchscreen, the fingers can be considered a hand-shaped rigid object. In such situations, a reaction force can be exerted on each finger using a three degrees of freedom (3DOF) haptic interface. In this study, a prototype 3DOF haptic interface system comprising a touchscreen, a 6-axis force sensor, an X-Y stage, and a capstan drive system was developed. The developed system estimates the input force from fingers using sensor data and each finger’s position. Further, the system generates reaction forces from virtual objects to the user’s fingertips by controlling the static frictional force between each of the user’s fingertips and the screen. The system enables users to perceive the shape of two-dimensional virtual objects displayed on the screen and translate/rotate them with their fingers. Moreover, users can deform elastic virtual objects, and feel their rigidity.
There are some other really bizarre examples of haptic interfaces in the AsiaHaptics2016 conference videos! (Not all safe for work, depending on your chosen profession, predilection for palpation, and assessment of sphincter tone.) 
 AsiaHaptics2016: https://www.youtube.com/channel/UC8qMmIgmWhnQBeABjGlzGbg/vid...
So, Star Fleet made the same transition as the US Navy, from a touchscreen to 3D tactile controls.
The Galaxy class was the first class of ship with a saucer separation system that could be re-attached while in flight. I believe the Constitution class did have the ability to separate its saucer, but that was accomplished with explosive bolts and it couldn't re-attach without being in drydock.
Tactically that would only make sense if the engineering hull could keep an enemy ship entirely occupied, if there was more than one enemy ship engaging then its going to be difficult to keep them all from pursuing the saucer.
You are right though, if you did suffer a warp core breach you aren't going anywhere quickly. But in almost all situations help is only one subspace call away.
Huh? No way; whether you're in a battle with 3 cloaked Romulan warbirds, or have an imminent warp core breach, you have minutes, at most, to get help. Other starships aren't that close by.
I didn't watch Voyager much, and never saw that episode, but this is extremely disappointing. The ST:TNG Technical Manual (which came out before VOY) clearly addressed this issue, way way back in the early 90s. You can actually feel touchscreen controls, because they have miniature force fields/tractor beams that provide the same tactile sensation you get with mechanical controls. Didn't the writers of VOY ever read the TNG Tech Manual?
It's of course even more disappointing that a sci-fi TV show in the 80s/90s was able to address this important HMI issue in a book meant just for geeky fans, yet 25 years later people in the industry still don't get it. Of course, we don't have tractor beams or force fields to implement what they wrote about in the tech manual, but it does show the show's technical consultants were thinking about and aware of this issue back then, 15 years before slate-style smartphones were even invented, and that maybe we should not be using touchscreens for certain controls until we do have force fields or some other workaround.
>and I mean HATED the touch screen when it came to virtual potentiometers (one operator got up and walked away saying "this screen is a fucking piece of shit")
That operator was correct. Virtual potentiometers on a touchscreen are a horrible idea and miserable to use.
Nope, in fact they were expressly prohibited from writing about the technology. In the scripts, instead of doing their built-world-homework and writing that coherently into the story, they just had to put "[TECH]" in to the scripts. Then the technical consistency editors came along and filled that stuff in. Not even kidding. This led to some of the really disastrous (IMO) early scenes in Voyager such as one where two characters (Cpt. Janeway and B'Elanna, iirc) are bonding while solving an crisis technical issue... and the dialogue is a total hash because it was "co-written" using a completely insane method.
You can add all the haptic and UI flair you want in, but it's not going to make my nerves tingle in the way my brain expects.
video in german https://www.youtube.com/watch?v=tJjp-P9jZCk
Most industrial touchscreens I've seen are resistive which makes them much more difficult to use, especially for dragging motions, so that may be compounding the problem. Capacitive screens are generally much easier to use.
I bought an origami cover for the reader, so it can be put at an angle on the desk.
The combination of both mean that I need to turn it off, otherwise it would randomly switch pages when it is in my pocket.
I miss my first Kindle with its physical page turning buttons.
I'm not anti-touchscreen by any measure, but did you really find that surprising? It seems common knowledge to me that keyboard shortcuts are faster than touchscreens for most tasks.
That common knowledge is wrong, but it comes from an important truth: brand new users who are not yet committed to your product will get frustrated if the thing they need to do is hidden away as a key-binding or command-line command.
So successful products optimise for the UX of a user who doesn't yet know how to use the product well. And such users really love touchscreens.
If you look at the menu's in win3.1 for example you will see that nearly every menu item has both the alt-menu, keystroke, shortcut as well as the global keyboard shortcut to the right of the menu entry. AKA alt-f, s was save, but some applications might have also put another keyboard shortcut to the right say 'ctr-s' or something.
This means that your average user who kept clicking the file->save menu could see that alt-f,s (via the underlines) would perform the same function without the mouse, or just that there was a simple shortcut.
Windows gradually removed this, and in the xp timeframe you only saw the shortcuts when the alt key was pressed while the menus were active. This of course defeats the purpose of sticking the shortcut in the users face everytime they click the menu, and the concept has stuck around to the point where in win10 google is pretty much the only way to discover shortcuts if the even exist (which is hardly rare). This is part of the reason I stick to the classic interface in win7, with the right tweaks its still puts the keyboard shortcuts in your face.
I only wish that modern UI designers spent a day or two actually reading the human/computer guidelines/research before calling themselves experts..
I think the problem is that touchscreens get ported to applications where there should be a reasonable expectation that the end user is an expert in the system. For example, cars, and aircraft. Touchscreens are great when you have portable systems that have to condense a lot of functionality into a small device, but I don't want to be in a position where a pilot has to touch the correct button on a touchscreen in the middle of serious turbulence. Likewise, no driver should be taking their eyes off the road to navigate to the air-conditioning tab. Applying touchscreens in these situations is not only bad engineering, it's outright dangerous. You have to demonstrate competent control of a vehicle just to operate it, so we shouldn't be assuming operators are brand new users that aren't committed to the product.
All of that results in undiscoverable interfaces designed for first-time users and horrible for everything else, and thus cult of first mouse, now touchscreen.
The person you responded to wrote:
> custom control board with specialized buttons and keyboard
So we weren’t even talking about keyboard shortcuts in the ctrl-c sense, but that the specific action has a specific button.
How, indeed, could that have been a surprise.
Here’s a my stupid story:
On the laser cutter I operate there as no way to tell the machine “the stock / remnant / offcut I want to cut the parts from is located at x,y”.
The machine has a touch screen display which shows the cutting heads current location. So you put it in manual mode, drive the cutting head in to the start position as per your materials location, then type the coordinates it displays as a graphic in to two fields in a dialogue box.
There’s no button on the machine or touch screen to automate that.
This is why I think UX professionals will be the first against the wall when the revolution comes.
Thankfully I’m aware of AutoHotKey, and Capture2Text, so I wrote a script to turn a keyboard shortcut in to a series of mousse movements, clicks, and OCR, to take the graphic display if numbers and turn them in to strings of numbers.
I still can’t believe there hasn’t been a software update to implement a feature I can build in to a compiled .exe that lotteralky took me 45 minutes to build from aware-the-tools-exist to implementation.
Someone give me a billion dollars already. I’m clearly a genius.
> You weren’t even talking about that.
> The person you responded to wrote:
> > custom control board with specialized buttons and keyboard
> So we weren’t even talking about keyboard shortcuts in the ctrl-c sense, but that the specific action has a specific button.
> How, indeed, could that have been a surprise.
I considered typing keypad or array of buttons, but I figured keyboard conveyed the intended meaning closely enough.
Yamaha has a nice idiom called "touch-and-turn" where there's an unassigned encoder right next the screen which manipulates whichever parameter you tap onscreen. Navigating the touchscreen is at least as nice as visually scanning a large format analog console to find the right handle, but a knob is the right way to actually tune it. It works well.
Your hands spend little if any time on the touchscreen during the show, but it helps you understand what's going on with the physical control surface and grab rarely accessed parameters, not worthy of dedicated console real estate, when needed.
Of _course_ they did.
But nobody listens to the UX/HCI researchers and academics, when the touchscreen vendor's sales team are inviting them to fully catered lunches on a tropical golf course...
Business as usual.
It happens with _everything_ else in this stupid industry, what're the chances of it not happening just the same when the tech industry targets the military industrial complex???
Unlike in business, military purchasing is, I think, generally run by people who were once users of the systems they were purchasing, and are not unlikely to rely on them again. Far more likely than, 'I don't care if the ensign standing watch on my next ship can do his job, this Mai Tai tastes great!' is 'wow, I see all these touch screens on TV shows, in movies and in the Tesla cars the print media tells me are the wave of the future; we need to get the New Hotness™ for our future combat systems too!' and 'hey everybody, look at this awesome futuristic 21st century cockpit my team designed!' They were, I think, completely well-intentioned and genuinely believed that they were doing a good job — and the few folks who opposed them probably sounded like cranky old guys ('why, in my day you had to get three stout sailors to man the rudder, and we liked it that way!').
Our entire culture is neophilic; military purchasers live in our culture; is it any surprise that they might be neophilic too?
Maybe that's just asking too much these days.
You could replace that with the more general "Humans as usual."
As pointed out in the article, this is not good human factors process when the key set of command options are fixed and unlikely to change. And that hard controls for similar functions across equipment gives better cost savings in human training time than the "soft" controls give at acquisition time.
I do think military acquisition folks can "fall in love" with "sci fi" interfaces but there has been a ton of excellent research at NASA Ames which disputes the utility of such interfaces.
(I despise Gizmodo, so it's a deep link for this one)
So if there's things you can do with the touchscreens it's likely to be a whole lot of small things you might want to do in rare non-emergency situations. That's a pretty good fit for a touch screen. You can potentially give the astronauts a huge amount of control over all aspects of the spacecraft without creating a mountain of buttons and knobs. It could even lead to better safety since you know that every single physical control you see is important, and there's less room for mistake when using only the physical interface.
I'm not an expert in this area, so I could be wrong, but my impression is that SpaceX has a pretty good design here. Not necessary better or worse than Boeings approach. It's not like NASA would let astronauts fly on it if they thought it was unsafe or hard to operate.
That said, it does look like they are prioritizing aesthetics over functionality of the control layout.
Like for smartwatches, I hate using touch controls to control it, I really prefer physical buttons since they provide multiple benefits
- keep the display smudge-free
- works with gloves (useful during winter)
- allow actions from touch-memory (ie: skip song without looking at it)
Having a thoughtful default UI along with the ability to display any control or status on the screen would seem like a better idea for central control station. A captain could pull up their own configuration for the control panel. During an emergency, they could fallback to a default view that could be referenced by operating procedure documents. Give the ability to delegate controls to other stations but do not allow multiple stations to control the same input, that seems like an idiotic idea.
Add labels on the display and use a set of generic controls and boom, you can use the same hardware for a ton of functions given the right abstractions.
This is how for example the interface of the 50k$ Arri Alexa looks like (the de facto standard in cinema cameras):
There were lower priced cameras like the Black Magic Production Camera which use only a toucscreen.
And you can guess why that Black Magic model is discontinued..
In fact the UI can be identical for a device with softkeys vs. one with a touchscreen. It causes a few DOH! moments when you are working with some instruments that work one way and some the other, but this isn't nearly as annoying as you might expect.
You still have to look at the screen though. Unless maybe you could 'lock' the UI into one mode (with no sub-menus), which might be a good compromise for vehicles.
The modular scene generally frowns upon systems that need deep menu diving, which is most of the time an indicator that your interface design is lacking.
I don't think it's that researchers think it's good. I think that executives and programmers do.
I am no expert on this, but from talking to people in the aviation industry I noticed slow a trend away from the cockpit-with-2,000 controls and towards using computer screens which switch between display multiple things. Or rather, there still are 2,000 controls, but thanks to the computer-screens, the number hasn't blown up to 200,000.
So my guess is the touchscreens in these ships replaced some computer screen where input (and mode switching) had been done by physical buttons. And now they are moving it back.
OTOH controls that you normally, and especially in a critical situation, reach and operate without looking, is a different thing. They should be stationary, and provide good tactile feedback.
For planes that means that controls required during normal flight can be touch controls. After all on Concord and TU-144 pilots didn’t even have visual. However during departure and landing, pilots need physics buttons and switches as it is visual flight.
Another conclusion is that fighter jets can’t have any touch controls, they are supposed to be in visual all the time.
No touchscreens for in-flight, though, although iPads became common as replacement for paper charts.
Touchscreens are vulnerable for fat fingering.
Every time i carry my touchscreenable laptop - it gets crazy because of accidental touches, moves, brushes and if i forget to lock the screen completely - this bordering with disaster, such as deleted files, moved folders, spradically launched apps, etc..
To be fair, though, the crucial controls are still under my feet and under my thumbs on the steering wheel, and those are all physical/tactile.
On a more serious note - the touch-screen is the greatest "generalist" UI configuration: A whole lot is possible, but no UI is any good, in the sense of having tactile elements.
Apple could still make it happen with a future version of their Taptic Engine. Preferably with hover haptic feedback, where you feel a "tingle" or something before touching the screen.
Touchscreens are the future™
However, I would not be certain if I could type well on a full-touch screen keyboard such as the Optimus Tactus . I do my typing blindly, having a touch screen would constantly require me to find the correct position for my fingers by look. While on a normal keyboard that is easy to do by touch.
I've never seen a MacBook Pro with a touchscreen keyboard. Can you point me at one?