Hacker News new | past | comments | ask | show | jobs | submit login

Wait. Let me get this straight.

Touchscreens, which are ideal for representing multiple user unterfaces in an intuitive way, but requiring constant visual contact are being replaced by a physical user interface that can be used by both touch, and memory?

Why was touchscreen ever even a consideration for controls you're not looking at?

Touchscreen is great for phones. It is awful for keyboards (see macbook pro). And if is even more awful for controls. Has nobody researched this before spending a few hundred million dollars?




Well, it worked on Star Trek. Then again, that was addressed in a Voyager episode when Tom Paris designs the helm in their new runabout with old fashioned buttons and switches because he wanted to actually feel the controls much to Tuvok's dismay.

Though, on a serious note, I work in industrial automation and user interfaces, aka HMI's have been touch oriented for quite some time. It was and still is common to see graphical elements which emulate the look of physical buttons used on machinery. This was done to help operators navigate touch screens who were used to panels full of buttons, knobs, and switches.

Recently I rebuilt a machine that was half analog and half digital controls to all digital control. I first started with a full touch interface with provisions for knobs and buttons. During testing operators hated, and I mean HATED the touch screen when it came to virtual potentiometers (one operator got up and walked away saying "this screen is a fucking piece of shit"). I installed encoder knobs to emulate potentiometers and it was a success. Everyone was happy.

Some things just can't be done with a screen. You need physical things to interact with.


> Well, it worked on Star Trek.

IIRC the official explanation from the makers of Star Trek is that the displays look like simple touch screens, but they are actually overlaid with structured force fields, so for the users it feels like a physical button press.


I saw a touch screen for controlling AV equipment at Sun decades ago that had a single solenoid underneath the glass that could thunk to provide tactile feedback as you pressed buttons and dragged sliders.


My god. Sun invented the Taptic Engine.


I don't think they actually designed it, they just used it in one of their conference rooms.

By the way, are you "Le Jojo" of "Jojo on UI", or a different "that_jojo"? ;)

http://www.art.net/~hopkins/Don/unix-haters/x-windows/jojo-o...


Totally different jojo, sadly.


I've played with some tech demos that used electrostatic stiction (I may be remembering the name wrong) to create friction on a glass touch panel.

The high voltage needed makes it a no go for most/all portable devices.

I played with some other "cl controllable friction" tech but don't remember the basis of it. Iirc one actually made small indentations in the screen to simulate buttons. Had a bunch of limitations, I think the button placement was baked in at time of manufacturing, but it's been awhile and I don't trust my memory on the topic.


tbf this could be achievable with Apple's virtual click tech. Would like to see how that'd work in practice.


I assume, that actions like turning/twisting do not work well on touch screens. you can do sliders but for somethings that may not give the correct control. I can probably turn a knob at finer increments than I can slide my finger on the screen.


It is all about haptic feedback, and low latency between action and outcome.

You can twist virtually with good precision. But, what is much harder is the feel, state of the knob, particularly when it is released and then gripped again.

Mechanically, the structure of the knob can take some energy input, and it serves as a mechanical pivot, or fulcrum, depending on how people use a knob.

Without all those physical things, people lack the complex frames of reference needed for fine, "thought is action" type control.


> You can twist virtually with good precision. But, what is much harder is the feel..

This is especially true in environments where gloves are worn.


In addition, the essential ship controls must work in rough weather, so you want something you can grasp to keep your hand steady. So long as airplanes have human pilots, I do not suppose the physical stick, rudder pedals and throttles are going away.


Yes. And state comms. Where that input device is matters.


> actions like turning/twisting do not work well on touch screens

Have you ever used an iPod (capacitive touchpad) scroll wheel?

A good physical knob is still better, but a touch wheel can be made pretty decent.


The iPod wheel has a physical barrier acting as an affordance for your finger to trace. It certainly works better than it would just floating in space in the middle of a touch screen.


I wonder if it's much easier to process the signal (at the level of measuring and interpreting the capacitances) if you assume a one-dimensional user input. I'm not sure what signal processing happens when sliding on a touchscreen/trackpad.


A iPod touchwheel can be used without looking at it. It's much more like a knob than a touchscreen display, even if it uses similar technology.


I imagine the star trek interact can really apply arbitrary forces. So you can simulate a knob for example. Not just a tactile sensation when you're already in (near) contact with the interface.


A really advanced Star Trek interface would use holo/replicator tech to materialize physical knobs, switches, and sliders in a user-defined configuration as they activated the console.


That sort of technology, if scale was not an issue, could allow you to make a miniaturized version of the situation and use your hands to move the ship among the other elements, and that input would then be translated into engine/thruster settings. The model ship would resist movement as needed.


This! This!


There was no nobs the in their UI as far as I can remember :)

A bigger thing (for me) is being able to sense gaps/shapes without pressing anything and a fixed layout - touchscreens are about the change but that's only good for UI that you look at.


Isn't this what Project Soli [0] was trying to achieve.

[0]https://atap.google.com/soli/


Afaik soli is the recognition part (through radar).

Something that creates physical feedback is for example that Disney VR project where they use air to create the feeling of resistance


I installed encoder knobs to emulate potentiometers

There's really no need to force this decision to go in either direction. Ever since I first heard of capacitive multitouch, going back to the timeframe before the first iPhone announcement, I've been waiting for someone to build 'stick-on' encoder knobs that the touchscreen controller can read.

These would simply take the form of a knob with a metal leaf or other polygonal electrode in its base, whose rotation could be sensed by code similar to that used to implement crappy 'virtual potentiometers' on existing touchscreens. The fixed part of the knob base would be epoxied or otherwise bonded directly to the screen surface, or perhaps held in place with some sort of frame.

Doesn't seem to have happened yet despite being an incredibly obvious (and inexpensive!) thing to do. Seems like the MIDI community would be all over something like this, even if no one else considered it worthwhile.


This reminds me of the Surface Dial. https://www.microsoft.com/en-us/p/surface-dial/925r551sktgn

Being able to set the dial on the screen and just turn it is a really good-feeling workflow, though likely not for something as mission-critical as the article is describing.


What I had in mind would cost closer to $0.99 than $99.00, though. (Well, OK, $9.99.) The Surface Dial was a relatively complex Bluetooth device; it didn't work through the touchscreen itself, except to the extent that the touchscreen somehow knew that a Surface Dial was resting on it.


According to the spec it uses the touchscreen, or at least some touchscreens:

>On-screen detection: Touch digitizer reports the onscreen location through a capacitive pattern (Studio only)

https://www.microsoft.com/en-us/p/surface-dial/925r551sktgn?...


Right, to sense location, not rotation. Rotation goes through Bluetooth, as with any number of existing knob controllers.


Funny, I have a whole sketch for this. Well the cat is out of the bag, we might as well complete this to prevent any patents.

I envision using cam levered suction cups to hold on rotary and linear sliders that had touch screen sensitive rubber tips. One could go as far to 3d print arbitrary interaction devices that could get attached to the face of the touch screen. You can use the multitouch sensor w/o the screen but still be able to configure arbitrary devices to go on the front.

I even had a design for joystick. Lots of analog opportunities when you have something like a back illuminated camera or a touch controller that can sense areas. You could also serially transfer data from the device to the touch screen, either using physical touches or electrically simulated touches.


Yep, any number of other controls besides knobs would work under the same basic principle. A linear slider control would be an obvious one, as would calculator-style membrane buttons.

I'd be surprised if the concept weren't already patented, though, just because the idea of a generalized capacitive control surface seems fairly obvious, and the patent office doesn't really apply an "obviousness" test. What definitely surprises me is that, patented or not, I can't just go out and buy these sorts of controls.


These are great ideas! Tbh I think that much of the reason that these don't exist is the configurability of software UIs - these controls in Linux/ Windows/ OSX would need to be specifically programmed for by the programmer, rather than say in Squeak by the users. Breaking down that user/ programmer barrier is key I think.


What if the metal base of the rotatable knob has a certain rotationaly asymmetric pattern instead of being completely flat?

We would then be able to receive that pattern and understand how the knob is rotated.

What’s the resolution of todays touchscreen?


It's an interesting thought. You'd want to use a full bandwidth data channel (e.g. Bluetooth) for anything complicated.

But for simple things, if possible without affecting the other parts of the screen, it'd be amazing to have a broadly supported, low bandwidth standard.


I can't really make sense of the bandwidth (information rate) of a tactile interface. A measure in units of length would make more sense to me.


Any digital control signal effectively has a minimum bandwidth.

This might be as simple as "Here's my encoded position * frequency of sampling", but for a general interface you'd want something adaptable.

What if there were two dials on the control? Two dials and three buttons? One dial, four buttons, and a joystick?


It's enough for knobs of 1 inch spacing or wider, no easy support for fancy geared knobs that coaxially have a rough setting and a fine adjustment.


Very strongly related to what you describe is fiducial markers. They were a big thing in the small pond that was projected touch surfaces some years ago, though I haven’t heard anything much about it in recent years as the industry has headed in a different direction. It’s harder to find info on them now, but it’s still out there. A quick search yielded https://www.christianholz.net/fiberio.html which contains an image of the concept at the bottom.


Also see Reactable

https://reactable.com/


I’m not sure touch screens have sufficient resolution.

Check this: http://huyle.de/2019/02/12/accessing-capacitive-images/ As you see, the sensor elements are huge, 4×4mm each, i.e. there’re only 15×27 sensors for the complete touch screen. On top of that, there’s high amount of noise in the signal of each sensor.

The reasons why it works OK in practice, fingers have very predictable shape, also a lot of software involved on all levels of the stack. Touch screen firmware filters out sensor noise and generates touch points. Higher level GUI frameworks “snap“ touches to virtual buttons, some platforms go as far as making virtual keyboard buttons different sizes, depending on which virtual keys are expected to be clicked next, according to predictive input software i.e. dictionaries.

What you propose probably can be done, by using a finger-like object, but I don’t expect the resolution will be great. At least not in comparison with hardware turning knobs, even cheap ones can be make extremely precise. See this https://en.wikipedia.org/wiki/Rotary_encoder and https://en.wikipedia.org/wiki/Incremental_encoder for more info, both are used a lot in wide variety of applications. Old mice with a ball had 2 of them, the reason why ball mice sucked was not sensor precision, it was dirt accumulation, a minor issue for a knob.


It would require multi-touch displays. I have no idea how ubiquitous those are these days, but I know most touchscreens interfaces in my life have not supported it.


In Michael Naimark's series of articles about "VR and AR Fundamentals" [1], the chapter on "Other Senses (Touch, Smell, Taste, Mind)" [2] discusses haptic feedback, and even mentions Hiroo Iwata's delicious "Food Simulator" [3].

[1] VR and AR Fundamentals: https://medium.com/@michaelnaimark/vr-ar-fundamentals-prolog...

[2] Other Senses (Touch, Smell, Taste, Mind): https://medium.com/@michaelnaimark/vr-ar-fundamentals-3-othe...

[3] Food Simulator: https://www.wired.com/2003/08/slideshow-wonders-aplenty-at-s... https://ars.electronica.art/center/en/food-simulator/ http://icat.vrsj.org/papers/2003/00876_00000.pdf

Hiroo Iwata is a brilliant mad scientist [4], and in a previous HN discussion about pie menus and haptic multitouch interfaces [5], I linked to his wonderful work on 3DOF Multitouch Haptic Interface with Movable Touchscreen. [6] [7]

[4] Professor Hiroo IWATA: http://www.frontier.kyoto-u.ac.jp/te03/member/iwata/index.ht...

[5] HN discussion of pie menus and haptic multitouch interfaces: https://news.ycombinator.com/item?id=17105984

[6] 3DOF Multitouch Haptic Interface with Movable Touchscreen: https://www.youtube.com/watch?v=YCZPmj7NtSQ

[7] 3DOF Multitouch Haptic Interface with Movable Touchscreen: https://link.springer.com/chapter/10.1007/978-981-10-4157-0_...

>Shun Takanaka, Hiroaki Yano, Hiroo Iwata, Presented at AsiaHaptics2016. This paper reports on the development of a multitouch haptic interface equipped with a movable touchscreen. When the relative position of two of a user’s fingertips is fixed on a touchscreen, the fingers can be considered a hand-shaped rigid object. In such situations, a reaction force can be exerted on each finger using a three degrees of freedom (3DOF) haptic interface. In this study, a prototype 3DOF haptic interface system comprising a touchscreen, a 6-axis force sensor, an X-Y stage, and a capstan drive system was developed. The developed system estimates the input force from fingers using sensor data and each finger’s position. Further, the system generates reaction forces from virtual objects to the user’s fingertips by controlling the static frictional force between each of the user’s fingertips and the screen. The system enables users to perceive the shape of two-dimensional virtual objects displayed on the screen and translate/rotate them with their fingers. Moreover, users can deform elastic virtual objects, and feel their rigidity.

There are some other really bizarre examples of haptic interfaces in the AsiaHaptics2016 conference videos! (Not all safe for work, depending on your chosen profession, predilection for palpation, and assessment of sphincter tone.) [8]

[8] AsiaHaptics2016: https://www.youtube.com/channel/UC8qMmIgmWhnQBeABjGlzGbg/vid...


Star Trek's bridge controls can be reconfigured, but there's a standard static configuration that doesn't change while they're in use. This is also referenced at a different point in Voyager when Tuvok is annoyed by how Tom Paris uses a nonstandard configuration. So that at least reduces the need for constant visual contact.


There's another Voyager episode in which Tuvok is temporarily blinded and he tells the computer to turn on the tactile interface so he can work on a computer console. So even 25 years ago before everyone had a touchscreen in their pockets, it was obvious that touch screens wouldn't be great for all situations.


And of course there’s the comment in ‘All good things’ by Dr. Crusher when she says she can’t believe how they got by with 2D controls in the old days, and how the current holographic controls were much easier to use.

So, Star Fleet made the same transition as the US Navy, from a touchscreen to 3D tactile controls.


Also in theory controls for any system can be transferred to any console on the ship, I can think of one episode where Seven transferred helm, tactical and ops down to an engineering console.


That was part of the problem on the USS McCain: there were so many different modes of control that the crew lost track of the state of the system.


I'd be really interested to see how the UI handled signposting for those handed-off functions, because I don't think the idea is inherently unwieldy, it just needs a good clear implementation.


It's it true that the whole bridge section can be separated and replaced from the saucer, which of course can also be separated from that thing with the engines?


The bridge modules can be fairly easily swapped out in drydock, but I don't believe that can be done while underway.

The Galaxy class was the first class of ship with a saucer separation system that could be re-attached while in flight. I believe the Constitution class did have the ability to separate its saucer, but that was accomplished with explosive bolts and it couldn't re-attach without being in drydock.


That galaxy-class saucer separation thing was honestly not a great idea, and it was somewhat annoying when they used it in the episodes. The problem with it is that only the "stardrive" section has warp capability, so what good is it to be able to separate except for using the saucer as a last-resort escape pod in case of warp engine failure? The saucer is simply too slow to go anywhere in a reasonable amount of time without a warp engine: at sublight speeds, it would take years just to get to the closest star system. This is probably one of the most annoying things about Star Trek: they completely ignored speed-of-light issues like this too often. Using "warp drive" as a plot device to get the characters from system to system in a week or less at FTL speeds is fine, but if you're going to do that, don't fall back on sublight "impulse drive" as something that's actually useful for anything except getting into and out of orbit.


The saucer section can still maintain a warp field for a little while after separation at warp, it could potentially coast quite far out of harms way.

Tactically that would only make sense if the engineering hull could keep an enemy ship entirely occupied, if there was more than one enemy ship engaging then its going to be difficult to keep them all from pursuing the saucer.

You are right though, if you did suffer a warp core breach you aren't going anywhere quickly. But in almost all situations help is only one subspace call away.


>But in almost all situations help is only one subspace call away.

Huh? No way; whether you're in a battle with 3 cloaked Romulan warbirds, or have an imminent warp core breach, you have minutes, at most, to get help. Other starships aren't that close by.


One of the reasons for that functionality I don't recall actually making it to an episode - tactical advantage in a fight. The rear section had increased maneuverability when separated, and it resulted in two targets for attackers, both of which could fire back.


Perhaps, but one of those targets is effectively stationary because it's so comparatively slow.


IIRC they used a similar tactic in "The Best of Both Worlds."


Right, before they mainly used it as an escape vehicle in case of warp nacelle or containment issues.


Yeah, the warp core ejection systems always seem to conveniently fail. You would want to dump the entire engineering section and haul ass if you were sitting on an antimatter bomb.


There were also physical backups in case the automated ejection system didn't work. There was always a physical backup to any touch interface that was critical to system operations.


>Well, it worked on Star Trek. Then again, that was addressed in a Voyager episode when Tom Paris designs the helm in their new runabout with old fashioned buttons and switches because he wanted to actually feel the controls much to Tuvok's dismay.

I didn't watch Voyager much, and never saw that episode, but this is extremely disappointing. The ST:TNG Technical Manual (which came out before VOY) clearly addressed this issue, way way back in the early 90s. You can actually feel touchscreen controls, because they have miniature force fields/tractor beams that provide the same tactile sensation you get with mechanical controls. Didn't the writers of VOY ever read the TNG Tech Manual?

It's of course even more disappointing that a sci-fi TV show in the 80s/90s was able to address this important HMI issue in a book meant just for geeky fans, yet 25 years later people in the industry still don't get it. Of course, we don't have tractor beams or force fields to implement what they wrote about in the tech manual, but it does show the show's technical consultants were thinking about and aware of this issue back then, 15 years before slate-style smartphones were even invented, and that maybe we should not be using touchscreens for certain controls until we do have force fields or some other workaround.

>and I mean HATED the touch screen when it came to virtual potentiometers (one operator got up and walked away saying "this screen is a fucking piece of shit")

That operator was correct. Virtual potentiometers on a touchscreen are a horrible idea and miserable to use.


Didn't the writers of VOY ever read the TNG Tech Manual?

Nope, in fact they were expressly prohibited from writing about the technology. In the scripts, instead of doing their built-world-homework and writing that coherently into the story, they just had to put "[TECH]" in to the scripts. Then the technical consistency editors came along and filled that stuff in. Not even kidding. This led to some of the really disastrous (IMO) early scenes in Voyager such as one where two characters (Cpt. Janeway and B'Elanna, iirc) are bonding while solving an crisis technical issue... and the dialogue is a total hash because it was "co-written" using a completely insane method.


Did not know about the technical manual, just ordered a copy.


It's not surprising that for brains evolved to deal with a physical world, physical controls feel more natural.

You can add all the haptic and UI flair you want in, but it's not going to make my nerves tingle in the way my brain expects.


Last year I had great fun trying out a new know that a local industrial company KEBA is working on/producing. The whole know is configureable. It can do resistances of any strength and direction, make the output run fast if you turn it fasst and then finetune slowly like you did on old radio. You can make it turn just one way or both, or limit it to part of the 360 degrees. You can give it that click feel to separate the 360 degrees into 10 positions, or even 11!

video in german https://www.youtube.com/watch?v=tJjp-P9jZCk


I totally agree that a combination of physical controls and touchscreen is the way to go. However there are ways to improve the design for touchscreen. Virtual knobs in my experience are pretty difficult to use on a screen, however sliders work reasonably well in their place.

Most industrial touchscreens I've seen are resistive which makes them much more difficult to use, especially for dragging motions, so that may be compounding the problem. Capacitive screens are generally much easier to use.


I have a Kobo Glo HD with what I think is one or more IR sensors for the touchscreen. It is very sensitive, even a fly walking on the screen will activate it.

I bought an origami cover for the reader, so it can be put at an angle on the desk.

The combination of both mean that I need to turn it off, otherwise it would randomly switch pages when it is in my pocket.

I miss my first Kindle with its physical page turning buttons.


I would love to see ghost touches messing up the industrial touchscreen and wreaking havoc on the machinery.


The more I get into synthesizers, the more I want analog controls (knobs, sliders, buttons, VC cable inputs!)


Are you the reason why my office water dispenser has a touch screen?


The federation also had to design interfaces to be universal and not assume bipedal humanoids with two hands and an oposable thumb, modern engineers have much narrower requirements.


They had to design them to fit within the production budget.


What about the shiny metal throttle?

http://propsummit.com/upload/408/tt4.jpg


On one project for automotive industry we had a huge problem with touchscreen interface because mechanics often have grease and dirt on their hands and the touch screens we used would get all confused with the residue left on the screen. We first tried to fix it in the software, but quickly realized it's much easier to switch to custom control board with specialized buttons and keyboard, just wrap them in plastic bags so you don't have to clean them, and it all works like a charm. Unexpectedly, it turned out to be faster for users too.


> Unexpectedly, it turned out to be faster for users too.

I'm not anti-touchscreen by any measure, but did you really find that surprising? It seems common knowledge to me that keyboard shortcuts are faster than touchscreens for most tasks.


Ever since mouses and GUIs were invented it was common knowledge that "ordinary" users don't bother to learn keyboard shortcuts.

That common knowledge is wrong, but it comes from an important truth: brand new users who are not yet committed to your product will get frustrated if the thing they need to do is hidden away as a key-binding or command-line command.

So successful products optimise for the UX of a user who doesn't yet know how to use the product well. And such users really love touchscreens.


This is something that has irritated me since windows/mac's started hiding the UI/shortcut hinting in favor of a "cleaner" interface. UI discoverability was a huge part of human/computer interaction research in the late 70's/80's.

If you look at the menu's in win3.1 for example you will see that nearly every menu item has both the alt-menu, keystroke, shortcut as well as the global keyboard shortcut to the right of the menu entry. AKA alt-f, s was save, but some applications might have also put another keyboard shortcut to the right say 'ctr-s' or something.

This means that your average user who kept clicking the file->save menu could see that alt-f,s (via the underlines) would perform the same function without the mouse, or just that there was a simple shortcut.

Windows gradually removed this, and in the xp timeframe you only saw the shortcuts when the alt key was pressed while the menus were active. This of course defeats the purpose of sticking the shortcut in the users face everytime they click the menu, and the concept has stuck around to the point where in win10 google is pretty much the only way to discover shortcuts if the even exist (which is hardly rare). This is part of the reason I stick to the classic interface in win7, with the right tweaks its still puts the keyboard shortcuts in your face.

I only wish that modern UI designers spent a day or two actually reading the human/computer guidelines/research before calling themselves experts..


The ALT-f-s option still works in a lot of windows software now. Microsoft Office will even overlay the letter of the keypress on top of the menu item.


> So successful products optimise for the UX of a user who doesn't yet know how to use the product well. And such users really love touchscreens.

I think the problem is that touchscreens get ported to applications where there should be a reasonable expectation that the end user is an expert in the system. For example, cars, and aircraft. Touchscreens are great when you have portable systems that have to condense a lot of functionality into a small device, but I don't want to be in a position where a pilot has to touch the correct button on a touchscreen in the middle of serious turbulence. Likewise, no driver should be taking their eyes off the road to navigate to the air-conditioning tab. Applying touchscreens in these situations is not only bad engineering, it's outright dangerous. You have to demonstrate competent control of a vehicle just to operate it, so we shouldn't be assuming operators are brand new users that aren't committed to the product.



But I wasn't talking about onboarding. I was talking about the speed of people using your software to do work on either a touchscreen or a keyboard.


The decision makers in UI design often design for "someone ripped from plow by grenade", as the saying goes in Poland - someone who has no idea about the product at all. Others tend to follow this.

All of that results in undiscoverable interfaces designed for first-time users and horrible for everything else, and thus cult of first mouse, now touchscreen.


You weren’t even talking about that.

The person you responded to wrote:

> custom control board with specialized buttons and keyboard

So we weren’t even talking about keyboard shortcuts in the ctrl-c sense, but that the specific action has a specific button.

How, indeed, could that have been a surprise.

Here’s a my stupid story:

On the laser cutter I operate there as no way to tell the machine “the stock / remnant / offcut I want to cut the parts from is located at x,y”.

The machine has a touch screen display which shows the cutting heads current location. So you put it in manual mode, drive the cutting head in to the start position as per your materials location, then type the coordinates it displays as a graphic in to two fields in a dialogue box.

There’s no button on the machine or touch screen to automate that.

This is why I think UX professionals will be the first against the wall when the revolution comes.

Thankfully I’m aware of AutoHotKey, and Capture2Text, so I wrote a script to turn a keyboard shortcut in to a series of mousse movements, clicks, and OCR, to take the graphic display if numbers and turn them in to strings of numbers.

I still can’t believe there hasn’t been a software update to implement a feature I can build in to a compiled .exe that lotteralky took me 45 minutes to build from aware-the-tools-exist to implementation.

Someone give me a billion dollars already. I’m clearly a genius.


Yeah AutoHotKey is surprisingly helpful for things that seem otherwise unautomateable.

> You weren’t even talking about that.

> The person you responded to wrote:

> > custom control board with specialized buttons and keyboard

> So we weren’t even talking about keyboard shortcuts in the ctrl-c sense, but that the specific action has a specific button.

> How, indeed, could that have been a surprise.

I considered typing keypad or array of buttons, but I figured keyboard conveyed the intended meaning closely enough.


Well, not surprising, just unexpected as we haven't really been trying to optimize the process by that. This was industrial-like interface so there was no keyboard shortcuts in the usual sense, other than hitting Tab to move through the input fields (but that was supported with touchscreen too). The main speed up was because of higher tolerance for imprecision that mechanical system gives you. You can hit the button while doing something else, as you walk by, or use it wearing gloves, or even hit it with elbow without having to lay down the tools, which turned out to be a huge time saver in their specific use case. We've also added some extra big buttons next to the keyboard specially for the common tasks, to maximize on this effect.


A buddy of mine who has a shop briefly considered a Leap Motion controller to get around the 'greasy hands' problem. I'm not sure if he ever followed thru with it.


Entertainment industry control consoles in audio, video, and lighting blend the two pretty well. Hardware faders and encoders for actually manipulating parameters. An LCD right next to each physical control showing its current assignment and meaning. Keyboard buttons for selecting pages and layers, playing back automation, and command-line programming. Touchscreens for situational awareness, drilldown to detailed views, configuration, routing, and navigation of complex hierarchies.

Yamaha has a nice idiom called "touch-and-turn" where there's an unassigned encoder right next the screen which manipulates whichever parameter you tap onscreen. Navigating the touchscreen is at least as nice as visually scanning a large format analog console to find the right handle, but a knob is the right way to actually tune it. It works well.

Your hands spend little if any time on the touchscreen during the show, but it helps you understand what's going on with the physical control surface and grab rarely accessed parameters, not worthy of dedicated console real estate, when needed.


> Has nobody researched this before spending a few hundred million dollars?

Of _course_ they did.

But nobody listens to the UX/HCI researchers and academics, when the touchscreen vendor's sales team are inviting them to fully catered lunches on a tropical golf course...

Business as usual.


Do you know that this has happened specifically around touchscreens, or is this a more general observation?


Just generalised Monday morning cynicism...

It happens with _everything_ else in this stupid industry, what're the chances of it not happening just the same when the tech industry targets the military industrial complex???


Principle of charity: don't attribute mistakes to malevolence.

Unlike in business, military purchasing is, I think, generally run by people who were once users of the systems they were purchasing, and are not unlikely to rely on them again. Far more likely than, 'I don't care if the ensign standing watch on my next ship can do his job, this Mai Tai tastes great!' is 'wow, I see all these touch screens on TV shows, in movies and in the Tesla cars the print media tells me are the wave of the future; we need to get the New Hotness™ for our future combat systems too!' and 'hey everybody, look at this awesome futuristic 21st century cockpit my team designed!' They were, I think, completely well-intentioned and genuinely believed that they were doing a good job — and the few folks who opposed them probably sounded like cranky old guys ('why, in my day you had to get three stout sailors to man the rudder, and we liked it that way!').

Our entire culture is neophilic; military purchasers live in our culture; is it any surprise that they might be neophilic too?


I guess some of us are hoping for a little more competence and actually doing some research, like, for instance, trying out these products in field tests to see how soldiers like them and use them and how effective they are with them before committing to a big contract.

Maybe that's just asking too much these days.


Maybe. I have very little faith that sailors "who were once users of the systems" who've worked their way up inside the military to decision making roles have _any clue_ how evil and deceptive and persuasive the sales teams from companies like Adobe or Oracle or Palantir or $touchScreenVendorDeJour can be...


> Business as usual.

You could replace that with the more general "Humans as usual."


Touch screens have a cheaper cost per command input and more easily changed. Further, you can reconfigure the controls on the screen based on use to achieve way more controls than would fit in a similar amount of space.

As pointed out in the article, this is not good human factors process when the key set of command options are fixed and unlikely to change. And that hard controls for similar functions across equipment gives better cost savings in human training time than the "soft" controls give at acquisition time.

I do think military acquisition folks can "fall in love" with "sci fi" interfaces but there has been a ton of excellent research at NASA Ames which disputes the utility of such interfaces.


And this is why I like Boeing's new manned capsule better than the one from SpaceX:

Boeing: https://www.nasaspaceflight.com/2018/08/boeing-starliner-cre...

SpaceX: https://i.kinja-img.com/gawker-media/image/upload/atk7lokher... (I despise Gizmodo, so it's a deep link for this one)


You should keep in mind is that the SpaceX capsule is not really meant to be human operated under normal circumstances, and whatever actions are meant to be taken by humans during normal operation or emergencies supposedly have physical buttons.

So if there's things you can do with the touchscreens it's likely to be a whole lot of small things you might want to do in rare non-emergency situations. That's a pretty good fit for a touch screen. You can potentially give the astronauts a huge amount of control over all aspects of the spacecraft without creating a mountain of buttons and knobs. It could even lead to better safety since you know that every single physical control you see is important, and there's less room for mistake when using only the physical interface.

I'm not an expert in this area, so I could be wrong, but my impression is that SpaceX has a pretty good design here. Not necessary better or worse than Boeings approach. It's not like NASA would let astronauts fly on it if they thought it was unsafe or hard to operate.


I don't actually see any touchscreen controls on the SpaceX panel. There are physical buttons in the middle and what appear to be information-only displays to either side. There might be some sliders, but it's hard to tell.

That said, it does look like they are prioritizing aesthetics over functionality of the control layout.


IMO touchscreen are awful in any kind of vehicles or activities requiring you to have your eyes focused on something else than the display.

Like for smartwatches, I hate using touch controls to control it, I really prefer physical buttons since they provide multiple benefits

- keep the display smudge-free

- works with gloves (useful during winter)

- allow actions from touch-memory (ie: skip song without looking at it)


I feel like destroyers are large enough that taking your eyes off the "road" for a moment to adjust controls is perfectly fine. Ships like that have radar systems to detect other boats. These things could probably pilot themselves with enough sensors. They move slow enough in crowded waterways and have actual radar (versus lidar or cameras) so seeing the speed and direction of other things around them is already built-in. Relying on a bunch of tired people to work together to steer a ship seems like a worse idea than having just one well-rested captain that can sit back and observe an auto-pilot.

Having a thoughtful default UI along with the ability to display any control or status on the screen would seem like a better idea for central control station. A captain could pull up their own configuration for the control panel. During an emergency, they could fallback to a default view that could be referenced by operating procedure documents. Give the ability to delegate controls to other stations but do not allow multiple stations to control the same input, that seems like an idiotic idea.


There’s something even worse than touchscreens IMHO - those capacitive buttons on non-screen surfaces, like the capacitive power touch points on some monitor bezels. As a visually impaired person, these are absolutely maddening. I don’t know if anyone else gets this, but it’s almost like my brain doesn’t know how to press them. Touchscreens are fine for me, but for some reason I get major tension in my hands when trying to find these surface ‘buttons’. They’re often denoted with a tiny symbol in low contrast, just to make it even worse.


As a visually unimpaired person, those touch buttons are just horrible. Especially so when it is a function that does not have an immediate effect. Just this morning I switched my monitor on, off and on again, simply because by the time it actually reacted, I had pressed the power button again already.


I loved how my XBox would turn on and do its 60 second boot sequence every time I dusted it. What a nice feature.


Or when a cat brushes past, in our case.


My dog intentionally turns on my PS4 in the morning. When she turns on the PS4 it automatically turns on the TV and will play the soothing main menu music. We aren't sure if she does it for the music, the light from the TV, or the beeping noise from when you first press the button, but she has been doing it almost every morning for the last year and a half.


Agreed and they tend to get worse with age for cheaper models since the already low contrast symbol tends to fade away with time.


They were useful because they can be reconfigured on-demand to different interfaces - e.g. you can route helmsman's responsibilities to a different bridge station in case of hardware damage or human casualties.


As someone who designs ans builds modular synthesizer modules: so can mechanical interfaces if designed appropriately.

Add labels on the display and use a set of generic controls and boom, you can use the same hardware for a ton of functions given the right abstractions.


Yup - just requires the hardware designers to have some knowledge of the required UIs. My main point is that the use of touchscreens was more a case of lazy/cheap design than trend-driven design.


And I agree. Touch screens can be beneficial under certain circumstances — but they are not amazing for anything that has to be reliable and must be operated under stressful conditions.

This is how for example the interface of the 50k$ Arri Alexa looks like (the de facto standard in cinema cameras): http://nofilmschool.com/sites/default/files/styles/article_w...

There were lower priced cameras like the Black Magic Production Camera which use only a toucscreen.

And you can guess why that Black Magic model is discontinued..


Sure. Softkeys - a row of buttons around the edges of the screen - are how a lot of electronic test equipment works.

In fact the UI can be identical for a device with softkeys vs. one with a touchscreen. It causes a few DOH! moments when you are working with some instruments that work one way and some the other, but this isn't nearly as annoying as you might expect.

You still have to look at the screen though. Unless maybe you could 'lock' the UI into one mode (with no sub-menus), which might be a good compromise for vehicles.


Many modular synth systems also do this without screens e.g. by indicating the state of the system with a row of colored LEDs and then colorcoding the labels on the controls. This certainly offloads the cognitive load to the user, but it also works.

The modular scene generally frowns upon systems that need deep menu diving, which is most of the time an indicator that your interface design is lacking.


With the added benefit of having your control stations look like the bridge of Star Trek space ships.


That's also possible with a common set of hardware controller devices. Just like multiple games are played with the same controller on a console, or the way we use personal computers with a keyboard and mouse.


That was part of the problem on the USS McCain: the crew lost track of where essential control inputs were being routed from.


Just about every Human Computer Interaction focused computer scientist I know is adamant that touch screens are inappropriate and vastly sub-optimal for many of the purposes it is currently employed in.

I don't think it's that researchers think it's good. I think that executives and programmers do.


Touchscreens, even where they don't make sense, are just one of those collective idiocies that regularly sweep the IT world. Hard to say who is at fault - there are many fashion victims in every group. Also many who have doubts or don't care either way, but will happily play along to sell something.


> Touchscreens, which are ideal for representing multiple user unterfaces

I am no expert on this, but from talking to people in the aviation industry I noticed slow a trend away from the cockpit-with-2,000 controls and towards using computer screens which switch between display multiple things. Or rather, there still are 2,000 controls, but thanks to the computer-screens, the number hasn't blown up to 200,000.

So my guess is the touchscreens in these ships replaced some computer screen where input (and mode switching) had been done by physical buttons. And now they are moving it back.


Configurable displays that you can quickly glance are great.

OTOH controls that you normally, and especially in a critical situation, reach and operate without looking, is a different thing. They should be stationary, and provide good tactile feedback.


One important fact here is that everything the pilots will need to reach quickly in an emergency still is a physical button. It's just the more "nice to have" things (e.g. programming the cost factor of the engines, putting the route into the autopilot) that are done through the screens. Though, even those inputs are largely done with a funky kind of mouse+keyboard. Actual touchscreens are largely relegated to tertiary functions like airport information displays.


I’m no where an expert and don’t know anything but I gave it a bit of thought. It seems to me that there are two types of vehicles or two types of operations. Instrumental and visual. So if a system is operated in instrumental mode, like instrumental flight, when all the information necessary is available in the screen in front of you, it seems to me that it would be more efficient to make interactive and turn the screen into touch screen. You’re only looking at the screen after all. But when you operate something in visual mode, when you have to monitor surroundings, the controls should be knobs and buttons, ones that you know from your muscle memory and can operate without looking.

For planes that means that controls required during normal flight can be touch controls. After all on Concord and TU-144 pilots didn’t even have visual. However during departure and landing, pilots need physics buttons and switches as it is visual flight.

Another conclusion is that fighter jets can’t have any touch controls, they are supposed to be in visual all the time.


Scene from a movie that comes to mind is the one from District 9, when they are starting the spaceship in the end. He’s totally focused on the screen, everything is happening on the screen so it totally makes sense to be a touch screen. Same goes for Star Trek, I suppose the operators don’t actually have visual on that big screen it’s just for comfort and reference.


Aviation tends to multiple-function displays with somewhat common controls used for them. Started with softkeys around the screens, on recent A350 you have two keyboard/mouse sets for both pilots, plus extra buttons, and a bit complex setup of which screens can display what.

No touchscreens for in-flight, though, although iPads became common as replacement for paper charts.


+1

Touchscreens are vulnerable for fat fingering.

Every time i carry my touchscreenable laptop - it gets crazy because of accidental touches, moves, brushes and if i forget to lock the screen completely - this bordering with disaster, such as deleted files, moved folders, spradically launched apps, etc..


I've owned a Tesla Model S for a few years now and its touchscreen has never felt like an impediment.

To be fair, though, the crucial controls are still under my feet and under my thumbs on the steering wheel, and those are all physical/tactile.


They're ideal for Steve Jobs to make marketing presentations of.

On a more serious note - the touch-screen is the greatest "generalist" UI configuration: A whole lot is possible, but no UI is any good, in the sense of having tactile elements.


The ability to route steering, throttle, and other critical controls previously locked to a couple of stations to ANYWHERE in the ship (including, say, the CIC) is a pretty convincing argument to an Admiral, I'd say. Combined with some modular control systems (a la the modular synth comments in the thread) it would be pretty awesome. The problem here was that they kept getting swapped between stations. It wasn't JUST the touch screen, it was also WHICH touch screen, which is a problem that should be resolved in case they're still looking to allow for control switching.


Tell that to Elon Musk’s Tesla


Having an input system with haptic feedback but also context-based controls and customizable layouts is still the Holy Grail.

Apple could still make it happen with a future version of their Taptic Engine. Preferably with hover haptic feedback, where you feel a "tingle" or something before touching the screen.


So, the closest we have to the Holy grail of input systems is the Steam Controller?


Even for gaming it’s awful. Playing a video game on a touch screen never felt right , depending on the game but For most games it’s an awful experience.


in aircraft, yes. Ships? probably, but the US Navy Surface Warfare community has a special way of believing they are more special than reality.


> Why was touchscreen ever even a consideration for controls you're not looking at?

Touchscreens are the future™


Fully touch capable keyboards would be awesome actually. Imagine remapping keys according to apps (games, editing, coding). I am not sure why you are using the macbook pro as a reference, as the only issues i had with it are related to the classic keyboard, not the touch.


If you mean that you have physical keys which you can remap, such as the Optimus Maximus keyboard concept [1], that would indeed be a great idea.

However, I would not be certain if I could type well on a full-touch screen keyboard such as the Optimus Tactus [2]. I do my typing blindly, having a touch screen would constantly require me to find the correct position for my fingers by look. While on a normal keyboard that is easy to do by touch.

[1]: https://www.artlebedev.com/optimus/maximus/ [2]: https://www.artlebedev.com/optimus/tactus/


I wonder how much would it be possible to touch-type on a touchscreen. I don't look at my phone keyboard anymore and type quite fast on it.


Shorthand writing should also work very fast on modern phone screens.


>It is awful for keyboards (see macbook pro)

I've never seen a MacBook Pro with a touchscreen keyboard. Can you point me at one?



Except the sentence clearly implies that MacBook Pros use a touchscreen keyboard. That's a hilariously wrong assertion, hence my sarcasm.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: