Hacker News new | past | comments | ask | show | jobs | submit login
US Navy will replace touchscreen with mechanical controls on its destroyers (theverge.com)
822 points by Grazester 12 days ago | hide | past | web | favorite | 440 comments
 help




Wait. Let me get this straight.

Touchscreens, which are ideal for representing multiple user unterfaces in an intuitive way, but requiring constant visual contact are being replaced by a physical user interface that can be used by both touch, and memory?

Why was touchscreen ever even a consideration for controls you're not looking at?

Touchscreen is great for phones. It is awful for keyboards (see macbook pro). And if is even more awful for controls. Has nobody researched this before spending a few hundred million dollars?


Well, it worked on Star Trek. Then again, that was addressed in a Voyager episode when Tom Paris designs the helm in their new runabout with old fashioned buttons and switches because he wanted to actually feel the controls much to Tuvok's dismay.

Though, on a serious note, I work in industrial automation and user interfaces, aka HMI's have been touch oriented for quite some time. It was and still is common to see graphical elements which emulate the look of physical buttons used on machinery. This was done to help operators navigate touch screens who were used to panels full of buttons, knobs, and switches.

Recently I rebuilt a machine that was half analog and half digital controls to all digital control. I first started with a full touch interface with provisions for knobs and buttons. During testing operators hated, and I mean HATED the touch screen when it came to virtual potentiometers (one operator got up and walked away saying "this screen is a fucking piece of shit"). I installed encoder knobs to emulate potentiometers and it was a success. Everyone was happy.

Some things just can't be done with a screen. You need physical things to interact with.


> Well, it worked on Star Trek.

IIRC the official explanation from the makers of Star Trek is that the displays look like simple touch screens, but they are actually overlaid with structured force fields, so for the users it feels like a physical button press.


I saw a touch screen for controlling AV equipment at Sun decades ago that had a single solenoid underneath the glass that could thunk to provide tactile feedback as you pressed buttons and dragged sliders.

My god. Sun invented the Taptic Engine.

I don't think they actually designed it, they just used it in one of their conference rooms.

By the way, are you "Le Jojo" of "Jojo on UI", or a different "that_jojo"? ;)

http://www.art.net/~hopkins/Don/unix-haters/x-windows/jojo-o...


Totally different jojo, sadly.

I've played with some tech demos that used electrostatic stiction (I may be remembering the name wrong) to create friction on a glass touch panel.

The high voltage needed makes it a no go for most/all portable devices.

I played with some other "cl controllable friction" tech but don't remember the basis of it. Iirc one actually made small indentations in the screen to simulate buttons. Had a bunch of limitations, I think the button placement was baked in at time of manufacturing, but it's been awhile and I don't trust my memory on the topic.


tbf this could be achievable with Apple's virtual click tech. Would like to see how that'd work in practice.

I assume, that actions like turning/twisting do not work well on touch screens. you can do sliders but for somethings that may not give the correct control. I can probably turn a knob at finer increments than I can slide my finger on the screen.

It is all about haptic feedback, and low latency between action and outcome.

You can twist virtually with good precision. But, what is much harder is the feel, state of the knob, particularly when it is released and then gripped again.

Mechanically, the structure of the knob can take some energy input, and it serves as a mechanical pivot, or fulcrum, depending on how people use a knob.

Without all those physical things, people lack the complex frames of reference needed for fine, "thought is action" type control.


> You can twist virtually with good precision. But, what is much harder is the feel..

This is especially true in environments where gloves are worn.


In addition, the essential ship controls must work in rough weather, so you want something you can grasp to keep your hand steady. So long as airplanes have human pilots, I do not suppose the physical stick, rudder pedals and throttles are going away.

Yes. And state comms. Where that input device is matters.

> actions like turning/twisting do not work well on touch screens

Have you ever used an iPod (capacitive touchpad) scroll wheel?

A good physical knob is still better, but a touch wheel can be made pretty decent.


The iPod wheel has a physical barrier acting as an affordance for your finger to trace. It certainly works better than it would just floating in space in the middle of a touch screen.

I wonder if it's much easier to process the signal (at the level of measuring and interpreting the capacitances) if you assume a one-dimensional user input. I'm not sure what signal processing happens when sliding on a touchscreen/trackpad.

A iPod touchwheel can be used without looking at it. It's much more like a knob than a touchscreen display, even if it uses similar technology.

I imagine the star trek interact can really apply arbitrary forces. So you can simulate a knob for example. Not just a tactile sensation when you're already in (near) contact with the interface.

A really advanced Star Trek interface would use holo/replicator tech to materialize physical knobs, switches, and sliders in a user-defined configuration as they activated the console.

That sort of technology, if scale was not an issue, could allow you to make a miniaturized version of the situation and use your hands to move the ship among the other elements, and that input would then be translated into engine/thruster settings. The model ship would resist movement as needed.

This! This!

There was no nobs the in their UI as far as I can remember :)

A bigger thing (for me) is being able to sense gaps/shapes without pressing anything and a fixed layout - touchscreens are about the change but that's only good for UI that you look at.


Isn't this what Project Soli [0] was trying to achieve.

[0]https://atap.google.com/soli/


Afaik soli is the recognition part (through radar).

Something that creates physical feedback is for example that Disney VR project where they use air to create the feeling of resistance


I installed encoder knobs to emulate potentiometers

There's really no need to force this decision to go in either direction. Ever since I first heard of capacitive multitouch, going back to the timeframe before the first iPhone announcement, I've been waiting for someone to build 'stick-on' encoder knobs that the touchscreen controller can read.

These would simply take the form of a knob with a metal leaf or other polygonal electrode in its base, whose rotation could be sensed by code similar to that used to implement crappy 'virtual potentiometers' on existing touchscreens. The fixed part of the knob base would be epoxied or otherwise bonded directly to the screen surface, or perhaps held in place with some sort of frame.

Doesn't seem to have happened yet despite being an incredibly obvious (and inexpensive!) thing to do. Seems like the MIDI community would be all over something like this, even if no one else considered it worthwhile.


This reminds me of the Surface Dial. https://www.microsoft.com/en-us/p/surface-dial/925r551sktgn

Being able to set the dial on the screen and just turn it is a really good-feeling workflow, though likely not for something as mission-critical as the article is describing.


What I had in mind would cost closer to $0.99 than $99.00, though. (Well, OK, $9.99.) The Surface Dial was a relatively complex Bluetooth device; it didn't work through the touchscreen itself, except to the extent that the touchscreen somehow knew that a Surface Dial was resting on it.

According to the spec it uses the touchscreen, or at least some touchscreens:

>On-screen detection: Touch digitizer reports the onscreen location through a capacitive pattern (Studio only)

https://www.microsoft.com/en-us/p/surface-dial/925r551sktgn?...


Right, to sense location, not rotation. Rotation goes through Bluetooth, as with any number of existing knob controllers.

Funny, I have a whole sketch for this. Well the cat is out of the bag, we might as well complete this to prevent any patents.

I envision using cam levered suction cups to hold on rotary and linear sliders that had touch screen sensitive rubber tips. One could go as far to 3d print arbitrary interaction devices that could get attached to the face of the touch screen. You can use the multitouch sensor w/o the screen but still be able to configure arbitrary devices to go on the front.

I even had a design for joystick. Lots of analog opportunities when you have something like a back illuminated camera or a touch controller that can sense areas. You could also serially transfer data from the device to the touch screen, either using physical touches or electrically simulated touches.


Yep, any number of other controls besides knobs would work under the same basic principle. A linear slider control would be an obvious one, as would calculator-style membrane buttons.

I'd be surprised if the concept weren't already patented, though, just because the idea of a generalized capacitive control surface seems fairly obvious, and the patent office doesn't really apply an "obviousness" test. What definitely surprises me is that, patented or not, I can't just go out and buy these sorts of controls.


These are great ideas! Tbh I think that much of the reason that these don't exist is the configurability of software UIs - these controls in Linux/ Windows/ OSX would need to be specifically programmed for by the programmer, rather than say in Squeak by the users. Breaking down that user/ programmer barrier is key I think.

What if the metal base of the rotatable knob has a certain rotationaly asymmetric pattern instead of being completely flat?

We would then be able to receive that pattern and understand how the knob is rotated.

What’s the resolution of todays touchscreen?


It's an interesting thought. You'd want to use a full bandwidth data channel (e.g. Bluetooth) for anything complicated.

But for simple things, if possible without affecting the other parts of the screen, it'd be amazing to have a broadly supported, low bandwidth standard.


I can't really make sense of the bandwidth (information rate) of a tactile interface. A measure in units of length would make more sense to me.

Any digital control signal effectively has a minimum bandwidth.

This might be as simple as "Here's my encoded position * frequency of sampling", but for a general interface you'd want something adaptable.

What if there were two dials on the control? Two dials and three buttons? One dial, four buttons, and a joystick?


It's enough for knobs of 1 inch spacing or wider, no easy support for fancy geared knobs that coaxially have a rough setting and a fine adjustment.

Very strongly related to what you describe is fiducial markers. They were a big thing in the small pond that was projected touch surfaces some years ago, though I haven’t heard anything much about it in recent years as the industry has headed in a different direction. It’s harder to find info on them now, but it’s still out there. A quick search yielded https://www.christianholz.net/fiberio.html which contains an image of the concept at the bottom.

Also see Reactable

https://reactable.com/


I’m not sure touch screens have sufficient resolution.

Check this: http://huyle.de/2019/02/12/accessing-capacitive-images/ As you see, the sensor elements are huge, 4×4mm each, i.e. there’re only 15×27 sensors for the complete touch screen. On top of that, there’s high amount of noise in the signal of each sensor.

The reasons why it works OK in practice, fingers have very predictable shape, also a lot of software involved on all levels of the stack. Touch screen firmware filters out sensor noise and generates touch points. Higher level GUI frameworks “snap“ touches to virtual buttons, some platforms go as far as making virtual keyboard buttons different sizes, depending on which virtual keys are expected to be clicked next, according to predictive input software i.e. dictionaries.

What you propose probably can be done, by using a finger-like object, but I don’t expect the resolution will be great. At least not in comparison with hardware turning knobs, even cheap ones can be make extremely precise. See this https://en.wikipedia.org/wiki/Rotary_encoder and https://en.wikipedia.org/wiki/Incremental_encoder for more info, both are used a lot in wide variety of applications. Old mice with a ball had 2 of them, the reason why ball mice sucked was not sensor precision, it was dirt accumulation, a minor issue for a knob.


It would require multi-touch displays. I have no idea how ubiquitous those are these days, but I know most touchscreens interfaces in my life have not supported it.

In Michael Naimark's series of articles about "VR and AR Fundamentals" [1], the chapter on "Other Senses (Touch, Smell, Taste, Mind)" [2] discusses haptic feedback, and even mentions Hiroo Iwata's delicious "Food Simulator" [3].

[1] VR and AR Fundamentals: https://medium.com/@michaelnaimark/vr-ar-fundamentals-prolog...

[2] Other Senses (Touch, Smell, Taste, Mind): https://medium.com/@michaelnaimark/vr-ar-fundamentals-3-othe...

[3] Food Simulator: https://www.wired.com/2003/08/slideshow-wonders-aplenty-at-s... https://ars.electronica.art/center/en/food-simulator/ http://icat.vrsj.org/papers/2003/00876_00000.pdf

Hiroo Iwata is a brilliant mad scientist [4], and in a previous HN discussion about pie menus and haptic multitouch interfaces [5], I linked to his wonderful work on 3DOF Multitouch Haptic Interface with Movable Touchscreen. [6] [7]

[4] Professor Hiroo IWATA: http://www.frontier.kyoto-u.ac.jp/te03/member/iwata/index.ht...

[5] HN discussion of pie menus and haptic multitouch interfaces: https://news.ycombinator.com/item?id=17105984

[6] 3DOF Multitouch Haptic Interface with Movable Touchscreen: https://www.youtube.com/watch?v=YCZPmj7NtSQ

[7] 3DOF Multitouch Haptic Interface with Movable Touchscreen: https://link.springer.com/chapter/10.1007/978-981-10-4157-0_...

>Shun Takanaka, Hiroaki Yano, Hiroo Iwata, Presented at AsiaHaptics2016. This paper reports on the development of a multitouch haptic interface equipped with a movable touchscreen. When the relative position of two of a user’s fingertips is fixed on a touchscreen, the fingers can be considered a hand-shaped rigid object. In such situations, a reaction force can be exerted on each finger using a three degrees of freedom (3DOF) haptic interface. In this study, a prototype 3DOF haptic interface system comprising a touchscreen, a 6-axis force sensor, an X-Y stage, and a capstan drive system was developed. The developed system estimates the input force from fingers using sensor data and each finger’s position. Further, the system generates reaction forces from virtual objects to the user’s fingertips by controlling the static frictional force between each of the user’s fingertips and the screen. The system enables users to perceive the shape of two-dimensional virtual objects displayed on the screen and translate/rotate them with their fingers. Moreover, users can deform elastic virtual objects, and feel their rigidity.

There are some other really bizarre examples of haptic interfaces in the AsiaHaptics2016 conference videos! (Not all safe for work, depending on your chosen profession, predilection for palpation, and assessment of sphincter tone.) [8]

[8] AsiaHaptics2016: https://www.youtube.com/channel/UC8qMmIgmWhnQBeABjGlzGbg/vid...


Star Trek's bridge controls can be reconfigured, but there's a standard static configuration that doesn't change while they're in use. This is also referenced at a different point in Voyager when Tuvok is annoyed by how Tom Paris uses a nonstandard configuration. So that at least reduces the need for constant visual contact.

There's another Voyager episode in which Tuvok is temporarily blinded and he tells the computer to turn on the tactile interface so he can work on a computer console. So even 25 years ago before everyone had a touchscreen in their pockets, it was obvious that touch screens wouldn't be great for all situations.

And of course there’s the comment in ‘All good things’ by Dr. Crusher when she says she can’t believe how they got by with 2D controls in the old days, and how the current holographic controls were much easier to use.

So, Star Fleet made the same transition as the US Navy, from a touchscreen to 3D tactile controls.


Also in theory controls for any system can be transferred to any console on the ship, I can think of one episode where Seven transferred helm, tactical and ops down to an engineering console.

That was part of the problem on the USS McCain: there were so many different modes of control that the crew lost track of the state of the system.

I'd be really interested to see how the UI handled signposting for those handed-off functions, because I don't think the idea is inherently unwieldy, it just needs a good clear implementation.

It's it true that the whole bridge section can be separated and replaced from the saucer, which of course can also be separated from that thing with the engines?

The bridge modules can be fairly easily swapped out in drydock, but I don't believe that can be done while underway.

The Galaxy class was the first class of ship with a saucer separation system that could be re-attached while in flight. I believe the Constitution class did have the ability to separate its saucer, but that was accomplished with explosive bolts and it couldn't re-attach without being in drydock.


That galaxy-class saucer separation thing was honestly not a great idea, and it was somewhat annoying when they used it in the episodes. The problem with it is that only the "stardrive" section has warp capability, so what good is it to be able to separate except for using the saucer as a last-resort escape pod in case of warp engine failure? The saucer is simply too slow to go anywhere in a reasonable amount of time without a warp engine: at sublight speeds, it would take years just to get to the closest star system. This is probably one of the most annoying things about Star Trek: they completely ignored speed-of-light issues like this too often. Using "warp drive" as a plot device to get the characters from system to system in a week or less at FTL speeds is fine, but if you're going to do that, don't fall back on sublight "impulse drive" as something that's actually useful for anything except getting into and out of orbit.

The saucer section can still maintain a warp field for a little while after separation at warp, it could potentially coast quite far out of harms way.

Tactically that would only make sense if the engineering hull could keep an enemy ship entirely occupied, if there was more than one enemy ship engaging then its going to be difficult to keep them all from pursuing the saucer.

You are right though, if you did suffer a warp core breach you aren't going anywhere quickly. But in almost all situations help is only one subspace call away.


>But in almost all situations help is only one subspace call away.

Huh? No way; whether you're in a battle with 3 cloaked Romulan warbirds, or have an imminent warp core breach, you have minutes, at most, to get help. Other starships aren't that close by.


One of the reasons for that functionality I don't recall actually making it to an episode - tactical advantage in a fight. The rear section had increased maneuverability when separated, and it resulted in two targets for attackers, both of which could fire back.

Perhaps, but one of those targets is effectively stationary because it's so comparatively slow.

IIRC they used a similar tactic in "The Best of Both Worlds."

Right, before they mainly used it as an escape vehicle in case of warp nacelle or containment issues.

Yeah, the warp core ejection systems always seem to conveniently fail. You would want to dump the entire engineering section and haul ass if you were sitting on an antimatter bomb.

There were also physical backups in case the automated ejection system didn't work. There was always a physical backup to any touch interface that was critical to system operations.

>Well, it worked on Star Trek. Then again, that was addressed in a Voyager episode when Tom Paris designs the helm in their new runabout with old fashioned buttons and switches because he wanted to actually feel the controls much to Tuvok's dismay.

I didn't watch Voyager much, and never saw that episode, but this is extremely disappointing. The ST:TNG Technical Manual (which came out before VOY) clearly addressed this issue, way way back in the early 90s. You can actually feel touchscreen controls, because they have miniature force fields/tractor beams that provide the same tactile sensation you get with mechanical controls. Didn't the writers of VOY ever read the TNG Tech Manual?

It's of course even more disappointing that a sci-fi TV show in the 80s/90s was able to address this important HMI issue in a book meant just for geeky fans, yet 25 years later people in the industry still don't get it. Of course, we don't have tractor beams or force fields to implement what they wrote about in the tech manual, but it does show the show's technical consultants were thinking about and aware of this issue back then, 15 years before slate-style smartphones were even invented, and that maybe we should not be using touchscreens for certain controls until we do have force fields or some other workaround.

>and I mean HATED the touch screen when it came to virtual potentiometers (one operator got up and walked away saying "this screen is a fucking piece of shit")

That operator was correct. Virtual potentiometers on a touchscreen are a horrible idea and miserable to use.


Didn't the writers of VOY ever read the TNG Tech Manual?

Nope, in fact they were expressly prohibited from writing about the technology. In the scripts, instead of doing their built-world-homework and writing that coherently into the story, they just had to put "[TECH]" in to the scripts. Then the technical consistency editors came along and filled that stuff in. Not even kidding. This led to some of the really disastrous (IMO) early scenes in Voyager such as one where two characters (Cpt. Janeway and B'Elanna, iirc) are bonding while solving an crisis technical issue... and the dialogue is a total hash because it was "co-written" using a completely insane method.


Did not know about the technical manual, just ordered a copy.

It's not surprising that for brains evolved to deal with a physical world, physical controls feel more natural.

You can add all the haptic and UI flair you want in, but it's not going to make my nerves tingle in the way my brain expects.


Last year I had great fun trying out a new know that a local industrial company KEBA is working on/producing. The whole know is configureable. It can do resistances of any strength and direction, make the output run fast if you turn it fasst and then finetune slowly like you did on old radio. You can make it turn just one way or both, or limit it to part of the 360 degrees. You can give it that click feel to separate the 360 degrees into 10 positions, or even 11!

video in german https://www.youtube.com/watch?v=tJjp-P9jZCk


I totally agree that a combination of physical controls and touchscreen is the way to go. However there are ways to improve the design for touchscreen. Virtual knobs in my experience are pretty difficult to use on a screen, however sliders work reasonably well in their place.

Most industrial touchscreens I've seen are resistive which makes them much more difficult to use, especially for dragging motions, so that may be compounding the problem. Capacitive screens are generally much easier to use.


I have a Kobo Glo HD with what I think is one or more IR sensors for the touchscreen. It is very sensitive, even a fly walking on the screen will activate it.

I bought an origami cover for the reader, so it can be put at an angle on the desk.

The combination of both mean that I need to turn it off, otherwise it would randomly switch pages when it is in my pocket.

I miss my first Kindle with its physical page turning buttons.


I would love to see ghost touches messing up the industrial touchscreen and wreaking havoc on the machinery.

The more I get into synthesizers, the more I want analog controls (knobs, sliders, buttons, VC cable inputs!)

Are you the reason why my office water dispenser has a touch screen?

The federation also had to design interfaces to be universal and not assume bipedal humanoids with two hands and an oposable thumb, modern engineers have much narrower requirements.

They had to design them to fit within the production budget.

What about the shiny metal throttle?

http://propsummit.com/upload/408/tt4.jpg


On one project for automotive industry we had a huge problem with touchscreen interface because mechanics often have grease and dirt on their hands and the touch screens we used would get all confused with the residue left on the screen. We first tried to fix it in the software, but quickly realized it's much easier to switch to custom control board with specialized buttons and keyboard, just wrap them in plastic bags so you don't have to clean them, and it all works like a charm. Unexpectedly, it turned out to be faster for users too.

> Unexpectedly, it turned out to be faster for users too.

I'm not anti-touchscreen by any measure, but did you really find that surprising? It seems common knowledge to me that keyboard shortcuts are faster than touchscreens for most tasks.


Ever since mouses and GUIs were invented it was common knowledge that "ordinary" users don't bother to learn keyboard shortcuts.

That common knowledge is wrong, but it comes from an important truth: brand new users who are not yet committed to your product will get frustrated if the thing they need to do is hidden away as a key-binding or command-line command.

So successful products optimise for the UX of a user who doesn't yet know how to use the product well. And such users really love touchscreens.


This is something that has irritated me since windows/mac's started hiding the UI/shortcut hinting in favor of a "cleaner" interface. UI discoverability was a huge part of human/computer interaction research in the late 70's/80's.

If you look at the menu's in win3.1 for example you will see that nearly every menu item has both the alt-menu, keystroke, shortcut as well as the global keyboard shortcut to the right of the menu entry. AKA alt-f, s was save, but some applications might have also put another keyboard shortcut to the right say 'ctr-s' or something.

This means that your average user who kept clicking the file->save menu could see that alt-f,s (via the underlines) would perform the same function without the mouse, or just that there was a simple shortcut.

Windows gradually removed this, and in the xp timeframe you only saw the shortcuts when the alt key was pressed while the menus were active. This of course defeats the purpose of sticking the shortcut in the users face everytime they click the menu, and the concept has stuck around to the point where in win10 google is pretty much the only way to discover shortcuts if the even exist (which is hardly rare). This is part of the reason I stick to the classic interface in win7, with the right tweaks its still puts the keyboard shortcuts in your face.

I only wish that modern UI designers spent a day or two actually reading the human/computer guidelines/research before calling themselves experts..


The ALT-f-s option still works in a lot of windows software now. Microsoft Office will even overlay the letter of the keypress on top of the menu item.

> So successful products optimise for the UX of a user who doesn't yet know how to use the product well. And such users really love touchscreens.

I think the problem is that touchscreens get ported to applications where there should be a reasonable expectation that the end user is an expert in the system. For example, cars, and aircraft. Touchscreens are great when you have portable systems that have to condense a lot of functionality into a small device, but I don't want to be in a position where a pilot has to touch the correct button on a touchscreen in the middle of serious turbulence. Likewise, no driver should be taking their eyes off the road to navigate to the air-conditioning tab. Applying touchscreens in these situations is not only bad engineering, it's outright dangerous. You have to demonstrate competent control of a vehicle just to operate it, so we shouldn't be assuming operators are brand new users that aren't committed to the product.



But I wasn't talking about onboarding. I was talking about the speed of people using your software to do work on either a touchscreen or a keyboard.

The decision makers in UI design often design for "someone ripped from plow by grenade", as the saying goes in Poland - someone who has no idea about the product at all. Others tend to follow this.

All of that results in undiscoverable interfaces designed for first-time users and horrible for everything else, and thus cult of first mouse, now touchscreen.


You weren’t even talking about that.

The person you responded to wrote:

> custom control board with specialized buttons and keyboard

So we weren’t even talking about keyboard shortcuts in the ctrl-c sense, but that the specific action has a specific button.

How, indeed, could that have been a surprise.

Here’s a my stupid story:

On the laser cutter I operate there as no way to tell the machine “the stock / remnant / offcut I want to cut the parts from is located at x,y”.

The machine has a touch screen display which shows the cutting heads current location. So you put it in manual mode, drive the cutting head in to the start position as per your materials location, then type the coordinates it displays as a graphic in to two fields in a dialogue box.

There’s no button on the machine or touch screen to automate that.

This is why I think UX professionals will be the first against the wall when the revolution comes.

Thankfully I’m aware of AutoHotKey, and Capture2Text, so I wrote a script to turn a keyboard shortcut in to a series of mousse movements, clicks, and OCR, to take the graphic display if numbers and turn them in to strings of numbers.

I still can’t believe there hasn’t been a software update to implement a feature I can build in to a compiled .exe that lotteralky took me 45 minutes to build from aware-the-tools-exist to implementation.

Someone give me a billion dollars already. I’m clearly a genius.


Yeah AutoHotKey is surprisingly helpful for things that seem otherwise unautomateable.

> You weren’t even talking about that.

> The person you responded to wrote:

> > custom control board with specialized buttons and keyboard

> So we weren’t even talking about keyboard shortcuts in the ctrl-c sense, but that the specific action has a specific button.

> How, indeed, could that have been a surprise.

I considered typing keypad or array of buttons, but I figured keyboard conveyed the intended meaning closely enough.


Well, not surprising, just unexpected as we haven't really been trying to optimize the process by that. This was industrial-like interface so there was no keyboard shortcuts in the usual sense, other than hitting Tab to move through the input fields (but that was supported with touchscreen too). The main speed up was because of higher tolerance for imprecision that mechanical system gives you. You can hit the button while doing something else, as you walk by, or use it wearing gloves, or even hit it with elbow without having to lay down the tools, which turned out to be a huge time saver in their specific use case. We've also added some extra big buttons next to the keyboard specially for the common tasks, to maximize on this effect.

A buddy of mine who has a shop briefly considered a Leap Motion controller to get around the 'greasy hands' problem. I'm not sure if he ever followed thru with it.

Entertainment industry control consoles in audio, video, and lighting blend the two pretty well. Hardware faders and encoders for actually manipulating parameters. An LCD right next to each physical control showing its current assignment and meaning. Keyboard buttons for selecting pages and layers, playing back automation, and command-line programming. Touchscreens for situational awareness, drilldown to detailed views, configuration, routing, and navigation of complex hierarchies.

Yamaha has a nice idiom called "touch-and-turn" where there's an unassigned encoder right next the screen which manipulates whichever parameter you tap onscreen. Navigating the touchscreen is at least as nice as visually scanning a large format analog console to find the right handle, but a knob is the right way to actually tune it. It works well.

Your hands spend little if any time on the touchscreen during the show, but it helps you understand what's going on with the physical control surface and grab rarely accessed parameters, not worthy of dedicated console real estate, when needed.


> Has nobody researched this before spending a few hundred million dollars?

Of _course_ they did.

But nobody listens to the UX/HCI researchers and academics, when the touchscreen vendor's sales team are inviting them to fully catered lunches on a tropical golf course...

Business as usual.


Do you know that this has happened specifically around touchscreens, or is this a more general observation?

Just generalised Monday morning cynicism...

It happens with _everything_ else in this stupid industry, what're the chances of it not happening just the same when the tech industry targets the military industrial complex???


Principle of charity: don't attribute mistakes to malevolence.

Unlike in business, military purchasing is, I think, generally run by people who were once users of the systems they were purchasing, and are not unlikely to rely on them again. Far more likely than, 'I don't care if the ensign standing watch on my next ship can do his job, this Mai Tai tastes great!' is 'wow, I see all these touch screens on TV shows, in movies and in the Tesla cars the print media tells me are the wave of the future; we need to get the New Hotness™ for our future combat systems too!' and 'hey everybody, look at this awesome futuristic 21st century cockpit my team designed!' They were, I think, completely well-intentioned and genuinely believed that they were doing a good job — and the few folks who opposed them probably sounded like cranky old guys ('why, in my day you had to get three stout sailors to man the rudder, and we liked it that way!').

Our entire culture is neophilic; military purchasers live in our culture; is it any surprise that they might be neophilic too?


I guess some of us are hoping for a little more competence and actually doing some research, like, for instance, trying out these products in field tests to see how soldiers like them and use them and how effective they are with them before committing to a big contract.

Maybe that's just asking too much these days.


Maybe. I have very little faith that sailors "who were once users of the systems" who've worked their way up inside the military to decision making roles have _any clue_ how evil and deceptive and persuasive the sales teams from companies like Adobe or Oracle or Palantir or $touchScreenVendorDeJour can be...

> Business as usual.

You could replace that with the more general "Humans as usual."


Touch screens have a cheaper cost per command input and more easily changed. Further, you can reconfigure the controls on the screen based on use to achieve way more controls than would fit in a similar amount of space.

As pointed out in the article, this is not good human factors process when the key set of command options are fixed and unlikely to change. And that hard controls for similar functions across equipment gives better cost savings in human training time than the "soft" controls give at acquisition time.

I do think military acquisition folks can "fall in love" with "sci fi" interfaces but there has been a ton of excellent research at NASA Ames which disputes the utility of such interfaces.


And this is why I like Boeing's new manned capsule better than the one from SpaceX:

Boeing: https://www.nasaspaceflight.com/2018/08/boeing-starliner-cre...

SpaceX: https://i.kinja-img.com/gawker-media/image/upload/atk7lokher... (I despise Gizmodo, so it's a deep link for this one)


You should keep in mind is that the SpaceX capsule is not really meant to be human operated under normal circumstances, and whatever actions are meant to be taken by humans during normal operation or emergencies supposedly have physical buttons.

So if there's things you can do with the touchscreens it's likely to be a whole lot of small things you might want to do in rare non-emergency situations. That's a pretty good fit for a touch screen. You can potentially give the astronauts a huge amount of control over all aspects of the spacecraft without creating a mountain of buttons and knobs. It could even lead to better safety since you know that every single physical control you see is important, and there's less room for mistake when using only the physical interface.

I'm not an expert in this area, so I could be wrong, but my impression is that SpaceX has a pretty good design here. Not necessary better or worse than Boeings approach. It's not like NASA would let astronauts fly on it if they thought it was unsafe or hard to operate.


I don't actually see any touchscreen controls on the SpaceX panel. There are physical buttons in the middle and what appear to be information-only displays to either side. There might be some sliders, but it's hard to tell.

That said, it does look like they are prioritizing aesthetics over functionality of the control layout.


IMO touchscreen are awful in any kind of vehicles or activities requiring you to have your eyes focused on something else than the display.

Like for smartwatches, I hate using touch controls to control it, I really prefer physical buttons since they provide multiple benefits

- keep the display smudge-free

- works with gloves (useful during winter)

- allow actions from touch-memory (ie: skip song without looking at it)


I feel like destroyers are large enough that taking your eyes off the "road" for a moment to adjust controls is perfectly fine. Ships like that have radar systems to detect other boats. These things could probably pilot themselves with enough sensors. They move slow enough in crowded waterways and have actual radar (versus lidar or cameras) so seeing the speed and direction of other things around them is already built-in. Relying on a bunch of tired people to work together to steer a ship seems like a worse idea than having just one well-rested captain that can sit back and observe an auto-pilot.

Having a thoughtful default UI along with the ability to display any control or status on the screen would seem like a better idea for central control station. A captain could pull up their own configuration for the control panel. During an emergency, they could fallback to a default view that could be referenced by operating procedure documents. Give the ability to delegate controls to other stations but do not allow multiple stations to control the same input, that seems like an idiotic idea.


There’s something even worse than touchscreens IMHO - those capacitive buttons on non-screen surfaces, like the capacitive power touch points on some monitor bezels. As a visually impaired person, these are absolutely maddening. I don’t know if anyone else gets this, but it’s almost like my brain doesn’t know how to press them. Touchscreens are fine for me, but for some reason I get major tension in my hands when trying to find these surface ‘buttons’. They’re often denoted with a tiny symbol in low contrast, just to make it even worse.

As a visually unimpaired person, those touch buttons are just horrible. Especially so when it is a function that does not have an immediate effect. Just this morning I switched my monitor on, off and on again, simply because by the time it actually reacted, I had pressed the power button again already.

I loved how my XBox would turn on and do its 60 second boot sequence every time I dusted it. What a nice feature.

Or when a cat brushes past, in our case.

My dog intentionally turns on my PS4 in the morning. When she turns on the PS4 it automatically turns on the TV and will play the soothing main menu music. We aren't sure if she does it for the music, the light from the TV, or the beeping noise from when you first press the button, but she has been doing it almost every morning for the last year and a half.

Agreed and they tend to get worse with age for cheaper models since the already low contrast symbol tends to fade away with time.

They were useful because they can be reconfigured on-demand to different interfaces - e.g. you can route helmsman's responsibilities to a different bridge station in case of hardware damage or human casualties.

As someone who designs ans builds modular synthesizer modules: so can mechanical interfaces if designed appropriately.

Add labels on the display and use a set of generic controls and boom, you can use the same hardware for a ton of functions given the right abstractions.


Yup - just requires the hardware designers to have some knowledge of the required UIs. My main point is that the use of touchscreens was more a case of lazy/cheap design than trend-driven design.

And I agree. Touch screens can be beneficial under certain circumstances — but they are not amazing for anything that has to be reliable and must be operated under stressful conditions.

This is how for example the interface of the 50k$ Arri Alexa looks like (the de facto standard in cinema cameras): http://nofilmschool.com/sites/default/files/styles/article_w...

There were lower priced cameras like the Black Magic Production Camera which use only a toucscreen.

And you can guess why that Black Magic model is discontinued..


Sure. Softkeys - a row of buttons around the edges of the screen - are how a lot of electronic test equipment works.

In fact the UI can be identical for a device with softkeys vs. one with a touchscreen. It causes a few DOH! moments when you are working with some instruments that work one way and some the other, but this isn't nearly as annoying as you might expect.

You still have to look at the screen though. Unless maybe you could 'lock' the UI into one mode (with no sub-menus), which might be a good compromise for vehicles.


Many modular synth systems also do this without screens e.g. by indicating the state of the system with a row of colored LEDs and then colorcoding the labels on the controls. This certainly offloads the cognitive load to the user, but it also works.

The modular scene generally frowns upon systems that need deep menu diving, which is most of the time an indicator that your interface design is lacking.


With the added benefit of having your control stations look like the bridge of Star Trek space ships.

That's also possible with a common set of hardware controller devices. Just like multiple games are played with the same controller on a console, or the way we use personal computers with a keyboard and mouse.

That was part of the problem on the USS McCain: the crew lost track of where essential control inputs were being routed from.

Just about every Human Computer Interaction focused computer scientist I know is adamant that touch screens are inappropriate and vastly sub-optimal for many of the purposes it is currently employed in.

I don't think it's that researchers think it's good. I think that executives and programmers do.


Touchscreens, even where they don't make sense, are just one of those collective idiocies that regularly sweep the IT world. Hard to say who is at fault - there are many fashion victims in every group. Also many who have doubts or don't care either way, but will happily play along to sell something.

> Touchscreens, which are ideal for representing multiple user unterfaces

I am no expert on this, but from talking to people in the aviation industry I noticed slow a trend away from the cockpit-with-2,000 controls and towards using computer screens which switch between display multiple things. Or rather, there still are 2,000 controls, but thanks to the computer-screens, the number hasn't blown up to 200,000.

So my guess is the touchscreens in these ships replaced some computer screen where input (and mode switching) had been done by physical buttons. And now they are moving it back.


Configurable displays that you can quickly glance are great.

OTOH controls that you normally, and especially in a critical situation, reach and operate without looking, is a different thing. They should be stationary, and provide good tactile feedback.


One important fact here is that everything the pilots will need to reach quickly in an emergency still is a physical button. It's just the more "nice to have" things (e.g. programming the cost factor of the engines, putting the route into the autopilot) that are done through the screens. Though, even those inputs are largely done with a funky kind of mouse+keyboard. Actual touchscreens are largely relegated to tertiary functions like airport information displays.

I’m no where an expert and don’t know anything but I gave it a bit of thought. It seems to me that there are two types of vehicles or two types of operations. Instrumental and visual. So if a system is operated in instrumental mode, like instrumental flight, when all the information necessary is available in the screen in front of you, it seems to me that it would be more efficient to make interactive and turn the screen into touch screen. You’re only looking at the screen after all. But when you operate something in visual mode, when you have to monitor surroundings, the controls should be knobs and buttons, ones that you know from your muscle memory and can operate without looking.

For planes that means that controls required during normal flight can be touch controls. After all on Concord and TU-144 pilots didn’t even have visual. However during departure and landing, pilots need physics buttons and switches as it is visual flight.

Another conclusion is that fighter jets can’t have any touch controls, they are supposed to be in visual all the time.


Scene from a movie that comes to mind is the one from District 9, when they are starting the spaceship in the end. He’s totally focused on the screen, everything is happening on the screen so it totally makes sense to be a touch screen. Same goes for Star Trek, I suppose the operators don’t actually have visual on that big screen it’s just for comfort and reference.

Aviation tends to multiple-function displays with somewhat common controls used for them. Started with softkeys around the screens, on recent A350 you have two keyboard/mouse sets for both pilots, plus extra buttons, and a bit complex setup of which screens can display what.

No touchscreens for in-flight, though, although iPads became common as replacement for paper charts.


+1

Touchscreens are vulnerable for fat fingering.

Every time i carry my touchscreenable laptop - it gets crazy because of accidental touches, moves, brushes and if i forget to lock the screen completely - this bordering with disaster, such as deleted files, moved folders, spradically launched apps, etc..


I've owned a Tesla Model S for a few years now and its touchscreen has never felt like an impediment.

To be fair, though, the crucial controls are still under my feet and under my thumbs on the steering wheel, and those are all physical/tactile.


They're ideal for Steve Jobs to make marketing presentations of.

On a more serious note - the touch-screen is the greatest "generalist" UI configuration: A whole lot is possible, but no UI is any good, in the sense of having tactile elements.


The ability to route steering, throttle, and other critical controls previously locked to a couple of stations to ANYWHERE in the ship (including, say, the CIC) is a pretty convincing argument to an Admiral, I'd say. Combined with some modular control systems (a la the modular synth comments in the thread) it would be pretty awesome. The problem here was that they kept getting swapped between stations. It wasn't JUST the touch screen, it was also WHICH touch screen, which is a problem that should be resolved in case they're still looking to allow for control switching.

Tell that to Elon Musk’s Tesla

Having an input system with haptic feedback but also context-based controls and customizable layouts is still the Holy Grail.

Apple could still make it happen with a future version of their Taptic Engine. Preferably with hover haptic feedback, where you feel a "tingle" or something before touching the screen.


So, the closest we have to the Holy grail of input systems is the Steam Controller?

Even for gaming it’s awful. Playing a video game on a touch screen never felt right , depending on the game but For most games it’s an awful experience.

in aircraft, yes. Ships? probably, but the US Navy Surface Warfare community has a special way of believing they are more special than reality.

> Why was touchscreen ever even a consideration for controls you're not looking at?

Touchscreens are the future™


Fully touch capable keyboards would be awesome actually. Imagine remapping keys according to apps (games, editing, coding). I am not sure why you are using the macbook pro as a reference, as the only issues i had with it are related to the classic keyboard, not the touch.

If you mean that you have physical keys which you can remap, such as the Optimus Maximus keyboard concept [1], that would indeed be a great idea.

However, I would not be certain if I could type well on a full-touch screen keyboard such as the Optimus Tactus [2]. I do my typing blindly, having a touch screen would constantly require me to find the correct position for my fingers by look. While on a normal keyboard that is easy to do by touch.

[1]: https://www.artlebedev.com/optimus/maximus/ [2]: https://www.artlebedev.com/optimus/tactus/


I wonder how much would it be possible to touch-type on a touchscreen. I don't look at my phone keyboard anymore and type quite fast on it.

Shorthand writing should also work very fast on modern phone screens.

>It is awful for keyboards (see macbook pro)

I've never seen a MacBook Pro with a touchscreen keyboard. Can you point me at one?



Except the sentence clearly implies that MacBook Pros use a touchscreen keyboard. That's a hilariously wrong assertion, hence my sarcasm.

You know, I think touchscreens are not preferable unless you need/want to be able to have controls for many different systems in the same space (e.g. a smartphone with multiple apps). If you are needing a dedicated control for a high-stakes system, the fact that it's a more modern-looking interface should not matter. Touchscreens are inherently inferior to a mechanical interface, if it's a single-system (i.e. doesn't need to morph into a different and back again).

This is why I detest touchscreens for most car functions. Specifically AC, with old fashioned knobs I have full control over AC without taking my eyes off the road, but nowadays everyone wants to put it on their infotainment system, hidden beneath 4 other buttons.

Yesterday I was driving back from Scotland to England, doing 70ish, when we ran into a really heavy thunderstorm. Went from good visibility to almost zero in seconds, and the only thing I could properly see was the crash barrier alongside the lane. I found myself desperately trying to see where the road was, simultaneously trying not to get rear-ended as I slowed up, switch on the rear fog lamps and hazards, max wipers, and select max demist because suddenly the windscreen had fogged up. Was one of those thankfully-rare maximum mental workload moments. Thank god for manual controls and muscle memory. Pretty such I wouldn't have had enough spare attention and/or brainpower left over for operating a touchscreen.

Honest question: Why don't modern cars yet monitor humidity in and outside the car and make the correct decisions to keep your windows clear? Why are we focusing on self-driving cars while our cars are too dumb to do this much on their own?

It isn't caused by any specific humidity levels. While it is depositing water droplets on to the glass surface, that same water is normally suspended harmlessly in the air (it is effectively de-humidifying the air onto the cold windows).

It is caused by the glass's surface temperature being much lower than the air temperature inside the car. Humans are warm. The air we breath is warm. We're heating up the interior of the vehicle. When that warm air contacts the cold surface of the glass, the water droplets migrate from the air onto the surface.

In order to detect it you'd need to know the glass's surface temperature ideally in the middle (away from the car's body) and also know the interior temperature. The interior temp they already have. But figuring out the glass's temp is non-trivial. Infrared camera is the only thing I can imagine working (since a sensor wouldn't be transparent or wouldn't be replaced when the glass is) but that would likely give inaccurate readings due to the outside temperature.


Yes, but could we maybe get a decent approximation of the windshield temperature from interior temperature, exterior temperature, and speed? Combined with interior/exterior humidity (maybe these gauges might be the expensive/finicky part of this project?), we could calculate a probability of window fogging.

It doesn't have to be perfect -- we can have it just turn on the defrost whenever the probability of window fogging is >10% or something.


Could glue/epoxy a sensor right behind the centre mirror I’d think. It’s far enough from the defroster vents that it wouldn’t heat up quickly from the hot air.

You could do one of two things. First, a heated windshield like I believe some Landrovers have. It’s about $5000 to replace, last I heard, but it works. Second, a double pane windshield like house windows or some motorcycle helmet face shields. These work 100% of the time, but would definitely be more expensive as well.

Best car I had for window defrosting/demisting was a Ford Mondeo which had a fine-mesh heating element embedded. Super fast clearing. More expensive than regular glass but not $5000 by a long shot.

I had a Honda Accord with a cracked radiator.... It defrosted right quick, but idling it after it warmed up was very ungood. After the radiator was replaced defrost went back to normal.

You can get it for most Fords. It's one of my favorite features, and any car maker that caters to the more northern people should have that option.

When it's sort of cold, but not "ice on the car" cold, it works great as defogging. Way faster than waiting for the car to heat up. When there is ice on the car, you turn on the heaters and within a minute or two you can simply use your wipers to clear the window of ice.


But, figuring out if visibility is good through the windshield is still manifestly a much easier problem to solve than self-driving cars.

It's not super accurate, but most cars these days also have outside temperature. Add a humidity sensor inside and outside, and we have temperature and humidity inside and outside the car, which should give us enough information for a basic microcontroller to approximate the temperature differential across the glass and whether or not fog (or frost) is likely.

I think knowing the temperatures and such amounts to predicting the foggy state. Detecting it could look much different. Visual observation is obvious. With purpose built glass, perhaps one could look at electrical properties on the inside of the windshield. Maybe changes in the reflectivity of the glass could be used. Maybe some refractive index shenanigans?

This seems like a simpler route. Why not just a camera pointed at a test mark on the windshield that checks if it is visible through the glass?

Or simply look for reflectivity at an angle, like the recent HN story about the guy who implemented a touchscreen on his Macbook by adding a small mirror to the camera above the screen.

> In order to detect it you'd need to know the glass's surface temperature ideally in the middle (away from the car's body) and also know the interior temperature

Or, perhaps more simply, have an optical sensor that detects and reacts to the fogging itself sooner than a human would.


Wouldn't it be easier to point a camera at the windshield and detect a change in opacity? I would think that's a far easier problem to solve than, say, facial recognition.

But figuring out the glass's temp is non-trivial.

Not really. Many windshields already have elements embedded in them like the ultra-thin wire used for the AM radio antenna. Sensing temperature via the resistance of a wire like that is about as trivial as it gets.


What if the glass had some small heating applied to it constantly?

Complete waste of energy? Additional energy required for AC when it is hot?

Because when the sensor goes screwy I don't want my car to start flipping shit like the defroster on when it's not needed? Because manual controls make such automation totally unnecessary? Because even if the automated system were designed to be "fail safe", the state that's 'safe' is actually context dependent, therefore manual controls will be necessary anyway?

It ain't broke, so stop trying to 'fix' it.


The problem is the current solution doesn't work. Because trying to perfect those settings manually while driving in near zero visibility is scary and dangerous. And nobody even knows the correct settings to use! There are YouTube videos testing whether cold or warm air is the fastest way to defrost and/or defog your window, because nobody knows or remembers.

I honestly don't remember and my solution is "try both for a little bit while maintaining control of the car".


I just leave the defrost on when it's raining or cold regardless of whether my windshield is fogging. No need for an automated system and my windshield never fogs up.

it's really not that complicated, you just have to understand why condensation is happening in the first place. it's always because the glass is colder than the moist air it is touching. there are two strategies to deal with this: either blow air with very low relative humidity over the glass, or fix the temperature differential itself. if you're using the "blow dry air" strategy, you always want to turn on AC and at least some heat, as this will create low relative humidity.

now in the winter you can sometimes do better than this by fixing the temperature differential itself. you can either lower the windows or cool the entire interior by blowing air with no heat. in reality, most people don't actually want to drive around in the winter with windows down or no heat, so the best tolerable option is usually to do as above: turn on AC and heat.

in the summer you can also get fogging on the outside of the windows. just use your wipers for this.


>it's really not that complicated

After reading this I am still not sure which one to use. They seem to both work.


Both work, one is just faster than the other. The post you replied to seems to imply that cold air is better in the winter, but that is absolutely not my experience.

When I get in the car in the morning, the air is already cold. No amount of more cold air from the vents will clear that. What's making it fog up, is the moisture I'm exhaling. More cold air is not going to fix that unless I drive with all windows down. So the only reasonable option since AC don't work at low temperature, is waiting for the car to heat up.


tl;dr of my post: turn on AC and blast hot air at the windshield. if that doesn't work in the winter, open your windows. if it doesn't work in the summer, it's because the fog is on the outside of the windshield; use your wipers.

In my car there is a front window defrost button that turns on the a/c (to dehumidify) but also the heat. Dry warm air does the best to remove condensation.

Perfect the settings? I've never driven a modern car that can't do it perfectly well when you hit the demister button. You're well overthinking things.

Manual works just fine. I just drove clear across the Trans-Canadian Highway, east to west, in a car with manual controls and it all worked great. Your eyes stay on the road because sober people can reach out and grab objects, like knobs or buttons, without looking. If the matter of defrosting confuses you then maybe you should sit down in your car for five minutes and learn how the multi-ton machine works before attempting to operate it; you owe that much to the rest of society. Don't ruin a car just because you are too lazy to work a damn knob.

(Warm dry air evaporates water that's condensing on cold surfaces. That's very far from rocket science.)


One: I think we can discuss this without condescension for anyone who doesn't agree with you.

Two: You are making a false choice: Automatic settings need not replace manual ones. Essentially what we are talking about is a spot on the dial that says "auto" which you are free to not use, much like headlight controls on higher end cars.


In practice the introduction of high-tech options reduces the availability of the manual options as manufacturers seek to reduce costs. Touch screen AC controls don't supplement manual controls, they replace them. This leaves people without irrational infatuations with tech high and dry, hence why I'm annoyed at anybody who advocates for it. If you've been in the market for a new car recently you'd see what I mean. Considering the state of the automotive industry right now, I think my comment was gentle. The suggestion that defrosting windows should be a matter delegated to a computer because it's confusing made me roll my eyes so hard, I was nearly blinded.

But cars already have this. My 2014 Ford Edge has front/rear defrost/defog buttons and they work just fine. As "liability" said, the answer is warm, dry air which happily works for both the defrost and the defog requests.

Well, they worked fine, but my A/C just went out, so getting dehumidified air is a problem these days.


>It ain't broke, so stop trying to 'fix' it.

Sure, but the marketing department had to fill a few blanks in the "bad weather options pack" (optional but casually installed on all cars in production that will be available in the next six-eight months) for a mere US$ 3,000:

1) self-learning wipers (using AI to set automatically an appropriate wiping speed, including economode, that only wipes the right side of the windscreen when you make a free turn on right at a traffic light)

2) intelligent defrost (computing the correct defrost temperature through analysis of real-time satellite heat maps)

3) heated and ventilated mats (that can dry your wet shoes and lower half of trousers independently from heating/confitioning settings)


I hope you make an artistic statement and patent some of these awesome ideas. 1) for conversation starter purposes, and 2) so that we can make sure auto manufacturers never actually do this, or at least you get fabulously rich in the process...

Interesting this patent literally just expired today: https://patents.google.com/patent/US5412296

Expired in 2012, buddy.

oops, read that last line incorrectly, its more of a status than an event.

I'd rather have the AC system dehumidify outside air, then reheat it before blowing it on the inside of the windshield. Most car HVAC systems make AC and heat mutually exclusive.

Dry hot air to heat the windshield above the dew point avoids the "damned if you do, damned if you don't" choice of cold dry air to keep the inside from fogging up versus hot humid air to keep the outside from fogging up.

It's not a matter of temperature sensing on the windshield, but humidity sensing in the cabin air.


Automatics are great when they work. I do have automatic wipers, which usually work pretty well. Fortunately they didn't remove the manual controls though because I had a boat on the roof yesterday, and the rain sensor behind the rear-view mirror didn't really see as much rain as I did.

There is a max-demist button, which for me is the perfect automatic function - it says what I want, and lets the car get on with selecting max fan/heat/AC/flow to the demist vents. And I can find it with muscle memory when I need it.


This is the Airbus v Boeing problem. There are two schools of thought, one being that you the operator can control all the settings including the ones necessary to avoid the fogging. There are others that do so automatically. So many times people would be aggravated that my a/c would turn on with the heat in my Audi. What they didn’t realize was how that prevented fogging of the windows which inevitably would happen a few min later. I noticed my current car also does this but it turns itself on after the heat has kicked in.

Automatic things are annoying IF the user isn't sure what is happening or why.

Heat randomly turns on -> human is confused, maybe uncomfortable, and might think the car is broken/haunted.

Heat randomly turns on, car displays "Preventing windshield fogging" -> human is thankful car is thinking ahead.


Except that displaying unexpected content interrupts the driver’s attention and takes it way from where it needs to be.

It’s certainly not important enough to distract someone performing any delicate maneuver.


Thats why a good notification system comes in different levels. Like red and big for important messages, blue and small for information like "heating windshield".

> This is the Airbus v Boeing problem

This is a false dichotomy.


Why don't cars have 2 batteries? One for starting the car, and the other for all the crap that tends to kill the battery when you most need it?

Probably because it costs more money and nobody will pay for it. That's the best guess I have.


Some cars have do two batteries, one for start/stop and one for everything else. (For example, some Mercedes have these, Suzuki "mild hybrid" cars, etc.)

I know about the Mercedes but did not know about the Suzuki. My main point though is that this is not a feature in your bog standard Honda Civic or Toyota Corolla or Ford Fusion. I think it should be, it's a huge safety and convenience feature. That it is not in your average car implies it's probably too expensive for the utility it offers the average person.

Personally I solved the problem by buying a lithium ion car starter for both of our cars. Works great, but I really would prefer the cars just had some redundancy built in.


You just have to be picky when you choose cars; my twenty year old Land Cruiser came with twin batteries factory fitted - though rather than one for starting and one for all the thingamajigs, they went with two beefy ones in parallel - 2*105Ah makes sure it starts every time.

I have been contemplating using the second one as an auxiliary battery after installing a fridge in the boot, though. Basically a matter of fitting a voltage monitor and a hefty relay.


Two beefy ones in parallel don't really solve the problem. If you leave your 1-Amp (probably a gross underapproximation) headlights on over the course of ~200 hours, you'll deplete the battery.

-It doesn't solve the problem (which is why I am considering splitting them now that I have added a ~1A consumer to the mix), but it does postpone the problem for long enough to (in most cases) ensure that you will realise your mistake before it causes you not to be able to start your car.

Leaving the headlights on (~10A) as you leave the car in the evening will still let you start it in the morning, for example.


It's an add-on you can choose for cars, if you want to. A lot of service vehicles in Australia have 2 batteries for this reason. 1 for the vehicle, the second for chargers (USB, DC, etc), routers, fridges, cameras, etc.

Not around here. I've never seen it offered for a non top of the line luxury vehicle.

on the other hand, AC is one of the only electronic features that causes a meaningful hit to fuel economy (and available power, if you have a weak engine). I very much prefer to decide for myself when it turns on.

Your car likely already turns off the AC when you accelerate hard. Most of them do that.

But not at cruise, which op was alluding to

The difference is so minute that I don't see what the point could be.

Some cars do that automatically - in which case there are tons of complaints about the car turning on the climate control automatically. This is something where you just can't win...

they do but people resort to hyperbole all the time to justify their hatred of touch screen interfaces or just interfaces they do not like.

there are bad touch screen systems and there are good ones. just as there are badly laid out physical controls and good ones.

even muscle memory works with touch screens, to say otherwise is just a lie. however the biggest oversight people tend to ignore is, how little they actually interact with controls of their car other than the turn signals and such. Most modern cars have full climate controls which does include humidity and such, many have automatic lights and wipers, and some go as far as doing the driving.

nineteen to twenty four buttons on steering wheel is just fine and intuitive to some just like the same number on a center console yet these same people will complain about the simplicity of a properly design touch UI and fewer physical buttons as being too complex.


Rear fog lights: something more cars should have in the US for this exact situation.

Or fewer. The only time I ever see them on is when it's a clear night and I'm getting blinded by the person in front of me. That said, they are pretty useful for more than fog... They're good for flashing at people behind you who don't have their headlights on.

...a rear fog light is an additional parking light on the rear driver side of your car to help other cars identify the width of your car in foggy or other incline to weather situations. They are not brighter than any other light that would regularly be present on that car. If it is brighter and blinding as you say, rest assured the car has been illegally modified and is subject to constant harassment by police in all other jurisdictions!

In the all cars I've owned, rear fog light is brighter than regular rear lights. Not sure if they are brighter than braking lights, but brighter than "position" lights for sure.

https://content.icarcdn.com/styles/article_cover/s3/field/ar... http://www.waycoolinc.com/graphics/z3/03/052503/jt/P5250326....


I've owned three cars with rear fog lights and all of them came from the factory with the rear fog light(s) being brighter than the rear parking lights. That makes sense too -- because if rear parking lights were bright enough for fog then you wouldn't need rear fog lights.

In cases like this I love the radar cruise control in modern cars it very accurately shows what's ahead even when you can't see anything. Agree with your conclusion some of the latest luxury cars have touch screens and the UX is awful.

My radar cruise control, on a Mercedes, fails with heavy rainfall. Warns you to switch to visual control.

Eyesight on my Subaru would fail as well. Manual basically says “if you can’t see, that thing can’t see as well”. It uses two cameras after all as well.

My Volvo has touch controls for AC but physical buttons for quickly toggling windshield fans/demist. Feels like a decent compromise.

Honest question: why didn't you take the train? This would have been a good time to take another sip of your whisky and soda.

In the UK taking a train is often (but not always) more expensive than driving and can take longer too. Depending on the route there might not be a viable service. The service in bad weather isn’t very reliable either.

But for some routes it’s really easy and convenient. It just depends on where you are and where you’re going.


I didn't take the train because I was transporting my son's sailing boat back from a competition. Also the train line was flooded yesterday, so that wouldn't have been an option. But normally that's not a problem, and I agree, if I'm just transporting myself, I do like the train.

On Friday this happened https://www.bbc.co.uk/news/uk-49300025

British trains are very expensive and unreliable at the best of times. Anyone who can drive, does.


Because he didn't know the whether would be like that?

And because he already owns a car?


You know, when you take your drivers test in the US, you have to demonstrate your ability to know where all those functions are before you even pull away to start your test.

The real question, as I found yesterday, is can you switch on high-speed wipers, rear fog lights, hazard lights, and max-demist, in a few seconds, while you're devoting 99% of your brainpower to not crashing into an unseen vehicle slowing in front, not veering out of the lane that you can't really see, and glancing repeatedly in the mirror in case you're about to be rear-ended.

The question is not whether you can operate the controls under normal circumstances. It's whether you can do it when task-saturated with more important tasks. Under such circumstances, I simply wouldn't have had any attention left for glancing at a touchscreen, but I did know and could hit all the manual controls rapidly without looking.


The answer is you focus on dealing with what's in front of you, so turn on wipers and demist. If all the drivers do that, then no one needs to worry about what's happening behind them. After you take care of being able to see and control the car, then you worry about rear fogs and hazards.

And if it's really, really bad, pull over. Don't be like those idiots who decide to stop in the middle of a highway with 75mph traffic around them, under an overpass to protect themselves from a hailstorm.


This is like the arguments that C programmers just need to be more careful about errors. We know that people make mistakes, get distracted, and get older. Real engineering is about designing systems which work well in actual conditions, not some unrealistic ideal case assuming perfect conditions.

It’s also simply wrong: in California, I had to show use of turn signals – not anything else – and no other place I’ve driven since has ever made me pass a test again. Assuming that everyone reads the manual and practices with every new vehicle is unrealistic, so you’re looking at potentially half-century lags between what’s tested and what people are driving.


I always wondered if Californians know what turn signals are.

About as well as the rest of the country, with the possible exception of Maryland, where they’re used to decoy people by indicating the turn required by the lane you’re in as opposed to the turn three lanes over you’re actually going to make.

I actually wonder whether google maps / Waze integration would be a worthwhile improvement: self-driving cars are a good ways off but simply signaling the direction which the driver was just told to turn would be nice, and a majority of drivers seem to be using mapping apps these days.


It is by far the worst state (or province) I've lived in for lack of turn signal usage. I want to get a bumper sticker that says, "Your Turn Signal is Broken".

How about a bumper sticker that says, "I slow down for stop signs".

This depends on the state/region. There's no universal standard in the US, and some states definitely don't require the operation of all of these controls before the test.

[flagged]


Hmm, this doesn't seem to be a response to what I said. Maybe you accidentally replied to the wrong comment?

> you have to demonstrate your ability to know where all those functions are before you even pull away to start your test

if only s/he had to pass the same threshold to post on HN!


Even if that were true everywhere, people drive vehicles other than the one they used for their drivers test in their lifetime.

And just like when they learn to use the shifter, they need to learn what that thing on the other side of the steering wheel does too before driving off whether it's tested or not.

I commend you on your attention to detail under every circumstance, but not everyone masters the windshield defogger (to the point where they don’t have to look at it) the moment they sit down in a rental car.

This was not my experience in New York. You don’t have to demonstrate anything other than knowledge of where the turn signal, ignition, brake and gas pedal are.

I took a driver's test 3 years ago and this was not included.

For what it's worth, Mazda recently decided that it doesn't want touchscreens in its cars anymore because they're a safety hazard.

"Mazda is purging touchscreens from its vehicles"

https://news.ycombinator.com/item?id=20200335


I had a top model Mazda 3 2014 before (manual Astina) as the #2 and everything about it was just first class usability wise.

The rotary navigation knob let me do everything on the screen, and all the buttons for everything else were in intuitive places. Also a great radar cruise control, simple yet highly functional HUD, and things like rear cross alert which is amazing when backing out of 45 degree parking spots next to some big trucks.

As much as I'd love a Tesla as my next car I'm not sure if their UI is where I want to go.

Presently riding to save my health, and money.


Old Panamera and new Panamera. Disgusting and disorienting touch controls. I honestly can’t tell why would anyone prefer touch controls. I also can’t honestly tell how is Tesla’s screen street legal.

I have knobs on the climate control on my car, but I still need to look at the screen. The problem is the knobs have no stops so I still need to look down to figure out what they're set to after they update the electronic display. On my old vehicle I knew the whole way to the left was fan blowing on feet, whole way to the right was defroster, and the 2 or 3 clicks in between were different settings of those extremes. Now I have no idea where it ends up when I spin the dial.

I'm not sure why dials have stopped clicking. Maybe it had some direct mechanical function at one point, but even if that's no longer necessary, the haptic feedback of the click seems just as important.

In fact, even on touchscreens there will often be some sort of audiovisual "click" effect, whether that's an icon lighting up, expanding, or making an audible click noise.


My method is to turn the knob back to when it stops moving and then it is now in a known state and I know how many clicks each setting is away.

That’s the thing it doesn’t stop. It’s just a rotary dial with the lightest detents I’ve ever felt. You just spin it and it’s a circular buffer through every setting.

> I detest touchscreens for most car functions

Agreed. I consider them dangerous, a threat to public safety, they should be banned by the NTSB, and some cars that use them should be under recall to retrofit them with mechanical controls.

Many states are now passing hands free laws that criminalize the use of phones or any other devices that use your hands while driving. The states provide exceptions for car touch screen use, but they shouldn't. They should criminalize the use of all touch screens when driving, not just some.


For the life of me I can't understand why touch screens built in to a car are considered "different" than phones. Cities and states are finally waking up and making it a crime to use your phone while driving, how in the world are car touch screens any different?

I suspect it's more because using your phone is almost definitely not controlling the car, while using a touchscreen that is part of the car's controls is still technically considered "operating the vehicle" and thus not a distraction, regardless of how difficult it is to actually operate.

This is a good idea but a recall is too much for this to be practical. At least the NTSB should be doing this for all new cars.

I had this discussion with my teenage child this week and neither of us are sure. Our state requires handsfree phone use but has the car touchscreen exception. Would operating Android Auto or CarPlay be illegal or does it fall under the allowed exceptions?

Absolutely agree. Controls with tactile feedback are literally lifesavers when you're trying to pay attention to the road. Touchscreens are cool, but all the basic controls should be presented as neat panels of knobs and buttons that don't require so much as a glance to operate.

You're not the only one who detests touchscreens in cars. Mazda agrees with you[0]. I think touch screens are primarily useful for complex interfaces and interfaces that change every time. But driving a car needs to be as easy and reliable as possible, because you need those eyes and brainpower on the road, not trying to figure out an interface.

[0] https://www.motorauthority.com/news/1121372_why-mazda-is-pur...


There’s a part of me that misses T9 texting on my cell phones. There was a time that I was very good at T9, and because it was deterministic (with auto complete off) you could text without looking and it’d text exactly what you meant. Definitely can’t do that now, and voice controls aren’t as good.

I can text without looking on iPhone 5/SE, but that went away with the 8

I'm going to repeat a comment I made a couple times before as I think it's relevant:

In one of the talks at Google I/O a few years ago a VP from Audi (or Volvo?) spoke in a thick and lofty German accent about how "ve haf completely oferhauled ze driver exzperienze". He played a sexy video clip showcasing their new Android infotainment system (something like https://youtu.be/h_7_fKJ0PNs), and the first thing I noticed is how they'd taken away my traditional temperature knobs and replaced them with digital touchscreen ones. They looked just like physical ones, and were in the exact same place you would expect (https://9to5google.com/2017/05/15/android-cars-audi/). So, I've gained absolutely nothing, and now I have to take my eyes off the road and look down at the stupid console just to change the temperature.

TLDR: Tacticle feedback is a Good Thing(tm) and designers should cultivate - not fight - muscle memory.


Plus GUIs are random in temrs of response time...

The emergency brake. I really need that to be a mechanical lever and not an electrical one. If something is going wrong on the road I need a backup brake that takes me under a second to activate while in a state of panic with my eyes glued to the road and it had better be functional.

That’s the worst thing you could do. You block your wheels and you totally loose control over the vehicle. ABS enabled bakes do stop you quicker while maintaining control over the vehicle.

ABS is not much help when your breaks fail. I think that's what GP was referring to.

This is often repeated but not the complete picture.

Modern, knob based, vehicle climate control systems actually bury a lot of functionality. For example Toyota's "automatic" AC you can either pick between fully automatic or a subset of manual controls. That's because they ran out of space/complexity headroom for more controls.

Touch screen climate control integrates better with voice (which you should be using while driving), offers better fine-grain control, more information (like CURRENT interior temp rather than just outside/target temp), and can offer new functionality (like profiles/pre-sets, additional automatic modes, memory climate, linked seat heat & cool/positional vents, etc).

I dislike several touch screen based climate control systems I've used. But that's because they're BAD. Car manufacturers are bad at making touch screen systems. But let's not throw the baby out with the bathwater. There's a lot of good reasons to go that direction, car manufacturers just need to work on UX a lot.


I feel like you're making some very false choices between smarts/voice compatibility and physical controls. Physical buttons and dials for users to interact with that still can be changed by the system are wholly possible. For instance, a rocker switch that sits in the middle can be used to issue on and off commands, while being separate from say, a status light that lets you know whether the device is on or off.

Which is to say, I think you're significantly limiting what you believe is possible with physical UX.


Voice controls suck and are exclusionary as fuck. Maybe they work for the subset of the population who natively speak a very standard dialect of American or British and have no speech impediments; for those people it's just an annoyance to have to pause the podcast or music.

For the rest of the population of this planet, voice controls are horrible; either the user doesn't speak the supported languages at all, or speaking the supported language doesn't come naturally and switching to it requires mental effort, or it's an infuriating experience to try to make the bloody machine correctly interpret what you're saying.

That's even ignoring the complete fucking shit show that is software trying to understand natural language and infer meaning, without just turning it into a tedious form of a command line interface.


> For the rest of the population of this planet, voice controls are horrible

Please count me out. I'm a non-native English speaker, with a strong accent, and I'd absolutely prefer voice control over having to:

1) Remove my right hand from the steering wheel, reaching forward where the controls are. There is a good reason all important controls (besides pedals) are on the steering column. 2) Either a) moving my eyes away from the road for a moment, looking for the knob; or b) trying to find the exact controls by touch and memory.

> an infuriating experience to try to make the bloody machine correctly interpret what you're saying

Machines are limited - and despite marketers wanting us to believe otherwise, I don't expect them to display comprehension natural human languages better than my cat does. Like I query search engines with keywords and special syntax, and not proper sentences, I expect to communicate with the machine in a special, non-natural language.

Restricted speech recognition with strict grammar and limited vocabulary works quite OK those days. It's the general-purpose voice recognition and "smart" assistants are things that suck hard.

Heck, maybe I'm a total weirdo, but I'd rather learn a special conlang than reach for AC controls by touch.


> Physical buttons and dials for users to interact with that still can be changed by the system are wholly possible.

Sure, but irrational. One of the largest benefits of physical controls is physical feedback, like being about to feel if you've reached the max cold/hot ceilings or the state of the switches.

To make them voice compatible you have to remove that physical association (e.g. infinite spinners, or blind switches), which means now you need to take your eyes off the road to use them, which was major perk of physical controls.

Physical controls that have no physical state are the worst of both worlds. You've lost the physical and electronic control's advantages while adding the disadvantages to both.


> Physical controls that have no physical state are the worst of both worlds.

This is so true. I absolutely hate the AC knob in my car. Apparantely, they thought it was good engineering to make it infinite.


At least you can find it without taking your eyes off the road. This works fine for fan power, where you know when to stop based on what you feel, but not so much for temperature, where you can’t immediately tell what’s right.

One of my friends has a motorized knob he uses for volume control- shr wrote a script that relays the current volume to the Arduino that controls the whole thing.

It's awesome- I love that thing so much and have parts on the way to build my own.

I think the issue is more bad integration than it is a bad idea.

A VFD or 7-segment display next to a pair of buttons for up and down works quite well for temperature control, and lets you integrate it nicely with digital controls.

The issue here is not that it's impossible to integrate the analog physical and the digital aether- it's the implementation that's oft lacking.


I think all of these things are solvable by haptics and good design.

Of course - the latter is the factor least likely to be present.


>Touch screen climate control integrates better with voice

That's a false dichotomy. Tactile controls doesn't mean you can have electronic logic behind them, that you can talk to in alternative ways (e.g. through voice).

Also, I wouldn't say the whole BS added to modern "climate control" (i.e. bloated AC) systems is much of an improvement...


I really think that nowadays touchscreens are becoming the cheap option. To me when I see a machine with custom designed button I know they had to do some mechanical design, whereas a flat touch screen just says "we slapped a tablet there and hacked an interface on android".

Which is ironic because I'd happily pay more not to have one. I bought a replacement for my 10+ year old MP3 player. The new thing is so cool and slick and flat and sophisticated and plain complex, and it's horrible to use. It demands your attention for everything.

My old MP3 player with its physical buttons and weird control lever, well, I could work it by touch and muscle memory.


It's why I like my Up!: they decided everyone already has their own touchscreen with infotainment. So you have a dock for your phone, a USB charger and a bluetooth enabled sound system. The rest uses mechanical controls.

It's interesting that despite Apple's no-buttons obsession, even the latest iPhone still has a physical silent-mode switch and volume buttons.

I don’t think Apple is against buttons, just not on the screen. Gotta say I think it’s pretty weird they have a silent mode button, and a Do Not Disturb mode hidden in the controls? What is that duplication about?

Do not disturb is not the same as silent. The former mutes all notifications so they don't wake up the phone nor produce sound or vibration, the latter disables UI sounds and uses vibrations instead of sounds for notifications. They are completely independent.

Sony had one of the best mp3 players imo, nw series, the ones shaped like a bic lighter. There was a rotary knob to switch between tracks, very intuitive to use without looking at it. But the main problem was it required their custom software to transfer files. The later models got turned into ipod imitations, with an lcd screen and buttons, such a shame.

That reminds me of how great of a device my iPod shuffle was. To bad the battery died..,

As with everything it depends. Throwing a whole category of input under the bus because it failed when we tried in some uses would be short-sighted. Especially as we’ve had touchscreens for less than a few decades.

> inherently inferior / single-system

From the top of my head, maps work better on touchscreens. Navigating document trees. Making elements link together, group them. And there must be hundreds of uses not obvious now because we haven’t tried enough.

Actually I don’t see how any input system would not have some high-stake single system where it’s a perfect fit. That would just be failure of imagination in my opinion.


Right. For anything dynamic, an interface that can redraw itself in milliseconds has enormous advantages over mechanical knobs and levers.

For managing permanent properties, I think mechanical interfaces are the best option, though of course vastly more costly.


You are right, but I think it goes beyond that.

Any control interface is by definition acting on something dynamic, so even pushing a physical button, we expect the some status to reflect the result instantly.

I think touchscreen could be best used as a step above joysticks, when an action cannot just be summed by a on/off (or +/-) state, or a vector in one direction.

Imagine in a military setting if you need to open a communication channel with 4 teams moving close to each other to warn them of a nearby enemy. Would you prefer to switch the “open communication” lever, check the team numbers, push the buttons 1, 5, 6, 12 and the “open channel” button. Or circle them directly on the map and click “open communication” on the right side panel?

I also think that a lot of touchscreen interface are immature and ill thought, but it would be the same issue with stupid buttons like we can find in so many cars for instance (why would the parking brake switch be next to the A/C on/off ?)


I think you just talked me into making some sort of huge steampunk button controller for slack... Have a label with the contact name, an incandescent lamp for new message status then a button to chose that to go to the conversation... Hmmm now all I need to find is some time

My Prius has manual controls for the A/C, but I get confused anyway...and, I think, for good reason.

What I have: There's one button that toggles A/C on/off; when it's on, a green LED glows in the button (the button is flush at all times). There's an up/down lever that increases/ decreases the fan speed: hold it up, and the fan speed increases by a notch every second or two, and similarly for down. When you let go, the lever reverts to the middle position. The fan speed shows in a series of bars in a tiny display above that switch. There's another up/down lever that controls the temp setting, although exactly what that does depends on whether the A/C is on (I think). Its current setting shows as numbers above that switch. There's another controller that determines whether the air goes to the front seat only, or the front + back; the setting for that appears on the main console screen, but only for a few seconds after you change it--after that, the main console reverts to showing engine status or something, depending on how that's set. Whether it goes to the driver only, or to the driver + front seat passenger, is determined by whether the car thinks there's a passenger (which it determines by weight on the seat). There are two other controls that determine whether the defrosters are on; I can't remember how you tell what they're set for.

What I wish: two knobs to control temp and fan settings; clockwise would be higher/ faster. A lever to determine air to the front/back seats. A toggle switch to turn the A/C on/off; up is on. Toggle switches for the front/rear defrosters.

What I'm glad I don't have: touch screen controls for all this.


This is absolutely correct. Humans are physical creatures living in a physical world and our brains have evolved to interact with that world in a physical way. Touch-screen interfaces, even good ones, necessitate a level of mental abstraction, which is going to make operating in a fatigued state even more difficult.

To add, we're analog creatures.

For a human to interface with anything digital, we need to drop what we’re doing and fully engage the touch interface.

Dangerous when driving, or similar.

With analog interfaces, our hands know exactly where to go and what to do just on feel, keeping attention elsewhere.

I’m glad the DOD made this decision and I hope others will follow.

Just because you can chip it doesn’t mean you should.


Heh, I just had a vision of some car manufacturer making the steering controlled by a touch screen. (Is there a reason that that would already be illegal?)

Is there a reason that that would already be illegal?

The FMVSS (https://en.wikipedia.org/wiki/Federal_Motor_Vehicle_Safety_S... ) probably has some rule that states a car has to have a physical steering wheel that operates in a particular way, as well as a brake pedal to the left of the accelerator, etc., although I haven't actually looked; I do know, however, that it requires automatic transmissions to have the PRNDL shift sequence.


This is a pretty old complaint, but my initial issues about the ipod, especially 2nd and 3rd gen was the lack of manual controls. At a younger age, a walkman or discman in my pocket with headphones in my ears was a constant, and I was able to find my way through whatever media was in there quickly and easily by thumb.

With the iPod, I could manage sometimes but about half the time I ended up having to take it out and look at what i pressed because it just wasn't working by feel. I remember a couple mp3 players from the time made manual buttons with different sizes and placement and better feedback, but they weren't around for long.


This seems unfortunately common across all industries. People focus on visual appeal only to the detriment of all other senses.

Fruit gets larger and brighter, but the flavor and texture go down.

Acoustic paneled ceilings are removed in favor of exposed ductwork and beams, but noise transmission between floors and echoes on the same floor get much worse. (I am torn about this one though)

Buttons with tactile guidance feel great but look too 1990s so everything becomes a glass rectangle.

I would really really like to see and participate in a renaissance of design that accounts for all of the human senses in all their forms.


Then we have the “touch buttons”, which doesn’t have any of the benefits of touchscreens (mutability) nor the benefits of regular buttons (touch feedback).

I agree, but isn't one (tiny, almost never worth it) benefit that it is not a mechanically moving part? Shouldn't it last longer?

Well user familiarity is important but not paramount. The Navy uses Xbox controllers in submarines and found that the learning curve for 19-year-old men is much better that way

This is different. The issue here wasn’t the use of a general-purpose device itself but that the device they used (a touchscreen) wasn’t the best fit for the task at hand. Even worse, multiple screens were able to control the ship at the same time which to me seems to be the root cause of the accident. A single touchscreen still wouldn’t have been ideal, but it would’ve still prevented the accident IMO.

In the case of using Xbox controllers it’s actually a great idea - these controllers were designed & refined over decades of intensive use by all kinds of people around the world - you wouldn’t be able to do better even if you tried with a custom design. There are obvious concerns regarding reliability of the hardware, but the design itself is in my opinion flawless.


For another example in reverse: the Thrust master Warthog [0] HOTAS (hands-on throttle & stick, a controller for flight sims) is based on the A-10C controls. Part of this is for an immersive experience if one is flying a Warthog, but a large reason many people favor it is the excellent ergonomics. Fairchild & the Air Force spent a lot of design effort and thought on control placement and developed a very nice to use set of controls. Just like there are control schemes where a game controller makes perfect sense, aping actual aircraft controls is a great way to fly a pretend plane. Or on my case, a pretend spaceship. I've used an Xbox controller to control various robotics projects because there was no need to reinvent the wheel, and I can certainly understand why there is a niche market for controls to mimic cockpit controls.

[0] http://mobile.thrustmaster.com/node/812


I would add a major caveat: Xbox controllers don’t have physical feedback the way an actual throttle lever or steering wheel does. See, for example, the endless arguments over whether the Boeing control style (where the position of a control means something) is better or worse than the Airbus style (where the position of a control is just an input).

If I were designing a destroyer control, I would probably make the throttle be some sort of device where the position of the device indicates the throttle state. But I would certainly make the throttle be a single-purpose device that controls nothing else.


Exactly: Xbox controllers are highly tactile and ergonomically designed so that they can be used effectively without ever having to look at them. In-car touchscreens are a menace, and a design fad that I cannot wait to see pass.

Additionally, they are proven to be highly ergonomic for a very wide swath of people. They aren't ideal for everyone (no input device can be ergonomic for literally everyone) but as shown from the wide array of people who can effectively use them they are tolerable for long periods of use for a (likely majority) of the population and already benefit from muscle memory for a lot of people. There's a good reason the Xbox and Playstation controllers share a very similar form factor, and have done so for several hardware generations now.

> i.e. doesn't need to morph into a different and back again

The UX terminology for what you're describing is a modal interface: https://en.wikipedia.org/wiki/Mode_(user_interface)

... which is to say, an interface that works differently depending on what "mode" the interface is in.

A classic example of a modal interface is vi/vim, where you have to explicitly switch into "editing" mode in order to actually insert text into the document. And a classic demonstration of why modal interfaces are undesirable can be produced simply by taking someone who's never used vi/vim before, sitting them down in front of it, and telling them to enter a line of text. They will start typing, and immediately become confused when the words they typed don't actually show up anywhere. (Or worse, when they happen to have typed a keystroke that corresponds to a vi command and weird stuff they didn't expect starts happening.)

As you note, modal interfaces are particularly bad in situations where the user needs to operate the system under pressure -- such as (say) on a warship, where the life of the operator may literally depend on them being able to accomplish tasks quickly and accurately. Modal interfaces force the operator to first orient themselves as to what mode the system is currently in before they can do anything, which slows down even experienced operators, and pull away attention that could otherwise be applied elsewhere.

What does all that have to do with touchscreens? Because how a touchscreen operates can be modified with only some programming, they tend to lure developers into building modal interfaces in an effort to cram as many features into them as possible. From a feature-checklist perspective that's great, but from a usability perspective it's a disaster. Anyone who's experienced a modern car infotainment system will understand why -- paging through menu screens to find the one mode with the feature you want while piloting a two-ton hunk of metal at 75mph is a bad combination.


Another classic example of a modal interface is the Photoshop toolbox, dating back to MacPaint.

http://www.psd-dude.com/tutorials/resources-images/adobe-pho...

https://en.wikipedia.org/wiki/MacPaint

It does not take much work for a Photoshop operator to orient themselves to the current mode, because it is communicated by a number of cues, including the shape of the mouse pointer.

One problem with vi’s interface is that, in contrast to Photoshop, it’s not easy for a vi operator to discover the available modes.

Are there any paint programs without a modal interface?


Rehosted Photoshop toolbox image:

https://i.imgur.com/oIKOOCC.jpg


I don't think you can call modal interfaces inherently bad. I suppose Gmail is a modal interface because if you pull it up and start typing you'll be issuing commands instead of composing an email, until you enter "composition" mode. But what is the alternative? Have every button on the keyboard do the exact same thing regardless of what application or part of an application is currently in use? It's fine if you don't like vim but you don't have to pitch it as an objective universal truth that emacs > vim.

I didn't say anything about emacs. emacs has plenty of problems of its own.

> But what is the alternative?

It takes some outside-the-box thinking. Here's an article by Larry Tesler, of Xerox PARC and original-Macintosh-team fame, on how his desire to build a modeless text editor led to the invention of copy-and-paste: http://worrydream.com/refs/Tesler%20-%20A%20Personal%20Histo...


Modal interfaces can work very well, if used wisely:

- with experienced users, e.g. people who are used to vim

- with a sharply limited number of modes, e.g. the three modes that vim has

It's just really easy to get carried away when you have the enormous number of combinations that touchscreens allow.


The main problem with

with experienced users, e.g. people who are used to vim

... though, is that it basically represents a surrender on the idea that UX is important at all. Given enough time and commitment on the part of the user, any interface can be learned well enough to be adequately useful. You could build an interface out of loaded guns and rotating knives, and a sufficiently committed user could eventually learn how to operate it without killing themselves. The challenge is that most users' time and commitment are not unlimited.

A stronger argument in favor of modal interfaces is that they enable the creation of complex interfaces that wouldn't be possible without modes. This is part of what appeals to so many vim users about vim -- once you've climbed the learning cliff it presents, you can do some things much more efficiently than you could in, say, a WYSIWYG editor. I personally don't agree with that tradeoff, but I recognize that a number of people see it as worthwhile.


This is absolutely not the case. There is a difference between UI built for people with no training/experience, and those with some training/experience; see e.g. the Therac disaster for an illustration of the UI challenges of building for experienced users. On personal computers, it means attention to keyboard shortcut design, making data more prominent and well-separated visually, and labels less prominent/space-consuming.

Consumer-oriented UI is not the only UI! Users' time and commitment are not unlimited, but they are substantial when e.g. you are in the military and already need to be trained to understand the problem domain.

EDIT: A great example is the UI of the controls of a car. It's important to put commonly-used controls in ergonomic areas, follow the principle of least surprise, etc.; but it can still be built with the assumption that the user is a licensed driver and knows where the turn signal usually is, that up is left and down is right, what all the assorted symbolic labels mean, etc.


Or if you need to clean/desinfect often. Touchscreens are good in hospitals.

Hospitals could also use hand controls made of metal with a high copper or silver content, or foot pedals. Maybe also autoclave-able interface boards, or disposable single-use plastic covers.

Too many medical devices are designed to be operated by fingers, rather than a foot. Touchscreens help the problem of hand-transferred contamination, but that is the wrong problem to fix. The problem is that hands are being forced to touch things unnecessarily. No one cares about what feet touch, because feet don't touch patients, and shoe-clad feet can step through a shallow disinfecting bath if necessary.


I agree, and would add that a case where the UI is evolving and you want to be able to ship functional updates and changes easily is another 'use a touch screen' indicator.

Google Maps

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: