Hacker News new | past | comments | ask | show | jobs | submit login
Honda bucks industry trend by removing touchscreen controls (autocar.co.uk)
1922 points by trenning 60 days ago | hide | past | web | favorite | 743 comments

Look at the cockpit of any modern airliner and you will see screens, but they are never interactive. There are hardware buttons, dials and lights all over the place. A tactile interface is both more obvious, sturdy and more stable, and therefore safer. The problem that touch interfaces solve, ever since the advent of the first smart phone, is that the interface is now dynamic. You can change it without having to replace the hardware. Here's the catch: for safety critical interfaces, YOU DO NOT want the interface to change. The point is moot.

Touch screens will hopefully never make it into any critical pilot systems, because safety and stability matters to airline manufacturers, current ongoing scandals notwithstanding. I only wish automobile manufacturers took their job equally seriously.

I work for an avionics manufacturer and I can assure you most of our upcoming comercial systems (and even a healthy portion of government ones) feature touch screen inputs.

Touch screens add multiple points of failure to a device that, if properly built, would last decades. A single glitch in a software driving a screen could render useless all touch inputs displayed on it, information loss aside. I'm all for mechanical switches everywhere. As for potentiometers, sliders etc, we already have optical and mechanical encoders that hardly fail, or if/when they do, it happens gracefully leaving enough time for replacement. To me, the reason for touch screens is either cost or aesthetics, or both.

In many of the newer systems, all those physical dials and switches are just inputs to the computer system which ultimately decides to do what the user is requesting. A bug which would prevent inputs from working right on a touchscreen could also happen on reading inputs on other systems. Not that I'm arguing for touchscreen controls, just that these days having a physical knob does not mean you're directly manipulating things. Software glitches can still muck up physical controls.

> In many of the newer systems, all those physical dials and switches are just inputs to the computer system which ultimately decides to do what the user is requesting.

Even so, a program for processing a switch or dial can be really short and simple. You can print it out on a sheet and check and double check every line of code for to make sure it's correct and all possibilities are accounted for.

A program handling a touchscreen will be complicated. Millions of lines of code. Maybe even billions. The best you can hope for is empirically verifying it's mostly correct most of the time.

You've done a lot of programming for hardware switches and such then?

I do some. And the last device we built, we still fight with a simple rotary switch. You have to do things like debounce inputs that seem like obvious binary switches. Getting the debounce windowing right can be just as "guessy". And guess what the highest point of failure on said device is. That selector switch. Had similar experience with buttons. I think the software part is just two forms of the Law of Conservation of Ugly.

I do like tactile better, but more for affordance/discoverability (e.g. ergonomic) issues than what you're driving at above.

Yeah processing switch and button and encoder data manually is terrible. Once you install a library to wrap this hardware device in a sane process, how different is that than getting an x/y pixel coordinate from a touchscreen? Touch technology is incredibly reliable. I touch my phone probably 5k times a day or more and I don’t have touch failures, and I carry it around with me and get debris on it and drop it off of tables and all that too. I would argue that a touch interface is one of the most reliable from a hardware standpoint despite not being tactile.

If anyone else is wondering what debouncing is:


Haha, this is actually a remarkably funny read with some grizzled, hard-won bits of wisdom sprinkled in.

"It’s surprising how many of those derelicts hanging out at the waterfront bars pick an almost random time constant. “The boys ‘n me, we jest figger sumpin like 5 msec”. Shortchanging a real analysis starts even a clean-cut engineer down the slippery slope to the wastrel vagabond’s life."

Pressure / proximity info is just as noisy and requires its own version of denouncing and x/y jitter handling on top. (Is it a click/drag/hovering over?) My partner can't even stop accidentally registering right-clicks on her laptop touchpad, which should be a really polished experience these days.

I'm not sure I buy touchscreens ever being simpler to handle. (or even in the similar range - they're strictly harder)

Even modern AAA computer games sometimes miss mouse clicks, because they foolishly poll for transitions of the button up/down state in the main loop, for each frame they render, instead of properly tracking the OS event queue.

It's a very common (and lazy) way of programming games (and other more mission-critical apps): naively polling the input device state in the main simulation or rendering loop, instead of actually responding to each and every queued operating system event like mouse clicks.

It's entirely possible to get multiple mouse down/move/up/click events per render frame, if the system has frozen or stalled for any reason (which happens all the time in the real world). But polling just can't deal with that, so it sometimes ignores legitimate user input (often at a critical time, when other things are happening).

So it's still unfortunately quite common for many apps to sometimes miss quick mouse clicks or screen touches, just because the system freezes up for an instant or lags behind (like when the CPU overheats and the fan turns on madly and SpeedStep clocks the CPU waaaay down, or even the web browser opens up another tab, or anything else blocks the user interface thread), and it just doesn't notice the quick down/up mouse button transition that it would have known about if it were actually tracking operating system events instead of polling.

I had a microwave that used a digital knob that was completely screwed up, from a debouncing perspective. You would turn the knob to try and add 30 seconds to the time and it would stutter between 5-10 seconds for a bit and then shoot up to 3 minutes and then you’d try to drop it down to 30 seconds and end up stuck between 1-2 minutes. It was infuriating! The old mechanical microwave dials were way more reliable than that piece of junk!

Are you debouncing on tactile or is the hardware doing it for you?

In an ancient textbook I was reading, they were explaining how to debounce with transistors, capacitors and resistors - so at one time in history debounce was done in hardware.

In my own brief stint in (non-critical) hardware development, all debouncing was done manually in software.

It all depends on application.... and perspective...

Hardware debouncing works well for most applications but may not be financially rewarding at scale. With time and effort software debouncing can render sometimes better/good or good enough results as hardware..

Remember the saying, "When all you have is a hammer, everything starts to look like a nail..."

Mechanical switches and rotary controls require debouncing which is no picnic.

I can 100% tell from your comment that you've never had to work with one.

It's less science than black magic to avoid double presses or missed presses.

I've done a bunch of debouncing switch inputs, keypad, keyboard, rotory switches. And wrote some test code for capcitive touch display (long time ago)

It's the kinda problem that will tend to bite you in the butt if you aren't aware of all the gotchas. Difficulty is they are application specific. But I wouldn't describe the code as particularly complicated.

Most of this stuff a crusty old neckbeard embedded programmer can do half drunk on Friday afternoon.

So because an expert can do it easily means it's easy? In that case literally anything is easy.

OP was saying that mechanical switches could be deterministic, which is something that I haven't experienced.

I do agree that there is less to go wrong than a complicated touchscreen interface however.

Billions? That sounds like a vast overestimate, no?

Are there any programs that approach a billion lines of code?

I'd think a few npm dependencies should do the trick ;)

a quick search brought up https://www.freecodecamp.org/news/the-biggest-codebases-in-h... which reports google's codebase is around 2 billion LOC. MS Office comes in close to 50 million, for example.

Not sure how accurate these are, but seem to give some rough comparisons, and yeah, not too many things are billions of LOC.

I wouldn't be surprised, the amount of crap that gets downloaded for a simple react app is incredible.

> A program handling a touchscreen will be complicated. Millions of lines of code. Maybe even billions.

I think you're off by a few orders of magnitude.


The phone switch for Nortel's Meridian PBX system circa 1994, which supported SONET and IP, had about 16 million lines of code. The complexity of a touchscreen is less than 1%, maybe less than 0.1% of that. Lines of code, however an absurd metric, in this case does say something. I'm just not sure exactly what, though.

Isnt code just vastly different... abstracted from 94’ era lingo?

Interesting link. Can anyone explain why a car needs so much code?

Instead of having a single computer that stores all the code, automobiles have lots of embedded systems with their own code and hardware, and lots of systems designed for validating safety critical functionality. When I say lots, I mean it's usually several dozen and can be over 100. Since many of these need to meet special regulations and oftentimes require hard real-time characteristics, this tends to add to the complexity significantly.

Apparently a lot of it is just generated templates coming from commercial SDKs. Adding to that, a car is a set of distinct embedded systems interacting in a sort of a ladder network topology rather than a vertebrate analog like multi-core PCs, so a lot of code in a car would be redundant or has little footprint restrictions.

I mean ... we all know a car does not need that much code

Really, why do you know that?

I would expect a car to have tons of code.

Think of all the functions...

Engine management, Engine monitoring, Powertrain control, Emissions, Diagnostics, Infotainment, Satnav, Climate Control, Traction Control, ABS, Anti-collision radar, Cruise control, Lane keeping, Backup camera, Parking sensors...

Now keep in mind that these hundreds of components exist in many many possible configurations so the system needs to handle having certain hardware available or not, and also handle a multitude of failure modes gracefully.

Perhaps because cars ran just fine (albeit with fewer features) for a long time with zero lines of code.

So did horses without any gasoline

Yeah, and so did banks, and so did airplanes.

So the more important question is: did adding software improve things (enough to be worth the “cost”)?

With cars, there are certainly many things where it did improve things: satnav, reverse camera, traction control etc, but also some where it made a perfectly working system worse (ie the “fixed” something that wasn’t broken): touchscreen dashboards.

Because my car has exactly zero lines of code and it runs just fine.

I guess in cars they use less (public) librarys because of safety. So there own libs are included in LOC. If you look at a modern Microsoft licence, they list tons of used open source libs in there products. I guess if you include all the librarys, the LOC number should be much higher.

Is this a good place for redundant microservices?

Each service handles data from a handful of physical of physical knobs.

At least that way you don't have the UI as a single point of failure.

There is no good point for "redundant" microservices.

They are, by very definition, an additional point of failure as you're always adding an additional interface. They're good for scaling, not for redundancy, and even that's wishful thinking for most applications.

EDIT: You could argue that microservices might free up the UI thread from locking mistakes, but if your team is going to make locking mistakes, you're also going to make mistakes in the microservice interfaces, so what's the point?

It's not about minimizing points of failure, it's about removing singular points of failure. Perhaps "redundant" was the wrong word but the point is instead of having a single UI component (touchscreen/UI thread) which can cause the whole system to fail, if you group buttons/knobs into individual microservices they are unlikely to all go down simultaneously.

I don't get how we can explain it in a simpler way? If you add a microservice, you're adding another point of failure. You add another interaction, you add more code. Add more code, you add more bugs.

It's pretty much the only empirical thing we have in software engineering, more code = more bugs.

Lots of little microservices means lots of extra code means lots of extra bugs.

Plus you've got to manage how they all interact. Which microservice has priotity? Did you even think of that? The breaking microservice? Or the volume microservice? Did you even think of that or try and test that? Your breaks get disabled every time you turn up the volume?

Whoops, you just killed a thousand people with your "redundant" microservices.

Erlang is build around the idea of microservices and message parsing. It's by far one of the best languages we have to make fail tolerant programs - it was invented to solve exactly that problem. While it does not mean you can take every program in any language and make them into a micro service architecture, it shows that if implemented correctly, the idea of microservices and messages absolutely increase reliability.

Do you object to redundancy under all circumstances, or only when it’s preceded by “microservice”?

If you add one microservice, you add a point if failure, but if you then add a few redundant copies of that microservice, for failover, then you don’t add a point of failure.

So… You're suggesting reducing the impact of potential error… By increasing the surface area of potential error…?!

It depends on the error. If its a logic error that will deterministically break all instances, then sure. If, on the other hand, its a transient error that a backup copy of the service can avoid, then redundancy adds safety. Sometimes safety critical software has multiple redundant units which need to agree, too, for example.

Multiple physical knobs and buttons add multiple points of failure: moving parts fail and even worse, they often fail intermittently. We all have that experience. Even optical encoders fail (I’ve had one fail on an engine, and obviously consumer mice, or the connectors fail).

A modern touch screen is superbly reliable because it has no moving parts, and it can be tested. The (consumer grade) iPad touchscreen is very reliable.

I had a smartphone that at some point started to randomly create touch events. Kind of like if you put it in your pocket while the screen is unlocked.

I don't remember any physical light switch that ever switch on or off by itself.

In an aircraft flying through turbulences I'd feel a lot more comfortable knowing that all switches are pyhsical. Try to use your smartphone while jogging...

Comparing a light switch to the switches and dials found in an aircraft is a it like comparing a light switch to a keyboard key.

Yeah they’re both switches, but size is incredibly important, and small mechanical devices are finicky and don’t produce nice clean digital output (that’s a lie that electronic engineers tell software engineers to keep things simple).

So yeah I’m sure you’ve never seen a light switch fail, but I bet you’ve seen a keyboard fail (especially if you’ve spent any time around a recent MacBook).

But I do agree with your point on using a touchscreen in turbulence. A counter point is that there are probably hundreds of controls or settings on a plane that you never touch during turbulence, possibly that you never touch in flight (like telling the flight computer how much cargo you’re carrying). Stuff like that is ideal for a touchscreen.

A mechanical switch is easy and cheap to fix or replace. A touch screen is the opposite. It cannot be repaired, only replaced. A switch can usually be cleaned easily, to restore its function. And proper quality switches can be actuated millions of times before failure.

> A mechanical switch is easy and cheap to fix or replace.

Not if it is in an airplane. Think of all the QC steps required to track the production, storage, shipping, installation, testing, etcetera for the replacement of a single switch. If a switch has failed it needs to be inspected to understand the reason for failure (no switch should fail; tracked to understand if it is a batch failure, plus other steps). I am only making an educated guess here.

> A switch can usually be cleaned easily, to restore its function.

Ummm, you think they put known failed parts back in planes? I think not. They do fix major parts, but the QC for that would be insane. You would make a switch to be hermetic and add anti-tampering - a manufacturer of any safety related device doesn’t want it to be “fixed”. Items are designed to be maintained (with proper schedules), or replaced.

> And proper quality switches can be actuated millions of times before failure.

On average? Or does it have a bathtub curve? Yes, quality switches are insanely reliable, but so are touchscreens.

If you have a variety of 50 switches and knobs, then the reliability is worse than 50x worse, because every item has it’s own reliability curve, and it only takes one failure to muck up your day.

>Not if it is in an airplane.

Even less so if it's on the space station. Or on a Mars rover. But we're talking about cars. Something a lot of people like to mend for themselves.

OP context of this thread is touch panels in aeroplanes, not cars.

I can say that an intermittent switch failure is hard to diagnose and potentially costly. The dash on a 2007 Ford I got cheaply had an intermittent fault where the whole dash would shutdown, and headlights would go off, while driving. Switching ignition off and on would fix it, so I presumed it just needed a reset. It was actually the barrel switch of the key - intermittent enough to cause a lot of dangerous trouble but hard to diagnose.

Point of fact; A touch interface digitizer and the LCD screen are two separate components just often glued and sold as a single unit. Replacing a digitizer, or a screen should be no more difficult than swapping out an analog component with a proper modular physical layout and connectors.

A cheap phone or tablet hardly represent best of breed for the technology as a whole.

Another factor is having to keep a larger variety of inventory for mechanical switches, vs a single touch screen SKU standardized across many models.

What happens if an object impacts a touchscreen and it breaks? I have my screens break (even behind a screen protector) and has failed to respond to touch. Imagine you are flying a plane and the touchscreen breaks due to impact. All your inputs are centralized behind that one touchscreen and you can no longer operate it. Wouldn't it be then better if there were multiple points of failure? You are decentralizing the impact a failure can have. Sometimes having multiple points of failure is a good thing. That way a failure in one component won't affect the other components.

Have a spare screen that you can just slot in and carry on.

The airplane cockpits I've read about have multiple screens, with knobs selecting the function of each screen (and which of the redundant flight computers control each screen), so a failed screen can be replaced by selecting its function on another screen. This makes more sense (and is simpler) than carrying a replacement screen (which would be dead weight most of the time) and designing a hot-swap mechanism (which would be yet another potential point of failure).

While your plane is falling from the sky?

Anecdotaly, my touch screen devices's screens are less durable than my mechanical keyboard, but I agree with your overall points, touch screens can be made incredibly durable.

Very interesting... I wonder what it would take for a touchscreen keyboard to be as reliable or live as long as a mechanical keyboard. Maybe it already does, but my mental picture and experience of phone screens getting flaky, and taking secondary+ swipes does not give me the confidence a mechanical keyboard does.

If they can make a durable touchscreen keyboard with a good click sound and feel and reasonable key travel and resistance, and that doesn't get crudded up with skin oils and food film, I'm willing to pay big bucks for it.

Multiple physical knows will never fail all at once like a single touchscreen can. As a small plane pilot I prefer to have as many independent systems as possible; I use an iPad for instruments and navigation, but every single function is duplicated by a separate instrument, so if I lose the iPad for any reason I have the instruments in the cockpit. I had several incidents where single components failed, but I was able to safely land the plane every time. Hardware failures are as much as a problem as software failures and dual-redundant critical systems are far better than multi-functional devices.

The iPad touchscreen is very reliable until you splash water on it. I don't think splashing a cup of water onto 1995 car dashboard would trigger any of the buttons.

There are plenty of individual use cases where touch screens make sense. The interactive map as it is enabled by the multitouch screen, with arbitrary rescale and repositioning and display of arbitrary layers of data, all handled at the speed of thought, is something unmatched by any other object or interface, for one example amongst many.

A wet finger makes my device go ballistic.

In airplanes touchscreens are less of an issue than in cars. In cars you have to have a hand on the wheel 99% of the time, and your gaze on the road. You can't afford interaction with a complicated non-haptic touch screen menu.

In airplanes it's different. Here, outside of takeoff and landing, it's OK to look at a screen for 10 seconds while interacting with it with a hand.

Not to mention that airplaines usually have TWO pilots onboard, so one can use touch screens while the other watches the sky ahead. Cars, on the other hand, have just one person in driver's seat.

> it's OK to look at a screen for 10 seconds

Not really. Pilots are supposed to be visually looking for traffic 90% of the time, and the rest scanning instruments.

So to be heads-down for 10 seconds, the non-flying pilot would have to arrange that with the flying pilot.

Under VFR sure, obviously under IFR they are looking at the instruments as much as 100% of the time, including the instruments which inform them if planes are nearby, and instruments which inform them of their location, direction, and all of the variables therein.

It would be madness if pilots had to rely solely on their eyes to locate other planes nearby. There is thankfully instruments which do this as well.

In VMC you still have a responsibility to see and avoid, regardless of whether you're on an instrument flight plan or not. (Ref: Regulation 14 CFR Part 91.113 (b))

Radar coverage has become ubiquitous in most places, but there's not universal coverage. Heads-up time is very important unless you're flying in actual IMC.

Can't find equivalent rule from ICAO, is this regulation US only?

Refer to ICAO Annex 11 - it's not the exact point being made by the GP, but note that traffic separation for IFR traffic from VFR traffic is only provided in Class A, B and C airspace.

Or, you know just flying the autopilot, which many do like 90% of the flying time...

I think what GP is trying to say is that because your average airlines has around 10+ km of altitude to loose before rapid disassembly commences, which takes a non-trivial amoun of time, compared to a car, which can go on a short and unintentional offroad trip within a few seconds or less.

No, you don't. you can fly over high mountains, having a lot less distance between the plane and the terrain, but how you lose altitude is more important: just a single km drop can be unrecoverable if exceeding VNE and losing a wing in the process.

Most planes in the world are not the big airliners, but smaller planes where single pilot operation is very frequent and many planes do not have a complex autopilot, but usually a 2 or 3 axis stabilization. Also in turbulent weather looking for 10 seconds at a screen can induce motion sickness even to seasoned pilots. We have to do that for map checks and calculations, not a pleasure.

Huh, I wonder if they can use a measurement to adjust the refresh rate based on turbulence. I think they did this for the space shuttle experiencing vibrations to make a screen readable.

It is not the refresh rate, it's the fact that you are looking at a screen that is not moving so much while the airplane and you body is shaking; your internal sensors are telling you are moving, the eyes do not correlate the information, you get motion sickness. I had this problem in my first 50 hours of flying, sometimes even after 100 hours.

Not necessarily. There are many moments during the operation of an aircraft where full attention is paramount. Yes, you see a pilot leaving the cockpit to use the lavatory while the co-pilot is monitoring the autopilot, but the margins are just as small as operating a motor vehicle.

My partner is a first officer, and frequently describes his job as being a glorified babysitter outside of takeoff and landing.

Worth noting that a significant amount of the information pilots use in the cockpit (at major US carriers, at least), things like flight plans, are on an iPad.

... until something goes wrong. Then a touch screen is the last thing you want. An airliner moving uncontrollably is no time to try touching just the right spot of the screen, and avoid touching the wrong spot.

Was team screen til this point.

In critical systems, you want to make sure inputs are easy to use in the worst case scenario.

Even the best of touchscreens can't compare to physical controls in tough times.

Kinda like trying to flip just the right toggle switch and avoid flipping the wrong one?

Just because it's on a touchscreen doesn't mean it has to be tiny and hard to touch. A 17" touchscreen could have fewer controls than the same hardware panel. And the controls could be bigger on the touchscreen.

It’s not like trying to flip just the right toggle. You’ve flipped it thousands of times and memorized its exact location through haptic memory. You gently feel the surroundings to ensure your hand is in the right location and then you make the decisive motion with your fingers. You don’t need to look while doing this. This is how hands work. This is how you type. I wish people would stop pretending they don’t understand this.

> flipped it thousands of times

This raises a question: how many times do an average pilot actually flip a switch over their carrier?

Many switches are flipped once or twice on takeoff, then again before or after landing. Depending on the plane (small GA aircraft or airliner or anything inbetween) there could be 4 or 5 switches to flip in total before takeoff, or 30-40. One of the most ubiquitously used controls in aircraft are concentric rotary encoders though, with a button integrated in them if you press the top of the encoder. Those are used to do menu navigation in the GPS units used in many planes, or for altitude/heading selectors in autopilot units.

Worth mentioning that these days GPS units seem to be getting touchscreens but usually still aren't losing the physical buttons.

Thiss os how i typw somtimws.


Imagine trying to find the right switch on this by feel, without hitting the wrong one by accident.


Easy mode, given the variety of orientation points in form of switches you can gently touch with fingers to recognize position.

That said, usually you make a short look at the panel to benefit from that hardcoded visual-motion coordination hw in your head.

I can just as easily make the argument the other way around: An airliner moving uncontrollably is no time to try touching just the right knob (of which there are like a hundred), and avoid touching the wrong one.

You'd be wrong. The critical controls are uniquely shaped so that the pilot can put his hands quickly on the correct one and know it's correct.

Part of pilot training (at least in my dad's day in the AF) was blindfolding the pilot and the instructor names a control, and the student must put his hands on it. Or he flunks.

I don't believe this is equivalent though. With hardware controls, most (if not all) of them are immediately accessible at all times. With proper training, body movement and tactile feedback will train your muscle memory which will help you find the right control without much of a hassle.

Just out of curiosity, do pilots still manually take off and land fully, or does auto-pilot / computer do this too nowadays?

The technology exists, but autoland functionality depends on the plane model and the airport. Usually, most of the approach is done with ILS with the final moments being manual.

Absolutely not true. In cruise, stabilized, especially with AP - the margins are MUCH MUCH higher. Pilots have fallen asleep (two of them) - overflown airports still landed etc.

I think major difference is commercial airliner vs say fighter jet. As a layman it still seems pretty wrong to make most of the screen highly dynamic. Maybe designated one on the side.

Back to the topic - in car, unless specifically intended for other passengers, driver should never stare on some stupid screen in a place way off the line of sight for driving. Whenever I do that even for a split second in my 15-year old bmw (checking if that knob is really for what I want), there can be an atomic blast in front of me and I wouldn't see it.

I'm not arguing for touch screens - I don't like them, but to say a commercial pilot in cruise at 30K in controlled airspace with controlled separations and TCAS / ADS-B has margins as low as someone driving 85 MPH with traffic literally 50 feet in front of them is absolutely ridiculous. The spacing distances alone are 100x different (in terms of travel time).

Deaths and injuries per mile traveled supports the idea that flying is MUCH MUCH safer.

Fighter jets are starting to use touchscreens, too.

Because fancy, advanced UIs work so well for Navy ships.

I remember when that happened, pretty shocking: https://en.wikipedia.org/wiki/Go!_(airline)#2008_incident_an...

Modern airliners are just short of autonomous. Even if they aren't, unless you are landing or on initial ascent, you are generally minutes away from catastrophic outcomes regardless of your control inputs. In fact, most of the time, if something bad is happening, simply letting go of the controls will lead to the issue resolving itself.

A car is very often fractions of a second away from a serious accident.

A plane at cruise altitude is rarely less than minutes away (unless, in some planes, you are actively trying to crash the plane/make the wings fall off)

> In fact, most of the time, if something bad is happening, simply letting go of the controls will lead to the issue resolving itself.

Unless you're flying a Boeing 737 MAX that is.

Indian Road Congress specifications recommend 3.5m minimum lane width for multi lane roads, 1.06m minimum center margin. Airplanes complex enough for touchscreen interfaces would be awful tight in those margins ;)

Big mistake, imnsho. It will work fantastic right up to the point where you actually need that control in an emergency and then it will fail you because it is impossible to hit the right area consistently in a bucking aircraft. It also requires visual confirmation rather than tactile confirmation, which requires you to take away your attention from the surroundings, something you do at your peril in aircraft.

The sad but real reason a ton of this is happening is one very big word that is typically not present in things like car / airplane design: flexibility... flexibility to change the user controls, flexibility to fix problems, flexibility to let the SW team work up until the last minute to get stuff working, and the second part that goes with this is cost. Touchscreens mean increased flexibility for the design and better control over cost to deliver features. Unfortunately, if the display dies and you can't see anything, then the car or plane may crash... so sadly it will probably take a couple of those events happening for this to be changed to have some kind of redundant systems that the pilot can use when the display dies suddenly.

If something needs that much flexibility in it's UI that people are messing with it at the last minute it doesn't belong as something people should be messing with while driving. The OP talks about HVAC controls. How much flexibility do you really need for that? The interface has been standardized for a long time. It's a known quantity both from a design perspective and user perspective. Ditto for common audio controls like volume, pause/play, skip.

You'll often see screens surrounded on all sides by physical buttons. The screen can be updated and changed over time but the interaction is still physical.

There already are redundant systems like the ones you want. Multiple displays. Most planes that are heavily reliant on screens for instruments/navigation, touchscreens or not, have two of them at least. If one fails, there is a button you press which condenses the information previously shown distributed over both screens on each individual screen (so, the remaining one if one has failed).

Planes have multiple screens which provides more redundancy vs broke physical switches.

User controls should never change. It leads to people getting confused and making mistakes.

There's lots of discussion here about the benefits or otherwise of analog switches etc, but it's not about the failure or otherwise of the input device, it's about the affordance to the pilot.

Well designed physical UI allows pilots to use touch and haptic feedback independent of sight. Whether the switch/dial/whatever is analog behind the scenes or is a digital input to the control infra is not the important thing.

I hope this is known to the manufacturer:

"The US Navy will replace its touchscreen controls with mechanical ones on its destroyers

After a deadly 2017 crash between a destroyer and an oil tanker"


What exact type of input is being grabbed by these screens? "Critical pilot systems"? Touchscreens already exist in airliners, for systems used by crew and passengers, but that's beside the point.

I fly recreationally and all the higher end gear has touch screens. Some of them are redundant with paths using hardware controls or not. But that's definitely the minority. There are a slew of critical operations that I simply cannot complete without interacting with a touch screen.

Not saying it's right or wrong but your original post is 100% incorrect.

Are these things certified?

Many recreational pilots fly with uncertified gear (GPS in particular), and even regular smartphone/tablet apps. They also have the required paper documentation and certified instrument but that's just to cover themselves, and as a backup.

Garmin has quite a few certified touchscreens these days, some intended for panel upgrades (e.g. https://buy.garmin.com/en-US/US/p/67886) and even some in new light jets (e.g. https://buy.garmin.com/en-US/US/p/66916).

Also, some airlines now have officially certified iPads as EFBs, meaning pilots no longer need to carry paper backups.

Yes this has all been in Pipers and Cessnas.

My iphone touch screen won't work if my fingers aren't clean and dry.

They are not capacitive

Critical driver systems (i.e. steering, signaling, acceleration and brakes) aren't controlled by touchscreen in any car I know of either, so if non-critical systems are beside the point, what is the criticism here? Or are you thinking of some particularly extreme cars that are even more reliant on touchscreen than the Model 3?

I was just trying to make an illustrative example of why I think static interfaces are safer and therefore better, at the end of the day it comes down to subjectivity for the part of a positive user experience, but when it comes to safety the trend is, to me, worrying.

I also imagine that there are other reasons for both airliners and cars to replace buttons with touchscreens, namely that of cost instead of prioritizing safety and stability, and in general I am not a fan of that trade-off. But I'm also not claiming to be representative of the automobile market in general.

If you believe that aviation not having touch screens means that cars should not either, then evidence to the opposite should change your mind. The A350 and 777X both have touch screens now.

It's not just Boeing, who you accused of being backwards who are doing this, Airbus is too, along with every other manufacturer. Garmin and BendixKing now offer touch screens and it's clearly the future of GA as well not just commercial aviation.

Everyone believes that this will increase safety. That showing only the relevant information in a tunable and interactive way will decrease distractions and help focus on what matters.

The idea that this is to save money is totally absurd! A 777X is $350 million dollars. Any accident would cost an astronomical amount compared to the cost of switches. Even leaving that aside. The touchscreens are actually far more expensive than the old instruments.

This is just a way for Honda to cover up the fact that they can't write software, can't design a reasonable UX, don't want to spend money on it, and want to live as if it's 1999 forever.

I'd pay extra for a 1999-style car. That year was near the peak for car design quality. Fancy cars hit the peak around 1995, and cheap ones hit the peak around 2010 or 2015.

Who is this everyone that believes touchscreens will increase safety? I could not find any reputable sources for this claim.

Furthermore, tactile feedback is safety. The fact that each switch has a feel, a size, a position - that let's your brain know what you are doing without having to take eyes off the road.

I have seen side mirrors controlled by touch screen, as well as some headlight functions. Which is are critical safety systems, though rarely an issue.

Still, if you’re borrowing your wife’s car it’s easy to realize you don’t have great blindspot visibility at which point looking at a touch screen is very distracting.

What happens during turbulence when you can't accurately touch a specific point on the screen? It doesn't take much turbulence, either, as I cannot reliably use the GPS on my phone, while driving, when it's in its (very stable) mount, and I'm on an average paved road.

Touch screen looks awesome on Star Trek, but in actual use it's an inaccurate, attention-magnet, nightmare.

Which control do you exactly need to adjust in the middle of a turbulence?

Most of them, as you find turbulence in all fases during the flight.

Roll? Pitch? Yaw? Engine RPM? Prop pitch?

Those are “most” of the controls... none of which require touchscreens.

What controls are you referring to specifically?

A plane isn't a good analogy for a car.

Airplanes have keypads that control complex functions on a screen, going from that kind of keypad to a touchscreen is logical.

In the case of cars, a touchpad is overkill for controlling the cabin temperature, stereo volume, etc.

It's logical? It's absurd. You can usually pull over a car and get out in the case of a critical hardware failure. There is rarely such a luxury in a plane.

Anything critical on a plane will be duplicated, including power systems. Sometimes triplicated.

You say that like the 737-Max incident didn't just happen.

I wonder about the temperature settings. I have manual dials but I basically set them to 21C for winter and 19C for summer. Other than that, I don't readily touch the hvac panel at all: I could certainly do those rare adjustments over a touchscreen.

I can imagine people would need to tune a radio panel more often, so at least basic functionality would be good to have as physical inputs. But even then basic radio functions are usually accessible via steering wheel buttons.

I pray you all read the ux bible About Face on interaction design.

So in due time we can be almost certain to hear about a crash due to a bad touch or a wonky screen.

New isn't always better.

That’s too bad. Imagine unresponsive touchscreen due to dry skin or wearing gloves, or hands are too sweaty.... or the controller goes out, etc.

Flight plans is one thing but controls are all together a different sort of thing.

Why ask for trouble?

I have no idea how, but my smartwatch touchscreen remains responsive underwater, in a bathtub or swimming pool. It was with on every workout, and it both endured the energetic movements and sweaty hands were not an issue. The thing has more RAM and CPU power than my desktop in first half of the 90s, runs Linux, and is programmed with JS (well, there's gotta be some faulty part in every design). Anyway, if I can buy such a thing for a few hundred dollars, then - unless there are some physical limitations I'm not aware of - it should be possible for people who build the planes.

Touchpads can't even be operated by a cat https://ask.metafilter.com/91541/Why-cant-Godfrey-work-the-t...

Actually the thread showed they can.

The theory I thought was reasonable for why the OP had troubles, was that Mac touch pads are sensitive enough to treat the separate pads of the paws as multitouch.

Testable: try with individual pad of paw on a Mac touchpad.

Touch screens for all purposes? Probably good to distinguish if some things, even at your company, should not be dynamic (are they dynamic?). Some life-critical thing that is vulnerable to an uncovered sneeze?

The big driver for this is reducing the number of components and thus manufacturing cost, and secondarily being trendy.

I wouldn't fly on a touchscreen plane, imagine Boeing going corporate with their airplane screens, nope, nope, nope.

function (buttons) > form (touch screen)

Usable inputs save lives.

Adding "touch screen calibration failure" to the list of things that can kill you.

please tell then to stop

and all SCADA systems


No offense intended, but the sector as a whole has been doing all sorts of dumb sh*t recently when it comes on-board electronics (787 batteries, 737max MCAS, A380 wiring, F35 ... everything). This doesn't exactly refute OP's point, is all I'm saying. Maybe airspace firms shouldn't be taking their design cues from Cupertino.

OP was making an appeal to authority, that avionics manufacturers know what they're doing and decided against touchscreens. So IMO it does strongly refute OP's argument.

(That's not to say that OP is wrong, of course, just that their argument isn't really a valid one. My belief is that touch screens would suck for flying a plane, but I'm not a pilot.)

What I was trying to say was more that avionics are by nature risk averse, so if they're doing something it's probably worth understanding why. So it's not so much "it's safe because it's in an airplane", I was more going for "consider why this very safety-focused environment looks different". So sure, I might be in the wrong if what I said was interpreted as a simple appeal to authority, but I was trying to get a point across that people spent a lot of time trying to make and keep these systems secure, so let's try to learn from that instead of invalidate it as being simply old or outdated (which Boeing themselves ironically seem to be guilty of).

It is very clear that the aviation industry is NOT risk averse. They are averse to losing money (via needing to spend money to redesign systems, recertify interfaces, re-train pilots, rebuy new equipment and simulators, all of those reduce risk but are capital intensive). But they are no longer risk averse. They might never have been risk averse at all - the roots of the aviation industry is exceedingly risk seeking in the first place (to fly is itself a risk seeking activity - and that's something understood by all pilots and all aviation and aerospace engineers on day 1 of wanting to fly)

You can say the civilian oversight groups that seek to regulate the industry are risk averse, but the companies that build the planes themselves, if they had their say, we'd be flying mach 3 upside down all day.

> They are averse to losing money (via needing to spend money to redesign systems, recertify interfaces, re-train pilots, rebuy new equipment and simulators, all of those reduce risk but are capital intensive).

Re-designing systems introduces risk and uncertainty. Being able to leverage existing pilot training reduces risk (because crashes have resulted from pilots forgetting they were flying X and applied training for Y). Buying new equipment introduces risk of manufacturing defects that wasn't present in the working one.

Appeal to authority is an informal fallacy of logic. It can/should never be used to prove an argument.

Sure, if everyone is an expert on the subject, or is willing to spend the time to become one, you should never appeal to authority.

That's not most people on most subjects. If someone appeals to authority and says "climate change is real, here's 100 scientists with PhDs who agree" I accept that. I am not willing to become an expert on the subject to be able to spend the time to review the facts for myself. Citing sources in a paper is essentially appealing to authority (I understand I could read those papers and the ones they cite, all the way down, but for most things, I'm not going to do that).

For what it's worth, for a specialist in a field, they'll have already read most of the papers that are cited and will be looking for new or missing ones to find gems or flaws in the argumentation.

That's fair. Most papers are probably written with specialists in mind, so I guess that wouldn't be a good example of appeal to authority.

> The problem that touch interfaces solve, ever since the advent of the first smart phone, is that the interface is now dynamic. You can change it without having to replace the hardware.

I mean, the other point of a dynamic interface is that you can now have more controls than would fit on a static interface. Touchscreen fit-to-purpose controls might suck more than hardware fit-to-purpose controls, but either option is better than a single set of generic controls that control multiple systems that "should" have different control paradigms, translating to the generic controls being a compromised bad fit for any use-case.

E.g. a hardware English-language keyboard is probably better than a touchscreen English-language keyboard (though people with modern Blackberries might dispute this); but both are better than entering English text through T9 on a dial pad. And the touchscreen has the benefit of allowing you to have more keyboards (for e.g. the multiple native languages you type that use different alphabets), which wouldn't even fit on the phone as hardware keyboards.

I bring this up, because eventually you run out of space to stuff additional controls. As airplanes become ever-more advanced, their cockpits will approach that point. At that point, dynamic affordances may be necessary, just so you can have some kind of "pagination" allowing you to squeeze more controls in. (Hopefully it'd just be for the non-time-critical switches to flip.)

There's always the possibility for hybrid UIs. Something with physical inputs with a dynamic display based on context, like a screen above a series of buttons and maybe a dial at the end ( think ATMs) or even buttons with OLED displays.

It's the best or worst of both worlds depending on your perspective, but they do offer superior hands-free operation over a pure touch device, but at the sacrifice of interface flexibility.

Also known as an MFD, and used for many decades in the aviation industry.

MFDs were pioneered in fighter jets, where the cockpit physical space is extremely limited while the amount of information the pilot has to deal with is far beyond any civilian pilot workload.

MFDs combine the durability of physical controls with the configurability and flexibility of screens, and it's completely beyond me why they are not standard equipment in all cars.

That's a funny coincidence. Two days ago, I was searching the Web for cars with MFDs after watching a review covering the Porsche Taycan's touchscreen interface, thinking how odd it was that I couldn't find any.

>Something with physical inputs with a dynamic display based on context, like a screen above a series of buttons and maybe a dial at the end

BMW's iDrive is, I think, the canonical early example of this in the automotive world.

> I mean, the other point of a dynamic interface is that you can now have more controls than would fit on a static interface.

I think this point is lost sometimes, but is also useful. My car has a lot of physical controls. Some of the ones that are useful during driving are tucked inconveniently below my left knee!

Moving some less frequently used controls to a touch screen might actually benefit some of these designs.

I'm not disagreeing with anything you're saying and despite my somewhat ranting comment I am not completely opposed to a mix of touchscreens and hardware interfaces: but I think you'll agree that there needs to be a decision made in terms of safety when you decide to use a touchpad for input. If you're adjusting the screen brightness on your smart phone, that slider doesn't have to be perfect, and maybe adjusting the cabin lights for an airliner doesn't have to be either: I'm just dreading the day when things like navigational headings and airspeed creeps into a touch interface because of "convenience". But if my observation about this trend is wrong, which I hope, that separation of concerns will stay in future designs as well.

Too late! Boeing 777-X will have touch screen interfaces for pilots.

Personally, I think information display can and should use touch interface, but actions should be tied to physical switches or buttons.

> Personally, I think information display can and should use touch interface, but actions should be tied to physical switches or buttons.

Underrated comment!

Information - the user is already looking at the screen, so they can touch virtual buttons. And that is probably the best approach, as they are manipulating information that is being displayed and they can see.

For actions, you won't necessarily have your attention on the screen. The information may not even be displayed in the screen yet, so now you have to divert attention and manipulate the system to get it to a state you can then change(eg, moving to the climate control screen).

Well keyboards are an interesting one. Airliners have space to install hardware keyboards, light aircraft don't so they have touch displays.

You could argue that an onscreen keyboard is significantly less mental load for the pilot than having to scroll letter by letter on a "dumb" interface.

Where do accidents happen? It's fairly rare for a plane to just break. And even then, if your electronics fail there are usually mechanical backups for the critical instruments. It's much more common for pilots to reach task saturation and make mistakes.

> You could argue that an onscreen keyboard is significantly less mental load for the pilot than having to scroll letter by letter on a "dumb" interface.

I don't buy that argument. What kind of keyboard? QWERTY or something else? What language? English is the language of aviation, but if you're not a native speaker who uses keyboards all the time then chances are you're going to have a LOT of mental load using a keyboard. Even something like the French or German keyboards, which use mostly the same letters, may be different enough to cause frustration -- and when you have an engine on fire you don't need to be struggling with those details.

The ones I've seen are a-z, not qwerty:


And English because anything you'd need to enter, like waypoints or airport IDs use roman/latin letters or numbers.

Airbus (A380 and A350) mounts both full size QWERTY on retractable "desk", and A-Z on the center console.

The keyboard is mainly used for planning ahead where you have more data to input. Both are equipped with mouse as well.

The problem is not "having touch interfaces", the problem is "having touch interfaces for critical systems".

I can assure you that PFD (primary flight display) interfaces are very much safety-critical. And they will be touch interfaces.

Personally I think it's an awful idea. I just watched the promotional video that I expected to give some answers as to why this design decision was made and I frankly ended up even more worried.[1] I can accept that I'm somewhat of a luddite when it comes to this and I might be wrong, I just hope these are thoroughly tested and actually solve real world problems, and aren't just a way for Boeing to save money or solve the problem of "hey why aren't there any cool touchscreens in here".

[1]: https://www.boeing.com/777x/reveal/touchscreens-come-to-777x...

You would have a difficult time finding an airborne system that does not already do this:

> I just hope these are thoroughly tested and actually solve real world problems

Changing from a legacy style to a new one is not cheap, and aerospace companies are not the type to spend money on useless, less reliable technology.

That sounds like a horrible idea.

If you're trying to say, lower the landing gear, and the button malfunctions, you can probably smack the button a few times until it works, failing that, rip the switch out and short the wires inside the switch and get the plane landed.

With a touch screen? What if the glass breaks and the capacitive layer fails? Or the software running the screen crashes? Or a bug prevents you from switching from the "Climate control" tab to the "Landing gear" tab?

There are three redundant methods to lower the landing gear on my aircraft, depending on whether you still have hydraulic ability available or not. It is not unique. One is a big, fat lever that will not go away with a hypothetical touchscreen option, since we pilots tend to like physical backups for flight-critical systems like that (though I’m curious what pc86 is flying upthread, since even the G1000 aircraft I’ve flown have usually had airspeed dials).

It is even totally possible to gravity drop landing gear on nearly all commercial airliners, I would expect, though I can only speak on the types I’ve rated on. I don’t see you asking “what happens if the landing gear lever fails?” which is actually a totally reasonable question, and one manufacturers have thought of. Touchscreens aren’t magic devices, they’re just another type of input to build redundancy behind.

It sounds like a horrible idea because you probably haven’t flown an aircraft and don’t know this. That isn’t an indictment of you, just a request to not judge so soon. I like the idea of screens that adjust to phase of flight so what I need is where I need it, because pilot workload is a real problem that automation has addressed for decades.

> One is a big, fat lever that will not go away with a hypothetical touchscreen option

Thanks for the explanation -- This makes me feel much safer as a passenger if the touch screen is provided to you as a convenience instead of a replacement. Yep, I haven't flown an aircraft. I was thinking that it was like a car where they are getting rid of physical knobs and replacing them with touchscreen-only interfaces which I hate.

> If you're trying to say, lower the landing gear, and the button malfunctions

If you're trying to lower the landing gear, and the button malfunctions, you use the gravity gear extension handle, which is a completely independent system. You can also land without the landing gear in the worst case.

> What if the glass breaks and the capacitive layer fails? Or the software running the screen crashes? [...]

You use the other screen, which is controlled by the other computer. There are also knobs to switch which computer controls each screen. In the worst case, there are the standby instruments.

Airplanes have a lot of redundancy.

"The US Navy will replace its touchscreen controls with mechanical ones on its destroyers

After a deadly 2017 crash between a destroyer and an oil tanker"


> Touch screens will hopefully never make it into any critical pilot systems

Touch screens in the cockpit seem like madness to me. Cockpits sometimes fill with smoke and the pilot has to be able to find and operate the controls.

Ever notice that the flap levers have little flaps on top of them? The nosewheel steering control has a little tire on the top? That's so the pilot knows without looking what his hands are on. These designs were not the result of some study group following fashion, but were the result of accidents.

Another consideration is vibration; trying to press a virtual button while in a high turbulence situation can be non-trivial, for example.

Critical functions will (hopefully) always remain on tactile controls for these and other reasons.

Not just smoke. During explosive decompression the air turns to thick fog, and that's when you'll need to fumble around for the correct controls the most.

This comment is off-topic: the article is about physical buttons for non-safety-critical systems. E.g. the article explicitly mentions climate control.

You can still die if the driver spends too long messing with air conditioner settings instead of focusing on the road.

CarPlay and Android Auto make this problem worse, IMO. Now you have app publishers writing arbitrarily complex UIs for cars. Spotify is a bitch to use while driving and because of Apple's reluctance to enable Siri support for third party apps, it's not very controllable by voice.

yup, just rented a cool modern car and was distraught at how much menu-diving there was. and even if you memorize it, the lag was still huge vs. real-time controls!

"Siri, play the Eagles on spotify"

Been available for a little bit now.


The ergonomics in question here are not about the driver being able to operate controls in a critical situation. They're about operating the controls without causing a critical situation.

If you want, you might think of radio, climate controls, etc as having negative values on the safety axis. You still want to shift them to the right as far as you can.

I'm sorry if I wasn't clear enough about this but I was more trying to make a point that physical input is safer, and that the trend towards touchscreens in general worries me. So a return to "normal", which I guess is not so normal these days, is for me very welcome. Touchscreens in cars cause safety issues for other reasons, namely those of distraction, but my concern is what the future will look like if touchscreen normality takes the upper hand over safety concerns.

Climate control is something you will want to adjust while driving, so it should be as eyes-free as possible once the basics have been acquired.

I can’t think of a car I’ve driven where the climate control was not physical though, that seems pretty insane.

Will you though? You have a temperature you're comfortable at. Set climate control to that and let the software manage the air.

All Tesla climate is non-physical. All newer Volvos are except for the defrost controls, as well.

I am really sorry but foggy windshield is definitely safety critical.

I am not a pilot, but it seems like MFD (multi-function displays, a sort of touch screen with touch points at the edge of the screen) are very common even in fighter planes.

Yes, the basic controls do not change. But more advanced functionality is easier presented through menus and screens which guide you through a process -- instead of adding tons of switches for every possible function.


Indeed when phones started moving over to touchscreen dial pads i always thought it was a step backward usability wise. I had one phone where I couldn't hang up calls easily as you needed to swipe in a particular direction and a number of my calls went though a VOIP number so that had a different way of hanging up. Same with whatsapp calls now answering and hanging up is different from a normal call. It's a real shitshow.

Touchscreens on phones are great for usability! There are so many things that are enabled by touchscreens that you couldn't do with an array of physical buttons or whatever. OP's comment was specifically for mission critical hardware where safety is paramount.

For making a phonecall they are crap in comparison to buttons.

One could argue that, despite still being called "phones," their primary purpose is not making phone calls anymore. I personally use my device for making phone calls less than 1% of the time I use it (and this includes talk time).

yeah the phone thing is just another app that I use about 5 times a month. it's nice that they include that app but if it wasn't there i'd prob still buy it!

I don't think this is entirely true. The GA glass panels love touchscreens, and they are just as FAA-certified as their non-touchscreen counterparts.

You are right that they are flaky. Here's Martin Pauly (great YouTube channel!) using his touchscreen transponder and it just stops working: https://youtu.be/bopcQSJKcD8?t=732

I think that’s partially why the market for handheld game devices didn’t get crowded out by smartphones. Tactile inputs let the player’s muscle memory work as a more direct shorthand into the product for a lot of video games. It helps the repeated interactions of a video game be more accessible over a “look, then touch” method of input.

Didn't the US Navy try this and end up crashing a destroyer or something?

You are correct.

> The US Navy is replacing touch screen controls on destroyers, after the displays were implicated in collisions.

Unfamiliarity with the touch screens contributed to two accidents that caused the deaths of 17 sailors, said incident reports.

Poor training meant sailors did not know how to use the complex systems in emergencies, they said.

Sailors "overwhelmingly" preferred to control ships with wheels and throttles, surveys of crew found.


I'd argue that humans have evolved to physically manipulate their environment. Safety critical systems are the last place you want UI variation. Things need to be predictable and tactile to build muscle memory.

I think you're missing something pretty major -- a touchscreen can support multiple UIs, menus, and controls, with minimal hardware. If I have 20 user adjustable inputs, I would need 20 dials/buttons scattered around the cockpit. On the other hand, with a screen, I can display 5 on each page and allow the user to swap between pages.

The thing I'd like to see more of is OLED buttons (Optimus Keyboard style). Physical, clicky, mechanical buttons, but each button is a screen. MFD buttons already change their function depending on what mode the screen is in, having the button also be an icon of what it will do can be very nice.

Otherwise I agree. You need the use of the interface to be as automatic as possible, and exploiting muscle memory & tactile feedback are very important for that. Touch screens fail there.

I was about to ask if there are any pilots on this thread who could weigh in because this was my intuition as a non-pilot. Just to drive one of your points home further, you don't want the interface to change because that could confuse the pilot. Toyota's braking fiasco is an example where unfamiliarity led to mis-operation even though the hardware or the device itself was functioning "to spec".

> The problem that touch interfaces solve, ever since the advent of the first smart phone, is that the interface is now dynamic. You can change it without having to replace the hardware.

They also enable the completion of hardware design before the interface design is completed. While the plastic molds and mechanical designs are worked out, the interface and software development can continue.

That sounds good, but for some reason, design of "virtual" controls always seems to end up far inferior to physical ones. Perhaps the thought is "We'll just toss something out there and we can fix it later", as opposed to "We only have one shot at this so we better get it right".

I'm reminded as well of web "app" interfaces. In the early days, with relatively fixed controls, one could often navigate sites more easily since there just weren't that many ways they could work. Now, with a blizzard of JS UI kits and an oh-so-wonderful variety of ways of doing everything, each site works differently. And it's not an improvement.

Yeah, sorry, I should have been clearer on my opinion on the matter: virtual controls are almost always worse than physical ones.

Cockpit design is strictly regulated by some very risk averse and conservative people. (And not just the cockpit, the passenger space as well.)

So it's hard to tell which parts of he design stem from being conservative and which parts actually enhance safety.

(I get your overall point. But your argument isn't really a good one for the point.)

One time I had to create a button for a in-flight software which would control a lidar instrument. The button would enable you to select when to turn on and off the instrument.

We went for a few iterations where the button would light up when the laser was on. There were tons of issues--the state of the laser was not stored in the hardware, so every time the button would disconnect briefly, the state would get reset and lost. After months of back and forth dealing with algorithms to save the state and deal with spurious button presses when the device disconnected, we ended up with a completely useless button, as disconnects would happen very often inside helicopters, rendering the button useless due to the spurious presses near disconnect/reconnect.

Finally, we decided to go with a hardware switch (which I wanted from the start). In about one day the whole thing blew over.

> Touch screens will hopefully never make it into any critical pilot systems

For one, you're too late, touchscreens are prevalent in modern avionics, and unlabeled buttons on the border of a screen that change function depending on what screen you're viewing are the second most common. The more relevant part to the Honda discussion is that there are different considerations for a car and airplane interfaces, and they are so different it's not a good argument to say "planes don't/shouldn't do this so cars shouldn't either."

The issue with a car isn't the interface, it's the fact that people look inside their vehicle for too long to fiddle with the radio. Even if the buttons/knobs could be operated entirely without looking, most people would still stare at their radio while they are doing it. Pilots learning to fly are trained to look outside after pretty much any action, they shouldn't ever stare inside the plane. They do a quick instrument scan and look outside. They glance at their chart and look outside. If they need to change frequency they do it and then look back outside. It's kind of hard to break the habit of looking outside when you start instrument training.

But most car drivers don't have the same amount of training and fixate on things inside the car, like the radio or climate controls. TBH, pilots still get fixated on things, it's just that they usually snap out of it and regain situational awareness before anything bad happens because the skies are pretty spacious. But car drivers don't have spacious roads. There's another car right next to you going 75 mph and if you drift out of your lane you'll cause an accident.

The issue isn't the design of the buttons at all. That matters to a fighter pilot, but the issue for a car driver is the fact that the screen is even on and the radio is accessible while driving down the road. The real critical safety feature would be disabling the screen while driving, and either locking out controls or only allowing voice control. But people would never buy a car that doesn't let them fiddle with the radio or stare at their little screen, so the actual safety feature that needs to be implemented won't happen.

Edit: When I say "fiddle with the radio" I'm including all activities that take place in a car's center stack- audio, navigation, climate control, etc. I'm also a pilot, have designed tests for avionics upgrades for multiple fighter jets, and own my own plane. I have lamented the introduction of touchscreens into modern avionics at a professional level and the personal level. I own three cars with varying levels of touchscreen invasion. So I've thought about the issues surrounding touchscreen quite a bit, and have concluded that the interfaces in a car are so simple that the issue isn't whether you can operate it without looking, it's the fact that people aren't trained to do so.

Dynamic interface can still be useful in some cases. Cars do require the driver’s active attention, but certainly not the passenger’s, who can operate the touchscreen. Further, undoubtably part of what makes controls intuitive is familiarity, and thanks to smartphones, tablets, and to some degree even modern laptops, it’s hard to argue against it from a familiarity standpoint. I’m sure it’s been attempted but, It’s hard to imagine a good mapping interface in a car with no touchscreen at all.

I’m still glad for what Honda is actually doing, which is not unilaterally removing touch screen controls but instead moving climate control back to physical buttons. These are things a driver ought to be able to operate safely while in motion, and touch controls only ever made them more complicated I think.

> Touch screens will hopefully never make it into any critical pilot systems

The F-35 Fighter Jet has basically only touch-screens (apart from Hands-on-throttle-and-stick controls for when you are busy pulling Gs)

Depends on your definition of "critical" but I don't think this is a great example.

And the F-35 is a great example of how not to do things, so of course it has touchscreens.

While not touch many Airbus aircraft feature a full keyboard and trackball for operating those screens.

The problem here is not with the technology - it's from the people implementing it. Instead of leaving the design and software to tech companies, we have car companies trying to do it on their own.

The result is terribly designed software that looks like it's from 2000.

The arrogance of that statement is amazing. I recently bought an iPad for the first time in 5+(?) years.. before multi touch and pressure sensitive screens. I have no f'n clue what I'm doing anymore. I accidentally had Safari running two windows side by side with no idea how to stop that. I'm still not sure what I did to make it go back to one window..

My friends and I decided to try out a (new to us) game which required Microsoft Store / Xbox PC Game Console or whatever the shit it is. 4 of us cannot figure out how to add someone as a friend. It's not in any menu anywhere. I can follow, I can favorite.. I have no idea how to "friend".. which means we can't figure out how to invite people to games.

I'm convinced if Silicon Valley were to design car interfaces I'd be stuck in some sort of pay per action dark pattern captivity hell.

And as a student pilot nothing scares me more than touch screen controls. Maybe I spend too much time down low in the thermals but it's so much easier to hold on to a knob and turn it, while counting clicks, than trying to press a touch screen and hope you hit the right finger sized button the right number of times to change radio frequencies. Different story on heavy planes since they don't bounce around as much as GA planes but it sucks to fight the fight while trying to maintain control/coordination.

So because you had a bad experience on one Microsoft product, all of the tech industry has poor user interface design?

Microsoft, Apple, Facebook, Google, etc. all have some of the best user designs created for technology. They have teams of researchers, psychologists, and designers working together to do this.

Have you heard of the auto industry doing anything remotely similar on this scale? Instead, they push away much better alternatives offered by Apple and Google for their own proprietary solution no customers asked for.

The arrogance of your comment is almost greater than your ignorance.

Honestly, if you told me that moving forward I could only have CarPlay or Android Auto then I would quit buying new cars. I dislike both of them. I also don’t want my car tied to a company that has a reputation for canceling products more than I change shoes. And if you think the auto industry doesn’t employ people to try and improve their user experience then you’re as daft as you think I am.

While working from home, I was trying to figure out how to mute my Android phone on a conference call the other day. I eventually did, but the sequence of actions to get to the menu was very strange. I'm pretty sure it was more intuitive just a few years ago, but of course, you don't control whether you update software anymore. Interface design is accelerating downhill, and it's amazing how things have regressed since Apple and Microsoft published guidelines for good design in the 80s and 90s.

Unfortunately, that kind of arrogance is common on this forum.

An airplane cockpit has two pilots and literally several hundred physical buttons. No one wants to drive that car.

In cars we're mainly talking about the volume and fan speed. Personally I much prefer knobs for those. But I wouldn't call those critical safety systems.

Someone already mentioned this: climate control becomes safety critical when your windshield fogs up. Good luck finding that goddamn touch button then... Additionally, I sometimes hear radio ads that have honk or siren sounds in them that immensely piss me off and cause me to immediately turn the radio off just to be able to hear if something's actually happening or not.

I'm afraid I have some bad news for you:




Just a few examples.

Aviation very much uses touch screens quite extensively. The difference usually is that more thought is put into when and how to use them; you're not just replacing all the buttons with a touchscreen and letting some underpaid intern design the interface for it.

All the latest Garmin avionics include touch screens. Airplane cockpits aren't the right metaphor because they are largely driven automatically.

Perhaps, but on many pilot's knees will be a digital touchscreen kneeboard which having learned with paper approach plates and map books myself was a godsend.

Sure, but there are still tons of touchscreens in the cabin -- controlling the climate, the lighting, and the playback of the safety video.

Yes, and I have no problem with that. I don't imagine that an imprecise control of the climate, lighting or video playback will bring the plane down. I mean, I sure hope not.

Electronic Flight Bag is one of the biggest screens, right? And that's used for navigation, which is largely strategic, not tactical.

I'd love for someone who's in the industry or an actual pilot to comment on this because I'm frankly not sure about what common scope AFBs have on commercial airliners, and what the backup procedures are if they fail. For private pilots, I know bringing an iPad up is common these days, but I think (and sure hope) commercial flight is a lot more risk averse and slow to adopt these things without thorough procedure.

80 to 90% of airlines worldwide use EFBs, based on iPad or Microsoft Surface.

Great points.

BUT - touch screens can be used if people are trained, the layout is rational, the device is responsive.

There are two underlying things:

1) Tactile. As you spelled out.

2) Changing interfaces. This is the real killer. 100 screens, don't know what's what, supposed to be driving.

These UIs need some thinking but I suggest that the 'knobs and buttons' can possibly be mapped to different functions depending on.

I see many people comparing automotive electronics to avionics, but there’s a big difference, in flight you can take your eyes off the “road” and in some situations even fly with 0 visibility, whereas in a car your next obstacle is only milliseconds away from you.

The f35 has a large touch screen that pilots use. Maybe the $1m helmet has a way to interact without touching.

For some more info: https://www.reddit.com/r/aviation/comments/cmypjd/the_finger... It's fun thread. You can catch some actual military pilots there.

> Touch screens will hopefully never make it into any critical pilot systems, because safety and stability matters to airline manufacturers, current ongoing scandals notwithstanding.

You might want to take a look at garmin's general aviation product lineup. They're pretty popular.

Not to mention, touchscreens really disadvantage the visually impaired.

True, but they are already a little disadvantaged when flying an airplane.

I was thinking more of essentially everything else, from working a stereo to an oven.

Pretty sure this has everything to do with planes being old. Boeing pretty clearly demonstrated that planes being planes doesn't mean they're immune from bad design, corruption, or any other problem.

Check out all the latest Garmin aviation equipment. Touch is here for critical systems including GPS systems used on instrument approaches. Super fun to reprogram in turbulence.

Nope, not anymore. Honeywell is making the new Gulfstream’s Symmetry Flight Deck for the G700 and it is touchscreen enabled.

It's not useful to compare cars to planes since cars can easily pull over to the side of the road.

Look at the upcoming TEMPEST fighter to see touch screen displays in full force.

well said. I for one don't like all those touchscreen buttons. Screens should be for maps and other non-interactive stuff.

Pilots already use iPads for checklists.

That's a different ballpark though as you can fall back to the paper version at any time.

More often than not, navigation charts are now on iPads and there is no paper backup carried anymore.

I suppose you don't include the flight manuals as a critical system, since those have been touch screens for almost a decade?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact