
“Smartphone” is the wrong name - loup-vaillant
http://loup-vaillant.fr/articles/smartphone-is-the-wrong-name
======
gjm11
I have for some time had a crackpot theory that language will actually evolve
the other way.

We have these things we carry around with us, called "phones"; they are
actually general-purpose computing devices and making (what we still call)
phone calls is a small part of their purpose. And to an increasing extent
these can fulfill almost all our computing needs.

So, fast-forward a couple of decades, assuming no huge technological
shifts[1]. Everyone is used to having one of these things that they carry
around in their pocket, and scarcely anyone really needs any other general-
purpose computing device. We still call them "phones" because we always did --
so now this is the usual term for a general-purpose computing device. And,
yeah, there are some people who, because of their unusual needs, have a bigger
more powerful one. Well, what are you going to call something like a phone but
sized to sit on your desk while you work? Obviously it's a desktop phone,
right? And somewhere out there In The Cloud there are whole farms of server
phones :-).

And there will be conversations like the ones we[2] now have about how
"computer" used to mean a human being who did calculations. "Hey, did you ever
look up where the word 'phone' actually comes from? It turns out that back in
the 20th century they mostly used them for talking to one another, and it
comes from some Greek thing meaning 'sound at a distance'. Weird, huh?"

[1] That's a pretty big assumption, of course.

[2] For sufficiently small values of "we".

~~~
api
> there are some people who, because of their unusual needs, have a bigger
> more powerful one.

Unusual needs like being able to see a lot of text at once, type things longer
than a paragraph, multitask, use more than one app in a work flow, and run
real software?

I doubt it... unless phones develop dockable desktop capability in which case
they are now dual-purpose converged devices. Right now to believe that mobile
is "the future" of _everything_ requires one to believe that most people have
nothing to say beyond one-liners and selfies and that the only purpose of
computing is to interact with canned services. That might be true for a subset
of the market but I think it's a smaller subset than some do.

~~~
clumsysmurf
> I doubt it... unless phones develop dockable desktop capability in which
> case they are now dual-purpose converged devices.

Already here, its only going to get better.

[http://bgr.com/2016/02/21/hp-elite-x3-review-
pt-1-preview/](http://bgr.com/2016/02/21/hp-elite-x3-review-pt-1-preview/)

~~~
igravious
Them's are some drool-worthy specs:

[http://cdn.bgr.com/2016/02/screen-
shot-2016-02-20-at-11-50-3...](http://cdn.bgr.com/2016/02/screen-
shot-2016-02-20-at-11-50-35-am.png)

------
stillsut
The average adult is required to be available by phone. Not by snapchat, not
by skype, not even by text message. By. Phone.

In tech-land and teen-land you can specify platforms to each of your
relationships. But in the average American's day, doctors, daycare providers,
clients, etc are all going to be contacting you by phone.

Without being required to carry around the phone part, a pocket-sized tablet
would have 10x less adoption than it does today. "smart _phone_ " it still is.

~~~
Razengan
Not to mention that you can have multiple accounts on any social network, and
switch between them at any time, but you can have only one _phone number_ on
that mobile device, whatever you want to call it.

So if that device is uniquely identifiable by a phone number, and only you can
be the only user of that number at any time (again unlike social network
accounts), might as well call it a phone.

~~~
kruczek
> but you can have only one phone number on that mobile device

Not necessarily. There are phones which can hold two SIM cards.

------
wodenokoto
Japanese cellphones had a lot of the same use-cases that modern smartphones
have. You could buy stuff online, use GPS, surf the mobile web, send pictures
via email and purchase stuff and movies (disney even produced direct to
featurephone cartoons in Japan!).

This is usually used as an argument for why the Japanese decided to
colloquially shorten 携帯電話 ("keitai denwa", hand-held telephone) into just
携帯(hand-held, or handy) instead of just calling it 電話(phone) like so many
western countries do or 携電("kei-den"), which follows a more typical pattern
for abbreviations in Japanese.

However, with the success of the iPhone, which was incompatible with all the
featurephone services a new segment arose consisting of iPhones and non-
backward compatible Android handsets, and these are denoted as "smartphones"
shorten to スマホ(sma-pho).

I think it is interesting that they chose to move closer towards the "phone"
moniker, than staying with the focus on hand-held.

~~~
Freak_NL
In that respect, I miss the word 携帯. スマホ ( _sumaho_ , ugh!) feels so
uncultivated in such a pleasant language.

It was a predictable outcome though. Japanese has a tendency to abbreviate
common words, and the term _smartphone_ (the unwieldy スマートフォン/sumātofon) was
imported with the first generations of smartphones.

The German _Handy_ on the other hand (as _ccozan_ mentions)… What a lovely
word.

~~~
rangibaby
I only hear people using スマホ to differentiate them from feature phones ガラ携
(Galapagos phone, also a homonym for "junk phone"). 携帯 is still the general
term for "cellphone".

~~~
wodenokoto
You see cellphone store advertise feature phones as Galapagos phones, and I
believe it was Panasonic or Sharp that even had a line of phones called
Galapagos.

I'm not sure I get what word "Gara" is a homonym with, to me it seems like a
fairly normal word.

~~~
rangibaby
Yes, the Galapagos phone is where ガラ comes from.

I've heard people calling them ガラクタ (crap) phones before. It's obviously just
slang...

------
gilgoomesh
> "Smartphone" has the wrong connotation. It suggests your expensive brick is
> a phone first, and can do smart things second.

If you look at where the battery goes, that's usually an accurate distinction.
If your screen-on time is less than 4 hours per day, you may spend 75% of
battery on the cellular antenna. Everything else takes a lower priority.

Calling it a phone also efficiently distinguishes between smartphones and
tablets or devices like the iPod Touch which are identical devices except for
the lack of cellular antenna.

~~~
provemewrong
There are plenty of tablets with cellular antenna though (not all of them have
voice call functionality, but that's mostly just a software limitation). And
with VoIP and instant messaging apps even WiFi-only device can fulfill "phone"
roles.

------
chroma
> For instance, picture Microsoft in 2001, after it got sued for bundling
> Internet Explorer with Windows. Imagine what would have happened if they
> sold Windows XP with an exclusive application store. Imagine that every
> program must be approved by Microsoft to run on XP, and they take a 30% cut.
> Oh, and no interpreter allowed.

> They would have been sued into oblivion, lost half their customers, suffered
> one hell of a bad press. They could have sunk. Yet when Apple did exactly
> that to their new computer, the iPhone, few objected and customers flocked.

This just tells me that execution matters more than the initial idea. There's
plenty of competition in the mobile space. If users didn't want Apple's
locked-down ecosystem, they wouldn't buy iPhones. The vast majority of users
don't care about side-loading apps or installing another operating system.
They are (quite rationally) willing to sacrifice customization and app choices
to avoid malware.

~~~
greggman
Microsoft was considered to have a monopoly (whether you or I agree). As such
they were subject to different rules. Apple, especially when iPhone launched,
did not have any kind of monopoly on cellphones nor did they have one on PDAs
(which is really where _smartphones_ come from, not cellphones).

As for people, people didn't sue Microsoft, companies did. People mostly
didn't care. Companies like Sun and Netscape cared. They're they ones that
lobbied to have the government declare MS a monopoly. In other words nothing
to do with execution and perceived awesomeness by customers. At the time MS
was sued IE was the best browser by pretty much every measure so customers
wanted it. It was companies that were upset.

Also malware infested vs walled garden is a false dichotomy

~~~
tremon
_At the time MS was sued IE was the best browser by pretty much every measure_

Meh. IE was the browser with the best website compatibility, because Microsoft
went out of its way to reimplement half of the HTML standard in its own way,
and could use the prevalence of IE to make webdevelopers follow "their"
flavour of HTML.

Mozilla's first Firefox release came with the slogan "take back the web". That
motto had nothing to do with the excellence of IE.

~~~
jerf
IE4 was head and shoulders above Netscape 4. IE4 is recognizably a modern
browser (by introducing .innerHTML), if a very poor one by modern standards;
Netscape 4 was hackery on an engine taken way beyond what it could sustain.
Netscape require a _long_ time to catch up, and the helping hand Microsoft
gave by slowing down dev on IE round about IE6. There was a long window where
IE was the best browser by pretty much every measure and I'm pretty sure it
overlapped this lawsuit, though my brain and the calendar don't always get
along perfectly.

You can argue that IE was that good due to Microsoft being able to leverage
its monopoly to develop something for free they could never have afforded to
develop otherwise, since that argument basically won in a court of law give or
take some nuances. But, in the meantime, IE really _was_ a better browser for
a good long time.

------
danjayh
> _Personally, I have more faith in a third alternative: a virtual instruction
> set. Like bytecode, though not managed like Java, and not meant to be
> interpreted or JIT compiled either. It could run on the bare metal, or be
> translated into something that is —like the Mill CPU. That way you can keep
> the illusion of having a single instruction set, while sacrificing virtually
> nothing. Moreover, future CPUs don 't even need backward compatibility, as
> long as you can translate (and optimise!) the virtual assembly for them._

I don't think he really gets how CPUs work -- it's already that way, and has
been for quite a while. The published instruction sets are the 'virtual'
instructions, and translation of these instructions (be they ARM, x86, or
PowerPC) is baked into each CPU's microcode. We actually have no visibility
into the 'real' instructions that CPUs execute ('micro-ops'), because they're
proprietary.

~~~
loup-vaillant
The idea is to own up to this reality, and stop designing instruction sets as
if they were meant run directly. x86 in particular has a legacy of simple CPUs
that didn't have much of a decoding unit. But now it has gone too far in the
other direction: it takes a significant amount of chip surface and energy to
decode, making it unsuitable for low power situations. ARM on the other hand
is probably lacking in the SIMD department (I haven't checked).

We need to go back and overhaul the CPU instruction set like Vulkan, Dx12 and
Mantle overhauled the GPU APIs. We need to reflect on what CPUs can do, what
they can do fast, and what they can do with low power. Then we need an
instruction set that would act as an API to these subsystems. Something
orthogonal, that doesn't take too much energy to decode, and could be decoded
in parallel if need be (for crazy desktop speed ups).

While we're at it, it might be nice to have explicit support for things like
pointer tags, to speed up dynamic stuff like garbage collection and runtime
type information.

You're right, I don't really get how CPUs work. But I did pick up a few things
that lead me to trust instruction set design is not over. We can do better.

~~~
loup-vaillant
When I make a long comment, I like to see explanations about what warranted a
downvote. So I can learn…

~~~
danjayh
I didn't personally downvote you, but I'll give you my best guesses:

1) The percentage of die used for decoding is actually already quite low. Per
Anandtech 2014, it was at 10% for x86 and decreasing:

[http://www.anandtech.com/show/8776/arm-challinging-intel-
in-...](http://www.anandtech.com/show/8776/arm-challinging-intel-in-the-
server-market-an-overview/12)

Since there is a floor to the number of transistors needed for decoding, there
might not be a whole lot to gain there.

2) Progress has already been made towards updating instruction sets to reflect
what CPUs do quickly, well etc. It started with MMX back in the stone age, and
has progressed through a plethora of SIMD and media acceleration instructions.

3) Instruction sets are already not designed as if they were supposed to be
run directly. Quite to the contrary, they are abstracted -- the instruction
set is the API, and the microoperations are the instructions. Designing them
as if they were to run directly would mean exposing the micro-ops, which would
require backwards compatibility breaking changes to the CPU each generation
when there were changes to the micro ops.

4) The current system already pretty well levels energy consumption between
competing ISAs. See: [http://www.extremetech.com/extreme/188396-the-final-isa-
show...](http://www.extremetech.com/extreme/188396-the-final-isa-showdown-is-
arm-x86-or-mips-intrinsically-more-power-efficient)

You're describing something that sounds like a bytecode VM, but in essence,
that's what modern processors _already are_. Unfortunately, x86 (at least 32
bit) assembly is pretty unpleasant as an 'API', but ARM and PowerPC are both
pretty good.

As far as pointer tagging, that's something that's probably mostly limited by
memory bandwidth (it's not a particularly compute heavy thing to do), so
unless the HW support came in the form of a dedicated on-chip cache, it
probably wouldn't get you very much ... and then, if you're going to the
expense of adding a dedicated on-chip cache, it's probably going to be more
effective as a general purpose cache -- if the code is accessing the pointer
metadata often, it will be in cache, and therefore accelerated.

Not trying to rain on your parade, I like seeing creative ideas, and I have no
idea why people are downvoting you. Have an upvote on me :).

EDIT: Also, ARM has NEON.

~~~
loup-vaillant
Whoa, I was hopelessly out of date. Thanks.

------
Animats
This article is from France. What's the French term? Has the Académie
Française decided on one yet? (Unlike English, French has an official
standards body.)

~~~
loup-vaillant
The most common term here is "portable", or "téléphone portable", or just
"téléphone". "Portable" means what you think it means: you can carry it.
Sometimes, (especially in the commercials), we see "mobile" instead of
"portable". It also means what you think it means: you can move it.

I think one reason behind the word "mobile" is because "ordinateur portable"
(or "portable" for short) is already used to talk about laptops. Destkops are
in rare cases called "ordinateur de bureau". Generally, we just say "PC"
—unless it's from Apple.

I am not aware of what the Académie Française may or may not have decided.

What I love about the English language here, is that we have 3 words that
neatly apply to the three form factors: desktop, laptop, and palmtop. It's a
bit of a bummer we can't exploit such regularity in French. If I had to settle
on a term, I'd probably try "ordinateur mobile", or "mobile" for short. Unlike
"palmtop", it wouldn't scream "computer", so the best I can hope for is that
we just stop using "téléphone" to describe those things.

------
kalleboo
In many countries they're mostly referred to as "mobiles" (as an abbreviation
of "mobile phone"), especially now that nobody has feature phones anymore and
there's no reason to make the distinction.

I find it perfectly vague yet easily understandable.

~~~
carlob
In Italy many people refer to mobiles as telefonino (lil' phone). I guess that
doesn't really apply to phablets though :)

------
icebraining
I agree that naming is important, yet I don't think calling them "palmtops"
would have changed anything. Tablets were and often are still called "tablet
computers", yet they usually have the same limitations.

~~~
loup-vaillant
True. On the other hand, the name make it a bit easier to protest: they can't
say those things are not computers, since can't do anything else.

Alas, we often just say "tablet", so it's still possible to make an artificial
distinction between them and "real" computers such as laptops and desktops.

The idea behind "palmtop" is to appeal to the intuitions behind laptops and
desktops, and suggest it should be subjected to the same rules (when
possible).

~~~
astrocat
I can't be the only one who has made a habit of casually referring to them as
"space computers." Whenever my wife asks some easily google-able question I
reply "well just ask you pocket space computer."

I want to hang on to the sense of awe and amazement at the reality of having a
powerful computer and all of the internet at our disposal at all times.

------
talles
If I had to rename "Smartphone" I would call it "Personal Computer". My phone
is way more "personal" than my desktop and laptop.

But there's absolutely nothing wrong with "Smartphone". Two are the reasons:

1 - There's history there. Most of the words aren't precisely crafted by
linguists when society needs them, we simply build on what already exists. The
so called smartphone came from the cellphone that came from the telephone, the
name itself hints you about its origins. It's fascinating what you can learn
about a word when you study its etymology.

[http://etymonline.com](http://etymonline.com)

2 - Meaning wise I wouldn't worry so much about the connotations that the
author mentions (I wouldn't worry at all). I'm with Wittgenstein in this one:
the word means whatever we decide it means.

[https://en.wikipedia.org/wiki/Philosophical_Investigations#M...](https://en.wikipedia.org/wiki/Philosophical_Investigations#Meaning_is_use)

~~~
punee
I'm pretty sure that Wittgenstein, while agreeing with your point about
meaning being defined by use, would still argue that there's a lot of unspoken
confusion that comes from calling the smartphone a smartphone.

The first thing I did when opening this thread was Ctrl+F to see if anyone was
suggesting it should be called a "personal computer", because that's what I've
thought best described what the smartphone really has become.

Now, to suggest even trying to use that to refer to smartphones these days
would be adding a much heavier does of confusion. But the insight seems
fundamentally right to me.

Interesting article on the subject that brings up this point: [http://ben-
evans.com/benedictevans/2015/11/7/mobile-ecosyste...](http://ben-
evans.com/benedictevans/2015/11/7/mobile-ecosystems-and-the-death-of-pcs)

------
jdlyga
We call them "buttons" on interface windows because they look and act like
mechanical buttons in the real world. But mechanical buttons have that name
only because they look like shirt buttons.

------
castell
Bill Gates called it the "Wallet PC", back in 1994. Later Pen-based computing.
Later "Pocket PC".

~~~
decasteve
Bill's description was pretty spot on too. I read The Road Ahead again
recently and he was right about a lot of things. Though even with all that
prescience Microsoft still missed the boat a number of things they saw coming
a mile away.

------
robbrown451
I doubt anyone is going to change what they call them (other than dropping the
"smart" now that such features are becoming the default), but the point about
not being able to install apps except though an app store with a monopoly is
very true. Of course it tends to be true with tablets as well.

Maybe when we start to see more android devices that are in a laptop
configuration we'll start to struggle more with that issue.

~~~
jpindar
Why? Are you under the impression that you can't install Android apps except
through an app store?

------
Zigurd
Articles like this gloss over how small tweaks to a product put it in a
different category, with large commercial consequences. In the case of
smartphones, having a mobile network radio is a very large difference in
capabilities, price, operating cost and channel. Smartphones are the largest
business on the planet and growing, and tablets have stumbled for not having
met their potential in displacing enough PCs in office productivity use cases.
Smaller distinctions are important within the smartphone and tablet markets,
like "phablets" and various size and price categories for tablets, and for
not-quite-tablets like "convertibles." Microsoft tried and failed in multiple
product generations to turn PCs into tablets, while in Windows CE and NETCF
they had the basic formula for a modern smartphone but they treated NETCF like
a red headed stepchild. Small differences, big results.

------
justMaku
Well, to be honest the best name would be just "PC". Think about it, PC stands
for a personal computer. And because a smartphone is in fact a small computer
and it's as personal as it gets.

------
peter303
Note Apple came to this market from a different direction than most other
cellphone makers. Their first mobile computer was a revolutionary music player
(innovative UI and music store). It evolved into a sophisticated media
computer with color video, wireless and computer utilities in the last iTouch
before the iPhone. (Apple still ships iTouches which are the iPhone device
without the cellular phone in it, or a micro iPad.) So they pretty much had a
full fledge mobile computer before grafting phone technology into it.

~~~
dragonwriter
> Their first mobile computer was a revolutionary music player

The Newton was not a revolutionary music player, and no Apple music player
before the iPhone was a "mobile computer" in a sense that would make this
portrayal meaningful and accurate.

> It evolved into a sophisticated media computer with color video, wireless
> and computer utilities in the last iTouch before the iPhone.

iPod Touch (sometimes nicknamed "iTouch") was introduced after the iPhone,
running a later version of iPhone OS (the OS which later became iOS) than the
first iPhone. There was no "iTouch before the iPhone."

There was the old-style wheel-controlled iPod before the iPhone, and that was
a revolutionary (at least in terms of commercial impact) music player, but it
wasn't a "mobile computer" in the same sense as the iPod Touch, modern
smartphones, or even earlier PDAs or the Apple Newton.

------
OJFord
Does anyone actually say "smartphone" except (vanishingly rarely) when
necessary to describe the device in contradistinction to "dumbphones"? Is
there a difference whether spoken or written?

If so, where are you? I'm curious because all I hear in the UK is "phone" or
"mobile" \- both of which we always used. But I'm acutely aware that I hear or
read "cell" in American sources significantly less often than, I think, I used
to.

~~~
loup-vaillant
> _Does anyone actually say "smartphone" except (vanishingly rarely) when
> necessary to describe the device in contradistinction to "dumbphones"?_

True. My beef isn't about "smart" however, it's about "phone". I believe my
point stands even more acutely in this light.

~~~
OJFord
Sure, it was something of a side-point, but the more relevant point (which I
may have forgotten to make..) was that in the UK 'mobile' is still common.

Although actually a contraction of 'mobile phone', it doesn't have the same
'problem' going forward, since we can just decide or assume that it refers to
a mobile computing device.

Possibly though things will get even more blurred than that - everything a
computing device and almost all of them mobile... Time will tell!

------
eitally
They won't stop carrying the "phone" connotation until the service providers
stop pitching themselves as primarily telephony companies.

------
mikeehun
I usually refer those as "tracking device"

~~~
zipwitch
I was thinking more "self-funding prole monitoring device program".

------
joakleaf
"Smartphone" is the wrong name? What about "Feature phone"?

Edit: Actually, I meant that the term "Feature phone" is really even a worse
name for feature phones, than "Smartphone" is for smartphones. I didn't
suggest that it was a better name, but the downvote and comments suggest that
I didn't make that clear.

~~~
squilliam
'feature phone' is already an industry term for (mainly) bottom-tier Android
phones that lack the hardware to be considered among 'smart phones' but have
the basic abilities to access the internet, and play basic media files.
Basically the phones that come with the pay-as-you-go wal-mart phone plans.

[https://en.wikipedia.org/wiki/Feature_phone](https://en.wikipedia.org/wiki/Feature_phone)

~~~
castell
Nokia's Symbian based phones, various China based look-a-like phones and
WinPhone 7 were called "feature phone".

You couldn't do much beyond the basic functionality. And there was a very
limited amount of apps available, if any at all. They were as cheap as today's
entry level Android smartphones.

------
fit2rule
In my opinion, it'd only be a truly smart phone if it shipped with a compiler
onboard so you could build apps for it ..

~~~
khedoros
DroidDevelop and AIDE are available on Android devices. As far as I know, no
portable devices ship with a compiler installed, but since it's easily
available in the app store, I'm not convinced that it matters.

~~~
fit2rule
OpenPandora, a portable device (which I consider to be an utterly arbitrary
class of computing), ships with usable compilers..

Point is, I think its arbitrary that you can't use an iPhone to write apps for
the iPhone.

~~~
khedoros
Right, OpenPandora, its predecessors, and its successors. I considered buying
one for a time, but never quite convinced myself, so I've never gotten to play
around with one.

------
harryf
Microsoft has been exploring the smartphone / laptop gap with Continuum -
[http://windows.microsoft.com/en-us/windows-10/getstarted-
con...](http://windows.microsoft.com/en-us/windows-10/getstarted-continuum-
mobile)

------
gvurrdon
I've long thought that "hand computer" was a good term for it, but that has
rather a lot to do with nostalgia:

[https://app.box.com/s/i6tw2gc8avr9r1hevn0trf4w41nt9r1b](https://app.box.com/s/i6tw2gc8avr9r1hevn0trf4w41nt9r1b)

------
agumonkey
Not even phones they're not handheld devices anymore. The lack of physical
interface is very sad. Taking pictures, listening to music, even calling
someone, all require high attention interaction over an unfit touchscreen.
It's a small slate computer.

~~~
coldtea
> _Taking pictures, listening to music, even calling someone, all require high
> attention interaction over an unfit touchscreen._

Are you kidding me? All of those functions are 10x easier to do on the "unfit
touchscreen" that on what we had before that (tiny non-touch screens, LCDs,
80's style arrays of buttons, etc).

And when it comes to actually taking the picture, changing track or volume
etc, smartphones even offer physical buttons on the sides.

Remember trying to get to the 10th track of the 4th folder in your "physical"
CD mp3 player or minidisk? Setting anything more advanced than zoom level and
picture mode on a typical 2002-2005 compact camera?

~~~
agumonkey
90% of the time I need direct blind access to simple functions. Now I have to
carefully swipe, carefully otherwise the app goes into gallery mode, or the
picture is taken, and since I was pushing on the screen, it's blurry and I
need two hands to do so.

Side picture buttons are increasingly rare if not defunct, volume rockers are
still here though.

A lot of things that could be done quickly is now fragile and subtle. A side
effect of translating desktop UX to IRL handheld gadgets, by the promise that
software will be smart enough to make it one button away.

psedit: as noted below, I indeed never realized the volume rocker was bound to
snapshot. My rant is half void now u_u;

~~~
oxplot
This can all be "fixed" in software. In fact, on my HTC One M8 which is
running Cyanogenmod, if I double press the power button at anytime, it
switches to camera app and I can then take photos by pressing either volume
buttons. I don't need to (or do) look at the screen let alone interact with
it.

~~~
khedoros
How often do you take a picture without needing to set the focus, or at least
watching the screen to see when the app has finished its autofocus? More
generally, beyond changing the volume or initiating a voice query, how much
can you really do without pulling the device out to use the screen? Lack of
hardware buttons and tactile feedback can't be fixed in software.

~~~
oxplot
Even with a purpose built DSLR, you still have to frame your shot. I meant I
don't need to look at the screen to make the phone ready to shoot. I believe
the software can go a long way. It will never be as UX friendly as a camera,
sure.

~~~
khedoros
I agree that there's room for UX improvement in the software, and as a
software guy myself, it's the first place I'm inclined to look at as a source
of improvement. It's just that not all the problems have a software fix, in my
opinion.

Thank you for the double-tap tip, though. It works in Marshmallow too,
apparently.

~~~
agumonkey
To me the issue is the lack of context and pressure. Physical interfaces
tapped into deep human perception. Sub millimeter movements, skin sensitivity,
change of texture, response curve. None of that is taken in account in the
desktop design world. At least before, software was designed for high
throughput and underpowered machines (I often find AS400 UX as ugly as
efficient, and in reality nobody cares about software being pretty). Now you
get Material Design. 80% eye candy. Whenever I have to use KitKat I feel so
relieved because it has centered, static, squary input menus. I don't have to
avoid overlapping free floating (+). Every new market causes a regression,
until it learns lessons from the past. The learning phase is still going
indeed.

------
joe_momma
Dude, just give it a minute and they'll be called "tricorders" as they should
be.

~~~
Animats
We're getting closer. Caterpillar just announced their C60 smartphone with
thermal imaging.[1]

[1] [http://www.catphones.com/en-
us/phones/s60-smartphone](http://www.catphones.com/en-
us/phones/s60-smartphone)

------
sschueller
Better than what we call them in Switzerland. 'Handy', pronounced the English
way.

~~~
pluma
Germany as well. I always heard it dates back to an early marketing thing.

There's also the old (~1990s) joke that it's actually Bavarian and short for
"Hän die ken Schnür?" ("Haben die kein Kabel?" / "Do these not have a cord?").

------
s3cur3
I prefer "tricorder."

------
radley
from the Greek _phon_ : voice, sound

i.e. a means to communicate.

You do not call an automobile "wheeled combustion chamber".

------
ask5
the device is still a phone, because it needs a SIM card to operate. Without
it it's pretty much useless. You cant even start the iphone without it, first
thing it needs is a sim card.

~~~
jdminhbg
What? If you boot an iPhone without a SIM it'll show you a warning/error that
the SIM is missing, but it still turns on and works.

------
nullc
Handcuff computer was rejected by the marketing department.

