

What can the Myo armband actually do? - agrant
https://www.thalmic.com/blog/can-myo-armband-actually/

======
shostack
This was a very thoughtful and interesting post.

It's funny, because despite the potential, it seems people these days don't
care about the "what will it do for me tomorrow" as much as "what can it do
for me right now."

Companies like Apple have the luxury of war chests that allow them to take on
substantial risk for iterating on a hardware product in an unproven and new
direction where the true potential is not able to be realized yet because of
how ground breaking the device is.

I have high hopes for the Myo and anything else that can translate my
movements into signals for various things. Ever since I read Rainbow's End and
how people's slight twitches and movements were used as input devices for
their wearable computers, I realized this is where things are going.

Someone needs to take the risk to bring this technology to market, and with
Myo's vision, my hope is that they survive long enough to see their vision
become reality vs. just serve as a stepping stone for another larger tech
giant to make it mainstream.

------
jadeddrag
Unfortunately thalmic labs is very developer unfriendly, so while the Myo
still might have potential, working with the closed source code makes it very
difficult to expand the functionality, especially in linux. I suspect it's
just another expensive paper weight.

~~~
smngreenberg
Scott from Developer Relations at Thalmic here. I'm sorry to hear that you're
running into problems. We'll be opening up a lot of the resources needed to
port to Linux and other platforms soon, but it is more of a time issue than
anything else.

In the interim, feel free to reach out to me on twitter (@smngreenberg) if you
have any questions. I'll mostly be able to help with our supported platforms
(Win, Mac, Android, iOS), but I'll do my best if it's something other than
those four.

------
jadeddrag
The short answer - it's not accurate or reliable enough to do anything useful.
It is just an expensive toy and worse, artificially difficult to hack. (the
raw data is restricted, even to developers)

------
lam
I am curious to know the current limitations to classifying other hand
gestures, such as a thumb-to-middle-finger gesture. Is the limitation due to
processing of (possibly alias) signals?

