
Segway robot - tomerico
http://robot.segway.com/
======
jakozaur
Wow, I use Double Robot:
[http://www.doublerobotics.com/](http://www.doublerobotics.com/)

and was wondering how to script this thing. E.g. appear at meeting and record
it for ppl in different timezone. However, it seems to be hard with Double.

With Segway it may the killer use case.

~~~
trymas
It remembers me a south park halloween special episode, where Stan facetimes
with it's iPad [0].

It's my personal opinion, but even though idea looks rather good (give a
physical presence, by working remotely), wouldn't it be just a very strange
and awkward (and maybe smug) to use? Like the original segway, which was very
cool invention, but did not gain much traction, due to how awkward it is, how
people look smug using it, etc (even though 'hoverboard' gained a lot of
popularity, but is IMHO an inferior gadget).

All in all, cool, though IMHO people will need to take a lot of time to be
accustomed by such device.

[0]
[http://wac.450f.edgecastcdn.net/80450F/screencrush.com/442/f...](http://wac.450f.edgecastcdn.net/80450F/screencrush.com/442/files/2012/10/sp_1612_promo01.jpg)

P.S. cannot compare to segway robot, as their servers seem to struggle to load
the content at the moment of writing this comment.

~~~
GavinB
The Segway is quite large and needs parking or storage. Also, it can't go up
stairs. Hoverboards can be picked up and carried easily around an office or
home, and can be tossed under a desk or in a backpack for storage.

A Segway is like a bike in that you always have to be thinking about where to
put it when you arrive. Hoverboard is much more like walking, in that you can
toss it in your backpack or just carry it when you arrive.

Though I'd rather just walk.

~~~
DavidAdams
The immediate use case I thought of when reading about it was that you don't
have to worry about where to put it. You ride it to work then the robot pops
up its cute little face and it drives itself back home. Then it's waiting to
pick you up when you're done. Obviously, that would depend on software that's
probably a lot more advanced than the current prototype. But I suppose that's
why they're soliciting developers.

------
ideamonk
OT: they forgot to setup their meta tags properly. The description is set to
"Responsive Minimal Bootstrap Theme"

~~~
fratlas
A funny find, but why were you looking at their source?

~~~
TeMPOraL
Why not? For some of us it's a decades-old habit, from back when the Web was
more friendly.

~~~
accommodavid
I just looked at the source code and, well, wow. Comments sometimes in English
and sometimes in Chinese, scripts are being loaded all over the place, one tag
is just commented out, one inline script does nothing but assign a global
variable to a value that's presumably just an output from PHP and many more
things that feel really... odd for something that's supposed to advertise a
new and complex technology.

~~~
vinceguidry
I wouldn't be so surprised. They're a robotics company, not a web design
company. They probably contracted it out.

~~~
pbhjpbhj
Surely it's far worse if they contracted it out as then this is the results of
people who they bought in on the basis of their web design. If it was in house
then there's an excuse of sorts.

~~~
vinceguidry
Nah. If you're not a web design company, than you're not going to know what to
look for when trying to contract a web design company. I'd expect shit-show
engineering to be the norm rather than the exception.

------
lrizzo
I really don't understand why for this use case people do not use a third
wheel and save the energy consumed to stand still with self balancing. Not
just this segway, also the double robot (the latter, at least, can claim
aesthetic motivations given the barrel shape).

~~~
robotresearcher
Tri-wheel robots wobble really badly. For telepresence, the screen is at the
top of a mast so wobbles are amplified. The balancing robots have very smooth
motion.

~~~
mhb
Wouldn't it be easier to stabilize the screen than the whole robot?

~~~
stcredzero
Put it this way: The head on the end of the mast is a significant mass at the
end of a long lever. To dynamically stabilize that, you're going to need big,
hefty, expensive motors. Well, you already have big, hefty expensive motors to
move the thing around in the first place, so it actually costs less money to
dynamically stabilize the whole robot.

~~~
jasonwatkinspdx
Check out the inverted pendulum on a cart problem, or go balance a couple bats
on your hand. When near vertical the inertia of the pendulum mass helps you
out, it's the inertia of the cart the motors need to overcome. The better the
control loop the fewer and smaller the corrections will be, tending towards
zero.

------
vinceyuan
This is really a cool idea.

Segway was acquired by Xiaomi. And the background in the video
[https://www.youtube.com/watch?v=nr-9p8o60gY](https://www.youtube.com/watch?v=nr-9p8o60gY)
is Beijing. I believe this project is developed in Beijing, China.

~~~
keville
Good info, I missed that bit of news.

[http://www.bloomberg.com/news/articles/2015-04-15/xiaomi-
bac...](http://www.bloomberg.com/news/articles/2015-04-15/xiaomi-backed-
startup-says-it-plans-to-buy-u-s-rival-segway)

------
ragebol
So you sit on its head if you use it as a 'hoverboard', that's curious.

But what could be the use cases for a robot like this? Its head is too low for
telepresence like Double Robotics. The arms-extension looks nice, but it can't
grasp anything with it because it seems to stiff, Lego-like hands. Plus the
hands can't reach up a table or kitchen to grab anything. Holding anything
with weight would also shift the center of gravity and mess with the
balancing.

On the other hand, mixing this robot with a thing that is useful otherwise can
get the technology out the door and get sales going to add functional arms
later on.

------
cpitman
Making robots with Segways was actually a thing at least around 2004-2006.
Robocup is a robotics soccer league with several divisions that each have a
different kind of challenge.

Briefly, there was a Segway league. Each team had a single robotic Segway and
a single human driven Segway which had to cooperate to score goals on the
other team. Here's the page of one of the teams with pictures and videos of
the Segway robots in action:
[http://www.nsi.edu/~nomad/segway/](http://www.nsi.edu/~nomad/segway/)

------
donkeyd
I had hoped that this would be a self-driving Segway, but it seems that when
you ride it, all the sensors are covered by your legs.

This could be the first step to robots getting into the main stream though,
since it has some useful functionality next to being a robot.

------
chinathrow
This might actually work:

"AN EXTENDABLE ROBOT

FOR THERE ARE MANY POSSIBILITIES OUT THERE. WHAT WOULD YOU LIKE YOUR GEARS TO
BE?"

Let the public find the exact usage cases, provide the basic tooling such as
mobility, GPS, depth camera etc.

~~~
cake
Meh, I was trying to find an actual usage of this robot. I guess even Segway
has no idea what to do with it.

------
amelius
Would be nice if combined with a 360 degree lens, and an Oculus Rift.

The problem is you can only use it indoors, because outside anybody could just
steal it.

~~~
drzaiusapelord
>and an Oculus Rift.

Well, its a telepresence robot, which means it will be at arbitrary locations
doing its job. How many places have the type of reliable low-latency, high-
bandwidth, low packet loss, etc wifi network to give a reliable Rift
experience? I mean, we can barely do this with the powerhouse of high-end
video cards via hdmi. How can we transmit a high quality 1440p per eye at
60fps over the common internet? Maybe 25mbps per eye with a lot of
compression, so a good 55mbps with overhead? Note, this is live video so no
buffering past a few dozen milliseconds. We can't without some really
dedicated connections and a wifi client>AP relationship with little to no
interference. You're not getting this in your average office or other common
telepresence locales. You sure as hell aren't getting it over 3G/4G.

I don't think people appreciate how exotic HMD's like the rift are. We can't
just plug them in anywhere. For the 3D effect to "work" you need high
resolutions and high framerates. If you can't get that, then a normal screen
should be used because its a waste of resources and a waste of time strapping
that thing to your face.

~~~
vectorjohn
High framerate video isn't necessary. The reason for the 60fps capable video
card where the rift is connected is that when you move your head, you need the
video to update quickly. If you had a 360 degree camera like the parent
suggested, you don't need the source to be 60fps, just that when you move your
head, you need the view on the rift to change at 60fps.

Think of a panoramic photo. It's not even 1fps, it's 1 frame ever and it still
gives you a great 3D effect. I think you'd be fine with a much lower framerate
source. And that would make a wifi and a not-that-impressive internet
connection fine.

Edit to add: of course, with a 360 video you have much larger frames, not
1440. You could probably do with less than 360 though, since you can assume a
person isn't going to be looking behind them or at the floor.

------
drzaiusapelord
I have thousands of dollars to spend on something like this if it had an arm
and could do things like get items from the fridge by verbal order (with no
step by step programming, just visual recognition of the fridge and how to
open it), play catch with the dog, sweep up, act as a security guard when I'm
gone, etc.

Robotics is one of those things where the hardware, price, and networking are
there but the software isn't. We don't have an AI-lite engine we can toss in
for simple things a dog could understand like "get my slippers." Until someone
cracks that code, this stuff is just going to be rich-boy novelties and
unneeded contenders in the already over-saturated telepresence market.

~~~
sbierwagen

      Robotics is one of those things where the hardware, 
      price, and networking are there but the software isn't.
    

I'm not sure where you got the impression that the hardware is ready for prime
time, because it ain't. I work in mobile robotics, and I wrote a blog post
about this a while ago: [http://c1qfxugcgy0.tumblr.com/post/31187427192/the-
enduring-...](http://c1qfxugcgy0.tumblr.com/post/31187427192/the-enduring-
tragedy-of-the-robotics-industry)

Batteries aren't good enough, linear actuators have awful power/weight ratios,
and computers just aren't fast enough to solve CV problems and calculate
grasps and paths in seconds, rather than minutes.

Saying you "have thousands of dollars to spend" is great, but not sufficient.
The PR2 I talk about in the blog post costs _four hundred thousand US
dollars,_ and it sucks! It's like saying you're willing to spend five thousand
bucks to buy a Lamborghini Aventador. It's going to be many decades before a
useful household robot only costs ten grand. A household robot just requires
too many breakthroughs in too many different fields.

~~~
drzaiusapelord
Oh I don't know, it seems to me that the PR2 is designed for industry so its
pricing is going to reflect that. I suspect there is a home robot space that
some startup can fill sooner than later. Whether it does all the things I
listed is the big question and I suspect it won't, but it may be able to do a
few things that make it a worthwhile purchase.

I've played a bit with opencv, pcl, ros, etc. There's some very impressive
image recognition stuff available right now that works well on commodity x86
platforms(1). I don't think the market is expecting a HAL-like or Jetsons like
robot, but I could see something akin to an early 80s home computer where the
product is clearly a long list of compromises but it does a few things very
well and is compelling. Home robotics may be the same way for a while until it
has its 1984 Macintosh moment, which as you say might be a decade or two or
four away.

I did appreciate your posting, but I think its a little dismissive of the some
of the homebrew and smaller scale stuff out there. The PR2 is a VC backed
monster designed to bring industrial robots to retail, hospitals, etc. These
guys want to build the 747 of the robot world. That's great. But there are
people out there building the Cessnas of the robotics world. I expect an
affordable consumber product that isn't a joke xmas 2018-2020. There's just
way too much potential here.

1\. Home robot hackers falling in love with the super low power NUC which
gives a CPUMark score around 5,000+

[http://www.showusyoursensors.com/2014/09/intel-nuc-for-
ros.h...](http://www.showusyoursensors.com/2014/09/intel-nuc-for-
ros.html#comment-form)

~~~
sbierwagen

      designed for industry
    

The PR2 has an arm payload of 1.8kg, total payload of 20kg, and a top speed of
1m/s. That doesn't sound "industrial" to me, that sounds like "the absolute
bare minimum to be a mobile anthropomorphic robot". And to achieve those
numbers, it weighs 480 kilos! That's a payload fraction of 4.1%! How is this
anything like a 747?

If it _only_ weighed a 60 kilos, the average weight of a human, then we would
expect a total payload of 2.4kg, and an arm payload of... 7.3 grams. Doesn't
sound too useful to me. And the damn thing would still cost $50,000!

The PR2 _was_ a VC backed monster, but remember, Willow Garage _went out of
business_ last year, because their products just weren't very useful! The
technology just isn't there, and won't be for a long time.

    
    
      Home robot hackers falling in love with 
      the super low power NUC
    

I'm sure that's fine for pathing, and Kinect-SLAM, but "getting a beer from
the fridge" is picking arbitrary items in arbitrary poses in an unconstrained
environment, basically the Amazon Picking Challenge, which nobody can solve
with reasonable speed yet, even with hundreds of thousands of dollars of
equipment.

If you honestly think you can build a _cheap_ robot that can do all that by
2020, then by all means, launch a startup and earn billions of dollars. But I
don't think it's going to be done before 2040.

------
PaulHoule
The Segway really is the architecture for "bipedal" robots and the reason you
haven't seen this before is all those patents...

------
passive
The most interesting thing about this for me is that the third partner is
Xiaomi. They are a company that I expect any day now to eat a large portion of
the consumer electronics market. Items like the Mi Band and Pistons compete
well with things 4X their price. I'm very curious about their involvement
here.

~~~
bravo22
They own Segway.

------
api
Heh... this is pretty close to the Wheelie Boy from William Gibson's recent
novel _The Peripheral_.

------
stock_toaster
It kind of reminds me of Johnny five!

------
digi_owl
Intel inside huh? I guess the camera array used is based on Intel Realsense
then.

~~~
chinathrow
Yes, that is correct.

"Intel RealSense RGB-D camera enables depth sensing. Great for development on
object recognition, tracking and other cool projects."

~~~
steve_k
The problem that I see with the real-sense is that just like the Kinect it's
based on structured light. This means that it will only work indoors (just
like the Kinect). So their cool demo with putting that on a drone is kind of
pointless, because who flies a drone indoors?

There are now some devices available which can provide depth data through
passive stereo vision. I have recently seen this one in action:
[http://nerian.com/products/sp1-stereo-
vision/](http://nerian.com/products/sp1-stereo-vision/)

The problem with that, however, is that it is targeted at industrial market
and probably way to expensive for any ordinary consumers. I guess we will
still have to wait for some major revolution in depth sensing.

~~~
reaktor
Heh - RealSense is not exactly one depth camera. This particular formfactor
uses active stereo IR (R200 camera), so it also works outside but loses the
projective texturing (where it's usually not needed anyway).

~~~
steve_k
Actually, I don't think so. Its kind of hard to get any technical information
on the real sense and I know that Intel is making different versions of it, so
please correct me if I'm wrong and if they have one which really does stereo.
On [http://www.intel.com/content/www/us/en/architecture-and-
tech...](http://www.intel.com/content/www/us/en/architecture-and-
technology/realsense-shortrange.html) they say the following:

"The Intel® RealSense™ Camera F200 is actually three cameras in one—a 1080p HD
camera, an infrared camera, and an infrared laser projector"

So this is just looks like the first Kinect. The infrared camera will observe
the projected pattern and the RGB camera is there to capture the color
information. You can't really match an infrared image (which is also covered
with a laser pattern) with a visible light image, as they will look very
different. So you would require a yet another camera (infrared or visible
light) in order to do stereo.

~~~
reaktor
The robot is using the R200, an active stereo camera, not the F200. Even the
F200 uses a fundamentally different technique (coded light, projected grey
code) rather than structured light as the Kinect uses.

Source: I work as a computer vision engineer on these products for Intel
RealSense.

------
flippyhead
Maybe it's just me but the lead image makes it look like the dog is chasing
the robot and it's scared. Which is an odd choice for a lead image.

------
tzm
I want this to welcome me at the airport.

------
yitchelle
A side comment. Anybody else noticed a request was made to hm.baidu.com for a
javascript from this site?

------
jacquesm
Let's hope these fare better than hitchbot:

[http://www.usatoday.com/story/news/nation-
now/2015/08/03/hit...](http://www.usatoday.com/story/news/nation-
now/2015/08/03/hitchhiking-robot-destroyed-philadelphia-ending-cross-country-
trek/31051589/)

------
hellameta
This is going to be so much more awesome as drones improve.

------
clintboxe
No way it's as good as Kevin.
[http://savedbythebell.wikia.com/wiki/Kevin](http://savedbythebell.wikia.com/wiki/Kevin)

------
ck2
_" We target to start shipping Segway Robot Developer Edition in early Q3
2016"_

so 7-8 months from now or later

~~~
vectorjohn
Yes, that is how you time.

