Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tesla Bot Update – Sort and Stretch [video] (youtube.com)
19 points by migueloller on Sept 25, 2023 | hide | past | favorite | 29 comments


I've been generally curious about the benefits of the humanoid robot form factor.

I'll admit I don't know much about the state of the enterprise robotics/automation market but it would seem like the market would be limited by the fact that: 1.) The companies able to afford such robots would have much higher throughput capacity requirements and would want to setup much more customized automation. 2.) The companies in the sweet spot in terms of lower throughput capacity requirements would not be able to afford the upfront cost of such robots.

I think the primary use case would be perhaps if these could actually get in the sub $100K-$150K range end cost wise which seems a bit far off given the complexity I'm seeing here. Perhaps the idea is to go for the long game?

It feels like a version with just the top half of the humanoid would be more interesting if it could cut costs particularly since this thing likely needs to be tethered for reliable power and communications in a factory automation setting.


It seems a bit strange that Tesla can build multiple electrics cars for 150k$ but the robot that consists of much less material should cost that much.

The car has lots of compute power, likely they are sharing things with the car. The car has 5 cameras and a host of other sensors as well.

The car also requires the cost of a huge battery and tons of materials plus a lot of manual work for the interiors, plus cooling and so on.

Yes the robot has a lot more actuators and complex mechanical pieces but is that gone cost as much as 2 full cars?

What complexity do you think would be so costly here?


> It seems a bit strange that Tesla can build multiple electrics cars for 150k$ but the robot that consists of much less material should cost that much.

Cars are only cheap because of massive economies of scale. This robot will probably be hand assembled, making it obviously quite expensive. Not to mention the need to recoup a lot the R&D investment money over relatively few units.


But that was the whole premise of Tesla investment in this bot. They don't want to produce 2, the want to produce a lots of them. They already produce the electronics in the millions, and have supply contract for cameras and so on.

Tesla is vertically integrated into producing most of the electronics (including the engine) in the car and in the charging networks. They also design products to be manufactured efficiently.

This suggests that they should be able to mass produce actuators and put those robots together efficiently. Musk looks at mass manufacture as part of the product development. However in the end this relies on enough demand being out-there so that its worth continuing the scale up of manufacturing lines.

It will likely take a many more iteration for this to be more then a tech demo.


Agree - I think it will be interesting to watch to see if it becomes more like the Tesla Model 3/Y or is more like the Tesla Semi/FSD/Cybertruck.


Yes the hand alone seems complex/intricate enough such that it'd pretty expensive to assemble/repair - whether using human or humanoid workers.

[1] https://techcrunch.com/wp-content/uploads/2022/09/tesla-robo...


I believe their target price is $20k. No idea if it costs that right now, I doubt it.


I did see that.

It'd be interesting if one of the Tesla car teardown YouTubers (i.e. Munro, Engineering Explained) did an Optimus humanoid robot first principles teardown.


Conveniently not labeled that the video is all at 4x speed. Using the “playback speed” menu to watch at 0.25x makes the human’s movements much more realistic and the robots movements much less impressive.


At 00:15, label pops up at the top-right that says “1.5X Speed”


I think .25x is a bit slow, as .5x replay also looks reasonable, but you are right there is something like a 2.5–3.5x speed up. Makes the robot’s smooth motion much less impressive.


Yeah I couldn’t decide if it was .25 or .5 because the dropped frames make the movements of the human look janky at any speed, but going off the part where he’s moving the blocks around .25 looked a bit more “natural” to me.

Either way the video is purposely deceptive and the robot is nowhere near even a university-level vision-based robot when seen at normal speeds.


Uh, really? Are universities routinely developing robots with that level of hand dexterity? I mean the way fingers close around the objects, it looks incredibly human. Not to mention the coordination between the arm movements and the hips and torso to keep balance. I keep finding it very natural, even if slowed down to 0.25.


.5x makes the part where the human interrupts the robot while sorting look natural.


The comments on Youtube proof that either:

1) Tesla is only good at creating SPAMbots 2) Google owns lots of Tesla stocks 3) Humanity is completely braindead

or all of the above. lmao. :'O

The Video is CLEARLY fake AND Tesla having such huge amounts of money doing such a bad job at CGI makes me really wonder how stupid people have become. Lots of glitches, no need to be an expert to see that.


Is this CGI? The shadows and movements of the block pieces don't seem real at all. Almost like they're from the PS3 tech demo era.


At 0:40 I see the sort task has magnets on the blocks.

I'm curious, dropped out of state school economics: for robotics types is "sort blue and green" something you should be able to pull of as an undergrad, given equipment?


that's ridiculous. sorting is easy, building a neural net that controls every servo/motor in a humanoid robot so that it can complete a given task while maintaining balance and posture is not. this isn't a simple control loop keeping the robot balanced while doing a hardcoded task.


Ok. Took a couple reads but I think I understand.

I'll put you down as a yes on sorting, no on self balancing humanoid robot


very easily. here is the lecture from my robotics course. Amazing professor. Dr. Sodemann was the best I had in my lifetime. https://robogrok.com/1-2-3_Cameras_and_Color.php

Its actually easier than you would expect! you can use a process in image processing called "thresholding" where you basically filter the RGB or color values. Its very common and a little "hack" on compute. Computers "look" at images or video by converting the camera data into large matrices. Red would be 1 matrix, blue another, green another. Instead of this, you typically convert it to black and white but you can do any color you want, depending what youre looking for. This reduces your compute from 3 matrixes to 1 and makes your functions exponentially faster. Someone can help me on O(n) notation lol.


You're talking about different things.

Computer vision to detect red and green blocks in a controlled environment is completely trivial.

Driving a robot with human hands to pick up and manipulate blocks is still advanced research.


Fascinating...my most recent deep work was around color so this really tickled me, ty!


Tesla can’t solve self driving, but it will produce a bot which can function in an environment which is a hundred times more open and chaotic than a road system? I don’t think so.


Unlike self-driving they aren't charging people in advance for something they haven't done, so I think this is fine. Plenty of companies are researching humanoid robots. I don't see why Tesla can't too.

This one does look quite impressive even if it is sped up a bit (find me a robot hand manipulation video that isn't).


Honestly the stakes are a lot lower for a humanoid robot IMO. There's a lot of risk around releasing FSD cars on public roads as opposed to testing out robots in a factory / controlled environment.

I'm actually really excited about the current state of robotics, and wouldn't be surprised if it ends up being the most impactful outcome of the most recent AI developments over the next 10-15 years.

I know there's a lot of hate for Elon Musk / Tesla, but at the end of the day he's just the money guy for some really great researchers out there.


> he's just the money guy

This argument never makes sense as it's pure circular reasoning. He's had no money to invest in anything without taking from some of his companies (which is what happened to Twitter). You can't invest in Tesla with money from Tesla.


> risk around releasing FSD cars

This reads like they haven't already taken that risk with fatal consequences.

Not seeing how a robot, literally marketed as being powered by the FSD computer, could actually work unless the underlying FSD system actually works. Which it doesn't.


I'm not sure about their marketing claims, but the robot in this video doesn't have wheels and isn't going 80mph on an interstate. Maybe the cars and this robot are using similar sensors, but I doubt there's a ton of overlap between the underlying system and the directions it gives. AGI doesn't exist yet


Psshhh they're only a few GPUs short of training this bot to drive the car for you. Checkmate!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: