Hacker News new | comments | show | ask | jobs | submit login
Show HN: Play rock-paper-scissors against your computer via webcam, neural nets (tenso.rs)
196 points by antimatter15 160 days ago | hide | past | web | favorite | 33 comments



Does anyone else notice that there is enough of a delay for you to switch your choice after the computers choice is displayed allowing you to get a 100% win rate.


Yup. It's still a super cool demo though.


I can't wait for someone to make a sign language browser interface out of this. I'm sure things like that have been done with desktop accessibility apps, but embedding it into a web page or browser plugin would be amazing.


Please correct my understanding here if I'm wrong, but if a user was able to use sign language wouldn't they be able to type? What accessibility improvement are you envisioning?


Fluent sign language is much faster than typing. I think it's about the same speed as spoken word, so you'd get the same benefits as normal speech recognition.


Learning sign language


My friends and I worked on this problem for a hackathon and even managed to get 3 letters that could be recognized! The judges were unimpressed and the win went to a group that copied a project off of instructables.com :)


If you'll suffer a shameless plug, a friend and I made an (imperfect) ASL learning project at TAMUHack one year. If you're interested:

https://github.com/ssaamm/sign-language-tutor


We thought about LeapMotion but went with a CV implementation instead. Very cool project!


exactly what I thought


It could also be useful in cases where you don't have a physical keyboard (mobile, for example, where you do have a camera...).


I don't know sign language, but I can imagine that for some people typing might be much slower than using sign language.


Which reminds me... I saw a funny trick once. Probably impossible through network, but there was an implementation that would always win, no matter what you'd choose. The secret was, image recognition was instantaneous and showed winning hand on the screen so fast that human assumed it is fair play. At the beginning you'd thought they are lucky. After few games - they have a good prediction algo. But after some time you feel weird. Of course people guessed sooner or later, but it was funny anyway.



Ah yes. No idealistic Asimov algorithms for real robots. You can't trust them, especially when programmed by wetware in the first place.


I love it when I win because it misinterpreted my paper as scissors.


Seems to be mis interpreting my hand signals around 5-10% of the time. The little icon flickers rapidly between correct hand signal & the "face" icon, and I tried holding my hand closer to the web cam which didn't seem to help. Very cool though.


I'd love to see this combined with https://www.youtube.com/watch?v=3nxjjztQKtY for a bot that always wins without a perceptible delay.


So, my 8 and 10yo sons and I just started playing again once we were introduced to Rock Paper Scissors Lizard Spock from Big Bang Theory.

http://www.samkass.com/theories/RPSSL.html https://www.youtube.com/watch?v=Kov2G0GouBw


It reminded me of http://www.nytimes.com/interactive/science/rock-paper-scisso... a game illustrating how simple AIs strategy(er, algorithms)) can win in well over of 50% of the cases ; but it's unplayable now in the post-Flash era.


Interesting. I had the opposite experience and went 8/2/4 against the bot, by assuming he would mime typical human responses to my previous choices. The last 4 games isn't enough for it to start engaging in metacognition, I guess.


Why is it just a red box in the latest chrome?

https://puu.sh/xaaCh/cae3f64840.png


This is hilarious and awesome. I'm reminded of the great Simpsons exchange:

Lisa: poor predictable Bart, always chooses rock

Bart: ah, good ol' rock. Nothing beats that!


Where can I find the training data/sample code for training a similar rock-paper-scissors network myself?


I'm getting a really high misclassification rate on what it says I'm throwing. Cute idea, but didn't work for me..


this is an awesome awesome demo =)


<plug>

Show HN: weekend, hack using Django. Rock Paper Scissors on twitter.

https://twitter.com/rpsrobot https://rps.barwap.com/

</plug>


> Unfortunalely, your browser doesn't support accessing your webcam.

This is in Safari — is this really the case? I'm not a web dev, so I don't really know, but AFAIK Safari can do web cam. And if not now, they should support WebRTC in High Sierra, correct?


Correct. There's no support for getUserMedia().

We've done some WebRTC stuff using Chrome. Super excited for iOS 11 where it'll be available (and High Sierra too, but that's not as big an issue as we can just tell people to use Chrome).


This is insanely impressive, huge shout out to the TensorFire team!


So cool! I can't wait for this to be released.


Looking forward to a YOLO or SqueezeDet or SSD+MobileNet version!


+1 for YOLO model being added. Awesome work guys!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: