I expect it is not available!
There are these really cool smart chessboards that can suggest moves and track your games... but they’re $400 and weigh 19 pounds. And of course there are apps that can analyze games but tracking and inputting games by hand is a huge pain. Or fully-digital chess apps... but board games are way more fun in real life!
We wondered: “why can’t you just do that in software and bring the best parts of chess apps into the real world?” So we did!
A camera passes a feed of the board through our machine learning model which interprets the state of the board and passes it off to Stockfish to display move suggestions in real time.
We didn’t quite get to recording the state over time in PGN but we hope to continue this project and add that soon!
Would love to know what you think. We’re working on enhancing other board games with computer vision as well; if you want to help us beta test sign up at https://boardboss.com
Also I live tweeted about our progress during the hackathon so if you’re interested in how the sausage is made you can check out the blow-by-blow here: https://threadreaderapp.com/thread/1179424684502388736.html
Looking forward to seeing what BoardBoss could become. These days I've been wanting a CV app to track backgammon games. Those dice can be pretty tiny though :)
ML tools have definitely come a long way in the past few years! We used CreateML for our first pass which is great for prototyping; you just give it your training data and hit go.
Unfortunately it’s not particularly transparent or tweakable. If it doesn’t do the job well enough you you’re out of luck and have to switch to another tool completely.
Edit: cool project!
Can you shed some insight into your ML process? One thing we did to simplify the vision problem is capture images from the same perspective (hence our tripod). We labeled 2894 objects across 292 images. We had 12 objects to detect: each piece for black and white. We struggled with occlusion, especially if a pawn is behind a queen.
There is also a presentation we had prepared for an informal talk: https://github.com/chesseye/chesseye/blob/master/presentatio...
Hope this helps! I always enjoy talking about these things, so feel free to reach out if you want to discuss it more.
I see a micro-projector with two cameras just 10 cm apart on a single tripod.
With stereo the tripod does not need to be that big for reliable detection.
Have you seen Tilt Five? It’s an AR headset that operates off a similar concept. Instead of trying to figure out the passthrough optics they put projectors in the glasses that reflect back at the viewer.
We were hoping to get this done at the hackathon actually but it turned out our v1 model wasn’t stable enough to track the game state over time. Hopefully v2 of the model will let us do this.
An added bonus over tracking it by hand is that a computer vision powered version would be able to track the time taken for each move as well.
If we built this would your club be willing to help us beta test?
We’ll aim to support the view from a seated player on each side of the game.
You should also consider posting this on the chess subreddit: http://reddit.com/r/chess
We’ll likely share to the 147k /r/chess when we have an app others can demo. Good call.
If not, anyone else wanna work on this together? email me (in my profile)
But I’d love to chat more about what you’d be interested in building! I’ll follow up with you.
At present, we’re simply outputting the recommended move from StockFish, which also does give a “Why.” Perhaps letting users add that commentary in the app sufficiently solves that need.
How did you handle a queen occluding a smaller pawn behind it? Simply more training data?
(Not chess but I think we’re on the same page!)