It's a recording of the phone screen running their app. The input of the app is the phone's camera, which they have pointed at their hands holding their Switch. The output of the app is the input plus the game overlaid on top of the Switch.
What I find very cool is that it's extracting scenes from the running game and then projecting them into the VR space, allowing the scenes to be panned/rotated in a way that isn't possible in the original game.
If you haven't watched the video, check it out; my assumption from just reading explanations here was that it was just projecting a facade of the emulated device and its normal rendering of the game into VR.
I actually originally assumed this was entirely done by some AI interpretation of the original 2D projection of the game scene, conjuring the entire new 3D scene. But of course, just extracting the vertex data from the emulator is way better.
It's pretty clear from the title; it's an AR app that projects an emulator over top of your controller.
Sounds pointless if you're using a phone as your window into AR, but if you were to imagine this on a headset like a Quest or Vision Pro, it would basically "turn" any controller into a handheld retro console.
> It's pretty clear from the title; it's an AR app that projects an emulator over top of your controller.
It's not that clear at all if this is your take away -- its not just projecting a 2d surface emulators normally generate, but is instead taking that 3d data from the emulator and projecting _that_ into space. If you move the controller
The camera is only used to track the controller, the actual 3D data is pulled from the game files. The Quest API does let you layer elements onto the passthrough view[0], so it could work, the game would just hover in front of you instead of tracking the controller.
Then why isn't it running on Android? That would instantly work on the Quest. Why would someone voluntarily subject themselves to xcode? People need to treat themselves better.
Which is interesting, because the release of ARKit doesn’t pre-date that of Android’s ARCore by even a whole year, and as I recall Google was dabbling in AR before Apple was. The difference is that ARKit has received consistent attention and iteration.
Well, there was a clear and specific product driving it in the Apple camp. Not just to get it ready for the dev kit for VisionOS devices but to get their dev teams a constant flow of useful data from any AR apps created in ARKit along the way that have likely fed into the design decisions and roadmap of the entire project.
I’m pretty sure ARKit has been 99.99% in existence as a way of getting as much real world data as possible for the headset design and ‘spacial computing platform’ project.
This sort of stuff also just seems to be way more in Apple’s wheelhouse. Apple is a device company, providing these kinds of developer libraries is, like, table stakes. For Google, supporting hardware isn’t in their DNA, right?
I think this type of camera control has some big potential in the AR market. It's not fully submersive, but (1) (almost) no nausea because you're peaking into the world (2) screen everywhere (3) can be used in many genres ranging from casual to RPGs and actions (4) feel more natural to existing gamers.
Somehow this made me more convinced with the future of AR, but not VR.
To me "walled garden" sounds pretty negative. It's a wonderful place unavailable to pretty much everyone. Some pretty anti-democratic, capitalistic s*t. But I suppose it carries different connotations for different people.
What's your preferred terminology? "Monopolist prison" comes to mind as an alternative that's not possible to interpret positively.
This seems like an incredibly specific demo. Even they say that it doesn't work with games that aren't mario kart.
Unlike the NES which has only a few ways to layout the display, which the 3D NES emulator relies on to get good results, the DS can do arbitrary 3D stuff. It also has no API layer to reliably extract the camera information to generalize this kind of re-projection.
Neat looking trick but its sort of a dead end. It requires hand written code for any game you would want to do. I'm also curious about the rendering artifacts visible in the demo.
The DS is very much the wrong choice for this. They should try something similar with Dolphin and gamecube games, because the community there already does some stuff with camera manipulation.
There is a fork of Dolphin for VR. I played a few minutes of Zelda with my Oculus on several years ago. Pretty trippy to be transported to a game I enjoyed on a CRT 22 years ago, with a life-scale Link staring me in the face, 3 feet away.
This reminds me that there seem to be few "VR ports" or "cross platform VR releases" of normal games. I guess the gameplay of standard games doesn't necessarily work very well with the inherent limitations of VR. (E.g. nausea from moving around too much)
If someone had told me ~15years ago that one day I'd have the ability to play a VR version of Super Mario Galaxy, but that, when that time came, I wouldn't even bother to try and play it, I'd have (1) had a hard time believing it, and (2) assumed that future gaming must be godlike, in order to make me not bother with SMG VR.
But the reality of it, is so much more human and boring.
That is, I know from playing far-more-tame 3d motion games in VR, that it actively sucks as much as it rules. The eyes are saying Wow, and the rest of the body is saying No thanks, in a way not experienced before++. As such, when I heard of Dolphin VR last year, I didn't even dream of using it. Even the mere thought of trying to do SMG levels in VR is intimidating.
++I grew up loving roller coasters, and was the only local kid that never got sick on rough boats, but this, this is something else.
I still think the Metroid Prime games would work well in VR, giving you more depth perception, situational awareness, and ability to look around than the original 4:3 or 16:9 aspect ratios did.
There's way fewer NES mappers than there are DS games. It's 30ish vs hundreds. On an NES emulator, coding up just a few mapper chips gets you most of the library and probably 95% of the games worth playing, whereas with this, you need to write a "camera and 3D asset exporter" for each and every title you want to emulate. The work to give this effect to any game is nontrivial. Honestly interacting with the AR library and headset is probably the EASIEST part of this demo.
Why do you have to be so negative about it? The README itself says it's a
"prototype/concept". It's a neat hack, certainly the best thing I've seen on this website today. You make entirely valid technical points, but you don't have to present them as if you were scrutinizing someone's startup pitch. It's not that.
With ARKit you could probably get it to recognise and track the controller using it's object recognition, there's sample code here : https://developer.apple.com/documentation/arkit/arkit_in_ios... Then you wouldn't have to strap the phone to the controller
I'm not so sure about RealityKit, I found it doesn't have the flexibility and finer grained options of ARKit, it's annoying that Apple make 2 frameworks for AR but don't have the same features in both
This makes me think, is there a Nintendo 3DS emulator for VR by any chance? Unfortunately non-VR screens which are able to display stereoscopic 3D are getting increasingly rare again.
I mean, that is pretty cool but... It seems kind of obvious that they could have just printed the QR code and taped the paper to the switch instead of using rubber bands to attach a phone to their switch...
Not sure why this is getting downvoted, I thought the same. Having worked on plenty of projects that rely on tracking a QR code, you'll get a much more stable experience scanning it on paper than an LCD screen.
Please consider what it would feel like to be the sole developer working on this project and then to hear this response from someone in person who you excitedly share the idea with. This is not a nice way to respond to someone in person, and it isn't a nice way to respond online.
Consider a rephrase: "This is awesome. I am really looking forward to the world when holographic displays make this possible without AR".
I actually was very excited to see what 3D images “floating” above my DS was supposed to be. Imagine my disappointment when I find out I’ve been clickbaited. Nothing is floating here, it’s just an AR app.
“Meh” is exactly how I feel. Indifferent. Unmoved. Unexcited. I left quickly and spent an hour reading about the technology behind the Voxon display. True volumetric images floating before your very eyes.
Perhaps this is an age thing, but I've been (following) around technology long enough to know what's likely / possible, and a leap into a fully-visible hologram on a handheld device isn't likely in the coming few years (at the very least we'd see some precursors to it first).
It's ok to feel indifferent, and you may even express that in person if someone is talking directly to you, but adding "meh" in a message you're choosing to write when no one asked you seems not nice to me.
I'm not even insisting people don't do it -- I'm recommending one reflects more deeply before one speaks; it might seem hard at first, but with practice it will likely filter out unpleasant-to-others gut reactions you may used to have.
Let people express their true feelings. We shouldn't be blowing smoke up each other's asses. Criticism is a fact of life and we can't culture our way out of it. There is "nice" and there is honest. And neither are kind (though honesty is much more likely to be kind)
Yes but there's a difference between just putting down other people's work and making substantive criticisms that the creator(s), as well as us readers, can learn from. That's why the HN guidelines say: "Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something." - https://news.ycombinator.com/newsguidelines.html
We definitely want good critical comments here, so in that sense you're correct—but we don't want shallow internet putdowns. Those grow like weeds, choking out interesting conversation and eventually degrading the entire ecosystem.
Your phrase "express their true feelings" is interesting in this context. Shallow internet putdowns aren't true feelings. Often they're a kind of side-expression of someone's bad mood, frustration with life, or whatever—feelings which are actually about something entirely different than (e.g.) AR or holographics or whatever the topic at hand is. Those feelings get displaced into internet swipes because the latter are cheap, risk-free, and bring a slight temporary relief. But they're not "true feelings". If they were, they'd be much more interesting reading.
The GP comment actually included a great second paragraph saying something about what they'd really like to see, and that was interesting because it was rooted in genuine curiosity and enthusiasm. But there was no need to put down someone else's work for not being that.
Calling every curt response "dismissive" is throwing the baby out with the bathwater. The original comment personally resonated with me, and while the linked project is kinda neat I guess there is a growing contingent of us that see AR as an inadequate stopgap on the way to holography. My response would have been very much similar.
But I also take issue with this because it sort of compels people to be wordy and force them to use awful gentrified business-bullshit speak where everything has to be couched as positive. There is value in being blunt and direct, and there is value in an economy of words. Just because someone may not be feeling bothered enough to couch their language in pleasantries, or they are incapable of formulating the words in a satisfactory manner, shouldn't mean that they also shouldn't speak at all.
I get that some sorts of negativity "sprout up like weeds" but if we truly want an arena of thoughtful, critical discussion then perhaps we should give such comments more benefit of the doubt.
"Meh, just AR, not true holographics" was such a classic shallow dismissal that it should be easy to understand why so many readers, including me, reacted negatively.
No one is calling every critical response dismissive nor compelling anyone to use gentrified business speak, false pleasantry, or any of the other things you're worried about.
Edit: one thing that you and the GP appear to have in common is a pre-existing understanding of "AR as an inadequate stopgap on the way to holography". Ok, that's interesting! but the context that pre-exists inside your (or anyone's) head doesn't communicate itself to others. It needs to be explained explicitly. That can be done without putting down the work of someone else who happens to be working on (say) AR.
If you express the point only as a putdown, without the surrounding information that would make it meaningful, then you're going to feel like you made a substantive comment (because for you it's resting on a substantive foundation that you're taking for granted) but it's going to land with readers (who don't have that internal context) as a shallow dismissal.
By "you" I don't mean you personally, but all of us—we all take our own internal state for granted. For HN to function well, though, commenters (all of us) need to work on not doing that, but rather explaining enough relevant background to make our messages interesting instead of empty or mean. The GP's second paragraph did a great job of that—just not the first bit.
Hey while you're at it can you remove my rate limit please. If I'm not meant to post then you should ban me. This reduces my ability to fully participate but does not at all reduce my ability to make fun of cowards.
I've removed the rate limit as a good faith gesture, but please keep in mind that we're trying for quality over quantity and particularly trying to avoid flamewars. "Make fun of cowards" is a distinctly inauspicious phrase in that respect.
“Meh” isn’t critique let alone criticism. It’s dismissal.
It’s not even relevant to this project because what are they expecting them to do? Make bespoke hardware for it? It’s a link to a GitHub page
It’s like if someone posted a new Tesla and someone replied with “Meh, should have been a Jetpack”. It’s just completely irrelevant if one were to take as criticism/critique. It’s pure dismissal
which is a dismissal. It adds nothing of value, provides no actionable task, doesn’t do anything beyond saying “I want something else” which is like “okay but that’s not within the realm of what this singular person can do so is irrelevant”
This is semantics. If I present something and the response is "meh" then that is important information to me.
Let people express themselves without policing it.
Attitudes like yours create echo chambers of sycophants. I personally hate an environment where the only feedback allowed is positive-to-neutral. Negative things, like dismissal that you seem to hate so much for some reason, are feedback too.
You’re missing the point. Nobody is saying don’t critique it. Critique is constructive. Meh is dismissive
Be as brutally critical as needed but criticism should be something of value.
“Meh” is the equivalent to “neat”. It adds nothing, but unlike neat, it takes away pride.
Also if we’re taking the meta discussion, then you’re doing the same thing you’re saying not to do. People are just giving feedback , but unlike the original comment it’s constructive feedback.
Not to mention that Dang often tells people to not be dismissive. If you don't like it, click back and see something else. Or do something more important like prepare for standup.
You’re basically describing the looking glass and it’s been available for several years now with a Unity 3D framework if you did want to build games for it