Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Swip.js, a library to create multi device experiments (github.com/paulsonnentag)
152 points by brackcurly on Oct 18, 2016 | hide | past | favorite | 17 comments

Using the devices as planes for the golf demo is incredibly creative!


I know you can use HTML5 DeviceOrientation for angle and the "swip" for relative positioning. But how did you get the physical size of the screen of each device?

We haven't figured out a way to do this automatically. Currently we prompt the user to enter the size of the device when the app is opened for the first time.

Why not two swips (maybe even single, two-finger swips)? that's enough for relative sizing

I don't think it's possible unless you add an app.

One solution would be create a companion app that determines the physical screen size, then makes that available to your network via web sockets.

There our a few steps I left out here but it's possible.

The app could be optional - if its installed that device will have fully automatic configuration. If it's not installed, you just fall back to your user prompt.

How do you know which device is positioned higher than the other (on a bunch of books)?

Based on the demo they showed, this isn't needed. As long as they know the orientation of the device relative to a flat plane they can move the virtual items within the bounds of that device's screen. When it crosses over on to another device, it only needs to calculate movement based on it's relative orientation.

Yes, exactly. If it enters the space of a tilted device, a force (depending on the rotation of the device) is applied to ball.

This is an amazing demo and a great reminder of the sheer cool factor of having tiny computers in our pockets.

Does anyone know how well you can time via an app?

I wonder how it sounds to play sound across devices. Streaming and clock syncing can happen beforehand, just the playback has to be very sync to an absolute timing.

Maybe one can do something crude sound propagation synthesis by playing with timing, sound runtime, gps and a crowd. If you synthesize the sound the app even stays very small. You could e.g. make an ocean wave roll through the audience (when every device knows where it is and knows the exact time of when the wave will hit that position).

Same could work with devices as pixels, but I don't find it that interesting.

I'd gone to a hackathon a few years back (I think it was Michigan's MHack), and one of the teams did exactly that. It was actually really cool.

They split the room into 3 sections, and then controlled each section to play it's own tune.

If you are playing sound you can use the microphone for synchronisation.

hehe, right, but it will get messy in a crowd. Maybe fun though, if the phones first try to agree on a timing.

I don't have a use case for this (yet) but ... the demo is absolutely incredible. Very nice work

I agree, but could go without the royalty free bg music.

Wow, great work. I assume the pinch gesture calibrates the positioning and is always center of the screen? The bounce off the screen edge on pong is satisfying.

You're right, when the server receives two swipe events at the same time in opposite directions we assume that both devices are aligned at this point. It doesn't matter though where on the screen you pinch them together,.

That golf game has a lot of potential even as a native app, nice!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact