Hacker News new | comments | show | ask | jobs | submit login
Visualizing an iOS Device in Blender Through Quantum Entanglement (medium.com)
29 points by Ivoah 4 days ago | hide | past | web | favorite | 14 comments





Ah, I did something vaguely similar for lulz a few months ago with a ThinkPad X61 tablet - the hard disks have accelerometers in them.

A few minutes dorkily waving the laptop around showed how the values changed, and I discovered the range differences were so straightforward I was able to compute the device orientation (which way is facing up) using a simple shell script!

Five (okay, maybe ten) minutes later, and X11 was happily rotating and inverting the display automatically as I flipped the thing around. (Seeing Chrome auto-flip exactly like an tablet does (albeit with nice animations) was neat.)

...with one caveat. Every single orientation change made the backlight kick off and back on again.

Tons of printk() debugging was insufficient to fully track down what was causing it. My "solution" was to neuter the bits of KMS that actually did the low-level DPMS calls - the result was some mildly scary display tearing in certain circumstances, but the backlight didn't flicker anymore.

Unfortunately, 'xset dpms force standby' (or 'suspend') no longer worked either - and nothing else that put the display to sleep via DPMS did either. Woops.

libdrm, KMS, DRM and X11 are a mess.

(As for what I was unable to get to the bottom of - something at the X11 level was deciding in certain circumstances that some rotates required "full GPU hard resets" and other rotates didn't. For example rotating left was fine, but rotating right was not. And inverting was fine but switching back to normal was not. But, get this, _if I had an external display attached_ (or had the VGA port forced-on), rotating left and right were both fine! Given that I was able to later reproduce this behavior on both an Intel and AMD system, this is why I glare at KMS/DRM and call them a mess.)


spoiler: has zero to do with quantum entanglement. The author thought that was funny? Cool though

Seriously wtf... I'll honestly say that I fell for this click bait. If it said something along the lines of "blah blah... accelerometer" I would've checked it out, too, AND actually read it. But because of the shit click bait, I'm not even going to bother.

Sorry about that. I figured most would get it was a lighthearted reference, and not groundbreaking quantum research that would change the world as we know it. Even so I made the first words after the title be “I might have the definition of “quantum entanglement” wrong.“, to try and make that clear. Oh well, I’ll get it right next time. Thanks for the feedback.

Personally I thought it was genuinely funny.

And on top of that, I thought it was in a real blender. So my brain imagined something like "Will it blend" while receiving orientation of all the parts via quantum entanglement. Now that would be the future.

Did you really expect it to?

The huge range of solutions that developers with different backgrounds will come up with is evidenced by the fact that when I read this...

The code is pretty straightforward. It opens up a socket to the host, then for each motion update it creates a MotionData value, sets the properties on it, encodes it into JSON and sends it to the script running in Blender. It reads any data the host sends and discards it.

...my first thought was why JSON? I'd be curious to know the reasoning, since if I wanted to do this same task, I'd just send the values directly as binary --- 4-byte floats seem the natural choice here, since that's the representation both sides ultimately want. Also, this protocol is clearly unidirectional, so there's no need to even bother with the other direction.


Why not?

It's a straightforward format that's widely adopted with easy-to-use libraries, and the data's tagged (with keys) so you get a bit of extra information when you're debugging.

It's a decent choice when you're building either a prototype, or not building something that's supposed to be extremely performance-oriented, in which case, I totally agree, a binary protocol would be more appropriate.


Every layer of abstraction introduces additional complexity, and more significantly, possible error cases. Observe how the given code has a path to handle JSON encoding/decoding errors. If you use a plain binary protocol, that's not even a consideration.

I know it may be a contrived example, but what happens if you try to send it several MB of JSON, well-formed or otherwise? Examining the code more closely, I notice that it only reads up to 4K at a time, and there is no message framing, despite using stream-oriented TCP. It's going to get very confused if it gets more or less than one exact JSON "message" per socket read. If you modify it so it does do framing, then the question above still holds... what happens if it runs out of memory, etc...?

All questions which are completely avoided by a simple "read 4 numbers from the socket and set positions" protocol. Code in which error cases simply cannot occur, has no need to be afraid of even a fuzzer feeding it input. KISS, YAGNI.

(The larger point I'm making is that simplicity is often surprisingly easy to apply, yet rather undervalued in these times of preferring more abstraction. Of all the easily-exploited IoT out there, I bet the majority of them would not be so if some simplicity had been applied to their design.)


It would probably also be better to use UDP instead of TCP -> you want to send updates as fast as possible, and if a packet is dropped you don’t want to waste network capacity to resend it — just send the next packet.

Then again, TCP is a lot easier to work with, since you get an error on the client side if you made a mistake. With UDP there is no way to know if the packet was received.


If you have a Python script at the other end JSON is nicer to decode and debug, and for such a low amount of data the few bytes overhead doesn't really matter.

Also, once you go with binary encoding, using a fixed-point integer format is almost always the better choice over sending float numbers. Depending on the needed spatial resolution (which isn't much in this example), you can reduce message size by using 8-bit or 16-bit fixed-point integers, or if you need more precision than float in the same space, a 32-bit fixed-point number.


I recently started learning about how powerful Blender’s python API is. I have a YouTube channel that people support through donations, and every month I send out little 3D printed objects as a thank you gift. I wanted to include a 3D printed little thank you card with each person’s name on it, but obviously I can’t take the time to model a unique card for each person.

After just a few hours of playing around I was able to set up a python script that reads people’s names out of a CSV file, launches Blender as a background process, generates a model of a little plate with text on it thanking a person by name, exports that model as an STL, calls up Slic3r (which can also be run headlessly) to generate the gcode for the printer, and then finally uploads that file to my 3D printer.

Previously I was writing people little notes by hand. Not only is this much cooler, it takes considerably less work from me. I just run a single command to execute the script and then walk over to my printer and hit “print”. A couple hours later I have a pile of little thank you cards.

I really think Blender is the crown jewel of open source software.


Reminds me of when I was playing with an Oculus Rift DK2 kit:

https://www.youtube.com/watch?v=nlc-epzkhb8




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: