
Show HN: Live-programmable voxel landscape in WebVR - moron4hire
http://www.primroseeditor.com/examples/editor3d/
======
moron4hire
Primrose is a text editor for use in WebGL/WebVR contexts.

I want to make live-programmable environments and create new user experiences
that require VR, rather than adapting 2D UX to the stereo display. Having a
text editor run in a texture is a necessary tool to start. I've demoed just
the editor before, but this is the first time I've been able to demonstrate an
environment constructed on the fly, using the editor.

You can find the source code here:
[https://github.com/capnmidnight/Primrose/](https://github.com/capnmidnight/Primrose/)

------
highCs
FYI, a voxel (volumetric-pixel) is a 3D rendering primitive which is a
competitor of the polygon (yes there is other primitives like elipsoide[0]). A
point cloud carrying whatever augmented information isn't in itself a voxel
model. Ultimately, a voxel is a voxel when it is rendered as such: when this
is the 3D primitive used to support/make the rendering.

So are minecraft lanscapes made of voxels? No, these are polygons.

[0]
[http://www.hardcoregaming101.net/ecstatica/ecstatica2-6.jpg](http://www.hardcoregaming101.net/ecstatica/ecstatica2-6.jpg)

~~~
moron4hire
I know the difference, but people today hear "voxel" and think "minecraft" and
I didn't want to say "minecraft" in the title, because there isn't any world-
level interactivity in this particular demo.

------
codeshaman
Good work. I can envision live editing to become standard and we're bound to
see new 'code' builders where programs can be expressed using graphics mixed
with text.

That being said, editing text in 3D isn't fun. Text is by definition 2D and
it's weird to edit it projected (although the idea itself seems cool).

I'd be perfectly happy with a split screen of 2D editor + 3D scene being
edited.

~~~
moron4hire
It's necessary with the WebVR setting, though, because mixing traditional 2D
elements with the VR scene is extremely jarring. I built Primrose specifically
because, at the time, other VR live-programming demos were overlaying HTML
elements with using CSS3 transform matrices to fake a 3D, barrel-distorted
projection, which meant that there were two completely different Z-buffers
(the WebGL one and the DOM one) and they didn't interact.

Certainly this is not the most ideal setup, but I think it's a good baseline
to continue to build from. I'm working on a game of collaborative AI bots that
the user will program using a simple, not-JavaScript programming language.
(maybe CoffeeScript, IDK, I actually slapped together a BASIC interpreter not
too long ago for another demo I made). I think it will work better to organize
functions 1-to-1 with editor objects, tied directly to the physical objects of
which you're editing. I'm approaching it from a "crack open the back of the
robot and twiddle the parameters" metaphor. I'm hoping the more concrete
scenario will obviate some of those awkward issues you've mentioned.

------
iamwil
What is this useful for? I'm not a WebGL/VR programmer, so I'm curious as to
why people would use this, besides the neat factor.

~~~
moron4hire
Think of it a bit like a 3D REPL.

Working in the browser provides me, a single developer, a much faster route to
a wide variety of other features (webcam, audio, webrtc) as well as cross-
platform compatibility (for example, this same site works in WebVR builds of
Chromium and Firefox for Android using a Google Cardboard-like setup).

With that as the baseline, I want to experiment in creating new user
experiences that require VR, rather than adapting the same, ol' 2D UX to the
stereo display.

One of those things might be live-editable objects. Having a text editor run
in a texture is a necessary tool to start. Eventually, I'll be building more
controls and features for being able to manipulate objects on the fly.

