
Simula: A VR window manager for Linux - georgewsinger
https://github.com/SimulaVR/Simula
======
georgewsinger
In case people are curious what it's like to work in VR, here's a demo of me
using Simula: [https://youtu.be/FWLuwG91HnI](https://youtu.be/FWLuwG91HnI)

I would argue that working in VR is fundamentally superior to working in
PCs/laptops. It's basically 10x the screens, focus, and work immersion.
Simula's text quality is also very good (getting around the eye fatigue
present in older VR Desktops).

~~~
skratlo
Well looking at the video, I mean, I can have dozens workspaces with hundreds
of windows (if I need to) and navigate them with moving just my fingers,
whereas you are apparently breaking your neck to navigate more than 3 windows.
I think what you're doing is cool and has to be done for the sake of
exploration, but I would really, really focus on navigation. Also, how are you
supposed to focus on text, when reading or reading or writing when it's
wiggling all the time. It seems to me that VR really is for games, but for
working with text I think humans are better off with stationary screens.

On a second thought: if you would make it easier to "fix" window, such that
one can focus on it and work. As when you are "placing" a new window in your
video, that could make it a workable solution.

~~~
kanetw
Navigation is one of the things we definitely need to improve upon.

I think the wiggling is actually kind of natural -- you don't notice your head
or eye movements when you do them automatically, but it might be exacerbated
in VR.

The fixing is an interesting idea, we'll try it out. I'm a bit worried it
might feel weird in VR.

~~~
ggreer
YouTube has support for VR videos. If you could record & render such that
people with headsets could watch a session, that might help with interest and
adoption.

I think navigation needs eye tracking, which sadly no headset currently
supports. Focus-follows-gaze would be a game changer.

~~~
georgewsinger
Uploading a YouTube VR video of a work session is a potentially fantastic
idea. Thanks for the suggestion.

Right now, Simula uses "dumb" eye tracking, in the sense that windows receive
keyboard and cursor focus when the user's forward eye gaze intersects a
window. We also have it so that users can control the cursor focus with their
forward gaze (presently binded to `Super + Apostrophe`); similarly, users can
drag windows around by holding `Super + Alt` and looking around. The
experience adds up to something quite productive once you learn all the
keyboard shortcuts (your fingers don't need to leave the keyboard).

------
carapace
Been thinking about VR workspaces since sometime in the '80's. Broadly, they
suck. It seems so cool but in practice VR adds pointless overhead to efficient
UI. Windows with affine transformations _suck_ at their _one job_. Very few
people use Second Life as an IDE.

The big win, as far as I can tell, would be to engage the user's spacial
memory. (There is a small but non-zero niche for 3D visualization of complex
systems: weather, large molecule, etc.) You're going to want to combine
"memory palace" with "zooming" UI in a kind of pseudo-3D (I think of it as
2.618...D but the exact fractal dimension isn't important I don't think.) Then
infuse with Brenda Laurel's "Computers as Theatre"...

[https://en.wikipedia.org/wiki/Second_Life](https://en.wikipedia.org/wiki/Second_Life)

[https://en.wikipedia.org/wiki/Method_of_loci](https://en.wikipedia.org/wiki/Method_of_loci)

[https://en.wikipedia.org/wiki/Zooming_user_interface](https://en.wikipedia.org/wiki/Zooming_user_interface)

[https://en.wikipedia.org/wiki/Brenda_Laurel](https://en.wikipedia.org/wiki/Brenda_Laurel)
\-
[https://www.goodreads.com/book/show/239018.Computers_as_Thea...](https://www.goodreads.com/book/show/239018.Computers_as_Theatre)

~~~
echelon
They might suck right now, but this is a relatively nascent application of the
technology.

Wait until resolution improves and we break out of the "desktop" paradigm. We
could have a collection of unlimited windows and tabs that exist in a
continuum around us, and we could use gestures to organize and surface the
contextually relevant ones.

We won't need a bulky multi-monitor setup, and we could work remotely nearly
anywhere. Imagine carrying your workspace with you.

> The big win, as far as I can tell, would be to engage the user's spacial
> memory.

Absolutely! Physical workspaces and work benches are incredibly functional
because we are spatial animals. Breaking out of the limitations of using a
screen could unlock more of our senses for use in problem solving.

I'm extremely excited about this technology. It will be great for software
engineers, creatives (2d and 3d artists), mechanical engineering, CAD, ... you
name it.

I really hope this keeps getting pushed forward. While I'm using all of my
spare cycles on a tangentially-related problem domain, I'd be more than happy
to donate money and ideas. This technology will be a dream come true if it
continues to mature.

~~~
IdiocyInAction
> Wait until resolution improves and we break out of the "desktop" paradigm.
> We could have a collection of unlimited windows and tabs that exist in a
> continuum around us, and we could use gestures to organize and surface the
> contextually relevant ones.

You can get 90% there by just using multiple desktops IMHO, at least that's my
experience.

~~~
Robotbeat
I think in the mid-term, the big win would be ability to have 4-10 large
monitors but with a cheaper, mobile, and compact solution with the eyes
focused further away.

Headsets have improved almost ~2x in resolution and have halved in price and
some have become wireless with better optics. A long ways from 4-5 years ago,
but still need another doubling of resolution (or maybe more) and an increase
in wearing comfort (lighter, more compact) plus an improvement in wireless
latency and maybe a reduction from $400 to $300, and you’re looking at
something that would be useful just as a replacement for multiple monitors.

Plus probably improvement with registering where your laptop and mouse are
automatically. In principle that could be done with a software update to
inside-out tracking software.

Additionally, some improvement is possible with similar to current resolution
but with improved subpixel rendering and RGB pixel layout.

Seeing what has been done with the Oculus Quest since I last checked out VR
like 3 years ago has left me pretty impressed. A lot of this stuff with
multiple windows in this demo could be done natively and wirelessly with the
Quest (which runs a kind of Linux). The inside-out tracking is impressively
good. If combined with an insert you can put your tracked controllers on
Bluetooth mouse and keyboard (so the Quest can register their positions in3D
space to allow proper rendering in-headset), it could give you a high
productivity workstation experience just about anywhere with WiFi (could be
through phone). Hand-tracking (Which works already) could even allow gestures,
although I’m not sure how important that is. Can the Quest do subpixel text
rendering like ClearType but 3D?

------
hello_there
This looks very nice! I've been thinking about something like this for quite a
while. May I suggest two additional features that I would love to see in such
a window manager:

1\. A way to neatly arrange all the windows on a virtual sphere that surrounds
the user, possibly arranging them automatically in a similar manner as a
tiling window manager.

2\. A way to rotate the before-mentioned sphere around you without forcing the
user to rotate it's head. This would avoid much of the neck strain. It could
be done by, for example, holding a button on the keyboard while moving the
mouse or by a simple keyboard shortcut to rotate the sphere by X degrees in
any direction.

This concept could also be extended to virtual desktops where each desktop is
a sphere around the next, like an onion, with the ability to "zoom" in to the
next desktop.

~~~
philsnow
> 2\. A way to rotate the before-mentioned sphere around you without forcing
> the user to rotate it's head. This would avoid much of the neck strain. It
> could be done by, for example, holding a button on the keyboard while moving
> the mouse or by a simple keyboard shortcut to rotate the sphere by X degrees
> in any direction.

one more thing for the author to play around with (I don't have the hardware
to try it myself) is experiment with speed/acceleration -- the mouse pointer
movement has speed and acceleration parameters that affect how it moves, so
that if you want you only have to move your mouse very little in real
dimensions to move it thousands of pixels.

It might cause motion sickness, but maybe you can get away with
pitching/yawing 5 degrees for every 1 degree of head movement, so to look
"straight up" you only have to tilt your head up 18 degrees. Hopefully you'd
still have the illusion of being oriented in a space.

~~~
ggreer
> It might cause motion sickness, but maybe you can get away with
> pitching/yawing 5 degrees for every 1 degree of head movement, so to look
> "straight up" you only have to tilt your head up 18 degrees.

That would cause extreme motion sickness.

------
masswerk
While the demo is impressive, I'm a bit unhappy with the naming. Mind that
Simula [0] is a historically enormously important programming language. (The
language, which introduced object-oriented programming concepts, developed by
Ole-Johan Dahl and Kristen Nygaard, 1962-1987; Dahl and Nygaard were awarded
the IEEE John von Neumann Medal and the A. M. Turing Award for this.)

[0]
[https://en.wikipedia.org/wiki/Simula](https://en.wikipedia.org/wiki/Simula)

------
ken
There's something a little humorous about a new VR system whose primary
feature is how well it can display VT100 emulators.

I don't know anyone who used MS Windows when their software was all still MS-
DOS programs. Windows really took off when programmers started writing
programs designed to take advantage of the native GUI paradigm.

Likewise, VR is never going to be accepted as a practical user interface until
it moves beyond the concept of "windows". I fully believe it's possible to
make a great VR user interface, but I can't believe it's going to look
anything like some rectangles floating in space.

This could be a VR-fvwm. What we really need is a VR-GTK+.

~~~
tyrust
Programmers work with text and will continue to do so for the foreseeable
future. Terminal based editors are powerful, so properly displaying terminal
is a great goal.

What would a non-rectangular UI for manipulating text look like?

~~~
ken
> Programmers work with text and will continue to do so for the foreseeable
> future.

Programmers work with 2D overlapping windows and will continue to do so for
the foreseeable future. I'm giving a criterion for when that's going to
change. They're connected. You can't beat a 2D display for displaying 2D data.

> What would a non-rectangular UI for manipulating text look like?

Looked at a DOM Inspector recently (HTML)? Or Computed Styles (CSS)? Or
Network Activity (HTTP)? All of the most common text formats I use, I view
through a non-text interface. These aren't inherently 2D data streams --
that's just what we do because they're being put on a 2D display. All of them
would be even more useful in 3D.

That's not even counting the biggest classical use for an extra dimension:
time, e.g., version control history, animation state, or database transactions
or migrations.

~~~
tyrust
I don't really do much frontend web programming, so those examples don't
resonate much with me. I see your point, though, with your last examples.

It's still very difficult for me to imagine how I would translate interfaces
I'm used to into another dimension. Even if I picture something floating in
front of me, I only perceive a 2D projection of it. Perhaps with clever
transparency or by rotating it around I could receive more information than I
would in 2D.

I feel like a character in Flatland, I stuck in my own dimension.

------
_anastasia
Sounds like this would work nicely with vim³[0]

[0][https://github.com/oakes/vim_cubed](https://github.com/oakes/vim_cubed)

~~~
georgewsinger
Nothing improves productivity more than vim³.

~~~
disconcision
Indeed. vim³ is great, but it's hard to compete with nothing.

------
ollifi
Looks cool and will try it out. That said, these things always feel like old
paradigm stuffed in to new technology. Bit similar like when film was new
people kept using it to record theater. It took years to come up with editing,
not having to start scene by people entering the stage etc. Now that we have
unlimited spatial space to manipulate the thing we come up with is to fill it
with 2d screens.

~~~
vsareto
There aren't a lot of jobs that can really use the 'space' provided by VR. I
think the adoption here would have to drastically outpace multiple monitors
and OS workspaces and keyboard shortcuts. I can easily get to 10-20 tabs in VS
Code or a browser, but my attention is only really ever on one at a time. If I
have to search through them, I'll have to do that on a monitor or in VR. I
might be able to organize better in 3D: HTML files on the left, CSS files on
the right, JS in the middle, backend reference above, tools and console below
(like 5 monitors in a plus pattern).

I'd need probably a dozen or so channels of information that I had to cycle
between quickly to convince me, given I've gotten used to multi-monitors,
workspaces, and keyboard shortcuts. Maybe some futuristic high level analyst
job that needs to look through lots of different kinds of information
(geographical, text, videos, etc) at once.

~~~
netsharc
I can imagine a web editor with some "3d space" flair. You have your
HTML/JS/CSS editor in front of you, but imagine strands connecting CSS rules
to the HTML elements in the preview window, which is floating slightly to your
right. Or the page being previewed can even be the size of a 5-storey building
which seems to be standing across the street.

For another interesting use case, how about a 3d debugger that lets you follow
a program flow over multiple classes and stacks...

------
daxterspeed
It seems to me like VR could get some great usage of vector assets. Having
icons and especially text render perfectly regardless of how close you get
seems ideal.

~~~
kroltan
Yes! In fact, games use another method that could also be very beneficial in
VR: signed distance-field rendering. As far as I know it's mainly used for
text, but there is nothing specific stopping it from working with other kinds
of images.

Essentially you get your vector data, and map it into a texture of distance
from the boundary, with positive being outside the boundary, and negative
inside. Then you apply a shader to that resulting texture to make it
presentable for the user. This means crisp contours/outlines at all but the
most extreme proximity, and easy implementation of other common vector effects
(such as bevels, shadows, etc).

------
lostmsu
Wow, it is written in Haskell.

~~~
1MachineElf
I was also impressed to see Nix package manager given first-class treatment.

~~~
kanetw
Packaging this was an absolute nightmare due to various distributions shipping
incompatible library versions and so on.

Nix, despite issues with everything OpenGL related, was the only thing that
actually worked. And Cachix made installation really fast.

PS: if it starts building for some reason during the install process, let us
know. We've tested it on various machines but due to some peculiarities
specific configurations might require a rebuild even with Cachix. We can add
that to the repo so that future installs and updates are faster.

------
htfy96
There was a similar product for Windows in 2016:
[https://techcrunch.com/2016/06/28/office-in-
vr/](https://techcrunch.com/2016/06/28/office-in-vr/)

Disclaimer: I worked at its company at that time

------
war1025
Looks like a nightmare to me, but I also prefer a single monitor with
workspaces where most of my coworkers seem to feel the more monitors the
better.

------
wdroz
I think that in 1-2 generations of VR headsets, we will no longer need
monitors. For now my HTC vive isn't comparable to my 4k screens.

~~~
_emacsomancer_
I think VR desktops look really cool, but I'm not sure I'd want to _have_ to
wear a VR headset every time I used my computer. Monitors will surely stick
around in parallel.

~~~
stragies
Until the VR headset comes integrated into contact lenses (my personal dream
since I started needing glasses for reading passing 40).

------
jstanley
Possibly a stupid question, but is it possible to try this out on an Oculus
Quest? E.g. with a link cable?

~~~
greggman3
I have Rift not a Quest so I don't know if they're the same but you can pull
your desktop and every window in your desktop up in VR on the default Rift OS,
even in the middle of running another VR app. I had a browser open in the
middle of playing Half Life: Alyx. Did the same on No Man's Sky VR

Someone asked if I could watch all the Star Wars movies at the same time. I
got 6 trailers running at once in 6 separate virtual monitors with the default
built in OS feature.

~~~
jstanley
The Quest runs Android and is a stand-alone device, it's not just a display
unit for your PC.

~~~
greggman3
Yes, but there's the PC link option for the Quest, so if you have the link do
you get the same experience as a Rift including being able to pull all your
Windows desktop windows individually into VR?

------
drusepth
This looks absolutely awesome, but I think there's a bit more to be done RE
neck strain and general ergonomics for the vast majority of people.

If you are at a desk and/or without a swivel chair, you're pretty limited to a
grid of ~9 screens directly in front of you, and even then it looks like
you're turning your head a TON to see the outer layer.

Some ideas that might help:

1) To use an analogy, could you increase the "mouse speed" of moving your
head? Right now it's 1:1 with the virtual space and makes geometric "sense",
but it might be nice to be able to focus on screens on the left/right by
moving your head less and having your focus move just as fast. For example,
moving your head 50% of what you do now to zoom over to the same place would
reduce neck strain when looking at those windows, but also open up a bit more
window space if, for example, you could still move your neck 100% of what
you're doing and be able to "look over" far enough to fit another set of
screens even further out. I don't know if this disconnect in pan speed would
be disorienting, but it's worth trying. :) c.f.
[https://i.imgur.com/0KCo9hG.png](https://i.imgur.com/0KCo9hG.png)

2) Keyboard gestures to "re-center" your focus might also go a long way.
Perhaps instead of craning your head to look at a window that isn't "straight
ahead" for a long time (as long as you're working in that window), you could
do a quick glance at that window and hit a keyboard shortcut that would
recenter wherever you're looking to be your "straight ahead", so you could
then straighten your neck back out and continue working in that window while
in a more comfortable position. c.f.
[https://i.imgur.com/u6SI284.png](https://i.imgur.com/u6SI284.png)

3) Likewise, we don't all have amazing headset resolutions and there's a lot
of potentially wasted space whenever focusing on a particular window that's
not "full-screened". Perhaps a keyboard shortcut to expand whatever screen
you're looking at to the full size of your view would be helpful for eye
strain. Even something that could temporarily "pin" a window to your view
(full screen and moving with your gaze) might go a long way toward minimizing
unused focus/pixels. c.f.
[https://i.imgur.com/yA1twKf.png](https://i.imgur.com/yA1twKf.png)

My immediate thought watching this was to mount a keyboard on a swivel chair,
but I think requiring end-users to get customized hardware to take full
advantage of your environment might be the wrong way to go. :)

~~~
IggleSniggle
Having spent some time in VR in the past, remapping inputs like "the mouse
speed of your head" can be super disorienting when you return to the "real
world," perhaps dangerously so if spent the majority of your workday in VR.

Your brain adapts, and starts to believe at a primal level that "things fall
this speed when I drop them," "I move this fast when I walk", "when I point
over there, it's at this angle," etc.

If anybody remembers playing GTA for too long, and later walking outside with
the feeling like you ought to be able to jump in the nearest car and drive it
a way, you have an inkling of how your brain gets remapped.

Remarkably quickly, moving around in the real-world can feel "off" to the same
degree that it felt "off" when you first entered the VR world when you start
messing with these primal feedback loops. Get ready to bump into things, and
don't operate any heavy machinery if you've been in a non-1:1 VR mapping.

~~~
ninkendo
After playing half-life: alyx for a few hours straight the other day, my mind
kept wanting to hit the teleport button to move around my house instead of
just walking. It was a very strong reaction.

------
jitl
Well now I have something to use my SteamVR after I finish Half-Life: Alyx

~~~
andybak
If you're stuck I can name a few dozen other cool things to do.

It's not like Alyx is the only reason to own a VR headset.

~~~
fcoury
Please do share, I am new to this world as I just got a headset.

~~~
andybak
That's a bit like saying "Just got a monitor. What's good to look at on it?"
;-)

The world of VR is so diverse it's tricky to recommend things without knowing
you.

Personally I'm not much of a gamer so I tend to be interested in narrative
stuff, abstract visuasation and geometry stuff or just exploring interesting
environments.

And the games I do like tend to have one or more of the above qualities.

You might be an RPG fan, a strategy gamer, an art aficionado , looking for
productivity or educational tools or any number of things.

------
MereInterest
This has been the main thing that I have wanted out of VR, and want to try
whenever it is that I get one. I'm glad to see that it exists, and that it is
usable for my primary goal of coding.

------
ganzuul
I really wish we will soon get UI elements which are not flat, with form and
function.

Volumetric video, spatial website navigation design guidelines, and augmented
reality are high on m wish-list.

~~~
georgewsinger
Can you elaborate what you mean by volumetric video and spatial navigation?

~~~
ganzuul
Video with z-depth, maybe computed from stereoscopic video. The complexity
added would then be similar to an alpha channel. This would break video out of
the rectangles we see in all UI design.

Wikipedia articles follow a format which makes it easy to find information.
Perhaps site maps for a VR web could be arranged according to a convention so
that users can easily figure out where to look for specific information. Early
VR we web tried to do this by arranging content like rooms in a building, and
streets, but that is cumbersome to use in practice. I think the wiki format
suggests a much better approach can be created.

------
viklove
Seems like this doesn't have any relation to xrdesktop[1], which doesn't have
mouse support! Really glad to see they have it here, any VR workspace is
completely unusable if I have to use motion controls to simulate a mouse. Will
definitely be trying this out tonight.

[1]
[https://gitlab.freedesktop.org/xrdesktop/xrdesktop](https://gitlab.freedesktop.org/xrdesktop/xrdesktop)

~~~
georgewsinger
xrdesktop has a very talented team of hackers working on it (so I have no
doubt they'll fix the mouse issue!).

Our approach with mouse cursors has been to assign every window its own mouse
cursor. This is possible in Simula since the active window is the one that the
user is currently gazing at, so that -- for example -- if you gaze at one
window and move its cursor, it won't affect the other windows' cursors. We
also allow users to control mouse cursors with their eye gaze (presently
bounded to `Super + Apostrophe`), which is a good productivity boost.

In fact, you can do everything in Simula with just the keyboard (no mouse or
VR controllers needed): move windows, move the mouse cursor, click the mouse
cursor, etc. Once you learn the shortcuts, it's very quick.

------
nixpulvis
I play a decent amount of flight sims in VR, and my eyes simply wouldn't be
able to handle any more of this than they already do for the sake of game
immersion. Unless theres some new headsets out there that fix this problem,
I'm not sure if this will work for me yet.

I do await the day my whole system is a AR/VR hybrid, but I suspect it'll be a
while still.

~~~
seanmcdirmid
The tech is still getting better, so it isn't like VR headsets are stagnant
technology wise. I can see resolutions getting hire and maybe using AMOLEDs
for less backlighting hitting your eyes all the time.

I've been spending an hour and a half in VR everyday as part of my quarantine
workout, and it seems fine, though I'm not really reading anything in that
environment.

------
noisy_boy
Does this work with Samsung Gear VR[0]? I got one for free but never really
used it much.

[0]: [https://www.samsung.com/us/mobile/virtual-reality/gear-
vr/ge...](https://www.samsung.com/us/mobile/virtual-reality/gear-vr/gear-vr-
with-controller-sm-r324nzaaxar/)

------
axegon_
Very clever idea actually, major props. Unfortunately I get really sick from
VR so it's not gonna be for me but I can definitely see how someone can take
advantage of that. In the WFH era for people who have tiny laptops, this could
do. I personally hook mine to the tv as a second screen but still... Very
clever!

~~~
kanetw
Or in cramped environments where you don't always have space for a multi-
monitor setup.

------
iamwil
How does it get the image of an application inside of SimulaVR? Is it
something weird like grabbing a graphics buffer? Or is it launching the
applications as a child process so it can access its viewport? Or is it asking
the windows manager? I'm not sure how this part works and am curious.

~~~
kanetw
We implement a Wayland compositor interface via `wlroots`. Windows can launch
like usual in our environment (i.e. via DISPLAY or WAYLAND_DISPLAY).

X11 apps are supported via Xwayland.

------
thecrumb
I remember seeing something like this during one of the early MagicLeap demos
and was intrigued. If the headset was lightweight enough this would rock. In
that video the screen remained in 'front' of the viewer and he could swipe
left or right to switch screens.

------
jimmySixDOF
I can easily see this in a Day Trader application Forex/Bloomberg/Etc you can
run a live feed 4x2 type dashboard matrix with a "click"-through drill down
experience or similar. So many possibilities. Nice work with big upside
potential.

------
bitwize
Fantastic! Now I just need a SinoLogic 16 with Sogo-7 data gloves and Thomson
eyephones. :)

------
sfj
What about an underwater setup where the VR headset is merged into a scuba
mask?

Eventually we could move on to having tubes implanted for feeding and waste.
You could stay in there indefinitely, forget you ever even existed outside the
simulation...

------
snickell
Very cool work, particularly important is the text quality improvements. Could
you talk about the approach to improving text quality is? Are you implementing
projected font hinting by any chance?

~~~
kanetw
The core issue with surface (text) quality is essentially that classic texture
filtering fails for surfaces here.

So we set a higher DPI than usual, and then use a supersampling algorithm with
a suitable kernel as a lowpass filter. This allows us to avoid artifacts while
maintaining sharpness.

------
matsemann
About eye-strain, how is focusing in a VR headset? Is it always a fixed point
given it's really just flat screens, or does one have to focus differently
when things are "far away".

~~~
stronglikedan
To add to your questions - What about old farts with reading glasses, like me?
Do they _just work_ with VR, or are they even needed?

~~~
kanetw
I wear glasses and VR is fine for me. I know some people who can't use it with
glasses, but I think that's for clearance reasons.

There's prescription lenses for headsets, but they're fairly expensive and IMO
only worth it if you use VR a lot.

~~~
stronglikedan
thanks. Good to know. I'll be sure to bring my reading glasses when I go try
on headsets. Something I would have likely neglected to do otherwise.

~~~
wincy
I’m pretty sure you never focus on anything up close in VR. It’s all “far
away”, as someone who is near sighted I absolutely have to use glasses in VR.
I suspect but am not certain some far sighted people might not need
correction, as something may seem close up but really it’s just “big and far
away”.

Try it!

~~~
outworlder
> It’s all “far away”, as someone who is near sighted I absolutely have to use
> glasses in VR

Yeah, me too. As far as my eye is concerned, most things are "far away".
Interestingly though, I can read things in VR "farther" than I can in real
life. I am not exactly sure what the convergence looks like.

------
tambourine_man
Very cool demo but I don’t want to read/write (code or otherwise) at an angle.
It’s tiresome and straining.

Whatever I’m editing should be frontmost and dead center, IMO.

~~~
georgewsinger
We have a key binding that automatically orients the active window towards the
user's gaze. If you're interested, I could make a launch flag that leaves this
feature on by default, so that any window you look at automatically orients
itself toward your gaze. This actually used to be the default behavior in
Simula a few iterations ago, and worked nicely in most contexts.

------
haolez
Not being able to see the keyboard would push my touch typing skills to the
limit! I hope it's doable, because this seems incredibly useful.

~~~
georgewsinger
When touch typing fails, Simula has a mouse & keyboard view:
[https://www.youtube.com/watch?v=D5c3Hfp8Hcw](https://www.youtube.com/watch?v=D5c3Hfp8Hcw)
Right now it's binded to `Super + w`.

~~~
haolez
Amazing! I had no idea this could be done in the current generation of VR
devices.

------
FpUser
One word - neck strain. If carpal tunnel syndrome is not enough we now get
even more fun trying to hurt out necks

------
cjohansson
I imagine the neck strain after a full work day. Ergonomics is very important.
I like the idea though

------
rijoja
mm I think we have a common interest

You could reach me at: [http://tbf-rnd.life/contact/](http://tbf-
rnd.life/contact/)

Or if you have some ways of contacting you I'd be very happy to do so

~~~
hugodutka
A tip for finding out the email of a Github user: choose any of their commits,
open it on Github and add ".patch" at the end of the URL, like this [1].
There's a good chance they use a real email to sign their commits.

[1]
[https://github.com/SimulaVR/Simula/commit/19cf46894cae1962e9...](https://github.com/SimulaVR/Simula/commit/19cf46894cae1962e9036052acc9c1237f4e871b.patch)

------
cjbassi
Is this for Xorg or Wayland? I couldn't find the answer in the readme.

~~~
kanetw
It's based on wlroots, but Wayland is less supported than the X compatibility
layer via Xwayland. That's fixable though.

~~~
DonHopkins
Developing a VR desktop on top of X-Windows seems like driving very fast down
a dead-end dirt road with a cliff at the end.

~~~
kanetw
We don't actually use X as the framework. The underlying code is Wayland, and
then Xwayland as a compatibility layer. It's just that most apps we test
against are X11.

------
yarrel
The name is familiar. :-)

------
thcleaner999
How change would it require to run on MacOS?

~~~
greggman3
It would require Apple it care about VR and gaming and VR capable GPUs. Except
for a MacPro, no Apple devices are powerful enough for VR. Even the top end
MacBookPro at $19000 still has an underpowered GPU below the lowest VR specs.

~~~
mjcohen
The highest possible price for a MacBook Pro is US $6099. How do you get
$19000? The iMac Pro can get to $14,299.

~~~
greggman3
Sorry, bad memory. Still that's beside the point. The point is Apple so far is
neither game nor VR friendly. I hope they change that stance. I'd prefer to be
on Mac 100% of the time but I have a Windows machine as well because that's
basically where all VR is happening.

------
enobrev
Once step closer to having an Ono-Sendai

------
jbirer
now make the windows wobbly and make it a spinning cube.

------
DonHopkins
Soon after the invention of the movie camera, there was a "genera" of films
that consisted of nothing but pointing a movie camera at a stage, and filming
a play in one shot.

That's the classic example of using a new technology to emulate an old
technology, without taking advantage of the unique advantages of the new
technology, before the grammar and language of film had been invented.

[https://en.wikipedia.org/wiki/Film_grammar](https://en.wikipedia.org/wiki/Film_grammar)

[https://en.wikipedia.org/wiki/History_of_film](https://en.wikipedia.org/wiki/History_of_film)

>The first decade of motion picture saw film moving from a novelty to an
established mass entertainment industry. The earliest films were in black and
white, under a minute long, without recorded sound and consisted of a single
shot from a steady camera.

>Conventions toward a general cinematic language developed over the years with
editing, camera movements and other cinematic techniques contributing specific
roles in the narrative of films. [...]

>In the 1890s, films were seen mostly via temporary storefront spaces and
traveling exhibitors or as acts in vaudeville programs. A film could be under
a minute long and would usually present a single scene, authentic or staged,
of everyday life, a public event, a sporting event or slapstick. There was
little to no cinematic technique, the film was usually black and white and it
was without sound. [...]

>Within eleven years of motion pictures, the films moved from a novelty show
to an established large-scale entertainment industry. Films moved from a
single shot, completely made by one person with a few assistants, towards
films several minutes long consisting of several shots, which were made by
large companies in something like industrial conditions.

>By 1900, the first motion pictures that can be considered "films" – emerged,
and film-makers began to introduce basic editing techniques and film
narrative.

Simply projecting desktop user interfaces designed for flat 2D screens and
mice into VR is still in the "novelty show" age, like filming staged plays
written for a theater, without any editing, shots, or film grammar.

VR window managers are just a stop-gap backwards-compatibility bridge, while
people work on inventing a grammar and language of interactive VR and AR user
interfaces, and re-implement all the desktop and mobile applications from the
ground up so they're not merely usable but actually enjoyable and
aesthetically pleasing to use in VR.

The current definition of "window manager," especially as it applies to
X-Windows desktops, tightly constrains how we think and what we expect of user
interface and application design. We need something much more flexible and
extensible. Unfortunately X-Windows decades ago rejected the crucially
important ideas behind NeWS and AJAX, that the window manager should be open-
ended and dynamically extensible with downloadable code, which is the key to
making efficient, deeply integrated user interfaces.

For example, the "Dragon Naturally Speaking" speech synthesis and recognition
system has "dragonfly", a Python-based "speech manager" that is capable of
hooking into existing unmodified desktop applications, and scripting custom
speech based user interfaces.

[https://github.com/t4ngo/dragonfly](https://github.com/t4ngo/dragonfly)

Another more ambitious example is Morgan Dixon's work on Prefab, that screen-
scrapes the pixels of desktop apps, and uses pattern recognition and
composition to remix and modify them. This is like cinematographers finally
discovering they can edit films, cut and splice shots together, overlay text
and graphics and pictures-in-pictures and adjacent frames. But Prefab isn't
built around a scripting language like dragonfly, NeWS or AJAX.

Here's some stuff I've written about the direction that user interfaces should
take to move beyond the antique notion of "window managers", and enables much
deeper integration and accessibility and alternative input and output methods.

[https://news.ycombinator.com/item?id=14182061](https://news.ycombinator.com/item?id=14182061)

>Glad to see people are still making better window managers! [...] I think
extensibility and accessibility are extremely important for window managers.
[...] I'd like to take that idea a lot further, so I wrote up some ideas about
programming window management, accessibility, screen scraping, pattern
recognition and automation in JavaScript. [...] Check out Morgan Dixon's and
James Fogarty's amazing work on user interface customization with Prefab,
about which they've published several excellent CHI papers: [...]

>Imagine if every interface was open source. Any of us could modify the
software we use every day. Unfortunately, we don't have the source.

>Prefab realizes this vision using only the pixels of everyday interfaces.
This video shows the use of Prefab to add new functionality to Adobe
Photoshop, Apple iTunes, and Microsoft Windows Media Player. Prefab represents
a new approach to deploying HCI research in everyday software, and is also the
first step toward a future where anybody can modify any interface.

[https://news.ycombinator.com/item?id=18797818](https://news.ycombinator.com/item?id=18797818)

>Here are some other interesting things related to scriptable window
management and accessibility to check out: aQuery -- Like jQuery for
Accessibility

[https://donhopkins.com/mediawiki/index.php/AQuery](https://donhopkins.com/mediawiki/index.php/AQuery)

>It would also be great to flesh out the accessibility and speech recognition
APIs, and make it possible to write all kinds of intelligent application
automation and integration scripts, bots, with nice HTML user interfaces in
JavaScript. Take a look at what Dragon Naturally Speaking has done with
Python:

[https://github.com/t4ngo/dragonfly](https://github.com/t4ngo/dragonfly)

>Morgan Dixon's work with Prefab is brilliant.

>I would like to discuss how we could integrate Prefab with a Javascriptable,
extensible API like aQuery, so you could write "selectors" that used prefab's
pattern recognition techniques, bind those to JavaScript event handlers, and
write high level widgets on top of that in JavaScript, and implement the
graphical overlays and gui enhancements in HTML/Canvas/etc like I've done with
Slate and the WebView overlay.

------
co_dh
htop at top,my neck.

