I love the edit markers in the last screenshot. But too skeptical about the performance of a text editor running in webkit. All of my experiences writing text online have been bad. So the jump into using webkit for an entire dev environment fills me with dread.
The nice thing about Vim mode is that it glosses over latency by making simple edits take fewer keystrokes. It's sort of why vi was invented.
In the words of Bill Joy himself:
> ... you've got to remember that I was trying to make it usable over a 300 baud modem. That's also the reason you have all these funny commands. It just barely worked to use a screen editor over a modem. It was just barely fast enough. A 1200 baud modem was an upgrade. 1200 baud now is pretty slow.
> 9600 baud is faster than you can read. 1200 baud is way
slower. So the editor was optimized so that you could edit
and feel productive when it was painting slower than you
could think. Now that computers are so much faster than you
can think, nobody understands this anymore.
Brackets is a decent programmer's text editor developed by Adobe and using WebKit. It works fine. The performance and polish of web editor components like CodeMirror has come on a long way in the past couple of years.
I thought CodeBox was using the ACE editor as part of their system like Cloud9. And I believe Orion and Codenvy (my company) use CodeMirror. Looking forward to trying out Atom. I can see many advantages already of it as an editor.
FYI, it's not WebKit but Chromium I'm betting, based on the crash reporting. If you want an idea of performance, Edge Code from Adobe will provide an answer. Oh and there's a welcome repo with an IRC channel listed. Atom.io just says "nope" for me but is likely its future home.
I'm currently working on a project that uses google maps in a few places. The main reason I'm not using OSM is gmaps.js . Ideally I will find / create my own simple wrapper like this on top of OSM in the near future
I thought about that the other day (development in general) but I couldn't think of any actual use cases. I always want to keep my head straight when I'm coding, but maybe you can use head movements as commands? Eg, tilting your head down slightly scrolls down, up, etc.
That sounds more like a use case for eye tracking. What about, say, a torus or sphere of virtual displays accessible via Rift? I don't think I'd actually want that, but it's possible someone might, although the resolution might be poor compared to an ordinary IPS panel -- I'm not familiar enough with the Rift to know for sure.
On the whole, I'm less excited by the Rift, as a developer, than I thought I would be; I'm just not seeing all that much in the way of use cases for it. (As a gamer, though, I'm over the moon, especially since Star Citizen will probably support the Rift.)
As a developer, I'm more interested in new input methods, such as the programming-by-voice scheme Tavis Rudd demoed at Pycon 2013; my wrists aren't as young as they used to be, and given that programming is my hobby as well as my profession, anything that takes some of the load off them will be welcome. (I just wish he'd release his damn code already! I've made a halfway decent start from scratch, in that I've got basic dictation working, but not having to reinvent all the glue from first principles would make life a lot easier…)
Take a look at the video down on this page . They used multiple cameras to capture a very wide angle (in this case 360 degrees) so you can navigate the scene after the fact. Matching up something like this with the oculus rift would be very interesting.