
Live Coding in VR with Oculus Rift, Firefox WebVR, JavaScript, Three.js [video] - adamnemecek
https://www.youtube.com/watch?v=db-7J5OaSag
======
IanCal
While there's a lot of talk here about the future of IDEs, difficulties with
current hardware and other challenges, I'd like to take a moment to talk about
what's actually in the video.

This fills me with almost childish glee. This is a virtual reality, where you
can pop up a "god window" and create/mess with a world around you. Even just
what's in the video, making some cubes move and change colour is _wonderful_.
It's fun!

Remember the early bits of amazement at programming? Where you could make the
computer say your name or draw a square? How much cooler would that be in VR?

~~~
bjt
For a year or so I worked as a contract programmer on the inworld content team
for Second Life. I didn't have VR, and the language I programmed in (LSL) was
even worse than Javascript, but it was still incredibly fun to use code to
shape the world around me. It's even more fun when you can do that in a shared
space and see others interacting with your creations.

Someday there will be something like Second Life with Oculus support and a
decent programming environment, and that will be awesome.

~~~
vlunkr
Second life looks like a great game that I would never play. Most people on it
seem to be only interested in the social aspect, but it's pretty awesome that
you can create programmable objects and sell them.

------
tinco
This is why I bought a DK2, I believe VR IDE's are going to be a thing, and
they're going to be great. The fonts aren't very crisp in this video, but once
we get some nice editors in there it's going to rock, the new display has a
high enough resolution to make it look great, and I bet the consumer version
is going to have a retina display which will just make any discomfort vanish.

Note that in the video there's just one editing window, in an IDE obviously
there could be editing windows and references all around you. You would never
need a mouse as your head movements convey positional information, though you
can use a mouse of course, and with good keyboard controls I think you can
have an editing space ten times as big as the traditional dual 27" monitors
give you.

No more hiding buffers in tabs, no more disruptions of flow as you're
scrolling through your buffers looking for the right one. You just access the
visual memory in your brain to intuively remember where you left the buffers
you use.

Also, the idea that this would somehow be bad for your neck sounds ludicrous
to me, how is staring for hours at a stationary rectangle with miniature
information more natural and easy for your neck muscles than having
information spread around you in even spacing at a comfortable virtual
distance? I think it's going to be a relief for nearly everyone.

~~~
bhouston
> This is why I bought a DK2, I believe VR IDE's are going to be a thing, and
> they're going to be great. The fonts aren't very crisp in this video, but
> once we get some nice editors in there it's going to rock

To a degree. But I when I work I have two WQHD monitors (2560x1440) with one
turned on its side so I have one text editor with a vertical resolution of
2560. When ever I work on a laptop, even an HD laptop, I miss this setup so
much as it feels like I am drinking through a straw.

Because of this I am unsure if VR, which is going to be much less than what I
am working with now, is going to be a serious replacement.

~~~
tinco
At the moment, the effective vertical resolution is about 3240p, and the
horizontal I'd guess would be around 6000p. And that's just the technology
preview, I don't see how you think VR is going to be less than your setup,
when it's already more..

~~~
brian_peiris
OP here. Your numbers are definitely off. The effective resolution of the DK2
is more like 320p or something. Definitely _lots_ of room for the tech to
improve there but I'm hopeful. Despite this limitation, the experience is
pretty darn good if I do say so myself.

~~~
tinco
Why 320p? I say 3240 because the height res is 1080p, and I think you can
comfortably tilt your head to get three times that height resolution. The
horizontal guesstimate I made similarily.

edit: ah I see, I hadn't factored in the warping of the optics, if you're
correct then the effective resolution would only be around 900p so I'm being
too optimistic :)

~~~
brian_peiris
Oh you're factoring in head movement. I see. Well you still haven't accounted
for the warping that the optics do. They stretch that 1080 pixels across a 100
degree vertical field of view, so the pixels directly in your sight (not
including peripheral vision) is really only about 320.

~~~
fsiefken
Funny coincidence, the oculus has a similar appearant resolution as the
vanilla doom game on the crt's had.

------
daw___
Source code:
[https://github.com/brianpeiris/RiftSketch](https://github.com/brianpeiris/RiftSketch)

------
fastball
That is the coolest thing I have seen on this website in a long time. I'm
literally giddy with excitement.

~~~
vincentleeuwen
I couldn't agree more, very cool. This is not just technological innovation,
this is potentially Art innovation too.

------
hanief
This is brilliant. I imagine remote collaboration will be very exciting with
VR technology. Since I haven't tried Oculus yet, how is text rendered on the
screen? Is it crisp? Or at least readable?

~~~
StavrosK
I tried one yesterday for the first time. It was a DK2, it was very low
resolution, and you could clearly see the colored pixels of the display.

Text had to be very large, so only three or four lines fit on the screen at a
time. After hearing people praise the DK2, I thought it would be basically
seamless, but it was jarringly low-res.

The experience was great, very natural, but the pixel density left a lot to be
desired.

~~~
hauget
I own a DK1 and the low resolution makes text reading incredibly difficult.
DK2 is a step up but text reading still strains my eyes. I think the 'sweet
spot' for us coders will be when the DK4 hits market late next year (DK3's
supposed to be out early/mid 2015).

~~~
widdershins
Huh? Oculus has said there will be no more development kits before the
consumer version. And they haven't announced a date for CV1.

~~~
hauget
cv1="dk3"; cv2="consumer version" BUT... it will all depend on the advances
others make in the input space.

------
hypertexthero
Reminds me of [Ready Player One][rp1] by Ernest Kline.

[rp1]:
[https://en.wikipedia.org/wiki/Ready_Player_One](https://en.wikipedia.org/wiki/Ready_Player_One)

------
readerrrr
This is really impressive and I'm interested in trying this.

But... the video shows a very specific example of programming where you are
basically designing a map or a level. As soon as you are programming something
else than a simple 3d game, it becomes less useful.

The major problem we are trying to solve here is managing windows, but I argue
that normal screens already do that and you don't need VR.

A large screen or two plus using alt-tab and virtual desktops can completely
and better replace virtual screens in VR. You have to move in the VR world to
access information. On the other hand using a normal desktop where the
information is represented abstractly( where in VR it is a physical object )
can be accessed instantly with a simple and fast key-combination. I simply
don't see which programming activity could you do better in VR than in a
normal desktop.

~~~
brian_peiris
OP here. Yeah, I was definitely not aiming for a full IDE here (obviously). It
more like a playground environment. Something like CodePen for VR with a focus
on the live-coding experience.

Using this app is comfortable but VR will definitely need higher resolutions
for anything more serious. It's not too far in the future though. If things go
well, I expect VR will be the driving force behind ultra-high-density 5"
screens in the next few years. Or some entirely new display tech (retinal
projectors?) will replace it.

~~~
endergen
I agree with Brian on this one, like I told him recently, IDEs were definitely
next on my side project radar. I tried out the Gear VR and it's only about 1.5
times DK2's resolution (in both vertical and horizontal resolution) and it
made a huge difference. At Oculus Connect I felt like a lot of the talks were
about exactly this that we finally have a use case for way higher res screens
instead of incremental demand that is only slowly cranked up by vested
companies like Hardware Manufacturers and really early adopters.

------
bsaul
This demo begs for drag n drop interfaces.

You'll soon build a counter strike level by walking inside it and sculpting
the world around you. How cool !

~~~
waldir
Like this? :)
[https://www.youtube.com/watch?v=VzFpg271sm8](https://www.youtube.com/watch?v=VzFpg271sm8)

------
davyjones
I haven't seen this mentioned in the comments here: I think this is huge for
CAD/CAE. I can see this disrupting current [Digital
Mockup]([http://en.wikipedia.org/wiki/Digital_mockup](http://en.wikipedia.org/wiki/Digital_mockup))
offerings.

Everything from design, analyses, testing moves into the VR environment.
Already CAD/CFD/CAE computational models are saving huge costs to companies.
Throw in VR and we are that much closer to a tangible, collaborative
environment. Very very excited!

~~~
tgb
Are there any players in the high end VR market for that sort of work that
look like they'll be better than the consumer oculus? I'm thinking that a very
high resolution screen would be the main thing a high end device could buy.
Maybe good inertial measurement units as well but I'm not sure how much that
would help over oculus style positioning.

------
justifier
oculus is digital stereoscopy

this demo is cool but i'd argue impractical

be careful getting excited over demos of technology that add a layer of
abstraction from the tech direct

ever seen a photo of a painting? ever read a book you loved and your friends
hated?

i'd suggest you try the oculus as you would use it: in your home on your box;
before spending any money on the device

also, you are blind to the keyboard with the headset on

i feel oculus caries a mythos around it that is very difficult to obviate if
you have yet to have worn one.. i'll link to an old comment i posted to
carmacks keynote at the first ocucon :
[https://news.ycombinator.com/item?id=8347450](https://news.ycombinator.com/item?id=8347450)

essentially i have stopped referring to oculus as vr, oculus is digital
stereoscopy, and being stereoscopy the oculus is exploiting an optical
illusion and as such, i believe, a flawed direction for the touted outcome:
virtual reality

this demo:
[http://www.phoronix.com/scan.php?page=news_item&px=MTcyMTI](http://www.phoronix.com/scan.php?page=news_item&px=MTcyMTI)
; is what encouraged me to shell out for the dk2

the experience failed to deliver on my expectations, though at the time i
thought my expectations were practical and rooted in evidence, of course now i
realise those expectations were irrational and that evidence was secondary
being derived from this video demo

~~~
simias
You have commented elsewhere in this thread with this same rant but I simply
don't understand the point you're trying to make.

Of course stereoscopy is part of what the oculus does (although there's also
head tracking, so talking only about the stereoscopy is misleading). You say
it's a flawed direction, so what's the right direction then?

~~~
justifier
it is my belief that oculus will sell a lot of devices largely by manipulating
the ad copy, resulting in frustrated people out a lot of money, i suppose i am
attempting to include another perspective on the device, one that i failed to
see before i bought one, and have yet to see since, but feel confident in my
understanding of it

i suppose it depends on your definition of virtual reality..

but i think the verbiage of vr is very misleading

my position is that if oculus was called: oculus digital stereoscopy; the tech
would lose a lot of its hype

as for the tracking it is certainly the best feature of the headset, again
with the goal of calling the thing what it does i refer to the 'head tracking'
as 'inferred spherical screen encapsulation'

what the head tracking is doing is windowing your vision to the size of a
fisheyed cell phone screen and then inferring your position in a 3d space and
displaying the contents of that field of view on the screen, refreshing the
content of the image in regard to the movement of your neck, giving you the
impression that you are in a sphere of flat displays looking through fisheye
glasses with blinders with a forced forward stare

when i put it on the restricted periphery and the inability for me to use my
eyes to look around was the most dramatic realisation

but that is just dismantling the vr hype, actual stereoscopy is flawed in its
own ways..

peoples eyes differ in distance apart, the oculus is solid plastic, certainly
the error is measured in mm to fractions of mm but that is still error that
your brain now is required to correct,

also the goggles press against your face tightly to seal out distracting
ambient light, this hurts after some time, and in the case of my watching a
film, completely depleted my eyes of blood leaving white bags under my eyes

that is the hardware

as for the software if you read the comment i linked in the comments you are
referring to i talk about an experiment i performed when i first received the
oculus that shows that the brain error corrects the inconsistencies between
the eye inputs at a pretty substantial distance, talking here of the blue and
pink box experiment, in the experiment only the pink box moves, but when i
showed it to a friend thae asked why when the two boxes began overlapping did
the blue one start moving as well? i watching this all cloned on the screen in
front of me explained that it still was the case that only the pink one moved
and the blue one is being moved around by thaer brain, so if a simple
experiment has my brain working overtime shifting still objects it would seem
reasonable that a lot of the 'jitter' that is so often talked about when
displaying entire worlds is your brain organising its inputs

as for what i believe is the right direction?

if we are talking about visual input manipulation.. i'd think projection,
perhaps onto a dome or disc encapsulating the entire eye range including
periphery of the eyes even with the pupil extended to the outer most regions
of the eye

or one fantasy device i dreamed up used this recent research:
[http://newsoffice.mit.edu/2014/magnetic-hair-directs-
water-f...](http://newsoffice.mit.edu/2014/magnetic-hair-directs-water-
flow-0806) ; if you watch the video, at the end, 57s in, you can see these
hairs bending optics, what you'd do is have two rows of these hairs, one for
each eye and each row bends to follow the movement of a pupil and each hair
acts as a pixel

this way simulates two photons bouncing off a single object and reaching the
eye, that would be wild

but mind that i referred to the effort as visual input manipulation, if you
want to talk vr i think the only possible direction is bypassing our sensors
and interacting directly with the processors, a la the suit in Zero Theorem :
[http://cdn-static.denofgeek.com/sites/denofgeek/files/styles...](http://cdn-
static.denofgeek.com/sites/denofgeek/files/styles/insert_main_wide_image/public/zero_theorem.jpg)

~~~
cLeEOGPw
I don't understand your argumentation against stereoscopy. It is how you see
the world. Even your suggested "dream device" would be stereoscopic device.
You can't run away from it. Besides even if the device something like you
suggested will ever be developed, there will be those few guys who will
complain how it is not real VR and how you need to send signals directly to
the optical nerve as well as spinal chord and all other senses in order to
achieve "real VR". These kind of talks are unproductive. Better try to develop
your version of VR instead of complaining about current best.

------
possibilistic
If a first-class, distributed VR world became a thing, I could get lost in
such a world forever. I would stay logging in for work, hobbies, friends,
adventure, and romance. Perhaps I would log off a few times a day for food,
sleep, hygiene, and exercise; I could imagine that being the extent of my
physical world interaction if the VR tech were good enough. (It has a long
road ahead to reach that point though. Ancillary technologies would also need
to evolve for closer to brain/full-sensory I/O.)

Given tools to continually innovate and improve the digital world, I don't
think I could ever grow bored of it. There would be endless permutations of
possibility with which to express myself. I could make games, movies, physics,
patch the stack. I could play God. Or I could let someone else play God and
adventure through the worlds that they create. Then perhaps send a pull
request.

An all-enveloping creative world sounds so much more exciting than the life
I'm living right now. I could finally be anyone I wanted. Make anything I
wanted. You might say I have this freedom now, but none of us really do. In a
virtual world there is no central society to tell us how to conform and fit
in, no traditional family structure. No tradition. Just feeding your short
life's want for expression. I want that so much.

I can see so many people like me preferring a life like this. With this level
of freedom. Imagine: Religion, money, and power could have little bearing.
Meritocracy reigns, but even that could be ignored if you wanted.

Furthermore, I can see costs as becoming nearly negligible. No more
consumption of material things beyond bare necessities: food, shelter,
hygeine, and healthcare. Nobody needs a physical high rise view when the
digital world can give that to anyone. I don't see why anyone else would need
to buy the things we are trained to as consumers. (Would that be an economic
problem? It very well could...) I don't see this necessarily ruling out a
digital economy, but unless things are contracted and paid for on a per-demand
basis, digital assets can easily be copied... an economic problem that might
need to be considered.

If this were widely adopted (and barring societal collapse stemming from large
swaths of society switching to a VR lifestyle), I can see it bringing ungodly
amounts of funding to biochem research. We might elevate our concern for
solving our biochemical longevity problems above all else.

Sorry for rambling and wasting your cognition with this idle pipe dream. I'm
actually pretty pragmatic and don't expect for much of this to happen while
I'm still young enough to enjoy it. Or at all.

~~~
vlunkr
I enjoyed reading your pipe dream, it's a pretty interesting thought. It would
be like living in our own matrix, where everyone can be Neo.

VR is still missing the ability to simulate smells, tastes, and tons of other
sensations. I think I would get bored in the world like that. Plus my real
life is pretty fun.

------
moron4hire
I'm building an open source framework for making these sorts of WebGL VR
experiences.
[https://github.com/capnmidnight/VR/](https://github.com/capnmidnight/VR/)

It doesn't yet support "live programming" like this, but the hope is the
framework would eventually be used to easily make these sorts of things.

So far, the focus has mostly been on the user input side. There is a
constraint-based system for defining input values from different "devices",
i.e. keyboard, mouse, gamepad, head mounted display motion tracking, arm
motion tracking, etc. There is even a rudimentary gesture system, where things
like a nodding gesture or head shaking gesture can be wired up and used to
activate commands just as easily as a mouse click or key press.

The framework also supports multi-user environments, and I've already started
building the text input controls (somewhat similar to what is shown here) to
do things like this (right now, it's only useful for live chat). Eventually,
I'll have buttons and text boxes and togglers, etc.

If you're interested in this sort of stuff, please consider joining me.

~~~
brian_peiris
OP here. I like where you're going with that. If you weren't already aware,
you definitely need to integrate WebVR. A couple of devs from Mozilla and
Google are adding VR APIs to Firefox and Chrome.

Join the mailing list: [https://mail.mozilla.org/listinfo/web-vr-
discuss](https://mail.mozilla.org/listinfo/web-vr-discuss)

More info: [http://blog.bitops.com/blog/2014/06/26/first-steps-for-vr-
on...](http://blog.bitops.com/blog/2014/06/26/first-steps-for-vr-on-the-web/)
[http://blog.tojicode.com/2014/07/bringing-vr-to-
chrome.html](http://blog.tojicode.com/2014/07/bringing-vr-to-chrome.html)

------
malux85
I like this. I have an oculus rift version 1, and I use the terminal
([https://github.com/hyperlogic/riftty](https://github.com/hyperlogic/riftty))
to code in occasionally.

This is really good because I have a stomach condition that means being in a
seated position is very painful for me, but using this I can stretch out on
the bed, with a keyboard resting on me, and still code :)

------
jasonkostempski
Eye tracking would be a sweet enhancement to an interface like this. If you
could just target something by looking at it (without having to turn your
head) press a key to use/select it and then use some other keys to jump
through the various pieces of code that created it, I couldn't even imagine
how much time would be saved in a large project.

------
maga
I wouldn't mind doing an ordinary programming in it as well. It would be much
easier than messing with multiple real displays.

~~~
spacefight
I'm not so sure how my neck would feel to the constant tiny movements in any
direction.

~~~
goldfeld
Look into the Bates method, it's natural vision research from an
ophthalmologist in the 20s. The argument is that animals (including us)
naturally move neck and eyes in exactly tiny little movements literally _all
the time_ , non-stop. And that is healthy because constant movement promotes
relaxation, whereas keeping your neck (and eyes) fixed leads to tension and
over time is what actually increases our prescription, which he argues is just
like being inflexible (that is, it becomes your new normal), but in the eye
and neck muscles.

So I'm actually pretty excited for something that needs me to look around all
the time. I've still gotta try making my text editor twitch about all the
time. A moving target makes more sense for our predator eyes, so maybe VR will
finally allow people who work with computers not to have disproportionally
more need for glasses.

~~~
Dewie
So I guess I'll end up with a standing/treadmill workstation because sitting
is unatural, and now also twitching/slightly evasive windows because staring
at a fixed spot is unnatural. I guess the next thing after that is an input
device which you have to bounce on and off the screen, in order to mimic those
primal hunter-gatherer spear throwing techniques.

------
noobermin
This is freaking cool by itself, but the video gave me an interesting idea:
what if you used this as an IDE for coding in general? You wouldn't need to
have multiple monitors, you'd have 360 degrees of stuff to look at.

Then again, the idea of being cut off from the outside world with only your
code to look at might be a less salient idea, so why not put a camera on this
thing, then you could have "projected windows", basically like the holographic
interface of tony stark without the holograms. I personally think that would
be pretty cool.

~~~
andrewliebchen
I think I would prefer the "solitude" of a 360deg coding environment that
would be afforded by an Oculus, NR headphones, and music. No visual
distractions!

Besides, I imagine that this setup would be cheaper than two monitors and a
desk. I'd need to drastically improve my touch typing skills...sometimes even
now I need to look down. Maybe in this set up there's a representation of my
hands or better yet, a VR native way to interact with code that isn't a
keyboard.

Addendum: I remember seeing something awhile back about building an IDE
designed to be used with a gaming controller only...

------
zacfinger
How has no one mentioned William Gibson or "Neuromancer" yet?

> The matrix is an abstract representation of the relationship between data
> systems. Legitimate programmers jack into their employers' sector of the
> matrix and find themselves surrounded by bright geometries representing the
> corporate data.

> Towers and fields of it ranged in the colorless nonspace of the the
> simulation matrix, the electronic consensus-hallucination that facilitates
> the handling and exchange of massive quantities of data.

~~~
ctdonath
It's a 30-year-old book ... older than many people who are making viable VR a
thing now. _Neuromancer_ sparked the otherwise naive notion of what VR would
be like, but now we've developed and normalized a good chunk of the technology
& language (verbal & visual), in a society which has grown accustomed to
pervasive computer interfaces (I sit here with 8 on my desk alone, and
totaling 22 in this room). Reality has taken a somewhat different & concrete
direction from what Gibson fluidly predicted. The wonder of a hypothetical
world of people jacking into their sector of the matrix surrounded by bright
geometries has been replaced by the mundanity of a real world of people
staring at Facebook et al; we live the electronic consensus-hallucination of
massive data, and are not impressed by someone telling us in abstract terms
how wonderful it will be.

------
erikpukinskis
If anyone played Myst, you may be tickled to realize that this is in fact what
Myst was about.

The hand-wavy "this might look like a book with a video in it and some neat
drawings, but it actually contains a magical language that describes the world
you're in" ...

... you're looking at a book written in a magical language that describes the
world it's in.

------
c3d
I really love the "in-world" editing. However, I'm not convinced by the
description of the scene. Here is what it should look like:
[https://www.youtube.com/watch?v=paJG7Fy5Few](https://www.youtube.com/watch?v=paJG7Fy5Few).

------
source99
VR is no doubt very cool.

But I'm not sure this is taking advantage of the true benefits that a VR
environment brings.

What are the major benefits that a VR environment brings?

Screen real estate would be 1 benefit but I would have to see an
implementation where this was done well before I could believe it.

What other benefits does VR have?

------
createuniverses
Has anyone tried "praxis"?

[https://www.youtube.com/watch?v=6rB39AXPmQQ](https://www.youtube.com/watch?v=6rB39AXPmQQ)

[https://www.youtube.com/watch?v=1VRtRazMYSA](https://www.youtube.com/watch?v=1VRtRazMYSA)

------
radisb
Nice, though I dont know if it is the wisest thing to start coding The Matrix
in javascript.

~~~
CmonDev
The only thing sadder would be programming a strong AI or brain-computer
interfaces in javascript. Hopefully the disaster will be stopped way before.

------
runewell
VR is a medium where I could see voice commands and visual programming
actually having an impact. I would love to see a 3D version of UE4 Blueprints
or Scratch.

------
alexkehayias
So cool! Great work. Might be nice to keep the window as more of a HUD with
some transparency so you can move around and still see the results while you
code.

------
mpg33
How easy is it to use a keyboard while wearing a headset though? Even though
90% of the time you dont need to look down ....occasionally you do.

~~~
brian_peiris
OP here. You definitely need to be very familiar with your keyboard for this
to be a fun experience. Even switching to a slightly different key layout
would throw me off. As long as you're a proficient touch-typist, it's really
not a problem.

------
fsiefken
i wonder if something like this can be achieved on the opensim platform, does
anyone know? I know a webkit browser can be accessed in the sim, but that's on
a 2D surface. You want to 'rev' the 3d them live and in game.
[https://www.youtube.com/watch?v=ubYYVnfLULI](https://www.youtube.com/watch?v=ubYYVnfLULI)

------
LeicaLatte
Nice effort. Input for VR still sucks though...

~~~
melling
Some combination of voice recognition, gestures (e.g. space, tab, return), and
a virtual keyboard is needed. Don't think people want to be confined to
sitting in front of a keyboard at a desk. It's not going to be easy.

------
elwell
Now throw an Emotiv headset [0] into the mix.

[0] - [http://www.emotiv.com/](http://www.emotiv.com/)

------
ck2
Now do it the other way around, you create virtual objects with hand motions
and it writes the code equivalent for you.

~~~
XorNot
VR enabled FreeCAD would actually do something like this. Everything in that
is represented as Python code.

------
elwell
Every now and then you see a demo that catches a glimpse of the future.

------
shocks
Someone please build vim in VR...

Then I can legitimately wear an Oculus Rift at work!

------
vhiremath4
Very rarely do I watch something and am instantly consumed by it.

------
kentf
Brian FTW!

------
bikamonki
Beautiful!

------
curiously
I don't think VR ide will be practical. You have a screen beaming inches away
from your eye. It's bad enough we stare at the computer monitor all day.

I think what would work is some sort of hologram reality agumentation that is
able to project 3d images in the environment you are in.

Again, this ergonomic aspect of the Oculus might be improved but my primary
concern is the vision health aspect of it.

~~~
ilaksh
VR HMDs have lenses that make it more like you are staring into the distance.

Its sort of like wondering about the health effects of wearing glasses.

~~~
curiously
I don't know but light being shone in your eyeballs for a long period of time
I find worrying.

