Hacker Newsnew | comments | show | ask | jobs | submit | SammoJ's comments login

Please show a paper where fine-grained vehicle classification in unconstrained images is anywhere near this performance from 20 years ago. You will not be able to, because it wasn't.


state of the art classification accuracy/range, not speed.


This is brilliant, thank you.

On the Musk transcript I found this formatting confusing: "Have you played Kerbal Space Program?

What do you think SpaceX uses for testing software?"

I can't access Reddit to see what the original comment was.


Two KSP mentions:

Q: In order to use the full MCT design (100 passengers), will BFR be one core or 3 cores?

EM: At first, I was thinking we would just scale up Falcon Heavy, but it looks like it probably makes more sense just to have a single monster boost stage.

Q: Nice to see you are doing things the Kerbal way.

EM: Kerbal is awesome!

The second one:

Q: "Hi Elon! Huge fan of yours. Have you heard of/played Kerbal Space Program? Also do you see SpaceX working with Squad (the people behind KSP) to integrate SpaceX parts into KSP?"

Reply (not from EM): What do you think SpaceX uses for testing software?

EM to Reply: Kerbal Space Program!

Short version - Elon Musk likes and plays Kerbal Space Program.


I don't see anything new here, but for any practioners like myself Scikit-learn and Spyder were the Python tools which finally moved me from a die hard MATLAB junkie.

I grabbed the handy Anaconda package from here: https://store.continuum.io/cshop/anaconda/

Within about two weeks, and with a little bit of discipline, I became a MATLAB->Python convert. Spyder is a solid IDE, and Anaconda comes with all the packages you need (i.e. Matplotlib...).

All that's left now is to find a solution to MATLAB's excellent debugging. You can break to pydb in Spyder but the debug environment is nowhere near as functional as iPython.


Have you tried ipdb?



So I had always heard this is the case, and now believe this must be a common misconception. See Table 6 in this file from the Office of National Statistics:


While from 2004 to 2012 total UK fertility increased from 1.80 to 1.98, the fertility rate for non-UK born women actually /decreased/ from 2.50 to 2.29 (albeit in a slightly messy non-monotonic fashion) while the fertility rate for UK born mothers /increased/ from 1.69 to 1.90 quite monotonically.

It would be interesting to see the stats including 2nd and 3rd generation immigrant mothers (i.e. born in UK both with parents born elsewhere).


I wasn't aware of UK-born mothers' increased fertility rate.

However, I'll argue that even if the fertitily rate of immigrant mothers decreased, this can explain the data because if more immigrants came in, the average fertility rate goes up (given that immigrants' decreased rate is still quite a bit higher than UK-born).


Relevant, very similar paper input/output wise, from our resident karpathy with a detailed discussion in comments: https://news.ycombinator.com/item?id=8621658


If you read his e-mail he admits he was "completely wrong"



This would work for 2D games, but 3D projection doesn't allow this. Furthermore in a 3D game there are more than two degrees of freedom.


You can render the images on the sides of a cube with 90° fov, put your viewpoint in the center and that's it. Once I programmed something like this and it works pretty well. I suppose that Google street view works in a similar way but I can't back this up.


Of course it does. Oculus is already using it; it's called timewarp.


Works great in IE11 on an X1 Carbon.


Were they playing anything or just walking around in the demo? I have one and have tested it out with many people. When people get sick it's usually because they've been in one of the less interactive demos just sort of looking around wiggling the mouse etc. The people who I let loose straight onto Half-Life 2 or other games where there is some sort of goal rarely get the same type of nausea. I think this is worth taking into account when people say "it's nausea inducing".

For insta-nausea put someone in the Tuscany demo and move them around while they wear the rift. Hilarity ensues!


We did one demo that was more or less floating above static terrain, and another with Half-Life 2.

My own nausea didn't actually start up until HL2. We were playing on a laptop (probably not enough horsepower), using slightly unfamiliar keyboard & mouse controls. So some movements were very fluid & second-nature to me, while others were jerky and off because of the slightly-off control scheme.

It's possible the most nauseating moment was taking the headset off, actually. Up to that point, my mouse hand had been a pretty accurate proxy for my in-game hand and arm, but taking my hand off the mouse and then ripping the "world" away with that same (now "phantom") hand was deeply disconcerting.

But it's also possible the nausea built up slowly over the course of playing. I'd love to spend more time with it to see if it's something you really can adjust to.


On the subject of movement while wearing a VR headset, it seems to me that most successful games will be those where your character remains seated. Flight simulators, space combat sims, mech games, stuff like that.

For that kind of game, just the Rift and a joystick should provide an amazing experience.


This is where the Omni or a product like it in the future comes into play.


Didn't hunt for the source of this one but I did find some true 4K examples here:


CTRL-F 2160p


RED posted this to vimeo which they say is 5K: http://vimeo.com/25424362#at=0



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact