Yes it's true it's larger than books and movies, but games in the end just compete with other forms of media for customers' entertainment dollars.
Unless we can come up with a use case for VR beyond gaming and fulfilling escapist fantasies. It will likely just be another contender for entertainment dollars.
Smartphones on the other hand essentially made it possible to do anything a general desktop pc could do for people on the go without having to be tethered to a chair infront of a desk. Phones can be used for payment, notetaking, etc. There are people who use smartphones and don't play games at all.
And the ultimate use case of VR is that you will be able to be "on the go" without ever leaving your chair -- in a form that's good enough that it doesn't feel inferior to being in person.
It's sort of the inverse of the smartphone use case.
Is it there yet? Of course not. But if/when it does get there... that's a much more compelling use case than a smartphone.
This is true but I think this is mostly irrelevant to the topic at hand. If VR was as popular as PS4 then we'd have said "VR took off". It doesn't need to be as popular as smartphones to be called successful but it's arguably not successful yet by many definitions. It certainly hasn't "taken off"
Most gaming accessories never took off. The racing wheel has been available for decades now and never "took off" and yet it never went away either. Racing wheels have a market and will continue to forever.
I predict VR will be slightly larger than the racing wheel market but they will not become super common in the next decade because they require a large amount of space. This doesn't matter at all because even today VR games and hardware are great fun to use.
The usecase I always envisioned was an all-in-one replacement for multi-monitor setup, TV, and essentially all visual media. I also imagined the flexibility to operate a computer in bright sunlight or while laying in bed thanks to a the goggles being sealed and replacing your vision. This is surely only a matter of the technology improving a bit. I believe a redesign of periphals suited to the ergonomics of VR would be required for my particular vision to be achieved. I suppose what I'm talking about is just a HMD but it could be optionally combined with head tracking like the implementations that exist now. I recall many years ago that HMDs where available so theyre nothing new. Why didnt they take off? Too clunky? Unpleasant somehow?
Yes, and pjreddie seems to have concluded computer vision is mostly (or too often for their liking) used for the digital equivalent of those bad things.
I think prjeddie's concerns are extremely relevant. However he's not the only one working on things like this, thus it's unlikely that research and development will stop, although I certainly think such development is ethically questionable. In some ways this thing seems similar to the ethical problems facing the scientists working on the nuclear bomb. I just hope to God that this tech will be used for good rather than bad, but the way things are going with political censorship (government sponsored or otherwise) and people of opposing camps doing their best to dox political opponents–let's just say I'm not too optimistic...
> it's unlikely that research and development will stop
I know you didn't make this argument here, but I still want to point out that that's ethically irrelevant for his decision.
Or the other way around: "Someone else would have done it" is not a defense when you've built something that was clearly gonna be used for Bad Things(TM).
> Yes, and pjreddie seems to have concluded computer vision is mostly (or too often for their liking) used for the digital equivalent of those bad things.
Indeed, there are many nefarious applications of computer vision. But applications to the medical industry are plentiful too.
I see weighing up the net benefit as a tricky and a personal matter.
That's fine - that's a personal choice he is free to make. But I completely disagree with it. I also don't think that unencumbered AI research is going to lead to the overthrow of the human race by machines like Elon does.
Making cheap computer vision is just as dangerous to the "tyrant" as his supposed victims. You can already make a plausible anti-president suicide drone, A Ticket To Tranai style.
This sounds like it will fall into the same "echo-chamber" problem platforms like Youtube fell into. Note that with the social-credit system, you are docked points if you are friends with the "wrong" people. Thus, you naturally friend with people with similar interests. I don't think they designed it with "anti-viral" as a motive, but instead as a way to weed out and isolate people with contrarian thoughts.
When I lived in China 10+ years ago, the West would criticize the Chinese government for limiting the spread of "opinions" on social media. Today, in the West, we call that "fake news", and limit its spread too.
Give it another 10 years and, at least here in Europe, we will have the first countries copying the "social points" system China is currently rolling out.
We, along with Europe, had "social points" system for a long time. If you smoked weed, were associated with communist/socialist circles, etc, you were or could be barred from certain jobs (public and private), banks, voting, etc. The idea that it's bad when china does it is just hypocrisy or borderline jealousy.
> Today, in the West, we call that "fake news", and limit its spread too.
We say/use "civility" or "western values" to censor, attack, etc. The chinese use "harmony", "chinese characteristics", etc to censor, attack, etc.
We shame the chinese and demand they treat muslims better while killing millions of muslims and destroying a few muslim countries. The chinese shame the west and demand we treat muslims better while interning millions of muslims.
Too bad hypocrisy isn't a currency or we'd all be rich and living large.
Your false equivalence here is absurdly full of shit on both points.
The highly decentralized, mostly non-computerized western social shaming that existed for decades for certain social beliefs and practices, which to some extent still exists in certain limited contexts was in no meaningful way comparable to a systematic, almost entirely centralized government-directed formal social credit scoring system that dictates one's "worthiness" and so many aspects of their social life, economic life and literal freedom to move around based on what political opinions they publish, who they're friends with, what things they read, share on social media or even what things they buy. Comparing the two is ridiculous. It's either blatantly blind or you're trolling for some other reason. No western state has anything like this. No notable number of people in the west are literally ruined economically or banned from travel, or the victims of their friends being nudged by the government into abandoning them because they posted something against Trump, or against Obama or in any other context. Very public, very open debate and protest against numerous government policies is still very much alive and well in the west. In China this is blatantly not the case.
> Your false equivalence here is absurdly full of shit on both points.
Ah "false equivalence". It gets me every time because at this point it's so obvious. A talking point by people who don't know what it means.
> The highly decentralized, mostly non-computerized western social shaming that existed for decades for certain social beliefs and practices
Highly decentralized? FBI background checks are decentralized? Credit checks are highly decentralized? If you think "the west" is decentralized, you really have bought into the propaganda hard. Let me guess you think "the west" is individualistic while the east is "collectivist".
> No notable number of people in the west are literally ruined economically or banned from travel, or the victims of their friends being nudged by the government into abandoning them because they posted something against Trump, or against Obama or in any other context.
And notable number of people in china have been? Or is it just parroting propaganda you just choose to believe in.
> Comparing the two is ridiculous.
Only to people with an agenda.
> It's either blatantly blind or you're trolling for some other reason.
This is called a false dichotomy with a dash of ad hominem.
> In China this is blatantly not the case.
The hong kong protests don't happen? How come there are so many protests in china?
Just because they aren't identical doesn't mean they aren't comparable. Just because one is bad doesn't mean the other is good. You can cherrypick nonsense to fit an agenda or just look at the obvious objectively.
If you can't plainly see the difference between the west's flawed but essentially democratic and mostly liberal systems of government, media rules and management of free expression (even if they sometimes veer into the authoritarian in an ad-hoc, sporadic way) and the overtly authoritarian, systematically repressive systems of social and political control that exist in the Chinese state, then there's no debating your nonsense. Spare me the claims of it all being anti-China propaganda too. The CCP is quite blatant about what it does and numerous creditable NGOs and media sources have reported on these subjects extensively.
> As a friend from out that way put it, "its harder to be a consumer there, but easier to be a human".
This rings true but it depends on what kind of human you are (I lived in Montreal for many years and enjoyed it there, but did not see myself living there long term). It's easier to be lower middle class, an artist, a student, a chef, a government employee, etc. in Montreal, but if you're at all an ambitious human, Montreal has less for you.
Apart from the universities (there's one that is highly-ranked internationally), it's not the kind of city that attracts go-getters (exceptions exist of course -- the tech scene these days, though still not comparable to major U.S. cities, is much different from when I was there).
Montrealers feel less of an economic struggle (more joie-de-vivre and love of the simple life, rents are controlled, CoL is low), but the existential struggle to fit in (if you're not pur-laine Quebecois), to find community (if you don't speak French at near native levels) and to find meaningful work (if you're at the top of your profession) is far more pronounced if you're ambitious.
Being fully human (for me) means being able to express myself in one's work and having good relationships. For many Americans, finding these things in Montreal may be more challenging than say a place like Toronto.
On the other hand, Montreal is a way more interesting place to vacation than Toronto.
A lot can be learned working with average engineers. You shut yourself out of a lot of jobs by avoiding average and will be probably screwed if the "above-average" places consider you an average candidate.
"Any tech company" was probably an overstatement, companies like Apple and Google are much more grounded in valuation compared to companies like Amazon, which had a PE ratio of 85.99 in 2018 [0].
But PE is obviously the wrong metric for Amazon since we know it reinvests all its earnings and has proven ability to convert investment into cash flow.
This is due to Amazon purposefully keeping gross profits at $0 for years. The CapEX investment was through the roof; investments such as AWS which were criticized heavily in their infancy. It was widely known Amazon would be able to turn a profit, they decided to ruthlessly expand their enterprise.
Amazon has low margins, I think the theory with the PE is that they can increase margins after achieving market dominance and get better earnings. Walmart is still bigger so Amazon isn't there yet. And then it could also be high because of the cloud/tech bubble.
There are many ways to rationalize a war though, sure you can't take over others' resources without outright destroying them, but if your goal was to simply wipe a competitor out and monopolize a scarce resource, the picture changes.
The U.S. did very much benefit from the balance of power change that resulted from WWII. War determines who's left, not who's right. Eventually someone might try to be the one who is left.
Learning a new language doesn't necessarily deepen the vertical bar if the language cannot really be used to improve productivity/innovation on top of an engineer's current toolset. Learning TypeScript on top of Javascript can be thought of as vertical, but learning say Lua or C# on top of JS is probably better described as horizontal unless you're already intending to do some really specific desktop application.
Learning different concepts from other languages can make a big difference. For instance I learned about the value of composition over inheritance by learning Rust, then applied it to my life as an Objective-C developer (prior to Swift). It forces you to break out of your well tread paths, and take the best of other systems and fold it into yours.
Coming from a procedural dynamically typed language like PHP and then learning rust, clojure, and node JS. All have huge benefits.
Rust ownership and types system teach you about the freedom it affords you when reasoning about values in a system.
Clojure teaches you about separating state from logic and the benefit of keeping it at the edges of a system.
Nodejs teaches you about async and programming which is imo, as different as functional is to OO. The way you need to reason about things is very different. The non blocking needs teach you about what types of things are blocking and which are not.
I took all these lessons back to PHP and my systems are massively improved as a result.
Most of the PHP hate comes from people dealing with PHP code written by people who simply don't know how to program.
That is not a defense of PHP, it has many faults, but it's a language like any other. I have problems, big ones, with every language and ecosystem I've ever been exposed to. That doesn't detract from their benefits or the concepts they can teach you.
Also, it takes like 20-30 hours to get mediocre with a new standard library and syntax. You won't "learn the language" but you'll get a good feel for it.
Arguing time cost as a reason to avoid learning new languages is pretty weak when you're spending a career programming.
This is a good point, though for most people, valuable skills are often "dry" and undesirable like plumbing, so someone who was really passionate might've persevered on his own anyways. Even if academia has the merit of making you persevere through a class you wouldn't have on your own, it's not efficient; Most of the classes colleges require from you to just graduate aren't "you-will-actually-need-this-later-in-life" classes and are just filler classes.
Front-loading on dry subjects also has the downside of scaring away people who would've otherwise done well given a different path of learning.
The timing of when you learn some things is also important; it matters not if you learned software architecture in college only to have all the knowledge become obsolete by the time you really need that skill - you'd have to review or worse relearn it by yourself all over again.
> Most of the classes colleges require from you to just graduate aren't "you-will-actually-need-this-later-in-life" classes and are just filler classes.
There's no way to know in advance which of those classes you won't need, though. Programming has become very specialized; it's no longer possible for a University to provide an education sufficient to prepare a student for the work they'll be doing in the "real world" because there's so much variability from one job to the next. Having a broad education, though, puts a person in a situation where they can more readily adapt to a wide variety of roles than they would without that background knowledge.