Hacker News new | past | comments | ask | show | jobs | submit login
Reliably measuring IQ through commercial puzzle games (sciencedirect.com)
93 points by davidiach on Aug 29, 2015 | hide | past | favorite | 60 comments

Well, reading the PDF was anticlimactic. Why? Because page 2 of the PDF describes the 12 games that were used. They were games that test memorization, puzzle completion, and pattern matching.

In other words, the "commercial video games" used in their research were "brain games" and not Pac-Man, Mario Bros, and Call of Duty, etc.

If you've ever taken a real IQ test where the proxy shows you cards to match or pictures with "missing" data, the 12 video games they used are very similar in spirit. It's not surprising that there is high correlation between the scores of those particular types of video games and real IQ tests.

However, that's not to say there is zero correlation between blockbuster video games like Call of Duty and IQ. Maybe another study will pursue that.

Thanks for the warning. I was really hoping to see Halo or Quake or something, because in my experience practicing these games in a casual-competitive way, I hit a performance ceiling. In high school I played Halo nearly every day, and a bright friend of mine, who didn't even own the game, was a surprisingly good player for his limited experience.

On a similar note, I once read that reaction time and IQ are correlated (not sure how strongly), which is interesting because you might expect motor functions like that to be orthogonal to higher-order cognitive abilities.

On a similar note, I once read that reaction time and IQ are correlated (not sure how strongly), which is interesting because you might expect motor functions like that to be orthogonal to higher-order cognitive abilities.

Speed is good, in many contexts. Say you've two people, and person A has overall cognitive speed 25% greater than the other.

That doesn't just help with Jeopardy!, it gives you 25% more time to think in normal conversations, on SAT tests, while playing video games, at work, etc.

Even a 10% slower speed differential from a baseline human is a big, big disadvantage.

> Even a 10% slower speed differential from a baseline human is a big, big disadvantage.

This would make sense if all thinking is equal and only the rate of thinking varies. In practice, the quality of thought processes seems much more important than the the rate at which such processes are carried out.

For example, attempting to assess "cognitive speed" can be very dangerous in interviews. It seems like such a promising metric. A candidate who answers questions 10% faster than average will generally be much more impressive than a candidate who answers questions 10% slower than average, and it's tempting to think person A will be 22% more productive. Of course, over time, it may turn out that he occasionally provides more bad solutions to problems, or can only solve superficial problems, or solves the wrong problems. And you realize the person who consistently solves problems correctly (albeit at a slower pace) is a better choice than the person who introduces new problems as quickly as he solves existing problems.

It could always be argued that person B is actually thinking faster, but thinking through the problem much more carefully, causing him to verbalize his answer later. Or perhaps both people are thinking at the same speed, but person B has simply thought more. Maybe person B is actually thinking slower but more effectively. From an outsider's perspective, we can't know what's happening under the hood, which makes "cognitive speed" a weak metric for judging effectiveness in practice.

In practice, the quality of thought processes seems much more important than the the rate at which such processes are carried out.

Of course the quality of thought matters.

But if you have two individuals that have the same quality, but different speeds, the faster one will seem smarter.

Think of a fast flowing conversation. A slower thinker is going to miss references and associations that a faster one will get. So the faster thinker will learn more from the conversation, and be able to participate more. This has many benefits.

This. You want some easy proof? Watch a video of Steve Jobs answering questions, like the WWDC 1997 Q&A video. Time how long it takes him before he starts speaking after the questions are asked, and note the quality of the answer.

Yes, but presumably in those long pauses, a faster thinker would have time to consider more possible replies, make predictions on how the audience would receive them, and generally make more adjustments and improvements than a slower thinker.

In other words, faster cognition would probably help you formulate a slow, considered reply just as much as it would help you answer more quickly. That might be especially helpful in a public speech to a group asking unplanned questions.

I agree with freyr, though, that quality of cognition/ideation matters more than speed, and doubt that 'thinking speed' is necessarily correlated to better thought output.

Like IQ itself, faster basic cognition probably means something, but I don't think anybody knows what it means or how (or even if) it relates to "intelligence" (whatever that means).

In my personal experience, I have seen more great ideas and solutions come from "weird thinkers" than "fast thinkers". I realize that sentence does reduce to the cliche, "Think outside the box, bro," but it seems true over the course of all that I have observed in my own life.

In that example, it may also simply have been that Jobs had been prepped (or prepped himself) thoroughly with answers to a lot of likely questions.

I don't put a lot of credence to the speed thing here. I know one senior executive who made a point of writing down question on cards when he was asked them live at an event. Which I always thought was a rather clever approach to 1.) Make sure he understood the question correctly and 2.) To give himself some time to formulate the best response.

Reaction time is good in general, but rather useless in the traditional IQ tests, i.e. culture-neutral multiple choice pattern-based questions. If you are 25% faster than your neighbour it gives you the benefit of the additional 25% of the reaction time, not total answering time, which is usually much larger in the context of IQ tests. Or nearly any test, for that matter.

Oh, sorry if I wasn't clear. Reaction time is strongly correlated with the overall speed of brain function. So it applies to other brain operations too.

Normal IQ tests should also contain a computerized part where you have to react and solve problems on a computer with a time limit. Last time I did that, they still used Win2000, but I can only imagine this increased.

A possible explanation for the correlation with reaction time is brain damage. If you are exposed to more lead as a child, that will damage both your motor cortex and higher cognitive functions, even if those aren't otherwise connected. Same mutational load. If you got a random mutation that slightly affects your neurons slightly less efficient (everyone carries such mutations, but our genetics is redundant enough to handle it.)

But I wouldn't expect a perfectly "normal", undamaged human to be a genius, but they would have superior reflexes. And there are tons of animals that are dumb, but have amazing reflexes.

Processing speed is only one part of IQ tests. You can see a pattern at higher IQs where the processing speed is lower than normal, but other parts nearly max out. People with jobs where you always have to be meticulous (like software), can show this pattern. Also processing speed is weighted lower in IQ tests vs things like pattern matching or solving.

In an FPS like Halo the relevance of skill sets is much harder to measure, but I think it's a delicate balance.

If I were forced to choose, I'd say motor skills and reaction time combined with spacial intelligence are the key drivers of success. I have yet to see an FPS professional that doesn't have tremendous shot accuracy and agility, relying only on their strategic skills. But I have seen some with great shots and spacial awareness make many questionable decisions. But my exposure is limited and my perception is biased.

I'd guess general intelligence shares relevance almost equally as it affects weapon utilization, detection avoidance, enemy action prediction, strategic positioning, verbal skills (team games), random trickery, etc.

Well, you could think about in in the sense that quick mental reaction time is necessary but not sufficient for a quick physical reaction time.

However, that's not to say there is zero correlation between blockbuster video games like Call of Duty and IQ. Maybe another study will pursue that.

There's a lot to learn and practice to achieve mastery in those games. And experience in other similar video games will strongly correlate to how quick and high the skill in one particular (popular) game will be achieved.

I was watching a video a while back with these two guys playing QuakeIII. They both knew the map intimately, and kept track of when power-ups would respawn based on when they were picked up. Then they based their strategy on what resources they had, and what resources their opponent had and where the other was likely to go next.

So yeah, we're talking hundreds, if not thousands of hours devoted to mastery in this game. Against an opponent like that, two Quake newbies would score about the same (dying almost instantly), even if one of the newbies was otherwise very smart, and the other very dumb.

My god, Did you just bring up some memories. In the 90s my buddy Morgan was head of the game lab at Intel. He was a mast of quake, but another factor was that in the 90s broadband was much more scarce. So imagine playing quake on the fastest machines Intel made on the fastest corp network likely available at the time AND being a master of quake. That was Morgan. And he got paid to do it.

I fucking hated playing quake against him. I was ok, but he was amazing. He also would taunt you.

However, my best gaming memories were when we would basically live at Intel for days at a time playing UO across six diff machines simultaneously on that same corp net. No lag. Thanks for triggering that memory.

Quake (actually Quake III) is still alive today in the form of http://quakelive.com which is free-to-play.

I think the time is ripe for Arena FPS to make a return to the mainstream (with updated graphics). Toxikk which is still Early-access seems very promising in this regard (https://www.youtube.com/watch?v=B28oQvaL7Mg).

I think CS:GO would make for a particularly interesting study. It can probably be studied just by looking at logs. You would have to consider network lag when measuring performance.

I like it because maps like Dust 2 are so well known by competitive players. The success of an individual depends on sophisticated social factors around cooperation or divergence from team members. Strong teams nearly always outperform teams with strong individuals but poor cohesion in my experience.

Intelligence could be measured across a variety of types/dimensions including motor skills/reaction time, individual tactical performance and team cohesion.

Is anybody doing work like this? Is the data available from Valve or independent server admins?

I was hoping for at least a mainstream game that was mainly about problem solving, like Portal.

Ok, we (belatedly) changed "video games" to "puzzle games" in the title here.

I had to go through an IQ test and a couple of brain teasers for a job application -- I wasn't hired even though I scored well in the IQ test and answered correctly 4 out of 5 of the brain teasers, they were expecting an outstanding result (though they would pay you about $20/hour).

I came to conclusion that IQ tests and brain teasers are just bullshit (in the context above, that is). You usually expect an exceptional result of it, because really, in day to day life you usually complete much more challenges and give good solutions than, lets say, 90% of the population.

You are not the only one criticizing IQ tests. Even Theodor Adorno had his problems with those tests. Some even claim that the whole IQ research is not very scientific at all. An IQ test for example is considered good if there is a Gaussian distribution present in the results it produces when applied to a large set of people. However, we still don't know if human intelligence is actually Gaussian distributed. The only proof you can find are such IQ tests, which actually are designed with this very pattern in mind. It's like asking a question with the answer already in mind.

I guess with the creation of phrases like "emotional IQ" and "cognitive IQ" even the research expressed its doubt that those tests can truly hold their promises. Freerk Huisken might be right in the end. He argued that it's a logical problem when one is trying to measure intelligence in artificial test as they can not retrieve the true capabilities of a human. Intelligence is a damn complicated subject and although we humans love to measure and categorize everything, we should consider that IQ tests are far from being comparable. Truth is, we still do not understand what intelligence actually is :)

This is actually a nice topic to discuss with philosophers.

Speaking as someone whose exposure to intelligence testing comes second-hand via a psychology graduate student, my understanding is that the quality of an IQ test is just based on how well it correlates with other IQ tests. The distribution always looks normal because they normalize it. https://en.wikipedia.org/wiki/G_factor_%28psychometrics%29#P...

We may not understand what intelligence is philosophically, but in the field of psychology, IQ/general intelligence/g factor is a statistical construct that isn't directly measurable but can be inferred by its correlation with measurable factors. https://en.wikipedia.org/wiki/Factor_analysis

When we give someone an IQ test we give them a made up set of problems to solve. We hope that if someone can solve these made up problems they will be better at solving problems that real life throws at them.

When we look at the research we find that if we know someone's score on an IQ test(how well they solve made up problems) we find they tend to do a little better(on average) at life then people who aren't as good at solving made-up problems. This is not meant to be the full measure of a man, or supposed to tell us the answer to what intelligence is.(This is probably a question a bunch of tenured philosophers could argue about until the end of civilization :) ).

The pro IQ argument I hear is that nothing else predicts success in life as well.

One must wonder how much that has to do with our individualist society that tries to assign rewards to individuals instead of emphasizing group cooperation and communal advancement. In a society that freely shares, would IQ be as powerful a predictor?

An intelligence test for groups of people would be a very interesting idea.

Wow, this is such an out-of-left field but excellent idea. I personally spent most of my formative years in a culture that wasn't as individualistic as american culture so the IQ test having such an implicit individualistic bias to it is fascinating to me.

Right, IMHO QI test (and the brain teasers, anyways) could be used, but as a small part of a more complex process -- like, what about giving the applicant a real problem to work on? Unless you don't have enough time to make a hire, but in this case, well, good luck then.

I have more interesting subjects to read about, I will let the experts tell me their conclusion (when they get one.)

These academic pay walls really are a cancer to learning and human knowledge.

In proper scientific fields, authors post preprint PDFs on freely available sites like arXiv.

A year and a half back, when I was in the 10th grade, I did some similar research on the effects of video games on the cognitive skills of students. I specifically designed games for this research. I wanted to further this study to test commercially available video games, but due to lack of support and finance, I could not (I'm still in high school).

You can go through the paper here if you want: http://www.emerginginvestigators.org/2015/01/gaming-cognitiv... OR http://arxiv.org/abs/1504.01665

TL;DR; of the paper: I divided the class into two groups, one that played the video games and one that did not. Two tests were taken - one before the children were allowed to play games and another at the completion of one week. Children were made to play games for an hour daily. The results showed that gaming does improve cognitive skills. I was particularly surprised by the substantial improvement that it led to in scores.

PS: Anyone who wants to take this further or maybe has ideas for implementing this, please feel free contact me. I would love to help you out on this.

I am the founder of http://brainturk.com . I would like to talk to you on this . What's your email ?

You can send me a connection request on Linkedin (https://in.linkedin.com/in/mohnish7) and once I accept it, you can check out my email address. I'm a bit skeptical about sharing personal data on public forums.

Maybe a better summary of this paper would be "IQ tests presented on a computer screen produce scores highly correlated to IQ tests presented on paper," since they didn't use what you would normally call "video games."

Was thinking this knowledge could be used as a recruiting tool, then immediately this came to mind...

"Greetings, Starfighter. You have been recruited by the Star League to defend the frontier against Xur and the Ko-Dan armada..."

I don't know if this is what you're referring to, but this is exactly what patio11 is working on now http://www.starfighters.io/

He's referring to the (1984) movie "The Last Starfighter", in which an arcade game is used to recruit a fighter pilot.

It's also where patio11 + ptacek got the name for that project.


They are already used for that purpose:

"The largest human cognitive performance dataset reveals insights into the effects of lifestyle factors and aging" Daniel A. Sternberg, Kacey Ballard, Joseph L. Hardy, Benjamin Katz, P. Murali Doraiswamy and Michael Scanlon, Front. Hum. Neurosci., 20 June 2013 http://dx.doi.org/10.3389/fnhum.2013.00292

One addendum here is that you can use them to measure g, but as soon as you start trying to use them to improve g any value of g you measure using them will be incorrect due to learning effects and there is no reliable evidence that playing such games can improve g. Source: wrote my prelims on this.

reliably <> accurately

Wow, some psychology professor who read Ender's Game (Probably watched it since, come on, psychology is science at its laziest and the timeframes match) decided to make a quick buck by publishing an online paywalled paper on the "discovery" "for the first time" that people give up signs of their intelligence when observed performing an activity that requires them to use it (which matches the description of an infinite number of activities...).

Hey, I can reliably measure your intelligence by reading your comments on Hacker News. May I have your cash-money bucks, please? $_$ I have been working hard with hundreds of my students to prove it! For the first time ever! $_$

The fact is that you need specifically designed games for an accurate test. Most video games just won't make the cut. Even a combination of them: take the top 10 from steam. I dare anyone to accurately measure the IQ of their players just by looking at them them play in whatever lab environment they want. An those would be pretty boring. They would be in fact, tests.

Publish or perish mentality ... or ... good science hindered by a hyperbolic, misleading title. The former is cancerous but the latter can be easily rectified. I myself can't call it because I didn't click the link hoping I would get a TLDR in the comments.

Is there a link to the paper?

You are a saint.

I was curious what they meant by "commercial" video games. I was expecting real video games like Quake or Witcher but instead it's what you'd expect to be used to measure IQ - puzzle games, some of which mimic questions on regular IQ tests. Not as exciting as the abstract leads you to believe.

I haven't a credit card, i am a penniless person studying on a public university that does not teach (almost) anything good, and wants to become a game dev based on free software philosophy, currently researching about edutainment and... zomg... you pwn, thanks! really thanks!

every article i want to read about this, is via payment only, so it is becoming really hard to research about this subject :( but today, you saved the day!!

This just might make your life easier: when you encounter a paper you want to read, but it's behind a paywall, post a request to https://www.reddit.com/r/scholar (the instructions are in the sidebar, make sure to follow them closely to get a speedy reply). One of the kind souls there who have access to the article will give you the link.

I don't know what specific university you're going to, but many university libraries have free access to journals for students.

You should do some research on your school's library's website.

Sad that we need to research how to get access to research.

How much impediment to the progress of the arts and sciences should we allow copyright to impose?

now you mention it, i think my university has access to some scientific journals... but i think that only covers hinari and scielo... and i have to make a formality to get an institutional account

Fuck yeah! You rock!

(also, didn't need anything weird to download, it was a straight HTTPS transfer over 443 for me)

Fuck mega, for some reason they require flash to download a file? Doesn't sound very trustworthy.

Use Chrome. Best thing to do is to download MEGASync for Windows or Mac OS as then it makes everything straightforward.

Welcome back to the days of "You need Internet Explorer 4 and please download this ActiveX control to use this site"...

Use a different browser.

We can estimate a metric of intelligence by observing people performing tasks that require thought and problem solving?


> Here we show, for the very first time, that commercial video games can be used to reliably measure individual differences in general intelligence (g). One hundred and eighty eight university undergraduates took part in the study.

Is this a joke? reliably measure ? 188 students? So are you trying to tell me that if I could play Call of Duty with A. Einstein I would be humiliated despite him never having played a video game in his entire life?

You would do well to learn about effect size and p values. Your common sense guess at what might be reliable is incorrect, which means your doing a disservice to your own understanding of the world.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact