Hacker News new | past | comments | ask | show | jobs | submit login

Neurosignal decoding was the topic of my PhD (which, to be fair, I quit after a year).

Every time an article comes up on the topic, I read through the comments section and realise how full of shit we all are here at HN. Then I read another article and assume the commenters are all smart people who understand the topic at hand.

I can't remember what the law that describes this is called, but it is absolutely real.




Ok, but rather than posting a meta putdown, why not share some of what you know about the topic? That way we can all learn something. A comment about neurosignal decoding would be much more interesting than yet another complaint about people being wrong on the internet, which we're all in a position to supply.

https://hn.algolia.com/?dateRange=all&page=0&prefix=true&sor...

https://news.ycombinator.com/newsguidelines.html

To be fair, this problem is a co-creation between comment and upvotes, and upvotes deserve more of the blame. But if you post an on-topic, substantive comment in the first place, then you're contributing to improving this place rather than worsening it. Recent comments on similar cases:

https://news.ycombinator.com/item?id=27110515

https://news.ycombinator.com/item?id=26894739


Hmm, perhaps there was some sort of cultural misunderstanding (I've always assumed you're American, and perhaps the choice of words on my part was quite Australian), but what I posted wasn't intended as an insult or flamebait - I included myself in the group of people who are "full of shit"!

It was more intended as a gentle reminder that there are a lot of comments on HN posted in an authoritative tone by people who are not experts in the field, and perhaps we should all stop and think before contributing to the problem in search of validation.

This is something I honestly care more about than BCI technology, and genuinely felt it was a positive contribution to the discussion as a whole. Based on others' reactions, it didn't seem to be taken negatively either.

Regardless of all that, in one of my comments below I did actually share both a short summary of the field and my experience as a researcher, as well as a 20-page long unpublished literature review. Hopefully that can help enlighten people somewhat on the terminology used in the BCI field and point out some of the limitations and issues with the current state of the art.


I hadn't seen your other comment when I wrote that. Thanks for the kind reply (and for teaching us about BCI technology!)


Thanks for the work you put in as a moderator, and I mean that genuinely. Very few people seem to be able to walk the tightrope between freedom of expression and civility and HN would be a worse place without you.


Oblg. XKCD: https://xkcd.com/386/

(My wife says this particular XKCD reflects me the most!)


Who are you, the master of articulation _/||\_


In a matter of speaking, he is as a moderator on HN.



Gell-Mann amnesia effect[1] by Michael Crichton.

“Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray's case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward—reversing cause and effect. I call these the "wet streets cause rain" stories. Paper's full of them. In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.”

[1]https://www.goodreads.com/quotes/65213-briefly-stated-the-ge...


With zero sarcasm, I truly want to know what your opinion is on the subject. Generally, I find this kind of stuff as interesting with wonderful potential of abuse. Someone more in the know, boots on the ground if you will, is worth infinitely more metric fuck tonnes over anyone else's opinion... to me at least.


My honest opinion: anyone reading this will be dead by the time technology advances to the stage where BCI ethics and abuse is a topic even worth considering.

These articles are designed to make it seem like the technology is developing rapidly, similar to the early CPU and memory chips. A more apt analogy is the progress of electrical systems in the 18th century: we can observe some odd effects but have no idea what it is we're actually looking at. This won't change until we gain a proper understanding of the human brain, just as electronics didn't take off until we understood the atom.

If you want my opinion on the current state of the art (as of 2018), as well as a quick introduction to some of the technology, here is an old draft of my research proposal (You can skip almost all of it except Appendix 1; that's by far the most important argument for why I think the current paradigm cannot deliver significant improvement): https://docs.google.com/document/d/1pmgCpDLEfHlWDu6OoHuoTOQ4...

As you can probably tell, I gave up because I could identify gaping flaws with the current paradigm but wasn't able to formulate a new one out of nothing.

On a side note, I suspect this is why impostor syndrome is so prevalent amongst PhD students. You're not actually expected to contribute worthwhile knowledge to society, but instead expand the body of knowledge in your field. In a static field, unless you are a hyper-genius capable of spinning a whole new paradigm, this means churning out papers which you know are bullshit but are supported well enough by the existing body of research to appear reasonable. In short, grad students feel like impostors because they are.


Exact same reason why I got out of academia, and my field was a lot more "realistic" than yours. (Post-Silicon Semiconductor physics)


I don't think it's fair to expect to revolutionize the field as a PhD student. The goal of a PhD is to prepare a good researcher, probably more important than producing new knowledge. Most papers are maybe boring and have little new things in them but I still think they are worthwhile


Clearly it's not ready for commercialisation (needs weekly recalibration; only a single individual tested etc), but it's still amazing - we're decoding signals from the brain! Isn't that worthy of discussion?


So then what's your take on Neuralink? Are they going to hit a plateau within the current paradigm? And what do you think that plateau will look like technologically? Maybe we could still get pretty far within the current paradigm.


Gell-Mann amnesia effect: https://en.wikipedia.org/wiki/Michael_Crichton#GellMannAmnes...

> In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.


It baffles me that the Gell-Mann Amnesia effect still doesn’t have it’s own Wikipedia page given how pervasive it is.


Add in Dunning-Kruger and you get one hell of a shitshow.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: