Hacker News new | comments | show | ask | jobs | submit login

You're confusing concurrency and parallelism. Concurrency describes having several tasks or operations that contend on resources or events. They may or may not execute in parallel, and certainly don't in Node. In contrast, Netty provides a thread pool for executing events and thus provides concurrency and parallelism.



As far as I know, by definition (coming from someone with a predominately Java background) concurrency requires multi-core processors, or, for example, a distributed application with at least 2 nodes that communicate with the same central server / "hub" (which would still require a multi-core processor, as far as I know, on the server). The central tenet mentioned with concurrency is often that of "race conditions" where it cannot be predicted which thread or node will access a resource first. If a task isn't executed in parallel, such as it has to wait for the other to finish before it starts / proceeds, I don't see how it could cause a race condition. Admittedly though, I don't have deep knowledge of how high level code translates to actual CPU instructions so it could be possible, or even likely, that if the processor is switching between tasks in such a way that each line of code, so to speak, that is run is from a different method that concurrency issues would occur even on a single-core processor. A language like Java has robust and mature aspects of the language to deal with this so I would be wary of using something like node.Js if this wasn't well documented. Concurrency issues are nasty and I have seen firsthand the mayhem they can cause in legacy applications I've worked on.


Concurrency doesn't require multi-core. Parallelism requires multi-core.


I certainly don't like to go around spouting nonsense, so I spent some time looking for formal definitions of concurrency. I wasn't able to find any support for the description you provided, though. Do you have a source?

FWIW, Wikipedia seems to believe that "concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other".

Regardless, sorry if sloppy (or poorly defined) terminology obscured my point.


If you quote the Wikipedia definition with a bit more context you'll see it matches mine:

"In computer science, concurrency is a property of systems in which several computations are executing simultaneously, and potentially interacting with each other. The computations may be executing on multiple cores in the same chip, preemptively time-shared threads on the same processor, or executed on physically separated processors."




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: