Am I the only one who finds the use of the phrase "real world" for the world of microcontroller programming jarring? It is used throughout the article.
e.g:- "rationals are never used in the real world, and floating-point is quite rare"
To the author's credit he does define the use of this term "On the other hand, all of these languages are too big to be used in the real world (micro-controllers), which is the realm where Forth and C remain popular."
but I still found it a bit jarring.
"It is reasonable to predict that Factor will largely phase out Forth in the world of desktop programming, but that Forth and C will continue to be predominant in the real
world."
I understand that English is probably not the author's first language, (it isn't mine either), but if he is reading this, replacing "real world" with "microcontroller world" might be an easy change to make.
I've been lurking on the factor-talk mailing-list where this paper has been discussed, and I believe the author explained he didn't mean 'real world' in a derogatory arrogant way, but rather a more literal sense. For example, while floating-point is predominant in scientific computing, spreadsheet software and even video-game engines, in 'the real world' of physical, human-scale objects moving at moderate speeds, at unremarkable altitudes, fixed-point is often more than adequate.
It's like saying that while quantum mechanics and general relativity may be useful for certain applications, in The Real World, Newton's laws of motion are all you need. It's not belittling more sophisticated solutions, it's just a kind of tactless pragmatism.
Have you considered the possibility that perhaps when the author says "the real world", he might mean "the real world"? I would not be surprised to learn that his point of view is that embedded systems make the world go round, and web programming is a kiddy pool for quiche-eaters. Most programmers are elitists about something or other; that "something" might as well be microcontrollers.
"Most programmers are elitists about something or other; that "something" might as well be microcontrollers."
It is certainly possible.
I guess it is just that I think the hypothesis that "English is not his first language" is more probable than "He is an elitist who thinks only microcontroller program is real programming" as an explanation for this usage. I could be wrong.
Interesting view of "the real world", where Python, C++, Factor, etc, are not used. This real world is contrasted with the desktop.
This is a pretty detailed comparison of Factor and Forth, insofar as I can tell (neither a Forth or Factor programmer). There is a little confusion about compiled vs interpreted--author of the paper claims that C is interpreted based on its use of the "%1" used in printf statements.
e.g:- "rationals are never used in the real world, and floating-point is quite rare"
To the author's credit he does define the use of this term "On the other hand, all of these languages are too big to be used in the real world (micro-controllers), which is the realm where Forth and C remain popular."
but I still found it a bit jarring.
"It is reasonable to predict that Factor will largely phase out Forth in the world of desktop programming, but that Forth and C will continue to be predominant in the real world."
I understand that English is probably not the author's first language, (it isn't mine either), but if he is reading this, replacing "real world" with "microcontroller world" might be an easy change to make.