

Sapir-Whorf with Programming Languages - gnosis
http://nklein.com/2009/02/sapir-whorf-wit-programming-languages/

======
jfb
I've long thought that at least metaphorically, Sapir-Whorf fit computer
languages like a glove. This observation is hardly new ("You can write COBOL
in any language"), and it's not particularly _useful_ , save as shorthand for
describing, say, potential hires; in my experience, most programmers write
code as in the language they first really used, regardless of the language
they're _actually using_ , and so it's often useful to ask someone with years
of Java but some Python (for instance) to write something in Python and note
if they've bothered to learn the idiom. Hardly determinative, and of course
there are those who are capable of really picking up new computer languages,
semantics included, just as there are those with the gift for human languages,
but it's just something that I keep an eye on.

~~~
Miky

      > most programmers write code as in the language they first really used, regardless of the language they're actually using
    

I doubt that this is true. I can't quite remember, but I'm pretty sure the
first language I ever used was Color Basic. I've forgotten quite a bit of it,
and it has pretty much no influence on my programming.

The first real programming I did was with structured imperative languages
(Game Maker was the first thing I used for anything real). The code I wrote
bore almost no resemblance to Color Basic, and while I was using these
languages, solutions to problems using the tools of structured imperative
programming came to mind much more readily than solutions using the style of
unstructured Basic.

Later, I learned Haskell and functional programming. For the first while,
everything I wrote was like a puzzle. I had to think for a long time to do
things the functional way. But after having used it enough, while I'm in
Haskell, functional solutions come just as readily as imperative solutions
come while I'm in those languages, perhaps even more.

Of course, people who are experienced with one language and inexperienced in
another will try to apply the idioms from the first language in the second
whether they're applicable or not, but programmers do not always write code as
in the first language they really used.

~~~
jfb
I may have not made myself clear; I think it is largely true that programmers
_start_ writing COBOL in language _x_ ; some rapidly progress, others don't.
It's the latter group that are red flags to me — it signals a lack of
intellectual curiosity.

Again, it's not dispositive or anything; sometimes programmers, particularly
younger ones, just haven't seen the strengths of the idiom in their alternate
languages.

------
quinndupont
No one actually believes the Sapir-Whorf hypothesis anymore, do they?

~~~
long
I'm a grad student in cognitive psychology; my advisor works in the so-called
"neo-Whorfian" literature.

What Sapir and Whorf were proposing is nowadays called "linguistic
determinism" and is pretty much universally repudiated.

However, there is evidence for a less radical version called "linguistic
relativity". For an example, see this paper:
langcog.stanford.edu/papers/winawer2007.pdf

~~~
noblethrasher
I haven't read the whole paper yet but I find it interesting that the Russian
language prescribes a distinction between light blue and dark blue and that it
just so happens that humans are more sensitive to blue than than red or green.

In fact, many expert Photoshop artists are aware of a trick whereby they can
make small changes to the blue channel to get dramatic effects on the
composite.

~~~
jacobolus
> _humans are more sensitive to blue than than red or green_

Without clarification, this is a nearly meaningless statement, but under the
most plausible interpretations I can think of it is wrong [in particular, we
have few "short wavelength" S cones in the fovea, the very central part of the
retina, and it is the signal in these S cones which allow us to distinguish
yellow from blue; the S cones contribute much less to lightness response than
M or L cones].

What human vision is primarily sensitive to is differences between a color and
its surroundings, and the relative sensitivity to differences along
light/dark, red/green, blue/yellow dimensions depends enough on the size of
the object, the prevalence of such differences in the rest of the scene, the
state of adaptation of the eye, and so on. At small scale, we are most
sensitive to lightness contrast, and two objects of similar lightness will
tend to blend together at the edge even if they are of quite different hue.

~~~
bane
It's also wrong, human photoreceptors are more sensitive to green wavelengths
than blue or red.

<http://en.wikipedia.org/wiki/Green>

"The sensitivity of the dark-adapted human eye is greatest at about 507 nm, a
bluish-green color, while the light-adapted eye is most sensitive about 555
nm, a yellowish-green color"

It's why night-vision systems are geared to green, but displays that preserve
dark-adapted vision are red.

<http://en.wikipedia.org/wiki/Red>

"Red light is also used to preserve night vision in low-light or night-time
situations, as the rod cells in the human eye aren't sensitive to red."

------
rgbrgb
Yes! Making a new function in Obj-C is so unnecessarily complicated. I would
bet that Obj-C functions are much longer on average than more functional
languages. Though header files are great for documentation, most functions one
writes should NOT be public but small Lisp-like macros which should not be
part of the object's header.

Yes, I know you can make 'private' functions by declaring them at the top of
your implementation file. However, this seems like a hack insomuch as you
never look at these declarations again until you write another private
function - they're just there so your code compiles without warnings.

Here's hoping for Obj-C 3.

~~~
Zev
Since Objective-C is a superset of C, it also gets compiled in a top-down
fashion[1]. if you don't want to add a method to your interface (public or
private), just put it above everything else that will use it.

1\. I believe there is something about only allowing fir a single pass in the
standard, but, don't quote me on it.

~~~
rgbrgb
Yeah, unfortunately you'll get a lot of warnings from Xcode if you do this.

~~~
Zev
No you don't. You get warnings about not responding to the method if you put
the method implementation _after_ it is used. If it is before, it works fine.
Example implementation:

    
    
      @implementation SomeClass
      - (void) foo { [self baz]; } // a warning will occur; the compiler doesn't know about - baz yet.
      - (void) bar { [self foo]; } // a warning will not occur; the compiler knows about - foo already.
      - (void) baz { NSLog(@"bzz"); } // doesn't do anything special
      @end

------
6ren
It's been a while since I used either C++ or lisp, but for a function that's
only used in the same class, isn't it just the function name, argument names
and types, and type of return value? You only need to mess around with header
files if it will be used elsewhere.

For lisp, you still need the function name and argument names (don't you? or
is it common to access them as a list, with cddr etc?) - so it seems the
saving is only in the type of return value and types of parameters. That is
extra typing, but it doesn't seem much more - I guess it may make a difference
at the margin. But if so, this is just dynamic typing. You still choose
argument names and a function name.

Or is it more a cultural, idiomatic thing, of what's customary in a language?
Which is the author's point, of Sapir-Whorf. I think of course you'll tend to
do what's accepted in a language, what's easiest, what its strengths are. But
it's also possible to cross-use idioms. It may be a little awkward, but if it
really is right idiom for that specific task, it should help.

Or it might be another kind of cultural factor: that the author's C++ is work
code, seen by other people (or more likely to be), and so function and
argument names need to be fretted over, for others to understand (and also not
criticize) them. I'm just guessing here. Why fret over C++ names, but not lisp
names?

------
kingkilr
This came up once before, here's what I had to say:
<http://news.ycombinator.com/item?id=2144142>

