Wow, I know it doesn't sounds very mature, but to see this article and comments on HN, really make me feel happy and proud to be a Sikh.
But, alas Sikh's own state of Punjab is nowhere close to successes of the community all over the world. It's a great tragedy.
The state of Punjab once a pioneer in green revolution and the richest in India is now being looted by dynasty politics and corruption. The state of things is so bad that people don't even want to visit back the place. Sometimes, we wonder if that's the key for our current success everywhere but in our own home.
I hope, perhaps naively, that maybe AAP have a chance to root out that particular dynasty. I remember they polled pretty well at the last election but am not sure how things worked out on the ground afterwards.
As a member of the diaspora I wonder what more can be done from the outside?
Both! Many villages are now fighting to stop opening up of Wine shops (some small villages have one school and 3-4 wine shops!). It is said that the corrupt elements in Punjab Govt. get a cut for every wine shop opened in Punjab. They also own most of the bus network.
My grandfather was murdered in Punjab, and my cousin who moved here who lived in Punjab for 18 years would rather be called American and moved on immediately. Rule of law is necessary for market activity, the problem is we seem to have moved away from respect of rule of law recently.
Just thought I would mention. Also a Sikh descendant who is a part of the diaspora happily living in Anglo-derived culture and also an atheist.
If the big idea was messaging and not objects, I'm curious to know how did we end up in this primarily object oriented world. Especially, when its founding father didn't had that intention as he says: ""I invented the term object-oriented, and I can tell you that C++ wasn't what I had in mind".
I have a guess from following quote: "This was why I complained at the last OOPSLA that - whereas at PARC we changed Smalltalk constantly, treating it always as a work in progress - when ST hit the larger world, it was pretty much taken as "something just to be learned", as though it were Pascal or Algol." -
There were people in hurry to make something with whatever theoretical basis there was instead of further refining the original concept. Probably, that's where it took the wrong turn.
I don't think we really did end up in an object-oriented world. I think we ended up with nominally object-oriented languages used in ways that are more struct-oriented than object-oriented.
I had the good fortune to do early Java work with a bunch of Smalltalk programmers, so I know you can write Java that is reasonably well object oriented. But I don't think I ever saw an intro-to-Java book that was any good in that regard.
Refining a language or programming paradigm requires a philosophical or scientific bent. Not much money in that. Most people just want to get a job done and go home at 5pm.
Edit: Bright side, I think polyglot programming is more popular now than ever in history. In the past you learned one language and that was 20 years of your career before you went into management. Your identity was wrapped up in being a "VB person", "C", "COBOL", or "Java/C# person" or "Rubyist" .
There still is language identity politics but I think Ruby might have been the last of the One True Religions, now the frontier is all over the place between Python, Ruby, Node, Java, Golang, c#, etc.
I think it is a perversion of computer science that we think ourselves as users of languages, instead of creators of languages. Especially because it is not hard to create a language these days -- we know much more and have better tools than people 30 years ago. Many of the problems that we consider uber-complicated could be relatively easily solved by creating appropriate problem-specific languages: consider web programming, data mining, etc.
The reason is that DSLs are just viewed as "another" burdensome language, instead of a design for codified problem-specific knowledge. A DSL should evolve as we evolve our understanding of the problem. Think about successful DSLs like Mathematica, for example.
"Object-oriented" was like "agile" in the late 80s and early 90s. It was viewed by management as a quasi-silver-bullet that would solve the software complexity crisis -- and by consultants as easy money. The qualities of Smalltalk that were selected for in the object-oriented craze were the ones that most appealed to computer-unsavvy middle management -- the ones that helped them herd programmers by the dozens or hundreds without them stepping on one another's toes. So classes, data encapsulation, inheritance, and virtual-method polymorphism were emphasized to promote the creation of sealed components that could be extended or wrapped but not changed (Brad Cox's software IC dream) and late-bound messaging de-emphasized in order to enforce static typing constraints.
OO is still a useful way to organize programs. Even if you use, lets say, templates in C++ for inheritance instead of method table dispatch...it's still a useful way to mentally/physically organize your code.
> I'm curious to know how did we end up in this primarily object oriented world
Hundreds of millions of lines of C and millions of C programmers.
They could, in theory, smoothly migrate into the world of object orientation using C++.
C++ has, however, held as a constant design principle that if you don't use the new stuff, it's still C. Put another way: feed your C program to a C++ compiler and it behaves the same way.
That design back pressure makes late binding and message-passing a la Smalltalk a non-starter.
Objective C solved this in a very different way: by making the Smalltalk-y additions syntactically orthogonal to the existing C bits. That's why you send messages [inside brackets].
C++ was backed by AT&T. Objective C was backed by some whacky tech company from California founded by that loony guy they kicked out of Apple when the grownups took over.
Not to mention that Smalltalk had serious corporate backers too (eg IBM), so what room was there for Objective C?
Eventually Java came along with the mission to rescue the world from the explosive mixture of C++ and median programmers. It took a page from the C++ book and aimed to be easy to switch. Similar model, similar basic concept of operation. And that's largely been that.
For those complaining about the "quality" of this piece - that somehow it degrades his past essays - its a response like others here: http://paulgraham.com/kedrosky.html Where even in past he usually explains when he thinks he is being misunderstood.
For that matter, I think this was more of a response too: http://paulgraham.com/speak.html (written when suddenly the whole world seemed to be worried about pg saying um a lot). I think this article did calm those people a bit.
We have an expression for this in Hindi I’m not sure if it translates well – trying to use a sword as a needle. To use mindfulness which is transcendental in nature and can open secrets of the life for few more bucks! Yes, as someone becomes mindful, it does helps to concentrate and focus more and be more creative and productive – but that’s a side product and yes, we still have to earn bread. But to make that the entire aim is not the right direction for that energy.
But Zen is such an open philosophy it won’t mind it and judge it as corrupt or otherwise. It will allow people to naturally see the difference – that they are giving away their diamonds in return of pebbles. Not profitable in my opinion.
The post is very negative. If you take away from someone what you think of as false, don't run way without providing a better alternative. Even if that is just thinking for yourself and not to follow anybody.
Yes, most blog posts go on to follow the same path as that of so called self-help books. But once in a while a sincere anecdote can make you think of fresh perspective on how to handle a situation in which you did not do better.
I think we as a generation should feel lucky to be living when Experimental Psychology is blossoming. Books from authors like Daniel Kahneman, Dan Ariely, Daniel Gilbert have had a life changing impact on my thinking. To really understand that our brains are not perfect - but are infact a kludge - makes you think about taking necessary precautions about these limitations.
And for productivity, number one thing that we need to outsmart is our impulsiveness.
I wonder how the author keeps such a long list in his mind while working. I'm not sure how, when his brain is actually not cooperating, he would say "brain, here's the remedy, so stick to it". Most points have this pattern "don't..", "resist..", "seize..". Whom are we kidding - as if its that easy directing our quirky mind.
I'm reminded of following Daniel Kahneman's closing notes in his books 'Thinking Fast & Slow'. So, after listing all biases and quirks (to which he has devoted his life), he writes:
What can be done about biases? How can we improve judgments and decisions, both our own and those of the institutions that we serve and that serve us? The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely: “This number will be an anchor...,” “The decision could change if the problem is reframed...” And I have made much more progress in recognizing the errors of others than my own.