I used to consider my knowledge of history better than at least 95% of the population, but while reading The Decline and Fall of the Roman Empire I realized how sketchy my view of history really was. So at the ripe old age of 35 I set off on a course of study centered on two series of books, The Story of Civilization, by Will and Ariel Durant, and Timeframe, a Time-Life series focusing on a timeframe in human history and what was going on in all parts of the world inhabited by man: lots of pictures and of course superficial, but it painted in a lot of gaps I othewise would have never covered. The Timeframe series starts much earlier than the Durant's, but once both series were in sync I would read the books in both series for an epoch, as well as at least two other books, either written in the era or about the era, drawing mostly from science, culture, and biography. For instance I read all the books of Euclid, Newton's Optiks and Principia (I slogged through the Motte transaltion before the first modern English translation became available), The Wealth of Nations, Shelby Foote's 3-volume history of the Civil War, and The Origen of Species. (It's real easy for me to spot folks who spoot-off about Wealth or Origen who have not actually read the books.) My program culminated with Tragedy and Hope, which being such an inflammatory work, I did not trust to read without the full background of history. The process was like watching Western Civilization unfolding.
Now for the unintended consequences: I became a bore at cocktail parties. I wanted to talk about the ideas in the fascinating book I was reading. I used to love arguing politics. Even with my prior knowledge it was hard enough finding opponents who would engage in rational discourse, now it is impossible. It's been so long my debating skills have totally gone down the tubes. The sad thing is I believe my problem is really society's. Political correctness (among other problems) in academia, has produced a generation of intellectually crippled intellectuals; and the entertainment industry, including the 24-hour news cycle as entertainment, has just stupefied people. I fear for democracy and republican government.
I'd love to debate random topics with you. Assuming you've actually read everything you claim, I think you'd be a refreshing change from the vast majority of people who simply regurgitate whatever their heroes say.
If you're interested, email me sometime: zedshaw at zedshaw.com.
People really aren't interested.
If there was some forum where people of like minds could get together and talk (verbally, not typing on a keyboard) say a Google Hangout or something, I think that would be awesome and definitely something I would be interested in.
Provided it can be moderated and organized and not be chaos (and focused on content, learning, expanding knowledge, etc.).
Euclid The Elements
Arrian The Campaigns of Alexander
Garmonsway (trans.) The Anglo-Saxon Chronicle
Komroff (ed.) The Travels of Marco Polo
Haydn The Counter-Renaissance
Braudel The Mediterranean and the Mediterranean World in the Age of Phillip II, Vol I&II
Pascal The Thoughts
Spinoza The Ethics
Christianson In the Presence of the Creator: Isaac Newton & his Times
Newton Principia (get the modern English translation by U.C. Press)
Hampson The Enlightenment
Rousseau The Social Contract
Boswell Life of Johnson
Phillips The Cousins' Wars
Schom Napoleon Bonaparte
Heidler & Heidler Old Hickory's War
Babbage On the Economy of Machinery and Manufactures
de Toqueville Democracy in America
Darwin On the Origen of Species
Foote The Civil War: A Narrative
Twain Life on the Mississippi
Spector Admiral of the New Empire, The Life and Career of George Dewey
Cardwell The Norton History of Technology
Abbott Flatland, A Romance of Many Dimensions
Meyer & Brysac Tournament of Shaddows, The Great Game and the Race for Empire in Central Asia
Doughty Travels in Arabia Deserta
Lefevre The Golden Flood
Massie Dreadnought, Britain, Germany, and the Coming of the Great War
Lawrence Seven Pillars of Wisdom
Durant The Story of Philosophy
Cardozo The Nature of the Judicial Process
Schapiro The Communist Party of the Soviet Union
Popper The Logic of Scientific Discovery
Kershaw Hitler: 1889-1936 Hubris
Blemenson (ed.) The Patton Papers (1940-1945)
Hayek The Road to Serfdom
von Mises Human Action
Skinner Walden Two
Kuhn The Structure of Scientific Revolutions
Guevara Guerrilla Warfare
Cleaver Soul on Ice
Lacey The Kingdom, Arabia & the House of Saud
Durant The Lessons of History
Hackworth Lessons Learned, Vietnam Primer
Quigley Tragedy & Hope: A History of the World in Our Time
EDIT: Just remembered two more
Locke Two Treatises of Government
(Since you've read Faust and the intial post mentions Gibbon's Rise and Fall, you might be interested in Theodor Mommsen's "History of Rome" as well.)
P.S. I'm heartened to see Mises and Hayek on your list. I recommend adding Burke and (especially) Carlyle. If you can fully digest Carlyle, your fear for democracy and republican government will turn to panic, and your journey to the Dark Side will be complete.
Tuchman A Distant Mirror
There's a Twain short story about folks who get more than the initial vote, in proportion to their accomplishments. So a propertied or moneyed man is so much more enfranchised, a doctor of philosophy has a couple more votes, and so forth. The incentive there for demagogues would seem to weight one even more firmly than presently in favor of the rich and the academic guild. I don't support any scheme that brings us closer to a plutocracy than we already are--merely mention this as a starting point.
And then there is government by lot; the next dogcatcher, etc. to be selected in some randomized fashion, perhaps confine the pool to certified catchers of dogs, treason for tampering, and so forth. The incentive becomes enormous to raise the general quality of the citizens in proportion to how broad the pool of candidates is; and, of course, to monkey with the criteria for candidates to benefit oneself and disadvantage one's adversaries.
This of course brings it's own set of problems...
That is to say, compulsory voting forces the most disinterested and politically ignorant members of society into the polling booths. It usually favors the most conservative party. In Australia that translates to maintaining "old-fashioned" views on, most notably, processing of refugees.
Australia was sporting the White Australia policy up until about 30 years ago, and still requires citizen applicants to pass an 'English' test. It's rather primitive and discriminative processing of refugees (arriving by boat only!?) seems to fall in line with that as well.
Perhaps I'm overly optimistic, but it seems plausible to me that there's a subreddit out there for people just like you.
It seems like having good political sense requires two things: having a solid understanding of the current state of people and assets, and having a good understanding of the impacts of policy. (As well as having some understanding of the whole human experience thing.)
While history is obviously important for understanding this stuff, it seems like after a certain point one would get more insight from other places. I don't doubt that most people don't know enough of history to understand politics, but at the same time most people don't know enough of anything else to understand politics either.
I dunno, I think it'd be more interesting to talk with you than to talk with people about what they drank at the last party they went to--especially since "what" usually just means "how much".
It's not enormously surprising that this question is 'rarely asked'. One would learn a lot more about philosophy ("what questions are these guys asking?" "what are some of the answers they've come to?") from even a mediocre introductory text or a chat with a TA/tutor, than by assuming that this rather sophomoric answer represents a reasonable response to the entire field. Calling it 'sophomoric' doesn't properly engage with the claims, but the claims are so smug, random and content-free:
"Books on philosophy per se are either highly technical stuff that doesn't matter much, or vague concatenations of abstractions their own authors didn't fully understand (e.g. Hegel)... It can be interesting to study ancient philosophy, but more as a kind of accident report than to teach you anything useful."
... that I can't find anything remotely meaningful to engage with.
Someone recommended Russell's History of Western Philosophy as an option; this isn't bad (although its treatment of Continental philosophy is hopelessly biased, it would still be enough to get you oriented).
The fact that whenever PG makes statements on some area I understand more about (philosophy, politics, economics) seem to be incredibly shallow, juvenile crowd-pleasers, makes me wonder at his expertise in areas that I don't know much about (history, painting).
Its mainly naive young programmers (socially stunted nerds) who believe they are smarter than everyone and have everything figured out.
I believe nerd hubris comes from a fundamental lack of empathy; an inability to put themselves in the shoes of people who are different.
People who lack perspective are easy pickings for silver tongued writers who make emotionally-potent audience-felating oversimplifications.
Suppose that my mother has Alzheimer's disease, and that I find the prospect of caring for her inconvenient, undesirable, and/or expensive. Do I have a responsibility to care for her anyway?
It is "useful" because it could potentially describe a real-life situation in which one might have to make a decision. Analyzing this quickly leads one to a general discussion of ethics.
As you point out, definitions are imprecise. What does it mean to say I have a "responsibility", or that I "ought" to do something? Logic and observation can get one to "If I do nothing, then my mother will suffer and/or die", but this is not the heart of the matter.
I think there is no hope of a "solution", in the same way, say, that Fermat's Last Theorem was proved. But does that mean that studying ethics, and what the usual lineup of dead white males had to say about it, isn't worthwhile?
The problem with ethics is that you can't show systematically that meat, closed source software, homosexuality, infanticide, government, self-defense, or torture are justified or unjustified. You have to have some type of basis for logic, and even if you come up with some sort of foundational principle, there's nothing special about foundational principles that makes one any more or less likely to reject them than the conclusions you reach from them. The history of ethics is a history of people working out logically consistent ethical systems only to discover that they justified things like kidnapping strangers and killing them for their organs, and based on the "moral intuition" that it's wrong to kidnap strangers and kill them for their organs, the logically consistent ethical system was amended. The history of philosophy is filled with this kind of thing, but let's think about what we're doing here! What do we need ethics for if all we're going to do is change them when they end up having unintuitive results?
Not that he offers solutions per se, but he starts from a similar view of the subject.
The amount of philosophy needed is not too significant. One needs to know enough logic to show that inability to find a justification doesn't mean a justification doesn't exist.
So if someone is using Singer's arguments from Animal Liberation to persecute meat eaters and you want to do something about it, general skill at persuasion is more important than philosophy.
Also: most disciplines require "general skill at persuasion," philosophy being no exception.
To the contrary, almost every vegetarianism argument on the Internet has comments like "plants are living things which you slaughter for food", and they just get ignored.
I suspect because if you seriously believed killing living things for food is wrong, it would be suicide, so the only conceivable answer is to reason in such a way that it comes out with the answer you want - draw your arbitrary cutoff between important-living-things and not-important-living-things somewhere above plants and bacteria while staying comfortably below humans.
Then let the argument rage as if there was a real substantive difference between drawing the line above cow or below cow, or just diverting it around milk, or only around milk sourced in a particular way.
On the other hand, you can look at the vast similarities between our nervous systems and those of other vertebrates as well as their behavior under pain (squealing and so on) to reasonably conclude that these animals feel pain like we do.
We don't have as much evidence and the evidence we do have isn't as visceral or widely discussed. That doesn't mean there is no evidence.
By focusing on pain and suffering you are implicitly drawing the line I mentioned about important/non-important living things there, and calling other views 'absurd'. But you aren't objectively right.
I don't believe that killing living things for food is wrong. Of course you could be using the generic "you," in which case your point is irrelevant to the conversation.
so the only conceivable answer is to reason in such a way that it comes out with the answer you want
I don't disagree. If you find flaws in the reasoning by all means point them out.
- draw your arbitrary cutoff between important-living-things and not-important-living-things somewhere above plants and bacteria while staying comfortably below humans.
My cutoff isn't arbitrary (again I presume you are addressing me), and this is a mistaken presumption. If you endeavor to minimize suffering among things in the world capable of suffering, eliminating animal suffering is a net plus.
If you want to argue that since all living things suffer and therefore all of our food sources suffer and therefore any attempt to delimit an ethical food source is arbitrary, good luck. I eagerly await your finding that vegetables feel pain.
I will not be surprised when it happens. Plants have nervous systems and sophisticated sensory capabilities. They don't have voices and their body language moves too slowly for humans to make sense of it without looking for it deliberately (such as with time-lapse photography). Plants display a variety of defensive behaviors.
Although technically, vegetables probably don't feel pain in the same sense that meat doesn't feel pain.
All of the above are arguable - why not argue that humans and cattle are inedible, but question where the line between insects and small mammal suffering is? Why not question whether stupid non-suffering cows are OK to kill? Or whether non-suffering humans are OK to kill and eat?
Even if you continue with your line, there are other ways you could argue it - if suffering is bad, then that doesn't mean eating animals is bad, you could tranquillise before killing and it would be OK. Or if you want to minimise suffering then wild animals aren't OK, and all animals should have a human managed lifespan from sedated birth to anaesthetised death followed by being eaten, and not-eating-animals but leaving them to die "naturally" is worse.
And there's room to step back and say suffering is a behaviour and thought pattern that's only of interest to the creature connected to that nervous system - otherwise of no interest, so why go for 'minimise suffering' at all? You could argue about methane emissions or water/other resource use, for why it's unethical to prioritise farming animals for food, or you could look at it as a social signalling - wanting to signal that we understand suffering and choose to avoid it or as a personal suffering (guilt) reduction aside from the animal suffering, or you could believe in a deity which punishes you after death for animal suffering you caused.
But however you look at it, you come back to: humans have to eat, so plants or animals have to be ethically OK to eat, and we don't want hurt selves or relatives and friends, so cannibalism can't be OK, but within that, whether extending compassion to animals is "right" or not, it's not objective - refusing to eat farm animals because the calculation on their methane emissions leading to global warming effects and future human suffering would be, but where the ethical line in farming living creatures for food is, is not. It's just arguable back and forth.
I eagerly await your finding that vegetables feel pain.
Even if they don't feel pain, they have a natural life, lifespan, life processes and reproduction which eating them stops. There is room for an ethos where doing that is wrong, you just dismissed it as daft.
The practical skill Philsophy teaches is critical thinking and logical reasoning. But you don't need philosophy to learn either of those.
"general skill at persuasion" sounds more like sophistry/rhetoric...
Persuasive techniques can be used deceptively but that's hardly necessary. It's important to accept that people are emotional and simply proving them wrong is usually not sufficient to change their mind about something they've already decided.
I think philosophy should have a function/use. But does it? I think its functional importance is in asking questions in such a way they can be answered by experiment.
So in the philosophy of mind, the functional role of philosophy is to refine concepts so there is a conceptual difference that makes a difference in experiment.
The problem is, there is no way to tell if you are doing useful philosophy if we define its function as creating the conditions for experiment to take over. We can only tell if philosophy is useful in retrospect, by the impact a work has. Philosophy is not like science, where we know we are 'doing science' by performing experiments. But it also is like science, in that we judge it's use retrospectively.
A good example of philosophy being useful is the case of Bertrand Russell's work on 'Russell's Paradox'. Russell's work in the Principia Mathematica was partially about formulating definitions that would free us of this paradox.
PG says "Outside of math there's a limit to how far you can push words; in fact, it would not be a bad definition of math to call it the study of terms that have precise meanings."
Surely there is use in forming these precise meanings in the first place. Some (not all) of that work is philosophy.
Russell's work was outside of math, or at a minimum it was 'Meta-math'. Russell was attempting to crystallize new logical definitions that could form precise definitions, and hence become mathematics. He was trying to create mathematics from philosophy.
Did it work? No. The Principia Mathematica was a failure. But the way it failed, it failed as a piece of mathematics, at the hands of Kurt Godel. Here is a case of philosophy becoming mathematics. In short, Godel's work leads to the work of Church and Turing, which leads to the computer.
So we can at least point to the concrete usefulness of some philosophy, retrospectively. This work was useful, even if its only effect was to shift around the conceptual landscape so non-philosophers could do 'serious work'.
And here's the yc discussion about it: http://news.ycombinator.com/item?id=404707.
From what I recall, these critiques of philosophy are well-known in philosophy, especially reactions to Aristotle. A inexperienced reader would get the impression that Aristotle and the mistakes he engendered went unchecked for a couple thousand years, until brought to bay by the sterling efforts of Bertrand Russell and Paul Graham.
From someone proposing a new approach to philosophy (!):
1. I would expect to see a little more on some of the less peculiar empiricists - Berkeley, in my dim memories, was the most obscure of that little grouping.
2. A less Cliff's Notes discussion of what's happened in analytic philosophy that strays beyond the brand names (Russell, Wittgenstein) wouldn't go astray either. Claiming it's all about 'language games' is a sophomoric 'out' to having to engage with any of the rest of analytic philosophy.
(Also, would you mind being civil? I'm doing it; you can too.)
Civility is a difficult, but simple, art, necessitating primarily that one should wish to get along with another: onan does not seem to be going out of his to make himself agreeable to pg or anyone else in this exchange.
Perhaps you would prefer people use more phrases like "I think", "I feel like", or pose sentences as questions. I suppose that works well enough for me without destroying conversation efficiency too much.
The word "accusation" has a broad spectrum of emotion, but perhaps a very narrow emotive connotation when heard by the accused.
If I've said something false, I'd honestly like to know what it is. I wrote that essay, like most essays I write, to figure out things for myself. So if I've made a mistake I want to know what it is.
For me there is one (very excusable) "blind" point in pg's position, it's Eastern classic philosophy (hear Confucianism and a bit of Taoism). For some, it will not even be considered philosophy, because it do not ask questions about questions. It is closer to politics, because it tries to answer the one important question (in Eastern eyes), how to have human beings living together without too much blood. Answers are complex, and rely on ethics, self-cultivation, etc. Studying these Classics is not just for statisfying one's curiosity about past things, it will help understand the world, and what to be in the world, all of that without any "magic trick" (hear God, promise of the paradise).
I don't think the ommissions you are talking about are on these lines, so it would be nice to see where these ommission make pg's essay unfair.
So no King Solomon, no Vedas and probably no Confucius either, as secular as he tended to be.
You mentioned at the end of the article that"...philosophy was hopelessly intermingled with religion. It didn't shake itself free till a couple hundred years ago..." I'd have to counter that philosophy without religion is generally new, hollow and unworthy of much study -- which in some ways I think is your point.
I'm genuinely curious: Why (or if) you think that this decoupling is a good thing? So few other disciplines (only Religious Studies, and perhaps The Arts) ever touch upon the idea of spirit at all -- which I think is at the core of Philosophy -- the nexus of Sapience and Sentience. If one is not a fan of religion on principle (its over-adherence to both dogma, and dominance are two criticisms that I have), then certainly this is a strength of Philosophy -- the decoupling of religion from matters of the spirit/soul. Spinoza's The Ethics; and Nietzsche's Beyond Good and Evil are two interesting books for students of Philosophy to consider.
Edit: reduce strength of statement.
The subjective is very important which is precisely the value of philosophy and other liberal arts.
I have to disagree with that. I've met many people, especially in larger enterprises, who started in development but then became more abstract over time. They weren't bad people, in fact, they were excellent at their job.
Programming to me has never been something that has to be continually pursued in order to stay fluent or able, but merely something that reflects your more basic skills and talents.
It's like playing a musical instrument. Almost anyone can learn playing the guitar, but it takes a special talent to excel at it. For the guitar this requires hearing, sense of rhythm, and others; for programming, this is analytical thinking, systematic thinking, and more. Some people will try to program but never be really good at it. I studied with people like that. It's not their fault, their skills are just in another area. Some others are great at it. Once they learned, it doesn't matter if they don't develop anything for 3 years; after their break, they look at a piece of code / framework / technology, understand what it does, and continue programming.
And the traits that make you a good programmer help you in other fields, even management. Yes, large corporations have structures, but we need structure to manage them. And we need managers. And a manager who was a distinguished developer will be much better suited for leading a team of developers -- even if he doesn't program any longer. This is a valid career path, and an interesting one at that.
My general opinion is that if you want to stay a programmer, find yourself a role where you can do that. If not, don't bother pursuing programming at all costs. It won't lead you in the right direction.
I am not so sure about that. I would say a distinguished manager who was a developer might satisfy your criterion. Social skills, leadership, empathy, and other such traits might be (perhaps even overwhelmingly) more important. That suggests an interesting application of the Peter principle. It seems to me quite possible that promoting the worst developers will lead to (on average) better managers, since we would not be removing the best developers from the development pool (and thus lowering the average competency of the developers), while--assuming people skills are in the large independent from coding skills and come from a normal distribution--the promotee's people skills will likely be better than their coding skills.
Really quite counterintuitive, but to me, a cool idea.
What you're describing -- a small team of independent programmers -- accounts for only one part of software engineering projects: programmers are at the center of the efforts, everything else is just overhead.
Many projects are more complex. Development spans multiple long-termed iterations, is time and safety-critical, very expensive and only an integrated part of a larger development process (e.g. automotive, aeronautics, ...). A lot of the man hours on these projects go into tasks that don't directly include programming. These are not necessarily management roles. You have Architects, Test Designers, ... not to count all the roles that have nothing to do with IT at all (if you design a car, what percentage or project members are programmers?).
I personally think that anyone who works in software engineering should have a strong background in programming/development. But active programming is not necessarily part of many roles in many software engineering projects today.
If you don't wish to work in such an environment, no one is forcing you to. If you are, and you are unhappy, explore different options. But don't underestimate the incredible importance that non-programmer roles play in many, many, many projects.
The two examples you use, would have to be the best examples of 'waste' in the software industry.
I liken them to having someone on your team who is "The Debugger" where all he does is debug the developers code.
Developers should know how to test their code/app and should know how to design their app, and know how to debug their app. Sure they could 'specialise' one or more of these areas but they should all be able to build software (i.e write code).
Yes, architects and test designers can be a waste. But they can also act to make developer's lives easier.
Good architects can code, but usually act at a higher level - making sure different projects (a) know about each other, and (b) work together. That's important because teams tend to focus on their own success without always taking account of the bigger picture.
Is synonymous with good developers are also good architects.
I think (if I can articulate my thoughts correctly) my meaning was, that there is an obsession with breaking tasks into specific roles that are then the sole responsibility of one person. To the point of inefficiency (given the number of people you have to talk to for a given feature).
I don't agree that specialisation is wasteful (for the project at least). If it isn't needed, the person can still be a 'standard' dev. But if it is needed, they contribute more. (The only waste is to the individual if they specialise in an un-needed skill).
But, most architects are lazy genius. Only way to utilize their skills is not allowing them to code, but letting them designs and coach.
Similarly, you could hire a developer on the strength of their testing and get them to champion a stronger infrastructure but like the architect they need to be an active developer to judge the usefulness of their solutions.
I'd say managers (of coders) need to be coders too. They can survive not being if they're great managers, but they're doing it with a handicap of not being able to understand the tools and see the big-picture their team is missing. And the state of the art, not whatever they used in school twenty years ago.
But, conversely I think I was a lousy employee early on (in terms of value created for the core product / hours spent) because I wasn't thinking of the business aspects of the company. So while I think everyone needs active programming skills to participate meaningfully in programming, I also think those programmers need to be aware of the business they're in, the entire industry trends, and that of the problem domain the work is in. If you design/make/sell a farming GPS, for instance, you'd better have farming experience.
Perhaps the loosely overlapping networks of broadly-skilled individuals managing their own (but with input from an on the rest of the company) doesn't scale well, but I've yet to see a better idea.
This neither confirms nor infirms Paul Graham's thesis. In fact, while PG's assertion was a bit too specific--pointy-haired boss means in general "bad manager", not "programmer turned manager"--I don't think you can deny that people want promotions, and in some places the only way to get promoted is to go into management. PG is of course correct in suggesting that if you want to keep programming, don't accept a promotion to a non-programming role. However, if you find this stunts your career growth, you can either look for a job in a company that will allow you to grow as a programmer, or you can choose to go into management. That many programmers are averse to this doesn't make it a bad choice. You just have to know what you're getting into, and hopefully how to do it well.
Depends on what your goal is. It's perfectly possible to build a good career without keeping up with the bleeding edge of research. You don't have to know what monads are to be well compensated (at least in a Western country). However, neither will you be the next Peter Norvig. If you want to make money, there's many ways to do that. If you want to eventually get a Turing Award... there's fewer ways to do that.
A manager who is a good programmer will be golden at interview time. Whether he will also be a good lead is unclear. Like iand said, they should get out of the way and enable programmers to work optimally. As an aside, the structure of corporations and how it affects programming is an interesting topic worthy of its own discussion thread.
That's not what he is suggesting though: His statement doesn't include "if you want". He formulates it as an at-all-costs mantra -- stay in programming, for the worst could happen. It can be absolutely horrid advice to a number of people who would excel even more at other IT roles or management roles (see interesting comment from robertk ). I myself was long not considering applying to non-development roles, because many of those great opportunities are more and more looked down upon as "not programming". I know some people who've come as far as "Be Mark Zuckerberg or die trying", completely rejecting any business model that isn't two guys coding in a garage. I didn't actually read PG's comment like that -- I have the utmost respect for PG; but this is the background for why I authored such an opposing comment. I don't actually think the RAQs we are discussing were designed to be put under lawyer-like scrutiny ;)
Depends on what your goal is
Completely agree. Casual focus will rarely lead to greatness.
they should get out of the way and enable programmers to work optimally
For someone to get out of the way, he first has to be in the way. You make the general assumption that a manager is bad unless he actively changes his style. I cannot challenge this assumption for I have no data or ideas on where to find some concerning if managers are universally good or bad. And even then employee satisfaction might not be the only metric to measure success. Although I do think that having fun at work is one of the most, if not the most important metric. Many things would be better if everyone enjoyed their work.
I believe the idea behind that perspective is that the aggressive startup model will lead to either failure or fabulous success, whereas a more traditional career will lead to only incremental improvements to your lot in life. Some people are attracted to that.
No no, that was merely short-hand. To say "they should get out of the way" is to say the manager should create a good working environment for the programmers, while not micro-managing. So yes on the nice office with few distractions, yes on the manager being the sole gateway between the team and the rest of the company, no on the manager dictating what sorting algorithm to use. See for instance this article by Joel Spolsky: http://www.joelonsoftware.com/items/2009/03/09.html
Curious about this notion of 'special talent' - So some people are just born to be better guitar players? Are you saying that something in the brain is pre-wired to be a better guitar player?
I'd say for a claim like that, you need a citation (preferably multiple).
In my opinion:
You can excel at anything you like, you just have to work at it (it helps if you do it when you're really young, as your brain has higher plasticity). Get rid of this "special talent" notion.
>Are you saying that something in the brain is pre-wired to be a better guitar player?
Yes, whether due to nature or nurture, I would say certain people will be faster at learning how to play the guitar, and thus, be a better guitar player given an equal amount of exposure. Of course, you could always put more time into it, but there is only a finite amount of time available to a given individual.
No citations, but I believe this agrees with the general consensus. In fact, I would be greatly interested if anyone could provide studies showing the contrary.
In any case, I second the request for more studies.
At various ages, the ability to learn changes (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.88....). It might well be true that there is a lot of optimization to be done in the way we teach -- something that Salman Khan illustrates well with his new teaching methodologies. That doesn't necessarily mean however that at all stages, ages, circumstances, people will be equally able to learn something.
Nature: Some people have an innate aptitude; I won't look for papers, as I find this to be truly obvious.
Talent = "a special natural ability or aptitude"
Aptitude = "capability; ability; innate or acquired capacity for something"
Like tentonwire argued, whether it's nature or nurture, people are better at some things than others.
From what I understand, you are effectively making the argument that anyone can excel at anything. In this case, I also concur with tentonwire, that the burden might be on you to prove that hypothesis.
Yes, see Io:
OMeta looks like a combination of the two. It sounds really interesting, but I suspect it's for people smarter than me.
Some previous HN posts: http://news.ycombinator.com/item?id=2144067 & http://news.ycombinator.com/item?id=2157961
Operating by consensus and not valuing the role of "manager" only goes so far. It may work when you're a few people living together, but I think it ultimately leads to cultures like Google's, where every decision requires a room full of engineers to agree.
At scale you try to find above avg leaders who know how to solicit input and sell ideas.
What is this fantasy Google? The Google I interned at was highly hierarchical, with enough meddling managers to drive this whole thread insane.
This is one of the ways that you can get over narrow-minded specialists making decisions that work for them but screw everybody else.
What you take away is a very precise way to pose questions that make sense and to avoid questions that don't make sense.
The humility gained from those encounters would do a great deal to open people's minds that "there's more on heaven and earth than are dreamt of in your philosophy". Maybe people would less adamant, and pay more attention to why they hold their opinions. If everyone did that, we'd have far more productive discussions.
it says somewhere "thought it need not be red"; should be "though".
Proposition 4.022 should be called 4.002, according to my edition and another edition on the web I checked.
Perhaps the Gutenberg edition you used is not as properly proofread as one would hope.
Mentioning the source on the site is probably a good idea, for those not inclined to open up the source and hunt for it.
You can see that Kindle users have highlighted the passage here: https://kindle.amazon.com/work/in-the-plex-ebook/B003QJNVO4/...
Edit: maybe instead of downvotes, you could reply explaining what's wrong with this position. Or just look at camlp4 and see how it provides macros that are better (with a better underlying language) than LISP. And yes, I've written a LISP compiler.
I read some of the docs for camlp4. It looks like a library for extending ocaml using syntax grammars, similar to Langscape(formerly EasyExtend) for Python. It's not clear whether that really addresses PG's point (a). It would be really nice if rwmj would make his point explicit instead of whining about downvotes and telling people to go slog through camlp4's incomplete, broken-english documentation.
I draw cartoons/caricatures (examples here: http://www.smileecards.com) and have painted a few times, but I don't quite call myself a painter.
About teachers, I totally agree that good teachers earn the respect of the students by having high standard, calling students out on bad quality work. I once suspected a teacher only read the beginning and the end of essays, so I submitted a 4 page essay that contained a recipe for banana cake in the second and third pages; I received a B+!
Although, looking for painters who are also programmers, but programmers of things other than computer-art systems is also an interesting question. Likewise with musicians, people who are musician-programmers in computer music, and people who are musicians and also programmers separately, might be interesting to consider as different groups.
I think part of this comes from my interest in video games. If you find yourself working as a one man show you'll quickly start developing skills with pixel art (or modeling and texturing if you work in 3D), much as programmers interested in webdev frequently start to pick up skills in graphic design. It would be nice to collaborate with other people and outsource some of these skills, but at the end of the day the best way to get precisely the results you have in mind is to be able to build every component of a project.
I would guess a smart person can learn to hack sufficiently well in 6 months to a year."
Hmm... interesting take considering the source, especially when contrasted with the general mentality that programming and software development is the finest of all trades and takes a near preternatural mastery only found elsewhere in classical musicianship 300 years ago. I quite enjoy the feeling that I could be good in a year.
Which brings to mind patio11's constant reminders, that running a successful software business has very very little to do with programming well.
Which means that pg thinks that 6 months is the level you need to be at to be able to start a startup. This seems intuitively right to me, especially these days.
"I didn't trust Larry and Sergey as coders," said Craig Silverstein. "I had to deal with their legacy code from the Stanford days and it had a lot of problems. They're research coders: more interested in writing code that works than code that's maintainable." According to Jeff, one of the quirks in the early Stanford version of Google was that when something unusual happened the program would print out an error message without any explanation. The message read simply, "Whoa, horsey!"
Though there are a few people around here who've built software businesses around things that are also nontrivial technical advances; cperciva comes to mind.
I live in Israel, where a very common path to an Exit is to build some sophisticated technology, then sell it after a few years to a larger company. So it is a very possible path you can take. It has its own advantages and disadvantages over the much more visible, much more HN-friendly, "Consumer Internet Startup".
The conclusion should be that a startups that is not doing something technically hard should be considered in trouble (because so easy to clone).
But, that you should look for the easiest way to success. It'll take months for a large company to have the meetings to acknowledge your niche, let alone try to serve it. The only thing you have to fear is another startup and you're at the advantage is that situation.
Building huge walls or picking a niche with a large cost to serve, is like wasting time on DRM for your game, instead of working on marketing and having someone actually buy it.
That may be appropriate for patio11. Not to demean the guy but let's be clear, he runs an online bingo service. It's my understanding that he makes much more money from consulting gigs leveraged from this modest endeavour.
Rest assured that if your ambition is to write the next Dropbox or, hell, any YC company, at least 50% of your founding team will need to be very good programmers indeed.
If you can't see the pure hack value in that, you may be commenting on the wrong message board.
The conclusion you think you can draw about Patrick and "programming" from the nature of his first product: you can't draw that conclusion.
I'm writing this not to offend Patrick (he could care less about threads like this), but because you are taking exactly the wrong lesson about what Patrick did and then broadcasting it to the rest of the community. Stop, please? :|
I'm a systems programmer, since 1995. The list of YC companies that involve "real programming" to me is pretty short. Dropbox qualifies, but how many from the rest of their class would? Which is to say: "technical impressiveness is both subjective and a stupid metric for a business".
"Gladwell repeatedly mentions the "10,000-Hour Rule", claiming that the key to success in any field is, to a large extent, a matter of practicing a specific task for a total of around 10,000 hours." http://en.wikipedia.org/wiki/Outliers_(book)
At an average of 50 hrs/week this means it would take you close to 4 years to achieve mastery. pg is certainly correct that you can be productive before this point, but I do believe mastery is much less a function of calendar time, and much more a function of how many coding hours you can get under your belt as the days fly by.
In short, because what we normally mean by "tasks" or "fields" can be broken down into smaller tasks or fields, the statement is self-contradictory if they mean what they normally means. But if we search for a definition of "task" that makes the statement true, we can trivially find a circular one: a "field" is something that a talented person can achieve mastery of in about ten thousand hours of practice.
Once reformulated that way, it becomes clear that this is not a statement about human expertise or expert performance, but about the economic structure of current society, and in particular our degree of specialization: most people (within the professional class Gladwell finds interesting, anyway) work in jobs that require about ten thousand hours of focused practice by a talented person to reach mastery.
(I should point out that this occurred to me when I saw Jeff Fox (RIP: http://groups.google.com/group/comp.lang.forth/browse_thread...) claiming that the research Gladwell summarized meant that you needed about ten thousand hours of programming in Forth to be good at programming in Forth, so you shouldn't put too much stock in the opinions of experienced programmers who didn't have much Forth experience.)
Once stated this way, we can formulate interesting hypotheses about the cause: perhaps when a field starts to require substantially more than ten thousand hours of practice to master it, its practitioners start to specialize in a part of it, so that they can reach mastery sooner.
Regardless, it's interesting that you bring up the example of the neurosurgeon because Gladwell's specifically written about the mastery of expert surgeons including the example of Charlie Wilson, a highly regarded neurosurgeon. Included in his telling of the 2987 transsphenoidal resections of pituitary tumors that Wilson performed, Galdwell highlights the seemingly thousands of hours Wilson practiced surgical procedures on rats early in his career. You can read the article here: http://gladwell.com/1999/1999_08_02_a_genius.htm
Well, maybe. But maybe Jeff Fox is right, and you really need ten thousand hours of programming in Forth to perform at an expert level programming in Forth. And, given IBM's historical turnover, it wouldn't be surprising if performing at an expert level maintaining DB2 required not just ten thousand hours of programming, but ten thousand hours of programming in PL/M on DB2, in order to know your way around that particular codebase well enough to perform well. Certainly the ramp-up time is likely to be longer on that codebase than on something more modern.
But maybe we should go in the other direction — maybe Bill Gates is right, and it only takes a year or two of programming to perform within an epsilon of your best possible performance, and it's really something broader, like "engineering design of artifacts with human users", that needs a whole ten thousand hours of practice to master — not merely programming as such.
My point is that these can't all be correct. Being an expert at engineering design of artifacts with human users implies that you're an expert at programming, which implies that you're an expert at programming in any context: in Forth, in PL/M in the core of DB2, on the space shuttle. Being an expert surgeon entails being expert at tying sutures, but it surely doesn't entail spending ten thousand hours practicing the tying of sutures.
Where you (and many other people including myself) find trouble in Gladwell's work is that it's too cursory and often overreaching. It's true that he's doesn't go into the specifics of various fields in defining the specific areas that each requires to achieve that genius level. But such a study would be extremely exhausting, if not impossible. It's also not where Gladwell sees his role; he's a pop scientist, not an academic. His job isn't to take academic studies and dig deeper. Quite the opposite. As a pop scientist, he takes academic research proposed in academic fields such as psychology, sociology, and anthropology and he extrapolates these ideas into popular culture to cater to a much broader audience. Taken in this context, I think you'll have a much different experience reading his work.
The decision to learn enough programming to build what I want stemmed from the realization that understanding how my webapp will work at the technical level significantly refines my vision for the product- bringing it much closer to reality. Whereas before I was talking to recommendation engine people because I wanted "recommendation features like Hunch," now I'm working in PHP/SQL and getting exactly what I want. My current strategy is to talk to the CTO's and creators of similar webapps to understand their technology and language choices (e.g. GAE or MAMP, SQL or Solr), go as far as I can on my own, and then work with a tutor for an hour or two to get momentum and correct my architecture mistakes.
I just started learning HTML, CSS, jQ, PHP recently , and I found myself advancing at surprisingly high pace (now watching jQ new_boston tutorials on youtube).
Main problem for me is finding good material to learn from, since I only learn from the internet.
You have any advice considering good online sources for a total beginner to learn coding really well ?
Caveat: I'm not a developer, so my comments about learning how to program should be taken with considerable skepticism. But I have some opinions based my effort to learn.
For me, programming syntax is less a problem for me than poorly organized code, poor grasp of problems and poorly thought out solutions. Design is a subtle and important topic that does not lend itself to the shorter coverage typical of internet material. Book-length immersion in the concepts is only a beginning to an understanding. So I think your "internet only" restriction is a severe constraint.
But it isn't as if there are many design books for beginners. These generally target people who've mastered some languages and know the design-problem space. "Pragmatic Programmer" was a big help.
I repeat: I'm not a developer. Hopefully this is thought-provoking, but I'm no authority.
The best thing to do, especially if you are on a budget is spend a few hours in a bookstore each week reading over various books on various subjects. I found this a lot more helpful.
I agree. One of the absolute worst pieces of advice I got over and over again was "don't go into programming. It's all being outsourced overseas anyway. Just learn how to manage programmers." Luckily for some reason I finally decided to ignore that advice and strive to become a great programmer myself. One of the, if not the, best decisions in my entire life.
Even now, although I certainly could become more of a "manager", I choose to stay in the pit coding. Although I now do tasks that can be called "managing" such as helping out other coders with their bugs and problems, mentoring, communicating with people outside of engineering, recruiting and interviewing, the biggest chunk of my time is spent programming and working on my skills.
It's worked for PG. It's worked for Paul Farmer(replace "programming" with "doctoring"). I'd bet it's worked for nearly every master of their field. I think it's an essential rule to follow.
To that end, I think PG is dead wrong. Programmers can make great managers, and find it very rewarding, even if they're not in code all day. There's as much variation in that end of the business as their is in the coding. Would you go into programming if you believed that ever job was a dead-end VB jockey? Of course not. The way to avoid being a PHB is to avoid behaving like a PHB.
Not all structures can stay small enough to avoid management. The startup phase is just an initial phase that will dysfunction as the number of employees grow.
Additionally, I don't think that having a manager who programs is a way to increase management quality.
So how do you get good management? There's no simple answer to that, it's a central problem to all companies as they grow.
> Couldn't you add something equivalent to Lisp macros to languages like
> Perl or Python?
> Not without turning them into dialects of Lisp. Real macros need to
> operate on the parse tree of the program.
Perl6 comes with full Lisp-like macros: http://en.wikipedia.org/wiki/Perl_6#Macros
See previous HN discussion on this: http://news.ycombinator.com/item?id=1279238
I've never met anyone who bridged visual art with anything tech-related, but composers, singers, pianists, orchestra members... hell yes.
Is it just that painters are less common that musicians and that ratio stays true in the tech world?
I also play piano, compose music and write (bad) poetry! I have a small sample of my music here: http://soundcloud.com/mournfulempathy/empathy-intro .
I don't think I'm good as a painter but it is a hobby I enjoy greatly!
You've never met a web designer, video editor, 3D artist or graphic designer?
In my head they are called "art", I have to make an active decision to think of "art" as "the arts" including other things - guess just the way I learned the vocab. as a kid. So it was already a step to edit from "art" to "visual art".
Perhaps I should have gone with "static visual art that is created for the sake of being art"?
As to your examples, I suspect the main reason that there's such a link is (obviously) the big crossover. If you're an artist who wants to make money making websites, or videos, or whatever, it makes sense to learn at least some of the technology related to it to help you. And if you're, say, a website developer, it sometimes makes sense to go the other way too.
I don't believe there's a difference. All the good graphic designers I know don't only do applied design for clients, they also create "static visual art that is created for the sake of being art". On the other hand, most sculptors I know mostly create statues for banks and local governments.
Having attended art school, I can tell you there is little art free from commerce and it mostly is no good. Having some constraints actually helps the quality of the work.
You create a website, your aim isn't generally to have people stand around and admire the art, that's a side effect. (And sure, sometimes maybe it's the strongest influence when a designer is making a website, but it isn't meant to be - it's meant to be a part of making something functional).
Plenty of websites have no commercial goal, they're just meant to be gawked at. All the major contemporary art museums have websites, photos and posters in their collections.
Also, many sculptures found in museums weren't made to be used as art - they were meant as tools or status symbols, for communication, sexual arousal or worship.
To be a 'real' artist isn't determined by the medium one uses, but by the quality of the work created. I enjoy the work of Dieter Rams, Saul Bass and Dick Bruna just as much as that of Vincent van Gogh and Jackson Pollock.
All I was trying to do was find a decent way to seperate art that isn't related to anything technical, of the type that (loosely speaking) has been being produced for centuries(/more), and stuff like video, web etc.
As to not thinking my wording is the problem - your replies have showed that the impression I gave you wasn't at all what my thoughts were, so clearly that was the problem.
The one with the most determined and smartest founders (in that order) is the more likely to succeed. "
All else being equal this is true and it's good advice.
But unfortunately when you are choosing from two startups to work for all else is not equal.
Also, be ready to put up with bullshit from parents, admins and other teachers who insist on the path of least resistance.
(You too, Peter Norvig.)
I am open to suggestions for changes, since some of the books are older there are multiple versions or translations available. Some of these are freely available as well - Franklin's autobiography and Caesar's Gallic Wars, for example.
I haven't read any of these particular books but I have read a great deal of other history books covering some of the same topics... this is a very good list though and now my to-read stack is going to be even higher :)
PG's RAQ History Books - Home http://bit.ly/qRK15K
(Use the bit.ly link so I can track. 39)
Otherwise, the site is:
Note that about half of these are out of print.
My initial reaction to this was to agree with the statement. And I still sort of do.
But I think women (or men) can be a big distraction if you are trying to create something. That being said the correct woman or men can help you achieve as well if the relationship is stable. The statement "behind every great man" there is a ... is basically true from my experience.
Then there is your "if the relationship is stable". Well, yes, and 'effective'! So, and especially for the huge fraction of men who don't have mothers as helpful as FDR's, how to get such a 'relationship'????!!!! May I suggest Women 101-102!!!!
Otherwise, yup, "can be a big distraction if you are trying to create something". Twelve year old boys, BELIEVE this!
Indeed, one of the first lessons in Women 101-102 is: Since the sixth grade or so, you REALLY wanted the girls to like you, right? So, you did this and that and it didn't work very well, right? Well, heavily what the girls want to see in a boy or man is someone strong, capable, and confident who understands the emotions of girls and women and is nice to them. For a while, being a successful quarterback on the varsity football team can work well. But, big secret: What REALLY works well is to be out of school, have a good career going, have plenty of money in your pocket, have a nice car, say, a recent Corvette, have a house bought, and have plenty of money to be able to support a wife and children. Then a large fraction of the females will, to exaggerate a little, do 'just ANYTHING' to be 'friendly'! Usually they won't even be able to help themselves! Uh, that reaction is a 'built-in function'! With these advantages in place, your brand of aftershave, blue jeans, shoes, cell phone, manner, tone of voice, pickup lines, even height and weight, etc. won't much matter! How 'bout that!
In particular, largely f'get about females your own age. Instead, in your teens, go for girls 1-4 years younger than you are. If she is 13 and you are 16, then she can be REALLY impressed that you have your own car but won't expect you to have a good career or a house yet!
Next, as from an expert on females, "Of COURSE, women are MUCH more emotional than men. That's the cause of all the problems.". Yup, not exactly right, but close! In a 101 level description, what rationalism and logic are for a hacker programming, fixing a car, diagnosing a network problem, etc. emotions are for females. In simple terms, first-cut, at the level males need to understand starting at age 12, the life of a female is all about emotions. Even when she is doing math or science, mostly she is seeking good 'emotions' from praise, acceptance, and approval from others she wants to please!
"Twelve year old boys, BELIEVE this!"
So I read it like
"12 year old boys believe this!!!" (mocking)
"12 year old boys need to understand this!!"
Good points and the inverse is true for women. My sister spent much of high school being popular and although she was smart (and later got a Masters) she didn't achieve as much as she could have had she not been as popular and attractive and good personality. No question in my mind about that.
Man up, walk over to her, and say "Hi. I thought you looked cute and wanted to meet you. I'm Robert." Repeat times 1000. That was your warm-up. Now things should feel much more natural and you will understand a whole lot more about social interaction.
Your remark might be good and, thus, a good contribution to some Web site that tries to provide advice "What I wish my father had explained to me when I was 12, however, I've come to expect that mostly he didn't understand very well."
For making an A in Women 101-102, right, the grade is to come from some unnamed course professor and not from the women the man wants to meet!
A year? Depends! To be very useful on Windows, really need to be okay on the content of several books, each about 1000 pages long, have worked through about 2500 Web pages of documentation at Microsoft's MSDN, along with more pages from other sources. Then need to write some code, at least as exercises, using what learned. For writing code, need to learn either an integrated development environment (IDE), e.g., Visual Studio, or get good with a powerful text editor (I use KEdit) and its macro language (I have about 150 such macros) and a scripting language. And need to get good with Windows, e.g., have traversed much of the obscure tree of things to click on. And need to be good at software installation, e.g., .NET Framework, service packs, IIS (for a Web server), Internet Explorer and some other Web browsers, likely some version of Office, maybe Knuth's TeX, SQL Server or some alternative, etc. Should learn some Word, Excel, and PowerPoint. Should learn some T-SQL, HTML, CSS, ASP.NET, and ADO.NET. Also need to be good at backup and recovery, ESPECIALLY of the operating system and boot drive. A year? Want to give up sleep for a year?
For Web development on Windows, that is a good list of a minimal list of EXACTLY what is needed. Web development on Windows is JUST what I'm doing, and that list of topics is JUST what is centrally involved.
What are you going to leave out? T-SQL? OS backup and recovery? Good skills with an IDE or editor?
Yes, learning most of, say, Visual Basic by itself is easy. The 'rub' is that for anything at all serious in Web development or just software development, it is just crucial to make good use of .NET, and that is HUGE. For a long time, nearly each time you turn around, you will need to use another .NET class, and that will take you 1-3 dozen Web pages at MSDN to find, download, save, index, abstract, and read. The total of MSDN Web pages needed adds up quickly: I have over 2500 now, and it grows quickly. Just yesterday I downloaded a lot just for 'request data validation'. Before that, the topics were HtmlEncode and UriEncode. Before that, TCP/IP sockets. Before that, Windows Communications Foundation. Before that Windows 'remoting'. Before that, Windows remote procedure call. Before that, a LOT on SQL Server administration and mangement. And on and on and on. It's ENORMOUS.
All you need at first is something you can type text into that knows how to highlight syntax. You don't even need to know the features of that particular editor, aside from stuff like "save" and "open".
> and a scripting language.
For what, exactly?
> And need to get good with Windows, e.g., have traversed much of the obscure tree of things to click on. And need to be good at software installation
Why? How does this actually help? How is this actually difficult?
> Should learn some Word, Excel, and PowerPoint.
None of that has anything to do with programming. And how is this actually difficult?
> Should learn some T-SQL, HTML, CSS, ASP.NET, and ADO.NET.
Well, you need HTML and CSS for sure, and you need SQL. And I think you're going about it the wrong way if you try and comprehensively read and memorize the API documentation for every single framework call you make. Any programmer looks up what he needs on a case by case basis and over time comes to remember the stuff he uses a lot.
But I think you're just making it harder on yourself using the Microsoft stack. There are, believe it or not, entire web development frameworks where you don't actually need to manage the TCP/IP sockets yourself.
> Also need to be good at backup and recovery, ESPECIALLY of the operating system and boot drive.
This is part of owning a computer, not programming. It's also not difficult.
"There are, believe it or not, entire web development frameworks where you don't actually need to manage the TCP/IP sockets yourself."
Actually, what I'm doing with TCP/IP is the easy way to proceed!
You interpreted what I wrote as working with TCP/IP for the Web page Internet communications. Sure, I let Windows and IIS handle that!
But, consider a Web page that needs to execute a SQL query. SQL Server runs in a different 'process', likely in a different address space, maybe on a different machine, real or virtual. So, how does the Web page communicate with SQL Server? Sure: That's already worked out by Microsoft. BUT: There IS communications, and it is 'asynchronous'.
Okay. What holds for a Web page getting to SQL Server can also hold for a Web page getting to some other software possibly particular to the Web site.
So, suppose the design of the Web site has a program that needs to run as a 'service' and be accessed from the Web pages 'asynchronously'. So, how to do that communications?
I looked into Windows 'remoting' (now considered obsolete by Microsoft), the Windows version of old remote procedure call (RPC), Windows Communications Foundation (WCF), Windows pipes (on the same machine), and old TCP/IP sockets. Just to investigate these may have involved over 100 Web pages at MSDN.
I settled on old TCP/IP sockets. For the data to be sent/received, to convert to/from instances of classes, I settled on 'serialization'.
So, I got (back) into TCP/IP sockets. I've done socket programming for over 20 years. To get started with sockets on Windows, I started with some simple socket code in the scripting language Rexx. I got both the 'client' and 'server' sides working with each other. Then I wrote the server side in Visual Basic .NET (the Microsoft 'managed code' version of their TCP/IP) and got that working with the client side code in Rexx. Then I wrote the client side code in VB.NET and got it working with the server side code in both Rexx and VB.NET.
Now the TCP/IP code is ready to drop into the VB Web page code and the VB 'remote server' code.
But using TCP/IP and serialization for this communications was the easy way.
Net, all your criticisms have turned out to be unjustified. I've done nothing wrong. What I wrote to begin with was right on target. Being so consistently wrong you should fear will hurt your credibility. It's not good to be so consistently wrong.
"I want to start a startup, but I don't know how to program. How long will it take to learn?"
So, he wants to "learn" "how to program". Good objective, clearly stated.
"All you need at first is something you can type text into that knows how to highlight syntax. You don't even need to know the features of that particular editor, aside from stuff like 'save' and 'open'."
"All you need at first"
is not very relevant to the clearly stated objective of learn to program for a startup. You just ran off the subject to try to find something to object to.
From your statement, it would appear that even Notepad would be sufficient, but it is not. Instead for anything productive even just in the learning, what I wrote is on the center of the target.
Actually, the editor I use, KEdit, does know "how to highlight syntax", and I think so little of that I keep it turned off and don't use it. I didn't go into what makes a good editor or IDE, but there's a LOT to it even when "how to highlight syntax" doesn't make the list.
"> and a scripting language.
For what, exactly?"
If not using an IDE, then need a scripting language for common tasks otherwise done by the IDE. Or maybe you would expect the poor student to type
C:\Program Files\Microsoft SQL Server\90\Tools\Binn\SQLCMD.EXE
What I wrote is correct: If don't depend on an IDE for essentially everything in the development, then need a scripting language.
"> And need to get good with Windows, e.g., have traversed much of the obscure tree of things to click on. And need to be good at software installation
Why? How does this actually help? How is this actually difficult?"
If get very far with it, then it's horrendous. The clicking and clicking and clicking goes on and on. There's next to nothing good in documentation. Much of what need to learn is just by trying, e.g., for everything in sight on the screen right click, left click, double right click, double left click and see what the heck happens.
"> Should learn some Word, Excel, and PowerPoint.
None of that has anything to do with programming. And how is this actually difficult?"
If they don't know how to program, then maybe they won't know these tools, either. If they are doing a startup, then they should know them. Word is a pain in the back side, very poorly documented, and commonly takes about two weeks to get good enough to do, say, a document good enough to be, say, a 'business plan' or a report to the Board. Similarly for writing letters to the government, lawyers, etc.
Excel, if get far enough to do some financial projections for fund raising or the Board the first time takes some days or weeks.
Once are okay with Word and Excel, then PowerPoint can go quickly, say, the first good 'foil deck' takes a few days wrestling with PowerPoint.
And I omitted Outlook: Back when I was on OS/2, I wrote my own POP3 e-mail and used it for years as a great tool. But with Outlook, I commonly yell and scream myself hoarse in frustration. Outlook is awash in serious problems: E.g., the PST files are by default in a hidden directory deep in the directory tree on the boot drive; takes a while to discover this and how to put the PST files in a decent place. "Decent"? Yes: DO want to backup the PST files as part of daily incremental backup, but do NOT want the boot partition as part of the daily incremental backup. So, have to get the Outlook PST files OFF the boot partition. Next, starting a new PST file commonly causes Outlook to lose the Contact List, and to get back the list have to discover a certain Web page at Microsoft and go through an obscure dance of about 12 steps. If have more than one PST file seen by Outlook, then on each use of Outlook it will set the archive bits of all the PST files so that they all get backup in incremental backup. So need to find a workaround or can have the daily incremental backup start at 500 MB or so. The options in Outlook are horrendous to get set appropriately; I finally traversed the whole tree of menus, set everything, and documented each click. We're talking DAYS of work. It's super tough to find old e-mail in Outlook. And those are just some of the problems; I omitted security issues and much more.
"And I think you're going about it the wrong way if you try and comprehensively read and memorize the API documentation for every single framework call you make. Any programmer looks up what he needs on a case by case basis and over time comes to remember the stuff he uses a lot."
I wrote nothing that suggested I "comprehensively read and memorize ..." You are straining to get off the subject to find things to criticize. Indeed, I wrote that I "index, abstract" the documentation, of course "to look up" as needed. Still, at least the first time, have to read the pages downloaded. And, again, to get very far with the goal
especially for "a startup", really WILL need MORE than 2500 such Web pages. To pull this all off in one year, we're talking ballpark 10 Web pages a day, on top of the rest. It's beginning to look like a busy year.
"But I think you're just making it harder on yourself using the Microsoft stack."
there are pros and cons going with Microsoft. While there are alternatives, net it's not a bad decision. It's not the more popular decision here at HN, but it's still okay, and it's the one I selected so can write about. If some alternatives are much faster, fine, but for the goal
especially for "a startup", my main point is correct: Going with Microsoft, one year is FAST.
"> Also need to be good at backup and recovery, ESPECIALLY of the operating system and boot drive.
This is part of owning a computer, not programming. It's also not difficult."
It's also commonly neglected, especially by people who are not yet deep enough into computing to program. So, it is reasonable to believe that the person might have to learn. For a startup, it's IMPORTANT.
And, as I wrote, for
ESPECIALLY of the operating system and boot drive
it IS difficult. It's a self-inflicted, unanesthetized root canal procedure, at least.
Oh, it's easy if don't do it. And it's easy if do it but don't test to see if it actually works, that is, results in a bootable partition with everything back where it was. But if do it and confirm that it actually works, then it's TOUGH. Just the testing requires running experiments where have to reinstall the OS, likely several times.
There may be some easy ways to proceed, with some special third party programs or some newer versions of Windows, but for XP SP3 and using NTBACKUP, we're DEFINITELY talking a self-inflicted, unanesthetized root canal procedure, at least. Why? The relevant documentation for NTBACKUP totally sucks: About all a user can do is just guess and try. Getting a good solution to backup and restore of a bootable XP SP3 partition with NTBACKUP is a real accomplishment, 1-2 weeks of work. But NTBACKUP does have some highly desirable functionality: It can do a 'shadow copy' where it backs up the boot partition while the partition is booted and running. And it can restore to another disk partition (with the same drive letter) of a different size.
As I discovered the hard way, the testing to be sure the are doing the backup so that a restored partition actually is bootable is just CRUCIAL.
Actually, in the end, NTBACKUP can be okay, but the documentation is so bad that lots of experiments are needed to get the procedure working correctly and test it to be sure it is working correctly. The main problem is just the documentation. If someone hands you some good documentation, say, in a well documented script, then you can be okay right away. Then you can save 1-2 weeks.
Here's how I solved the backup problem: I bought an external hard drive, connected it to my Mac, and clicked "yes" when my Mac asked if I wanted to use it for backups. Here's how I solved my email problem: I set up Google Apps on my domain, which took a total of maybe 30 minutes (though I'm sure you'd go through and document every single click it took to configure the DNS settings). Here's how I got my first editor: I got TextMate and started using it, and was productive from the first second. As far as scripting languages for building code, I think I learned how to use make and that's about it. That's all I needed to do to be productive. I can, and in many cases have, wasted time wanking over configuring everything in finer detail, but that's not a useful mentality.
It's probably not even about Microsoft at this point. There are lots of programmers who are productive on Microsoft (though I've never met a serious programmer who actually uses Visual Basic). You just don't seem to be one of them if it's as much of an ordeal for you as you make it out to be.
Everything I said is well informed and well justified.
While you were not clear on just what you did on a Mac, for your
"Here's how I solved the backup problem: I bought an external hard drive, connected it to my Mac, and clicked "yes" when my Mac asked if I wanted to use it for backups."
on Windows XP SP3 that in no way addresses the problem I explained now twice: Back up a bootable partition so that it can be restored and bootable.
One severe problem with such backup is backing up a booted partition as it is running. As I explained, NTBACKUP has this problem solved, with 'volume shadow copy', but the solution is involved. You didn't describe how Mac has this problem solved.
As I explained, the problem with NTBACKUP is the documentation and getting the options so that the backup can be restored and be bootable. Due to the documentation problems, that is NOT easy.
And once you have such a backup of a bootable partition, how do you restore it? I do have a solution, have tested it, and have shown that it WORKS.
Again, yet again, there can be solutions with third party software and/or versions of Windows after XP SP3, but for that version of Windows what I wrote is a very accurate description of what HAS to be done if just using Microsoft's software.
Your claim that I am making things too difficult in backup is just flatly WRONG -- uninformed, misinformed, and just plain WRONG.
For Outlook, your solution was to use gmail. That has some pros and cons, and I don't like the cons.
For writing down all the clicks in configuring Outlook, you don't have a better solution. There are a LOT of options for Outlook -- a LOT. Some of them are important for security. About the only way to get all the options right is to do JUST what I did. Security problems with bad options in e-mail programs have been grim for years; maybe most Outlook users without a support group to set all the options have some security holes.
Your claim that I am making things too difficult in Outlook is just flatly WRONG -- uninformed, misinformed, and just plain WRONG.
Heck, I even omitted describing setting options for Internet Explorer and Firefox. For Firefox, a default option is to PERMIT running Java in a Web page. OUTRAGEOUS security hole: Java can do ANYTHING. If a user is not REALLY careful with Firefox options, then they will have Java enabled.
"though I've never met a serious programmer who actually uses Visual Basic"
taken literally means nothing but suggests something misinformed, uninformed, and just plain wrong: In fact, on Windows, the .NET version of Visual Basic is fine: It gives essentially full access to the basic 'common language runtime' (CLR) and .NET; it has full 'managed code' including its memory management; using another language on the CLR is mostly just a matter of different syntactic sugar. An advantage of Visual Basic is that the syntax is easy to read and not borrowed from the too sparse and idiosyncratic C/C++ tradition as in C#.
When I hire people, I will have them start on Visual Basic because it is easier to get going with than C#.
On Windows, what language would you use? C/C++? That's not Windows 'managed code', and managed code is IMPORTANT. So your main options will be C# or Visual Basic. You could use either one, but working for me I'd ask you to use Visual Basic. You would not be able to use C/C++.
Your claim that I am doing something wrong using Visual Basic .NET is just flatly WRONG -- uninformed, misinformed, and just plain WRONG.
"I don't even know what you're talking about when you say you "index" and "abstract" API docs--I just keep them open in a browser window and search for things I don't remember offhand."
your confession that you don't know is on target. As I wrote, I have something over 2500 Web pages of Microsoft MSDN documentation. For those 2500 pages, what you described is not a solution. For a solution, my "index" and "abstract" are a good solution: So, I have some simple, 'flat ASCII' files. I have one for each of Visual Basic, Windows, ASP.NET, and SQL Server. Typically I have about two of these files open in my favorite editor. For an issue, say, about ASP.NET or related parts of .NET or communications, I start with the file for ASP.NET. In that file each Web page of documentation has a few lines. Some of the lines give the title of the Web page; another line gives the tree name of that Web page on my system; those two lines form the 'index' and are an appropriate use of 'index'. Also there are a few more lines that say a little more about what is in the page, typically copied from the first paragraph or two of the page. If the page is a good 'root' page for a larger topic, then that is noted. Those extra lines are the 'abstract'. And there can be some additional notes of mine.
To find something, I just use the 'locate' command of my editor. Then to open the Web page, in my editor I give one keystroke on the line with the tree name.
But there's no way, as you suggest, to have all 2500 Web page files open at once, and your solution said nothing about how to find the right one of the 2500 pages but my solution did. And there is no XP SP3 'search' function nearly as effective as what I have with the four files and my editor.
Your solution with 2500 Web pages would be much less productive, not more.
Same for using a weak editor, C/C++, MAKE instead of a scripting language, and more. And your suggestion for backup on XP SP3 has little chance of resulting in being able to restore the boot partition so that it will boot.
Your claim that I am doing something wrong with my indexing and abstracting of those 2500 Web pages is just flatly WRONG -- uninformed, misinformed, and just plain WRONG.
For your "ignorant", "irrational compulsion", "unproductive busywork", those are all insulting, uninformed, misinformed, and just plain WRONG.
Again, yet again, one more time, the original question, go back and read it for yourself, was:
and you responded with
"That's all I needed to do to be productive"
which does not answer the original question.
E.g., you said nothing about SQL Server. Hmm .... So, yes, from a plugin in a Web browser, I got a virus. Right: Now I keep nearly all plugins disabled. Have to work at this since some software installs and enables browser plugins without notice. That virus was from one use of the Akamai download manager to get a PDF file on a motherboard from an Asus Web site. Bummer.
So, to be sure to solve the virus problem, I reinstalled Windows, Office .NET, TeX, SQL Server, etc.
Then I wanted SQL Server to read my old database, which was not on the boot drive and not damaged in the reinstallation. So, I used the T-SQL CREATE with ATTACH. This didn't work (for complicated reasons, not my fault). Finally, in trying to get SQL Server to use my old database, SQL Server got 'sick' and quit doing much of anything. So, I tried to uninstall SQL Server. It wouldn't. I tried to 'repair' it; it wouldn't repair. I tried to reinstall SQL Server; it wouldn't. Basically SQL Server had just wiped out my boot drive, and I had to start over with the boot drive. BUMMER.
Now I have a backup, bootable, of my boot partition with everything installed EXCEPT SQL Server so that if SQL Server ruins my boot partition again I will be able to restore the boot partition from just before SQL Server and then reinstall SQL Server. Some of what is important with Windows and SQL Server 'administration and management'.
Yes, I have more than one bootable partition!
One of the problems was that when I installed SQL Server as I had before the virus, what was installed was different -- strongly against my wishes, I ended up with two versions of SQL Server with 'side by side' installation. Well, apparently 'side by side' is just awash in bugs so that can't uninstall, repair, or reinstall.
I finally got through it, but it was NOT fast.
You didn't mention SQL Server. But for the original question, SQL Server or some substitute stands to be important.
So, what you are doing that is "productive" does not answer the original question. I answered the original question, and you didn't.
On what I'm doing, you are WRONG, consistently, 100% WRONG.
That's enough of responding to your errors.
Surprise, there are more environments that you could do web development on and quite a few of them are a lot easier to master than the one you describe.
Also important is scalability. Some 'development environments' that are easy to get started with have bad reputations for 'scalability'.
Another practical issue is, is the guy just a developer in a group where he can draw on others in the group or is he really, now, and during most of his development, just one guy and not funded and has to do nearly everything alone? To do it all as just one guy means he needs to know a LOT about computing a bit far from just Web development alone.
Actually, I omitted a lot: For a Web startup, usually need something like relational database, e.g., SQL Server. If the startup gets very far into production, then he will need some background in SQL Server management and administration, e.g., performance, backup, and recovery, and I omitted such topics.
I gave a description of some of what is important if developing on Windows. Here at HN, Windows is not the most popular option. But if you have some alternatives in mind, then describe them,