Push: You learn from books, classes, mentors, and studying examples, then apply what you have learned.
Pull: You have a problem that you must solve, then you learn what you need, any way you can, to build the solution.
I suppose there are pros and cons of each method, and I imagine that many people here have used some of both.
For the record, I am 100% Pull. I have absolutely no formal training. It took me 2 years to find my first job and then I was thrown into the deep end. It was simultaneously frustrating and exhilarating. There were so many times I didn't know what to do or didn't have enough "tools" in my box. So I had to figure it out and find sources of learning. But I always did. Any when I got that first thing working and then saw my customer's eyes light up, I was hooked.
Your CS degree may make you think that you're a "push" learner, but may I suggest that you adopt a "pull" approach. Forget what you think you know and find a job or a project or someone who has a real need. Then build what you need. You a several advantages over me: (a) It shouldn't take you long to find that job/demand/customer. Just keep looking. (b) You already have tools in your tool box, maybe not the right ones for the job, but you have something. And (c) It's easier than ever to adopt a "pull" approach. Help is everywhere.
You may feel frustrated, but I don't think you have a problem at all. You're in a great (and very normal) situation. Just adjust you attitude, find something to build, and do it.
On the other hand a pure push approach results in lack of practical knowledge, and frequently a "purity over pragmatism" mindset. As a guy who works regularly with grad students, this is frustrating -- the 100th argument consisting of me saying "yes I know the theory your professor taught, but here is how you do it for real given the bugs in the library implementation (or the runtime constraints or whatever)" gets pretty old too.
I guess I'm saying a pure approach for either has a tendency to miss a pile of benefits, and introduces it's own drawbacks.
The first 9-12 months were tough. I read some beginner's books, wrote lots of really ugly code, and got things to work eventually, if only barely.
Then the code started to get prettier and more robust, and I picked up some more theoretical books like Leiserson's Intro to Algorithms to learn the theory. Having "pulled" around code for 9-12 months on my own, learning more formal computer science really allowed me to begin gluing those pieces together.
Everyone learns differently. Maybe a better approach for some CS programs would be to throw students into the water for their first year, let them struggle reinventing the wheel and solving practical problems as best they can, and then teach the theory and formalisms later?
But at some point, I picked up a book because of a recommendation and it was astonishingly good (Cocoa Programming on OSX, Hillegaas). It gave me a real sense of confidence and overview that I was lacking in all my pull-learned topics. I guess this is because pull only ever provides very localized knowledge but you rarely see the big picture.
Since then, I tend to push-learn at first to get a feeling and overview over a language or framework and then gradually switch over to a more pull-oriented approach. This has served me very well. Some books out there are real marvels that can get you started with a new technology very painlessly. Gathering all that knowledge would take a lot longer. Yet, there is nothing as instructive as getting your own hands wet.
Some form of bidirectional or multi-pass algorithm is likely to be more efficient here... ;-)
I just wanted to expand on the "find a job or project or someone who has a real need." This part was tricky for me. I love learning, and my approach for a long time was to just keep learning rather than doing. As soon as I would finish one programming book or tutorial series I'd start on another. If I ran across something new and interesting I would switch over to it. After a long time approach to doing that I realized I had a lot of half-baked knowledge and nothing to show for it.
Finally I buckled down and "did." I started by tackling a problem of my own - keeping focused, on task, and efficient. It has become a tangible project that I will (soon) be able to show a potential employer or even make money off of it. The best part is that it's something I personally believe in and need, so I'm passionate about it. Working on someone else's project sometimes makes it hard to get excited about what you're creating.
If you can't think of some problem of your own to solve, work on someone else's. The best approach I know of here is to find a local not-for-profit organization or charity. Ask them what could improve their ability to do whatever it is they do, or even help save them money and then create it. Give it to them free of charge. This a) gives you something tangible to show potential employers, b) could possibly be a tax write off, c) gives you real world "pull" experience, d) is for a good cause.
Some people might tell you that you should never do work for free. In this case, the work you're doing is benefiting you, even if there is no money involved. Besides, if it's for a good cause then there will be rewards - assuming you believe in God or karma :)
To reiterate edw519, adjust your attitude, find something to build, and do it.
I've since changed my approach. I limit my Push to little bits in order to familiarize myself with tools, but don't go too far in detail. This way I can cover a wider range of topics without wasting time on details I'll never use. Then when I actually start a project, I'll dive into the details as I'm using them. I find digging into details after you have context for what you're doing also helps with memory.
I think the OP probably learned more than they think they did, they just need some context to string it all together and help jog their memory. I wish I'd figured this out in college or shortly afterward, because most of what I learned was "in one ear, out the other".
I once asked the head of our program why our degree didn't require any CS courses, but did require that we learn to code. She smiled and said "Well, we think you should just learn it." That short conversation made an impression on me. It wasn't in college study things, I was there to learn how to do things.
I find the attitude you expressed strange. They still insisted you take math courses, correct? Or did they say, "Well, we think you should just learn differential equations"?
Now, there are some who are doing high performance computing where abstraction and software engineering become important--but again, at the undergraduate level, how many courses are emphasizing the advantages of say templated C++ programming for scientific computations?
I think a service course on scientific computing would be of more use to most physicists than a general background in CS. For those that will end up working on large projects, then perhaps a course on software engineering. As for math courses, maybe I was a bit weird, but didn't you learn diff. equations in high school :p? But as for things like PDEs and such--I will say that a lot of it is learned as part of the coursework for say a first modern physics course, rather than in a math course...
For us, math, programming, etc. are tools that we use to do what we're interested in...(This is in no way to denigrate math or programming)
I have to mention, I did take a 100-level CS course as an elective, and personally, it was too basic. I think to get what we needed, we would have had to skip the intro courses, and I it's possible the CS department would have frowned on that. On a related note, my PhD was in medical physics, and our department did have an agreement with the bio department to let us take a 400 level physiology class without the prereqs. It was a point-of-pride in the physics program that our students always lead those classes.
No doubt CS can get deep and anyone that codes could benefit from those higher-level courses. But, I think the situation wouldn't allow for that.
Maybe she was saying that physics students were bright enough to figure out what they needed on their own. From what I saw, she would be correct there. We weren't math majors, but we were pretty able. :)
Anyway, it seems more like you agree CS courses aimed specifically at scientific computing would be good and worthwhile.
Oh, absolutely. I think there might be an analogy in writing. -I'm no writer, but I can pen an effective grant. I'd really like to be an effective writer too, but that would be a divergent path in light of my work.
There's just not enough time in one human life. :/
BTW, a second postdoc is cruel and unusual punishment IMO. My best wishes to your friend.
I actually took multiple programming classes and ended up doing my PhD in computational physics. One of the biggest issues is that much of the software written by and for physicists tends to have very poor design, because they tend to learn the bare minimum necessary to get things to work (in Fortran no less).
They still insisted you take math courses, correct?
This is true, but we still learned a ton of math techniques in our physics classes. I must have learned Fourier transforms about 6 times (and finally understood them after the 3rd time or so :) )
This is a great way to describe these different ways of learning. I'll be using these terms again. Thanks.
If you are all push, you won't be able to program.
If you are really all pull, you will be able to program, but it will be kind of magical programming.
Ex. If I am all pull: I know that the regular expression matches based on certain rules, but I don't understand complexity theory or finite automata, so I don't understand that I shouldn't use a regex to process xml. I might create a working program, but it is likely that it will be convoluted and take me longer (or be buggy in ways which I cannot comprehend!), because I reinvented solutions to already solved problems, or didn't understand the tools I was using.
If I am all push, programming becomes an intimidating experience. For obvious reasons, I'm going to have difficulty problem solving, obviously -- I've never actually sat down and solved problems. It would be like trying to learn math without having done any practice exercises.
I started as a 100% pull programmer (with minimal formal CS training), and have been pushing through a masters degree in CS. It has helped me by bounds. The concepts themselves are important, but also important is understanding the approach to problem solving that led to the concept. I can apply the theories behind a compiler or database (or anything) in places where a compiler or database would be totally inappropriate.
Point being, I see things differently now.
There are a lot more options for me to look into when I'm initially researching how to solve a problem, and it is a lot less likely to be influenced by whatever the new hotness buzzwords happen to be for the season, and because of that, my results are much better.
You could make a comparison to music. The self-taught guitarist versus the classically trained guitarist, and the guitarist somewhere in between. When I took guitar lessons I wasn't taught classically, but I was taught the concepts (the theory behind chords, scales, etc.) and I was also encouraged to pick things up on my own by both playing around and finding tabs for my favourite songs and having a go.
This push vs pull concept can probably be applied to ways of learning in general. I think it's safe to say most people here are agreeing a mixture is best.
I call this "lazy evaluation", and it's my strongest learning style also.
When I set out to write a software synthesizer, for instance, I was reading papers and learning signal-processing stuff I wouldn't have otherwise bothered with.
1. Should college teach you to program?
This is a philosophical question about whether a CS degree should be educational or vocational. The general consensus seems to be that it should be the former not the latter. It should give you the theoretical foundation that you can apply to almost anything.
Therefore, college won't necessarily teach you any programming beyond what you need to read examples and do assignments, which may not be particularly deep.
2. Do you like to program?
Frankly, every good programmer I know started programming long before they went to college or at least programmed outside of college for their enjoyment.
If you haven't done that and don't do that then I really have to question if you're in the right profession.
3. Programming vs Being a Programmer.
Being a programmer as a job for which you are paid is different to programming. It involves many other skills such as design, reviews, dealing with people, writing documentation, supporting applications and so on. This is not something taught in school (nor could it be really).
I view the first 2-3 years of your work life as an apprenticeship of sorts. You've got the basic theory, now you have to go out into the real world and make yourself useful to somebody.
Many companies have graduate programs and the like. I think it's fairly important to start off somewhere that's good, meaning they'll teach you something (rather than simply crushing you, which, sadly, is more the norm).
If you can put in 2-3 years at, say, Google, Facebook or Apple after doing a good CS course, you'll have gotten yourself off to a very good start.
I personally think that's a problem. There is nothing wrong with being educational in your sense of the term, but I do think people with a degree in CS should be expected to know how to program.
Not expecting a CS alumnus to be a programmer is like having an architect not know how to draw buildings, or a surgeon not know how to wield a scalpel.
Sure, the degree should absolutely teach you more than just programming, but there is no value in being a CS grad without knowing how to program. It's sort of the low level toolkit that you absolutely must be proficient in to being with.
A strange thing about CS education is that we have all these people studying Computer Science or even Software Engineering without ever looking at a significant piece of software. You learns lots of small examples and lots of theoretical bits, but as far as I can see most programs never take a look at, say, the Apache webserver and explain how all these small things come together to produce a successful piece of software. That's really a pity.
You do realize that with both professions you noted there is usually an apprenticeship time (internship/residency) -- and also in both professions you must licensed (in architecture typically after your internship and in medicine before the residency).
I do disagree though that there is no value in not being able to program. The main contribution of CS is not programming. It is in the theory of computation. I really do view programming as purely incidental.
You wouldn't expect a mechanical engineer to know how to repair a 2011 Ford Focus. They'll konw the basic ideas under the hood, but all of the technoogy specific aspects of it shouldn't be taught.
This scared me in college, since I hardly programmed before (or even during) my degree. However, I've found that it's simply not true. Not just for myself, but for the many people I've met who started with completely different careers before getting into programming.
Anyway, my point is this: My current machine I do most of my work on has 64 GB of ram and 32 cores, but you absolutely can get into computing with the kind of money you can earn on a weekend mowing lawns.
(And yes, I was born before 1990. Not everyone in college knew how to program already, but all the kids who made it through the weedout courses sure did...)
When I entered college, I was surprised that most of my classmates have had broadband connections and better computers for years, but never bothered to do much more than receive email with them.
Do you apply those standards to everyone, or just to programmers? If only programmers, why?
However, there are others in the same vein as programming:
* fine arts (drawing, painting)
* football and other sports
* creative writing
On the other side, there are fields that are more "professional", where students only begin learning these disciplines at college:
The engineering-ish side of CS (programming) is far more similar to the sciences (physics, chemistry) or mathematics than other engineering disciplines are because anyone with drive can and does delve into it without the help/permission of formal educators.
This is to say nothing of the fact that real CS is a branch of mathematics...
We recently hired and then let go of a guy who is doing a CS degree, good grades, competes in ACM tournaments, so he was really sexy as a junior programmer. It turned out he doesn't really like to program, he's not a programmer. The way this materialized was when given an assignment, he wasn't willing to proactively figure out how to solve it. We either told him in great detail how to solve it or he'd throw up his arms.
This happened in 9 out of 10 cases, whether it was a really small (1 hour) or a larger (multi-month) assignment. After two months of this we let him go.
My list would be:
- basic electronic concepts, ie. transistors, XOR / NAND, bit shifters, adders
- assembly language
- hardware layout: CPU, registers, main memory, MMU, interrupts
- operating systems: hardware interface, multi processing, file systems
- networking: packet networks, ethernet, TCP/IP (sliding window, congestion control)
- low level programming: C with fopen & co, sockets, structs, system calls
- compiler construction (this is important!), parsers, translators, programming language implementation and virtual machines
- algorithms: complexity theory, automata, CFGs, common data structures (heaps, maps, trees)
- discrete mathematics, statistics
Of course, most places I've dealt with have no desire to hire someone who is a generalist: specialist or get out is my observation.
I've done that stuff; by necessity, I'm old enough to have grown up with computers and didn't have it all on a platter. My sons are learning "top down" and doing fine.
"I see websites like Stack Overflow and search engines like Google and don't know where I'd even begin to write something like that."
In response I'll quote part of the best paragraph PG's ever written:
"I've always been fascinated by comb-overs, especially the extreme sort that make a man look as if he's wearing a beret made of his own hair. How does the comber-over not see how odd he looks? The answer is that he got to look that way incrementally. What began as combing his hair a little carefully over a thin patch has gradually, over 20 years, grown into a monstrosity. Gradualness is very powerful. And that power can be used for constructive purposes too: just as you can trick yourself into looking like a freak, you can trick yourself into creating something so grand that you would never have dared to plan such a thing. Indeed, this is just how most good software gets created. You start by writing a stripped-down kernel (how hard can it be?) and gradually it grows into a complete operating system." -- http://paulgraham.com/essay.html (His best essay IMO.)
In this case, a lot of it is what you tell your brain.
- Understand time & space complexity
- Security & Encryption
- Concurrency & why ACID transactions are hard
I've actually worked at a company that wasted millions on a fruitless project because management didn't understand basic automata theory:
- Finite State Machines / Regular Expressions
- Stack Machines / Context Free Grammars
- Turing Machines / Anything Computable
EDIT: Another good thing to study, though I'm not sure if there's a good single source: Why distributed systems are Hard and why distributed transactions are Really Hard. I really didn't care for the teaching style of the professor who staked out that whole area as "turf" so none of my materials were particularly memorable or good, but the information was valuable.
management didn't understand basic automata theory
IMHO management isn't required to understand much about CS, as long as the relationship with the developers is as between a customer and an engineer (i.e. customer asks for stuff, engineer explains what he can and cannot do).
Most problems with management comes from a disconnect with reality. Even when they know CS theory and/or could understand the complexity involved (ex developers); having bikeshed-style opinions and making demands with an iron fist in spite of developers advising otherwise (or the reverse, having developers on the team without an engineering mindset that can't think for themselves) ... that's the real poison.
management didn't understand basic automata theory
That company tried to translate code from one programming language to another using only regular expressions. (This was a long time ago when you couldn't almost count on most "regular expression" implementations being Turing Complete.)
Perhaps the problem was really that engineers who understood the futility of the approach didn't speak up or weren't listened to. Still, I'd expect people managing an engineering firm that builds bridges to have some basic grasp of physics. So I don't think it's going too far to have those running a software company to have some basic grasp of the basic laws governing their field.
It might work like this: some programmer working on the project realizes something can't be parsed in a finite state machine, and 'cleverly' separates out one batch of regular expressions into a loop/recursive call. Repeat until the programmers are satisfied with the number of conditions they can handle. I've seen software that was actually written like this. Fortunately, the language it was designed to parse was pretty simple.
It might work like this: some programmer working on the project realizes something can't be parsed in a finite state machine, and 'cleverly' separates out one batch of regular expressions into a loop/recursive call
In this case, you've created an "Augmented Transition Network" which is equivalent in power to a stack machine or a Context Free Grammar. Is this really playing devil's advocate? That would imply setting up something that isn't a straw man whose solution isn't basic automata theory.
This also gives you an idea of how broken the project was, since my understanding is that they'd fix "that particular bug" and keep writing more regular expressions.
You can create a submachine representing the regular expression transition conditions, and just attach that at the state you wanted, resulting in a finite state machine.
Of course this doesn't hold when you are talking about non-standard regular expressions, and it's probably a nice feature to have when creating the automata, but IMHO it still sounds like a silly idea when CFG-tools like ANTLR and YACC are available.
Wouldn't that still be a finite state automata?
I also hate working with ANTLR/YACC ... heavy, hard to start with, steep learning curve. These tools are designed for industrial-strength compilers, where performance / flexibility matters.
And PEGs are better than CFGs (that's teeshirt material right there :))
The key is the bit about the "recursive call."
If you've got an algorithm that does that, then it cannot be called "iterative".
I.e. backtracking is recursive, no matter how you implement it.
In school textbooks they do differentiate between "recursive" and "iterative" backtracking, to teach you how to get rid of the call-stack and manage your own. But that's another story.
Ha! I know a guy who essentially got a US patent for roughly the same idea. His non technical bosses were so impressed that they gave him a raise and made him "Director of Analytics" at what was (or rather should have been) an Analytics heavy company.
It wasn't just the management to be blamed: it the case of building bridges, engineers / architects can even go to prison if a bridge collapses.
That will be really helpful. Thanks.
Algorithms: Introduction to Algorithms (Cormen, Leiserson, & Rivest)
Security: Applied Cryptography (Schneier)
I cannot find my automata book just now. It's down in the garage, and I want to stay inside where it is warm now. As for concurrency, I got part of that from the Tannenbaum OS book:
But the rest, I actually got from a coworker on the job! We did cover databases and ACID transactions in school, but that wasn't taught very well and I didn't really get it until I was doing real work.
Do some assembly language. It will give you a key advantage over everyone who is too scared to touch it.
Write a compiler and/or interpreter. This can actually be pretty small, and it will also give you an advantage over those too scared of something so seemingly "esoteric."
Thomas recommends Practical Cryptography instead.
"Practical" will be my next technical read.
That would be about right.
I remember that because all my IRC friends immediately got to work on crazy crypto tools with algorithms and ideas cadged from that book. It definitely didn't teach them that crypto was hard.
It's one thing to do some fun project. It's another thing to do something for production. It's yet another thing to read such a book and realize it means there's people who know a lot more about this than you. At the same time, it is a fun read for a techie.
Also, if you have time and passion, write a compiler.
Requiring a degree is popular as it is an easy to implement filter rather than any correlation with ability. (Note that question of whether a CS degree is relevant or not is an other question entirely - I'm a CS graduate and I think the maths & theory heavy course I did is irrelevant to 98% of development jobs even if it has been pretty useful for me).
There are a few tidbits from your education that will save you from serious egg on your face. (See my other post this thread.) This is not just theoretical. I've seen a company suffer major embarrassment because software sold to Fortune 500 companies was written by programmers who thought, "all that poppycock about race conditions/transactions is just academic hot air."
IMHO Computer Science courses deliver value through proving a knowledge of abstractions of software and hardware systems. These formal abstractions remain applicable pretty much forever whereas knowledge of particular technologies are of passing value.
[NB My hardwired knowledge of vi commands, Unix system calls and C was merely pleasant side effects of my CS degree not an end in itself]
The real thing I feel that I miss is not being able to recognise where known algorithms/data structures could help out, something I'm working on fixing.
Praxis (as I understand it) is simply the harmonious blending of practice and theory into action. As I'm beginning the exit trajectory from undergrad and into the "real world" (whatever that is), I've been reflecting on my education in the art of programming. I've realized that praxis has been the only substantial educational tool I've had.
Knuth says it best. “If you find that you're spending almost all your time on theory, start turning some attention to practical things; it will improve your theories. If you find that you're spending almost all your time on practice, start turning some attention to theoretical things; it will improve your practice.”
Of course you don't know how to program, because most computer science programs suck. They teach how to code, the basic stuff like variables, if statements, loops, etc. You probably had a course in data structures and algorithms where you learned how to code your own lists and then how to sort them. You probably learned about object oriented programming with tons of examples about how a dog is a type of animal or maybe a foo is a type of bar. You probably wrote a ton of little one off programs to count the words in a file, calculate fibonaci sequences, fling disks around the towers of hanoi, or calculate interest rates.
Probably by the time you were deep into your data structures class you started zoning out, because heck, you can get by just fine with arrays. Same for algorithms, why are they making all this stuff so complicated, I can do most of this with just if and while. You never had to write a program more than a few hundred lines long, so object oriented programming just seems like confusing overkill. So do functions for that matter since you never had to do the same thing more than once in your program.
The problem is, you were taught all that shit in a vacuum and nobody ever told you how to put it all together. Since the programs you were working on were so simple, you never ran into the problems that those techniques were meant to solve. You were taught tons of theory, but nobody ever sold you on it. You have no idea how to apply all that stuff you learned. When and why to use each little bit of theory.
Maybe you tried to bootstrap yourself, start writing your own real programs. Maybe you had some success, most likely you got bored or frustrated and collapsed a few days later under the cruft of your own code base because you had no idea how to structure a real program.
Of course your professors will tell you that what you really need to learn wasn't the job of the university. They are there to educate, not help you get a job. I can understand their point of view to an extent, but what would you say about an english major who just got their BA and doesn't know how to write more than a 2 page paper. Theory & knowledge are just giant pile of bullshit to a person without a proper impetus to apply it in a non trivial application.
I realize there are some awesome colleges out there that are probably nothing like this. Given what I still see out of the junior developers I work with, I would say they are still the exception rather than rule.
That was all first year in my program. Second year was algorithm design, architecture, source control, *nix devlopment, etc (in addition to math, technical writing, etc).
I don't think my school is particularly spectacular, but even in second year we've gone beyond what you describe, and the people who take the attitudes you talk about have failed out. I don't think most CS programs are as bad as you say.
I think an apt adjective for many of our courses and assignments is "contrived".
I'm graduating with a ME degree but I don't feel like I know how to weld.
About half of the professional ME's I know either own a TIG/MIG welding system and/or know how to weld.
Do you remember learning how to ride?
Knowing rigid-body mechanics, the theory of angular momentum, gyroscopic effect, etc. isn't going to keep your ass in that seat. Contrariwise, by trying and falling many times you learned how to balance yourself atop an unstable structure of wheels and frame when you were five years old.
Programming is the same. To feel confident in your ability to program, you simply have to do it. A lot. And you will write a lot of crappy, crappy code. Just keep doing it.
The lightning talks meeting was also really popular; one guy showed how to use Blender in five minutes, someone else talked about a darknet he had written, and another student used it to distribute Google Wave invites she had from her summer internship.
I would say that 30% of my programming skill and knowledge comes from what I learned in class, while 70% comes from reading, watching lectures online, having friends critique my code, looking at open source projects, etc. But without that initial, introductory training from my CS department I would never have been in a position to understand those other topics.
Personally I would love to see a community effort to put together a free, open source "bridge the gap" curriculum for new college grads. It would focus mostly on coding best practices, code reviews, the importance of tests, engineering work flows, version control, staging servers, etc. In particular I think it should focus on real-world codebases as opposed to the one-off programs you constantly write in academia. Sure you can learn most of this stuff at your first job, but I think there are a ton of people out there who would love to learn this stuff, but they don't really know what they need to learn or where to start.
- One semester-long software development class, where the teacher did literally nothing while we formed small teams and worked on a project for a real client in or near our university
- Software engineering internship in the industry
- Extremely hard-ass Advanced Placement C.S. teacher in high-school
I learned a lot of cool theoretical stuff in college that is definitely good to know: functional programming, operating systems, machine learning techniques, compilers, graph algorithms, etc., but was never taught any core programming techniques or required to write very extensive code at any level of quality. It's all about real-world experience, if that's what you're shooting for.
Whether you can write a novel that is compelling or sellable or worthwhile is another story (pun) altogether, of course.
In CS programs I've seen many students that expect to leave with the ability to just sit down and code up a Facebook clone off the top of their heads. It's similar in IT roles. There is no class that is "what to do when network is slow 101".
If you're unlucky then you got a pseudo-trade-school education in Bachelor of Science clothing. Sorry to say that you probably got ripped off. But don't worry, if you have a passion for creating software then you can still use self-study to build up the knowledge and skills necessary to become a true software craftsman, and at least you're not starting at square one. Plus, a CS degree still looks good on paper and will help with job searches.
P.S. Learn source control inside and out, it'll pay massive dividends.
I like the first answer on the SO post, developing a website or writing a game will definitely make you a much better coder. When you build something from the ground up by yourself, you learn a lot more than just coding. You learn how to break out of deep troughs of despair. You learn how to work more than 8 hours at a time. You learn why its important to add efficiencies to your work habits. These are a few of the things you learn. But, these are the things that separate coders (the subset) from people with degrees (the superset).
I think it's shameful that colleges are perfectly happy to take 4 years of tuition and tell you that your shiny BS in CS will get you a great job when you graduate. A few profs will drill it into your head that co-op is mandatory, but that's only if you get the good ones.
The college will teach you the basic, it will give you a ground to know how to start building. They won't tell you how to put the bricks together.
That's why I help my co-workes with something they don't know and encourage them to do the same with others.
Learning how to recover and navigate gracefully around the brick walls will serve you well in the real world.
A worthless piece of paper. Had a fun three years getting it though.
Or maybe I just never get any better, hm.
I don't think this problem indicates the usefulness of a CS degree. Instead, I think it says that the only real way to learn how to program is to program.
It's one thing to solve a problem once it's in front of you, but it's another to make the plan that exposes you to the problems, and keeps new problems coming along so that you can keep solving them, until one day... the project is done.
This is why there are so many bad programmers around. Depressing.
I can't fathom how anyone could be a programmer/trying to become a programmer and not have constant side projects, partially written OSes, games, etc
It's easy to say "I want to be an X", but then you realize that you're not spending enough time doing X to become good at it. And the reason you're not spending enough time doing it is because you just don't love it enough, perhaps you love something else more.
He writes "I'm trying to improve my knowledge by studying algorithms, but it is a long and painful process."
Which is depressing. You do not learn how to program by 'studying algorithms'.
Do you learn to become a great author by reading the dictionary? No.
It doesn't sound like he wants to be a programmer for the right reasons to me.
The (quite old; it'd be interesting to see how this panned out) question came across to me as someone who, at the end of their many years in education up to this point, has got very used to directed learning. They are used to learning by being set a challenge with some accompanying theory and completing the challenge. They may well be good at this and enjoy doing it, but they haven't become used to searching out the material for themselves.
In fairness, for a corporate setting this may not be a great limitation! When you're working on someone else's project doing maintenance work as most developers end up doing, the need to seek out new problems and work out novel methods can come later; for now, your work domain is fairly closely controlled.
We may well have a CS graduate without the passion to stick at development, but this isn't necessarily the case. I did very little personal work then, because I had other interests as well and was doing plenty of programming for my course, thank you. I think we've got someone with the self-awareness to realise their limitations but insufficient experience of learning outside of education to know how to address this.
Becoming a good programmer is something different.
() Deconstructing problems into smaller problems and expressing those in computer commands.
If this is so, and you can't program, then how did they appear?