Hacker News new | past | comments | ask | show | jobs | submit login
I prepared for a decade to graduate in CS in three months (miguelrochefort.com)
425 points by miguelrochefort 11 months ago | hide | past | favorite | 374 comments

I'll repost my WGU experience from another comment:

I dropped out of traditional college. Last class I remember they were talking about what a GET and POST request were. I had been a professional web developer for years at that point. I felt it was a waste of time.

In WGU I had a similar easy class, stuff that was "way below me". Not to worry, I passed the class in a day and stopped worrying about wasted time.

I recently did a C++ assignment that was filled with artificial requirements making it artificially difficult, yet I learned some things about pointers and figured out the basics of CMake for the first time. Useful stuff. Now I'm doing a similar assignment with Java. I knew I would have to set aside my distaste for things like Java (gross!), and with that mindset things aren't so bad. I'm actually enjoying JavaFX.

WGU's calculus class was harder than calc 1 at my brick and mortar university, but easier than calc 2. Once again, I knew most of the material and passed after only a few weeks of casual review. I also found a calculus text book I personally like (and will happily refer to again in the future), rather that just using whatever the department chose.

I do agree that a CS degree doesn't teach as much as it sometimes gets credit for. I do believe WGU's CS degree is as good as many brick and mortar CS degrees, but with minimal "bullshit", though there is still some.

From your comments and the article, seems like the WGU model could completely disrupt the normal 4 year university experience, for self motivated learners.

Of course, if the goal is 4 years of fun social life and putting off "adult" responsibilities, the WGU approach is counter productive.

But for people who want the benefit of a degree as an employment credential, looks like this is the fastest and cheapest way to attain it, and I don't think many other existing four year colleges are prepared to compete with this model. Their cost structure doesn't allow it.

> From your comments and the article, seems like the WGU model could completely disrupt the normal 4 year university experience, for self motivated learners.

I don't agree at all. The "normal" university experience is designed around the assumption that the student is not familiar with the subject, and thus they specify a path comprised of a combination of lessons and assignments that allows everyone to get up to speed on a topic and share a common base level of all the fundamentals at a given stage.

The OP's comments mention nothing of the sort. They show that, even though the student lacked some fundamentals, the preliminary work already had covered most of the topics.

Thus OP's story is one mostly about getting a certification instead of actually learning something new. It involves learning stuff at their own pace instead of being forced to ramp um on a time budget with hard constraints on where the acceptable speed of growth must be.

We should not confuse the two, and we should certainly not mistake being able to pass a certification as a reflexion of intelligence or capability. The challenge of getting from cluelessness to have a solid command of the fundamentals of a topic in less than 4 months is incomparably higher than just casually cruising on a topic for 10 years.

If the goal is employment, a lot of the times an internship is becoming expected. This is easier to secure at a larger school, where recruiters visit to make face time on campus.

Considering that most of my social life (in late 20s) is made of people I met and worked with in bachelors and grad school, I am really unsure about this approach. Apart from the social factor, I was also got immense support and motivation from my peers. Given the emotional/social support a brick and mortar education institute gives, their graduates will fare better at jobs.

I think you'll enjoy this: https://tracingwoodgrains.medium.com/speedrunning-college-my...

There's actually a bit of a college speed running community.

Can you please share the name of Calculus textbook?

I think the subheading may be a better title -- 'This is the story of how I prepared for a decade to graduate in 3 months.'

This seems feasible only for a specific set of people: those with previous experience & looking specifically for the credential and not so much the learning experience that comes with a traditional 4 year degree. Not to take away from the author's achievement, I just think it would be misleading to imply that this is a path that most people can take.

Reminds me of every single one of those "how I paid off $100,000 in debt in 1 year" articles. It's always like "dad got me an executive job at his company and grandma gave me a condo as a graduation gift so instead of living in it I moved in with my parents rent free and turned the condo into income stream. (Aren't I smart!?!) Since mom always had a gourmet hot meal prepared for me I could put all my income into my loans instead of silly things like buying food and worrying about preparing meals."

These sorts of writings have their audience (I guess) but they aren't written for normal people. What I mean is, normal (or even above average) people can't follow a similar path and get similar results - it's extraordinary people getting extraordinary results.

I paid off $240,000 in 3 years by not being an idiot with my money. I got on a budget with my wife. We cut up our credit cards because we always double or triple spent our money. We budgeted for things like Christmas. We followed Dave Ramsey's plan. We're 2 months from paying off our house now.

Edit - I went back and checked, it was $140k.

Heads up: this is a common pattern of expression that makes you come off as really obnoxious to people who will never tell you that that's what's happening. If you can't understand why people have a problem with this, here's the explanation: there are almost certainly smarter people than you who happened to be more unlucky than you.

You were lucky-but-stupid before, so you stopped being so stupid, and now your messages carry the undertone that anyone else is some variation of the person you were back in the stage before you wised up. It's a view that doesn't provide any space for people who were wise from the beginning. When you say things like what you said above, you're not highlighting how smart you are or getting down to how dumb other people are; you're just highlighting how lucky you were to even have the circumstances where you were allowed to be that stupid in the beginning.

Do you do social skills coaching? Plenty of tone deaf people (myself included) who would gladly pay for this.

Seriously, what an amazing comment. Imagine if we had someone like this in every discussion on the internet.

This is one of the best comments I've ever read on HN, bar none. Thank you for that.

I would guess he is actually getting downvoted because he is pointing out an uncomfortable truth.

The majority of the people that frequent this site are above the median income. Being in the tech field gives you a huge leg up economically.

You can either squander that, or you can use it to stay out of debt and build wealth.

Having someone point out that you maybe don't actually make the best financial decisions is uncomfortable and makes people defensive.

Doesn't make the statement any less true. Most people on a tech salary (even outside the unicorn tech hubs) have the means to be debt free and live quite comfortably besides.

Alternatively, maybe someone really does have a "royal flush" <https://www.youtube.com/watch?v=rV8XhzG_rAg> (or anal fissures, cf The Office episode "Health Care", season 1, episode 3).

At the risk of more down votes, I'll disagree with you. There are certainly smarter people than me who are less lucky than me. However, its a much smaller subset than many would believe. Many claim unlucky when in reality their luck is a consequence of their past choices. They've taken risks and those risks didn't work out.

I once read some advice that I'll try to repeat, but without knowing whether I can do a great job capturing it, but here it goes. The general idea was that if you're on a date with someone, then you should avoid asking or saying things that subtly insist that something is true if you don't know it to be true. An example is that if you don't know whether your date was molested by their father as a child, you should tread carefully with any questions or comments that carry the presumption that they weren't.

Here's some more sage wisdom: the average human has approximately one testicle. Except not, right? Because that's not how numbers work. So, it doesn't matter if, say, only 4% of the people you interact with are unlucky and 96% aren't. This is not an engineering problem. If you do meet someone who is (or was) unlucky, and you have an interaction with them like this, then even though they're in the 4%, their circumstances are still 100% at odds with your assumption.

interesting perspective. Thanks for sharing!

You can only get downvoted on a post to -4, fyi. I stopped caring nearly as much about being agreeable once I learned that.

For what it's worth, congrats on what you did. I'm in a similar position and I completely agree that for the majority of people here it wouldn't take anything beyond stopping making bad choices.

> I stopped caring nearly as much about being agreeable once I learned that.

The score is just some meaningless internet dick measuring, so why care at all?

Psychological junk, you know.

If staying in my home country is a bad decision then nothing will change my mind.

I think it's important to make a distinction here that I didn't think to make initially:

I am not saying "everyone can make hundreds of thousands of dollars per year if they stop being dumb."

What I am trying to say is "Most people can avoid being massively in debt by making better spending decisions."

That means you earned more than $80,000 post-tax income surplus to your needs. That surplus is a third more than the US average total household pre-tax income.

But the average household probably also isn't in 240k of debt unless they made very very poor financial choices.

this is correct, the median household has a (positive!) net worth of ~$120k. you have to go down to the tenth percentile and below to find households that actually have negative net worths.


I do not think that "240k in debt" (he clarifies it was 140k in another comment) actually means owing 240k more than your total assets.

As far as the median net worth, that's just someone most of the way through paying off their mortgage...

I find it really hard to believe that only the tenth percentile has more student debt than savings. Virtually everyone I know in the 20s, and many in their 30s with advanced degrees, all have more debt than savings.

that shouldn't be too surprising. the lower net worth buckets are made up disproportionately of younger people who a.) are still paying off student loans (if they took them), b.) have yet to reach their life peak income, and c.) haven't had as much time to accumulate savings. if you look at the breakdown by age on the page I linked, you'll see that median net worth increases almost monotonically by age bracket.

you might also consider that there is probably some sampling bias in your social circle. I'd guess it disproportionately consists of people who have advanced degrees, possibly from more expensive schools. as a counter-anecdote, most of my friends got STEM degrees at a state university. the ones who took out loans paid them off completely within two years of graduation.

Or student debt; a doctor going 200k into debt for med school probably isn't exactly in dire financial straits.

I'd feel bad for a med student halfway through that gets sick. 100k of debt and no degree. yikes!

Or they're in the US and had to go to hospital for treatment?

I went back and checked my records. It was 140K! oops. Anyways, here is my list.

85k in student loans 30k 401k loan 20k car loan 30k car loan

So 46,000 in disposable income? That's a massive amount of money by normal-person standards.

Hell, the fact that you had $30k in a 401k that you could borrow from yourself already makes you exceptional.

Yes. I had ~$500 payments for each of those every month. Luckily, once I got the first one paid off, I was able to roll that previous payment into the next one.

401k match is nice and I'd had been working for 5-6 years when I did this, so it was vested.

I was in a deep hole, but I had a big shovel. I wish I hadn't dug the hole though. I encourage others to not dig holes either now that I've learned my lesson the hard way.

How was this "the hard way"?

going into debt in the first place is the hard way.

People say "I learned it the hard way" when in the end they lost something for eternity, went through suffering etc. Like a recovered drug addict with permanent health damage or someone who turned on the path of crime and spent his youth behind bars or something. To me it seems like you had a pretty good life, took on some debt to live even more carelessly and then matured and calmed down and channeled more of your disposable income into paying back said debt. I don't see the hardship based on your comments so far.

you know there are gray scales in life, right? Things can be hard without somebody dying. Marriage is hard. Raising children is hard. Those things are hard without dying, drug addiction, or other exceptionally hard circumstances.

You mention some pretty heavy consequences to learn lessons.

I'd say committing to something for 3-4 years that means telling yourself NO is pretty hard. Its not "survive being a Vietnam Prisoner of War" hard, but I never claimed it was.

Well said reply.

Actually, "learned it the hard way" refers to making the mistake yourself, rather than learning from someone else's mistake and avoiding making the mistake yourself at all.

It's better to learn from others mistakes, than to "learn it the hard way".

"The prudent sees danger and hides himself, but the simple go on and suffer for it." - Proverbs 22:3

You're right. But I also changed my mentality. We became intentional with our money. Its amazing how you can find money in a budget to put towards debt. Its also how you stay out of debt going forward. You learn to prioritize your spending.

I think you're getting downvoted because you're missing the point somewhat. You had the opportunity and means to change your mentality and become 'intentional' with your money, and learn how to prioritise your spending, because you've had such a surfeit of wealth compared to the average that you won't have faced the same problems others would (and still) have. That's fantastic for you, don't get me wrong, but for many people it would sound quote tone-deaf.

If you can afford to put aside almost 100k a year and still live comfortably you are decidedly not in the class of people for whom saving is, if not an impossibility, a luxury.

We're not just talking about Pratchett's Boots Theorem[1], but also the fact that such people are quite literally only one or two paycheques away from total disaster. These are the people who can't afford to save because every penny is spent on merely surviving, and they can't amass the bare minimum amount of wealth it would take for them to be able to lift themselves out of that situation.

[1] https://moneywise.com/a/boots-theory-of-socioeconomic-unfair...

Being poor is only something you can truly understand once you've tasted it yourself. The stress alone feels like it's shaving many IQ points from your baseline. Even then, I got only a limited understanding compared to people in the third world. (not going for the misery olympics here, just an observation)

Having been there, I second that. Juggling late utility bills in the middle of winter against having food to eat. If you come through that, it hangs with you forever in ways that others that have not will never understand. Even though that life is almost forty years behind me now, it clearly affects my life decisions every day. I am aware of that, and I manage it, but it is always there. Like you, I am not going for the misery olympics either. Just don't assume that, given my career and place in life now, I am not happy with my perfectly maintained fourteen year old car. I know I can afford a new one, thank you very much. But spending that kind of money, just because I can, does not bring me comfort. People that have never been poor will never understand that.

I completely agree here. I've never been poor. I can't imagine how some families feel that grew up in the south Chicago projects. The stress has the be very brutal.

The deepness of my debt and the steepness of my accent are just variables in the equation.

Many people are 1-2 paychecks away from total disaster, but they never take the step to make a change. It can certainly feel impossible. But the reality is "merely surviving" has a broad usage. I know lots of people that are living paycheck to paycheck, but have the iPhone X.


I don't know if you're shilling this guy or drank too much of his kool aid but it's still blind to the reality of the situation seen across the entire world. You're talking about your 'equation', which is not the same equation other people are subjected to.

It’s amazing what you can do when you earn way more than the median household.

This is one of the hardest things to explain to people.

Here is a thought experiment. Lets assume that the minimum required to live in a given place is $20K.

So now you have two people, one makes $30K, the other makes $40K. How much more does the second one make compared to the first? Twice as much. At $30K, you have $10K disposable. At $40K, you have $20K disposable. And someone making $60K is making four times the disposable compared to $30K.

Once you get above cost of living, everything gets easier.

When we were looking for an apartment between selling and buying houses, I was surprised by how there is a decided floor to rent prices. Like nothing exists below a certain point. Somehow I felt we could just get a super cheap place because I thought housing price would scale with income. I was surprised when I found out it didn't. So your cost of living example holds up in my experience.

That's the result of housing shortages. Landlords can pick their tenants and since they don't run a charity they usually rent out to the highest bidder. The obvious solution would be to increase supply.

Yea. You can get $240k in debt.

Good luck getting anyone to lend you that kind of money if you're actually poor.

Someone who is able to get deeply in debt basically by definition has access to money. They would not be granted loans otherwise.

the largest portion of my debt was student loans (over half), which anyone can get in the US (not sure about other countries).

What the hell? So you're literally earning $80k above your living expenses. Waaaaaay above average

I'd be more surprised if you couldn't pay that sort of money off

Re your edit: I'm even more sceptical now. Your figures were off by a third. I know exactly the value of my mortgage when I took it out. That figure is burned into my mind. I don't know how you could be out by $100,000

I don't know what it will add to the conversation, but I thought it was worth adding in my two cents.

My family of five is supported by my income (~$100k / year).

I get paid weekly. Every Friday, I go into my bank account amd transfer anything in excess of $2000 into the stock market.

Some weeks that's quite a bit of money, some weeks it's not so much money.

As of a week ago when I did the math, we had $65k more to our names this year than we did one year ago.

Probably half of that was money we saved directly, and half was stock market gains, so I guess from a debt payoff perspective maybe that's closer to $30k.

But I guess my point is, I don't think it's unreasonable to think a two income household in a higher-income metro than where I live could have $80k/yr in excess income to pay down debt with.

It's certainly not unreasonable or even uncommon. But, it's still way above the norm in the US. Median household income in the US is somewhere around $65k/year (before taxes and expenses).

Side note - You're only holding $2k in cash for a family of 5? That seems low. Your efforts to invest for the future are commendable, but if you get caught on the bad end of a 2008-style recession, is it enough to keep paying bills?

> You're only holding $2k in cash for a family of 5?

I am holding $2k in a checking account. My savings accounts are appropriately sized to get us by for probably six months to a year.

The other side of this is that "average" living expense has little meaning when applied to a place as big and varied as the U.S., much less the world. In some areas, the cost of living is much higher. The commonly provided solution to living somewhere the cost of living is too high is move. Unfortunately, moving comes with all sorts of other constraints that are rarely considered.

I this case, we're talking about a family, so what if one or both of the the spouses needs to work but can't find work or is paid much less in the new location? Is it still a net win if the family income is reduced to a large degree?

What if there is family in the current location but not any target location? Beyond any help they may be able to provide (child care, if not regularly, in special circumstances), they may provide a real tangible mental health benefit compared to living in a new location where you know few if any people. Or what if you're providing help to that family member, and leaving would be problematic?

The bottom line is that is that some people are stuck in or around places like SF making very little compared to the cost of living in the area, but also don't feel like they are in a position they can leave. If you made $80k in SF and lived in SF or close enough to commute, you might find that a large portion of that goes purely to housing. For a family, probably $24k a year minimum, even if you commute in from an hour away.

I mean, that's still contingent on having a decent salary relative to the amount of debt - I'm assuming it didn't happen with a couple of ~30k/year salaries.

I was a sole provider making under $100k.

Edit: salary. We were receiving small bonuses ($10k) at the time. but we learned to keep our expenses down, and every bonus we did receive we rolled right into debt before spending it on stupid crap.

Something is off with the math here. I don't doubt your story, but perhaps your numbers are a bit off?

240k over 3 years is $80k/yr in principal alone, substantially more with interest. If your post-tax take-home was $95k, that leaves only $15k/yr for living expenses for your household, which is extremely low.

Maybe that's possible with a lot of beans and rice, paid off used cars, and second-hand clothes and such.

Seems like part of the debt you are paying down is a mortgage, so that means your housing cost at least is part of the $80k/yr debt service. Still doesn't leave much for transportation (often second biggest expense for a household) and the rest.

no, you may be right (it was a few years ago). It may have been 3.5 or 4 years. I did receive small raises over the years, but we kept our budget fixed. I did receive a big windfall near the end from a company acquisition ($40k iirc). But at the rate were were paying stuff off, it brought it in ~6 months.

The math still shocks me every time I do it.

So it was really around 100k over four years with the last 40k covered by your acquisition earnings. So 25k a year on a 100k salary (pre/post-tax?). Which is a lot less than 80k a year. Still impressive, and kudos, but let's not pretend that it's anywhere the same. That's a difference of two whole median incomes per year.

Jeez, I made $99k last year and saved more than $25k and I AM an idiot with my money (I consider my lifestyle to be lavish). So I suspect it's more the "making a lot of money" part instead of the "not being an idiot with my money" part. Absolutely nobody is going to "not be an idiot" themselves to a six figure loan payoff in a few years on a 30k/year salary.

This math doesn't check out. $240k for three years is $80k/year. Assuming you are already factoring interest in. So you and your wife lived off less than $20k a year? Where?

Must have had extremely cheap rent, be very frugal when it comes to eating and basically any expense, really. In rent alone I pay $10.5k/year (with what most people here extremely cheap rent). When I still was in school living off school loans, our frugal weekly groceries (almost no meat, very rare, mostly fresh produce in discount markets) came out around $80 a week, so about $4-5k/year. We're already at $15k. This leaves $5k for literally everything else. I'd wager no kids, no car payments, no cell phone outside maybe prepaid, basically nothing in utilities, buying used clothing, no appliances that broke, no internet above basic, next to nothing in terms of entertainment expenses, etc. Quick mental math keeping the strict minimum from what I pay right now and I'm already over $20k. This is weirdly cheap.

And honestly, if the numbers are right, good on him for having the discipline to do it, but this sounds like a rather awful three years.


I didn't mean to make this about finance or the ability to pay off debt, it just reminded me so much of the type of writing that is really, really common in "pop finance"-type publications.

I wasn't at all saying that I believe that it's impossible to pay off a large debt quickly or that nobody ever does - just that, of course you are going to if you're in such an extraordinary situation. So the entire thing ends up being not very interesting or actionable, yet the title promises that it will be both.

I guess there's an audience for it though, because they keep getting published.

>by not being an idiot with my money

It was also due to the fact that you were making $80k/year ABOVE your living expenses.

Living expenses is a flexible item. You learn to keep your living expenses down.

Living expenses are only flexible when you have the money to be flexible.

If you're on a low wage, maybe with some kids. You're buying whatever inefficient car, low quality clothes that need replacing more often, a house near your child's school. Having that huge amount of disposable income offers you a lot more luxuries

Yes but it's a touch difficult to keep living expenses below $0, which most people in the developed - let alone developing - world would need to do on their salaries to save the extra $80k/year.

ha, yes. looking back, I don't know how we did it as fast as we did. we kept our expenses low (20-25k/year low) and poured every dollar into the debt. We didn't live in a fancy house or have new cars (beyond when we got in to it with the new car loans).

You're being completely tone deaf to the point that its frustrating to believe that people like you exist.

Just be thankful you don't work with them

tone deaf to what? That some people have big problems? That those big problems may be bigger than mine? That sometimes solutions to big problems are hard? That sacrifice for some isn't the same for sacrifice for others?

Or that some people's problems are as big as they are because they fail to recognize the change needed to solve them, or are unwilling to make that change? That often times, people fail to recognize the change needed because it first requires them to acknowledge and take responsibility.

Congratulations for that!

However, you also paid of that $240k by earning at least $80k / year.

How much did you tally as living expenses during that period? As a rough estimate

it was 10ish years ago, so this is a guess...

Mortgage was $750ish water/electric $200 phones/internet $100 food $500-600 fuel $25 insurances ??? $100-200

maybe $1500-2000 a month. $20-25k year in real expenses.

The thing is, 90% of the value of a 4 year degree (at this point) is the credential. I support broad and quality general education, I think it has value, but I don't suggest anyone put themselves deeply in debt for the rest of their lives over it. But to have any shot whatsoever at a decent job? Yeah, maybe.

So if you have the opportunity to pay a fraction of the cost for the credential, even if it means teaching yourself everything, that seems like a strategy very much worth considering for many people. It's dystopian that that's where we are now, but it is what it is.

> 90% of the value of a 4 year degree (at this point) is the credential.

Why do you say that, and how do you measure value?

I have quite a few lifelong friends from undergrad, people I wouldn't have met or bonded with if I'd done school online. And it's not necessary to know people to get great jobs, but I feel like it helps; 4 of the 6 jobs I've had came about through my undergrad network. One of them, as a founder, was partially funded by a prof. I met as an undergrad. My undergrad department helped my education and career in various ways including giving me a scholarship and RA and TA work, and publishing articles about me after graduating.

I am not an average case, partly because I went to graduate school, but for me it's safe to say the value of the credential part of my undergrad degree is somewhere near zero. Definitely not zero, but I've never needed the credential for anything but passing the checklist of requirements for my first job. The value I got from my time spent in the 4 year degree is almost entirely from the relationships I formed.

Do you work in academia now? If so, yes, the story is wildly different for your case.

> I've never needed the credential for anything but passing the checklist of requirements for my first job

That's a pretty crucial piece of value! For people who don't come from wealthy families, that entry into the college-educated workforce is often the difference between poverty and not-poverty.

The networking angle is an interesting one I hadn't considered, though personally, while I made several great friends in college, they and the other people I met there have had little to no effect on my job prospects. The closest thing would be that I went to a job fair while in school which eventually led to my first job, though there are plenty of job fairs that are open to the public (potentially even the one I went to; I can't remember). Of course it's possible I'm not the average case myself and that many people get networking benefits from their college experience; I can't say for sure.

> how do you measure value?

I do want to clarify that I'm focusing on financial prospects because in today's America, those are quite dark for many (most?) people. I'm not someone who dismisses the value of a real education, or the value of those relationships and experiences and memories. But I've seen several friends who put themselves in hundreds of thousands of dollars of debt for those things, and will probably be paying the interest for the rest of their lives, and I just don't think that tradeoff is worth it.

> Do you work in academia now?


Yes, there is value of being able to get the first job. Though the value isn't really quantifiable, and it changes over time. Plus the first job typically has other requirements besides a degree, all of which are valuable for the purposes of not getting rejected before an interview. I was just curious where "90%" comes from. (I'm comfortable with it being symbolic, representing your feeling that it's a majority. I just wanted to tease that out and clarify.)

To your original point, for better or worse, incomes and lifetime earnings are statistically higher for college grads. I didn't realize the difference was as high as it is, but the Fed recently published that in the U.S., incomes are roughly double for bachelor's degrees over non-graduates, and roughly triple for advanced degrees. That means that there is a large financial value wrapped up in getting the degree, one way or another. It might have a lot to do with networking, and it might have a lot to do with the credential and social signaling. I'm certain there's some of both. But this value is definitely a must to know about before deciding to forego a degree, and probably a very good thing to keep in mind before choosing an online program over an in-person one.

Yes, the 90% was a stand-in for my subjective impression

> incomes and lifetime earnings are statistically higher for college grads

Yep. And my interpretation (possibly biased) was that, in a world where nearly all information can be found online for free, the value of college as an environment for gaining professional skills has diminished greatly, but it still gets used as a (perhaps lazy) gatekeeping signifier by hiring departments. I still think there's truth to that, but there's probably something to the networking aspect as well

So many are in this boat, though; having the chops they've built informally but not having the paper to get in the door.

I really like WGU because it focuses on competency rather than time in a classroom. I hope more educational orgs follow suit.

That said, my wife, who is getting a Master's in Nursing from WGU, is learning plenty -- but she can focus on the actual learning while testing out of areas that she already knows well.

Would it not be possible for a very focused, hardworking individual to take all the course on MIT for free then enroll in this school and finish fast?

It's also possible for a amateur jogger to join an ultra marathon and run until they pass out.

College students are all amateurs.

That's why it takes them 4 years.

I went to WGU. I spent 6 months and got my degree.

I really don't think it is possible to go any faster than about 3 months max. Even that is a slog and you will need to be going full time.

Most of the exams are 70-100 questions and you will need to do 2 exams per course. The final exam is proctored, which means you need to make an appointment and go through this whole process of verifying your identity, going through an anti-cheat process and so on that burns another 20-30 minutes between setup and take down. So I would say expect a minimum 2 hour block of time per exam. That is just taking the exams.

So even a class that I finished in one day because I knew everything still takes a FULL day minimum because you are taking 2 exams totaling nearly 200 questions. It is mentally exhausting even if you know the material.

But truthfully you are going to need to study at least somewhat for at least half of the classes. Plus you can only grind through full day exams for so long until you reach exhaustion.

Burn out is also very real. I was averaging 3.5 courses a week for the first month or so. Then 2 classes a week. And by the end I was lucky to finish 1 course a week. It really is a TON of work, even if you come in with a lot of knowledge and experience like I did and don't need to study as hard as other people. The courses are still a lot of effort and take a toll on your energy and brain.

But if you are VERY motivated, you can do it in one term. The way that WGU charges you, you pay per 6-month term. It is a flat fee that covers all proctoring, text books, exams, etc for as many courses as you can complete during the 6 month term. So you would pay the same amount of money to do it in 3 months as 6 months. But if you go into 8 months, then you have to pay for the full second term. So I would set your goal on getting it done in 6 months and if you come in prepared I think it is practical. But you need to have a lot of time to dedicate to it during those 6 months.

Edit: Oh, I forgot to add that not all the courses are exams. About 1/3 of the courses are "performance based". These are courses like History, English Composition and stuff that require you to do several graded assignments. These often culminate into a final assignment to do a large research paper to pass the course. All of the assignments are required and must be graded. So you will need to submit them, get a plagiarism scan, and wait for an instructor to grade it (usually 36-48 hours after submitting it). So these will also slow you down. The good thing is that these courses have less time dedicated to studying, but more time dedicated to working on all the research papers and written assignments. These also can slow you down.

20 weeks of 40 hour weeks is fast. Exams and assignments are going to consume some time and you probably won't know every nuance of a course even if you've taken a different and more rigorous variation.

It also probably depends how big a deal the 4-year degree is to you. There are other options like edX MicroMasters that may make more sense (and it wouldn't shock me if they carried more cred in at least some circles).

Sure, but not in 3months.

Yes of course, but could someone pull it off in 1 year if they take all the free MIT courses ahead of time?

College debt is a huge burden on kids these days.

If you study in advance, you're limited by how fast your institution can give you credits.

There what OP is about They studied ahead and did 3 years in school.

You're usually paying per credit for these transactional degrees though, not per year.

College debt is only a burden for scam private schools and people trying to buy their way into the upper class. If you go with community college Associates and public state college Bachelor, it's not a huge debt burden to get a degree. The time spent studying (not working for money), and paying to live in your own apartment (instead of supported in parents home) and get fed by the college restaurant (instead of home shared meals) is the main cost.

> I just think it would be misleading to imply that this is a path that most people can take.

Yes, but if focus just on computing, in that field, there is, long amazing to me, evidence that "most people can" and, in a significant sense, do: I've long been impressed the degree to which the learning crucial for computing in the US has been obtained via self study much as in the OP.

Okay, say what you will about an ugrad computer science degree and then move on to graduate degrees, Masters and Ph.D. Uh, reality check: At the grad level, the material to be taught is often not all polished up and instead has rough edges, loose strings, things left unclear, etc. The fundamental reason is that the material, the work, the education, whatever, are supposed to be at the leading edge, or, if you will, the bleeding edge. More, the profs are paid mostly for their research (which for teaching is a path, with some efficacy, of quality) and not having all polished up, with good pedagogy, course teaching.

And in Ph.D. programs, and sometimes in Master's programs, the student is expected to write something, original, hopefully publishable -- usual criteria, "new, correct, and significant".

So for the work, it's, in a word, new. Right away can assume false starts, dead ends, encounters with brick walls, unpredictable rates of progress, unpredictable results, etc. So, we're back to independent work or, if you will, self study.

Main point: In education, self-study is not only helpful but at times crucial. In particular, independent study has been especially important in the US computer industry. So, in a sense, the independent study in the OP is not very surprising.

Oh, by the way, the OP mentioned a Georgia Tech Master's degree for ~$10,000. Secret: Commonly high end US research university graduate programs are very short on good students and long on tuition scholarships -- tuition should be about $0.00 for the whole graduate school effort.

Evidence: I was a student and a prof in applied math and computing. From a world class research university, I got a Master's and a Ph.D. For the grad degrees, I paid nothing in tuition. Independent study was crucial.

E.g., of the five Ph.D. qualifying exams, I did the best in the class on four of them, all four made heavy use of independent study, and three of them making heavy use of independent study before the grad program.

E.g., the work that did me the most good in grad school was the independent study with results that were clearly publishable.

> Oh, by the way, the OP mentioned a Georgia Tech Master's degree for ~$10,000. Secret: Commonly high end US research university graduate programs are very short on good students and long on tuition scholarships -- tuition should be about $0.00 for the whole graduate school effort.

It’s vastly easier to get into a Master’s than into a grad school programme where everyone is supposed to be aiming at a doctorate. Most terminal Master’s programmes are cash cows. GA Tech isn’t. They just don’t want to make a loss, but the population of people who can get into a terminal Master’s is quite different from those who can get admitted to a Ph.D. And GA Tech’s OMSCS can be completed while working a full time job. Good luck doing that while in proper grad school.

When I went to grad school, I just got admitted to the grad school and not specifically a Master's program, a terminal Master's program, or a Ph.D. program.

After enough courses, and maybe a paper (I wrote a paper, later published as part of a reading course) can get a Master's. I did that, got a Master's.

If want a Ph.D., then just pass the Ph.D. qualifying exams (QE), courses or not, Master's or not. After passing the QE, do some research. The standard was "an original contribution to knowledge worthy of publication", and the usual criteria for publication are "new, correct, and significant". So, proposed a dissertation research project (I already had a 50 page manuscript I had done on my own on a problem I brought with me to grad school), got approval to work on that as my dissertation, I did some research, derived some math, wrote and ran some software, wrote up what I'd done, stood for an oral exam, passed, and got a Ph.D.

I never paid tuition.

If for some Master's program Georgia Tech is charging a lot of money, they they should have something different from what I outlined.

I agree that this is a much better title. I actually also attended WGU during the pandemic and got a degree over the course of 6 months for about the same amount of money as the author. I came in with 42 transfer credits, so I completed 90 credits at WGU to complete my degree.

The degree took me 6 months for about the same number of classes as the author. So I was going about half the speed of the author of this post, yet my speed was still astonishingly fast for my mentor. While you will read about people flying through the program, the norm is to take about 2.5 years.

When I first started the program I always joked that I was simply buying a $3,500 paper (the rough cost for one 6-month term). I looked at it as a degree-mill. I had accepted it for what it was. But after going through everything and graduating, I was wrong about that initial assessment. I am actually far more proud of the work I did at WGU than what I did at my official 4-year university back in my mid-20s. I also walked away feeling like I learned more at WGU than at my previous university.

Yes, some of the courses are very easy. I was able to coast through them in a day, relying entirely on my previous experience. But this is the same as any other college, there are always 10-20% of the courses that are easy, and you basically just need to show up. The difference is that a normal university would require you to go through the motions for 16 weeks before you can complete the class that you could have passed on day #1. WGU simply lets you take the final exam whenever you want, allowing you to control the timeframe.

While I flew about 20% of the classes quickly in less than 48 hours after starting them, the rest took me an average of about a week of full-time work. I was usually juggling two classes at a time.

A standard 3 credit college course is supposed to take around 20 hours of in-class time with about equal amounts of studying time at home. So a normal 3 credit class generally takes about 40 hours of work to pass. But it is spread across 16 weeks and you also juggle 4-6 other courses. At WGU you take 1-2 at a time and go full bore until you are done with the course. I find that I was still averaging about 40 hours per course on average, but since I could do it all at once I went through it faster. I also found that i retained the knowledge a lot better.

Funny enough, my breakdown is very similar to the author. I would say I didn't even look at the material for about 1/5 of the courses. I skimmed through the material while watching cherry picked lectures for 3/5 of the courses. The final 1/5 of courses I read the text book cover to cover.

Basically what I am getting at is that WGU isn't just a "buy an online degree" program. You really need to work for it. I was working full-time on school. Generally 8-10 hours a day during the week and 3-5 hours a day on the weekend. The tests were generally very difficult. Even courses where I had a lot of experience, I really had to slow down for the final assessments. Passing a course requires you to pass two tests. They call them a "pre-assessement" and an "objective assessment". This is essentially a mid-term and a final-exam from any other school. That is generally the only requirement. So as soon as you pass the "mid-term" they will let you take the "final exam", and once you pass that then you are done with the course. You can choose how to spend your time to prepare for these tests. You can spend zero hours or 100 hours on the course. There are lectures (they call them "cohorts"), text books, study guides, practice tests, homework problems, and flash cards provided for each course. You can choose what you want to use and what you don't want to use. You can also look at a course syllabus and recognize which parts you already know and which parts you need to study and then only spend your time on those parts. You go at your pace and you are in charge. Nothing else is required. For this reason you really need to have good time management and self control.

Overall I am really proud of my WGU Degree. It wasn't easy. I struggled on several classes, but I also walked away learning far more than I expected.

Its also worth noting that at the speed I went, it was exhausting. I really don't recommend it. It is clearly possible, but by the end I was burnt out. I hit full burn out when I had 4 courses left. I really struggled through the final few classes because I had simply gone so fast and so hard for too long. Unless you really need it, I don't recommend cramming this into a single term even if its technically possible.

This was essentially my experience with WGU as well. I started in November and was done by the following August. Most of my classes from community college were transferred and I had ~90 CUs (EDIT: Just a guess on the number. It's been a while...) I to complete. I was able to complete several courses a week for the first few months, but started tapering off to about 1 course a week as time went on. I was exhausted by the end, but it was worth every penny and minute spent.

Congratulations on finishing the degree and being proud of the effort you put into it.

If you don't mind me asking: what motivated you to pursue it?

I am in my mid-thirties right now. I dropped out of university in my early-twenties to pursue a career. I saw a lot of initial success in my career and was flying up the corporate ladder. But I always felt like I had this skeleton in my closet or dark secret of not having even a bachelors degree.

Early on in my career I could compensate for not having a degree with my experience and work history. I was mostly competing against people with Bachelors Degrees who had little or moderate experience, and I simply had more experience and could compete well for jobs against them.

I found that as I have started applying for much higher level positions now, the degree skeleton has come to haunt me much more. A lot of the positions I have been applying to lately actually ask for MBA's or Master's Degrees. I am competing against other applicants who have Master's Degrees, while I have a high school diploma and some nice experience. The disparity is getting much wider and I knew I needed to get rid of this unnecessary blemish on my resume/CV.

For any other young people reading this. You might feel like you don't need a degree because your career is going great right now. But I will say that life is a lot harder without a degree as you get into higher positions. I am not saying it isn't possible to get a VP or Senior position without a college degree, it is possible, but it is MUCH harder than simply having that paper. There are definitely jobs that you are more than qualified for that you want and you will be turned away simply because of the lack of degree. It is a sad stumbling block that I was sick of dealing with.

In your experience, what‘s the max(0) position you can get to without the degree?

(0) max in this case being a general term since as you mention it’s possible to get to the higher levels. Essentially where’s the point at which it becomes much harder.

If you have an associates degree right now. Would you list both?

It's been well known for a while among those who are honest, with themselves and with the evidence, that degree requirements are mostly a cultural gatekeeping exercise for entry-level work. That isn't to say that training and expertise aren't critical, particularly the more responsibility a worker takes on; it's more that one rarely requires 4 years of preparation for a placement that involves work a supervisor that's known you for a month, at best, will trust you with. And the old axiom holds: "The best way to learn how to do something is to do it."

A degree does show that you're committed and willing to jump through hoops (even if only because you don't know any better).

That tittle would attract less eyeballs.

The author sure didn't take any shortcuts to graduate.

Some people on Reddit did say they completed 2-3 month bootcamps/courses with no prior experience and got junior dev jobs (in the UK), so I don't even know. Then again it's online and on Reddit, so they may be just lying.

I believe it is possible to learn enough to be decent in 3 months full time, and then learn everything else as you go. However, I don't see anyone hiring with that kind of experience.

But then why pay for a certificate? Four figures, no less. You could go through a bunch of free programming courses in 3 months and print your own certificate, same thing as long as you can actually do a job...

Personally did a bootcamp about 4 years ago. Also taught myself a couple more tech skills in the 2 months post-certificate before I got hired, but nothing special. Applied to several dozen jobs a week (easy to do when most don't follow up in any way...)

Went from about spot-on median salary for the US prior to switching into tech, to 50% more in my first role. In my first 3 years I increased my income over 400%, though I also switched from salaried to contract, so it end up about 300% increase for net pay.

Wouldn't have been able to afford a CS degree, in terms of either money or time, so it was a huge opportunity for me to go the bootcamp route. Have moved up into a senior/lead position, as well, so all the late nights of working on my skills post-bootcamp (and still while working, even now!) seem to have paid off.

But I don't know what the market in Europe is like. MCOL area, US, here.

Here in the US we also take a good hard look at bootcamp graduates. Code school/bootcamp graduates I’ve found to be good enough to do most of the mundane web work broiler-plate we have to write. It gives them experience, it give us that broiler-plate code no one wants to write.

Bootcamp’s here like Turing or Galvanize cost $20k and take 4-6mos but you’ll get an entry level Dev job at $75k or more when you’re done. It’s been a really good experience. Some have been really good programmers, others not so much. Same could be said of any demographic. There’s performers and under performers.

It's "boilerplate".

It's because a reusable letterpress metal plate of text is called a "boilerplate" because it looks like the manufacturer's nameplate on a steam-train's boiler. That led to any reusable block of text being called "boilerplate". No BBQ equipment was involved.

Indeed, I wrote it on my phone, brisket wins on iOS apparently. I usually write it boiler-plate.

No worries.

This is really interesting! What about applicants who didn't go through a bootcamp, but have decent projects on their website/Github/whatever? Is it the fact that you can't verify if they actually built that themselves or just copied/stole it?

I’ll be more keen on a candidate that has interesting GH projects and is self-taught than someone without GH and came from bootcamp. But if both have interesting projects, they both are equal as far as candidacy goes. I can only speak for myself. YMMV.

> But then why pay for a certificate? Four figures, no less. You could go through a bunch of free programming courses in 3 months and print your own certificate, same thing as long as you can actually do a job...

My understanding is a lot of these boot camps have parterships with companies and help place the people who complete them.

I always worry about the kind of companies that hire bootcamp devs like that.

Sure, I agree that quite a lot of university is not necessary for everyone, but not 2.5 years of unnecessary

If someone has just started from scratch 3 months ago and only knows how to program a bit in javascript, interface with a server and populate a database. I don't want them anywhere near a project.

I want them to at least know the basic data types and algorithms, security and integrity and design patterns. I don't want to have to deal with software crashes because they don't understand what O(N^2) is.

> I always worry about the kind of companies that hire bootcamp devs like that.

Typically it's a red flag for companies that are cheap and treat software as an expense and not a part of their core product.

That's definitely a thing. Very popular over here. The format is normally that you do a crash course over 2-3 months, and then do a year's on-the-job training as an apprentice.

> In 2012, I studied computer science at Concordia University and dropped out after 1 semester.

> On my first day, I attended 2 lectures. I quickly realized that a 2-hour commute to listen to someone slowly recite a PowerPoint wasn’t the best use of my time.

It's so depressing that a lot of expensive higher education is this pathetic. I wish our education system worked enough like a market that it could hold lecturers, programs, and schools that to a higher standard, and destroy them if they cheated their students like this.

I am going to go out on a limb and say that most of getting a Comp. Sci. degree at Concordia University isn't actually "listen[ing] to someone slowly recite a PowerPoint". Having attended a public University in Canada myself it certainly wasn't my experience (and also not very expensive, <10K a year). Of course, as with most things in life a lot of the benefit of an activity is what you make of it, not just what is on offer to the passive consumer.

Upon completing the article my main takeaway was that the author would have been a lot better off just toughing it out at Concordia in the first place. If you are the sort of person that tries something once for a few hours and then just backs away from it entirely they way this person did.. it's really something!

As a fellow 2012 Concordia dropout, it's possible there's more to that particular story going untold, and this is the clean version of it for employers. You can look up "printemps érable 2012" for some context and Montreal Police brutality related to that. My personal experience being a student involved running from police on horseback launching teargas grenades. Finding excuses not to talk about it is easier than trying to explain it.

Thing is, 20+ years ago you had to go to college to find a dozen like minded students to collaborate with. Today, reddit stufy groups are free

You don't pay for the slides, you pay for the course design, the office hours, and access to experts in the field that not only know the course material but can also guide you to further learning outside the syllabus.

...But you have to actually show up and talk to your professor.

I would say that the idea is you pay for those things. Sadly however a lot of Universities put together programs for the sole purpose of offering that program. Often times CS programs (like the ones at my University) are put together for that reason.

I've had an inordinate number of instructors who treat the lecture as the only teaching duty they have. No matter how much you would like to engage outside, the instructors have to participate as well. Many do not, and have no interest in doing so. As a consequence, the end result is often that you're just buying powerpoint slides and an optional seat in an auditorium.

Oh well that's certainly possible. Like anything else, you can still have a poor program or bad luck with a professor.

I'm just sad to see comments from students that have no idea what a university has available outside the lecture. I'm also quite surprised by it. Did these people never talk to a teacher in grade school?

Yeah, not only that. I went to a state school and we had access to a lot of computing resources I likely wouldn't have had otherwise. I attended in the 90s and unless you already had a job in the industry, you were unlikely to find a network with hundreds or thousands of computers with different OSes on them. I not only got to use macOS and Windows, but VAX, and just about every flavor of Unix at the time (AIX, HPUX, Apollo, Solaris, A/UX, etc.), and even got time on an IBM mainframe to see what that was like.

I remember, during my senior year, interviewing with a well-known company that had a grand idea for putting together a new documentation system that would allow cross-linking of documents so you could just click on a word and it would take you the definition of that word or the manual page for it. I asked, "Oh, like HTML?" to which they responded, "What's HTML?" This was around 1992-93-ish. Needless to say, I had a leg up on those already in the industry thanks to having had access to those resources at school.

It also has a lot of job opportunities. I got to be a Unix sys admin on the school network which both helped pay my expenses and gave me real-world experience. It wasn't glamorous, but it looked better on a resume than having worked flipping burgers.

The whole point of college is to figure out what you want to do, explore, network and learn how to learn.

The amount of learning that happens outside of classes, during labs with peers, is on par with lecture halls if not more. I've seen side projects, discussed random technologies, even startup MVPs on Campus.

That's where the real value lies. That and an environment where there's cutting edge research.

Never really heard about Concordia CS.

Some public universities seems to just be concert-hall sized lectures. Is it one of them?

You're paying for the brand.

To be fair, this is the problem with Powerpoint. It's a well studied phenomenon.

I use it as a rule of thumb, the best thing I can do with Powerpoint is have graphics. If I start writing words, I'm misusing it. That's not entirely true, outlines and things can be helpful, but it's a reasonable ideal to shoot for to avoid just reading from the slide.

Everything in higher education makes a lot more sense when you realize they aren't selling an education they're selling credentials.

Among other things, yes.

At least in US universities along with the powerpoint slide lectures there are also smaller discussion sessions, group assignments, take home programming tests, semester-long papers and projects, optional research and teaching opportunities and more. The lecture is really the least important part.

A 2 hour commute is also a massive problem. Students should be onsite wherever possible.

Requiring a computer science degree to apply for a programming job has always been bizarre to me and this blog post is a great example of why.

"The bulk of the effort was memorizing things that I normally would have Googled."

And given how memory works, in a few weeks you'll probably have to Google them again.

To earn my CS degree I learned a lot of stuff before starting my career that I never used. It would have been much more efficient to learn the things I needed on demand. I learned more about programming in 30 days on my first real programming job after graduation than I did in 4 years in school.

You know the things you learned exist, at least approximately what they're useful for, and what you have to search for to look them up. I think it's definitely not a perfect system, but having programmed a long time before I started studying computer science, it did fill a lot of knowledge gaps I didn't even know I had. If you're working in a team you might not notice this as much, though, as long as at least someone knows what they're looking for.

Going to college is this peculiar process where what you get out of it is commensurate with what you put into it, 100%.

I was a math major in college. A lot of my exams were open book - because exactly like you said - it's mostly about do you know how to think about something, do you know how to unfold something, do you know what to search for, and how it's useful.

Over the years I've had to help nieces and nephews with calculus homework, and it's been interesting to see this at play in the "real" world. I approach math a little differently now than a high schooler would, and usually my mind starts to wander around thinking, "you know, I'm pretty sure I remember xyz theorem that we can use here, let me look it up." Most of the time it works every time. From there it's easy to put it into the context of their current lesson and go over it in a way that makes sense with how and what they are studying in class.

When looking at job applicants, if someone has a degree, at the very least it means they had an opportunity to put a lot of effort into it and HOPEFULLY it means they got a lot out of it.

Sure, it's possible, maybe even most common, for it to just be a piece of paper - and that's OK. That's why we interview and a degree isn't the be all end all.

> Requiring a computer science degree to apply for a programming job has always been bizarre to me and this blog post is a great example of why.

What folks don't realize is that it's a signal to noise ratio issue [0]. It's not perfect but saves a lot of time down the line.

> "The bulk of the effort was memorizing things that I normally would have Googled."

That's missing the point. A lot of Engineering is being able to spot patterns and know enough about a subject to be able to research it properly and efficiently.

"is this a state machine", "can I represent this as a tree", "is this a regular language or do I need a more sophisticated parser".

[0] https://blog.codinghorror.com/why-cant-programmers-program/

There are two horrors here:

1) most people who apply for programming jobs can't write FizzBuzz;

2) the traditional way to signal that you are not one of them will cost you 5 years of your life, and probably burden you with decades of debt.

Employers of course worry about the first one, because the costs of too much interviewing come directly from their pockets. Students worry about the second one, because they pay for that one directly with their money and time.

The fact that if you can learn in 3 months what the average student learns for 5 years, there is a possibility for you go get a signal that really only costs you 3 months of your time, and also proportionally less money, is a fantastic improvement over the traditional way where if you are faster, you just have to spend more time waiting... while still paying huge money for that time.

(The equivalent improvement for the employers would be like, doing FizzBuzz tests during phone call, so you would only spend 2 minutes per hopeless candidate, instead of scheduling one hour of your time.)

> The fact that if you can learn in 3 months what the average student learns for 5 years, there is a possibility for you go get a signal that really only costs you 3 months of your time, and also proportionally less money, is a fantastic improvement over the traditional way where if you are faster, you just have to spend more time waiting... while still paying huge money for that time.

The author did code professionally for 10 years before getting his CS degree. And he did have an Associates Degree in IT to land those jobs.

Yes, the author was not just an ordinary student. I still consider this a fantastic improvement for two reasons:

> many opportunities ... out of reach because I didn’t have the required papers

> From there, a Bachelor’s degree typically requires an additional 3 years of study.

1) The author had a problem he wanted to fix; the traditional way required 3 years and lots of money; the alternative required only 3 months and less money.

2) If, hypothetically, there is an 18 years old genius who could accomplish this trick right after high school (maybe not in 3 months, but 6 or 12 months -- still faster and cheaper than the traditional way), they can, and this is how.

Computer science is as relevant to most web developer jobs as quantum physics is to welding. You want job training to be a react monkey take a community college night course. If you want to study something truly fascinating and is not job skills training, study computer science.

While I praise the author's effort, as a professor myself (I teach CS in a local polytechnic school - something like the US community colleges) I really doubt the quality of either the WGU and of the knowledge retention.

As an example, this semester I'm teaching Operating Systems concepts (definitions, processes, threads, semaphores, mutexes, signals, memory layouts, IO, etc.), and even if I take the theoretical concepts out (like not explaining how a fork() works under the hood), I still have 20 hours of labs and some 60 hours for two projects. How can this author ever reach the same level of competency as my students who let these concepts "sink", as he dedicated 15 hours in only 2/3 days.

Second example: in 2019 I taught some classes for a 4 month course on basic programming and web apps. Even having 7h of classes per day, things had to be really succinct, and in 4 months, although they were capable of doing some web apps, there was a lot of confusion in their heads because of the fast pace.

Some other examples on his post: 9h30 minutes for Discrete Math I. Either trivial things were handled, he is a math wiz, he rushed through all the exercises, or no exercise solving was necessary..

Again, I don't want to undervalue the author's effort, but as someone who as been teaching for quite some time, and has taught people from the 7th grade to MSc, I'm really suspicious of the quality of the degree and of the knowledge retention..

I read the story differently. It seems like WGU splits two problems (teaching material and verifying that the students learned material) into two different problems. It's not that the author crammed weeks of learning into hours, it's that the author _already knew all the same material from years of experience with it_, and just needed someone to help him verify "Yes, this person understands this material." I would love to see more universities who are happy to verify people's learning, instead of pretending there's only one way to learn the material.

The objection you raise seems more relevant to bootcamps, which claim to teach one how to program in a matter of weeks. After a few weeks, students might be able to get by, but they aren't going to be on par with people who have been working through some of this stuff for several years.

I took a Data Structures and Algorithms course at a brick and mortar university. It was being taught by a substitute teacher from the art department who knew some programming. The class was 5 assignments, the first of which I completed with about 60 characters of Python code; I remember feeling smug because others struggled with the same assignment. The other assignments weren't much harder. It's not all bad though, I had a great time "hanging out" (our classes were little more) with the teacher and other students.

My point is: Brick and mortar universities can also be ridiculously easy. Your students are either lucky or unlucky, depending on how they look at it. I'm currently approaching 40 hours spent on Software 1 at WGU, and I've worked as a professional programmer for over 10 years. In my experience WGU has been harder that what I had at my little "community college" (which was actually an accredited university).

> It was being taught by a substitute teacher from the art department who knew some programming. The class was 5 assignments (...)

I guess you were unlucky! Teacher from the art department, and a class with only 5 assignments?! Didn't your university had a CS department with CS-competent people to substitute? For instance, the CS department of my college has at least 40 teachers, and it's a polytechnic school (more like a vocational school than a theoretical university)..

What I want to say with this is that maybe your university is also questionable for delivering CS courses like that..

It was a summer class, and I don't know what their staffing situation was like.

> What I want to say with this is that maybe your university is also questionable for delivering CS courses like that.

Yes, it was. My point is that most CS degrees come with questionable quality. We should not knock WGU for its flaws while ignoring the widespread flaws of traditional universities.

> Even having 7h of classes per day, things had to be really succinct, and in 4 months, although they were capable of doing some web apps, there was a lot of confusion in their heads because of the fast pace.

If I'm reading that right you taught 7 hour per day for 4 months? 16 weeks * 5 days a week * 7 hours = ~500 hours?

If I were to spend just 100 hours doing (WGU style) self study of any topic I would expect to come away with a clear view: not knowing everything, but at least knowing what I know, and knowing how to learn more as needed. I think the traditional style of teaching (large class, one teacher) isn't very time efficient, so it makes me sad to see something like WGU dismissed solely because of the time spent.

> If I'm reading that right you taught 7 hour per day for 4 months? 16 weeks * 5 days a week * 7 hours = ~500 hours?

There were 6 professors. I taught ~75h of server-side web development..

> I think the traditional style of teaching (large class, one teacher) isn't very time efficient, so it makes me sad to see something like WGU dismissed solely because of the time spent.

Not that time spent == quality, but as in the examples I mentioned above, 15h for operating systems concepts (which I've been teaching for the last 3/4 semesters) doesn't teach you anything unless you already know most of it, and 9h30 for Math is only enough if you are reading a book (diagonally, that is)..

Again, the author's effort is something that he should be proud, but I, personally, think that he hasn't learn much things with enough quality..

I probably spent even less time passing Discrete Math 2 at WGU. However, that was largely because I had fully embraced that I would never graduate from a university and had to learn on my own, so the year prior I had read some books on proofs, probability, and statistics. I have notes and flashcards, I really studied, never expecting school credit. I had also encountered combinatorics and such in grade school and other math classes over the years. Discrete Math always seemed like "programmer's math" to me, in contrast to Calculus, and being a programmer it felt easier.

We don't know a lot about what the author was doing outside of school, but we know he was serious about self improvement - he wasn't just putting in hours at work and collecting a paycheck. It sounds like he's completed some impressive programming projects, he attends meetups, keeps a blog, uses Linux, hates Windows, learned to like Windows anyway, ported a Windows UI library to Android and iOS, speaks multiple languages, has attended multiple universities in the past, etc. I think there is a good chance that "he already [knew] most of it".

The sad thing is that no one cares abou the quality of the degree outside of academia. All they care about is seeing the BS on your resume.

Most employers aren't going to see someone with a degree from "Western Governors University" any different than someone with a degree "Central Ohio University" or "University of Wisconsin - Stout". Even though one of these allows you to get a degree in 6 months for $3,500 and the other two are full 4-5 year degrees that total $60,000.

Yes if you have MIT or Harvard on your resume it will be memorable and stand out, but other than that... they are all basically looked at equally.

This is overly simplistic.

There are tiers to everything. If you think it's MIT or Harvard, and then everyone else is a tossup, I've got some very bad news for you.

That only proves GP's point. What the author of the blog-post did in 3 months does not meet the expectations of a CS program from even a half decent school (think bottom of the top-100 list).

You know what also doesn't meet those expectations? a degree from a diploma mill.

So what you're saying is that, since outside of academia no one cares about quality of education, employers don't mind (or maybe shouldn't mind, according to your opinion) degrees from diploma mills.

When I say that the quality of the degree is not good, it is somewhat implied that the quality of the work of the person who did the degree is also questionable.. :/

Unfortunately you get this with students at every school. Many are just there to skate by and get a degree. The OP could also just know the material already and had no issue being able to pass as quickly as possible. This is something that is a huge benefit of WGU.

Also for what it's worth I took 8 weeks doing about 10 hours of work a week on Discrete math 1 and found it very interesting. My discrete math notes is a 1300 line org file that translates to a 29 page word doc. This doesn't include the many proofs and problems I practiced on a white board or on my iPad. I completed Operating systems in 5 weeks with about 15-20 hours of work a week. I used Georgia Techs Udacity course on operating systems to supplement my learning.

> I completed Operating systems in 5 weeks with about 15-20 hours of work a week. I used Georgia Techs Udacity course on operating systems to supplement my learning.

Which gives ~75h-100h, way more credible than the 15h the author mentions in the post..

My students have 3h/week of Lecture, 2h/week of labs which is about 75h of classes (15 weeks). If we add ~60h estimated time for two projects, it's 135h. It means to me that you had enough time to learn the concepts and may have internalized some/most of them..

Here you say that the time spent is "way more credible" than the authors. I think it's worth noting though, that it's the same course.

Im not a native english speaker, what I mean by more credible is the outcome, not the degree itself. In other words, this guy that took 100 hours, I would say that it seems more credible to have learned more than the other that says he took only 15 hours...

I have to agree here. I am going back to get a 2nd BS, this time in a technical field, and I just finished discrete mathematics (at a community college no less). We used Rosen, 8th edition, and I had single homework assignments with enough problems to fill up 6-8 hours if you include the readings required. It was probably one of the most challenging classes I've ever taken.

And then the proofs. Oh the proofs.

One thing that stood out for me as well was that someone else indicated that, in general, the only requirements were the exams. Most of the engineering classes I took had weekly problem sets that, yes, took at least a solid evening to do.

The way my undergraduate did credits was the (theoretical) number of hours a week they took between lectures, lab, recitation, and study/problem sets. Typical classes added up to 12 with usually 4 hours of the first 2 categories and 8 hours of the last. Classes varied but that wasn't too far off for the typical class. (Though there were some real time sink outliers.)

My university recommends that you spend 2-3 hours on your own per credit hour of study per week.

That was pretty much my experience at least for lecture-type courses.

I actually read that as 2-3x the in class time. No, where I went people generally spent way more than 2-3 hours/week of their own time on many courses.

It is per credit hour. A 3 credit course would require 6-9 hours per week, a 4 credit course 8-12.

I'm glad to see others attending WGU. I worked in the industry for a little while, but I couldn't take web dev any longer and decided to go back to school.

I've enjoyed the CS program quite a bit, especially the self-pacing. I have been going at a much slower pace than you have, but I hope to finish up in the next 3 months. For someone new to CS I still think WGU is a good school if you are self-motivated. A standard class is expected to last 8-12 weeks, but they take the full 6 months if needed. Some of the intro classes are very short and can be done within a day or 2. Some classes have cohorts which are nice and instructors will have webinars to go over things in the class.

I had a lot of fun with the later project courses giving me a chance to build out some neat JavaFX and Tkinter applications. A lot of students get crushed by Discrete Math 1 and 2, but it's just one of those classes you need to practice with paper and pencil at. I also like that you are open to using other material to learn the subject matter. For the operating system class, I worked on Georgia Tech's Intro to operating systems class on Udacity and learned everything I needed for the test. The school also provides a Pluralsight subscription so, you can benefit from the excellent classes there as well. There is also a subreddit where students discuss courses and their strategies for passing them here: https://www.reddit.com/r/WGU_CompSci/

I can see the program being brutal for people who cannot keep themselves on track due to lack of deadlines though. You do have an advisor that follows you through the whole program that communicates with you every week. If you do struggle with deadlines, let them know and they can help you strategize and keep you on track.

Lastly, I would like to note that the capstone does force the student down the path of a Data analysis/AI application. As someone that has little to no interest in AI, I was pretty bummed that it wasn't more open-ended. These things change though, so it might be different in the future.

This is interesting to me. I attended a public/state university for 5 semesters studying computer science, and dropped out due to personal matters (while also getting paid employment during the internet boom in a relatively rural part of the United States.) My first paid job didn't pan out, but then I got my first salaried position as a web developer, and I've been employed ever since (though I'm currently furloughed but mostly satisfied to be so because - I can handle it financially, and I have lots of work and projects around the house to keep me busy.)

I've often thought about getting a degree, mostly for some additional pointers on low-level, algorithmic engineering, and for exploring more interesting computer science subjects in greater depth. However, I don't plan on doing full-time development work for a significant period of time going forward, so the investment of money and time at a traditional four-year university doesn't seem worth it to me.

This alternative may or may not pay for itself in compensation, but that isn't why I would pursue it. I'm just interested in the education!

This was my motivation for finally going back to school. I dropped out originally due to running out of money and having to take a full time job. Or rather I told myself I'd continue school part time, but never did. Lack of a degree was never a problem professionally so there was no reason to go back and spend a pile of money and a lot of time.

But then I got to the point where I wanted to advance my knowledge of various topics and a master's degree would have been a great structured way to do that. For example I did the MITx MicroMasters in Supply Chain Management. After that I would have had an option to finish the master's program on-site at MIT with 1 additional semester. But I couldn't pursude this due to lack of completed undegrad.

So this finally motivated me to look into some options, and I also found WGU very appealing due to the rolling start, low cost and ability to quickly blaze through the classes I know and only spend time on the ones I need to learn something new.

Hey can I talk to you about that course? I’m switching into a new job in supply chain management (I currently work as a sys admin/IT admin) and feel pretty overwhelmed trying to figure out how that industry works

Certainly. Email is on my website and link is in my profile.

At a high level for anyone else following along, I'd highly recommend the MITx SCM Micro Masters. I learned a ton from those courses and then followed it up with a 1 week SCM bootcamp in person at MIT. The bootcamp was more expensive than the entire previous 5 classes but it was a great way to fill in some additional things that you can't really do online, specifically doing case several case studies and having awesome group discussions.

A comment on the CompSci part. For me, the CompSci degree was the best part of my entire technology background. I loved the classes that stretched the mind: Automata, multiple algo classes, etc.

The downside - is once I had exposure to those mental highs, programming wasn't nearly the same. The luster of C++ pointers or code optimization (or whatever) wasn't nearly the same after touching the deeper stuff. Which made me sad - to be honest.

I had some rich experiences going through computer engineering at Notre Dame. I very much didn’t come from wealth, and received some amazing scholarship. I left with a relatively small student loan. (Now, USC on the other hand - offered me no financial aid! Which I rightfully understood as they didn’t really want me to attend.)

One of them was doing an internship after sophomore year with the Air Force Research Labs (Hanscom AFB, though I ended moving off base and subletted an apartment in Boston). I built a system with GNU radio (software defined radio), a free attenas, etc. and was able to take it to “production”. My peers in the program came from other great schools like NC State, LSU, a couple historically black colleges, etc.

Now, this program was only available to folks considering going through the office candidate program (which, I don’t think any of us ended up pursuing). But sophomores in college.

I returned to campus and found an opportunity to be a research assistant with doctoral candidates doing SDR work on Zigbee - so I hopped on.

There were many more rich experiences, too: studying abroad in London, meeting the girl I’d marry, realizing I was not the smartest person in the room and inheriting a bunch of culture that I’m better off for having.

I don’t know how you replicate these experiences outside a formal program; that’s the uphill battle for any of these programs. I also don’t think that it would’ve been best for me to pursue any of these modern alternatives. But, the articles approach, and programs like Lambda School do provide interesting value to their own customers and on their own merits. They also provide appropriate competitive pressure to keep Unviersity programs honest.

> I don’t know how you replicate these experiences outside a formal program;

They aren’t designed for that. The specific programmed mentioned I believe is designed solely to bypass credential based gates where you might meet them.

Bootcamps are generally designed for quick turn around times for people who want a high paying career change, and come with a different set of experiences.

Cool. I applaud your achievement.

In my case, I have often considered going back for a degree (I had a "redneck tech school" education), but I could never justify devoting the time and money to get a degree in tech that would already be at least three years out of date by the time I graduated, qualifying me for a job, paying half of what I was already making.

Nowadays, the point is moot. I have spent my entire adult life, relentlessly self-teaching (still at it), and no longer need to prove anything to anyone (the biggest issue with not having a sheepskin, is looking up noses everywhere).

I love learning, and do wish I could have gotten a "more rounded" education, but the mitigating factor is that I have been shipping software, since I was 22, and that has taught me some stuff that I wouldn't have learned otherwise.

I read something ages back about a lesser known process called 'testing out' where you are basically able to just do the exams at universities to get a degree. It's suppose to encourage life-long learning and its based on the assumption that many professionals and adults accrue relevant experience outside of academia. I had always wanted to look more into this myself but it seemed to require too much bureaucracy. OPs university sounds like it might be an actual working alternative to this. I find this fascinating and wonder what other degrees might be possible

As a full time working adult, WGU was _amazing_ for getting a degree, at my own pace. Which was a lot faster than a standard brick and mortar school. You can see the course catalog here [0]. My work paid tuition for my degree, but they had an upper limit on reimbursement. WGU was what I found that they approved, and came in under cost (I didn't want to pay out of pocket).

I really do recommend WGU for experienced professionals, and self-guided and motivated learners. I would not suggest it for a brand new high schooler, or people who have no experience in the industry they're trying to get a degree in.


Testing out of a course used to generally be available for freshman level courses but rarely for anything more advanced than that. (The exception might be that one could generally test out of up to a year of calculus since some people don't consider calculus a freshman course - e.g., business majors, etc.) It was a cheaper option at some universities than the AP exams were for students that had taken AP courses, although AP courses were not required to test out of a course. The testing out option enabled high school students that had taken advanced courses to start university more in-line with their backgrounds rather than having to repeat courses for no reason.


Edit: that was decades ago - don't forgo an AP exam thinking you can test out unless you already have admission and have verified the availability and costs of testing out of a course.

Reputable colleges won't give you credit for that, beyond skipping first year prerequisites. But you can get "a degree" for employers like government that pay based a formula based on credentials.

Sort of. CLEP tests go over much of your core curriculum, but not major specific stuff. So really more like two years worth of material, maximally. And while some top tier universities won't credit that, they -will- credit transfer credit from colleges that DO accept CLEP credit. Takes a bit of work to figure it all out, but you can end up going to top tier universities with two years of college credit if you do it just right.

But yeah, per OP, you can't get a degree with it.

If they did, they'd get the reputation for leaving money on the table. Notgonnahappen.

My understanding of “testing out” is that it generally refers to standardized things like CLEP where there are only a handful of subjects you can test out in, typically prerequisite academics.

I think the takeaway point from this is how dumb it is that most organizations value credentials over experience. It shouldn't come as a surprise that someone who has a decade of work experience in the field can sail through a college curriculum like this, and it is really a failing of society more broadly if the author feels that he needs to get a diploma to be taken seriously.

It's not dumb, it just hides a cynical compromise. Companies need to wade through tons of applications, young people want a path through which they have some degree of certainty (pun semi-intended) that their work will lead to social capital + being with other people their age in a common setting. The college experience leaves no one very satisfied but many chunks of society tolerably satisfied.

Of course, the social contract is weakening and a new compromise will have to be found. That may simply be the broader acceptance of online degrees and certificates.

For anyone who is wondering, the school he listed (Western Governors University) is unfortunately not ABET accredited as far as I can tell.

ABET accreditation doesn't mean much for CS however. For example, none of these schools are ABET accredited for CS:

- UC Berkeley


- Stanford

- Cornell

- University of Washington

- Caltech

- Princeton


It means quite a lot if you want to use the CS degree for certain purposes, such as the U.S. patent bar as mentioned below.

I was pointing it out not to criticize the school generally, but because it is meaningful for some use cases that sound like "I need a CS degree, quick."

Add WGU to that list and tell me which one doesn't belong.

I'm not arguing for/against WGU.

My argument is that ABET accreditation isn't relevant to how "good" a CS program is. It does establish a minimum baseline, but lacking it just means you have to determine the program's value in other ways.

I think it is relevant though. Most of the mid-level schools I looked at were ABET accredited. Going unaccredited in CS seems to be a thing that you can only get away with if you have so much or so little name recognition that it doesn't matter.

Isn’t the point of ABET mostly so that you can obtain engineering licenses. I don’t imagine most software development jobs would even bother looking up if your program was accredited because you aren’t going to need a license.

The US government for one requires ABET accreditation for all engineering degrees regardless of licensing requirements for the job. I'm currently a government contractor who is hoping to eventually convert. It's not yet clear to me how much the lack of ABET accreditation on my UC CS degree is going to hurt me.

You're correct. The path I took is not without compromises:

- WGU is not ABET accredited.

- WGU is not as rigorous as other universities.

- WGU is not as prestigious as other universities.

- WGU gives all graduates the same 3.0 GPA.

I probably should point them out in my post.

That said, I still believe WGU was the right choice given my circumstances. I don't value these things enough to justify taking 12 to 16 times longer to graduate.

I agree with all of your points. Thanks for sharing your experience and especially the break down on the time. Do you plan on going for a Masters, either elsewhere or at WGU ?

Ew everyone gets 3.0 GPA? What's the point of that? Are all of the exams pass/fail?

I think it's a combination of:

* The school targets students with previous experience, which transfer credits that have no influence on the GPA.

* The school uses technical certifications (e.g., Oracle Oracle Database SQL 1Z0-07) as a final assessment for some courses, which don't cleanly map to a GPA.

* The school doesn't grade project-based courses (PAs), and instead requires students to resubmit their project until 100% of the rubric is met.

It's basically pass/fail. They won't even show students their grade on final exams (although Chrome's DevTools can reveal them).

I believe it is regionally accredited.

This should be at the top. What's the point of that paper if the issuer is not accredited?

It's not ABET accredited, but it is regionally accredited -- which all of the big schools are. That means credits earned at WGU will generally transfer to a traditional school.

There are a lot of top schools that are regionally accredited in their CS department but not ABET.

ABET has been for more traditional engineering like Chemical Engineering and Mechanical Engineering and the like.

Even for those schools that do have their CS departments ABET accredited, it's a new thing.

The school I went to, University of Texas at Austin, for example, only became ABET accredited in CS in 2017. Whereas the ChE was ABET accredited in the 1930's.

The school may be accredited but not the degree. This is the case for CS degrees from several UC schools including Berkley. My CS degree from a slightly less prestigious UC is not accredited but no one has ever said anything about it.

The article says the school is accredited by a different group.

I assume if you're just interested in learning. (Not trying to be sarcastic or trollish -- I'm looking for stuff that optimizes more for personal enlightenment than a credential, and I accept that both targets are valid to pursue.)

With that said, even for that use case, it feels clickbaity to label it "I got a CS degree" rather than "I learned the content of a CS degree".

> "I'm looking for stuff that optimizes more for personal enlightenment than a credential"

The point of accreditation is to assure that various minimum standards are met by the educational institution. (Whether that works or not is a different discussion.) Lack of normal accreditation is usually a sign of a diploma mill and, more importantly, that the student will not be getting their money's worth either in terms of a credential or personal enlightenment.

I assume this matters if they want to go to grad school, but does it matter for any other reason? I think employers attribute value to programs they have heard of, whether it is ABET or not is not something I have ever considered.

One example is taking the patent bar in the United States. A CS degree from this school cannot be used for that purpose. I'm not saying it matters to employers (I don't know either way) and I expect its regional accreditation may be fine for some purposes.

There plenty of people getting a BS from WGU that go on to Master level degrees at other universities. I know quite a few of them personally. Going to WGU for a BS and then a different university for a Masters, is quite possible and not unheard of.

This is actually what my sister did. She got a WGU Bachelors Degree and then went on to get a Master's Degree from a more recognizable college. She shaved 3-4 years off her full masters program that way (if you consider the time spent for the Bachelors degree in that process) and at the end of the day she is applying on her resume that she has a Master's from a recognizable university name that raises eye brows.

I am currently looking at doing the same.

If you do the math, it is possible to get a WGU Bachelors Degree + Full 2-Year Master's Degree program from a "recognizable" school for less money and in less time than most people are spending to get a bachelors degree from an unrecognizable community college.\

That's a life hack for anyone looking for Master's programs.

The US government is very big on requiring ABET accredited degrees for engineering. I've never gotten a straight answer on whether this is the case for CS or not.

I was initially skeptical, but this looks pretty solid after reading. The author seems to be an absolute beast though, so anyone diving into this should expect to focus and work hard / very consistently.

Aside from it being everyday work, which he explains it was his only job and during Covid, the daily hourly effort seems to average around 6-7 hours a day.

That said, I don’t see many days where he didn’t have school. So yea this isn’t for everyone obviously.

Does anyone know about similar programs for non-US residents? I understand the original author is Canadian, but it seems they got granted a "special exception" that is unlikely to be available to, say, people from the EU (my case), or other countries.

Look up Thomas Edison University, which will let you test out of many classes, transfer in other (US) college credit, does offer PLA credit (though that isn't as easy as just studying for and taking an exam). Also OpenClassrooms, which is project based, and offers many options for Bachelor level diplomas through a non-traditional route with many of the diplomas backed by reputable European universities such as Ecole de Paris and Lyon University.

Also check out degreeforums for other places that are legitimate.

I am not affiliated with any of these websites or businesses, but I have been through their educational programmes. If you are an intrinsically motivated self-learner/self-teacher that works well on their own with occasional mentor contact, then I can highly recommend them.

Thanks! That may be very useful at some point.

Scott H. Young completed entire 4-year MIT curriculum for computer science in one year [1]. That was in 2012 when MOOCs was still a new thing. Good luck, I hope your WHY is strong enough to push through difficult days.

[1] https://www.scotthyoung.com/blog/myprojects/mit-challenge-2/

WGU would be great if they would just accept my friggin high school transcripts. I've had them mailed out twice, once via certified letter, and both times their admissions department failed to receive said transcripts. It's really infuriating too because I'd love to take their CS program but they just... won't... accept... my... transcripts.

Also gave up in attending because of the admissions process.

I eventually went to and graduated WGU. But I actually tried to go a few years ago initially and their admissions and communication process was such a disaster I gave up.

I went back later and literally had to force them to let me enroll. When I finally complained to my mentor about the admissions process being like pulling teeth she simply said "We want to make sure students are motivated before we enroll them". I am not sure I buy that response, but that's what she told me.

I agree, it is a joke.

I applied to WG this year to get a CS degree. Work as a dev with no degree. My college transcript has AP classes on it so they wanted those sent as well. These were from 2 decades ago and I have virtually no chance of getting them. Emailed my "enrollment advisor" 3 times about the situation, no response. But every week or so he would send a canned email about how great WGU is and enroll now! Figured if they were so uncommunicative before they got my money imagine how it would be after.

This is very interesting to me. I have an EE degree but have been doing software full time since I graduated in 2009. Even though I am mostly self taught and feel competent enough compared to my coworkers, I frequently have pangs of self doubt around not having a CS degree. Mainly worry about not having a deeper knowledge of algorithms and data structures. I have been thinking for a while of an online CS degree and this looks pretty good!

> Mainly worry about not having a deeper knowledge of algorithms and data structures.

Why not just look at these then? As an example you could check out MITs algorithms and DS lectures on Youtube. An entire degree might be a lot of wasted effort.

Some of it is just the unknown unknowns I guess. I can only look at things I think I may not know or know well enough. Whereas a directed course might guide me to what I don't know better. But perhaps you are right, I should probably start there.

What I would probably do is look at MIT's degree requirements for their 6-2 or 6-3 degree (BS for EE and CS--with apparently different emphasis on the EE or the CS). And then look at OpenCourseware for the syllabi. Of course you can look at other schools as well--though I don't know of other resources as comprehensive as OpenCourseware. That should give you a fairly good idea of at least what MIT requires that you may not be familiar with.

MIT is probably also a good start if you an EE given that the degrees are in the same school vs. places where CS is closer to the Math department.

Also I assume OpenCourseware and edX.

I also dropped out of a Canadian computer science degree program but in my case I will likely never get a degree.

I left University of Alberta in 1989 after completing about two years of my degree to co-found a startup with the thought that if it failed I would go back to get a degree at a better program. The startup was a success and I have never gone back to school.

Sadly, in 1988 the University of Alberta CS program was stuck in a 1970s data processing curriculum and, for undergrads, had neither PCs or Unix. I had already been making money writing applications software for PC for several years and there was no way I was going to go work for an oil company or provincial government doing mainframes.

I did miss out on some things by not completing. I had very much wanted to take the compilers course offered by Jonathan Schaeffer (of Chinook checkers and poker bot fame). Had the U of A program included internships or co-op, more unix or PCs and a more modern curriculum I probably would have stayed. Indeed they closed the program to new students for a year in the fall of 1989 to retool it and modernize. The revised program was much better (In irony, I was hiring interns from the program about the same time as I would have originally graduated).

It is still strange when I have to explain sometimes that I have no degree. I can't imagine getting a CS degree just for the piece of paper. I can understand why someone might need that paper in addition to the skills, such as for a TN work visa. Thankfully I have not. If I did go back, and in my 50s I would feel like I was stealing a seat from someone who really needed it for their career, I would go for math, statistics instead of CS, a degree complementary to the CS skills already have.

> Sadly, in 1988 the University of Alberta CS program was stuck in a 1970s data processing curriculum and, for undergrads, had neither PCs or Unix

I feel the landscape today is very different for CS degrees. Most universities standardized quite a bit around what CMU, Stanford and Berkley are doing. But you are not the first telling me that CS degrees were a crapshoot in the 80's. EE as a pathway to software was much more common.

At that time there was work in the ACM to standardize the college CS curriculum and, for me, the obvious differences between the proposed ACM curriculum and what I was taking was a stark indicator.

I feel like some of the other comments are missing out on how it's possible to do this slower while still working, regardless: Thank you for this experience, it's super motivating. I recently got into the GA Tech OMSCS program myself. I had a non-tech BSc from a South Asian university that I barely survived because it wasn't my thing. Ever since I got out of school, I've worked in tech. I even managed to get a job and move to North America through my work, but the lack of a degree always felt off to me as well.

If I'd read this article a few years ago, I'd likely have jumped into this program before joining GA Tech's OMSCS program. But at this point, I'm happy with what I chose, and to be able to work and study at the same time, no matter how intense it may get.

> it's possible to do this sort while still working.

This. Woah, I had to scroll down so much to reach this point! This is a very important aspect indeed.

Does the GATech OMSCS program have this flexibility? .

So they admit people from outside the US?

yes and yes

Congratulations. It is somewhat surprising to me how some credits required vastly more effort than others. For example Algorithms 2 is worth as much as "Introduction to IT" - 4 credits each.

Were there other restrictions on which credits you could use? In your opinion is this "fair"?

This is normal. When I was scheduling classes I would weight them by difficulty and have a cap for total course weight to account for it. Otherwise it's easy to overload on hard courses.

I certainly tended to do that as an undergraduate when I could. Credit hours or the equivalent correlate pretty loosely to difficulty and the time you'll need to spend. (And, of course it's even truer when you mix in non-engineering classes which aren't necessarily easy but were relatively speaking for me for the most part.)

>On average, I studied for 40 hours a week and completed a course every 3 days.

I will say that when I was in university, there were very few courses I only spent about 24 hours on. In fact, there are probably a fair number of courses which, even had I taken them a couple years earlier, would still have probably taken more time than that to do assignments and brush up for exams.

Credits are usually used as a proxy for how much out-of-classroom time the class will occupy in a week. Not necessarily the difficulty of the material.

Algorithms II is typically a mathematical proof class about algorithms. At my university, it was used as a filter class, and quite demanding. I think it was also 4 credits.

Hopefully this style of university takes off. The traditional college system is an outright scam.

That's brilliant. I wish something like this had been available back in the late 90s when I was getting my CS degree from Oregon State. I pulled it off in a couple years by taking a heavy load and working through summers, I'd have been ecstatic to find a fully self-paced program.

I also think the idea of capping it off with OMSCS is a great idea. I did that myself but 20 years later. And mostly just for entertainment rather than boosting my career.

Is this accreditation accepted to apply to Georgia Tech's OMS CS?

Yes. I did the WGU BSCS purely to get into GaTech OMSCS, and I got in with no problem.

How do you guys feel about this?

He signed up at an accredited university and then used non university resources to transfer credits in, which weren’t universities at all

Is that concept new? Is this something we can all be doing for other degrees or is this barely accredited and will come under fire very soon and stop being respected long before

Those online classes he took are ACE or NCCRS accredited. Study.com for example can be transferred to many Universities: https://study.com/college/school/index.html#transferSearch

Take a look at ACE and NCCRS: https://www.acenet.edu/Programs-Services/Pages/Credit-Transc... http://www.nationalccrs.org/policies-procedures

I did notice the classes themselves were accredited, and I also like the possibility of rapid iteration and refinement of the course material that these online classes can offer. ie. A university has one professor following an unchallenged lecture routine, whereas online can be very competitive and update and have the most useful style of visual aids.

I went to a public 4-year university and transferred in community college credits that I'd previously used to satisfy high school graduation requirements.

Requiring courses from a university is assuredly too high a bar, given that it would exclude anyone who, say, transferred from a community college into a 4-year program (several of my brightest peers fell into that category).

Several of these courses (Calc 1, American Government, Environmental Science, Stats) are comparable to taking AP exams (cost of $95) for credit, which you could ostensibly do without taking the course in the first place.

One travesty to me is that high school AP courses are so much more rigorous than every single other alternative for college credit. 10 months straight reserved for the local overachievers, with a suspenseful lead up to a third party test that determines their future. A future which is ultimately no different from college-bound people that coasted and chilled through high school, and a professional future that isn't much different from everyone there.

The actual college course is 3-4 months and may just have only a midterm that you need to show up for, or less.

The AP exam and equivalents you can actually self-study for, so the overachievers congratulating themselves for 10 months straight would probably be equipped to use their time more wisely if they already have the discipline and support system.

It's not new. Some accredited institutions will give credits for "life experience", and I know some University's work with local community colleges so that students can transfer credits in. Typically these transfers are very limited though and only let you bypass "Intro to X" type classes, or meet requirements for graduation that aren't core to the program.

This is interesting, but from what I have seen in the industry a CS degree is only really useful if you are starting out or are a relatively junior developer. With 10+ years of experience, an additional "BS degree from XYZ University" line on your resume is worth pretty much nothing.

In the article he suggests that wasn't really his experience and it's hard to argue with someone's experience. I also read him as just being bothered by the fact that he had never gotten a degree. Given that, and the way/circumstances under which he did it, it's hard to argue with just online enrolling for a semester and doing it.

From what I understand, it can sometimes be useful to have the paper. On the other hand, I'm sure you're right that a lot of the time in tech someone with 10+ years of experience probably doesn't add a lot by getting a credential like this--especially if e.g. there were significant opportunity cost to doing so.

My personal anecdata is that it's most important dealing with government or government contractors. This is because they have incentives built into their contracts. In the private sector I've found it less of a hurdle.

WGU CS grad here (with 10+ years of experience before I entered the program). If you're in the midwest (and from what I hear, east coast) it's definitely necessary to get through a lot of HR screens.

I see a lot missing, here. No assembly, compilers, linear algebra, or digital design courses that you find in every other CS program.

How did this courseload get accredited? (and I say this as someone who never graduated - just showing some respect for the courseload required by some of my peers, here)

FWIW, I did this program and assembly / digital design (assuming you mean computer architecture) was pretty well covered in the OS class. They use the Patterson + Hennessey book that many other large universities seem to use. That class was one of the ones that took me the longest since the only background I had was going through the nand to tetris book a few years before.

Definitely no compilers or linear algebra though. Personally, I plan on going through some self-study of those topics at some point or another, but I've had a hard time prioritizing them over what seems like more relevant material to the area I've been working in (full stack web dev).

I was wondering about linear algebra as well. Not sure how you can skip on that.

To my knowledge, certain accreditations are easier to get than others.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact