Hacker News new | past | comments | ask | show | jobs | submit login
Google's Plan to Disrupt the College Degree Is Absolute Genius (inc.com)
39 points by jvreagan 32 days ago | hide | past | favorite | 80 comments

I question the depth of education you can receive from a six-month course set as opposed to a full four-year undergraduate experience. This feels like the same kind of marketing you saw from Flatiron School or one of those other coding bootcamps, but with a larger name attached.

Cynically, I'd say that Google is just trying to increase the number of qualified individuals to fill grunt-work positions so they can pay each employee less.

While much of the full four-year undergrad experience is full of other priorities (ex. socialization, networking), there is usually a lot slow learning and almost traumatic adaptation that happens that I'm convinced can't happen on a shorter time frame. Being in a competitive learning environment with colleagues and resources specialized to help learning is hard to replicate. Many of the biggest and most interesting ideas require you to sort of soak your brain in them.

Unrelated to the four year undergrad experience, coding bootcamps and the like that are 3 month and 6 month courses do not work for building deep knowledge for complete beginners -- IIRC their most common usage was for people already in tech to make a change to different specializations. I think the claims that Flatiron School or a bunch of other coding bootcamps sold is basically fraudulent -- it would take over 6 months of high quality practice just to get a handle on HTML, CSS and Javascript, but they throw random new programmers into HTML, CSS, then JS by way of React (and sometimes even rails/backend stuff). Deep, stable knowledge is not built that way.

> Cynically, I'd say that Google is just trying to increase the number of qualified individuals to fill grunt-work positions so they can pay each employee less.

This is exactly what they're trying to do. I guess I could think about it as trying to vertically integrate employee education. This is also what Flatiron School and those other coding bootcamps were trying to do as well IMO, as well as getting at some of the funding that was made available for those efforts.

I don't think that's necessarily a bad thing though -- while people in those positions might see a drop in salary (or rather a slowing of growth in salary), more people with those kinds of skills is probably a net positive.

That's a smart way to skimp on training costs and to produce employees that are very dependent on google and the affiliate corporations. Sure, some people may benefit from this especially the ones who'd otherwise get indebted with education costs and not have any job prospects once they graduate. But this is not real education, it's some kind of limited training with a certificate. Yes, it seems to me that they want to fill grunt-work positions and they'd get the higher level workers with degrees from real universities. What's not clear is whether this training is free of cost and what kind of classes they intend offer.

Maybe enough to get a jr position and to demonstrate dedication? You can learn everything else on the job.

I found my studies were largely useless and a lot of it was just filler, fee justification material.

One of the mandatory elective credits was a course with a title “Role of a car in a North American society”.

There is not much to question here. If you look at the list the positions listed are not rocket science type and basics of each can be squeezed into 6 month. Problem is that the result is most likely very narrow skilled individual with not much future unless of course they keep learning.

>>...individual with not much future unless of course they keep learning

This is now true no matter what level of education. Just need to start somewhere, and this is a great option for those without access to funds or the desire to take on the crippling debt increasingly associated with a 4 year degree.

The current offerings are:

- Data Analyst

- Project Manager

- UX Designer

- IT Support Specialist

I wonder how specific to Google products those courses are. It sounds like Google's answer to Microsoft Certified Microsoft Product Fixer certifications.

It's just courses on Coursera for $49/month.

For the IT Support Specialist, really none at all is specific to Google products. The major focus is Windows and Linux. When mobile devices are discussed both Android and iOS are treated in equal measure. ChromeOS is mentioned but never used really.

A lot of the questions and examples mention competing products like Office365, they don't mention G Suite at all that I remember. The brief cloud section is pretty general and doesn't discuss GCP in specific.

I see it as an abbreviated version of A+, Net+, and Sec+ in one. For someone at a beginner level I think it's a pretty good introduction to IT support.

Oh yes, let my project be managed by someone with a 6 mo course. I'm sure that will work just fine

Six months of onboarding is plenty of time to form someone you'd want to keep around.


Not because of the project management side (which ok, can be a long and specialized career by itself) but by the side of "how do you have someone manage something they don't know how it works" (which is needed in smaller projects)

Products ... or internal tooling and processes.

There'sba long history of dominant monopolies structuring pedagogy and business skills training to their own internal ops, dating to DuPont, GM, AT&T, IBM, Xerox, Microsoft, Cisco, etc., etc., etc.

I don't trust a company like Google to act in its students' and certificate holders' best interests. It's not hard to imagine that the certifications will all be tied back to a Google account that they can terminate at any time, at their discretion. On a more Google-y note, I wouldn't want to get an email in five years telling me that they're sunsetting their online certificate verification portal, meaning no one can actually trust my claims as a certificate holder.

This is only a problem for a specific implementation. You can skip the online verifier for example by handing out a certificate-signed PDF. Until we see the actual implementation or plan, there's no need to criticise the potential ones.

Thought exercise: how do employers verify certificates currently, and how would they do it if your college shut down? In my experience, it's "they don't" and "they couldn't". It's both a bit more and less tricky with jobs that require recognised certification - there will be likely a nationally issued certification which matches your education one.

I agree, my speculation is just that: speculation. I sincerely hope they can come up with a generic, decentralized way to verify professional credentials, as that could bolster certification's viability as a career path.

Elsewhere in tech, Cisco's certifications come with a sort of verification code which you can punch in to their site to check (in addition to the paper trail from Pearson and the physical certificate they mail you). If it weren't such a long-running part of their business, I'd be equally skeptical about them maintaining the certification infrastructure, as it's wildly tangential to the business of building networking equipment. As to the question of what a potential employer would do if Cisco (or Google, or a college, or a bootcamp) did shut it down their verification systems? I couldn't say, I've never hired anyone, nor have I had HR breathing down my neck to check off position requirements.

> "there's no need to criticise the potential ones."

Why not? There might be people on these forums who could take note.

The certificate is via Coursera, which is entirely separate and only has a 'sign in with Google' option. You don't need a Google account to do it, get your certificate, or link it to your LinkedIn.

I think the point of these certs is to get a job at Google which lessens some of these concerns.

Is there such a term as "edmill"? A process of mass producing cheap labor for low-paid positions. This post just smells of hype and marketing for something like that.

I think back in the Soviet-influenced days of my country, people went to college knowing which job they'll be assigned at graduation. I guess this is the modern day PTUs? [1]

[1] https://en.wikipedia.org/wiki/Professional_technical_school

Soviet block country here as well. However, profession training schools are not communist invention and have their place in educating people for useful jobs. At least, I don't know how to build a high-quality bathroom and I don't think that people who can should not be valued less than someone who can make a terrible electron app. My UX in the former is much better tbh. :)

My issue is that this stuff now pretends to substitute for colleges which is totally different category. Six months of youtube videos are just professional training for office workers of the disposable type, not the creative/problem-solving variety that needs broad understanding of the field.

I agree with you about PTUs having their place in planned and free market economies alike. I think you’re saying that if people have the option to go to a good “traditional” school and learn CS there (and meet a bunch of students and professors, etc), they shouldn’t throw away that option for something like this, is that right?

On the other hand, I feel like there are plenty of people who have either graduated with a humanities degree or simply do not have the ability to go to a good technical school: I would hope that this provides a new on-ramp for people like that, and it actually adds value to people’s lives. Even though as most people here have stipulated, this seems commercially motivated by Google’s interests.

My issue with those courses is that they teach you the "hows" of a problem, but not the "Whys". Completing them, you know how to perform a certain task, maybe even according to best practices, but you don't know why those practices are good, when they are bad choice, what are the alternatives. Ones the assumptions of the course break, it becomes useless. Without the deeper understanding of the field, moving to a newer technology, framework, or even gui interface might turn out to be a struggle.

Yes, such courses might be useful to fill in gaps in one's education, but it can't substitute said education. Low-quality tech graduates might profit, but humanities graduates won't become decent developers, at least not without continuous training for years which might take as long as a second degree.

Short answer: Yes, highly-focused skills-centric but narrow education has a long history.

Longer answer:

It's useful to keep in mind the (usually) unstated goals of educational systems:

- Produce a technically-skilled, but politically pliable, working class.

- Produce a managerially competent, but not revolutionary, management and professional class.

- Persist existing power structures, whatever their form; political, cultural, corporate, religious, technical, epistemic.

The fundamental division in education has long been between liberal education and technical education, and can be traced to the emergence of the modern university in the 11th century (Bologna, 1088, Oxford, 1096), if not to the Romans and Greeks distinguishing the ars liberalis, and artes mechanicae, the latter also called the "servile" or "mechanical" arts. This later expands to basic literacy skills ("'readin', 'ritin', and 'rithmetic", the "three Rs"), and basic skills vs. higher-order thinking skills, on which there is much long-standing debate and contention.

Expansion and reform were limited by numerous factors, including a monopoly by law in England establishing Cambridge and Oxford as England's only permitted universities until 1827. (https://www.historytoday.com/miscellanies/medieval-universit...)

In the industrial era, Prussian and Humboldtian education reforms instituted universal compulsory scientific and technical (rather than religious) education, largely at state expense, from Kindergarten, and including a university system for advanced education. With increasing demand for basically literate workers under factory and clerical work as well as technically-skilled workers in heavy industry, chemical, agricultural, transport, communications, information, government, and military sectors, the basic outlines of this system were widely adopted in industrialised countries through the 19th and 20th centuries.

The first technical, polytechnic, and engineering universities emerged in the 19th century. M.I.T. as a leading exemplar, though not the first, was founded in 1861. It was preceded by others, with Rensselaer Polytechnic Institute (RPI) possibly the earliest in 1821. Notably, technical schools were among the first to offer specifically-focused courses of study, persisting to this day in the numbered M.I.T. catologue, where lower-numbered offerings are generally more fundamental and earliest-established, modulo some subsequent subdivision.

Major expansions occurred through and following major wars, including the US Civil Way (founding of M.I.T.), and the first and second World Wars, as well as the post-war / Cold War era, notably Vannevar Bush's "Science, The Endless Frontier" (https://nsf.gov/od/lpa/nsf50/vbush1945.htm)

The Academic Major system began emerging in the early 19th century, though it would not really reach recognisable form (or be named) until after 1875. It replaced a general liberal education university system without formal major emphasis.

The post-1960 public research university is exemplified through projects such as the California Master Plan for Higher Education, 1960, strongly driven by governor Pat Brown and University of California president Clark Kerr. That effort was itself a reaction to an earlier technological monopoly, that of the railroads. Similar expansions occurred elsewhere, see the Robbins Report (1963) for the UK, or a set of Chinese initiatives since the 1990s: the Double First Class University Plan, Project 211, Project 985, or the C9 League.

In the US (and strongly similarly in much of the industrialised world), a de facto if not explicit hierarchy of prestigious highly-selective top-tier universities (largely private though with some public institutions), other highly selective schools (many state university systems). These are followed by less selective institutions, many formerly state colleges, "normal schools" (teachers' colleges), and numerous smaller private schools, and some polytechnics and ag & tech schools. Community colleges ("junior colleges") may feed 4-year programmes or directly train workforce, and are generally not selective (all applicants are accepted). Public and private vocational schools, as well as company-specific credentialing programmes (CCIE, RHCE, MCSA, OCP, Java SE, etc.) provide a range of skills training and certification, some basic, some advanced technical, some continuing professional education.

The various roles of education as teaching basic skills, higher skills, and cultural indoctrination, were commented on by John Stuart Mill ~1860s Britain, as noted by Hans Jensen, subject to various forms of control and coercion, largely via funding or lack thereof:

First, the universities were given the task of providing an unceasing supply of ideologically correct candidates for vital positions in government, church, and business. The state was able to make the faculties of the "venerable institutions" of higher education, or rather indoctrination, assume this duty because it controlled appointments and held the purse from which "emoluments" flowed into the coffers of academics....

The state devised a second educational strategy in order to prevent such a calamity from occurring. According to Mill, the "elementary schools for children of the working classes" were given the task of ensuring that the poor would continue to accept docilely their dismal station in life. It was very easy for the state to force the public schools to assume this role. It did so simply by failing malignantly to allocate sufficient funds for the operations of what Mill identified contemptuously as "places called schools"...

Hans E. Jensen, "John Stuart Mill's Theories of Wealth and Income Distribution". Review of Social Economy. Pages 491-507. Published online: 05 Nov 2010. (http://www.tandfonline.com/doi/abs/10.1080/00346760110081599)

(A more complete cite and discussion here: https://old.reddit.com/r/dredmorbius/comments/6x7u6a/on_the_...)

A list of Wikipedia articles addressing much of this history and development:





















Huh, I've struck a nerve there. :)

A question is where we should draw the line between education never being able to teach everything that one needs and education being only a way to utilize people. I would put it where a skill let's you acquire additional knowledge and capabilities, such like reading, math, logic, and the like. In that sense, MIT and teaching someone how to use excel for project reports go in different categories.

This is an opinion only though. The debate is rather large and it has many nuances.

Yeah, I've been kicking these ideas around for a while.

There are a few directions we could take this. Some freaks (myself) find them all fascinating.

There's the development of modern business comms and procedures, notably at railroads and DuPont Chemical. Joanne Yates has written a history. These became codified in business practices training.

Standardisation itself has been a tremendous advance, much of it lead by a Republican, Herbert Hoover, as Commerce Secretary.

The establishment of common practices, methods, and skills is itself a powerful asset for companies. Armies of workers skilled in typing, filing, programming languages, operating systems, productivity software, and more, make the underlying companies' products and services more valuable.

The classification of skills on a hierarchy is its own mess --- from basic to complex. Breaking apart the Seven Liberal Arts into their sub-groupings of the trivium (grammar, logic, rhetoric --- think of these as input, precessing, and output), and quadrivium (maths, geometry, music, and astronomy --- quantity, quantity in space, quantity in time, and quantity in space and time), reveals some of this. I've been considering similar fundamental divisions of technical mechanisms to fundamental dynamics or elements.

Or the classical professions: medicine, law, theology, business. Later engineering and technology in its own right.

There's the durability or ephemerality of knowledge, skills, and equipment. I've been in tech long enough to have some sense of what does and does not endure, possibly even why. Future Sock spoke to this 50 years ago.

Then there is the nature, history, and function of education, as institution, as sevice, as profession, its roles in society, culture, business, industry, politics, and military. The impacts of the wars of the 19th & 20th centuries really cannot be overstated. There are many tensions, and many covert or latent functions (a wonderful concept from sociologist Robert K. Melton).

I would argue the first polytechnic is the École Polytechnique (1794) in France. It served as a model for institutions in Germany and especially Canada (Polytechnique Montreal, 1873).

The French one, however, was founded as a military school but the general course structure and focus on engineering transcended time.

A comprehensive implementation of something akin to the German vocational training system is what the US needs.

It has been discussed across the political spectrum. The US needs to finally take concerted action toward this, with big investment behind it.

It was widely floated as an approach a few years ago by Benioff (and countless others) and the government made some basic motions in that direction:


It's not nearly enough. The government has to go a lot bigger to make a dent in a job market the size of the US.

The author seems to exist in a permanent state of enthusiasm, for mega corps and their corpy things.


Just an observation.

Well just any move made by FAANG will raise waves of enthusiasm and loud "genius!!!" shouting... regardless what the move was about.

I clicked your link to disagree with you but the evidence is on your side.

College in the USA is an economic plague on young people. This most certainly can't be as bad.

Except that’s completely untrue. College degrees are well worth the investment — the earning potential increases of having a college degree vastly dwarf tuition costs. When people are young, debt is a problem, but those same people continue to see (exponential) income growth over the course of their careers and eventually beat the debt, and then some. Folks without degrees struggle with income growth.

Some college degrees are worth the investment. The productization of college degrees being offered by second-rate institutions has created a race to the bottom. I see many young people graduating with little in the way of employable skills with (usually non-STEM) degrees that are merely a piece of paper that they paid tens or maybe even hundreds of thousands of dollars for. I was one of those people. It took me years of grinding away at entry level jobs and self-learning in my spare time to create income growth. When I look back, I wonder why even have gone to school at all? All the skills I learned were on the job or were through knowledge I knew I needed to learn to get to the next level in my career. Could I have made better choices with my degree? Sure. If I could do it over, I would. The common narrative is what you have said above, any Bachelors degree == good. I think the reality is much different for many of us.

This would be a more persuasive argument if graduates weren't faced with an entire horizon of corporations, that aren't very interested in candidates w/o job experience.

This is just... not true? At least not among anyone I’ve ever known who didn’t go to school for CS, law, or med school.

Are you basing this on your experience or the current market realities?

The average student loan debt for a new grad is about $33k, and the median is about $17k. $33k is a little bit less than the average price of a new car sold in the United States.

It sucks to owe money after college, but it is by no means a plague. It's a burden that many adults take on willingly each year just for a new car -- surely 4 years of education is worth as much money as a new Toyota Camry with leather seats.

On top of that, there are a lot of studies that show that college grads have better health outcomes throughout their lives.

The author doesn't seem to understand how supply and demand work. Unless there's about to be a massive increase in the availability of these jobs, this is going to create a bubble that will reduce these salaries and lower the skill level in these positions.

It doesn't push down salaries but does increase the signal to noise ratio when it comes to hiring and therefore the cost of talent acquisition.

In influx of bootcamps in my area caused something similar with Software Engineers. We didn't want to automatically exclude people from these institutions but we ended up having to.

If you had not excluded them do you believe it would have pushed down salaries?

I took the IT Support Specialist in early 2019 and it only took 1 week of dedicated video watching and test-taking to do (I did this so that I wouldn't need to pay the Coursera membership fee) (but that was because I already had a strong background in IT support/troubleshooting, software development, cybersecurity, and was intermediate in networking). I feel like the course is solid for explaining the concepts that it talks about, but I can't attest to how well it works for people with very little experience in IT.

You solidified your knowledge with your prior experience. Without practice one may potentially get good scores on tests but can't magically skip the practice. Learning takes time

What I've been wondering is, if these kinds of programs succeed, how do we ensure a smooth transition to the new career paths at the high school level? I worry that we're going to enter (or maybe have already entered!) a long period of bifurcation, where most schools continue to teach kids they'd better go to college even as college increasingly sets them up for failure.

Race to the bottom.

Great idea as the article notes, subpar execution.

I took a similar format course offered by IBM in data science on Coursera, and my goodness, I found a lot of pure junk work in the peer review process. Either that, or it was straight up copy and pasted from somewhere else online.

Online classes might definitely work, but they need more iterations to make it work.

I see a lot of negativity in this thread. What’s wrong with Google giving a bunch of people a taste and the opportunity to grow?

I prefer to believe in the ability of people to reach beyond their station in life. Sometimes all they need is inspiration. If Google can provide that it’s great.

The concern is whether Google actually is giving them an opportunity to grow. As was mentioned downthread, most existing training certificate programs offer no real growth and exist only to create a stagnant job pool in service of the commercial interests of the company issuing them. I'm optimistic that this program will be different, but I understand why someone who distrusts Google might be concerned.

Strictly speaking, neither Google nor any other single entity is responsible for giving anyone any opportunity of any kind; to work, learn, grow, etc. That they do is is very cool.

From my experience, the best people at anything (it doesn't have to be coding, anything in life) are those who caught a glimpse of something that resonated with them and, from that point forward, could not let it go.

In other words, from my perspective, the opportunity to grow is created by the person, not by external actors.

Here's a simple example: How often the children of very rich people end-up being complete losers? They don't lack resources at all. They don't lack opportunity at all. What they lack is hunger, exposure, discipline, struggle and that something that creates that spark that inspires so many to excel.

This is why I said that what Google is doing is great. You don't need to put people through a full CS curriculum for them to be useful and, more importantly, to change their lives. What they do with the experience and insight after that is up to them.

I don't get the negativity either. For someone with experience they could do it essentially for free if they have time and form their opinion that way.

They even give people a $150.00 cash card if they commit to completing the course in X number of months.

Overall, as I posted elsewhere I think the IT one is a pretty good intro to the subject, definitely best for people who have some skills and passion for the topic tho.

Half the classes they make you take at universities are worthless anyways. They're just there to keep you 4 years and get more money. A CS major doesn't need to learn about Greek Mythology.

Universities are not intended to be vocational schools or trade schools. That's why electives are required to graduate. That's why CS programs teach theory more than practice.

It's always worth being a well-rounded person. The most successful software engineers I know are not the most technically proficient. They are the ones who understand how their customers and their employees and their managers view the world, and they can use that understanding to prioritize the engineering work that will have the most impact. They are good at explaining what they are doing in terms that non-engineers understand, they can take requirements from non-engineers and turn them into realistic engineering requirements, and they can talk to corporate executives without seeming like Comic Book Guy from the Simpsons. These are all good skills to have, and none of them are taught in CS class.

Ok but in reality you need a degree to get a decent job so they are de facto trade schools whether you realize it or not. Your assumption that electives somehow make someone more 'rounded' is also laughable. That comes from being curious.

Agreed, but how much debt should one take on to learn these other skills? (Scott Galloway has been talking about this a lot lately - university degree cost inflation, which has skyrocketed without a commensurate increase in the quality of the product/service.)

There are other ways to pick up these skills, which are much more cost effective.

That argument also applies to CS skills. There are lots of successful self-taught programmers out there.

Boot camps are the software equivalent of vocational/trade schools and they are looked down on for a reason.

I assume your comment is made from an US perspective. Other countries manage to give students education without burying them in debt... maybe there's a lesson, somewhere...

It’s hard for me to express how deeply I disagree with this philosophy. I started out in the hard sciences with a MS in biochemistry, so I certainly don’t use any of the direct “skills” I learned in my major-specific courses in my current career as a programmer. Micropipetting, mass spectrometry, and gel electrophoresis have yet to come up in writing software (at least in the jobs I’ve had).

What is continuously useful are three main categories of skill: general critical thinking, analytical writing, and empathy. The first is helpful when solving any problem at all, the second when proposing solutions to new problems, and the third when understanding what problems need to be solved, both at a code level (what does the user need, what documentation do my coworkers need) and at an interpersonal level (how should I focus pairing sessions with this colleague, what projects and teams would they function well on, how can we help to ensure their professional growth).

I have to say that, on average, the humanities courses I took did a much better job of teaching these skills than the science courses.

Of course, even outside of immediate practical benefit, I learned a lot of really interesting stuff in those classes! A lot of them helped to shape the way I still see the world today.

Sure, we can try to pare down “higher” education to what has historically been called vocational school (in the US) or similar, but I think we’re potentially losing a lot along the way.

I have to disagree with that. I came into college as an engineering student with a strong interest in humanities. I really wanted to get a lot out of some sociology, philosophy, and psychology classes, and signed up for more than required. But the reality of it was, they were complete wastes of time and ended up turning me off of humanities.

For me personally, I'd have been much better off just following some curricula and reading the material on my own, not fussing with memorizing arbitrary details for a test and working on team reports with other disinterested students.

I think there's something to the apprenticeship thing, and would love to see this succeed. Even if it's not for everyone, I think it's good for lots of people. Six months seems a little short, for sure, but maybe 18 would be enough to learn the main algorithms, learn some project management, and push to be a productive team member. And I think working on "real work" is much more motivating than classwork for the average 18 year old, so that's going for it too.

> general critical thinking, analytical writing, and empathy.

I wonder if there is a more direct way to aquire those skills.

The goal of the university is to give you wide angle of view and most importantly teach you how to educate yourself. I've studied physics back in Soviet Union's university ages ago for free and consider it the best ever thing. I later moved to Canada do just fine and can never thank enough my tutors.

..for free. Would you have paid $60K - to $100K plus for the same? That is what students are facing now.

Very few would disagree with the intangible benefits of a university education. They question is, how much is it worth, and how much debt should you take on to obtain those benefits?

This is coulda, shoulda, woulda type of question. I was raised in former Soviet Union and as a consequence was able to get great education without having to pay for it directly.

What it would be for me if I was born in the US - no slightest idea. I could've been rich or poor. It is useless to speculate.

> Would you have paid $60K - to $100K plus for the same?

Average new grad loan debt in the US is about $33k and the median is about $17k. It's not that bad.

97 Things Every Programmer Should Know by Kevlin Henney: Chapter 71. Read the Humanities


They're only worthless if you're treating college as a vocational school instead of a chance to broaden your horizons.

There are a lot of other ways to broaden one's horizons that don't cost tens of thousands of dollars. Or that do, but broaden the horizons in other ways.

This seems like more of a problem with our system of financing college than with the idea of college itself. Plenty of places manage to make college education more affordable than in the US.

They are worthless from a cost-benefit analysis. You could join a social club for free and broaden your horizons in ways that would be vastly better for your career.

Half is an exaggeration.

But I agree that it's not obvious what's the direct immediate benefit of a Mythology class, versus Medieval History or Classical Literature. But I do believe they all share a long term indirect benefit.

The fundamental skills I expect a college grad to have are the ability to learn new concepts fast, be able to ingest and process a lot of -often unstructured- information and synthesize it clearly. That's something you'll get from studying unrelated subjects (at least, at a decent university). Often these courses forces you to read a lot of material and write about it.

That's the practical goal of humanities.

To be fair, I'd probably hire the programmer with the knowledge of greek mythology over the one without. Personal preferences aside, great programmers are very rarely one dimensional

That may vary well be true, but why does one have to spend tens of thousands of dollars to learn it from a university when they are there to get a degree in computer science?

That sounds like a strawman. Does any CS major require Greek mythology?

I would have preferred Greek mythology to art history, which I took to meet a requirement of my CS major.

I don’t know how it is today but when I went, you had to take a certain number of credits in humanities, social sciences, etc, no matter what your major was. So while they didn’t specifically mandate “Greek Mythology,” they mandated a certain number of credits from courses in that bucket, which stretched the time commitment for the degree at least a year from what it would have been if I focused on just my major requirements.

CS requires some humanities and CS requires mythology is pretty different. And ideally CS would include some relevant topics for understanding how tech can inadvertently mess up people's lives.

> ideally CS would include some relevant topics for understanding how tech can inadvertently mess up people's lives

I saw this elsewhere on HN and thought it was worth bringing to this thread. https://www.zdnet.com/article/a-young-stanford-graduate-trie...

Yay more gate-keeping!

A new chapter in the SV Neofeudalism saga. More dependency on large corporations, on yet another aspect of life.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact