Cynically, I'd say that Google is just trying to increase the number of qualified individuals to fill grunt-work positions so they can pay each employee less.
> Cynically, I'd say that Google is just trying to increase the number of qualified individuals to fill grunt-work positions so they can pay each employee less.
This is exactly what they're trying to do. I guess I could think about it as trying to vertically integrate employee education. This is also what Flatiron School and those other coding bootcamps were trying to do as well IMO, as well as getting at some of the funding that was made available for those efforts.
I don't think that's necessarily a bad thing though -- while people in those positions might see a drop in salary (or rather a slowing of growth in salary), more people with those kinds of skills is probably a net positive.
I found my studies were largely useless and a lot of it was just filler, fee justification material.
One of the mandatory elective credits was a course with a title “Role of a car in a North American society”.
This is now true no matter what level of education. Just need to start somewhere, and this is a great option for those without access to funds or the desire to take on the crippling debt increasingly associated with a 4 year degree.
- Data Analyst
- Project Manager
- UX Designer
- IT Support Specialist
I wonder how specific to Google products those courses are. It sounds like Google's answer to Microsoft Certified Microsoft Product Fixer certifications.
It's just courses on Coursera for $49/month.
A lot of the questions and examples mention competing products like Office365, they don't mention G Suite at all that I remember. The brief cloud section is pretty general and doesn't discuss GCP in specific.
I see it as an abbreviated version of A+, Net+, and Sec+ in one. For someone at a beginner level I think it's a pretty good introduction to IT support.
Not because of the project management side (which ok, can be a long and specialized career by itself) but by the side of "how do you have someone manage something they don't know how it works" (which is needed in smaller projects)
There'sba long history of dominant monopolies structuring pedagogy and business skills training to their own internal ops, dating to DuPont, GM, AT&T, IBM, Xerox, Microsoft, Cisco, etc., etc., etc.
Thought exercise: how do employers verify certificates currently, and how would they do it if your college shut down? In my experience, it's "they don't" and "they couldn't". It's both a bit more and less tricky with jobs that require recognised certification - there will be likely a nationally issued certification which matches your education one.
Elsewhere in tech, Cisco's certifications come with a sort of verification code which you can punch in to their site to check (in addition to the paper trail from Pearson and the physical certificate they mail you). If it weren't such a long-running part of their business, I'd be equally skeptical about them maintaining the certification infrastructure, as it's wildly tangential to the business of building networking equipment. As to the question of what a potential employer would do if Cisco (or Google, or a college, or a bootcamp) did shut it down their verification systems? I couldn't say, I've never hired anyone, nor have I had HR breathing down my neck to check off position requirements.
Why not? There might be people on these forums who could take note.
My issue is that this stuff now pretends to substitute for colleges which is totally different category. Six months of youtube videos are just professional training for office workers of the disposable type, not the creative/problem-solving variety that needs broad understanding of the field.
On the other hand, I feel like there are plenty of people who have either graduated with a humanities degree or simply do not have the ability to go to a good technical school: I would hope that this provides a new on-ramp for people like that, and it actually adds value to people’s lives. Even though as most people here have stipulated, this seems commercially motivated by Google’s interests.
Yes, such courses might be useful to fill in gaps in one's education, but it can't substitute said education. Low-quality tech graduates might profit, but humanities graduates won't become decent developers, at least not without continuous training for years which might take as long as a second degree.
It's useful to keep in mind the (usually) unstated goals of educational systems:
- Produce a technically-skilled, but politically pliable, working class.
- Produce a managerially competent, but not revolutionary, management and professional class.
- Persist existing power structures, whatever their form; political, cultural, corporate, religious, technical, epistemic.
The fundamental division in education has long been between liberal education and technical education, and can be traced to the emergence of the modern university in the 11th century (Bologna, 1088, Oxford, 1096), if not to the Romans and Greeks distinguishing the ars liberalis, and artes mechanicae, the latter also called the "servile" or "mechanical" arts. This later expands to basic literacy skills ("'readin', 'ritin', and 'rithmetic", the "three Rs"), and basic skills vs. higher-order thinking skills, on which there is much long-standing debate and contention.
Expansion and reform were limited by numerous factors, including a monopoly by law in England establishing Cambridge and Oxford as England's only permitted universities until 1827. (https://www.historytoday.com/miscellanies/medieval-universit...)
In the industrial era, Prussian and Humboldtian education reforms instituted universal compulsory scientific and technical (rather than religious) education, largely at state expense, from Kindergarten, and including a university system for advanced education. With increasing demand for basically literate workers under factory and clerical work as well as technically-skilled workers in heavy industry, chemical, agricultural, transport, communications, information, government, and military sectors, the basic outlines of this system were widely adopted in industrialised countries through the 19th and 20th centuries.
The first technical, polytechnic, and engineering universities emerged in the 19th century. M.I.T. as a leading exemplar, though not the first, was founded in 1861. It was preceded by others, with Rensselaer Polytechnic Institute (RPI) possibly the earliest in 1821. Notably, technical schools were among the first to offer specifically-focused courses of study, persisting to this day in the numbered M.I.T. catologue, where lower-numbered offerings are generally more fundamental and earliest-established, modulo some subsequent subdivision.
Major expansions occurred through and following major wars, including the US Civil Way (founding of M.I.T.), and the first and second World Wars, as well as the post-war / Cold War era, notably Vannevar Bush's "Science, The Endless Frontier" (https://nsf.gov/od/lpa/nsf50/vbush1945.htm)
The Academic Major system began emerging in the early 19th century, though it would not really reach recognisable form (or be named) until after 1875. It replaced a general liberal education university system without formal major emphasis.
The post-1960 public research university is exemplified through projects such as the California Master Plan for Higher Education, 1960, strongly driven by governor Pat Brown and University of California president Clark Kerr. That effort was itself a reaction to an earlier technological monopoly, that of the railroads. Similar expansions occurred elsewhere, see the Robbins Report (1963) for the UK, or a set of Chinese initiatives since the 1990s: the Double First Class University Plan, Project 211, Project 985, or the C9 League.
In the US (and strongly similarly in much of the industrialised world), a de facto if not explicit hierarchy of prestigious highly-selective top-tier universities (largely private though with some public institutions), other highly selective schools (many state university systems). These are followed by less selective institutions, many formerly state colleges, "normal schools" (teachers' colleges), and numerous smaller private schools, and some polytechnics and ag & tech schools. Community colleges ("junior colleges") may feed 4-year programmes or directly train workforce, and are generally not selective (all applicants are accepted). Public and private vocational schools, as well as company-specific credentialing programmes (CCIE, RHCE, MCSA, OCP, Java SE, etc.) provide a range of skills training and certification, some basic, some advanced technical, some continuing professional education.
The various roles of education as teaching basic skills, higher skills, and cultural indoctrination, were commented on by John Stuart Mill ~1860s Britain, as noted by Hans Jensen, subject to various forms of control and coercion, largely via funding or lack thereof:
First, the universities were given the task of providing an unceasing supply of ideologically correct candidates for vital positions in government, church, and business. The state was able to make the faculties of the "venerable institutions" of higher education, or rather indoctrination, assume this duty because it controlled appointments and held the purse from which "emoluments" flowed into the coffers of academics....
The state devised a second educational strategy in order to prevent such a calamity from occurring. According to Mill, the "elementary schools for children of the working classes" were given the task of ensuring that the poor would continue to accept docilely their dismal station in life. It was very easy for the state to force the public schools to assume this role. It did so simply by failing malignantly to allocate sufficient funds for the operations of what Mill identified contemptuously as "places called schools"...
Hans E. Jensen, "John Stuart Mill's Theories of Wealth and Income Distribution". Review of Social Economy. Pages 491-507. Published online: 05 Nov 2010. (http://www.tandfonline.com/doi/abs/10.1080/00346760110081599)
(A more complete cite and discussion here: https://old.reddit.com/r/dredmorbius/comments/6x7u6a/on_the_...)
A list of Wikipedia articles addressing much of this history and development:
A question is where we should draw the line between education never being able to teach everything that one needs and education being only a way to utilize people. I would put it where a skill let's you acquire additional knowledge and capabilities, such like reading, math, logic, and the like. In that sense, MIT and teaching someone how to use excel for project reports go in different categories.
This is an opinion only though. The debate is rather large and it has many nuances.
There are a few directions we could take this. Some freaks (myself) find them all fascinating.
There's the development of modern business comms and procedures, notably at railroads and DuPont Chemical. Joanne Yates has written a history. These became codified in business practices training.
Standardisation itself has been a tremendous advance, much of it lead by a Republican, Herbert Hoover, as Commerce Secretary.
The establishment of common practices, methods, and skills is itself a powerful asset for companies. Armies of workers skilled in typing, filing, programming languages, operating systems, productivity software, and more, make the underlying companies' products and services more valuable.
The classification of skills on a hierarchy is its own mess --- from basic to complex. Breaking apart the Seven Liberal Arts into their sub-groupings of the trivium (grammar, logic, rhetoric --- think of these as input, precessing, and output), and quadrivium (maths, geometry, music, and astronomy --- quantity, quantity in space, quantity in time, and quantity in space and time), reveals some of this. I've been considering similar fundamental divisions of technical mechanisms to fundamental dynamics or elements.
Or the classical professions: medicine, law, theology, business. Later engineering and technology in its own right.
There's the durability or ephemerality of knowledge, skills, and equipment. I've been in tech long enough to have some sense of what does and does not endure, possibly even why. Future Sock spoke to this 50 years ago.
Then there is the nature, history, and function of education, as institution, as sevice, as profession, its roles in society, culture, business, industry, politics, and military. The impacts of the wars of the 19th & 20th centuries really cannot be overstated. There are many tensions, and many covert or latent functions (a wonderful concept from sociologist Robert K. Melton).
The French one, however, was founded as a military school but the general course structure and focus on engineering transcended time.
It has been discussed across the political spectrum. The US needs to finally take concerted action toward this, with big investment behind it.
It was widely floated as an approach a few years ago by Benioff (and countless others) and the government made some basic motions in that direction:
It's not nearly enough. The government has to go a lot bigger to make a dent in a job market the size of the US.
Just an observation.
It sucks to owe money after college, but it is by no means a plague. It's a burden that many adults take on willingly each year just for a new car -- surely 4 years of education is worth as much money as a new Toyota Camry with leather seats.
On top of that, there are a lot of studies that show that college grads have better health outcomes throughout their lives.
In influx of bootcamps in my area caused something similar with Software Engineers. We didn't want to automatically exclude people from these institutions but we ended up having to.
I took a similar format course offered by IBM in data science on Coursera, and my goodness, I found a lot of pure junk work in the peer review process. Either that, or it was straight up copy and pasted from somewhere else online.
Online classes might definitely work, but they need more iterations to make it work.
I prefer to believe in the ability of people to reach beyond their station in life. Sometimes all they need is inspiration. If Google can provide that it’s great.
From my experience, the best people at anything (it doesn't have to be coding, anything in life) are those who caught a glimpse of something that resonated with them and, from that point forward, could not let it go.
In other words, from my perspective, the opportunity to grow is created by the person, not by external actors.
Here's a simple example: How often the children of very rich people end-up being complete losers? They don't lack resources at all. They don't lack opportunity at all. What they lack is hunger, exposure, discipline, struggle and that something that creates that spark that inspires so many to excel.
This is why I said that what Google is doing is great. You don't need to put people through a full CS curriculum for them to be useful and, more importantly, to change their lives. What they do with the experience and insight after that is up to them.
They even give people a $150.00 cash card if they commit to completing the course in X number of months.
Overall, as I posted elsewhere I think the IT one is a pretty good intro to the subject, definitely best for people who have some skills and passion for the topic tho.
It's always worth being a well-rounded person. The most successful software engineers I know are not the most technically proficient. They are the ones who understand how their customers and their employees and their managers view the world, and they can use that understanding to prioritize the engineering work that will have the most impact. They are good at explaining what they are doing in terms that non-engineers understand, they can take requirements from non-engineers and turn them into realistic engineering requirements, and they can talk to corporate executives without seeming like Comic Book Guy from the Simpsons. These are all good skills to have, and none of them are taught in CS class.
There are other ways to pick up these skills, which are much more cost effective.
Boot camps are the software equivalent of vocational/trade schools and they are looked down on for a reason.
What is continuously useful are three main categories of skill: general critical thinking, analytical writing, and empathy. The first is helpful when solving any problem at all, the second when proposing solutions to new problems, and the third when understanding what problems need to be solved, both at a code level (what does the user need, what documentation do my coworkers need) and at an interpersonal level (how should I focus pairing sessions with this colleague, what projects and teams would they function well on, how can we help to ensure their professional growth).
I have to say that, on average, the humanities courses I took did a much better job of teaching these skills than the science courses.
Of course, even outside of immediate practical benefit, I learned a lot of really interesting stuff in those classes! A lot of them helped to shape the way I still see the world today.
Sure, we can try to pare down “higher” education to what has historically been called vocational school (in the US) or similar, but I think we’re potentially losing a lot along the way.
For me personally, I'd have been much better off just following some curricula and reading the material on my own, not fussing with memorizing arbitrary details for a test and working on team reports with other disinterested students.
I think there's something to the apprenticeship thing, and would love to see this succeed. Even if it's not for everyone, I think it's good for lots of people. Six months seems a little short, for sure, but maybe 18 would be enough to learn the main algorithms, learn some project management, and push to be a productive team member. And I think working on "real work" is much more motivating than classwork for the average 18 year old, so that's going for it too.
I wonder if there is a more direct way to aquire those skills.
Very few would disagree with the intangible benefits of a university education. They question is, how much is it worth, and how much debt should you take on to obtain those benefits?
What it would be for me if I was born in the US - no slightest idea. I could've been rich or poor. It is useless to speculate.
Average new grad loan debt in the US is about $33k and the median is about $17k. It's not that bad.
But I agree that it's not obvious what's the direct immediate benefit of a Mythology class, versus Medieval History or Classical Literature. But I do believe they all share a long term indirect benefit.
The fundamental skills I expect a college grad to have are the ability to learn new concepts fast, be able to ingest and process a lot of -often unstructured- information and synthesize it clearly. That's something you'll get from studying unrelated subjects (at least, at a decent university). Often these courses forces you to read a lot of material and write about it.
That's the practical goal of humanities.
I saw this elsewhere on HN and thought it was worth bringing to this thread. https://www.zdnet.com/article/a-young-stanford-graduate-trie...