Hacker News new | past | comments | ask | show | jobs | submit login
University of the People: Tuition-Free, Accredited Online Degree Programs (uopeople.edu)
725 points by tomato2juice 8 months ago | hide | past | favorite | 464 comments



Hi. I attended this school and then got accepted to another more standard institution, which I applied at the same time to. The new institution, learning I had been attending U of the P, promptly told me that U of the P credits would not transfer. Universal transferability in question, I opted out.

I did enjoy meeting people from all walks of life, all over the world. However, I also saw a grossly wide range of educational professionalism in the students. In the introductory mandatory writing course, for example,there were a number of classmates whom could not grasp the idea of plagiarism being unethical. With a plagiarism assignment graded by those peers, it was difficult to not feel like higher educational learning was moving along for oneself at a progressively intellectually challenging pace.


> could not grasp the idea of plagiarism being unethical

I remember experiencing this at a private religious university. At the time, my hyper-religious mind was blown to see students outright cheating in the Testing Center.

Since then I've been exposed to additional perspectives on plagiarism. It is an extremely deep and nuanced topic. A few years out of school, I ended up mentoring and then teaching college students who seem to match the sort of person you describe. This was a huge shock at first.

The more I learned about these students, the more I learned about the sheer variety of perceptions involved: One person's fairness concept is, to another person or group, a latent power dynamic which ought to be questioned.

Or, this person's concern for the big-picture ethical questions is this other person's small-picture roadblock in an economic problem which seems more urgent with each passing moment. You want a big picture? Can you justify it in seconds, with something that's not simply a subjective perception or largely-covert moral construct of your own?

Yet another person's assumption of perpetually commonly-understood contract is another's baroque exercise in cleverness and flexibility. It's the sneaky laser dance from _Ocean's Twelve_, and _that_ kind of challenge is, psychologically speaking, extremely energizing for them. Don't think they didn't notice how things work in the "real" world! (When these two see each other face to face--so to speak--there are harsh outcomes)

Anyway--sorry to hear about your experience & thank you for sharing so that others can be more educated about their choice of institution.


The problem with plagiarism isn't best characterized as some sort of power struggle between teachers and students via bullshit assignments.

The purpose of learning to write is to make yourself a formidable communicator. If you can independently analyze a new topic to learn something new and apply the results of those learnings towards a particular goal, you can be amazingly effective in everything you aim for. But if you plagiarize every assignment you rob yourself of your own training of this critically important competency.

Plagiarizing some work doesn't really hurt the work, it hurts you.


This is under the assumption that you see college is a program of self-betterment and not busy work for receiving a degree that says you can have a middle-class job. It feels like even the universities themselves see it as the latter these days.


The cynical anti-intellectualism in this thread is bracing.

All part of the zeitgeist I guess.

News is fake. Science is fake. Schools are barriers. Everything is subjective, objective reality is nonexistent.

How do we have productive disagreements going forward?


> News is fake. Science is fake. Schools are barriers. Everything is subjective, objective reality is nonexistent. How do we have productive disagreements going forward?

Funny you describe it that way. I'd argue that young people in STEM fields, including IS/CIS/CompSci undergrad programs, think everything can be objective when that clearly is not the case.

You don't need to go to college to press buttons, fill out spreadsheets, or input code until you get the output you seek. You need to go to college to make the subjective decisions, which don't have a clear right/wrong answer.


I can't word this in a non-snarky way, but it's a genuine question:

Why do you believe that college can teach making subjective decisions?


Because moral philosophy and epistemology have been a thing for well over 4,000 years, and college could easily teach the basics.


This.

The humanities are glossed over at best in public high schools across the United States. I don't know of one that requires a PHIL intro course of students.

College is not vocational training (unless you're a law or medicine student), it's for learning how to think.


If you study the International Bacalaureat (IB), a high school curriculum taught around the world that is based on the French system, you are required to study Theory of Knowledge; effectively an introduction to philosophy.


> The humanities are glossed over at best in public high schools across the United States. I don't know of one that requires a PHIL intro course of students.

I'm not sure how much weight this argument holds.. The whole "gen ed" thing is a rather US-centric concept.

I don't know of any universities in the UK that require a PHIL intro course of students. When you go to university, you overwhelmingly study the one course ("major") that you picked beforehand. There's often a small amount of room on many courses for optionals from other fields, if you want to take them, but this is by no means mandatory and I'd say the proportion of folks doing philosophy modules studying a different degree at my alma mater was slim.


No wonder the UK's app startups are even more ridiculous than the US's app startups ;).

Yes, gen-ed is ubiquitous in the US. If you're in any humanities related program in the state I live in (Texas, so that's probably 25-30 large universities total), you'll have to take an intro PHIL course at least, which will probably be Plato and a random survey of 19th century European readings.

More is highly recommended for students looking for law school admission after their undergraduate degree at the state-owned college I attended.


In theory, it would be good to have a part of our educational system which teaches people how to think. But what does that look like? I'd say that boils down to two things: logic and evidence.

The vast majority of philosophy classes are absolutely garbage at teaching either of those things. Sure, in theory, logic is part of philosophy, but in any of the philosophy classes I've taken, we didn't talk about logic. The things we did talk about were often examples of how not to think, yet they were presented as equally valid next to much more rational ideas.

For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials. I'm sure lots of people walked out of that class thinking that the categorical imperative was a perfectly reasonable way to make ethical decisions. If this is the sort of "learning how to think" philosophy classes are doing, then I'd prefer we didn't--I'd rather let people figure out how to think on their own than to teach them unequivocally incorrect ways of thinking. Philosophy could be useful if these classes were taught as, "Here's a bunch of historical ideas, and here's how we apply logic to prove them wrong." But until that happens, I'd strongly oppose introducing any more philosophy to curricula.

Other fields are better-equipped to teach people logic and evidence. Science is all about evidence collection, and logically applying the collected evidence to the evaluation of hypotheses. Math, especially around proofs and derivations, is all about logic, and probability and statistics give you tools that are very broadly applicable. History, if taught well, teaches you how to logically analyze artifactual evidence and logically contextualize the present in terms of the past.

But, there are two problems: first, many college students don't focus much on these areas. And second, the parts of these fields which I mentioned aren't particularly well taught even by these fields. Many students get A's in science classes thinking that science is memorizing a bunch of facts about chemicals or living things, without ever having learned how to obtain new facts themselves. Many students get A's in math classes having memorized a bunch of formulas without being able to derive even basic proofs. Many students get A's in history classes having memorized a bunch of historical events, without knowing the difference between primary and secondary sources, and without ever considering that an author might have bias. Even the classes which do teach people how to think, to some extent, generally do a piss-poor job of it.

That's not to say that these fields (and other fields not mentioned) have no value. Even if you think well, your thinking is only as useful as the evidence you feed into it, and colleges do a very good job at moving vast amounts of evidence on a variety of subjects into people's brains. Further, colleges often do a lot of work getting people skills: lab techniques, using computers, effective communication, etc. You can argue that the purpose of college is learning how to think, but the implementation of college is much better at teaching people information and skills. Learning how to think would certainly be valuable, but de facto it's not what colleges are doing, and the things colleges are doing do have some value.

That said, modern colleges often put teaching of any kind behind profits, and that's not something I see any value in for society or students.


I agree completely on your critique of profit motive at universities, and think it particularly applies to state-owned institutions. There is a false notion that profit is an automatic good, when that is clearly not the case.

There is more to critical thinking than formal logic, I'd argue. The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.

I'd guess that a big part of the reason there is such a glut of humanities graduates who can't find professorships is that people simply enjoy the classes enough to keep going all the way through graduate degrees. You get discussion and debate in those classes that you can't find anywhere else.

I don't think the above is true of many other disciplines of study, with so many degrees offered being pitched for purely profit motive as job training, as you mentioned above.

I can't do a better job of describing this than this professor who puts his public lectures on youtube for free..

https://www.youtube.com/playlist?list=PLpO_X3dhaqULiItXg84O9...


> There is more to critical thinking than formal logic, I'd argue. The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.

Well, if you look at literary criticism, there are a bunch of different ways to do it. The oldest ways, such as authorial intent or historical criticism, aren't that divorced from history as described in my previous post, or from just normal old formal logic. But a lot of the ways popular now, such as Marxist criticism or feminist criticism, are forms of reader-response criticism. In the worst cases, this sort of criticism can be used as a pulpit for professors to pass on their ideologies, which is deeply problematic--rather than teaching students how to think for themselves, it's teaching them to think like the instructor. In the best case, it can teach students how to evaluate literature in relation to their own goals--but I would argue that this is just an application of formal logic. The reality, in my limited experience, is neither of these extremes--classes I've taken and my friends have taken have mostly been "these are some of the ways people have thought about literature"--it's more about passing on information than about teaching how to think.

As I've said before, there's a lot of value in giving people information, I just don't think it supports the "college is about teaching people how to think" narrative.

That said, I'll give two caveats here:

1. My own formal training isn't in literary criticism, and beyond some general-ed requirements and second-hand experience from friends/partners in literature programs, I have very little experience here. My impressions here may very well be off-base, which is why I didn't mention literary programs in my previous post. A notable bias in my previous post is that I talked most about the fields I'm most familiar with.

2. Up to this point, I've basically been speaking about teaching facts versus teaching how to think as if they were two different, mutually exclusive things, but it's worth noting that that's not quite true. It's true that simply giving a student a fact doesn't teach them how to evaluate whether something is a fact, but if you give a student enough facts, eventually they come across discrepancies and begin to experience cognitive dissonance. Over vast swaths of facts resulting in a few discrepancies, a student will eventually begin to come up with their own framework for evaluating discrepancies, and hopefully that framework will look a lot like formal logic and evidence collection. I'd argue that this is a very, very inefficient way to teach students how to think, but eventually I think it does work.


For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials

I've read this a couple times, I'm curious about what you're saying here, do you mean that your class just reviewed some writing on the categorical imperative on its own, or read Groundwork of the Metaphysics of Morals?


I'm not sure what you mean by 'teach the basics'. Certainly, college can provide the texts and an environment where other people are interested in the same subjects. If that's all you mean I agree, though it's far from the only institution where that's possible.

The trouble, I think, is that making ethical judgements requires wrestling with ethical conundra oneself, and that is not something a professor with 300 undergrads to teach can provide any useful guidance to the majority of them for. The idea of accurately assessing performance is even more unrealistic. Maybe it's a function of the kind of university I attended, but the vast majority of my fellow students who were taking these 'subjective' courses were simply gaming a rubric in their writing. And this is true even of those who were genuinely interested in the subject matter, they saw it as a price of admission.

Which seems to me like an impediment to actually learning what was traditionally taught on more of an apprenticeship than an industrial model. If your own undergraduate experience was different, I'd be curious what your university did differently.


> making ethical judgements requires wrestling with ethical conundra oneself, and that is not something a professor with 300 undergrads to teach can provide any useful guidance to the majority of them for.

I don't remember a lot from my undergraduate course on Ethics, but I do recall that literally in the first lecture, the professor presented us with questions about things like how should one behave or treat others, and then presented us with "edge cases" that directly challenged what most of us had answered.

As a young person, it's very easy to think that our problems are novel and unique, and the ethics course very clearly showed that many of these problems are millennia old, with people having given names to better-realized versions of what most of us think of as the way we should behave, and that people have spent lifetimes of work writing and arguing about the ramifications and "edge cases" of such philosophy.

I feel like the biggest benefit from the course was not any particular ethical guidance, but rather the challenging of our beliefs, and the realization that these things _are_ hard, and are not something we can trivially answer with something that fits on a Hallmark card.


Do you really think new college students absorb all that they hear in the entry-level classes? You're also operating under the assumption that all professors employed by a college are capable of effectively communicating the topic they're supposed to teach


>I can't word this in a non-snarky way, but it's a genuine question:

>Why do you believe that college can teach making subjective decisions?

Lol, most people actually believe that common-sense can be taught to people. I don't.


I agree. Developed is one thing, even building a space where it's easier to develop, but teaching? As a guy who's taught for both fun and profit, I don't buy it.


Anti-college is not anti-intellectualism. Quite the opposite. Higher education is no longer about education; it's about profit. Pieces of paper are pointless for anything other than wallpaper. Free-range education through meeting and collaborating with others is more beneficial to expanding your knowledge than handing over money to some college. Save those tens of thousands of dollars and years of your life. Spend that time and money being an apprentice, creating your own curriculum, or taking specific training.


> Anti-college is not anti-intellectualism. Quite the opposite. Higher education is no longer about education; it's about profit.

In the US, maybe. Do people take ridiculous loans for their degree outside of US? Some loans, sure, but loans that amount to 5-10x their future yearly income? I don't know...


While there are some truly staggering examples of US college loan debt, the average loan debt at the end of a 4 year degree in the US is $26k or about the price of a new mid range car. For the majority of people, their total college loan debt is below a single year of their first year annual income out of college and certainly not 5-10x.

[1]http://www.collegescholarships.org/loans/average-debt.htm


> In the US, maybe. Do people take ridiculous loans for their degree outside of US? Some loans, sure, but loans that amount to 5-10x their future yearly income? I don't know...

Yes. In the UK. I had a relationship with someone who specifically learned German in school, and went to Germany to tutor in a cross-education outreach program after she graduated to go scout for Universities she wanted to attend in order to avoid having to take out massive loans like her siblings did back home. Very smart girl.

She and I enrolled into online classes, I had already complete my Bsc but wanted to do this with her; but she felt she was missing on the 'campus life' part of the University experience and went into Pedagogy to the Masters level and now teaches back in the UK.

In a post Brexit World, that is just not possible.

The EU is still pretty favourable in terms of University costs being hidden and obfuscated via VAT for the students, but many Industries within it's local economy (PIIGS, Romania, Hungary, Slovenia in the Eurozone, and just about most of the periphery member nations) cannot provide adequate jobs let alone a career to its graduates within their sectors so they have to go to Germany, UK, Holland and as things have gotten worse France to a much lesser degree than when I was there.

The ideal being landing a job in the US or China where they can make obscene amounts of money in certain fields like Tech or Medicine with little to no debt, and subsidized advance degrees. Which still opens it up to the work visa lottery, and uprooting your life during some the most critical years of your entire Life (late 20s to early 30s) in the hopes it pans out.

The best thing that can happen is to disrupt it entirely and level the playing field and re-structure it in such a way that its both affordable and accessible to all motivated to want to go in and meet its requirements. And incentivize them to stay in their home towns a build a solid community and tie it to the needs of its actual needed labor force: hopefully doing away with the notion of studying Civil Engineering for Oil Rig drilling if you're from Iceland kind of thing. As it makes no sense, and doesn't reflect the value system or the job prospects of your community let alone the job prospects of a Nation that is entirely dependent on renewable geo-thermal.

How exactly the Lab portion of STEM gets solved is still a mystery.

I propose the building auxiliary wet-labs in Libraries within their communities. The net benefit here being that students should be required to teach children and adults of their community the topic or subject they are studying as a graded portion of their grade for the privilege of having such a model and build community in the process. Or perhaps that should be the only real on-campus (at both Universities and Community Colleges) component to what is an otherwise entirely Online system?

Just look at this example, which having to attend my midterm and practicals during one of the largest fires in San Diego History (I was literately trapped in my car on my way back home to OC for 7 hours after they closed campus when we were sitting down for the exam as the classroom filled with smoke) and during my finals during the H1N1 swine flu pandemic, I can understand this from both sides:

https://ktla.com/news/local-news/ucla-professor-suspended-af...

That hot button issue could be entirely mitigated, whether you're pro or against the BLM protests is irrelevant. On just a practical and logistical matter you could just overcome this with the current technology that we have and avoid the certain backlash to the professor, department because of it from the irate student body and opportunistic Media.

I saw a rant from a UCLA professor pretty much lining out how he, and his entire profession have not seen a single decrease in pay since he left University in the late 80s as a TA and saw how the CSU/UC extortion system was being assembled in what was once the envy of the entire US' university system--which followed the EU's model pretty well, and was low to no cost if you were local, but had the ability to employ its graduates as the California Economy could support it. Which was a net benefit that significantly contributed to CA becoming the 8th largest economy in the World.

I can't seem to find it and really wish I had saved it as the very employees in the system are to the point where they know it went too far. And are perhaps even afraid of what may happen at what an angry mob can do these days.


I agree with you, but the primary issue is signalling via degrees. For the elite non-college educated who already have a working portfolio of projects to reference, landing a "white-collar" job may be a possibility. For the rest of them, a non-degree holder, even if objectively competitive with a degree-holder, will be immediately discounted by a hiring manager who's looking at 250 resumes.


Productive disagreement? I think to begin, we need to vet people we might have disagreements with. I don't think productive dialogue can be commodditized. Maybe it can after the fact via podcasts etc, but dialogues themselves are, I think, inherently analogue and highly personal. I think, given the realities of the attention economy, that we need to be much more selective about whom we let in to a conversation that might change our own mind. And this needs to be on a case-by-case basis, with everyone setting and updating their own standards on whom they'll let into dialogue.

I think people like the one you're responding to would agree and increasingly think that association with institutions of higher learning send a strong signal to avoid dialogue. It doesn't necessarily look like anti-intellectualism to me, any more than filtering out people who didn't graduate high school is necessarily elitism. I could see myself rationalizing either, depending on the kind of conversation I wanted to have.


Could you please define anti intellectualism by your standards? To me, anti intellectual notions would be just if the initial intentions of a discussion were based on a very rich topic and were dissolved in some manner. I beg a brief and graceful peer review for the poor submissions that likely seek solace in the isolation of a virus infected planet.

And at end. In my intention to post, I was solely being altruistic, informing whomever reading that if they were to read this article and consider getting a degree from U of the P that they should consider the risk. Just a gesture. However, I think my writing style might have been misunderstood as some semblance of pseudo intellectual attempt or such. Do know, for the record,that as the 1st to reply to the post, my intention was to inform.

But I am intrigued and inspired. How about we both try to post an article that invites our versions of intellectualism! Ready set go.


>"How do we have productive disagreements going forward?"

We are barely having any of those right now in the greater society. As long as we can't argue facts, objective-reality and do so without feelings, we'll continue descending into anarchy and divisiveness.


US is "the greater society" now??


I'm not from or residing in the US. My comment was aimed as commentary on what I can see happening globally.


Your post lacks nuance.

Some news is fake, some isn't.

Some science is fake, some isn't.

Schools are barriers, but for many elements of a school, the fact that it's a barrier is a good thing--we don't want ignorant people performing in roles that where knowledge is required. The problem is that many elements of schools are barriers which are poor at achieving their purpose, or are directly counterproductive to their purpose.

> How do we have productive disagreements going forward?

That's a complicated question, but oversimplifying the opinions of people we disagree with and then labeling it ("cynical anti-intellectualism") isn't the answer.


Or: Objective reality might exist, but it's not necessarily present in the universities, which are mostly there as an IQ test by proxy with a filter for the most lazy, with some social indoctrination thrown in. Any science or truth exists only at the whims of the social order of the day.


>Any science or truth exists only at the whims of the social order of the day.

This is not objectively true but I understand what you're trying to say. I'm sad to hear that your experience of science and truth has been only that which society has given you, or at the least that you feel that others are only experiencing it in that way


Sorry, I should have worded that more clearly: Only that science or truth that is doesn't contravene the current social order is allowed air, allowed to be talked about without reprisal and shunning. The degree to which this is true is a canary for how totalitarian and neurotic your micro-society is. The micro-society of universities generally seems to be becoming more, rather than less, rabid and paranoid.

A lot of it is like Bostrom's idea of the decentralized electroshock dystopia: even though a significant proportion of people are witches, everyone's afraid of reprisal for not actively hunting witches, so the witches-in-hiding hunt their own when they're unmasked.

But this is the way of things; this wave will pass, eventually, as well. And like the soviet scientists who kept their heads down and mouthed the party line, the secret iconoclasts will survive till the current order is replaced by the next, with its own peculiar tabboos.


This is a crucial argument. Do you have a link to something that supports your statement that “this is not objectively true”? I think it would help a lot in these debates.


Yeah it's called "The Barstool Experiment". Look it up :-)

The short summary is: Two people sitting in a bar, discussing how there is no objective truth, or that "science or truth exists only at the whims of the social order" and stuff like that, all day long. And then a third person comes along and smashes their brains with a barstool. In some variations, the first two people are a scientist and a priest. But the third person is just some dude with a barstool.


But if you can't do the middle-class job because you can't write, the piece of paper will only get you so far. It may get you an entry-level job, but you probably won't get much in the way of promotions. So even from that perspective, plagiarism isn't helpful.


> But if you can't do the middle-class job because you can't write, the piece of paper will only get you so far. It may get you an entry-level job, but you probably won't get much in the way of promotions. So even from that perspective, plagiarism isn't helpful.

Where do these mythical jobs exist where being able to write well is a requirement for career growth? Certainly not at engineering companies.

I wish what you said were true, but in my experience, "being able to write and communicate well is critical in the workplace" is one of the top lies taught to me when I was at university. We had to take a regular writing class, and a technical writing class to graduate with an engineering degree. And when I get to industry, I see no signs of people practicing what they're taught, and it doesn't hold anyone back.

Edit: I should say my experience is more about writing than communicating as a whole. People do need to be good speakers/presenters. But writing? Not really.


Yes, writing is important. I have coworkers who struggle at writing, and it takes mental overhead to try to understand them. It hurts their ability to communicate clearly, think clearly, and to be taken seriously. I have clients who struggle to write well; I was just barely looking at a legally binding document that is incomprehensible in places and would have serious repercussions on those using it. That is bad. Society is better when we are truly educated to think and to communicate. We can be enriched by each other when we understand each other. Writing coherently helps us think coherently. That improves our lives more than a degree certificate ever will.


Writing and communication skills are absolutely necessary for career growth. The power of persuasion is directly linked to the ability to communicate your ideas well.


Careers can be grown in many ways. In terms of career advancement, good writing will be outperformed by ruthless opportunism nine out of ten times. Cheating in education is identifying a metric and optimizing for that instead of the quality themetric is intended to measure and that strategy won't stop being effective upon getting a degree. Implication: no, "they are only hurting themselves" isn't a valid excuse.


> Writing and communication skills are absolutely necessary for career growth. The power of persuasion is directly linked to the ability to communicate your ideas well.

I've updated my original comment to reflect that I was referring to writing and not general communications in general (although I wrote it more broadly).

I've seen people really value presentation skills and PPT. Persuasion on 1:1 and via presentations is definitely valued.

But via writing? No. They're atrocious when writing emails. And they rarely write docs/briefs. If they do the latter, it's really meant to be a teaser to get someone interested, and then that person will go talk 1:1 to get the details or ask for a presentation.

My experience at work: Writing anything longer than 1-2 pages is a good way to ensure no one will read it. And again, if I have a good enough "lead", what will happen is the senior person will read the lead, stop reading, and schedule something to talk to me in person so he can understand in detail. At some level, I understand why he would do that - it can be an interactive conversation where he can interrupt, ask for clarification, etc. Whereas if he read the thing, he would have to write up a response, or even worse, make notes to ask me the next time he sees me.

I almost never get anything as well written as a typical HN comment. Even (internal) documentation/manuals/Wikis are poorly written.


At large tech co that pays well, writing is very important for career progression:

1. You often have to write design docs to communicate what you are making and gather feedback. These documents if badly written won't be as well received.

2. At many companies, you have a million things to work on, so in a way, you get to choose who you work with at some level. If someone communicates badly to the point of annoyance, it will take something special for you to decide to work with them or not.

3. To convince execs and managers to approve your project ideas, you often have to write a document explaining your idea. If it's badly written, the exec isn't going to be as interested in it.

4. To get fame as an engineer, you often should write compelling blog articles. Badly written blogs tend not to be read.

5. Good docs make popular libraries, popular libraries get attention.

Which leads to:

6. Promotion is often done by a committee of people who don't know your work, and all they are going to do is review what you wrote. And promo is often based on leadership of projects. And how do you become the leader of projects? You write compelling documents.

Bad writing is not a good sign. It's like saying, at my company, we don't write tests and we don't have alerting & monitoring on our servers.


> And promo is often based on leadership of projects. And how do you become the leader of projects? You write compelling documents.

This sounds somewhat idealized. As much as writing well is an asset, the understanding and ability to navigate office politics is what increases one's chances at career advancement. In the simplest form, just doing what the boss wants/expects one to do, not doing the unwanted things. No matter big or small company, a lot of subtle things that are said and done matter more, than the ways things are put in writing.


What do you think navigating office politics is? A lot of it is communication, and a lot of that communication is written down, especially in the remote work world we are going to. :)

Also how do you become the well paid boss that tells people what to do? You get promoted. You often have to communicate to your subordinates on what to do through writing and so on. Otherwise your known as a bad boss and the best might not want to work with you.


> You often have to communicate to your subordinates on what to do through writing and so on.

There're many different ways how the communications end up being. Some ways/bosses are better at writing, others are not as much. If anything to notice is that some well-writing bosses scale down to very short one-paragraph emails, almost slack-style. Even more, these already short messages may have typos (how?? the spell check is ftee!!). Well, that's the busy-boss style, as writing long replies or bothering with minor corrections just takes the time away from the tons of emails in the queue.

Which brings this to the next point, that, perhaps, a _comprehension_ skill is of equal importance for success in any kind of team. And that goes beyond what's written.

Some people/bosses won't read long texts, others want details, yet others want structure; many prefer visual depictions (hello, white board), some would rather write the whole thing in their own words.

As it's said - write for your audience. I'd add, that one needs to communicate to the audience's comprehension level, and you'd stand a better chance to be understood.


> What do you think navigating office politics is? A lot of it is communication, and a lot of that communication is written down, especially in the remote work world we are going to. :)

I think you and GP have quite different understandings of office politics. I know the GP's perspective: Most office politics is done over coffee/lunch. Then the followup is "Can you present your stuff in our next meeting?" And if not that, then "Do you have a PPT with your proposal?"

People in my company like looking at PPTs. Reading a few pages? Not as much. Just today a senior exec sent out a 4 page whitepaper on the key initiative our department is working on. I probably should do a survey in a week's time to see how many people bothered to read it.

And once again, I think your perspective is biased a bit towards certain SW companies. Most engineering companies don't have remote work (although there's talk in my company to continue allowing it once COVID blows over). It's a well known tech company, but definitely a very traditional one.

> Also how do you become the well paid boss that tells people what to do? You get promoted.

As I pointed out earlier, most companies don't require a formal writeup for promotion beyond the annual review. When I got my last promotion, my manager told me "Both I and my manager are quite familiar with your work, so this paper you're writing is merely to fulfill an HR requirement, and I'm here only to make sure you don't claim something you didn't do," We don't "apply" for promotion. It is granted based on the manager/committee's opinion on how well you're doing. I didn't know I was getting one till it was granted.

Oh, and those annual writeups are a thing of the past. So now your bonus/promotion is entirely based on your manager's opinion. People did protest this change, but I actually welcomed it - I knew the act of writing things up in the past was viewed by most teams as a mere formality, so why waste time on it?

Our company has lots of mentors who do 1:1 or group presentations on career growth. What I write here is reflective of what they say. Not once did they advocate "writing well". Networking, getting to know your manager's needs (or his/her manager's needs), etc are the usual ways to do well. And presenting in front of audiences (for networking, not for promoting your idea - that is secondary).

> You often have to communicate to your subordinates on what to do through writing and so on. Otherwise your known as a bad boss and the best might not want to work with you.

There are plenty of bad bosses in my company. And that's where office politics come into play - "good" engineers in our culture are those who navigate around such bosses via politics (and don't spend time complaining). "Bad" engineers complain or leave.

(I don't agree with the sentiment, but that is the perspective here).

Probably a lot of this has to do with the fact that we have very few competitors. For most of the "best" engineers, going to a competitor is a step down. And from what I've heard from those who left, the culture isn't any better there.

Based on the responses to my initial comment, I'll concede that not every company is like mine - and that's refreshing. It is also clear that my company is not by any means an outlier.


Nope, he's talking about actual tech companies. Competition is fierce there and "just doing what the boss wants/expects one to do" is the bare minimum or even underperforming. Being able to articulate your ideas and persuade others that your're right and follow your lead is crucial to rising through the ranks.


> 4. To get fame as an engineer, you often should write compelling blog articles. Badly written blogs tend not to be read.

I could go step by step, but really? This sounds to be so far out of reality. There is reason engineers dont write blogs - they dont matter.


The proviso there is "to get fame"

Technical accomplishments only make you famous if people know about them - if you want to become Joel Spolsky or Bruce Schneier or John Carmack or Donald Knuth then either you're going to have to publicise the evidence of your brilliance, or someone else is.

Plenty of people don't particularly pursue fame - you can make plenty of money and get plenty done without it. But if fame is your goal and you hope to achieve it through code alone, I challenge you to name the maintainer of grep or openssl or the linux kernel bluetooth subsystem without looking it up :)

There are other routes to getting your accomplishments known, of course. Sid Meier and Bill Gates and Linus Torvalds aren't famous for blogging.


Neither John Carmack nor Donald Knuth nor Bruce Schneier became famous because of blogging. The only one who became famous for blogging was Joel Spolsky who blogged more as technical manager or entrepreneur then engineer. Joel Spolsky also blogged when light management blogs were rare and would show up high in google search. It was something new back then, it is not something new now.


Knuth and Schneier both wrote highly influential books, though, which I would argue are examples of programmer plus writing skills equals famous programmer


Which is not the same thing as having blog. It is completely different. They also wrote under different conditions as blogs are written - book was paid work and publishers pass your books through editors.

Also, Bruce Schneier is not a programmer. He can code, but the bulk of his work is not programming.


As they say, if you are trying to get rich and famous, instead try to just get rich. Chances are it will suit you pefectly.


I am at "large tech co"

> 1. You often have to write design docs to communicate what you are making and gather feedback. These documents if badly written won't be as well received.

In my company this is entirely up to the culture of the org. Some parts of the company will require it. Other parts treat it as a formality (i.e. few will read it). And other parts don't require it at all.

> At many companies, you have a million things to work on, so in a way, you get to choose who you work with at some level. If someone communicates badly to the point of annoyance, it will take something special for you to decide to work with them or not.

The key phrase is "to the point of annoyance". If most people are poor writers, they are not annoyed at the fact that their peers are poor writers. Even worse, being a good writer is not an advantage.

If your culture doesn't value it, then it is not of value.

> 3. To convince execs and managers to approve your project ideas, you often have to write a document explaining your idea. If it's badly written, the exec isn't going to be as interested in it.

I addressed this in my comment and won't repeat what I've said.

> 4. To get fame as an engineer, you often should write compelling blog articles. Badly written blogs tend not to be read.

I suspect this is reflective of the SW point of view. My company is an engineering one. It is a giant, and is usually the top company in its discipline. I've worked with engineers who are likely the best in their discipline globally, and often way ahead of academia.

Not one of them has a blog - internal or external.[1] Very senior leaders tend to have them, and they usually are not technical, but corporate speak.

Keep in mind: Most of the engineering world is very different from your typical SW company.

> 6. Promotion is often done by a committee of people who don't know your work, and all they are going to do is review what you wrote. And promo is often based on leadership of projects. And how do you become the leader of projects? You write compelling documents.

In your whole comment, this most reflects how unreflective your perspective is in the engineering (and even SW) world. Yes, I do know some companies that do promotions via a committee of people who don't know your work (Google, etc). For the rest of the tech world, this is rare. People get promoted because they have a manger who will root for them in front of the committee. The committee is typically the next level manager, and he/she likely is aware of your work. There's no "promotion packet" that one writes. There's the annual review (under 2 pages), and the committee only scrutinizes it if your manager is pushing for a good bonus or promotion. And as long as its readable, it's good enough. Of course, this means that spelling errors and poor grammar are OK.

When you want to get to a really senior role (usually takes 15+ years in the company - less than 1% of employees reach that level), only then does a wider committee get involved and will scrutinize your work. Do you have patents? Do you have external publications? And this is only for a technical role. You're not subjected to this scrutiny to get into senior management. Which is why surprise, surprise, we have a larger number of senior managers than senior engineers.

> And how do you become the leader of projects? You write compelling documents.

Oh heck no. You get an idea and pitch it verbally to management.

Trust me - a very consistent feedback I've gotten from management at work is "You write too much. No one will read what you wrote" - almost always delivered after I write an email that has 4+ paragraphs. I'm not claiming I'm a great writer, but they don't give up because they find my writing hard to read. They give up after seeing that the email doesn't fit on their screen. If I have a "tldr" they'll read that and talk to me in person (only a tiny minority will read the actual email).

Culture is king. Writing well will serve your career only if you are in a company that values it. Let's not pretend that writing well will take you places in organizations that don't value it.

[1] I should say that they do not openly have a blog. Some may have ones they don't publicize. Having a technical blog you spend a lot of time on would not be viewed positively, and the more senior you are, the more people will be concerned you'll leak IP. The company is pointlessly secretive and senior management doesn't want to allocate resources to vet your blog's content for IP violations.


Sounds like a great, healthy company to work for. /s

> You write too much. No one will read what you wrote" - almost always delivered after I write an email that has 4+ paragraphs.

Writing well does not mean writing a lot. Often it can mean the opposite. The same thing written plainly in fewer words is often better than the opposite. Apply Occam's razor to your writing and shave away the extraneous.

Also 15 years isn't that long of a career time, and eventually, you want to get to the staff engineer level where the promotion committee dynamic applies. Also at big tech co similar promo packet stuff happens for sr. management too.


> Writing well does not mean writing a lot.

4 paragraphs is "writing a lot"? Seriously? Since this whole thread was about writing classes in university, I don't think anything I wrote there was less than 4 paragraphs, and my grade would have been poor if I had.

How many whitepapers or technical docs have you read that were less than 4 paragraphs? Manuals? Changes explaining a product pivot?

Honestly, some of the responses to my comments fit in the category of "I'm unquestionably right. So if it's not working for him, he must be doing something wrong! Let me try to guess at what that is."

Not the most fruitful way of having a conversation.


I work for a small(about 50) company. The ability to write well is absolutely huge.I've seen people being refused salary increases(indirectly),not given more important tasks,and simply struggle on daily basis when they can't put in writing what they need.I wouldn't be able to count how many times I had calls from partner level supplies telling me they can't understand what the email they received from my colleague means. It's so bad sometimes I'm honestly thinking of introducing writing test for the next hires.


I'm generally against pre-employment tests, but a short writing test is quite possibly the best I've heard of. The biggest problem I can think of is that making it non-discriminatory might be difficult.


I’ve seen many engineers stuck in their careers in part because their writing skills were lacking. Articulating your ideas in a written form is a very effective way to influence others and share your vision. It shows others how you approach a problem and what you value.


Now imagine how bad it would be if we let in the people who are worse at writing than the current set of people.


"longer than 1-2 pages"

^ not to be confused with strong writing skills!

Brevity is crucial. Learning to compress ideas, eg limiting emails to 5 clear sentences, is part of this rare and important skill. Sorry you haven't (yet?) experienced a work env where there's good writing. Such places do exist!


> "being able to write and communicate well is critical in the workplace" is one of the top lies taught to me when I was at university

Definitely. Communication full of casual txtspeak and/or broken English all over. I suspected my first job's recruitment emails to possibly be some kind of scam at first because they were made in 3 different fonts in the same email with random words capitalized or colored various colors for emphasis, of course full of broken English - and I'm not talking about just terms like "do the needful" which are valid Indian English, that's fine, but even evaluating as that language so much of the communication is just terrible and nobody seems to care. I guess it works out fine and ultimately doesn't matter much but it still feels unprofessional.


I strongly disagree.

Communication skills -- especially in writing -- are increasingly important, rare, and valuable.

I've been doing software-related work for a living since 1998. The trend toward remote and async collab -- which has only ever increased in that 22-year span -- strengthens my conviction.


You are seriously limited in your ability to move past being an individual contributor if you aren't able to write well. Especially in this industry where remote work, even across time zones, is so common.

I see some really brilliant problem solvers in my company, for instance, that are definitely being held back by their inability to communicate well. Communication allows you to scale your impact several times over.


Which field(s) of engineering do you talk about? What are the highest level(s) of promotion the persons who don't write well reach and continue to work in?

I would think that writing well is at least a requirement for promotion into a technical leadership role (above senior individual contributors).

By writing well, I don't mean in the style of journalists or novelists. Rather, writing clearly and concisely to effectively convey one's points and reasoning should be very valuable in engineering.


> Which field(s) of engineering do you talk about? What are the highest level(s) of promotion the persons who don't write well reach and continue to work in?

Electrical, computer, and SW.

I'm not saying communicating well is not needed. I'm saying writing well is not needed. What I've seen: A good presentation (including PPT skills) is much more valued than writing. Decisions are usually made because of them, not because someone wrote a good brief outlining positives/negatives. Emails longer than a few lines tend not to be read, so people don't focus on it. Documents are usually not read by many except those beneath them, etc. I almost never see a senior management write anything of substance unless it is required by Legal/HR - they'll always get an underling to write them (and no, writing them is not how underlings become senior management).

I'm not saying I like the state of affairs, but it is how I've seen it.


You must be joking, or else work with very low caliber people. I don’t consider people for positions who cannot write well, and certainly know that most successful people do write well. Really, can you take a poorly written engineering spec or RFP seriously?

Just in case you are serious, counterpoint for others who may not know, all forms of communication are very important to succeed and move up.


> You must be joking, or else work with very low caliber people.

They're only low caliber people when it comes to writing. Otherwise they're exceptional engineers. The company historically and currently is a market leader. We're not talking about a small shop.

> I don’t consider people for positions who cannot write well

And this is exactly what I'm talking about. If you're in a setting that values it, and set up filters for it, then of course writing is important. If you're in my company where it isn't valued, then not only is it not important, it has little benefit. No point in writing well if people aren't going to read it.

> Really, can you take a poorly written engineering spec or RFP seriously?

In the case of my company, yes - if you want to keep your job. Unless it's inscrutable, I can't go to my manager and refuse to work on something because the spec is poorly written. He'll immediately tell me to go contact the author and sort it out. Occasionally the author will be nice enough to fix the spec and release a new document. But it's hit and miss. The reason many don't fix the formal spec in these cases? Their managers don't value it.

Specs are for major efforts. At the intermediate level: "What the heck is a spec? We just communicate requirements via PPT."

(No, I'm not kidding).


> I don’t consider people for positions who cannot write well

But that is your personal preference enforced only where you personally can enforce it. That is not imply your decision making is typical for industry.


I've gotten lots of emails from bosses and bosses' bosses like "i semt the file,pls confirm." Few seem to care about writing well in the workplace. With psychological things like how talking like someone makes them like you more, I wouldn't be surprised if "proper" writing was a hinderance to career advancement in such a scenario.


That may not be a counter-argument, it's quite possible that upper-level management doesn't feel the need to impress 'the little people', and only sweats over documents that will be read by their peers.


I can only believe you are trolling.


You write well. Maybe you have a blind spot because of that.


Wait...does anybody care about writing skills anymore? ——outside—or even inside—journalism? I wish they did. Writing seems to be treated like a 20th century skill of the uber-affluent layabout these days. I jump for joy when I stumble upon some great writing on the internet.


Yes. Try consulting. Information security consulting to be precise our work product is a written report. We have to communicate nuanced and complex topics to a variety of audiences. Writing well, even technical writing, is hard and it’s not as obvious as code when it is not quite right. We care deeply about it. Writing clean proposals, white papers, blog posts etc. It all matters to us. Sure your average coding job doesn’t require it, but plenty of work does. And I do believe having communication skills at a high level in a written form is a competitive advantage. It’s hard to see, but in the long run people who can eloquently write their ideas have an advantage on the less articulate.


The number of people in this thread breezily dismissing the value of effective writing is taking my breath away. I don't think these confident declarations we're getting in this thread paint a remotely accurate portrayal of skills that actually help career advancement. I think everyone's kind of playing a game where it's treated as a trick question and they're looking to emphasize the exceptions as much as possible.

Any sort of work, in say, nonprofits, or public relations, or marketing, or consulting, or any institution where you're at a level of management where your job is to present plans and preside over their progress while being accountable to oversight, and these are examples of the top of my head where I have at least some sort of familiarity, are places where strong writing is an asset. And I'm sure I'm just pointing to a small slice that I know from my own experience. These aren't special exceptions. These are the norm. The counterexamples make me wonder what, if any, actual career experience people are actually drawing from to claim otherwise, or whether they have the perspective to understand how representative those counter-examples actually are.


Throughout grad school, I worked as a writing tutor to support my humanities habit. Each new class of high school seniors was less skilled —and even less interested— in writing than the last. These were kids with 3.9 GPAs who went to Dalton, Horace Mann, Brearly, Choate and similar. I think great writing is a valuable life skill. I wish I could see more real-life evidence that employers care about it in the real world. (hiring and advancement)


I literally just gave three or four entire industries where I feel it's almost certainly an important career skill based on my career contact with those fields.

The lack of responsiveness comments have to one other on the internet is disorienting to me, because I would have thought that this would merit acknowledgement. Your anecdote may as well have dropped out of the sky in response to basically any comment in this thread.


Shoot. I was trying to be supportive/positive. I have worked in many non-profits. I agree, writing virtuosity would be a tremendous asset in some of the jobs you mentioned. I have not yet encountered such skill in these realms. I’m all for bringing back literacy! Beautiful! But... unless one is employed at Harper’s or Granta, expect a less than orgasmic response to your stylish articulations. I’m still a fan of great writers.


My point is not so much that employers value it or not, but that it is a tool in your arsenal to advance yourself.


To be honest, I arrived at university fully capable of my first job out. I think most people do. The selection process has you write essays and do math and all that shit.

Work is hard, but like most things you learn best doing the thing. Not saying SICP was shit. Just that I could have done that in high school and accelerated my time to money (and through it, contentment).

Maybe I'll let my kids do something like that if they feel the mildest desire to.


Kudos to you— having started coding QBasic in fifth grade, I definitely needed 3-4 years of undergrad to be ready for my first professional SWE job.


That's really funny. We first did Logo and then BASIC at that age (~10 years). Good times, eh? Back in the day? Everything was new and fresh.


I was _thrilled_ that I could program with something already on my PC. This was before the internet, and I was under the impression a C compiler was several hundred dollars that we couldn’t afford. Eventually, I even wrote “production” QBasic: my boss at the ISP would manually move spam to a spam folder in Thunderbird, export it periodically and kick it over to tech support. We used to manually look at IPs and enter them into the Cisco blocklist, until I wrote a super janky QBasic script to extracts IPs.

Two years later, I learned about regular expressions.


Haha incredible. What a story. Love that you managed to use BASIC in production. What a tale.

We had to go to computer lab at school to use the computer. But then my parents spent a fortune on getting us one. Now that I think about it, it was like half a year's rent. Jesus Christ, what were they thinking?!

Mostly played games. But then got a Linux CD from a magazine when I was 13. Wiped the drive accidentally trying to partition it. Disaster. Path to writing code begun.


If you have the skills to get the degree with 8 hours of work per day or to get the degree with 2 hours of work per day, getting the degree with the 2 hours of work does not mean you will not be able to do the job which will require you to do the 8 hours of work when you graduate.


There seems to be a sizable group where any job (especially an entry-level middle class job with the associated health insurance benefits and salary) would be a massive step up in income and social status. Cheating on a test to escape a life of poverty in the slums is not much of a dilemma IMO.


It's actually the opposite. First generation migrants try their best because they know what happens if they fail. Cheating is easier if you can bribe authorities.


Most of the jobs don't require you to be able to write well at all - especially not university level writing. Really, all most folks need is the ability to communicate effectively in person or over email. You might - just might - be required to write a letter. Of course, these skills are ones that a store manager at any random retail or dining establishment needs too, and some of these 'middle class' jobs require nothing much at all (if it is a factory that pays well enough).

Besides, plagiarism isn't really about writing. You can lump it into two categories: Cheating, which isn't most folks' intention, and more importantly, giving someone credit for an idea. This last one is something folks need to do in some professions. Don't take an employee's idea and call it your own, same for something your boss has you pass along. Don't pretend something is your own idea when it was implemented at a job you had years ago. This version of plagiarism is vastly more important than writing skills (which can be taught without needing to address plagiarism).


> This is under the assumption that you see college is a program of self-betterment and not busy work for receiving a degree that says you can have a middle-class job.

That's not really up for discussion though.

The degree itself will become utterly meaningless extremely quickly if we would actually generally accept that kind of reasoning.

The whole reason the degree is worth something is because it's perceived as a token of you having done the work and self-betterment etc.

It's not an empty token that allows you to have a middle class job. In practice it might be, but as soon as you openly accept that is just what it is, and only what it is, then you only get cheaters.


To have a middle-class job, you need knowledge, not a degree. (Disclaimer: my son does not have a degree but has a pretty demanding middle-class engineering job.)

If you spend time in a university to just get a diploma and maybe some connections, you likely are wasting your time and significant money (remember, a student loan cannot be got rid of by a bankruptcy).


Even if you only care about obtaining the degree, rampant plagarism also diminishes the value of that degree.


I think, plagiarism also harms the character. If instead of doing things right and put effort, you cheat, eventually you learn that doing things right does not worth it. And then it becomes a personal trait, so to speak.


> I think, plagiarism also harms the character. If instead of doing things right and put effort, you cheat, eventually you learn that doing things right does not worth it. And then it becomes a personal trait, so to speak.

Welcome to the underlying systemic problem with some of Society's more critical institutions.

Fraud and corruption have become institutionalized, and lying and cheating are just the name of the game.

Explain to me how Banks got away with what they have if not fir this very root issue; blow up the economy because of reckless, risky investments: Bonus, bailouts, and golden parachutes for all.

Default on your student loans, utility bill or car payment? We'll ruin your credit for all of your miserable existence, while you slave away anyway because the former cannot be expunged.

I'm so glad I borrowed from family and friends instead of banks or the State. It was hard paying them back, but if I had to choose a creditor of last resort I think I made the right choice.


> Plagiarizing some work doesn't really hurt the work, it hurts you

Except learning isn't really the reason most go. It's to get a well paid job at the end of it.


I returned to college after 10 years in the insurance industry. I’ve always viewed programming as a hobby, but decided to take the leap and pursue a career that I might actually enjoy.

Anyway, I notice a lot of younger students have this attitude and it frankly causes them to produce really crappy work. As long as they pass the class, they don’t really care to absorb the material.

I can’t help but wonder what kind of job they’re hoping to get when they leave school. What will happen when they get a technical interview? I can’t imagine them doing anything beyond answering phones at a company’s IT Help Desk.

I dunno. I listen to every word the professors say as if they’re telling me the secret to eternal life while half the class is dozing off.


The one class I took in college that was relevant to my job, at all, was the AI course on structured vector machines, which were irrelevant after two years. For most programs, correctness is a nice-to-have, forget speed or responsiveness or data structure choice.. and in production code, non-technical and constant factors override theoretical solutions. University computer science courses, on the whole, are quickly obsolete or obsolete to begin with, incompatible with software development as it's practiced, and are best reviewed in the week before seeking a new job to score well on leetcode. The best skills you're going to learn are the metacognition you're picking up in university, the ability to learn new languages and adapt to new environments and solve new problems, not anything that's actually taught.


> What will happen when they get a technical interview? I can’t imagine them doing anything beyond answering phones at a company’s IT Help Desk.

My experience says you'd be very surprised. As in most fields, networking, charisma, and ability to bullshit play a substantial role in IT hiring.


> I listen to every word the professors say as if they’re telling me the secret to eternal life while half the class is dozing off.

This is absolutely brilliant.


As a software developer my education has nothing to do with a job or my salary.


> > Plagiarizing some work doesn't really hurt the work, it hurts you

> Except learning isn't really the reason most go. It's to get a well paid job at the end of it.

There's no contradiction here.


Not necessarily hurt you, but doesn't help as much as it could.

From talking to people further along the plagiarism spectrum than myself, they see it as developing good taste or almost coaching.

Yeah yeah in a writing class its hard to justify not learning to write. But in any other class...

Lets hypothetically say we're in a computer science class and our assignment is to write an essay on the supremacy of the C++ language. There's all these English department goals of becoming a better writer that would be met by my pitiful attempt to glorify polymorphism. But the C++ goal of learning to be a better C++ programmer would be best met by extensive reading and research to find the best Stroustrup quote. If I were involved in the academic scene of converting papers into salary via cooperation with other researchers, I need to quote my coworkers accurately to share the revenue appropriately. However what if I don't have the goal of playing that game? In a learning environment in casual verbal conversation I might tell my C++ instructor that C++ main() returns an int. Yet if I write that down as I just did, I'm committing the academic sin of plagiarism by not properly footnoting Stroustrup, that's a direct quote from him. But I'm not trying to play the academic game, I'm trying to learn to program, and develop good taste by copying the right people. It seems a little unfair to grade students based on playing a different game than they signed up for. Even if the institutional goal is to produce little academics, in practice almost none of the kids will become academics.

That's a very authoritarian example of copying a guy at the top; but it also applies to lower level copying.

They're not necessarily wrong or self destructive, just kids on a different path with different priorities.


I was going to write sth similar -- that if a writing assignment seems pointless from a student's point of view, and s/he thinks the study time is better spent in other ways, maybe coding a software program instead,

Then to some extent I could sympathize with those who plagiarize. ... If it's to save time for something more on topic they think.

But if it's a writing class, then, no! Or writing about history or society etc


> Power struggle between teachers and students via...

Uh, did I say that, or were you intuiting? Not exactly a learner's approach :D

> Plagiarizing some work doesn't really hurt the work, it hurts you.

Except when it benefits you? This is the subjective perception I was talking about. That their mindset differs does not instantly make them wrong, especially when you can throw a dime out the window and hit an educated professional who falls short of the best (heaven forbid the "perfect") ethical standard.

Ethics is, and should be, hard. If you put words into my mouth, is your position ethical? This stuff requires the ability to stick around, listen, learn, and stay in the game, moreso if you plan to claim the high ground.


This isn't a subjective ethics problem. It's a competence problem. If you can't formulate your own thoughts into a coherent statement (the purpose of learning to write) then you'll be useless to anyone that needs new coherent arguments.


That's a bit of an overreach, isn't it, calling my writing incoherent and incompetent? (Edit: Parent clarified that "you" is meant to mean a hypothetical student, not me.) If more information is needed, as was the case here, we can ask questions. That's got nothing to do with author competence. It's not an essay contest. And why assume we know it all, filling in the gaps like that? In a discussion of ethics, this is a qualitative issue to say the least.

The concept of competence as you describe it is also very much a vague, subjective concern out of which you've just attempted to carve a covert competence contract. This leaves your blind spot unguarded because you are unknowingly making the discussion focus on you and your own competence level.

And this is a big part of why "hyper ethical" subjective ethics people struggle--they assume their view is right and don't ask questions of others.


I haven't said anything about your writing at all. Maybe it wasn't clear enough that I meant "you" as in "someone" or "one". I'm sorry it was possible to misunderstand my general statement as a personal attack.

This doesn't have to do with me either -- the market will determine whether any one person is valuable enough to employ (or promote). My only claims are that being able to write makes one more valuable, and that plagiarizing assignments at school fails to teach one to write.


Ah, I see what you mean, that it was not meant to refer to my level of competence. :) I did indeed read the "you" wrong. Thank you for the clarification.


His statement was very matter of fact.

If students don't bother to do the work, they won't develop any competence in the discipline.

It would be like sending someone for Scala training, only to have them skip all of the work, buy the answers to the quiz, get the accreditation.

University is about much more than 'skill acquisition' but there is a lot of that. Cheating is almost universally pointless.


‘Cheating is almost universally pointless’

Except that it clearly isn’t. There are all kinds of people in positions of power who are clearly incompetent in many of the skills we would want them to be expert in.

Often cheating enabled them to pass the gatekeepers and attain their position.


(Edit: Thanks for the clarification on the other comment--I had read the "you" wrong)

Sure, that example absolutely works. One reason why you wouldn't cheat is that you know that a specific outcome you want requires something of you that you must learn. I found it striking just how rare this was, though. We can fault students for not having made up their minds, but I found that many of them are just really open to new directions, and this can help to enable part of the plagiarism equation, but it's also something of a gift...

Anyway, talking to my students I discovered that the discipline is really often completely up in the air. So while the rhetorical / imaginary student's path for the purposes of argument might be be "study math -> work in applied math," quite often it's "study phil -> work in I don't know what" or similar.

The students who plagiarize with this mindset are really quite something. It's nuanced--they're smart about it, leaving no final question on which points against them can rest. For example, the student submits a first prospective paper in which they quote-paste for pages on end and then plagiarize not by direct-copy, but by reading and then re-hashing someone else's conclusion from a book or another paper, and they get a C+. Well, if a B- is all they need in the class, they are good to go. Then they take a reactive / tactical stance and only change this approach in the future if they absolutely have to.

This pattern happens over and over. If you attempt to pin the student down on qualitative issues, they have a number of tools to use here. You have to be ready for extreme negotiation. They _may not be able_ to learn about quality, ethics, etc. Shocking sometimes but it's a struggle for many. One of the most common negotiation techniques is, "I just...I don't understand. I'm really not that smart" and then they start crying or leave the room in a rage. This can instantly shut down a professor with average or greater levels of sympathy. The student converted the negotiating professor's original value proposition into a risky interpersonal issue. If further negotiations occur, they will find ways to illustrate why things are unfair to them. What is the prof going to do about that? Do they even have time for it at all?

Then you can go back to students who are in the "study math -> work in applied math" group. You look outside of the math classes and you can see the same pattern. They know wasted effort when they see it, or think they do. And again--some, not all. Savvy employers also weed out some of these people but then other employers hire them because they desire tactical cleverness in their organization, and they recognize it when they see it. Maybe it's how they got the boss job in the first place.


It’s not an overreach. The competence is a product of originality and independence.

> And this is a big part of why "hyper ethical" subjective ethics people struggle--they assume their view is right and don't ask questions of others.

That is itself quite the assumption.


Not an assumption at all. That's from my subjective past experience, from having conducted qualitative research experiences in this area as I interviewed and taught students. I shared that experience in my original reply and would love for anyone else to do the same, referring us to their past experiences even anecdotally, rather than forecasting woe via subjective intuition.


My real world experience has taught me many developers cannot write (code or prose) and would rather jump in front of a bus than write their own original code. The next time I am tasked with candidate selection I will use an essay assignment as the first round of filtering.


> Uh, did I say that, or were you intuiting?

What is it with this everywhere on HN these days where people assume every response someone makes is a refutation? It's a conversation, dude. People will take it places. Sometimes people will build on ideas you mention. Other times people will take an incident and draw their own conclusions. Other times people will draw on a similar incident or talk about a related concept.

I'll accept my off-topic downvotes since that is fair but this is so frustrating.


Hmm, I think it's only fair that we zoom in a bit from "everyone on HN complaints" to the details in question. Otherwise it makes it appear that you want to avoid discussing details in question which directly flow into the qualitative nature of this discussion.

I just re-read what you wrote. If you're saying your response was not a refutation, given that you wrote "is not best characterized as," I think it's pretty clear as to why that could be misunderstood, to say the least. It seems clear to me that you were replying in direct disagreement and also projecting words I never said right into your reasoning. And further, it now seems as if you're claiming that I'm being assumptive. This is just compounding, not helping.

I think specifics are important here because it's unfortunately common for people to attempt to sweep pesky details under their subjective-ethical rug in the name of [hand wave]. Since this is an ethics discussion some due concern to communicating ethically seems reasonable to expect. If that's frustrating, maybe you can at least see the frustration on both sides.


I wonder if a similar principle is at play when people throw links at each other in place of a debate


>Plagiarizing some work doesn't really hurt the work, it hurts you.

and a job well done is its own reward right? i think it's very pretentious to say that to a person who's attending school in order to improve their lot in life (because credentials count for so much); that what's more important than the credential is some abstract notion of improvement. you might as well cast it in terms of sin and salvation.


But isn't this abstract notion of improvement supposed to be the entire point of education? It feels to me like education's purpose is undermined by its role as a prerequisite for a middle-class career. "When a measure becomes a target, it ceases to be a good measure."


Abstract or not, it's completely qualitative / subjective. Easy to argue. Very easy. Try it: Pick a specific, abstract notion, name the institution, and talk to the students.


> and a job well done is its own reward right?

Sure, if the life you want is to be at a desk for 8 hours a day regurgitating your superior's existing biases back at them, go ahead. I find employers much prefer someone that can attack a real problem and think critically about potential solutions from multiple levels of analysis. If all you're good at is chewing someone else's cud and spitting it out with a slightly different word order then you're useless to people that actually want to solve problems.

And, sorry to say, credentials are counting for less and less every year. If employers have to take a year to train you to think critically, then what use does a credential serve as a filter? I wonder why that's happening...


(for work) I have both "plaigiarised" a policy that I found on the internet, AND written one using as source the ToC of 2-3 other policies, but did the fill-ups all by myself. I learned nothing from copy&paste, I gained plenty from typing it up from scratch myself (even added points that the ToC's were missing).

The clients got the same value (they wanted a v1 policy, and they got one). I became better by doing the work, so next time I had a discussion on the matter I felt that I controlled the discussion instead of pitching in.

Faking it till you make it has the risk that you fake it forever and you become the paradigm of the Peter Principle.

Walking the walk takes more time but always benefits in the long run.

I have met plenty of people though that take the risk to never grow/evolve and stay in their comfort zone because they just want the base salary to fund their hobbies and they get no sense of accomplishment through their work (for many reasons)(I am not getting into this discussion).

Edit: Ps: I now work like this (when asked to develop a policy for a new client): spend some time thinking of key points (technology changes fast enough in some areas), drop a couple of examples for each bullet point, and then "plagiarise" from previously made work. This way I have prepared part of the downstream Procedures. You would think that a Policy is high (enough) level so it shouldn't need frequent changes, but different clients want different things.


> a person who's attending school in order to improve their lot in life (because credentials count for so much)

If every person at the school had this reasoning, the credentials wouldn't count for anything.

The credentials only count for something if enough of the people graduating are actually fulfilling this promise of self improvement in the subject of their studies.

The cheaters are literally freeloading off the prestige of the credentials produced by the people who do put in the effort. If that group of people did not exist, the credentials would be useless to the cheaters as well.


> in order to improve their lot in life (because credentials count for so much);

Those credentials only matter insofar as they describe the likely caliber of the alumni that come from that particular school. Being a terrible student isn't going to help the value of your degree a whole lot...


> Plagiarizing some work doesn't really hurt the work, it hurts you.

Funny enough, this idea is almost certainly one you are plagairising. Perhaps you could find nuance, depth, and understanding by turning those words on themself?

Somehow I feel like most of the debate here centers on these words:

> The purpose of <foo> is <bar>.

That little word "the" there at the beginning seems a bit myopic to me. I mean, clearly, something like writing functions in more ways than one. Here are some other potential purposes of writing, off the top of my head:

* Deconstruct your personal ontology,

* Intrinsic artistic value,

* Emotional expression,

* Elucidation of unconscious grammatical habits,

* Social signaling.

Any of those are perfectly functional operations of the writing act, and a few of them actually benefit by plagiarising (Identifying these is left as an exercise for the reader). Of course, saying that "university should allow students to operate under any possible goal framework" is a different matter, but hopefully that at least points toward one way of thinking with more nuance about plagiarism.


I've seen this myself, and I think this sort of moral relativism is pretty bad for society in general. You can spin a story of victim hood or power dynamic to justify anything unethical. What should be a fair transaction becomes a game of who can backstab first and who is left holding the bag. This I think leads to a breakdown in the fabric that keep things stable eventually.


The problem is that these people will plagiarize their previous employers' code when working on your codebase and your code when they move on to their next employer or will also plagiarize FOSS code without worrying about the niceties of licensing. Their perceptions are one thing but the legal reality makes these sort of people hazardous to employ.


Except when their boss hired them because they're clever and can solve extant work problems (huge workload) via a happily-localized scope of concern. I've seen this easily placed back on the employer. Non-FOSS versions happen in various industries all the time.

You can make the argument that legal teams exist to relieve the very potent pressure of this uncomfortable situation for those on both ends. There are truths here that flow right up to the top and all the way to the bottom. Attacking those perspectives is too hard. Better in some ways to set up social norms around it. Thus, lawyers, the concept of win-win negotiation, etc. Just when your anger is boiling over you learn that the C-levels want you to drop the moral crusade.


This is a thoughtful comment, at the same time, it's some pretty scary postmodernism.

Advocating cheating because it helps a student from possibly lesser means achieve their 'check-box' in education is missing the point, in a very serious way.

We can always strive to have an open mind, to try and be sympathetic to the plight of others - but the Truth is pretty much still the Truth.

Plagiarism is not a controversial subject in the context you've mentioned, it's just wrong in every sense.

More pragmatically - if someone is in a situation wherein their need for the 'checkbox degree' outweighs their actual need to learn something, then they almost assuredly should not be there. It's pointless for society to be spending a lot of energy and resources only for people to waste them, and it ruins the credibility of the system. I'm not unaware of the fact that a lot of Uni may feel like jumping through hoops, but even then, if the hoops are merely jumped through, we're learning something. Uni is not meant to be enlightening at every step, it's also, like everything a grind.

A 'free public Uni' is going to attract a wide swath of students, and there will necessarily be all sorts of issues, probably very low graduation rates, challenges in communicating the material. I totally support the idea, at the same time, we should strive to maintain the credibility of our own ideals and institutions. Universities are there to help develop character, they're not just about 'absorbing data'. In the long run, it's worth it.


The "checkbox degree" hypothesis is mostly false. Some companies do require a degrees, yes. But that's a minimum requirement. Literally zero companies promise a job to everyone with a degree.

It's a bit difficult for me to believe that there are highly competent people who would be effective employees now, but for whom cheating is easier than doing the work. Remember: cheating (without getting caught) takes time too.

I can write a few page essay on any given topic much faster than I can figure out how to copy/pasta that essay without getting caught.

I can implement most undergraduate programming assignments faster than I can figure out how obfuscate someone else's code enough to fool cheat detectors.

Etc.

The subset of people who are truly competent but for whom cheating is easier than doing the work has to be vanishingly small.


It depends, like many things, this is very nuanced.

In my time at university, I knew many capable and bright people who cheated once in a while, but only in well-picked cases, not as their general policy. For example if there was a mandatory bullshit subject, or something that they regarded as worthless waste of time etc. and rather spent the time on the important things.

I think it's a very naive shielded "good boy" view of the world that there is some simple rigid moral rule like never lie, never cheat etc. It may work in a benevolent environment like rich protective parents and never dealing with adversity. One has to develop one's own sense of justice.

This can be easily misconstrued. The point isn't to believe in nothing, be exploitative and selfish. Rather, be mindful and don't just blindly follow someone's bullshit. Indeed much of the purpose of education is to kill this ability and to certify the capability for blind obedience and jumping through hoops without ever questioning it.

One guy I mentioned in the above parts is actually really honest in general and sometimes I wonder how he gets away with it in corporate environments, saying straight nos, not putting up with colleagues criticizing him for working too fast etc. I've usually been much more careful but he's more successful. And it's an art of picking your battles, refusing bullshit, sometimes openly sometimes secretly (at least don't lie to yourself), sometimes making a stink, sometimes just complaining to fellow students, knowing the unwritten lore of which courses are unofficially considered "cheats allowed" by most talented students and probably the teacher included.

The world is complicated, but for shielded kids with underdeveloped social skills it can be hard to learn how widespread "rule bending" is in real adult life and how much this is basically known, expected and part of life.

Again, this is not to say be selfish and disregard others. Rather, think for yourself, know when something is bullshit (there is lots of fancy official institutional stamped-and-signed authoritative bullshit out there, often coming from people who know it's bullshit but either don't care of feel their hands are tied).


I agree. I'm a CS major and am forced to take a class on history. The class is entirely online and the teacher doesn't seem to really care one bit. The entire class is just random quizzes that can easily be googled/ Just follow along on quizlet.

I have two options; Read the super boring textbook for 5 hours, take the test legit and get an 80% or I quizlett the test get 100% and spend the next 5 hours listening to Dan Carlins "Hardcore History" Podcast which i find much more informative and enjoyable.


I had similar cases with "business" and "management" courses in my computer engineering degree. While understanding business thinking is definitely an important part of being a computer engineer, these courses were utter bullshit and everyone knew it. So everyone I knew cheated, it was an open secret.

But the university wanted to boast that they are modern and prepare students for business stuff: look, we even require business-related courses in our program! But actually it was some nonsense like memorize various lists, like the 5 different aspects of whatever, some pseudo-mathy formulas, etc. I mean, if you take me that seriously to give me this type of nonsense as "learning material" then I will take you precisely as seriously when it comes to the exam.


> Can you justify it in seconds, with something that's not simply a subjective perception or largely-covert moral construct of your own

You are making this way too complicated.

It's honesty vs. dishonesty.

Those of us playing on team honesty are right to view the dishonest as playing on the opposite team.

Which team is actually morally right, is the complex issue.


> You are making this way too complicated.

> It's honesty vs. dishonesty.

Kind of like you're either a criminal or you're not. If you drive 58mph on a 55mph zone you are. Otherwise you're not.

Binary positions are always convenient, once you pick a side.


This is a great meta level view of the moral perception landscape, the individual replies advocating for one particular prescriptive position or the other are missing the forest for the trees.


> Since then I've been exposed to additional perspectives on plagiarism. It is an extremely deep and nuanced topic.

There are deep and nuanced ways in which people tell themselves and others how ethical they think they are taking credit for other people's work.

It's not really something that is up for discussion whether it's bad. The rules are pretty clear, and for serious higher education the punishments are extremely harsh. And deservedly so.

What does it even matter what it says on your diploma if you cheated your way to getting it?

These people might pat themselves on the chest for being oh so clever in subverting the system or whatever, but what does it really mean? That you're good at cheating, unwilling to do the work, and gladly take credit for other people's work. In other words, being a useless turd.

You could have been studying astronomy, physics, math ... but instead all that you really proved is that you're good at cheating.


People are extremely good at justifying their own actions in any shape and form. Reminds me ex's mother,who was a hard core church goer and was undeclaring profits from her company. When asked why would she do such thing,her excuse was that the taxes are way too high( the country was within top 20 countries in the world when it comes to taxes). Another thing is that usually these things don't happen all at once but rather gradually. The same is observable in companies,where corruption start from tiny things and evolve over time.


... the Testing Center!? Hey I went there too.

But I don't see how this is more nuanced that being honest or dishonest. I really pride myself on being able to see where a lot of opposing viewpoints are coming from, but I can't see how any of the people you describe are doing much more than lying to themselves about what they're doing.


How is this different to the following?

"Using counterfeit money is arguably OK because some of the people who use it might need it, or see the need to earn a living as a needless waste of time as long as there's idiots who'll accept fake currency".


The issue with counterfeit money is that it devalues the worth of all other money.

But given who holds the majority of the money being devalued, given the extent that money is already being devalued for the benefit of some at the expense of others, and given that the idea of taking a relative to wealth portion from the group for the benefit of the poor is a form of redistribution the group already engages in regardless of the consent of the giver, I think you can easily construct an argument that it is not immoral to use counterfeit money as long as you pick the right targets.


Plagiarism as a means to acquire a university degree devalues the worth of all other degrees...


Plenty of people can construct arguments that breaking the rules are acceptable when they do it.


Wait, this is interesting. I thought plagiarism is universally bad? Can someone explain?


themodelplumber is a perceptive observer of human psychology, and they've described some of the ways people justify cheating with one narrative or the other. This isn't advocating for cheating or plagiarism, just hypothesizing why some brains are happy to do it.


People have to justify bad things otherwise they wouldn't do them. This is how we get racism, antisemitism and discrimination in general. A lot of taboo activities have an initial moral barrier and once you break it there is nothing stopping you from doing it over and over again.


Plagiarism was very widespread in my university. The students doing it were simply not punished


> At the time, my hyper-religious mind was blown to see students outright cheating in the Testing Center.

Then you would have lost your mind during some of my upper division STEM courses, where the normal class average is in the mid 40% on the high end and high 30s most semesters. We had a class average of ~83% in one of them, and when the professor was in total disbelief about how blatant they were being (several didn't even try not to get 100%) he got upset and decided to stop re-using the previous year's exams that were circulating amongst the Frats/Sororities and the class average dropped back down to low 30% for the last 2 Midterms and finals. This personally helped me as I got my average to a B+ range after my Lab scores were included.

Academia, be it online of IRL, it always going to have that element of corruption, there has just been so much money to be made that it drew the worst from other sectors into its administrations. Eric Weinstein goes into the depth about the artificial scarcity within STEM that was created in the US to favour a 'race to the bottom' approach to wages since the the 90s, and he went to both MIT and Harvard!

My best professors were often disillusioned and jaded after having been in Academia for a few decades, one even having to petition and protest to the Dean of their departments to be paid for having taught back to back Summer courses during the budget cuts. A person who went to Oxford for his BSc, no less...

All of this is taking place while hearing about some high ranking official or executive is in the process of being disgraced for having plagiarized their Masters or PhD thesis decades ago, which begs the question how the hell did it pass back then? Don't they all have to go through dissertations defenses and have academic advisors who are PhD or Post Docs in the very same department?

I won't even talk about the administrative led favoritism and vetting they did for foreign students during my time there, either.

I'm a proponent of of Online Learning in general, especially as its disrupting the Brick and Mortar University exploitation model; you can now get a Masters in Electrical Engineering from CU Boulder entirely via Coursera for $20k with installments and Financial Aid available where applicable, whereas that's not even the total cost of a single year of Undergraduate studies there. Which really sucks for incoming students, but could be a net boon if they can start reaching the undergrad degrees in time now that most Universities are having to do that due to COVID.

I just think its worth keeping in mind that this institution (academia) doesn't deserve to be regarded anything more than the corrupt money-pit that it is, which occasionally avails itself to allow brilliant minds to excel in their field, after having exploited them to publish and use their labs until the can make a name for themselves and operate outside of it. I just think about the wasted Human capital, talent, and drive that you only have when you're young, naive and idealistic and determined to make make a difference in the World.


All universities have pretty serious problems at this point, but we can't write off all of academia as a "corrupt money-pit." It sounds like you are generalising from your own experience at a single institution, and there are a lot of bad universities out there.


This detailed reply is fantastic. I want to paint a narrative in acrylic. (Perhaps just my mood today). It will have a plummer modeling on a catwalk as a moral war rages of cool fashion balconies meet the vision of the dog fights in a proverbial rotation of another fresh coat to the red carpet. See u in the concrete warehouse just short of arms reach, but certainly out of the limelight. Or in. Or out. Or in. Which is real?

You're welcome!


> The new institution, learning I had been attending U of the P, promptly told me that U of the P credits would not transfer. Universal transferability in question, I opted out.

Yes, many schools will not accept coursework from nationally accredited universities. This includes coursework from UoPeople. You will not be able to apply for many graduate schools as well. A degree from UoPeople definitely comes with limits, and I wish the school was a little more forward about that. (There are lots of discussions about this on the internal social network.)


This isn't restricted to untraditional schools. At least one of our local state universities have stopped accepting credits from the community colleges due to the practice of transferring for the final couple years. Might be something similar going on here. It might be that they want the degree to represent the full 4 year experience at the school or it might just be the money. The cynic in me prefers the latter.


If that was happening in my state I’d definitely be writing my representatives to take action and change that- transferability to a university while being able to live at home and pay less was quite explicitly one of the points of having a community college system when I was growing up.


Many state universities changed those transfer policies when states made deep higher ed budget cuts during the great recession. The alternative in our state was closing branch campuses (thereby making college even less accessible for most of the state).

Lots of universities in the US are lavish and expensive. Our state's branch campuses are the definition of utilitarian. Tuition is some of the lowest in the country. Salaries for professors at the branch campuses are lower than what we pay our high school teachers. Very little admin overhead. There's simply not much to cut.

Anyways, not saying it's right. But if you cut higher ed budgets, higher ed will get more expensive to the end user. Especially if the system was already hyper-efficient. Limiting transfer credits is pretty much the only way to make things more expensive to the end user without increasing tuition.


Sounds like the set up in your state is to have branch campuses try to serve many of the needs that community colleges serve (or used to serve) in my state.

Whatever the setup, I think it is access to higher education at as local level is important.


A number of states ended up enacting laws to solve this exact problem. Lawmakers had to force the big state universities to accept transfer credits from community and regional colleges.


> Might be something similar going on here. It might be that they want the degree to represent the full 4 year experience at the school or it might just be the money. The cynic in me prefers the latter.

Everything I've ever heard about and learned of the college admissions and pricing situation makes me prefer that latter.


That's surprising. Around me it's mostly been the opposite with schools signing transfer agreements and the community colleges developing programs specifically for students who plan to transfer.

Though, in the case of UoPeople, it's purely about nationally accreditation lacking the prestige (and sometimes rigor) of regional accreditation. It's been that way for at least a couple of decades since I started reading up on higher ed.


A degree says something about the student. I completely understand why universities would not want to accept credits from school below their own qualification level. If they confer a degree to a student that student is representing the university's education quality to everyone that hires them or works with them.

Spending three years at a community college and expecting the credits to transfer to a major university makes no sense for the institution. At that point they did a minority of your training, why would they recommend you?


I follow your logic, but I really think it's about money since they also put up barriers for testing out of classes.

Usually, a strong state-run community college program lets an Associates degree earned at a community college transfer into any public state university as a Junior. Most non-degree focused classes directly transfer. Going between states (staying in the same regional accreditation) seems to be difficult and not all states are set up like I had described. The theory being that someone who isn't quite ready for university has an opportunity to catch up and in my experience it gives more options when popular prerequisites are full. So the state has a reason to push for this.

I'm also skeptical about universities pointing to their qualification level because I've interviewed graduates from many well known universities and was kind of floored at the basic things many candidates seemed completely unfamiliar with.

Nationally accredited organizations don't have anyone pushing for this relationship. This seems to generally work for those schools because they can advertise as accredited, get a fig leaf of transferring credits, but usually want you to stay and complete tranning there. Most tend to be vocational which makes a lot of that moot.


Your example (3 years) is extreme. I could not get 15 CS credits to transfer from one private, 2nd tier US college to another (Boston university). BU refused and i took the classes again and paid for them twice. It put me in the rare position of being able to compare the quality of education at the two institutions side by side.


Hah! Don’t leave us hanging. What was your experience between the two for classes?

I took some community college programming classes in NJ and NYC. Didn’t have to repeat the ones I did. I was in awe at how weak the curriculum was in both cases. It’s not the student’s faults imo that their programming skills were lacking. The classes weren’t challenging.

Yet some of them eventually transferred to State Uni for comp sci. It really amazed me. Wonder how they fared and if that just means that specific state school is just really easy too.


I'd be curious to hear your observations. I did BA/MS at BU in CS (CAS of course) and found the classes often hit or miss, but the ones that were good were really good. That being said, I did have an interest in theoretical CS while many CS undergrads do not, which is what the department emphasizes.


The student:professor ratios at Boston University CS classes were much higher. I didn't realize the importance of that until i had this comparison with the other institution.

It led to many fewer questions by students, teachers assistants (something the other institution did not have and did not need because one professor could attend to all of his students), and in general less interaction between students with each other and with professors/TAs.

The curriculum at both were essentially the same for this set of classes.


I agree except for the fact that these are both state run public schools, part of the same state school system. Citizens should be able to switch between them freely based on the needs of the time and how they wish to pursue their education.


CCs don't teach Junior or Senior (300 or 400 lv) courses. If you stay at one longer than two years it is because you are going part-time or started with college-prep courses.

I spent 3 years (including Summers) at a CC then transferred into a top 10 engineering program. The 'extra' time was required because I needed to take a year of math classes before calculus.

And IMO, the CC's pre-engineering program left me better prepared for the university's department/major coursework than some of my peers who started at the university.


> Spending three years at a community college

This makes no sense. Spend as long as you want at the community college, transfer all the credits you want, by definition they will all be 1xx or 2xx classes. Your degree will require half the classes to be 3xx & 4xx, so you will always have to spend at least two years at the university.


The STEM in me says, it should all be the same.


one of our local state universities have stopped accepting credits from the community colleges

Do you remember which state and university?


Yes I was slightly piffed. Yet, the professor was notably intelligent and very professional. My comment was for others wondering if it was a scam. Its not a scam, it's real, but preferably for people seeking a multicultural experience along with their general educational knowledge, irregardless of the loss of highly guaranteed nontransferrability. Perhaps if one is retired, for example, it would simply be an amazing choice. :)


Speaking on your comment about plagiarism, my second term there was a student who submitted their Java homework in a Word document. It contained two screenshots. The first was of the NetBeans IDE. The second was of their browser opened to a Github page that contained the answer to the assignment. They'd un-destructively cropped the second using Word. They made it look like it was typed into NetBeans. I was able to track down the original source.


Sometimes laziness shoots the moon and becomes artistic genius


>there were a number of classmates whom could not grasp

Sorry, pet peeve. It's "who". "Classmates" are the ones doing "could not grasp" instead of "could not grasp" being done to them.


Any easy trick I use to help remember which to use is to replace who/whom with he/him to see which makes more sense in the context.

"He" could not grasp. => "Who" could not grasp.

I should vote for "him" => "Whom" should I vote for?


Not to invalidate your observation, but plagiarism is a hot topic at any college or university. I attended 6 different ones and in all of them it was top or near the top of concerns.


I am sorry to hear that your transfer of credit was not possible between UoPeople and the more standard institution. But I think that you should not feel bad or think that because the transfer was not possible implies that UoPeople standard is low or questionable. I say so because transfer of credit depends on the university in question, University A may refuse but University B may accepts and both universities are of high standard. For example, there are universities of higher standards out there that accepts transfer of credit from UoPeople, all you need do is to first of all carry out a research to know such universities before you apply rather than applying before requesting for transfer of credit which might not be possible. As for the aspect of plagiarism, I must say that it is a serious offense and at UoPeople, many students have not grasp the idea because they did not take their orientation classes and foundation courses seriously else they will not be ignorant or be victims. So putting all these together truly makes one feels that studies at UoPeople is not like higher education, but on the other hand, everything being equal,if you are a student at UoPeople, then it means that you should be responsible to follow all the instructions and rules governing the platform right to the end of your study then you will feel that UoPeople is more than a higher education center. NB: There is no institution in the world without a downside that affects the student as well.


>there were a number of classmates whom could not grasp the idea of plagiarism being unethical

It might help to define what form of plagiarism you encountered as there are some behaviors that count as plagiarism (at least were counted as such back in my college ethics class) which I never found any valid reasoning for. I have no difficulty understanding the ethical issues of passing off someone else's work as your own, but you can also plagiarize by passing off your own work as your own.


I've just completed my first term with UoPeople. I agree that I learned a lot about plagiarism and even had the opportunity to report some!

For me UoPeople is a blessing. I'm able to pursue a higher education all while working and supporting my family. This will give me so many new opportunities for better careers when I eventually move back to the states.

The educational method is challenging. I do like peer assessment, but I feel that some peers don't really try when grading. I like to give grades that are warranted, good or bad. Being told my work is not good with no correction is highly vexing.

I hope to continue to grow as a student with UoPeople and see others do the same.


I mean, that's probably true, but even students at the absolute top universities are capable of this.

https://en.wikipedia.org/wiki/2012_Harvard_cheating_scandal


The more students you meet at "top Universities" the more the illusion of superiority disappears.

The students are mind numbingly average. (But don't worry they will let you know what college they went to 12 years ago)

Heck if the job requires any communication, you might be better off finding someone humble or someone who needs to prove themselves.

Or maybe the best of these students go on to work at companies that pay 250k/yr and I just don't meet them.


I attended a no-name college and went to a top-tier university for my phd (think mit/stanford/cmu/berkeley). I also taught at a different top-tier university. Everything below is specific to CS.

I think that top-tier universities are different in the top 20% and bottom 20% of students. The top 20% because you won't find those types of people at no-name. The bottom 20% because they're at least not completely incompetent (unlike the bottom 20% at no-name).

Here's how I break it down:

Top 20% of students at top universities: They really are that good. Also, they are the ones who benefit most from the intense curriculum at top universities. So they start off really good and actually do get a force multiplier effect from the comparatively very rigorous course-work. Most of these students end up on rapid trajectories in industry or with NSF fellowships.

Average students at top universities: Most of these students would be in the top 10% of the class at a no-name college. Still quite good, but nothing special.

Bottom 20% of students at top tier universities: Really nothing special. Comparable to the average student at no-name. Contrast with the bottom 20% at no-name, many of whom couldn't fizzbuzz on their first attempt in our senior interview prep course.

So, top tier places are really characterized by their top 20% and bottom 20%. The top 20% really are quite amazing to work with. The bottom 20% are at least not incompetent.

One last note:

> Or maybe the best of these students go on to work at companies that pay 250k/yr and I just don't meet them.

Most of my students at top-tier had offers in the $150K - $250K range. So if your sample of top-tier university CS students is sampled exclusively from non-entry-level engineers who did not receive offers in the $250K range, you're almost certainly sampling from the bottom 20% of top-tier graduates.


> However, I also saw a grossly wide range of educational professionalism in the students

Your specific example of plagiarism is odd; I'm not sure what's difficult to grasp there. But wide ranging dedication (or professionalism if you prefer to call it that) is pretty standard at brick and mortar universities - it's pretty standard for a subset of people to prefer the partying and socialising aspects, and also pretty standard for quite a lot of people to drop out altogether.


The saying goes: copying from one person is plagiarism, copying from many is research.


But in the latter case, sources are properly referenced?


plagiarism was how the germans caught up on the industrial revolution. plagiarism is how china cought up to the western world. the strict argument on this is that ideas cannot be owned. no one owns an idea. ideas just float around and can be left aside or captured. that's how i think about it anyways.


There's a moral argument w/r/t the concept of stealing IP, but in the context of education there is arguably more damage being done.

If you steal a design for a jet turbine, you still have to learn how it works and you add that understanding to your personal/corporate skillset. Basically you have learned a secret.

If you have someone write your college essay for you, then you probably do so in order to avoid putting in the work to write the essay yourself. Basically, you have learned nothing.

I know there's nuance to both but I think your argument is overly-reductive.


Oh yeah you're right let's not mix that both. I thought he strictly meant the first.


I took a semester of courses from UofP. It had a wonderful diversity of ages, genders, and nationalities. Interacting with those people through peer assignments was interesting, and I applaud what they're trying to do.

Ultimately, however, I found the entire thing way too tedious - full of the classic "make-work" and silly hoops. The level of pedagogy was very basic.

I dream of (and am actively working towards) a day where the computer's potential as a new educational medium is fully realized, rather than the current parade of attempts at transplanting a brick and mortar classroom into a remote asynchronous delivery system.


Are you willing to share more about what you're working on? It sounds interesting.

One language app I tried for a few minutes was a VR app that put you, for example, on a train, and you had to have a relevant conversation. It was cool, not sure how effective because the GearVR I used at the time made me sick too quickly.


(Disclaimer: Most of what I say here has caveats and qualifications – hard to capture full nuance in a comment's space.)

One of the distinguishing features of computers is easy simulation. Humans learn well by direct, tight, interaction with a system – poking at a thing to see how it works. Tight feedback loops are how we quickly build intuitions.

Explorable Explanations [0] are a great extant example of what I mean, but we can do more... Imagine a "textbook" constructed around a well built-simulator (extant example, Earth Primer [1]). Sections of the textbook would present the simulator configured in a pre-set state, with various simplifications and initial conditions. Students can poke and prod these simulators, reset them, test out new states, and answer questions. Assignments could be on the order of "Given Sim[Initial Conditions] figure out what gets you to Sim[Desired State]".

Current explorables are lovely, but (usually) incredibly bespoke. One concrete improvement would be to produce an OpenSim standard, which would allow other content creators to embed and customize these sims to construct new narratives.

Another powerful computer affordance is extreme specificity – a system should be able to build a representation of a learner's current knowledge / skill graph, and figure out a (or many) shortest path(s) from current knowledge to desired knowledge, scoped to that user's interests. (Many LMS systems attempt this, but we're still in the early, clunky days).

Next we have non-linearity. Most textbooks and courses impose a false idea of topic dependence. Yes, there are some intrinsic dependencies, but there are many MANY more ways (orderings) of moving through a learning space than your chapter textbook suggests.

I can go on like this forever. I myself am currently focused on constructing usable knowledge graphs (and localizing incoming students on them), and simulations students can easily play around with to build powerful intuition. For the latter, we've found the wealth of STEM tools available in Python to be a huge boon. We (I work for an ed-tech non-profit startup) usually teach students some programming, and then have them start building models (physics simulations) or interacting with existing toolkits (such as, recently, the Rosetta protein modeling suite). These tools act as a forcing function for real-world relevance, and allow for direct intuition building over rote memorization. More effective, and much more engaging.

To address your comment: VR (and IMHO, AR) look to have a lot of potential in the future, though they are still in their infancy. I think there's a lot of potential for the humanities to provide more immersive experiences of places and time periods. (Re humanities, simulations also work quite well. Mock trials, political re-enactments, etc. There's a reason simulation games are so popular...)

[0] https://explorabl.es/ [1] https://www.earthprimer.com/ [2] https://rosettacommons.org/


A knowledge map (or graph) is definitely something that should be used more intensively in course structure and instruction.

I remember very well reaching an understanding of a subject by different means than what the teacher expected(?) or was taught to expect. That path of learning was rejected (despite the result being the same) and accordingly graded as a failure. Such resistance to different learning paths, a "status quo" of how people should learn, and the tendency to try and fit everybody into well-defined boxed is frustrating and counter-productive.


This is fantastic. Not that you’re lacking in ideas, but as you mention the value of simulation I wanted to ensure that you had seen this historical case study: https://obscuritory.com/sim/when-simcity-got-serious/


Literally all of these ideas are ones I've had for years in the education space. I'm glad someone is actually doing them.


You might also be interested in what I'm building at https://learnawesome.org : A rich learning map of educational resources, augmented with tools like spaced-repetition or notes with bidirectional-linking and project-based learning cohorts with peers and mentors.


Looks very cool, thanks


I'm 100% on board with these ideas, and I've also been following a lot of these ideas for a while. I'd like to learn more about the startup if possible, my email is in my profile if you don't want to post here.


Likewise, I’d love to learn more about what you’re doing! I’ve found the learning-by-doing paradigm has worked for me (e.g. with learning ML concepts by applying them alongside reading theory, rather than after) and your simulation driven learning using cutting edge tech and applications (like your protein folding example) is a very intriguing extension of that idea. Email also in profile.


Wow, that kinda sounds like my dream job. Do you mind sending me a plain link in an email?


A young lady's illustrated primer? ;)


Sure, but as of today 2020, it would probably be GPT-2 (or -3) filling in for the voice actors, which would create students extremely well educated in knowledge which is 60% real. Maybe that's not so different...


The greatest potential benefit AI has for humans is the ability to act as a life-long personal mentor: to collect, filter, summarize, translate into a form best suited for the individual, and communicate that information. Especially as we enter into the "Post Information Age", where the amount of information becomes so large that traditional methods of management such as search engines become unusable.


> I dream of (and am actively working towards) a day where the computer's potential as a new educational medium is fully realized, rather than the current parade of attempts at transplanting a brick and mortar classroom into a remote asynchronous delivery system.

I assume you already know about PLATO, yes?


Wow it's been a while since I've read about PLATO. Thank you for the reminder. Another example of how so much that's cool in computing has its roots in systems of the past (the mother of all demos, neural nets, LOGO (Mindstorms), etc.).

My ideas aren't novel, but every decade we try again and hope all the pieces are in place to start something which grows and thrives.


Assuming you're not talking about the ancient philosopher, I'd like to know what you're referencing. Searching for "PLATO education" isn't finding something beyond the expected.



I've been thinking about this as well. There is so much potential given the vast number of free and incredible resources online. Lmk if you're interested in working on something together.


I started building this last year with the same realization: https://learnawesome.org


The first question I always ask of these weird online universities is, are they regionally accredited? National accreditation is not really a high bar and isn’t considered competitive; all legitimate institutions of higher learning should qualify for and receive regional accreditation from an appropriate authority before considering such study in one’s best interest.

This online learning site is eligible for regional accreditation as of 2020, but they really play up the eligibility in a way that seems suspect. They aren’t clear if they have even applied, but they have 5 years to apply until they have to have a redetermination of eligibility.

Not saying this is a scam, but it feels like a scam.

This site was very hard to find, for what it’s worth.

https://www.uopeople.edu/student-experience/quality/accredit...


> Not saying this is a scam, but it feels like a scam.

Not directly related to your specific concern, but every attempt to make a open / democratized degree has felt like that to me, unfortunately.

Example: There was an online degree program I saw that awarded a real, regionally accredited degree from a state university. The degree and course work were exactly the same, but the online program was basically open. No admission requirements except to pay for and pass some pre-requisites that would then grant you access to the rest of the degree. The equivalent brick and motar program went through the typical admissions process.

My first thought: "this sounds very close to for-profit school".


I've attended several schools online at both the undergrad and graduate level including a for-profit school. I don't find much different between them because they've all followed an asynchronous approach.

However, UoPeople is different in that it's much, much less instructor led. They use a "peer learning" pedagogical approach. It's something I quite dislike about the school. I consider it to be poorly implemented and leads to problems like "revenge grading." However, I tolerate it because it allows me to study in a structured environment for less than I'd pay at my local community college.

It's not something I would recommend for weak learners and/or for many who need a regionally accredited degree.


I'm there not because I need it either. I graduated from a software bootcamp and I've already been working for 2 yeast, so I'm good. However I found out UoPeople and researched them a bit e.g. called a local college 's HR department to find out if they'd recognize someone with a degree from there to teach and they said yes. My next thought was to review the Visa process for a couple countries; this is my primary interest and motivation. None of them specified where the Degree came from only that it was a sanctioned/Accredited institution.

So I don't really care about the details. I learn more, great, but otherwise I'm happy because I just want to be able to apply to get into Europe or Asia as a software developer in the future and not worry about whether I have enough points, etc to pass the rigor.

All of that being said, the first course was a gong show with lots of academic violations, but like all things in life it's what you put into it and I plan on graduating with Honors, so even if it's a weak degree I'll still be able to respect my work and effort.


I assume by "first course" you mean UNIV 1001.

For those unaware, that's the "study skills, introduction to college, college success" course.

Yeah, that was pretty awful. I don't mind the material. It was nothing new to me, but the quality of discourse, grades, assessments, and so on--ugh.

With each step through the CS prerequisites, things have improved. So it seems the less prepared students either washout or get stuck doing general education courses. (Thankfully, I've got all that covered in transfer.)


Can you give some examples of 'peer learning'? I noticed OP said that about plagiarism and I was curious how that works.

The problem with all forced group work and not being able to choose the group is you get a very mixed quality of effort and talent. I'd stay away from any program that pushed that heavily.


"Peer learning" is implement primarily in two ways at UoPeople.

The first is through discussion forums. We are required to post a response to a discussion prompt each week. We return to write a minimum of three responses to our peers, and we also assess their response on a scale of 0-10.

The idea is that peers will touch on different aspects, extend one another's learning, and engage in further discussion. In reality, we end up getting "great post" responses most of the time and minimal effort put in by our peers. That's not always the case, but it happens enough that I don't think it's actually helpful or a meaningful approach.

The other means is through peer assessment of regular assignments (e.g., 'papers,' programming assignments, etc.). We submit these one week, and then our peers read/study and assess our assignments based on established criteria. Peers are supposed to provide feedback on each aspect and then an overall response.

Again, the idea is that we learn through studying and responding to our peers. In reality, most students make minimal effort. So most of the time my assessment responses are limited to "good job." So, not really beneficial. (I've had a few really good responses, but it's rare.)

As for group work, I'm not aware of any instances of that at the school. (I've had it in graduate courses in the past, and I don't like it.)

I really do not like the "peer learning" system at UoPeople. I think it can work, but I don't think you've done a good job of implementing it. And instructors that I've had to date only have minimal interaction with students. (Though, I don't really need it, so it doesn't concern me much.) I've considered transferring several times, but the idea of paying $1-2k per course at a state school vs. $100 per course at UoPeople doesn't appeal to me. (As I've noted elsewhere, I'm not doing this primarily for the degree. Instead, it's about learning for me.)


Local community college online classes has that same mandatory writing something and responding 3 times. I really disliked it. Forcing that sort of thing with such specific requirements made it so the majority were talking without much care at all. I was the same as I don’t like the forced interaction. At least the way it was set up.


This is a common approach used in online courses. I taken both undergrad and grad courses which required it, and it doesn't really help raise the level of discourse. I don't like it for that reason. It comes across as "make work."


>It's not something I would recommend for weak learners and/or for many who need a regionally accredited degree.

I should hope not given the fact that they are not regionally accredited and now that they have finally applied they likely won't be for another 2-3 years. Assuming they pass.


In principle, this could work if the school maintained high standards and allowed everyone to attempt to pass their courses + eventually attain the degree. Having an online platform without limitations (or with less limitations) in instructor resources allows legitimate programs to pursue this model.

To my knowledge, the difference with for-profit schools is that you are essentially guaranteed a degree by enrolling and paying the tuition and fees. If the for-profit institution actually had a good reputation, I can imagine they could also implement more restrictive policies. But the way academia works, a large part of prestige comes with history - and this makes it nearly impossible for new schools to break into this echelon of elites. I suspect it's also for the same reason that for-profit schools are not able to consistently attract talent in terms of instruction, since the ones who can would prefer to be employed by more prestigious institutions.


> To my knowledge, the difference with for-profit schools is that you are essentially guaranteed a degree by enrolling and paying the tuition and fees.

A large majority of students who start at the University of Phoenix fail to graduate. The guarantee you speak of does not exist, not because the standards are high, but because the average student admitted to these universities is incapable of the work, or too disorganized to do so. There are plenty of not for profit universities who admit similar students and they don’t graduate either.


I won't name names since I've buried the hatchet but here's a story along those lines.

During a hotly contested political debate in my city a person who called himself a doctor (psych therapist, to be precise) showed up at a council meeting and threatened a city council member from my district.

Being the data-fluent person I am I offered to help, so I pulled all of the person in question's educational, legal, and accrediting history to see what would turn up.

It turned out that the person was a state employee before moving to my state but was fired after burning a car for insurance fraud during a divorce. In fairness to him he admitted to that crime and paid his fines, restitution, and community service and what not, but he could no longer work for the government in his previous role (police) after being convicted. So he managed to get himself into an online psych program like the one you're describing (from a state university, but awfully scam-looking), which got him an Education degree, not a medical degree. In short, he could have been a high school guidance counselor and that's about it. But what his online undergrad and master's degrees in psych education did get him was the ability to make his entry into a PhD program look legitimate. But wait, you say, what would make any PhD program accept such a person who had a sketchy online undergrad and master's degree?

Presumably it's a lot easier to stomach such a PhD candiate if the university in question is in the African nation of Malawi, and is affiliated with the evangelical church that the person in question attends in the US. The Malawi university was one of the African universities that was churning out PhDs in suspect STEM disciplines back in the early 2000s to go around publishing papers about "intelligent design" if you remember those days. If you don't, it was the latest in a long line of hare brained evangelical attempts to end-around Charles Darwin in textbooks in the early 2000s. To this day, the person in question advertises himself as a "Christian hypnotherapist" providing all of the usual counseling services, and has an ID in the government medical provider database so he can accept Medicare, Medicaid, and insurance payments. The state I live in has since tightened up applications for medical licenses for people moving in from out of state. Previously, they had what seemed to be a fast-track process for people licensed in other states that did not involve re-checking educational credentials (that was in the late 90s/early 00s).

So I could discern from looking at all of this that there was likely some sort of coach, seminar, consultant, or similar that was training people on how to do all of this back then. If not, this guy planned the whole thing as his escape from the car burning bit and new career as a psychiatrist with shady credentials. First he got the state university degree, then the state university online masters which was fairly worthless, then the shady online PhD from Africa, then as soon as the ink was dry on his conditional (based on criminal background) therapist license in the state he lived in he moved to my state, and applied for a new license based on his existing license in his previous state, all in an attempt to get out from under the criminal record and reinvent himself as a shady psych therapist.

And it worked! As far as I know, his "medical" practice is open today.

In the process of digging all of this dirt I found other professors at other religious affiliated universities in the city I live in that had very sketchy African university credentials, including one listed as a neurosurgeon. I also found quite a few high school teachers who were making regular trips to African countries, including Malawi, from this area that they surely could not afford to make on high school teacher salaries. Those teachers per their social media profiles were all associated with the two evangelical sects in question that the car burning cop therapist and the shady neurosurgeon came from.

The likely erosion of educational legitimacy from of all of this (job markets, US university professorships, professional licenses, etc) are fairly obvious, I'd say.


> If not, this guy planned the whole thing as his escape from the car burning bit and new career as a psychiatrist with shady credentials.

Psychiatrists are medical doctors, and the rules on who gets to practice in the US are for practical purposes written by the AMA. That guy was not a licensed psychiatrist. He wouldn’t even be allowed to sit the first levels of the requisite exams.


You're right, I mixed up my 'iatrists and 'ologists.

Looks like the edit window has passed, sorry!


This sounds really interesting. Would you specify which state university?


Sure, It's the University of Colorado Boulder Masters programs hosted through Coursera:

https://www.coursera.org/degrees/msee-boulder

https://www.coursera.org/degrees/master-of-science-data-scie...

The fact that they're masters programs with performance based admissions make them even more suspicious to me, but if you're looking for bachelors programs like this I think the ASU bachelors hosted Coursera also has such an admission path.


How far do they go on the performance based admission? Must you have a Bachelor’s to be admitted? I’m currently on my last year of my Master’s and did not fulfill the usual entrance requirements but CEFIMS will let you into their Master’s programmes sans Bachelor’s though there is a bit of hoop jumping and it’s difficult to do it in the two year minimum for students who start with a Bachelor’s.


Nope, there's no bachelors degree required. I was actually pretty surprised by this when I first heard about it and I emailed them. In part of their response I quote:

Our program aims to make admission less bureaucratic (no degree, no GRE, nor recommendation letters, nor statement of purpose required), but we still are appealing to students who have the near-equivalent of a Bachelor’s degree, whether through academic study, years’ of industry experience, or, being self-taught to a high level. This is a rather rigorous program at the Masters level, equivalent to the content taught for many decades in our on-campus courses, but presented in slightly smaller pieces and in 8-week sessions, rather than 16-week semesters.

I have heard of people getting entrance to a Masters program without an bachelors degree, but not in technical disciplines and definitely not in a manner this open.


Thank you. If I ever want another Master’s I’m going to keep this in mind.


So what makes regional accreditation so great? Do other countries have it? Why can’t a major regional body go national? Is it the law? It all sounds like a scam from that to textbooks.


To somewhat oversimplify, the regional accreditation bodies (about a half dozen of them) are older and better established than the national accreditation bodies. They generally have 100+ year histories, accredit all of the major universities within their jurisdiction, and in general have acquired a great deal of respect throughout US academia.

National accreditation boards in general, and especially those that are not field-specific, are seen as being less rigorous, faster to accredit, and more profit- or prestige-motivated than the regional accreditation boards. Most are relatively new and some of the most prominent (e.g. ACICS) have been involved in specific controversy over the quality of their work. Because the regional accrediting boards are older and better respected, the first question about a university that advertises a national accreditation tends to be "why aren't they regionally accredited?". It is both symptom and cause of this difference that most for-profit and otherwise "questionable" institutions are nationally accredited and not regionally accredited.

While some national accreditation bodies are considered rigorous and respectable (e.g. ABET), they tend to be field-specific (in the case of ABET, engineering) and often require regional accreditation as a prerequisite to accredit a university, as they consider regional accreditation to be the indication that the broader learning institution is able to provide a quality program. Usually specific departments or colleges of a university will go to these types of accreditation boards to add a credential to specific programs (e.g. "ABET accredited computer science program") above and beyond the university already having a regional accreditation.


Short version: Its not great but if you want your credits to transfer you go to a regionally accredited school. Its old and spent a large part of its history entrenching itself in the American education industry, therefore its automatically granted legitimacy.

There are schools with a shit curriculum that have regional accreditation. The reason it matters to Americans is because most large universities are more significantly more accepting of transferring credits that come from a regionally accredited school.

I've been to nationally and regionally accredited schools. My degree was from a top 200 school in the US. The classes I took at the "lesser" institution were significantly more rigorous and significantly more effective at teaching skills relevant to the major I was studying.

I've known several people that went to schools that have been sued or otherwise have terrible reputations for being predatory for profit schools like ITT and the University of Phoenix. Their coursework was frequently more difficult than mine.

To be fair though, those schools were sued partly because of their shady high pressure sales tactics and for tricking people into taking on massive amounts of debt to take classes. That's pretty scummy, but based on the experiences of my friends it seemed like they at least had legitimate work to do in order to graduate.

Note: I don't doubt that on average the regionally accredited schools have been more thoroughly vetted. I'm just saying accreditation is not even close to perfect so the only reason it matters in the US is because the credits are easily transferable.


Regional accreditation is the "gold standard" in the US. National accreditation is consider below that. However, the school is accredited by the DEAC which is recognized by the US Department of Education. (Some have tried to claim the school is unaccredited, but that is false.)


I graduated from Bob Jones University. During my time there, BJU was Nationally Accredited and not Regionally Accredited.

This didn't personally affect me very much, but I know of many people who had difficulty getting into graduate school or professional work because of this.

A degree from a nationally accredited university limits your options. I'm not saying you'll be completely screwed if you get it, but not as many graduate schools will consider you.


Not yet. They started the process a few months ago.


Uopeople is not a scam . It's like every other institution of learning with qualified instructors . Make your research well before you make any comment.


I've graduated from this university with a bachelor's in Computer Science last year. Overall I had a good experience. I found the curriculum to be on a weaker side. I was interested in various things about computers when I was young, so it felt like I knew half of the material. While studying, I was working at the company part-time. At the normal study pace, you should be ready to put up to 30 hours per week when taking 2 courses. In reality, it took me half of that time. Usually 10 hours per week. Some people with whom I've studied had a full-time job. I guess its a tradeoff. You have a less rigorous curriculum but you then have more time to pursue other interests. Closer to the end of the studies I quickly found a full-time job in a good company locally.

The tuition-free part is more of a marketing. At the end of the day, it cost me $4,060 for the whole degree which is quite reasonable.

Speaking of employment and red flags, from LinkedIn I know that those who had studied at UoPeople work for many big companies including FAANG. No problem here.


> At the normal study pace, you should be ready to put up to 30 hours per week when taking 2 courses. In reality, it took me half of that time. Usually 10 hours per week.

30 hours per week for just 2 courses? Are these courses covering more than a typical university?

In my undergrad, I took a minimum of 5 courses per semester (16-20 credits). Scaling from your expectations, that would be 60-80 hours per week? I doubt I went over 40/week, including lectures (admittedly, it wasn't a demanding university).

Just giving you my 2 cents so you don't assume that the time you spent per course is all that different from a traditional university.


The school uses terms (5 per year) rather than semesters, so each course is 8 weeks (plus 4 days for finals) rather than the usual 15 weeks plus finals week.


That's what I'm finding. I started with one course because I was worried about time to study and time with my wife and child, but now I'm thinking I could do 2 courses probably without too much concern.

My only worry is the Math course. Not having gone to University or high school in 20 years I'm fearful. The other courses have been easy so far though, but like you I'm working on the field, so it's not so new to me.


The math courses were surprisingly difficult for me, but in general, quite good. I spent way too much time on College Algebra (I kid you not, like 20+ hours every week--the pace is insane, they give you like 20-40 pages to read, and a number of assignments).

I'm happy I went through them, but yea, MATH1201 was kinda difficult and MATH1280 was super boring (but not as difficult to be honest).


A fellow student at UoPeople here. I too spent an enormous amount of time on College Algebra. I did nothing but math, morning till evening, all day every day. I dreamt about math at night. And guess what? It was one of the best periods of my life. I fell in love with math.


Yep, same here... I also thoroughly enjoyed it, and was deeply dissatisfied with how easy was the final test, which for me tested very little. Seeing as they do not allow a calculator, that limits a lot of possibilities for problems on the test.


YMMV. When I took Calc 3 at a "legitimate" university, we were told that the class would probably require a good 30 hours a week. I don't think I ever spent more than 5 hours a week, and I still got out with a good grade.

On the other hand, I've had small classes worth only half the credits, with absolutely ridiculous workloads.


>In reality, it took me half of that time. Usually 10 hours per week.

I got a bachelor in a normal university while having a full time job in the last two years. I see no issue about this point.


So this seems like a good time to ask, does anyone have experience with University of London's online degree and how US employers feel about it? I've been eyeing it for a couple years now and was wondering if anyone either has experience directly with this program, or at least can speak more to how US employer's feel about degrees from overseas.

https://london.ac.uk/sites/default/files/prospectuses/comput...

https://www.coursera.org/degrees/bachelor-of-science-compute...


This may be slightly off-topic, but if you're looking at online/distance-learning offers from the UK, the OpenUniversity ( https://en.wikipedia.org/wiki/Open_University ) is very well known and well respected, and has decades of experience in doing this.


I will second the OU, both me and my wife got access to higher education through the OU and successfully transferred credit elsewhere. It was still cassette tapes and books back then, but those materials were excellent. The difference I see now is that the funding is not as good from the government for UK learners. I didn't pay a penny for my courses.


OU is pretty interesting, and I've always heard really good things about it, at least from people in the UK nations strongly associated with the UK. But one thing the GP asked about was how US employers feel about it, and I wouldn't consider OU a well known school in the US.

Some people have brought this up about UoTP, and I think it may also apply to OU in the US: The name makes it sound fishy.

Slightly unrelated, but I find it weird that I, an American, can apply to some their law degrees, but not a number of fully-online STEM degrees.


Unfortunately, the OU is now a lot more expensive than it used to be. It's a shame it can't offer anything at the price of this University of the People.


Another vote for OU; I haven't attended but from an employer perpective, OU grads have had excellent depth of knowledge.


I’m doing a Master’s in Finance from CEFIMS[1]. It’s a taught Master’s so it’s just fifth year of undergrad, with higher grading standards, though not that much higher. I’ve found it reasonably rigorous and intellectually stimulating and to the extent it’s not rigorous a great deal of that is my own fault for not choosing harder courses. Can’t speak to the esteem it’s held in by US employers but it’s cheaper than almost anything equivalent in the US. If you want prestige and you’re in Software Engineering do Oxford’s M.Sc. They’ll accept enough quality work experience in lieu of a Bachelor’s[2]. You need to go to Oxford for a week for each module though, and the fees aren’t cheap. If you want a regionally accredited US degree cheap and fast people have gotten Excelsior College degrees in under a year. If that interests you go to degreeforum.net. It’s devoted to speed running the credential acquisition process at minimal cost. People have done GA Tech’s MS in CS after.

[1] https://www.cefims.ac.uk/

[2] http://www.ox.ac.uk/admissions/graduate/courses/msc-software...

> However, more extensive experience may compensate for a lack of formal qualifications, and a strong, immediately-relevant qualification may compensate for a lack of professional experience.


I don't know about the online course, but in general the University of London is highly regarded. Some of the best universities in the world (LSE, UCL) are part of the University of London system, and all the members are very good universities.

I don't imagine US employers would have an issue with degrees from there. You can find it easily and it looks reputable.


University of London includes among the most prestigious colleges in the world: King's College of London, London School of Economics, and Goldsmiths. They have been doing distance learning since for decades, if not longer if I remember correctly.

The one thing to note, at least when I took it in early 2000, is that its tough. I signed-up for their business/finance program and it was quite the course workload.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: