I did enjoy meeting people from all walks of life, all over the world. However, I also saw a grossly wide range of educational professionalism in the students. In the introductory mandatory writing course, for example,there were a number of classmates whom could not grasp the idea of plagiarism being unethical. With a plagiarism assignment graded by those peers, it was difficult to not feel like higher educational learning was moving along for oneself at a progressively intellectually challenging pace.
I remember experiencing this at a private religious university. At the time, my hyper-religious mind was blown to see students outright cheating in the Testing Center.
Since then I've been exposed to additional perspectives on plagiarism. It is an extremely deep and nuanced topic. A few years out of school, I ended up mentoring and then teaching college students who seem to match the sort of person you describe. This was a huge shock at first.
The more I learned about these students, the more I learned about the sheer variety of perceptions involved: One person's fairness concept is, to another person or group, a latent power dynamic which ought to be questioned.
Or, this person's concern for the big-picture ethical questions is this other person's small-picture roadblock in an economic problem which seems more urgent with each passing moment. You want a big picture? Can you justify it in seconds, with something that's not simply a subjective perception or largely-covert moral construct of your own?
Yet another person's assumption of perpetually commonly-understood contract is another's baroque exercise in cleverness and flexibility. It's the sneaky laser dance from _Ocean's Twelve_, and _that_ kind of challenge is, psychologically speaking, extremely energizing for them. Don't think they didn't notice how things work in the "real" world! (When these two see each other face to face--so to speak--there are harsh outcomes)
Anyway--sorry to hear about your experience & thank you for sharing so that others can be more educated about their choice of institution.
The purpose of learning to write is to make yourself a formidable communicator. If you can independently analyze a new topic to learn something new and apply the results of those learnings towards a particular goal, you can be amazingly effective in everything you aim for. But if you plagiarize every assignment you rob yourself of your own training of this critically important competency.
Plagiarizing some work doesn't really hurt the work, it hurts you.
All part of the zeitgeist I guess.
News is fake. Science is fake. Schools are barriers. Everything is subjective, objective reality is nonexistent.
How do we have productive disagreements going forward?
Funny you describe it that way. I'd argue that young people in STEM fields, including IS/CIS/CompSci undergrad programs, think everything can be objective when that clearly is not the case.
You don't need to go to college to press buttons, fill out spreadsheets, or input code until you get the output you seek. You need to go to college to make the subjective decisions, which don't have a clear right/wrong answer.
Why do you believe that college can teach making subjective decisions?
The humanities are glossed over at best in public high schools across the United States. I don't know of one that requires a PHIL intro course of students.
College is not vocational training (unless you're a law or medicine student), it's for learning how to think.
I'm not sure how much weight this argument holds.. The whole "gen ed" thing is a rather US-centric concept.
I don't know of any universities in the UK that require a PHIL intro course of students. When you go to university, you overwhelmingly study the one course ("major") that you picked beforehand. There's often a small amount of room on many courses for optionals from other fields, if you want to take them, but this is by no means mandatory and I'd say the proportion of folks doing philosophy modules studying a different degree at my alma mater was slim.
Yes, gen-ed is ubiquitous in the US. If you're in any humanities related program in the state I live in (Texas, so that's probably 25-30 large universities total), you'll have to take an intro PHIL course at least, which will probably be Plato and a random survey of 19th century European readings.
More is highly recommended for students looking for law school admission after their undergraduate degree at the state-owned college I attended.
The vast majority of philosophy classes are absolutely garbage at teaching either of those things. Sure, in theory, logic is part of philosophy, but in any of the philosophy classes I've taken, we didn't talk about logic. The things we did talk about were often examples of how not to think, yet they were presented as equally valid next to much more rational ideas.
For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials. I'm sure lots of people walked out of that class thinking that the categorical imperative was a perfectly reasonable way to make ethical decisions. If this is the sort of "learning how to think" philosophy classes are doing, then I'd prefer we didn't--I'd rather let people figure out how to think on their own than to teach them unequivocally incorrect ways of thinking. Philosophy could be useful if these classes were taught as, "Here's a bunch of historical ideas, and here's how we apply logic to prove them wrong." But until that happens, I'd strongly oppose introducing any more philosophy to curricula.
Other fields are better-equipped to teach people logic and evidence. Science is all about evidence collection, and logically applying the collected evidence to the evaluation of hypotheses. Math, especially around proofs and derivations, is all about logic, and probability and statistics give you tools that are very broadly applicable. History, if taught well, teaches you how to logically analyze artifactual evidence and logically contextualize the present in terms of the past.
But, there are two problems: first, many college students don't focus much on these areas. And second, the parts of these fields which I mentioned aren't particularly well taught even by these fields. Many students get A's in science classes thinking that science is memorizing a bunch of facts about chemicals or living things, without ever having learned how to obtain new facts themselves. Many students get A's in math classes having memorized a bunch of formulas without being able to derive even basic proofs. Many students get A's in history classes having memorized a bunch of historical events, without knowing the difference between primary and secondary sources, and without ever considering that an author might have bias. Even the classes which do teach people how to think, to some extent, generally do a piss-poor job of it.
That's not to say that these fields (and other fields not mentioned) have no value. Even if you think well, your thinking is only as useful as the evidence you feed into it, and colleges do a very good job at moving vast amounts of evidence on a variety of subjects into people's brains. Further, colleges often do a lot of work getting people skills: lab techniques, using computers, effective communication, etc. You can argue that the purpose of college is learning how to think, but the implementation of college is much better at teaching people information and skills. Learning how to think would certainly be valuable, but de facto it's not what colleges are doing, and the things colleges are doing do have some value.
That said, modern colleges often put teaching of any kind behind profits, and that's not something I see any value in for society or students.
There is more to critical thinking than formal logic, I'd argue.
The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.
I'd guess that a big part of the reason there is such a glut of humanities graduates who can't find professorships is that people simply enjoy the classes enough to keep going all the way through graduate degrees. You get discussion and debate in those classes that you can't find anywhere else.
I don't think the above is true of many other disciplines of study, with so many degrees offered being pitched for purely profit motive as job training, as you mentioned above.
I can't do a better job of describing this than this professor who puts his public lectures on youtube for free..
Well, if you look at literary criticism, there are a bunch of different ways to do it. The oldest ways, such as authorial intent or historical criticism, aren't that divorced from history as described in my previous post, or from just normal old formal logic. But a lot of the ways popular now, such as Marxist criticism or feminist criticism, are forms of reader-response criticism. In the worst cases, this sort of criticism can be used as a pulpit for professors to pass on their ideologies, which is deeply problematic--rather than teaching students how to think for themselves, it's teaching them to think like the instructor. In the best case, it can teach students how to evaluate literature in relation to their own goals--but I would argue that this is just an application of formal logic. The reality, in my limited experience, is neither of these extremes--classes I've taken and my friends have taken have mostly been "these are some of the ways people have thought about literature"--it's more about passing on information than about teaching how to think.
As I've said before, there's a lot of value in giving people information, I just don't think it supports the "college is about teaching people how to think" narrative.
That said, I'll give two caveats here:
1. My own formal training isn't in literary criticism, and beyond some general-ed requirements and second-hand experience from friends/partners in literature programs, I have very little experience here. My impressions here may very well be off-base, which is why I didn't mention literary programs in my previous post. A notable bias in my previous post is that I talked most about the fields I'm most familiar with.
2. Up to this point, I've basically been speaking about teaching facts versus teaching how to think as if they were two different, mutually exclusive things, but it's worth noting that that's not quite true. It's true that simply giving a student a fact doesn't teach them how to evaluate whether something is a fact, but if you give a student enough facts, eventually they come across discrepancies and begin to experience cognitive dissonance. Over vast swaths of facts resulting in a few discrepancies, a student will eventually begin to come up with their own framework for evaluating discrepancies, and hopefully that framework will look a lot like formal logic and evidence collection. I'd argue that this is a very, very inefficient way to teach students how to think, but eventually I think it does work.
I've read this a couple times, I'm curious about what you're saying here, do you mean that your class just reviewed some writing on the categorical imperative on its own, or read Groundwork of the Metaphysics of Morals?
The trouble, I think, is that making ethical judgements requires wrestling with ethical conundra oneself, and that is not something a professor with 300 undergrads to teach can provide any useful guidance to the majority of them for. The idea of accurately assessing performance is even more unrealistic. Maybe it's a function of the kind of university I attended, but the vast majority of my fellow students who were taking these 'subjective' courses were simply gaming a rubric in their writing. And this is true even of those who were genuinely interested in the subject matter, they saw it as a price of admission.
Which seems to me like an impediment to actually learning what was traditionally taught on more of an apprenticeship than an industrial model. If your own undergraduate experience was different, I'd be curious what your university did differently.
I don't remember a lot from my undergraduate course on Ethics, but I do recall that literally in the first lecture, the professor presented us with questions about things like how should one behave or treat others, and then presented us with "edge cases" that directly challenged what most of us had answered.
As a young person, it's very easy to think that our problems are novel and unique, and the ethics course very clearly showed that many of these problems are millennia old, with people having given names to better-realized versions of what most of us think of as the way we should behave, and that people have spent lifetimes of work writing and arguing about the ramifications and "edge cases" of such philosophy.
I feel like the biggest benefit from the course was not any particular ethical guidance, but rather the challenging of our beliefs, and the realization that these things _are_ hard, and are not something we can trivially answer with something that fits on a Hallmark card.
>Why do you believe that college can teach making subjective decisions?
Lol, most people actually believe that common-sense can be taught to people. I don't.
In the US, maybe. Do people take ridiculous loans for their degree outside of US? Some loans, sure, but loans that amount to 5-10x their future yearly income? I don't know...
Yes. In the UK. I had a relationship with someone who specifically learned German in school, and went to Germany to tutor in a cross-education outreach program after she graduated to go scout for Universities she wanted to attend in order to avoid having to take out massive loans like her siblings did back home. Very smart girl.
She and I enrolled into online classes, I had already complete my Bsc but wanted to do this with her; but she felt she was missing on the 'campus life' part of the University experience and went into Pedagogy to the Masters level and now teaches back in the UK.
In a post Brexit World, that is just not possible.
The EU is still pretty favourable in terms of University costs being hidden and obfuscated via VAT for the students, but many Industries within it's local economy (PIIGS, Romania, Hungary, Slovenia in the Eurozone, and just about most of the periphery member nations) cannot provide adequate jobs let alone a career to its graduates within their sectors so they have to go to Germany, UK, Holland and as things have gotten worse France to a much lesser degree than when I was there.
The ideal being landing a job in the US or China where they can make obscene amounts of money in certain fields like Tech or Medicine with little to no debt, and subsidized advance degrees. Which still opens it up to the work visa lottery, and uprooting your life during some the most critical years of your entire Life (late 20s to early 30s) in the hopes it pans out.
The best thing that can happen is to disrupt it entirely and level the playing field and re-structure it in such a way that its both affordable and accessible to all motivated to want to go in and meet its requirements. And incentivize them to stay in their home towns a build a solid community and tie it to the needs of its actual needed labor force: hopefully doing away with the notion of studying Civil Engineering for Oil Rig drilling if you're from Iceland kind of thing. As it makes no sense, and doesn't reflect the value system or the job prospects of your community let alone the job prospects of a Nation that is entirely dependent on renewable geo-thermal.
How exactly the Lab portion of STEM gets solved is still a mystery.
I propose the building auxiliary wet-labs in Libraries within their communities. The net benefit here being that students should be required to teach children and adults of their community the topic or subject they are studying as a graded portion of their grade for the privilege of having such a model and build community in the process. Or perhaps that should be the only real on-campus (at both Universities and Community Colleges) component to what is an otherwise entirely Online system?
Just look at this example, which having to attend my midterm and practicals during one of the largest fires in San Diego History (I was literately trapped in my car on my way back home to OC for 7 hours after they closed campus when we were sitting down for the exam as the classroom filled with smoke) and during my finals during the H1N1 swine flu pandemic, I can understand this from both sides:
That hot button issue could be entirely mitigated, whether you're pro or against the BLM protests is irrelevant. On just a practical and logistical matter you could just overcome this with the current technology that we have and avoid the certain backlash to the professor, department because of it from the irate student body and opportunistic Media.
I saw a rant from a UCLA professor pretty much lining out how he, and his entire profession have not seen a single decrease in pay since he left University in the late 80s as a TA and saw how the CSU/UC extortion system was being assembled in what was once the envy of the entire US' university system--which followed the EU's model pretty well, and was low to no cost if you were local, but had the ability to employ its graduates as the California Economy could support it. Which was a net benefit that significantly contributed to CA becoming the 8th largest economy in the World.
I can't seem to find it and really wish I had saved it as the very employees in the system are to the point where they know it went too far. And are perhaps even afraid of what may happen at what an angry mob can do these days.
I think people like the one you're responding to would agree and increasingly think that association with institutions of higher learning send a strong signal to avoid dialogue. It doesn't necessarily look like anti-intellectualism to me, any more than filtering out people who didn't graduate high school is necessarily elitism. I could see myself rationalizing either, depending on the kind of conversation I wanted to have.
And at end.
In my intention to post, I was solely being altruistic, informing whomever reading that if they were to read this article and consider getting a degree from U of the P that they should consider the risk. Just a gesture. However, I think my writing style might have been misunderstood as some semblance of pseudo intellectual attempt or such. Do know, for the record,that as the 1st to reply to the post, my intention was to inform.
But I am intrigued and inspired.
How about we both try to post an article that invites our versions of intellectualism! Ready set go.
We are barely having any of those right now in the greater society. As long as we can't argue facts, objective-reality and do so without feelings, we'll continue descending into anarchy and divisiveness.
Some news is fake, some isn't.
Some science is fake, some isn't.
Schools are barriers, but for many elements of a school, the fact that it's a barrier is a good thing--we don't want ignorant people performing in roles that where knowledge is required. The problem is that many elements of schools are barriers which are poor at achieving their purpose, or are directly counterproductive to their purpose.
> How do we have productive disagreements going forward?
That's a complicated question, but oversimplifying the opinions of people we disagree with and then labeling it ("cynical anti-intellectualism") isn't the answer.
This is not objectively true but I understand what you're trying to say. I'm sad to hear that your experience of science and truth has been only that which society has given you, or at the least that you feel that others are only experiencing it in that way
A lot of it is like Bostrom's idea of the decentralized electroshock dystopia: even though a significant proportion of people are witches, everyone's afraid of reprisal for not actively hunting witches, so the witches-in-hiding hunt their own when they're unmasked.
But this is the way of things; this wave will pass, eventually, as well. And like the soviet scientists who kept their heads down and mouthed the party line, the secret iconoclasts will survive till the current order is replaced by the next, with its own peculiar tabboos.
The short summary is: Two people sitting in a bar, discussing how there is no objective truth, or that "science or truth exists only at the whims of the social order" and stuff like that, all day long. And then a third person comes along and smashes their brains with a barstool. In some variations, the first two people are a scientist and a priest. But the third person is just some dude with a barstool.
Where do these mythical jobs exist where being able to write well is a requirement for career growth? Certainly not at engineering companies.
I wish what you said were true, but in my experience, "being able to write and communicate well is critical in the workplace" is one of the top lies taught to me when I was at university. We had to take a regular writing class, and a technical writing class to graduate with an engineering degree. And when I get to industry, I see no signs of people practicing what they're taught, and it doesn't hold anyone back.
Edit: I should say my experience is more about writing than communicating as a whole. People do need to be good speakers/presenters. But writing? Not really.
I've updated my original comment to reflect that I was referring to writing and not general communications in general (although I wrote it more broadly).
I've seen people really value presentation skills and PPT. Persuasion on 1:1 and via presentations is definitely valued.
But via writing? No. They're atrocious when writing emails. And they rarely write docs/briefs. If they do the latter, it's really meant to be a teaser to get someone interested, and then that person will go talk 1:1 to get the details or ask for a presentation.
My experience at work: Writing anything longer than 1-2 pages is a good way to ensure no one will read it. And again, if I have a good enough "lead", what will happen is the senior person will read the lead, stop reading, and schedule something to talk to me in person so he can understand in detail. At some level, I understand why he would do that - it can be an interactive conversation where he can interrupt, ask for clarification, etc. Whereas if he read the thing, he would have to write up a response, or even worse, make notes to ask me the next time he sees me.
I almost never get anything as well written as a typical HN comment. Even (internal) documentation/manuals/Wikis are poorly written.
1. You often have to write design docs to communicate what you are making and gather feedback. These documents if badly written won't be as well received.
2. At many companies, you have a million things to work on, so in a way, you get to choose who you work with at some level. If someone communicates badly to the point of annoyance, it will take something special for you to decide to work with them or not.
3. To convince execs and managers to approve your project ideas, you often have to write a document explaining your idea. If it's badly written, the exec isn't going to be as interested in it.
4. To get fame as an engineer, you often should write compelling blog articles. Badly written blogs tend not to be read.
5. Good docs make popular libraries, popular libraries get attention.
Which leads to:
6. Promotion is often done by a committee of people who don't know your work, and all they are going to do is review what you wrote. And promo is often based on leadership of projects. And how do you become the leader of projects? You write compelling documents.
Bad writing is not a good sign. It's like saying, at my company, we don't write tests and we don't have alerting & monitoring on our servers.
This sounds somewhat idealized. As much as writing well is an asset, the understanding and ability to navigate office politics is what increases one's chances at career advancement. In the simplest form, just doing what the boss wants/expects one to do, not doing the unwanted things. No matter big or small company, a lot of subtle things that are said and done matter more, than the ways things are put in writing.
Also how do you become the well paid boss that tells people what to do? You get promoted. You often have to communicate to your subordinates on what to do through writing and so on. Otherwise your known as a bad boss and the best might not want to work with you.
There're many different ways how the communications end up being. Some ways/bosses are better at writing, others are not as much. If anything to notice is that some well-writing bosses scale down to very short one-paragraph emails, almost slack-style. Even more, these already short messages may have typos (how?? the spell check is ftee!!). Well, that's the busy-boss style, as writing long replies or bothering with minor corrections just takes the time away from the tons of emails in the queue.
Which brings this to the next point, that, perhaps, a _comprehension_ skill is of equal importance for success in any kind of team. And that goes beyond what's written.
Some people/bosses won't read long texts, others want details, yet others want structure; many prefer visual depictions (hello, white board), some would rather write the whole thing in their own words.
As it's said - write for your audience. I'd add, that one needs to communicate to the audience's comprehension level, and you'd stand a better chance to be understood.
I think you and GP have quite different understandings of office politics. I know the GP's perspective: Most office politics is done over coffee/lunch. Then the followup is "Can you present your stuff in our next meeting?" And if not that, then "Do you have a PPT with your proposal?"
People in my company like looking at PPTs. Reading a few pages? Not as much. Just today a senior exec sent out a 4 page whitepaper on the key initiative our department is working on. I probably should do a survey in a week's time to see how many people bothered to read it.
And once again, I think your perspective is biased a bit towards certain SW companies. Most engineering companies don't have remote work (although there's talk in my company to continue allowing it once COVID blows over). It's a well known tech company, but definitely a very traditional one.
> Also how do you become the well paid boss that tells people what to do? You get promoted.
As I pointed out earlier, most companies don't require a formal writeup for promotion beyond the annual review. When I got my last promotion, my manager told me "Both I and my manager are quite familiar with your work, so this paper you're writing is merely to fulfill an HR requirement, and I'm here only to make sure you don't claim something you didn't do," We don't "apply" for promotion. It is granted based on the manager/committee's opinion on how well you're doing. I didn't know I was getting one till it was granted.
Oh, and those annual writeups are a thing of the past. So now your bonus/promotion is entirely based on your manager's opinion. People did protest this change, but I actually welcomed it - I knew the act of writing things up in the past was viewed by most teams as a mere formality, so why waste time on it?
Our company has lots of mentors who do 1:1 or group presentations on career growth. What I write here is reflective of what they say. Not once did they advocate "writing well". Networking, getting to know your manager's needs (or his/her manager's needs), etc are the usual ways to do well. And presenting in front of audiences (for networking, not for promoting your idea - that is secondary).
> You often have to communicate to your subordinates on what to do through writing and so on. Otherwise your known as a bad boss and the best might not want to work with you.
There are plenty of bad bosses in my company. And that's where office politics come into play - "good" engineers in our culture are those who navigate around such bosses via politics (and don't spend time complaining). "Bad" engineers complain or leave.
(I don't agree with the sentiment, but that is the perspective here).
Probably a lot of this has to do with the fact that we have very few competitors. For most of the "best" engineers, going to a competitor is a step down. And from what I've heard from those who left, the culture isn't any better there.
Based on the responses to my initial comment, I'll concede that not every company is like mine - and that's refreshing. It is also clear that my company is not by any means an outlier.
I could go step by step, but really? This sounds to be so far out of reality. There is reason engineers dont write blogs - they dont matter.
Technical accomplishments only make you famous if people know about them - if you want to become Joel Spolsky or Bruce Schneier or John Carmack or Donald Knuth then either you're going to have to publicise the evidence of your brilliance, or someone else is.
Plenty of people don't particularly pursue fame - you can make plenty of money and get plenty done without it. But if fame is your goal and you hope to achieve it through code alone, I challenge you to name the maintainer of grep or openssl or the linux kernel bluetooth subsystem without looking it up :)
There are other routes to getting your accomplishments known, of course. Sid Meier and Bill Gates and Linus Torvalds aren't famous for blogging.
Also, Bruce Schneier is not a programmer. He can code, but the bulk of his work is not programming.
> 1. You often have to write design docs to communicate what you are making and gather feedback. These documents if badly written won't be as well received.
In my company this is entirely up to the culture of the org. Some parts of the company will require it. Other parts treat it as a formality (i.e. few will read it). And other parts don't require it at all.
> At many companies, you have a million things to work on, so in a way, you get to choose who you work with at some level. If someone communicates badly to the point of annoyance, it will take something special for you to decide to work with them or not.
The key phrase is "to the point of annoyance". If most people are poor writers, they are not annoyed at the fact that their peers are poor writers. Even worse, being a good writer is not an advantage.
If your culture doesn't value it, then it is not of value.
> 3. To convince execs and managers to approve your project ideas, you often have to write a document explaining your idea. If it's badly written, the exec isn't going to be as interested in it.
I addressed this in my comment and won't repeat what I've said.
> 4. To get fame as an engineer, you often should write compelling blog articles. Badly written blogs tend not to be read.
I suspect this is reflective of the SW point of view. My company is an engineering one. It is a giant, and is usually the top company in its discipline. I've worked with engineers who are likely the best in their discipline globally, and often way ahead of academia.
Not one of them has a blog - internal or external. Very senior leaders tend to have them, and they usually are not technical, but corporate speak.
Keep in mind: Most of the engineering world is very different from your typical SW company.
> 6. Promotion is often done by a committee of people who don't know your work, and all they are going to do is review what you wrote. And promo is often based on leadership of projects. And how do you become the leader of projects? You write compelling documents.
In your whole comment, this most reflects how unreflective your perspective is in the engineering (and even SW) world. Yes, I do know some companies that do promotions via a committee of people who don't know your work (Google, etc). For the rest of the tech world, this is rare. People get promoted because they have a manger who will root for them in front of the committee. The committee is typically the next level manager, and he/she likely is aware of your work. There's no "promotion packet" that one writes. There's the annual review (under 2 pages), and the committee only scrutinizes it if your manager is pushing for a good bonus or promotion. And as long as its readable, it's good enough. Of course, this means that spelling errors and poor grammar are OK.
When you want to get to a really senior role (usually takes 15+ years in the company - less than 1% of employees reach that level), only then does a wider committee get involved and will scrutinize your work. Do you have patents? Do you have external publications? And this is only for a technical role. You're not subjected to this scrutiny to get into senior management. Which is why surprise, surprise, we have a larger number of senior managers than senior engineers.
> And how do you become the leader of projects? You write compelling documents.
Oh heck no. You get an idea and pitch it verbally to management.
Trust me - a very consistent feedback I've gotten from management at work is "You write too much. No one will read what you wrote" - almost always delivered after I write an email that has 4+ paragraphs. I'm not claiming I'm a great writer, but they don't give up because they find my writing hard to read. They give up after seeing that the email doesn't fit on their screen. If I have a "tldr" they'll read that and talk to me in person (only a tiny minority will read the actual email).
Culture is king. Writing well will serve your career only if you are in a company that values it. Let's not pretend that writing well will take you places in organizations that don't value it.
 I should say that they do not openly have a blog. Some may have ones they don't publicize. Having a technical blog you spend a lot of time on would not be viewed positively, and the more senior you are, the more people will be concerned you'll leak IP. The company is pointlessly secretive and senior management doesn't want to allocate resources to vet your blog's content for IP violations.
> You write too much. No one will read what you wrote" - almost always delivered after I write an email that has 4+ paragraphs.
Writing well does not mean writing a lot. Often it can mean the opposite. The same thing written plainly in fewer words is often better than the opposite. Apply Occam's razor to your writing and shave away the extraneous.
Also 15 years isn't that long of a career time, and eventually, you want to get to the staff engineer level where the promotion committee dynamic applies. Also at big tech co similar promo packet stuff happens for sr. management too.
4 paragraphs is "writing a lot"? Seriously? Since this whole thread was about writing classes in university, I don't think anything I wrote there was less than 4 paragraphs, and my grade would have been poor if I had.
How many whitepapers or technical docs have you read that were less than 4 paragraphs? Manuals? Changes explaining a product pivot?
Honestly, some of the responses to my comments fit in the category of "I'm unquestionably right. So if it's not working for him, he must be doing something wrong! Let me try to guess at what that is."
Not the most fruitful way of having a conversation.
^ not to be confused with strong writing skills!
Brevity is crucial. Learning to compress ideas, eg limiting emails to 5 clear sentences, is part of this rare and important skill. Sorry you haven't (yet?) experienced a work env where there's good writing. Such places do exist!
Definitely. Communication full of casual txtspeak and/or broken English all over. I suspected my first job's recruitment emails to possibly be some kind of scam at first because they were made in 3 different fonts in the same email with random words capitalized or colored various colors for emphasis, of course full of broken English - and I'm not talking about just terms like "do the needful" which are valid Indian English, that's fine, but even evaluating as that language so much of the communication is just terrible and nobody seems to care. I guess it works out fine and ultimately doesn't matter much but it still feels unprofessional.
Communication skills -- especially in writing -- are increasingly important, rare, and valuable.
I've been doing software-related work for a living since 1998. The trend toward remote and async collab -- which has only ever increased in that 22-year span -- strengthens my conviction.
I see some really brilliant problem solvers in my company, for instance, that are definitely being held back by their inability to communicate well. Communication allows you to scale your impact several times over.
I would think that writing well is at least a requirement for promotion into a technical leadership role (above senior individual contributors).
By writing well, I don't mean in the style of journalists or novelists. Rather, writing clearly and concisely to effectively convey one's points and reasoning should be very valuable in engineering.
Electrical, computer, and SW.
I'm not saying communicating well is not needed. I'm saying writing well is not needed. What I've seen: A good presentation (including PPT skills) is much more valued than writing. Decisions are usually made because of them, not because someone wrote a good brief outlining positives/negatives. Emails longer than a few lines tend not to be read, so people don't focus on it. Documents are usually not read by many except those beneath them, etc. I almost never see a senior management write anything of substance unless it is required by Legal/HR - they'll always get an underling to write them (and no, writing them is not how underlings become senior management).
I'm not saying I like the state of affairs, but it is how I've seen it.
Just in case you are serious, counterpoint for others who may not know, all forms of communication are very important to succeed and move up.
They're only low caliber people when it comes to writing. Otherwise they're exceptional engineers. The company historically and currently is a market leader. We're not talking about a small shop.
> I don’t consider people for positions who cannot write well
And this is exactly what I'm talking about. If you're in a setting that values it, and set up filters for it, then of course writing is important. If you're in my company where it isn't valued, then not only is it not important, it has little benefit. No point in writing well if people aren't going to read it.
> Really, can you take a poorly written engineering spec or RFP seriously?
In the case of my company, yes - if you want to keep your job. Unless it's inscrutable, I can't go to my manager and refuse to work on something because the spec is poorly written. He'll immediately tell me to go contact the author and sort it out. Occasionally the author will be nice enough to fix the spec and release a new document. But it's hit and miss. The reason many don't fix the formal spec in these cases? Their managers don't value it.
Specs are for major efforts. At the intermediate level: "What the heck is a spec? We just communicate requirements via PPT."
(No, I'm not kidding).
But that is your personal preference enforced only where you personally can enforce it. That is not imply your decision making is typical for industry.
Any sort of work, in say, nonprofits, or public relations, or marketing, or consulting, or any institution where you're at a level of management where your job is to present plans and preside over their progress while being accountable to oversight, and these are examples of the top of my head where I have at least some sort of familiarity, are places where strong writing is an asset. And I'm sure I'm just pointing to a small slice that I know from my own experience. These aren't special exceptions. These are the norm. The counterexamples make me wonder what, if any, actual career experience people are actually drawing from to claim otherwise, or whether they have the perspective to understand how representative those counter-examples actually are.
The lack of responsiveness comments have to one other on the internet is disorienting to me, because I would have thought that this would merit acknowledgement. Your anecdote may as well have dropped out of the sky in response to basically any comment in this thread.
Work is hard, but like most things you learn best doing the thing. Not saying SICP was shit. Just that I could have done that in high school and accelerated my time to money (and through it, contentment).
Maybe I'll let my kids do something like that if they feel the mildest desire to.
Two years later, I learned about regular expressions.
We had to go to computer lab at school to use the computer. But then my parents spent a fortune on getting us one. Now that I think about it, it was like half a year's rent. Jesus Christ, what were they thinking?!
Mostly played games. But then got a Linux CD from a magazine when I was 13. Wiped the drive accidentally trying to partition it. Disaster. Path to writing code begun.
Besides, plagiarism isn't really about writing. You can lump it into two categories: Cheating, which isn't most folks' intention, and more importantly, giving someone credit for an idea. This last one is something folks need to do in some professions. Don't take an employee's idea and call it your own, same for something your boss has you pass along. Don't pretend something is your own idea when it was implemented at a job you had years ago. This version of plagiarism is vastly more important than writing skills (which can be taught without needing to address plagiarism).
That's not really up for discussion though.
The degree itself will become utterly meaningless extremely quickly if we would actually generally accept that kind of reasoning.
The whole reason the degree is worth something is because it's perceived as a token of you having done the work and self-betterment etc.
It's not an empty token that allows you to have a middle class job. In practice it might be, but as soon as you openly accept that is just what it is, and only what it is, then you only get cheaters.
If you spend time in a university to just get a diploma and maybe some connections, you likely are wasting your time and significant money (remember, a student loan cannot be got rid of by a bankruptcy).
Welcome to the underlying systemic problem with some of Society's more critical institutions.
Fraud and corruption have become institutionalized, and lying and cheating are just the name of the game.
Explain to me how Banks got away with what they have if not fir this very root issue; blow up the economy because of reckless, risky investments: Bonus, bailouts, and golden parachutes for all.
Default on your student loans, utility bill or car payment? We'll ruin your credit for all of your miserable existence, while you slave away anyway because the former cannot be expunged.
I'm so glad I borrowed from family and friends instead of banks or the State. It was hard paying them back, but if I had to choose a creditor of last resort I think I made the right choice.
Except learning isn't really the reason most go. It's to get a well paid job at the end of it.
Anyway, I notice a lot of younger students have this attitude and it frankly causes them to produce really crappy work. As long as they pass the class, they don’t really care to absorb the material.
I can’t help but wonder what kind of job they’re hoping to get when they leave school. What will happen when they get a technical interview? I can’t imagine them doing anything beyond answering phones at a company’s IT Help Desk.
I dunno. I listen to every word the professors say as if they’re telling me the secret to eternal life while half the class is dozing off.
My experience says you'd be very surprised. As in most fields, networking, charisma, and ability to bullshit play a substantial role in IT hiring.
This is absolutely brilliant.
> Except learning isn't really the reason most go. It's to get a well paid job at the end of it.
There's no contradiction here.
From talking to people further along the plagiarism spectrum than myself, they see it as developing good taste or almost coaching.
Yeah yeah in a writing class its hard to justify not learning to write. But in any other class...
Lets hypothetically say we're in a computer science class and our assignment is to write an essay on the supremacy of the C++ language. There's all these English department goals of becoming a better writer that would be met by my pitiful attempt to glorify polymorphism. But the C++ goal of learning to be a better C++ programmer would be best met by extensive reading and research to find the best Stroustrup quote. If I were involved in the academic scene of converting papers into salary via cooperation with other researchers, I need to quote my coworkers accurately to share the revenue appropriately. However what if I don't have the goal of playing that game? In a learning environment in casual verbal conversation I might tell my C++ instructor that C++ main() returns an int. Yet if I write that down as I just did, I'm committing the academic sin of plagiarism by not properly footnoting Stroustrup, that's a direct quote from him. But I'm not trying to play the academic game, I'm trying to learn to program, and develop good taste by copying the right people. It seems a little unfair to grade students based on playing a different game than they signed up for. Even if the institutional goal is to produce little academics, in practice almost none of the kids will become academics.
That's a very authoritarian example of copying a guy at the top; but it also applies to lower level copying.
They're not necessarily wrong or self destructive, just kids on a different path with different priorities.
Then to some extent I could sympathize with those who plagiarize. ... If it's to save time for something more on topic they think.
But if it's a writing class, then, no! Or writing about history or society etc
Uh, did I say that, or were you intuiting? Not exactly a learner's approach :D
> Plagiarizing some work doesn't really hurt the work, it hurts you.
Except when it benefits you? This is the subjective perception I was talking about. That their mindset differs does not instantly make them wrong, especially when you can throw a dime out the window and hit an educated professional who falls short of the best (heaven forbid the "perfect") ethical standard.
Ethics is, and should be, hard. If you put words into my mouth, is your position ethical? This stuff requires the ability to stick around, listen, learn, and stay in the game, moreso if you plan to claim the high ground.
The concept of competence as you describe it is also very much a vague, subjective concern out of which you've just attempted to carve a covert competence contract. This leaves your blind spot unguarded because you are unknowingly making the discussion focus on you and your own competence level.
And this is a big part of why "hyper ethical" subjective ethics people struggle--they assume their view is right and don't ask questions of others.
This doesn't have to do with me either -- the market will determine whether any one person is valuable enough to employ (or promote). My only claims are that being able to write makes one more valuable, and that plagiarizing assignments at school fails to teach one to write.
If students don't bother to do the work, they won't develop any competence in the discipline.
It would be like sending someone for Scala training, only to have them skip all of the work, buy the answers to the quiz, get the accreditation.
University is about much more than 'skill acquisition' but there is a lot of that. Cheating is almost universally pointless.
Except that it clearly isn’t. There are all kinds of people in positions of power who are clearly incompetent in many of the skills we would want them to be expert in.
Often cheating enabled them to pass the gatekeepers and attain their position.
Sure, that example absolutely works. One reason why you wouldn't cheat is that you know that a specific outcome you want requires something of you that you must learn. I found it striking just how rare this was, though. We can fault students for not having made up their minds, but I found that many of them are just really open to new directions, and this can help to enable part of the plagiarism equation, but it's also something of a gift...
Anyway, talking to my students I discovered that the discipline is really often completely up in the air. So while the rhetorical / imaginary student's path for the purposes of argument might be be "study math -> work in applied math," quite often it's "study phil -> work in I don't know what" or similar.
The students who plagiarize with this mindset are really quite something. It's nuanced--they're smart about it, leaving no final question on which points against them can rest. For example, the student submits a first prospective paper in which they quote-paste for pages on end and then plagiarize not by direct-copy, but by reading and then re-hashing someone else's conclusion from a book or another paper, and they get a C+. Well, if a B- is all they need in the class, they are good to go. Then they take a reactive / tactical stance and only change this approach in the future if they absolutely have to.
This pattern happens over and over. If you attempt to pin the student down on qualitative issues, they have a number of tools to use here. You have to be ready for extreme negotiation. They _may not be able_ to learn about quality, ethics, etc. Shocking sometimes but it's a struggle for many. One of the most common negotiation techniques is, "I just...I don't understand. I'm really not that smart" and then they start crying or leave the room in a rage. This can instantly shut down a professor with average or greater levels of sympathy. The student converted the negotiating professor's original value proposition into a risky interpersonal issue. If further negotiations occur, they will find ways to illustrate why things are unfair to them. What is the prof going to do about that? Do they even have time for it at all?
Then you can go back to students who are in the "study math -> work in applied math" group. You look outside of the math classes and you can see the same pattern. They know wasted effort when they see it, or think they do. And again--some, not all. Savvy employers also weed out some of these people but then other employers hire them because they desire tactical cleverness in their organization, and they recognize it when they see it. Maybe it's how they got the boss job in the first place.
> And this is a big part of why "hyper ethical" subjective ethics people struggle--they assume their view is right and don't ask questions of others.
That is itself quite the assumption.
What is it with this everywhere on HN these days where people assume every response someone makes is a refutation? It's a conversation, dude. People will take it places. Sometimes people will build on ideas you mention. Other times people will take an incident and draw their own conclusions. Other times people will draw on a similar incident or talk about a related concept.
I'll accept my off-topic downvotes since that is fair but this is so frustrating.
I just re-read what you wrote. If you're saying your response was not a refutation, given that you wrote "is not best characterized as," I think it's pretty clear as to why that could be misunderstood, to say the least. It seems clear to me that you were replying in direct disagreement and also projecting words I never said right into your reasoning. And further, it now seems as if you're claiming that I'm being assumptive. This is just compounding, not helping.
I think specifics are important here because it's unfortunately common for people to attempt to sweep pesky details under their subjective-ethical rug in the name of [hand wave]. Since this is an ethics discussion some due concern to communicating ethically seems reasonable to expect. If that's frustrating, maybe you can at least see the frustration on both sides.
and a job well done is its own reward right? i think it's very pretentious to say that to a person who's attending school in order to improve their lot in life (because credentials count for so much); that what's more important than the credential is some abstract notion of improvement. you might as well cast it in terms of sin and salvation.
Sure, if the life you want is to be at a desk for 8 hours a day regurgitating your superior's existing biases back at them, go ahead. I find employers much prefer someone that can attack a real problem and think critically about potential solutions from multiple levels of analysis. If all you're good at is chewing someone else's cud and spitting it out with a slightly different word order then you're useless to people that actually want to solve problems.
And, sorry to say, credentials are counting for less and less every year. If employers have to take a year to train you to think critically, then what use does a credential serve as a filter? I wonder why that's happening...
The clients got the same value (they wanted a v1 policy, and they got one). I became better by doing the work, so next time I had a discussion on the matter I felt that I controlled the discussion instead of pitching in.
Faking it till you make it has the risk that you fake it forever and you become the paradigm of the Peter Principle.
Walking the walk takes more time but always benefits in the long run.
I have met plenty of people though that take the risk to never grow/evolve and stay in their comfort zone because they just want the base salary to fund their hobbies and they get no sense of accomplishment through their work (for many reasons)(I am not getting into this discussion).
Ps: I now work like this (when asked to develop a policy for a new client): spend some time thinking of key points (technology changes fast enough in some areas), drop a couple of examples for each bullet point, and then "plagiarise" from previously made work. This way I have prepared part of the downstream Procedures. You would think that a Policy is high (enough) level so it shouldn't need frequent changes, but different clients want different things.
If every person at the school had this reasoning, the credentials wouldn't count for anything.
The credentials only count for something if enough of the people graduating are actually fulfilling this promise of self improvement in the subject of their studies.
The cheaters are literally freeloading off the prestige of the credentials produced by the people who do put in the effort. If that group of people did not exist, the credentials would be useless to the cheaters as well.
Those credentials only matter insofar as they describe the likely caliber of the alumni that come from that particular school. Being a terrible student isn't going to help the value of your degree a whole lot...
Funny enough, this idea is almost certainly one you are plagairising. Perhaps you could find nuance, depth, and understanding by turning those words on themself?
Somehow I feel like most of the debate here centers on these words:
> The purpose of <foo> is <bar>.
That little word "the" there at the beginning seems a bit myopic to me. I mean, clearly, something like writing functions in more ways than one. Here are some other potential purposes of writing, off the top of my head:
* Deconstruct your personal ontology,
* Intrinsic artistic value,
* Emotional expression,
* Elucidation of unconscious grammatical habits,
* Social signaling.
Any of those are perfectly functional operations of the writing act, and a few of them actually benefit by plagiarising (Identifying these is left as an exercise for the reader). Of course, saying that "university should allow students to operate under any possible goal framework" is a different matter, but hopefully that at least points toward one way of thinking with more nuance about plagiarism.
You can make the argument that legal teams exist to relieve the very potent pressure of this uncomfortable situation for those on both ends. There are truths here that flow right up to the top and all the way to the bottom. Attacking those perspectives is too hard. Better in some ways to set up social norms around it. Thus, lawyers, the concept of win-win negotiation, etc. Just when your anger is boiling over you learn that the C-levels want you to drop the moral crusade.
Advocating cheating because it helps a student from possibly lesser means achieve their 'check-box' in education is missing the point, in a very serious way.
We can always strive to have an open mind, to try and be sympathetic to the plight of others - but the Truth is pretty much still the Truth.
Plagiarism is not a controversial subject in the context you've mentioned, it's just wrong in every sense.
More pragmatically - if someone is in a situation wherein their need for the 'checkbox degree' outweighs their actual need to learn something, then they almost assuredly should not be there. It's pointless for society to be spending a lot of energy and resources only for people to waste them, and it ruins the credibility of the system. I'm not unaware of the fact that a lot of Uni may feel like jumping through hoops, but even then, if the hoops are merely jumped through, we're learning something. Uni is not meant to be enlightening at every step, it's also, like everything a grind.
A 'free public Uni' is going to attract a wide swath of students, and there will necessarily be all sorts of issues, probably very low graduation rates, challenges in communicating the material. I totally support the idea, at the same time, we should strive to maintain the credibility of our own ideals and institutions. Universities are there to help develop character, they're not just about 'absorbing data'. In the long run, it's worth it.
It's a bit difficult for me to believe that there are highly competent people who would be effective employees now, but for whom cheating is easier than doing the work. Remember: cheating (without getting caught) takes time too.
I can write a few page essay on any given topic much faster than I can figure out how to copy/pasta that essay without getting caught.
I can implement most undergraduate programming assignments faster than I can figure out how obfuscate someone else's code enough to fool cheat detectors.
The subset of people who are truly competent but for whom cheating is easier than doing the work has to be vanishingly small.
In my time at university, I knew many capable and bright people who cheated once in a while, but only in well-picked cases, not as their general policy. For example if there was a mandatory bullshit subject, or something that they regarded as worthless waste of time etc. and rather spent the time on the important things.
I think it's a very naive shielded "good boy" view of the world that there is some simple rigid moral rule like never lie, never cheat etc. It may work in a benevolent environment like rich protective parents and never dealing with adversity. One has to develop one's own sense of justice.
This can be easily misconstrued. The point isn't to believe in nothing, be exploitative and selfish. Rather, be mindful and don't just blindly follow someone's bullshit. Indeed much of the purpose of education is to kill this ability and to certify the capability for blind obedience and jumping through hoops without ever questioning it.
One guy I mentioned in the above parts is actually really honest in general and sometimes I wonder how he gets away with it in corporate environments, saying straight nos, not putting up with colleagues criticizing him for working too fast etc. I've usually been much more careful but he's more successful. And it's an art of picking your battles, refusing bullshit, sometimes openly sometimes secretly (at least don't lie to yourself), sometimes making a stink, sometimes just complaining to fellow students, knowing the unwritten lore of which courses are unofficially considered "cheats allowed" by most talented students and probably the teacher included.
The world is complicated, but for shielded kids with underdeveloped social skills it can be hard to learn how widespread "rule bending" is in real adult life and how much this is basically known, expected and part of life.
Again, this is not to say be selfish and disregard others. Rather, think for yourself, know when something is bullshit (there is lots of fancy official institutional stamped-and-signed authoritative bullshit out there, often coming from people who know it's bullshit but either don't care of feel their hands are tied).
I have two options; Read the super boring textbook for 5 hours, take the test legit and get an 80% or I quizlett the test get 100% and spend the next 5 hours listening to Dan Carlins "Hardcore History" Podcast which i find much more informative and enjoyable.
But the university wanted to boast that they are modern and prepare students for business stuff: look, we even require business-related courses in our program! But actually it was some nonsense like memorize various lists, like the 5 different aspects of whatever, some pseudo-mathy formulas, etc. I mean, if you take me that seriously to give me this type of nonsense as "learning material" then I will take you precisely as seriously when it comes to the exam.
You are making this way too complicated.
It's honesty vs. dishonesty.
Those of us playing on team honesty are right to view the dishonest as playing on the opposite team.
Which team is actually morally right, is the complex issue.
> It's honesty vs. dishonesty.
Kind of like you're either a criminal or you're not. If you drive 58mph on a 55mph zone you are. Otherwise you're not.
Binary positions are always convenient, once you pick a side.
There are deep and nuanced ways in which people tell themselves and others how ethical they think they are taking credit for other people's work.
It's not really something that is up for discussion whether it's bad. The rules are pretty clear, and for serious higher education the punishments are extremely harsh. And deservedly so.
What does it even matter what it says on your diploma if you cheated your way to getting it?
These people might pat themselves on the chest for being oh so clever in subverting the system or whatever, but what does it really mean? That you're good at cheating, unwilling to do the work, and gladly take credit for other people's work. In other words, being a useless turd.
You could have been studying astronomy, physics, math ... but instead all that you really proved is that you're good at cheating.
But I don't see how this is more nuanced that being honest or dishonest. I really pride myself on being able to see where a lot of opposing viewpoints are coming from, but I can't see how any of the people you describe are doing much more than lying to themselves about what they're doing.
"Using counterfeit money is arguably OK because some of the people who use it might need it, or see the need to earn a living as a needless waste of time as long as there's idiots who'll accept fake currency".
But given who holds the majority of the money being devalued, given the extent that money is already being devalued for the benefit of some at the expense of others, and given that the idea of taking a relative to wealth portion from the group for the benefit of the poor is a form of redistribution the group already engages in regardless of the consent of the giver, I think you can easily construct an argument that it is not immoral to use counterfeit money as long as you pick the right targets.
Then you would have lost your mind during some of my upper division STEM courses, where the normal class average is in the mid 40% on the high end and high 30s most semesters. We had a class average of ~83% in one of them, and when the professor was in total disbelief about how blatant they were being (several didn't even try not to get 100%) he got upset and decided to stop re-using the previous year's exams that were circulating amongst the Frats/Sororities and the class average dropped back down to low 30% for the last 2 Midterms and finals. This personally helped me as I got my average to a B+ range after my Lab scores were included.
Academia, be it online of IRL, it always going to have that element of corruption, there has just been so much money to be made that it drew the worst from other sectors into its administrations. Eric Weinstein goes into the depth about the artificial scarcity within STEM that was created in the US to favour a 'race to the bottom' approach to wages since the the 90s, and he went to both MIT and Harvard!
My best professors were often disillusioned and jaded after having been in Academia for a few decades, one even having to petition and protest to the Dean of their departments to be paid for having taught back to back Summer courses during the budget cuts. A person who went to Oxford for his BSc, no less...
All of this is taking place while hearing about some high ranking official or executive is in the process of being disgraced for having plagiarized their Masters or PhD thesis decades ago, which begs the question how the hell did it pass back then? Don't they all have to go through dissertations defenses and have academic advisors who are PhD or Post Docs in the very same department?
I won't even talk about the administrative led favoritism and vetting they did for foreign students during my time there, either.
I'm a proponent of of Online Learning in general, especially as its disrupting the Brick and Mortar University exploitation model; you can now get a Masters in Electrical Engineering from CU Boulder entirely via Coursera for $20k with installments and Financial Aid available where applicable, whereas that's not even the total cost of a single year of Undergraduate studies there. Which really sucks for incoming students, but could be a net boon if they can start reaching the undergrad degrees in time now that most Universities are having to do that due to COVID.
I just think its worth keeping in mind that this institution (academia) doesn't deserve to be regarded anything more than the corrupt money-pit that it is, which occasionally avails itself to allow brilliant minds to excel in their field, after having exploited them to publish and use their labs until the can make a name for themselves and operate outside of it. I just think about the wasted Human capital, talent, and drive that you only have when you're young, naive and idealistic and determined to make make a difference in the World.
Yes, many schools will not accept coursework from nationally accredited universities. This includes coursework from UoPeople. You will not be able to apply for many graduate schools as well. A degree from UoPeople definitely comes with limits, and I wish the school was a little more forward about that. (There are lots of discussions about this on the internal social network.)
Lots of universities in the US are lavish and expensive. Our state's branch campuses are the definition of utilitarian. Tuition is some of the lowest in the country. Salaries for professors at the branch campuses are lower than what we pay our high school teachers. Very little admin overhead. There's simply not much to cut.
Anyways, not saying it's right. But if you cut higher ed budgets, higher ed will get more expensive to the end user. Especially if the system was already hyper-efficient. Limiting transfer credits is pretty much the only way to make things more expensive to the end user without increasing tuition.
Whatever the setup, I think it is access to higher education at as local level is important.
Everything I've ever heard about and learned of the college admissions and pricing situation makes me prefer that latter.
Though, in the case of UoPeople, it's purely about nationally accreditation lacking the prestige (and sometimes rigor) of regional accreditation. It's been that way for at least a couple of decades since I started reading up on higher ed.
Spending three years at a community college and expecting the credits to transfer to a major university makes no sense for the institution. At that point they did a minority of your training, why would they recommend you?
Usually, a strong state-run community college program lets an Associates degree earned at a community college transfer into any public state university as a Junior. Most non-degree focused classes directly transfer. Going between states (staying in the same regional accreditation) seems to be difficult and not all states are set up like I had described. The theory being that someone who isn't quite ready for university has an opportunity to catch up and in my experience it gives more options when popular prerequisites are full. So the state has a reason to push for this.
I'm also skeptical about universities pointing to their qualification level because I've interviewed graduates from many well known universities and was kind of floored at the basic things many candidates seemed completely unfamiliar with.
Nationally accredited organizations don't have anyone pushing for this relationship. This seems to generally work for those schools because they can advertise as accredited, get a fig leaf of transferring credits, but usually want you to stay and complete tranning there. Most tend to be vocational which makes a lot of that moot.
I took some community college programming classes in NJ and NYC. Didn’t have to repeat the ones I did. I was in awe at how weak the curriculum was in both cases. It’s not the student’s faults imo that their programming skills were lacking. The classes weren’t challenging.
Yet some of them eventually transferred to State Uni for comp sci. It really amazed me. Wonder how they fared and if that just means that specific state school is just really easy too.
It led to many fewer questions by students, teachers assistants (something the other institution did not have and did not need because one professor could attend to all of his students), and in general less interaction between students with each other and with professors/TAs.
The curriculum at both were essentially the same for this set of classes.
I spent 3 years (including Summers) at a CC then transferred into a top 10 engineering program. The 'extra' time was required because I needed to take a year of math classes before calculus.
And IMO, the CC's pre-engineering program left me better prepared for the university's department/major coursework than some of my peers who started at the university.
This makes no sense. Spend as long as you want at the community college, transfer all the credits you want, by definition they will all be 1xx or 2xx classes. Your degree will require half the classes to be 3xx & 4xx, so you will always have to spend at least two years at the university.
Do you remember which state and university?
Sorry, pet peeve. It's "who". "Classmates" are the ones doing "could not grasp" instead of "could not grasp" being done to them.
"He" could not grasp. => "Who" could not grasp.
I should vote for "him" => "Whom" should I vote for?
It might help to define what form of plagiarism you encountered as there are some behaviors that count as plagiarism (at least were counted as such back in my college ethics class) which I never found any valid reasoning for. I have no difficulty understanding the ethical issues of passing off someone else's work as your own, but you can also plagiarize by passing off your own work as your own.
For me UoPeople is a blessing. I'm able to pursue a higher education all while working and supporting my family. This will give me so many new opportunities for better careers when I eventually move back to the states.
The educational method is challenging. I do like peer assessment, but I feel that some peers don't really try when grading. I like to give grades that are warranted, good or bad. Being told my work is not good with no correction is highly vexing.
I hope to continue to grow as a student with UoPeople and see others do the same.
The students are mind numbingly average. (But don't worry they will let you know what college they went to 12 years ago)
Heck if the job requires any communication, you might be better off finding someone humble or someone who needs to prove themselves.
Or maybe the best of these students go on to work at companies that pay 250k/yr and I just don't meet them.
I think that top-tier universities are different in the top 20% and bottom 20% of students. The top 20% because you won't find those types of people at no-name. The bottom 20% because they're at least not completely incompetent (unlike the bottom 20% at no-name).
Here's how I break it down:
Top 20% of students at top universities: They really are that good. Also, they are the ones who benefit most from the intense curriculum at top universities. So they start off really good and actually do get a force multiplier effect from the comparatively very rigorous course-work. Most of these students end up on rapid trajectories in industry or with NSF fellowships.
Average students at top universities: Most of these students would be in the top 10% of the class at a no-name college. Still quite good, but nothing special.
Bottom 20% of students at top tier universities: Really nothing special. Comparable to the average student at no-name. Contrast with the bottom 20% at no-name, many of whom couldn't fizzbuzz on their first attempt in our senior interview prep course.
So, top tier places are really characterized by their top 20% and bottom 20%. The top 20% really are quite amazing to work with. The bottom 20% are at least not incompetent.
One last note:
> Or maybe the best of these students go on to work at companies that pay 250k/yr and I just don't meet them.
Most of my students at top-tier had offers in the $150K - $250K range. So if your sample of top-tier university CS students is sampled exclusively from non-entry-level engineers who did not receive offers in the $250K range, you're almost certainly sampling from the bottom 20% of top-tier graduates.
Your specific example of plagiarism is odd; I'm not sure what's difficult to grasp there. But wide ranging dedication (or professionalism if you prefer to call it that) is pretty standard at brick and mortar universities - it's pretty standard for a subset of people to prefer the partying and socialising aspects, and also pretty standard for quite a lot of people to drop out altogether.
If you steal a design for a jet turbine, you still have to learn how it works and you add that understanding to your personal/corporate skillset. Basically you have learned a secret.
If you have someone write your college essay for you, then you probably do so in order to avoid putting in the work to write the essay yourself. Basically, you have learned nothing.
I know there's nuance to both but I think your argument is overly-reductive.
Ultimately, however, I found the entire thing way too tedious - full of the classic "make-work" and silly hoops. The level of pedagogy was very basic.
I dream of (and am actively working towards) a day where the computer's potential as a new educational medium is fully realized, rather than the current parade of attempts at transplanting a brick and mortar classroom into a remote asynchronous delivery system.
One language app I tried for a few minutes was a VR app that put you, for example, on a train, and you had to have a relevant conversation. It was cool, not sure how effective because the GearVR I used at the time made me sick too quickly.
One of the distinguishing features of computers is easy simulation. Humans learn well by direct, tight, interaction with a system – poking at a thing to see how it works. Tight feedback loops are how we quickly build intuitions.
Explorable Explanations  are a great extant example of what I mean, but we can do more... Imagine a "textbook" constructed around a well built-simulator (extant example, Earth Primer ). Sections of the textbook would present the simulator configured in a pre-set state, with various simplifications and initial conditions. Students can poke and prod these simulators, reset them, test out new states, and answer questions. Assignments could be on the order of "Given Sim[Initial Conditions] figure out what gets you to Sim[Desired State]".
Current explorables are lovely, but (usually) incredibly bespoke. One concrete improvement would be to produce an OpenSim standard, which would allow other content creators to embed and customize these sims to construct new narratives.
Another powerful computer affordance is extreme specificity – a system should be able to build a representation of a learner's current knowledge / skill graph, and figure out a (or many) shortest path(s) from current knowledge to desired knowledge, scoped to that user's interests. (Many LMS systems attempt this, but we're still in the early, clunky days).
Next we have non-linearity. Most textbooks and courses impose a false idea of topic dependence. Yes, there are some intrinsic dependencies, but there are many MANY more ways (orderings) of moving through a learning space than your chapter textbook suggests.
I can go on like this forever. I myself am currently focused on constructing usable knowledge graphs (and localizing incoming students on them), and simulations students can easily play around with to build powerful intuition. For the latter, we've found the wealth of STEM tools available in Python to be a huge boon. We (I work for an ed-tech non-profit startup) usually teach students some programming, and then have them start building models (physics simulations) or interacting with existing toolkits (such as, recently, the Rosetta protein modeling suite). These tools act as a forcing function for real-world relevance, and allow for direct intuition building over rote memorization. More effective, and much more engaging.
To address your comment: VR (and IMHO, AR) look to have a lot of potential in the future, though they are still in their infancy. I think there's a lot of potential for the humanities to provide more immersive experiences of places and time periods. (Re humanities, simulations also work quite well. Mock trials, political re-enactments, etc. There's a reason simulation games are so popular...)
I remember very well reaching an understanding of a subject by different means than what the teacher expected(?) or was taught to expect. That path of learning was rejected (despite the result being the same) and accordingly graded as a failure. Such resistance to different learning paths, a "status quo" of how people should learn, and the tendency to try and fit everybody into well-defined boxed is frustrating and counter-productive.
I assume you already know about PLATO, yes?
My ideas aren't novel, but every decade we try again and hope all the pieces are in place to start something which grows and thrives.
This online learning site is eligible for regional accreditation as of 2020, but they really play up the eligibility in a way that seems suspect. They aren’t clear if they have even applied, but they have 5 years to apply until they have to have a redetermination of eligibility.
Not saying this is a scam, but it feels like a scam.
This site was very hard to find, for what it’s worth.
Not directly related to your specific concern, but every attempt to make a open / democratized degree has felt like that to me, unfortunately.
Example: There was an online degree program I saw that awarded a real, regionally accredited degree from a state university. The degree and course work were exactly the same, but the online program was basically open. No admission requirements except to pay for and pass some pre-requisites that would then grant you access to the rest of the degree. The equivalent brick and motar program went through the typical admissions process.
My first thought: "this sounds very close to for-profit school".
However, UoPeople is different in that it's much, much less instructor led. They use a "peer learning" pedagogical approach. It's something I quite dislike about the school. I consider it to be poorly implemented and leads to problems like "revenge grading." However, I tolerate it because it allows me to study in a structured environment for less than I'd pay at my local community college.
It's not something I would recommend for weak learners and/or for many who need a regionally accredited degree.
So I don't really care about the details. I learn more, great, but otherwise I'm happy because I just want to be able to apply to get into Europe or Asia as a software developer in the future and not worry about whether I have enough points, etc to pass the rigor.
All of that being said, the first course was a gong show with lots of academic violations, but like all things in life it's what you put into it and I plan on graduating with Honors, so even if it's a weak degree I'll still be able to respect my work and effort.
For those unaware, that's the "study skills, introduction to college, college success" course.
Yeah, that was pretty awful. I don't mind the material. It was nothing new to me, but the quality of discourse, grades, assessments, and so on--ugh.
With each step through the CS prerequisites, things have improved. So it seems the less prepared students either washout or get stuck doing general education courses. (Thankfully, I've got all that covered in transfer.)
The problem with all forced group work and not being able to choose the group is you get a very mixed quality of effort and talent. I'd stay away from any program that pushed that heavily.
The first is through discussion forums. We are required to post a response to a discussion prompt each week. We return to write a minimum of three responses to our peers, and we also assess their response on a scale of 0-10.
The idea is that peers will touch on different aspects, extend one another's learning, and engage in further discussion. In reality, we end up getting "great post" responses most of the time and minimal effort put in by our peers. That's not always the case, but it happens enough that I don't think it's actually helpful or a meaningful approach.
The other means is through peer assessment of regular assignments (e.g., 'papers,' programming assignments, etc.). We submit these one week, and then our peers read/study and assess our assignments based on established criteria. Peers are supposed to provide feedback on each aspect and then an overall response.
Again, the idea is that we learn through studying and responding to our peers. In reality, most students make minimal effort. So most of the time my assessment responses are limited to "good job." So, not really beneficial. (I've had a few really good responses, but it's rare.)
As for group work, I'm not aware of any instances of that at the school. (I've had it in graduate courses in the past, and I don't like it.)
I really do not like the "peer learning" system at UoPeople. I think it can work, but I don't think you've done a good job of implementing it. And instructors that I've had to date only have minimal interaction with students. (Though, I don't really need it, so it doesn't concern me much.) I've considered transferring several times, but the idea of paying $1-2k per course at a state school vs. $100 per course at UoPeople doesn't appeal to me. (As I've noted elsewhere, I'm not doing this primarily for the degree. Instead, it's about learning for me.)
I should hope not given the fact that they are not regionally accredited and now that they have finally applied they likely won't be for another 2-3 years. Assuming they pass.
To my knowledge, the difference with for-profit schools is that you are essentially guaranteed a degree by enrolling and paying the tuition and fees. If the for-profit institution actually had a good reputation, I can imagine they could also implement more restrictive policies. But the way academia works, a large part of prestige comes with history - and this makes it nearly impossible for new schools to break into this echelon of elites. I suspect it's also for the same reason that for-profit schools are not able to consistently attract talent in terms of instruction, since the ones who can would prefer to be employed by more prestigious institutions.
A large majority of students who start at the University of Phoenix fail to graduate. The guarantee you speak of does not exist, not because the standards are high, but because the average student admitted to these universities is incapable of the work, or too disorganized to do so. There are plenty of not for profit universities who admit similar students and they don’t graduate either.
During a hotly contested political debate in my city a person who called himself a doctor (psych therapist, to be precise) showed up at a council meeting and threatened a city council member from my district.
Being the data-fluent person I am I offered to help, so I pulled all of the person in question's educational, legal, and accrediting history to see what would turn up.
It turned out that the person was a state employee before moving to my state but was fired after burning a car for insurance fraud during a divorce. In fairness to him he admitted to that crime and paid his fines, restitution, and community service and what not, but he could no longer work for the government in his previous role (police) after being convicted. So he managed to get himself into an online psych program like the one you're describing (from a state university, but awfully scam-looking), which got him an Education degree, not a medical degree. In short, he could have been a high school guidance counselor and that's about it. But what his online undergrad and master's degrees in psych education did get him was the ability to make his entry into a PhD program look legitimate. But wait, you say, what would make any PhD program accept such a person who had a sketchy online undergrad and master's degree?
Presumably it's a lot easier to stomach such a PhD candiate if the university in question is in the African nation of Malawi, and is affiliated with the evangelical church that the person in question attends in the US. The Malawi university was one of the African universities that was churning out PhDs in suspect STEM disciplines back in the early 2000s to go around publishing papers about "intelligent design" if you remember those days. If you don't, it was the latest in a long line of hare brained evangelical attempts to end-around Charles Darwin in textbooks in the early 2000s. To this day, the person in question advertises himself as a "Christian hypnotherapist" providing all of the usual counseling services, and has an ID in the government medical provider database so he can accept Medicare, Medicaid, and insurance payments. The state I live in has since tightened up applications for medical licenses for people moving in from out of state. Previously, they had what seemed to be a fast-track process for people licensed in other states that did not involve re-checking educational credentials (that was in the late 90s/early 00s).
So I could discern from looking at all of this that there was likely some sort of coach, seminar, consultant, or similar that was training people on how to do all of this back then. If not, this guy planned the whole thing as his escape from the car burning bit and new career as a psychiatrist with shady credentials. First he got the state university degree, then the state university online masters which was fairly worthless, then the shady online PhD from Africa, then as soon as the ink was dry on his conditional (based on criminal background) therapist license in the state he lived in he moved to my state, and applied for a new license based on his existing license in his previous state, all in an attempt to get out from under the criminal record and reinvent himself as a shady psych therapist.
And it worked! As far as I know, his "medical" practice is open today.
In the process of digging all of this dirt I found other professors at other religious affiliated universities in the city I live in that had very sketchy African university credentials, including one listed as a neurosurgeon. I also found quite a few high school teachers who were making regular trips to African countries, including Malawi, from this area that they surely could not afford to make on high school teacher salaries. Those teachers per their social media profiles were all associated with the two evangelical sects in question that the car burning cop therapist and the shady neurosurgeon came from.
The likely erosion of educational legitimacy from of all of this (job markets, US university professorships, professional licenses, etc) are fairly obvious, I'd say.
Psychiatrists are medical doctors, and the rules on who gets to practice in the US are for practical purposes written by the AMA. That guy was not a licensed psychiatrist. He wouldn’t even be allowed to sit the first levels of the requisite exams.
Looks like the edit window has passed, sorry!
The fact that they're masters programs with performance based admissions make them even more suspicious to me, but if you're looking for bachelors programs like this I think the ASU bachelors hosted Coursera also has such an admission path.
Our program aims to make admission less bureaucratic (no degree, no GRE, nor recommendation letters, nor statement of purpose required), but we still are appealing to students who have the near-equivalent of a Bachelor’s degree, whether through academic study, years’ of industry experience, or, being self-taught to a high level. This is a rather rigorous program at the Masters level, equivalent to the content taught for many decades in our on-campus courses, but presented in slightly smaller pieces and in 8-week sessions, rather than 16-week semesters.
I have heard of people getting entrance to a Masters program without an bachelors degree, but not in technical disciplines and definitely not in a manner this open.
National accreditation boards in general, and especially those that are not field-specific, are seen as being less rigorous, faster to accredit, and more profit- or prestige-motivated than the regional accreditation boards. Most are relatively new and some of the most prominent (e.g. ACICS) have been involved in specific controversy over the quality of their work. Because the regional accrediting boards are older and better respected, the first question about a university that advertises a national accreditation tends to be "why aren't they regionally accredited?". It is both symptom and cause of this difference that most for-profit and otherwise "questionable" institutions are nationally accredited and not regionally accredited.
While some national accreditation bodies are considered rigorous and respectable (e.g. ABET), they tend to be field-specific (in the case of ABET, engineering) and often require regional accreditation as a prerequisite to accredit a university, as they consider regional accreditation to be the indication that the broader learning institution is able to provide a quality program. Usually specific departments or colleges of a university will go to these types of accreditation boards to add a credential to specific programs (e.g. "ABET accredited computer science program") above and beyond the university already having a regional accreditation.
There are schools with a shit curriculum that have regional accreditation. The reason it matters to Americans is because most large universities are more significantly more accepting of transferring credits that come from a regionally accredited school.
I've been to nationally and regionally accredited schools. My degree was from a top 200 school in the US. The classes I took at the "lesser" institution were significantly more rigorous and significantly more effective at teaching skills relevant to the major I was studying.
I've known several people that went to schools that have been sued or otherwise have terrible reputations for being predatory for profit schools like ITT and the University of Phoenix. Their coursework was frequently more difficult than mine.
To be fair though, those schools were sued partly because of their shady high pressure sales tactics and for tricking people into taking on massive amounts of debt to take classes. That's pretty scummy, but based on the experiences of my friends it seemed like they at least had legitimate work to do in order to graduate.
Note: I don't doubt that on average the regionally accredited schools have been more thoroughly vetted. I'm just saying accreditation is not even close to perfect so the only reason it matters in the US is because the credits are easily transferable.
This didn't personally affect me very much, but I know of many people who had difficulty getting into graduate school or professional work because of this.
A degree from a nationally accredited university limits your options. I'm not saying you'll be completely screwed if you get it, but not as many graduate schools will consider you.
The tuition-free part is more of a marketing. At the end of the day, it cost me $4,060 for the whole degree which is quite reasonable.
Speaking of employment and red flags, from LinkedIn I know that those who had studied at UoPeople work for many big companies including FAANG. No problem here.
30 hours per week for just 2 courses? Are these courses covering more than a typical university?
In my undergrad, I took a minimum of 5 courses per semester (16-20 credits). Scaling from your expectations, that would be 60-80 hours per week? I doubt I went over 40/week, including lectures (admittedly, it wasn't a demanding university).
Just giving you my 2 cents so you don't assume that the time you spent per course is all that different from a traditional university.
My only worry is the Math course. Not having gone to University or high school in 20 years I'm fearful. The other courses have been easy so far though, but like you I'm working on the field, so it's not so new to me.
I'm happy I went through them, but yea, MATH1201 was kinda difficult and MATH1280 was super boring (but not as difficult to be honest).
On the other hand, I've had small classes worth only half the credits, with absolutely ridiculous workloads.
I got a bachelor in a normal university while having a full time job in the last two years. I see no issue about this point.
Some people have brought this up about UoTP, and I think it may also apply to OU in the US: The name makes it sound fishy.
Slightly unrelated, but I find it weird that I, an American, can apply to some their law degrees, but not a number of fully-online STEM degrees.
> However, more extensive experience may compensate for a lack of formal qualifications, and a strong, immediately-relevant qualification may compensate for a lack of professional experience.
I don't imagine US employers would have an issue with degrees from there. You can find it easily and it looks reputable.
The one thing to note, at least when I took it in early 2000, is that its tough. I signed-up for their business/finance program and it was quite the course workload.