Normally, it means having a joint appointment in two academic departments (e.g. Linguistics and Cognitive Science at State U, or in the Department of Linguistics at State U, with an appointment at the Head-and-Neck surgery program at the local med school). This is a well-known and common practice, and although it can be tricky (particularly with even splits, where there's not one true home), it's not a 'harmful' thing.
As the author explains in the article per se, he's talking about an industry/academic split. This is much harder, for the reasons he's outlined, and as an academic, I too am skeptical. It could be a nice idea in moderation, and it'd be great to have more bridges between Academia and industry, particularly given the brutality of the Academic Job Market.
But I can easily see that a professional administrator somewhere deciding that it's cheaper to stock departments with 20% appointees than to actually hire career professors and educators. And as any adjunct-heavy institution will tell you, a department full of moonlighters is no place to make a life. Perhaps more damning, 20% of anybody's life isn't enough to support all but the weakest of teaching, even for a single course, so over-reliance on this will just further damage the instructional core of universities.
So, one or two in a department could be nice, but I don't think it's a great model for the future.
If anything, it's responding to emails ;)
Quoting the piece, "Part of the point of being a big company is to control your environment by crushing, containing, or co-opting inconvenient innovations." I think the author is arguing that attitude is fundamentally at odds with the values of the academy.
It seems more likely to me this article assumes zero sum competition where there are actually positive sum gains to be had. It's good Facebook wants to pay for PhDs to do real market driven work instead of these people starving as adjuncts somewhere.
My experience with academia is that everyone is scrambling to get grants and get published. Nobody ever asked questions about where the grants came from. A lot (probably even the majority at the time) was from the department of defense, and explicitly targetted to create weapons.
Professors spent a huge amount of their time writing grant proposals. It's like pitching to a bunch of VCs, only you do it every month, and the amounts of money are much smaller.
And this is the reward for a lifetime of achievement. If you're starting at the bottom today, conditions are positively Dickensian. The average (not the maximum, the average!) PhD in CS took 6 years. During that time you'll be paid almost nothing, no matter what the cost of living is around you. And you are essentially an indentured servant of the professor. If he wants you to do a routine task that has nothing to do with your research, you have to do it. Cumulatively these tasks could add up to years of delays. After you graduate, you'll probably have to take multiple postdoc jobs, often at very low salaries, in hopes of getting a faculty position. Sometimes the hopes come true, but very often not.
And from what I understand, CS is actually one of the "good" subjects to go to graduate school for. Things are much, much worse in the humanities.
It's truly incredible that anyone would hold this up as a better system than how industry works. Hmm, let's see... a two week interview process, after which the company will tell the applicant whether they're hired. Or, a two year postdoc after which the university may choose to throw them away like garbage. Spending half your time writing grants, versus spending a few minutes a week writing a status report. Come on.
Also, the section about how "the students will suffer" from industry partnerships reads like a bad joke. Students suffer because the most universities hire faculty purely based on research, and not at all based on teaching. Full stop. The top research schools have contempt for teaching undergrads; that's why they hire adjuncts to do it at minimum wage. (Well, they also dump some of the burden on graduate students, too.)
With internships, I made over 60k a year in grad school. I worked on projects of my choice. I did not do a postdoc after graduating. I graduated from a small, unranked department and got a tenure-track position at an R1 university in a top 75 department.
My salary as a student wasn’t 40k. My income from my RA position plus my internship was more than 60k.
It worked out well since I would be doing basically the same research whether I was at the company or at my university.
Edits: ah, just saw in your profile it was software engineering!
I graduated in 5 years into my dream tenure-track job, and 7/8 of the students in my cohort got good tenure-track jobs too (the remaining one went back to running a successful business unrelated to her research). The school I'm at pays $40K/year stipends for PhD students, including the summer; so I think many people only count the 9-month stipend for PhD students which is about 27K, and not the summer salary.
I would have been thrilled if my freshman and sophomore computer science classes were taught by guest "professors" who had legitimate careers, and occasionally took the time to teach a class. I'm sure their feedback to my department would have been extremely valuable, because, at the time, every class was some professor's half-baked experiment on teaching computer science.
Now, almost 20 years later, I'd really enjoy the opportunity to take a school's almost complete syllabus, and teach it to students.
How does this apply to Artificial Intelligence? I'm sure there are lots of unwritten lessons from industry that haven't make their way back into academia.
He was pushed out because the school wanted professors who would do research and he just wanted to work and teach classes in the evening.
Seems like what you needed was a trade school. Maybe having a degree from a more prestigious university helped you get an initial job, but in terms of the things you were actually looking for, that's trade school stuff and barely overlaps with Computer Science.
The evolution of, management of and use of large computer programs over a long period is pretty well unstudied. And yet 100's of millions of people have to use these things every day.
The programming methods used by practioners (and now taught in the Ivy league) have been developed by folk science, and are not rigorous in any way I can think of.
The Academy has failed in these two ways, at least. For large systems it's a market failure. The study of systems in the field over decades is not a way to get tenure - so it has largely not happened. The development practices created by and advocated by academia failed in the field; so we got "Agile" and we can't get rid of it with science (or at least, we haven't so far)
We can dismiss this all as artisanal, but in that case we need to separate Computer Science from Software Engineering properly. This means that Comp Sci goes to the Maths faculty and SE goes to Engineering. Crucially the expected level of funding for Comp Sci goes to Maths levels; buy a whiteboard and get on with it. This is no solution to be honest, but better than the current situation where the community owns to be tacking the problems of industry but actually addresses little in the core.
Also, this split absolutely does happen at some universities, and the funding is decided by agencies based on their priorities. One such university is the University of Waterloo, and the CS department is not wanting for funding...certainly not at the buy a whiteboard and get on with it levels you're suggesting.
The best right out of school sysadmins I've seen were failed physicists. Apparently they run some moderately large stuff.
In terms of CRUD apps - my jaw is on the floor... Don't you care about the harm that is inflicted on the people doing the development, their victims (everyone) and the reputation of the infrastructure that they create? What about voting machines? Compulsory XKCD link : https://www.xkcd.com/2030/
I think that the lofty disregard is fine - just don't go arguing for grant funding on the basis of real world impact.
On AI and ML - where is the work that will enable methods to be actually managed in the wild? How come the estimates of performance based on the methodologies of testing from academia are so woeful? Why has the academy been content with "it provides 94% TP in test with 99% confidence but when we ran it in production it gave us about 80% after review"!
Every single grant application makes (often bogus) economic rationale for its puported benefits to society and the economy. The trope in mathematics is that everything is relevant for either cryptography or protein folding.
> theoretical CS which (tragically) effectively includes much of the database, programming language and methodology community, and much of the AI and ML community too, then this is a misallocation of capital
You may not accept it but there is a whole bunch of very theoretical mathematical work that goes on in AI and ML. There is a whole bunch of work that is more empirically grounded and less whiteboard as well. There is a whole spectrum on the whiteboard to deployed-in-the-real-world. That is why there are often Applied Physics programs, different from Physics programs, different from Engineering programs. And people in each of those have varying levels of overlaps with each other based on where they sit on the theoretical-applied spectrum.
> Don't you care about the harm that is inflicted on the people doing the development, their victims (everyone) and the reputation of the infrastructure that they create? What about voting machines?
I never said anything about not caring - this is a silly red herring. I was making a statement about the computational resources needed to solve people and project management issues in software engineering as a counter to your "just give them some whiteboards" comment. I still don't see why throwing more cloud compute resources at Software Engineering departments will make your Scrum meetings more efficient. In fact I don't know if academia is well-poised to solve such problems at all.
> arguing for grant funding on the basis of real world impact
The idea behind funding the sciences in academia is that we fund research that may have long-term impact on society. You don't get to throw a fit because every problem you have at work isn't being solved by someone sitting in a university.
> the estimates of performance based on the methodologies of testing from academia are so woeful?
Are you claiming that every experiment that comes out of a physics lab works flawlessly out in the real world? Or every paper from a life science lab goes on to successfully become a new medical treatment? I mentioned it before but there are often several fields of study dedicated to just taking highly controlled results from labs and trying to get them to work in the real world. Not everything makes it (especially in the life sciences example). AI/ML are at least better in that they often (but they should be doing it even more) give you what you need to replicate the lab experiment on the controlled, sanitized data.
Keeping in contact with industry is important even for those who work with the theoretical as a source of inspiration and applications. Even if the computer science itself should be technically divorced from language watching it in action and understanding it helps realize what is important so they don't spend all of their effort on minimizing memory footprint at the cost of other constraints. Following what they are doing informs both the industry and the professor.
Theory without reference to reality can easily lead to a solipsistic spiral that leads nowhere to put it politely. While it is good to explore the theoretical foundations breakthroughs come from noticing what exists in reality and understanding it. An immortal's eternity with classical physic in an infinite white walled room with infinite ink would not come up with quantum physics much less subatomic particles. Alienation from industry should be something to be ashamed of not to be proud of.
Clearly, the professors didn't get the message that they were teaching classes at a teaching school, and not a research University.
I also need to emphasize that most students at this school do not pursue academic careers. Heck, most students pursuing a bachelor's degree do not pursue academic careers! If you think most educational institutions exist purely for research, then I don't think you understand the point of getting an education.
Completely clueless? I think you mean: clueless about working as a practitioner on a large, messy code base. No?
My experience shows the opposite. Academics write code that only they need to understand. And the type of deep thinking required for academia lends itself nicely to fewer context switching, ie massively long classes and functions and procedural execution of steps.
On the other hand, in real life you have to worry about others modifying your code, sometimes at the same time as you, meaning abstractions, decomposition, decoupling, etc.
However Dijkstra, of course, always wrote impeccable code.
I'd say the raiding of AI faculty is a reality, so Facebook's proposal is better than the raw poaching that is occurring. With salaries more than 3-6x higher in industry than in academia, the cost of being in academia is high for AI researchers compared to other fields where it is close to 1.5x. The program I received my PhD from had many faculty leave entirely to join companies, and this "dual affiliation" seems like a reasonable compromise.
Facebook enables PhD students to be funded, provides access to massive resources for building datasets and for compute power, and removes much of the grant writing burden allowing one to focus more on curiosity-driven research.
Faculty have so many things that pull them away from research, grant writing/fundraising, teaching multiple courses, and university service. Because things are moving so fast in AI, it requires more time to stay current on research activities which is hard to do in academia because so much time is spent doing other things.
This should be the headline.
The most critical issue here is Intellectual Property. In academia the IP is owned by the academic institution and has traditionally found its way into published research before being put through a technology transfer office or taken out of the universities by their creator. Don't forget Stanford still gets HUGE annuities from their licensing of the tech to Brin/Page.
Alternatively, corporations own the IP from any research from the outset, and history would indicate that trade secrets or patents that can monopolize a technology made from the research, will be pursued before or simultaneous to any publishing in research journals.
You have to pay one of them eventually if you want to make a product with it. So the question is, do we want corporations or academic institutions being the primary driver/owner of new knowledge?
Or is it just totally pragmatic, and we let the one with the most money win?
The only exceptions I can think of are the fringes of physics attacked by complexity analysis - but these really are fringes!
Now, one of these professors announces that Facebook will pay them $500k instead, but they will allow 20% of time at the university, and will pay you $50k
Now you have $150k, and 1 class per year already in the bank. You need to find 4 more classes taught, and by spending $120k you are able to do that, you also have $30k for TAs.
I fear that the total commercialisation of academia means that we are unlikely to see meaningful material gains for society in terms of new cures to disease or technology advances outside of those that can be monetised for recurring revenue in the next decade.
It's really an unfortunate but self inflicted American problem.
> The proposal harms our students directly. Our faculty at their best secure everyone’s future by teaching talented students how to understand the challenges facing the broader world. Such mentorship is enriched by the courage, independence, security, and trained judgement of senior scholars to guide students’ perspectives on what is worth doing, what is likely irrelevant, and what is wrong. Engaging with a student body requires an all-in commitment, both in teaching and advising roles. Faculty primarily working elsewhere means cancelled classes. Faculty wedded to a company means advice that’s colored by the interest of the company.
I'm not sure I agree with the implications of what follows the first sentence in the paragraph above - these are rather broad generalizations.
Academia may certainly help students understand challenges of the larger world but in my experience this is as mixed a bag as other settings: working in the private sector, working in non-profits and volunteering.
Finally, faculty working elsewhere seems like a very common thing. I've hired academics as consultants in the past and worked in companies where this was common and seemed encouraged by their universities. Note that they weren't primarily working for us - so this is a good distinction - but it also begs for a more clear definition of what "All-in commitment" means above.
(edits : grammar etc)
Pick your poison here. Either you're a servant to teaching, training grad students, chasing grants, and academic politics, or you only get to work on things that will make Facebook, Google, etc, money.
Without dumping blame on Universities, the point is that the cost of "teaching" gets passed on to students, saddling them with debt. Profit or no, the ratio of most tuitions often boils down to students paying what might be the cost of a 1:1 student:teacher ratio, effectively paying an entire adjunct salary (gross pay), for 4 years straight, single handedly.
Take a look at what that means when the reality is a 30:1 ratio or worse. But, the rationalization being: they get a diversity of expertise, vetted for world class quality (hopefully) even if they don't get the one-on-one personal touch of a direct hands-on apprenticeship, with the personal attention of a mentor.
But hey, yeah the campus grounds cost money, and accredation, and administrative bureaucratic overhead, and so one.
But yeah, adjuncts make shit, work part time, and have more than one source of income anyway. So, suck it, ivory tower!
The number of times I've read a sentence that started with something similar to "Wearing my community hat, .." and felt the sentence was anything but community orientated is way higher than I would like. I'm sure a certain percentage of this is actually my own biases, but I'm also pretty sure that percentage isn't 100%.
I disagree. Perhaps the most famous exception is Grace Hopper, a naval reservist who had both a civilian employer and a separated military career. https://en.wikipedia.org/wiki/Grace_Hopper
I too am an officer reservist, though Army not Navy. Two separate careers with often unrelated skills in unrelated industries. Yes, there are challenges to division of effort in this regard. To say humans are incapable of doing this, though, is ignorance from people who have never tried it or magnificently fail at it.
Another example I worked with personally: MG Scottie Carpenter. He is also an Army Reservists with two separated careers. When I worked with him in his first general command he was a senior leader of the North Carolina State Troopers (state police). He is now the deputy commanding general of the Army Reserves. http://www.usar.army.mil/Leadership/Article-View/Article/126...
Yes, competent and career minded individuals can achieve dual affiliation serving two masters. It is completely possible and some people excel very well at it.
What people don't see about dual affiliation is that there is extra insight gained from these struggles that other people cannot relate to. I have tried to explain this to people many times before and it is often utterly incomprehensible.
In the civilian corporate world software development is a big common thing. In the military it is nearly nonexistent. The primary reason for this is workplace culture and the near absence of a professional nature around software in the civilian world that the military can model internally. By professional nature I mean there is no widely accepted definition of skills (even in an ad hoc defacto way), licensing, or code of conduct that defines the profession. Trying to explain how the military is behind the times and could save hundreds of millions of dollars a year by aggressively building an internal professional culture of its own around software development is equally frustrating.
Another example is that people in the civilian world are sometimes easily offended. This is incredibly frustrating when every conversation is a midnight tip toe on egg shells. This accepted degree of sympathy and sensitivity are what, in my opinion, allow the Dunning-Kruger effect to occur (sometimes rampantly). https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect
Conversely, in the military you want to be just kind enough to avoid crushing someone's soul and destroying their self-esteem, unless they have honestly earned a good soul destroying moment. Kindness has a far lower value than honesty, which is a wildly different work culture. Jumping between these work cultures can be disorienting. Brutal honesty is a pretty simple thing to figure out, even if emotionally scaring, compared to guessing at people's self-serving emotional motives. So in the corporate world you really have to slowly test the waters before challenging peoples' opinions even if you have 20 years experience and they have none.
Perhaps the most similar quality between the military and the corporate software world are the various irrational things people do to assert job security. In the military it is hard to fire people, but it is easy to cancel a contract and swiftly eliminate a large swath of civilian contractors. This can result in some software products that are massively complex and hardware bound so that they need continued, exclusive, and highly specialized support from particular vendors. In the corporate world, on the other hand, it is very person for themselves resulting high doses of invented here. God forbid software developers ever expose their incompetence by writing original software, so instead write as little software as possible and simply glue third party products together as much as possible. This is why many developers in the web world spend their careers painfully specializing around framework/libary APIs instead of spending a few hours learning the standard APIs that everything compiles to. https://en.wikipedia.org/wiki/Invented_here
* People would work in academic institution XOR your company: you cannot give the grant to the best in your field and still have him work for you, so you'd always want to hire the best you can, and give the grants to the rest;
* Grants most often support publications, not working solutions. At least with government-sponsored grants it's best when the academic analysis and publication is sponsored from a grant, whereas industrial involvement guarantees that the work is relevant and the published solution actually works.
In theory, the contract structure seems a lot more limited than an NIH, NSF, etc. grant, where you are minimally constrained by the proposal, but in practice, the program managers seem willing to amend the contract so that no one collects a bunch of obviously useless data.
There is sort of a continuum between consulting and a research contract, admittedly, but I'll also note that I have government grants with deliverables. All it really means is that there's potentially interim products that need to be delivered, or that there's things that are less vague than papers and presentations that the agency/sponsor wants to see.
80/20 is more like "Somewhere between 60/40 and 100/0"
I had a boss ask me not to take a second university class, this was smart for everyone involved.
There are busy times and work and boring times at work, to push that it unrealistic to manage a structured life outside of work is silly.
Ive worked with professors that let me take 2 weeks off for Work Travel.
I've had to cut out of work because I had a class. I came in early the next day and prepared for my meeting, things were fine.
I'm a big fan of moderation, and this article claims an extreme situation that is temporary and often unlikely.