Hacker News new | past | comments | ask | show | jobs | submit login
You Cannot Serve Two Masters: The Harms of Dual Affiliation (argmin.net)
255 points by stochastician on Aug 10, 2018 | hide | past | favorite | 79 comments



Just a quick note: 'Dual Affiliation' is being used here differently than in most academic contexts, so the headline's a bit odd.

Normally, it means having a joint appointment in two academic departments (e.g. Linguistics and Cognitive Science at State U, or in the Department of Linguistics at State U, with an appointment at the Head-and-Neck surgery program at the local med school). This is a well-known and common practice, and although it can be tricky (particularly with even splits, where there's not one true home), it's not a 'harmful' thing.

As the author explains in the article per se, he's talking about an industry/academic split. This is much harder, for the reasons he's outlined, and as an academic, I too am skeptical. It could be a nice idea in moderation, and it'd be great to have more bridges between Academia and industry, particularly given the brutality of the Academic Job Market.

But I can easily see that a professional administrator somewhere deciding that it's cheaper to stock departments with 20% appointees than to actually hire career professors and educators. And as any adjunct-heavy institution will tell you, a department full of moonlighters is no place to make a life. Perhaps more damning, 20% of anybody's life isn't enough to support all but the weakest of teaching, even for a single course, so over-reliance on this will just further damage the instructional core of universities.

So, one or two in a department could be nice, but I don't think it's a great model for the future.


Dual affiliation meaning industry/academic split is unambiguous here since that's the word Facebook AI Research choose to use [1][2] and the authors are explicitly responding to Facebook's model.

[1]: https://www.businessinsider.com/facebook-yann-lecun-dual-aff... [2]: https://www.facebook.com/schrep/posts/10156638732909443


Facebook's reappropriation of a well-understood term in academia doesn't mean it's not ambiguous... If anything, it's disingenuous.


Maybe better to call it misguided.


I think particularly in the case of FAIR, this is a response to Facebook being outspoken about "dual affiliation" with their AI researchers, so it's a response really to Facebook specifically.

https://www.facebook.com/yann.lecun/posts/10155435964892143


For the last year of my degree our teachers were mostly contract teachers taken from industry. The courses they taught suffered fairly heavily for it. Their contracts covered only the time spent at school so marking, coming up with assignments and lectures were all done on the teacher's own time. This led to some pretty half asset efforts by the teachers, arguably not entirely their fault, and one case of a teacher making trouble for students working on a project outside of school and the same teacher using his position to acquire grant money for work he didn't do.


The bulk of academic work is chasing grants, not "curiosity driven research". If universities want professors not to leave in droves to tech companies that will consistently give them funding and cut away the bullshit that eats up the majority of a professor's time, they should try competing, rather than bemoaning that professors are no longer following the sacred path of academic asceticism.


While I would like to stop chasing grants as much as the next tenure track assistant professor, I don't think it's fair to characterize "the bulk" of academic work as chasing grants.

If anything, it's responding to emails ;)


Emails and sitting in meetings (e.g. https://twitter.com/research_tim/status/1017506139137826819 ).


I do not think this is what the piece is arguing. Rather, the concern is that by creating these 80/20 splits, the core values of the university are compromised. There's nothing a priori wrong with industrial research, it's this attempted hybrid that's problematic. Hence the title, "you cannot serve two masters".

Quoting the piece, "Part of the point of being a big company is to control your environment by crushing, containing, or co-opting inconvenient innovations." I think the author is arguing that attitude is fundamentally at odds with the values of the academy.


If BigCompany expects its research center to crush inconvenient innovations, they're not really running a research center, they're just calling it that because they like the titles.


i read "inconvenient" as "competing". if your r&d can lead to a few choice patents (or you "just" buy them) you can sometimes hamstring competition or get little graft by leeching off competitor's developments.


The point of being a big company, not the point of running a research center.


Top conferences accept papers with a large expected impact, not necessarily papers the authors were really curious about. It's deeper than merely what time is spent on, it's also where accolades are given. The facebook thing shows real problems to the researchers and provides good data, good starts to high impact papers.


Article says two masters which implicitly assumes industry would be some new master pulling professors in some new direction. But right now professors have multiple masters. Undergrads, grads, administrations, grants/funding, etc. The idea that the previous master was "curiosity driven research" and now it will be "shareholders" is an exaggeration. I could cynically say the previous master was "least publishable unit."

It seems more likely to me this article assumes zero sum competition where there are actually positive sum gains to be had. It's good Facebook wants to pay for PhDs to do real market driven work instead of these people starving as adjuncts somewhere.


Wow. I don't even know where to start on this one.

My experience with academia is that everyone is scrambling to get grants and get published. Nobody ever asked questions about where the grants came from. A lot (probably even the majority at the time) was from the department of defense, and explicitly targetted to create weapons.

Professors spent a huge amount of their time writing grant proposals. It's like pitching to a bunch of VCs, only you do it every month, and the amounts of money are much smaller.

And this is the reward for a lifetime of achievement. If you're starting at the bottom today, conditions are positively Dickensian. The average (not the maximum, the average!) PhD in CS took 6 years. During that time you'll be paid almost nothing, no matter what the cost of living is around you. And you are essentially an indentured servant of the professor. If he wants you to do a routine task that has nothing to do with your research, you have to do it. Cumulatively these tasks could add up to years of delays. After you graduate, you'll probably have to take multiple postdoc jobs, often at very low salaries, in hopes of getting a faculty position. Sometimes the hopes come true, but very often not.

And from what I understand, CS is actually one of the "good" subjects to go to graduate school for. Things are much, much worse in the humanities.

It's truly incredible that anyone would hold this up as a better system than how industry works. Hmm, let's see... a two week interview process, after which the company will tell the applicant whether they're hired. Or, a two year postdoc after which the university may choose to throw them away like garbage. Spending half your time writing grants, versus spending a few minutes a week writing a status report. Come on.

Also, the section about how "the students will suffer" from industry partnerships reads like a bad joke. Students suffer because the most universities hire faculty purely based on research, and not at all based on teaching. Full stop. The top research schools have contempt for teaching undergrads; that's why they hire adjuncts to do it at minimum wage. (Well, they also dump some of the burden on graduate students, too.)


It seems like once a week I see comments like this. This was not my experience at all.

With internships, I made over 60k a year in grad school. I worked on projects of my choice. I did not do a postdoc after graduating. I graduated from a small, unranked department and got a tenure-track position at an R1 university in a top 75 department.


I don't know a single person who has gotten a recent (5 years) faculty position without one or more postdocs. I'd say this is extremely uncommon experience. Nor have a seen salaries more than 40k for students, even postdocs don't always make 50k.


All of my friends on the market this year from a variety of schools got faculty positions without postdocs. It isn’t uncommon in CS.

My salary as a student wasn’t 40k. My income from my RA position plus my internship was more than 60k.


I guess I'm not in CS, maybe thats the difference.


60k per year doing internships? I thought a typical tech internship paid $5-8k/month, which means at most $25k over the summer. (Maybe my number is for undergrads, and grad student interns get paid significantly more?)


$6-9k a month for research internships with the ability to extend them beyond 3 months.

It worked out well since I would be doing basically the same research whether I was at the company or at my university.


What field? Even at top tier institutions I've never seen grad students make more than 40k, but I'm thinking of research heavy PhDs, where you continue your research over the summer.

Edits: ah, just saw in your profile it was software engineering!


I had the same experience as the grandparent, and averaged 70K a year. I had a lot of great opportunities during my Ph.D so I was really fortunate, but I felt that many students I met during internships were serial interns and got fellowships too, so it seemed at the time like most strong PhD students in Computer Science averaged about 60K a year and enjoyed it. Also note that you don't pay FICA (Social Security / Medicare) on your PhD stipend so 25K is like 30K.

I graduated in 5 years into my dream tenure-track job, and 7/8 of the students in my cohort got good tenure-track jobs too (the remaining one went back to running a successful business unrelated to her research). The school I'm at pays $40K/year stipends for PhD students, including the summer; so I think many people only count the 9-month stipend for PhD students which is about 27K, and not the summer salary.


I continued my research at companies every summer. Seems fairly common in CS, so you get access to real data/systems and good salaries.


I think this is fairly unique to CS. In physics, I had much the other experience.


When I took computer science in 1999-2003, the best professors had rather significant industry experience. The worst professors were completely clueless because they never worked with a large computer program in their life.

I would have been thrilled if my freshman and sophomore computer science classes were taught by guest "professors" who had legitimate careers, and occasionally took the time to teach a class. I'm sure their feedback to my department would have been extremely valuable, because, at the time, every class was some professor's half-baked experiment on teaching computer science.

Now, almost 20 years later, I'd really enjoy the opportunity to take a school's almost complete syllabus, and teach it to students.

How does this apply to Artificial Intelligence? I'm sure there are lots of unwritten lessons from industry that haven't make their way back into academia.


My professor for senior level graphics was a prominent employee of the major visual effects company in town and he taught a fantastic course.

He was pushed out because the school wanted professors who would do research and he just wanted to work and teach classes in the evening.


> The worst professors were completely clueless because they never worked with a large computer program in their life.

Seems like what you needed was a trade school. Maybe having a degree from a more prestigious university helped you get an initial job, but in terms of the things you were actually looking for, that's trade school stuff and barely overlaps with Computer Science.


This is a problem on so many levels.

The evolution of, management of and use of large computer programs over a long period is pretty well unstudied. And yet 100's of millions of people have to use these things every day.

The programming methods used by practioners (and now taught in the Ivy league) have been developed by folk science, and are not rigorous in any way I can think of.

The Academy has failed in these two ways, at least. For large systems it's a market failure. The study of systems in the field over decades is not a way to get tenure - so it has largely not happened. The development practices created by and advocated by academia failed in the field; so we got "Agile" and we can't get rid of it with science (or at least, we haven't so far)

We can dismiss this all as artisanal, but in that case we need to separate Computer Science from Software Engineering properly. This means that Comp Sci goes to the Maths faculty and SE goes to Engineering. Crucially the expected level of funding for Comp Sci goes to Maths levels; buy a whiteboard and get on with it. This is no solution to be honest, but better than the current situation where the community owns to be tacking the problems of industry but actually addresses little in the core.


I actually agree with your separation of Software Engineering and Computer Science comment, but completely disagree with the funding level statement. The notion that branches of Mathematics don't use computer resources is absolutely laughable. If anything, I don't see why the best practices for managing people who are developing the same CRUD apps over and over again needs any significant investment of any kind besides a few laptops. Computer Science can not and should not give a shit about development practices like Agile or its alternatives. CS people do own a lot of problems facing industry like machine learning and AI, and they address plenty of problems "in the core"...just nothing to do with process.

Also, this split absolutely does happen at some universities, and the funding is decided by agencies based on their priorities. One such university is the University of Waterloo, and the CS department is not wanting for funding...certainly not at the buy a whiteboard and get on with it levels you're suggesting.


>The notion that branches of Mathematics don't use computer resources is absolutely laughable.

The best right out of school sysadmins I've seen were failed physicists. Apparently they run some moderately large stuff.


I had a really good one work for me - but he went off back to physics so that he could play with a real computer!


I think that there is a conflation that needs sorting out. Mathematicians and Physicists may need funding at a high level, they may deserve it in a philosophical and natural justice sense as well. Computer Science argues for (and gets) high levels of funding by asserting economic rationales for funding with a justification that physics and maths struggle to match. My view is that if we are talking about the SE side of the shop then this is rational and fine, but if we are talking about theoretical CS which (tragically) effectively includes much of the database, programming language and methodology community, and much of the AI and ML community too, then this is a misallocation of capital.

In terms of CRUD apps - my jaw is on the floor... Don't you care about the harm that is inflicted on the people doing the development, their victims (everyone) and the reputation of the infrastructure that they create? What about voting machines? Compulsory XKCD link : https://www.xkcd.com/2030/

I think that the lofty disregard is fine - just don't go arguing for grant funding on the basis of real world impact.

On AI and ML - where is the work that will enable methods to be actually managed in the wild? How come the estimates of performance based on the methodologies of testing from academia are so woeful? Why has the academy been content with "it provides 94% TP in test with 99% confidence but when we ran it in production it gave us about 80% after review"!


> Computer Science argues for (and gets) high levels of funding by asserting economic rationales for funding with a justification that physics and maths struggle to match.

Every single grant application makes (often bogus) economic rationale for its puported benefits to society and the economy. The trope in mathematics is that everything is relevant for either cryptography or protein folding.

> theoretical CS which (tragically) effectively includes much of the database, programming language and methodology community, and much of the AI and ML community too, then this is a misallocation of capital

You may not accept it but there is a whole bunch of very theoretical mathematical work that goes on in AI and ML. There is a whole bunch of work that is more empirically grounded and less whiteboard as well. There is a whole spectrum on the whiteboard to deployed-in-the-real-world. That is why there are often Applied Physics programs, different from Physics programs, different from Engineering programs. And people in each of those have varying levels of overlaps with each other based on where they sit on the theoretical-applied spectrum.

> Don't you care about the harm that is inflicted on the people doing the development, their victims (everyone) and the reputation of the infrastructure that they create? What about voting machines?

I never said anything about not caring - this is a silly red herring. I was making a statement about the computational resources needed to solve people and project management issues in software engineering as a counter to your "just give them some whiteboards" comment. I still don't see why throwing more cloud compute resources at Software Engineering departments will make your Scrum meetings more efficient. In fact I don't know if academia is well-poised to solve such problems at all.

> arguing for grant funding on the basis of real world impact

The idea behind funding the sciences in academia is that we fund research that may have long-term impact on society. You don't get to throw a fit because every problem you have at work isn't being solved by someone sitting in a university.

> the estimates of performance based on the methodologies of testing from academia are so woeful?

Are you claiming that every experiment that comes out of a physics lab works flawlessly out in the real world? Or every paper from a life science lab goes on to successfully become a new medical treatment? I mentioned it before but there are often several fields of study dedicated to just taking highly controlled results from labs and trying to get them to work in the real world. Not everything makes it (especially in the life sciences example). AI/ML are at least better in that they often (but they should be doing it even more) give you what you need to replicate the lab experiment on the controlled, sanitized data.


That reminds me of the foibles of some of the Greek philosophers and thinking that checking if the theory has anything to do with reality would pollute the purity of thought and thus concluding that projectiles must have triangular paths where they drop at the end once they run out of impetus and women must have fewer teeth then men because they have smaller heads and all teeth are clearly the same size. Their 'wisdom' was clearly the height of foolishness despite lifetimes of thinking because they insisted upon their alienation as an inherit good.

Keeping in contact with industry is important even for those who work with the theoretical as a source of inspiration and applications. Even if the computer science itself should be technically divorced from language watching it in action and understanding it helps realize what is important so they don't spend all of their effort on minimizing memory footprint at the cost of other constraints. Following what they are doing informs both the industry and the professor.

Theory without reference to reality can easily lead to a solipsistic spiral that leads nowhere to put it politely. While it is good to explore the theoretical foundations breakthroughs come from noticing what exists in reality and understanding it. An immortal's eternity with classical physic in an infinite white walled room with infinite ink would not come up with quantum physics much less subatomic particles. Alienation from industry should be something to be ashamed of not to be proud of.


It was a technical university that promoted itself as preparing students to enter engineering careers. To be quite blunt, during their open houses they advertised that students would have high salaries upon graduation.

Clearly, the professors didn't get the message that they were teaching classes at a teaching school, and not a research University.

I also need to emphasize that most students at this school do not pursue academic careers. Heck, most students pursuing a bachelor's degree do not pursue academic careers! If you think most educational institutions exist purely for research, then I don't think you understand the point of getting an education.


> The worst professors were completely clueless because they never worked with a large computer program in their life.

Completely clueless? I think you mean: clueless about working as a practitioner on a large, messy code base. No?


I don't get this narrative (and memes) that University code is clear and perfect and real life code is ugly dirty and hacky.

My experience shows the opposite. Academics write code that only they need to understand. And the type of deep thinking required for academia lends itself nicely to fewer context switching, ie massively long classes and functions and procedural execution of steps.

On the other hand, in real life you have to worry about others modifying your code, sometimes at the same time as you, meaning abstractions, decomposition, decoupling, etc.


You misunderstood me. I wasn’t implying academics write clean code, just exactly what you’re saying usually it’s small & dirty code.

However Dijkstra, of course, always wrote impeccable code.


I'm a professor working in AI.

I'd say the raiding of AI faculty is a reality, so Facebook's proposal is better than the raw poaching that is occurring. With salaries more than 3-6x higher in industry than in academia, the cost of being in academia is high for AI researchers compared to other fields where it is close to 1.5x. The program I received my PhD from had many faculty leave entirely to join companies, and this "dual affiliation" seems like a reasonable compromise.

Facebook enables PhD students to be funded, provides access to massive resources for building datasets and for compute power, and removes much of the grant writing burden allowing one to focus more on curiosity-driven research.

Faculty have so many things that pull them away from research, grant writing/fundraising, teaching multiple courses, and university service. Because things are moving so fast in AI, it requires more time to stay current on research activities which is hard to do in academia because so much time is spent doing other things.


You can’t do disruptive entrepreneurism if 80% of what you do is owned by a big company.

This should be the headline.

The most critical issue here is Intellectual Property. In academia the IP is owned by the academic institution and has traditionally found its way into published research before being put through a technology transfer office or taken out of the universities by their creator. Don't forget Stanford still gets HUGE annuities from their licensing of the tech to Brin/Page[1].

Alternatively, corporations own the IP from any research from the outset, and history would indicate that trade secrets or patents that can monopolize a technology made from the research, will be pursued before or simultaneous to any publishing in research journals.

You have to pay one of them eventually if you want to make a product with it. So the question is, do we want corporations or academic institutions being the primary driver/owner of new knowledge?

Or is it just totally pragmatic, and we let the one with the most money win?

[1]https://www.redorbit.com/news/education/318480/stanford_earn...


I think an important factor is the salary disparity. Unless you live in the middle of nowhere, academic salaries cannot provide an upper middle-class lifestyle anymore. Academics are only human and it is natural to want the lifestyle all of their peers from grad school, college (or even their recently graduated student who works for FAANG) seem to enjoy. Unless academic salaries in CS increase significantly (and match say, business school salaries), this trend will continue.


I'm guessing salaries may come down somewhat CS industry jobs as supply of people with software development training increases. Salaries are much lower in physical science, engineering and biology even for PhDs compared to software development/CS/"tech". Although in industry salaries are still higher than academic ones, although the difference is not as dramatic.


Why does "faculty working elsewhere mean cancelled classes" - if the faculty is paid for 100% by facebook, but works 20% at the university does this not mean that the students receive a bonus teacher? Is there no scope for enrichment of computer science by industry? And what is "academic computer science"? I mean, look down at your keyboard - nothing in computer science is purely academic; it's the most applied of domains!

The only exceptions I can think of are the fringes of physics attacked by complexity analysis - but these really are fringes!


Working 80% for Facebook basically precludes the time commitment for teaching classes.


As per many other comments - doing admin and applying for grants takes up ~80% of normal academic time - and yet classes get taught.


Ok - let's say that the departmental budget for professors is $1m, and you pay $100k to fund 10 professors, who then teach 50 classes. All is well.

Now, one of these professors announces that Facebook will pay them $500k instead, but they will allow 20% of time at the university, and will pay you $50k

Now you have $150k, and 1 class per year already in the bank. You need to find 4 more classes taught, and by spending $120k you are able to do that, you also have $30k for TAs.


So I'm actually a professor. And while admin and grants is a lot of my day, they're built into the assumptions of having an academic appointment. We know. Packing all of your university-related duties into 20% of your time? That means classes won't get taught.


Our government, and moreso, our society through cultural and government apathy has devalued anything academic that does not produce material gains - capital, wealth, patents, etc.

I fear that the total commercialisation of academia means that we are unlikely to see meaningful material gains for society in terms of new cures to disease or technology advances outside of those that can be monetised for recurring revenue in the next decade.

It's really an unfortunate but self inflicted American problem.


It's a problem, sure... but the problem is that we're not funding schools. If we spent some tax dollars on our education system, academia would be more independent from business, and could focus more on the sorts of long-term and difficult-to-monitize research that business is not so great at.


Semi orthogonally, in my opinion, Ben Recht is one of the most important voices in ML and AI. I highly recommend people follow him.


I agree-he is a very crisp and clear thinker.


snip snip:

> The proposal harms our students directly. Our faculty at their best secure everyone’s future by teaching talented students how to understand the challenges facing the broader world. Such mentorship is enriched by the courage, independence, security, and trained judgement of senior scholars to guide students’ perspectives on what is worth doing, what is likely irrelevant, and what is wrong. Engaging with a student body requires an all-in commitment, both in teaching and advising roles. Faculty primarily working elsewhere means cancelled classes. Faculty wedded to a company means advice that’s colored by the interest of the company.

I'm not sure I agree with the implications of what follows the first sentence in the paragraph above - these are rather broad generalizations.

Academia may certainly help students understand challenges of the larger world but in my experience this is as mixed a bag as other settings: working in the private sector, working in non-profits and volunteering.

Finally, faculty working elsewhere seems like a very common thing. I've hired academics as consultants in the past and worked in companies where this was common and seemed encouraged by their universities. Note that they weren't primarily working for us - so this is a good distinction - but it also begs for a more clear definition of what "All-in commitment" means above.

(edits : grammar etc)


Yeah, working 80% of your time in industry would be catastrophically destructive to most academics' intellectual freedom. You know what else would be catastrophic to their intellectual freedom? Working 20% of their time at a university.


Where exactly do I sign up to get paid to do research on topics that don't have immediate business application?

Pick your poison here. Either you're a servant to teaching, training grad students, chasing grants, and academic politics, or you only get to work on things that will make Facebook, Google, etc, money.


FAIR, DeepMind, Brain, MSR and many others all employ large numbers of people working on pure research with no immediate business application, and with at least as much freedom as their academic counterparts.


Last time I checked, adjuncts make shit, get shat on, and have to work multiple jobs anyway.

Without dumping blame on Universities, the point is that the cost of "teaching" gets passed on to students, saddling them with debt. Profit or no, the ratio of most tuitions often boils down to students paying what might be the cost of a 1:1 student:teacher ratio, effectively paying an entire adjunct salary (gross pay), for 4 years straight, single handedly.

Take a look at what that means when the reality is a 30:1 ratio or worse. But, the rationalization being: they get a diversity of expertise, vetted for world class quality (hopefully) even if they don't get the one-on-one personal touch of a direct hands-on apprenticeship, with the personal attention of a mentor.

But hey, yeah the campus grounds cost money, and accredation, and administrative bureaucratic overhead, and so one.

But yeah, adjuncts make shit, work part time, and have more than one source of income anyway. So, suck it, ivory tower!


I feel lots of this applies to Open Source too. The notion of wearing the company or community hat is pretty common, and often used to persuade others of your good intentions.

The number of times I've read a sentence that started with something similar to "Wearing my community hat, .." and felt the sentence was anything but community orientated is way higher than I would like. I'm sure a certain percentage of this is actually my own biases, but I'm also pretty sure that percentage isn't 100%.


TL;DR some professors are worried that their poorly-paid students doing grunt-level work that professors find beneath themselves to do, will realise quite quickly what the better deal is after working 80% for a company and 20% for a university.


> This model assumes people can slice their time and attention like a computer, but people can’t do this.

I disagree. Perhaps the most famous exception is Grace Hopper, a naval reservist who had both a civilian employer and a separated military career. https://en.wikipedia.org/wiki/Grace_Hopper

I too am an officer reservist, though Army not Navy. Two separate careers with often unrelated skills in unrelated industries. Yes, there are challenges to division of effort in this regard. To say humans are incapable of doing this, though, is ignorance from people who have never tried it or magnificently fail at it.

Another example I worked with personally: MG Scottie Carpenter. He is also an Army Reservists with two separated careers. When I worked with him in his first general command he was a senior leader of the North Carolina State Troopers (state police). He is now the deputy commanding general of the Army Reserves. http://www.usar.army.mil/Leadership/Article-View/Article/126...

Yes, competent and career minded individuals can achieve dual affiliation serving two masters. It is completely possible and some people excel very well at it.

---

What people don't see about dual affiliation is that there is extra insight gained from these struggles that other people cannot relate to. I have tried to explain this to people many times before and it is often utterly incomprehensible.

My civilian employment is a senior JavaScript developer at a big bank. The military considers itself a profession, and like other professions there is mandated education and certification to do anything. Other civilian careers, nearly everything I can think of, has this but software does not. As a result there are somewhat fewer incompetent people that get promoted to higher responsibility compared to the corporate world. Trying to explain this kind of incompetence to software developers is like shooting yourself in the face.

In the civilian corporate world software development is a big common thing. In the military it is nearly nonexistent. The primary reason for this is workplace culture and the near absence of a professional nature around software in the civilian world that the military can model internally. By professional nature I mean there is no widely accepted definition of skills (even in an ad hoc defacto way), licensing, or code of conduct that defines the profession. Trying to explain how the military is behind the times and could save hundreds of millions of dollars a year by aggressively building an internal professional culture of its own around software development is equally frustrating.

Another example is that people in the civilian world are sometimes easily offended. This is incredibly frustrating when every conversation is a midnight tip toe on egg shells. This accepted degree of sympathy and sensitivity are what, in my opinion, allow the Dunning-Kruger effect to occur (sometimes rampantly). https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect

Conversely, in the military you want to be just kind enough to avoid crushing someone's soul and destroying their self-esteem, unless they have honestly earned a good soul destroying moment. Kindness has a far lower value than honesty, which is a wildly different work culture. Jumping between these work cultures can be disorienting. Brutal honesty is a pretty simple thing to figure out, even if emotionally scaring, compared to guessing at people's self-serving emotional motives. So in the corporate world you really have to slowly test the waters before challenging peoples' opinions even if you have 20 years experience and they have none.

Perhaps the most similar quality between the military and the corporate software world are the various irrational things people do to assert job security. In the military it is hard to fire people, but it is easy to cancel a contract and swiftly eliminate a large swath of civilian contractors. This can result in some software products that are massively complex and hardware bound so that they need continued, exclusive, and highly specialized support from particular vendors. In the corporate world, on the other hand, it is very person for themselves resulting high doses of invented here. God forbid software developers ever expose their incompetence by writing original software, so instead write as little software as possible and simply glue third party products together as much as possible. This is why many developers in the web world spend their careers painfully specializing around framework/libary APIs instead of spending a few hours learning the standard APIs that everything compiles to. https://en.wikipedia.org/wiki/Invented_here


Their response is different from my first thought. Which is that Facebook wants the freedom to embed their employees inside universities where they are in a position to get the best and brightest students into Facebook's hiring pipeline.


I find the idea odd. if you want to contribute to academic research create a grant(s) with your budget.


That's a big can of worms. This dual employment may have its drawbacks (I think mostly due to the fact that this is a mega-corp; for a small bootstrapped company it's only good, I think), but consider the following two topics:

* People would work in academic institution XOR your company: you cannot give the grant to the best in your field and still have him work for you, so you'd always want to hire the best you can, and give the grants to the rest;

* Grants most often support publications, not working solutions. At least with government-sponsored grants it's best when the academic analysis and publication is sponsored from a grant, whereas industrial involvement guarantees that the work is relevant and the published solution actually works.


It's trivial to structure grants (or indeed contracts) such that they have concrete deliverables, even for academic research.


I think the moment a 'grant' has concrete deliverables (to a private firm?) it ceases being a grant and starts being some kind of consulting. I think people try to do this around ucla pretty often i.e. (I give a professor money -they do the research- I keep the IP)- it is pretty frowned upon in that context. In ML and CS I think this is probably more kosher-Michael Kearns seems to have a pretty thriving consultancy.


DARPA structures a lot of its research funding like this. The 'award' is a contract, typically for data, models, etc. For example, the statement of work for my last project has about a dozen deliverables of the form "tabulated data comparing [outcome] against [experimental variable 1], [exp. variable 2], ...." or "a computational model relating...." As far as I can tell, they are not actually interested in the data itself; this is just a hack to use the procurement process to fund research. They were happy to let the researchers keep and share the data.

In theory, the contract structure seems a lot more limited than an NIH, NSF, etc. grant, where you are minimally constrained by the proposal, but in practice, the program managers seem willing to amend the contract so that no one collects a bunch of obviously useless data.


mmmm today i learned something new-thanks.


My university, in contrast, is pushing industry engagement like this pretty hard. I happen to disagree with that approach, but it is a thing.

There is sort of a continuum between consulting and a research contract, admittedly, but I'll also note that I have government grants with deliverables. All it really means is that there's potentially interim products that need to be delivered, or that there's things that are less vague than papers and presentations that the agency/sponsor wants to see.


The real question is why should they? The cynical golden rule is in force here - the one with the gold makes the rules. There needs to be some sort of carrot or stick to get them to go along with 'their way' such as getting the best results or people judging their output as biased. "It is the way we have always done things." is a statement of complacency not a justification.


Then they wouldn't own the research and the resultant capabilities by default. Yes they do and might continue open sourcing most of their research, but they have right of first refusal to publish, which is really the most important piece.


It's relatively easy to structure a grant program such that the rights are either shared, or can be licensed for a pretty nominal amount. There are other sectors of academic research that are quite good at this.


Nobody is saying you can't do it, however it's just rarely done. The default way is to not share/license, so that's what normally happens.


At my university, the default is actually a licensing agreement.


Again, the point is that corporations do not have licensing agreements by default, whereas Universities do because their charter is to license usually through a technology transfer office.


Companies also have this problem...


Seems more like 80% industry 20% spy to me.


As a person working like 40% academy, 60% industry, I'd say it is in reverse: you get to 'spy' on industry's knowledge and practice and transplant it into academic setting, where it is very likely publishable.


I find this a bit silly.

80/20 is more like "Somewhere between 60/40 and 100/0"

I had a boss ask me not to take a second university class, this was smart for everyone involved.

There are busy times and work and boring times at work, to push that it unrealistic to manage a structured life outside of work is silly.

Ive worked with professors that let me take 2 weeks off for Work Travel.

I've had to cut out of work because I had a class. I came in early the next day and prepared for my meeting, things were fine.

I'm a big fan of moderation, and this article claims an extreme situation that is temporary and often unlikely.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: