Re. This section: I’m not in any of the listed industries, but software developers routinely create works that scam others out of life savings, facilitate hate crimes, price fix to with the...
Re. This section:
It would break the world if everyone did it
[…]
To be clear, I’m not saying perverse incentives never induce bad behavior in medicine or other fields. Of course they do. My point is that practitioners in other fields at least appear to have enough sense not to loudly trumpet The Incentives as a reasonable justification for their antisocial behavior—or to pat themselves on the back for being the kind of people who are clever enough to see the fiendish Incentives for exactly what they are. My sense is that when doctors, lawyers, journalists, etc. fall prey to The Incentives, they generally consider that to be a source of shame. I won’t go so far as to suggest that we scientists take pride in behaving badly—we obviously don’t—but we do seem to have collectively developed a rather powerful form of learned helplessness that doesn’t seem to be matched by other communities. Which is a fortunate thing, because if every other community also developed the same attitude, we would be in a world of trouble.
I’m not in any of the listed industries, but software developers routinely create works that scam others out of life savings, facilitate hate crimes, price fix to with the intention of harvesting every potential dollar from the working class, etc. And a rounding error bats an eye. Having briefly worked in related areas, instead of blaming The Incentives, they assuage all guilt and point to individual responsibility, or say that regulations would stop them if it were actually harmful.
Reaching out a bit beyond my field, we can thank career politicians for the current state of polarization in several jurisdictions that use first past the post voting system. Having no context into their closed door discussions, I can’t say whether they privately feel remorse for the direct harm they’ve caused to stay employed, but publicly the best I’ve seen is a shrug and dismissal of the overall topic.
I suppose this is my roundabout way of asking the question, has the author looked at the world lately? Admittedly this piece is seven years old …
It seems hard to say how widespread such attitudes are in the software industry? Certainly there are lots of people who are quick to make allegations of unethical behavior, which at least shows...
It seems hard to say how widespread such attitudes are in the software industry? Certainly there are lots of people who are quick to make allegations of unethical behavior, which at least shows that it’s not an openly accepted norm. On the other hand, companies like Uber do exist.
I’d argue that it’s difficult to determine the prevalence of an attitude within a given population absent a survey. I don’t think the author did one across academia, hence why I felt comfortable...
I’d argue that it’s difficult to determine the prevalence of an attitude within a given population absent a survey. I don’t think the author did one across academia, hence why I felt comfortable extrapolating from my terribly comprehensive dataset (n=1). I’d imagine that most people are either ignorant of the issues or actively deflect them, and that hasn’t seemed to lessen the effect of their actions. This is contrary to the author’s position that learned helplessness in other industries would cause widespread justification of antisocial behaviour: the alternative, in practice, is to ignore the problem and to take the optimal (incentivized) choice regardless.
On the other hand, companies like Uber do exist.
Minimally, I’d add the entire online gambling industry to that list, as well as anyone working on software in the fossil fuel industry. One could make an argument for all development in the crypto sector, and many financial institutions as well.
I’m not sure that everyone in these industries is unethical, but I agree that unethical behavior is widespread, and in some industry sectors, it’s the norm.
I’m not sure that everyone in these industries is unethical, but I agree that unethical behavior is widespread, and in some industry sectors, it’s the norm.
I read about half this article, before it became too long, so forgive me if I write about something not relevant. I think the author believes that by making people aware to how their belief that...
Exemplary
I read about half this article, before it became too long, so forgive me if I write about something not relevant.
I think the author believes that by making people aware to how their belief that justification via working within the status quo, this awareness will somehow bring about change. I do not agree with this assessment. They spend a lot of time highlighting what can and does go wrong by working within the system, instead of addressing why the system exists the way it does.
I instead propose some practices that I think would help bring about change, in a move towards "Reformation"
End artificial publication/crediting scarcity. If you didn't do it without me, then you didn't do it without me, give credit where credit is due.
We have gotten to a point in science where we no longer have to be so stingy with who gets authorship, or rather, who gets credit, for work done or assisting with work that leads to a publication. I can only assume that back in the day we literally had physical space issues to credit everyone, and that experiments were often only done by 1-5 people at a time, and thus we had this convention to be very strict and clear about authorship and who did what. These days, that just isn't true anymore, and, as the author points out, science already has a lot of issues with promoting antisocial behavior, and this is just another one. Now, I am not naive enough to think that false authorship scarcity is merely a product of convention, I know it's also deeply tied to notoriety being a huge commodity in science, so I argue the main reason we are in this predicament these days, is obsessions with ego. I would love to see a crazy huge list of "thank you" the way that movies give credit, "coffee boy #1 - so and so". (I know we have acknowledgement sections and other "throw me a bone" type stuff, but IYKY, even those are so coveted and prestigious).
Increase pay/prestige for non-PhD or non "top tier" professions (professorship, PIs, etc.) There are so few options for anyone in academic science to make a decent living outside of going for the top positions, there is also no way to get any credit outside of pursuing the top positions, or rather, it's much harder. Again, there is a constant balance between the two major commodities in science: being known, and being paid. I think by increasing the prestige and pay of non-"top" positions, we would reduce a lot of anti-social behaviors that stem from wanting to be known and it would soothe a lot of people's egos. It would also reduce the competition of how many people are pursuing a limited number of positions, that are limited by nature. I suppose we could also reduce the requirements to become some of these top-tier positions as well, I am not opposed to this, and have seen some "assistant PIs" or whatever you want to call it, that do not have PhDs, for example. But like any post-hoc position that does not eliminate the original position, these people are often exceptional and would have been PhDs anyway but aren't for whatever reason, and they end up being paid less/less prestige than a "traditional" PI or Professor.
Publish/Document/Archive "failed" or rejected manuscripts and experiments. Publish negative data. (this one is extremely unrealistic or lofty, but a man can dream). A lot of collaboration has died in science due to "required" secrecy. I had to leave science because it was painful to me that in a field where we are supposedly pursuing truth and knowledge for the sake of pursuing truth and knowledge and expanding the common knowledge of humankind, we would spend so much time hiding information from other scientists. I felt so wasteful doing experiments over and over that either someone else had already done, but I did not know the results of (I mean for people outside your lab, not within the lab - its own issue), or doing experiments that "went nowhere" because we drop the project or don't publish the results. We do not help anyone else by hiding these experiments, we only help ourselves from fear of being scooped, or fear of giving someone a head start. It's weird and perverse.
Enacting this would also encourage praising "good science"(how it was performed) over "what we found out" (what was sexy), and would actually help with issue 1, by allowing people to get recognition even if their finding wasn't exciting or "new" (don't get me started on the issues of reproducibility or replication).
Reward or encourage collaboration. I suppose reproducibility and replication could help with encouraging collaboration. However, what I had in mind, was groups actually working on the same thing at the same time together (at various degrees of togetherness), since most of our problems in science are quite large endeavors at this point. I don't know how you "reward" this, when our current incentives are, be known, get paid.
Before I tap out, my understanding is that a lot of this is USA centric, and that in other places some of these problems are not as perverse, but they still exist, since at least right now, the USA is still the powerhouse of publication. I feel that the legacy publishers are really some of the worst aspects of science, and I hope they can be gutted from the inside out, if not removed entirely.
Thanks for coming to my TED talk.
Random notes
Scientists have only few choices these days to make science their living: academia, industry, niche fields (government, teacher, advocate, journalist, writer, etc.) but let's say for the argument of this post, the behemoths are 50/50 academia or industry. Industry has way less of the problems that academia has that the author discusses, it's my personal opinion that this is because industry's main commodity is money and marketability, and they are upfront about that. Academia's main commodity is being first and being known (ego driven), while projecting a veneer that it is about pursuit of truth for truth's sake. This mismatch of ideals with practice, makes it worse to be in Academia, for myself personally.
I think Science could stand a better chance at reformation if more scientists were cross-trained outside of science. If for no other reason than for them to realize they don't have to stay in science. But for other reasons as well, like understanding how other systems can function. The reason I think it's important for scientists to realize they could leave science at any given time is a few fold: 1. In the scenario we are in of: work within the system to change the system, or work within the system as it stands, most people will not consider "leave the system, because I cannot do either of those things." Learning and working in things outside of science helps you have that third option. 2. It gives scientists conviction to choose science, and perhaps that will motivate them to change or embrace science, but at least it was a choice, and not just something they went along with because they don't know any better.
Science also suffers from a huge amount of "fuck you I got mine," but also "I need to haze the shit out of you, because suffering is part of the process and I need to validate my own shitty experience." It relies heavily on what another poster here said along the lines of, "if what I was doing was so bad, they wouldn't allow me to do it" mentality. The old guard is really toxic, and I guess like everything else in our world right now, most change doesn't come from within and that is sad, and it's time for new blood.
I absolutely agree that the way towards meaningful change is via systemic change. All the changes you’ve illustrated would make science a better place to work. I also agree that the author’s...
I absolutely agree that the way towards meaningful change is via systemic change. All the changes you’ve illustrated would make science a better place to work.
I also agree that the author’s article isn’t really meaningful advocacy. I do think the author believes they are pushing for change when they really aren’t.
They spend a lot of time highlighting what can and does go wrong by working within the system, instead of addressing why the system exists the way it does.
Those words still deserve to be said. Human systems don’t exist in a vacuum; they are made up of their individual components: humans. Systems only change when the humans within them change, often against the incentives that be.
For example, if everyone in industry maximized their salary, non-profits and other good-doing groups would struggle to find employees. If regulators always capitulated to the demands of lobbyists, society would suffer.
The author is criticizing the mindset that the systemic incentives deprive humans of agency. They don’t. You still have many, many individual choices. The hardest choices are always those that go against the incentives, and they often make the biggest difference.
During my time in academia, I made a set of choices that went against the incentives that existed. They may have cost me my career within academia. I also believe those choices likely galvanized enough political will within my community to make small but meaningful systemic change. I wasn’t solely responsible——most of the credit for those changes should go to those in leadership positions——but I have a sense those changes wouldn’t have happened without me (otherwise they should have happened years prior).
I won’t argue that there is a moral imperative to make those extremely difficult decisions. I think it’s perfectly okay to behave as the incentives drive you to. I think it’s crucial to acknowledge that there is an opportunity cost to that behavior. There is always individual agency towards making the changes you do want to see.
Hate to split the sentence like that, but that's one of the biggest problems across the board with not fixing the incentives: The only people who will buck against them are the ones willing to...
I made a set of choices that went against the incentives that existed. They may have cost me my career
Hate to split the sentence like that, but that's one of the biggest problems across the board with not fixing the incentives: The only people who will buck against them are the ones willing to lose their job or career.
It's harder to rock the boat when you have mouths to feed. Hence why whistleblowers are rare, especially for all but the most extreme situations.
There are thousands of people who could say exactly what happens in C-Suite meetings regarding some truely awful behavior. They almost certainly will not unless the guilt outweighs their need for continued employment.
I think this article is glossing over what is (to my mind) the main problem with perverse incentives. The author is correct that incentives do not force people to act a certain way; after all,...
Exemplary
I think this article is glossing over what is (to my mind) the main problem with perverse incentives. The author is correct that incentives do not force people to act a certain way; after all, people act against their own interests all the time, even without any ethical or philosophical reasoning.
But incentives reward and promote people who optimize for those specific incentives, and they punish and demote people who do not. The more competitive the field is, the more it will be filled by people who serve its incentive structure most effectively.
If the incentive structure does not align with ethical behavior, then ethical people are forced out. If the incentive structure does not align with rational behavior, then rational people are forced out. If the incentive structure does not align with creative behavior, then creative people are forced out. And so on.
And once you lose your ethical/rational/creative/whatever people, it's not realistic to ask people still working in the field to start acting ethically/rationally/creatively/whatever. Even if you do change the incentive structure to perfectly align with these values, you probably won't see all that much change unless you simultaneously make it easy and desirable for existing people to leave the field and for new people to come in and replace them. Anything that makes turnover difficult (high education/training/experience requirements, high networking requirements, poor accessibility to alternative fields with equivalent pay and prestige, etc.) will make change slower.
And this is why bad incentive structures are poison. They can ruin an institution for a generation.
I pursued industry over science because of exactly this. My undergraduate research advisor had a saying, "always have the cameras rolling. It only has to work once, as long as it's on camera". I...
I pursued industry over science because of exactly this. My undergraduate research advisor had a saying, "always have the cameras rolling. It only has to work once, as long as it's on camera". I still don't have the gumption to tell him that's the precise opposite of science.
I have a few papers in my name and presented at a conference, which was cool, but man, him and my project partner just fundamentally did not give a shot about advancing human knowledge. The advisor in particular was an entertainer, he just wanted a crowd to gasp in awe, damn the consequences.
From the article: It could be generalized to many other areas of life. Sometimes, ironically, it’s idealists who overestimate the power of incentives - someone with conflict of interest must be...
From the article:
There is, of course, an element of truth to this kind of response. I’m not denying that perverse incentives exist; they obviously do. There’s no question that many aspects of modern scientific culture systematically incentivize antisocial behavior, and I don’t think we can or should pretend otherwise. What I do object to quite strongly is the narrative that scientists are somehow helpless in the face of all these awful incentives—that we can’t possibly be expected to take any course of action that has any potential, however small, to impede our own career development.
It could be generalized to many other areas of life. Sometimes, ironically, it’s idealists who overestimate the power of incentives - someone with conflict of interest must be lying, and no industry research could possibly be honest.
Re. This section:
I’m not in any of the listed industries, but software developers routinely create works that scam others out of life savings, facilitate hate crimes, price fix to with the intention of harvesting every potential dollar from the working class, etc. And a rounding error bats an eye. Having briefly worked in related areas, instead of blaming The Incentives, they assuage all guilt and point to individual responsibility, or say that regulations would stop them if it were actually harmful.
Reaching out a bit beyond my field, we can thank career politicians for the current state of polarization in several jurisdictions that use first past the post voting system. Having no context into their closed door discussions, I can’t say whether they privately feel remorse for the direct harm they’ve caused to stay employed, but publicly the best I’ve seen is a shrug and dismissal of the overall topic.
I suppose this is my roundabout way of asking the question, has the author looked at the world lately? Admittedly this piece is seven years old …
It seems hard to say how widespread such attitudes are in the software industry? Certainly there are lots of people who are quick to make allegations of unethical behavior, which at least shows that it’s not an openly accepted norm. On the other hand, companies like Uber do exist.
I’d argue that it’s difficult to determine the prevalence of an attitude within a given population absent a survey. I don’t think the author did one across academia, hence why I felt comfortable extrapolating from my terribly comprehensive dataset (n=1). I’d imagine that most people are either ignorant of the issues or actively deflect them, and that hasn’t seemed to lessen the effect of their actions. This is contrary to the author’s position that learned helplessness in other industries would cause widespread justification of antisocial behaviour: the alternative, in practice, is to ignore the problem and to take the optimal (incentivized) choice regardless.
Minimally, I’d add the entire online gambling industry to that list, as well as anyone working on software in the fossil fuel industry. One could make an argument for all development in the crypto sector, and many financial institutions as well.
I’m not sure that everyone in these industries is unethical, but I agree that unethical behavior is widespread, and in some industry sectors, it’s the norm.
I read about half this article, before it became too long, so forgive me if I write about something not relevant.
I think the author believes that by making people aware to how their belief that justification via working within the status quo, this awareness will somehow bring about change. I do not agree with this assessment. They spend a lot of time highlighting what can and does go wrong by working within the system, instead of addressing why the system exists the way it does.
I instead propose some practices that I think would help bring about change, in a move towards "Reformation"
End artificial publication/crediting scarcity. If you didn't do it without me, then you didn't do it without me, give credit where credit is due.
We have gotten to a point in science where we no longer have to be so stingy with who gets authorship, or rather, who gets credit, for work done or assisting with work that leads to a publication. I can only assume that back in the day we literally had physical space issues to credit everyone, and that experiments were often only done by 1-5 people at a time, and thus we had this convention to be very strict and clear about authorship and who did what. These days, that just isn't true anymore, and, as the author points out, science already has a lot of issues with promoting antisocial behavior, and this is just another one. Now, I am not naive enough to think that false authorship scarcity is merely a product of convention, I know it's also deeply tied to notoriety being a huge commodity in science, so I argue the main reason we are in this predicament these days, is obsessions with ego. I would love to see a crazy huge list of "thank you" the way that movies give credit, "coffee boy #1 - so and so". (I know we have acknowledgement sections and other "throw me a bone" type stuff, but IYKY, even those are so coveted and prestigious).
Increase pay/prestige for non-PhD or non "top tier" professions (professorship, PIs, etc.) There are so few options for anyone in academic science to make a decent living outside of going for the top positions, there is also no way to get any credit outside of pursuing the top positions, or rather, it's much harder. Again, there is a constant balance between the two major commodities in science: being known, and being paid. I think by increasing the prestige and pay of non-"top" positions, we would reduce a lot of anti-social behaviors that stem from wanting to be known and it would soothe a lot of people's egos. It would also reduce the competition of how many people are pursuing a limited number of positions, that are limited by nature. I suppose we could also reduce the requirements to become some of these top-tier positions as well, I am not opposed to this, and have seen some "assistant PIs" or whatever you want to call it, that do not have PhDs, for example. But like any post-hoc position that does not eliminate the original position, these people are often exceptional and would have been PhDs anyway but aren't for whatever reason, and they end up being paid less/less prestige than a "traditional" PI or Professor.
Publish/Document/Archive "failed" or rejected manuscripts and experiments. Publish negative data. (this one is extremely unrealistic or lofty, but a man can dream). A lot of collaboration has died in science due to "required" secrecy. I had to leave science because it was painful to me that in a field where we are supposedly pursuing truth and knowledge for the sake of pursuing truth and knowledge and expanding the common knowledge of humankind, we would spend so much time hiding information from other scientists. I felt so wasteful doing experiments over and over that either someone else had already done, but I did not know the results of (I mean for people outside your lab, not within the lab - its own issue), or doing experiments that "went nowhere" because we drop the project or don't publish the results. We do not help anyone else by hiding these experiments, we only help ourselves from fear of being scooped, or fear of giving someone a head start. It's weird and perverse.
Enacting this would also encourage praising "good science"(how it was performed) over "what we found out" (what was sexy), and would actually help with issue 1, by allowing people to get recognition even if their finding wasn't exciting or "new" (don't get me started on the issues of reproducibility or replication).
Before I tap out, my understanding is that a lot of this is USA centric, and that in other places some of these problems are not as perverse, but they still exist, since at least right now, the USA is still the powerhouse of publication. I feel that the legacy publishers are really some of the worst aspects of science, and I hope they can be gutted from the inside out, if not removed entirely.
Thanks for coming to my TED talk.
Random notes
Scientists have only few choices these days to make science their living: academia, industry, niche fields (government, teacher, advocate, journalist, writer, etc.) but let's say for the argument of this post, the behemoths are 50/50 academia or industry. Industry has way less of the problems that academia has that the author discusses, it's my personal opinion that this is because industry's main commodity is money and marketability, and they are upfront about that. Academia's main commodity is being first and being known (ego driven), while projecting a veneer that it is about pursuit of truth for truth's sake. This mismatch of ideals with practice, makes it worse to be in Academia, for myself personally.
I think Science could stand a better chance at reformation if more scientists were cross-trained outside of science. If for no other reason than for them to realize they don't have to stay in science. But for other reasons as well, like understanding how other systems can function. The reason I think it's important for scientists to realize they could leave science at any given time is a few fold: 1. In the scenario we are in of: work within the system to change the system, or work within the system as it stands, most people will not consider "leave the system, because I cannot do either of those things." Learning and working in things outside of science helps you have that third option. 2. It gives scientists conviction to choose science, and perhaps that will motivate them to change or embrace science, but at least it was a choice, and not just something they went along with because they don't know any better.
Science also suffers from a huge amount of "fuck you I got mine," but also "I need to haze the shit out of you, because suffering is part of the process and I need to validate my own shitty experience." It relies heavily on what another poster here said along the lines of, "if what I was doing was so bad, they wouldn't allow me to do it" mentality. The old guard is really toxic, and I guess like everything else in our world right now, most change doesn't come from within and that is sad, and it's time for new blood.
I absolutely agree that the way towards meaningful change is via systemic change. All the changes you’ve illustrated would make science a better place to work.
I also agree that the author’s article isn’t really meaningful advocacy. I do think the author believes they are pushing for change when they really aren’t.
Those words still deserve to be said. Human systems don’t exist in a vacuum; they are made up of their individual components: humans. Systems only change when the humans within them change, often against the incentives that be.
For example, if everyone in industry maximized their salary, non-profits and other good-doing groups would struggle to find employees. If regulators always capitulated to the demands of lobbyists, society would suffer.
The author is criticizing the mindset that the systemic incentives deprive humans of agency. They don’t. You still have many, many individual choices. The hardest choices are always those that go against the incentives, and they often make the biggest difference.
During my time in academia, I made a set of choices that went against the incentives that existed. They may have cost me my career within academia. I also believe those choices likely galvanized enough political will within my community to make small but meaningful systemic change. I wasn’t solely responsible——most of the credit for those changes should go to those in leadership positions——but I have a sense those changes wouldn’t have happened without me (otherwise they should have happened years prior).
I won’t argue that there is a moral imperative to make those extremely difficult decisions. I think it’s perfectly okay to behave as the incentives drive you to. I think it’s crucial to acknowledge that there is an opportunity cost to that behavior. There is always individual agency towards making the changes you do want to see.
Hate to split the sentence like that, but that's one of the biggest problems across the board with not fixing the incentives: The only people who will buck against them are the ones willing to lose their job or career.
It's harder to rock the boat when you have mouths to feed. Hence why whistleblowers are rare, especially for all but the most extreme situations.
There are thousands of people who could say exactly what happens in C-Suite meetings regarding some truely awful behavior. They almost certainly will not unless the guilt outweighs their need for continued employment.
I think this article is glossing over what is (to my mind) the main problem with perverse incentives. The author is correct that incentives do not force people to act a certain way; after all, people act against their own interests all the time, even without any ethical or philosophical reasoning.
But incentives reward and promote people who optimize for those specific incentives, and they punish and demote people who do not. The more competitive the field is, the more it will be filled by people who serve its incentive structure most effectively.
If the incentive structure does not align with ethical behavior, then ethical people are forced out. If the incentive structure does not align with rational behavior, then rational people are forced out. If the incentive structure does not align with creative behavior, then creative people are forced out. And so on.
And once you lose your ethical/rational/creative/whatever people, it's not realistic to ask people still working in the field to start acting ethically/rationally/creatively/whatever. Even if you do change the incentive structure to perfectly align with these values, you probably won't see all that much change unless you simultaneously make it easy and desirable for existing people to leave the field and for new people to come in and replace them. Anything that makes turnover difficult (high education/training/experience requirements, high networking requirements, poor accessibility to alternative fields with equivalent pay and prestige, etc.) will make change slower.
And this is why bad incentive structures are poison. They can ruin an institution for a generation.
I pursued industry over science because of exactly this. My undergraduate research advisor had a saying, "always have the cameras rolling. It only has to work once, as long as it's on camera". I still don't have the gumption to tell him that's the precise opposite of science.
I have a few papers in my name and presented at a conference, which was cool, but man, him and my project partner just fundamentally did not give a shot about advancing human knowledge. The advisor in particular was an entertainer, he just wanted a crowd to gasp in awe, damn the consequences.
From the article:
It could be generalized to many other areas of life. Sometimes, ironically, it’s idealists who overestimate the power of incentives - someone with conflict of interest must be lying, and no industry research could possibly be honest.