My research career as PI is now taking off, and my work is enmeshed in big, sometimes huge teams across disciplines. When I read these accounts, I want to go back to clinical work only. My terror...
My research career as PI is now taking off, and my work is enmeshed in big, sometimes huge teams across disciplines. When I read these accounts, I want to go back to clinical work only.
My terror of perpetuating falsehoods made me so slow and deliberate through my training and early career that I routinely became the "bottleneck" that other team members would push and push to move faster. Being the person who frequently spots misused references (as in correct/good paper cited but with an incorrect interpretation or misrepresentation added by the writer) makes me both valuable and annoying to work with. But bring in checking for others' fraudulent data into the picture, and it's a much darker and stickier terror. I'm not sure how long I want to do this!
This is going to be required reading for my lab, and I'm going to create a lab manual similar to Dr. Laskowski's.
This pretty much explains it all right here. Until science rewards negative data and true replication, it will continue to hinge entire careers on positive data and novelty only, which directly...
“If you create perverse incentives, you’re going to create perverse behaviors.”
This pretty much explains it all right here. Until science rewards negative data and true replication, it will continue to hinge entire careers on positive data and novelty only, which directly incentivizes people to falsify stories and data.
Yeah, I think ultimately fraud in science can only really be combatted by changing the system that incentivizes it. I liked these observations in the article: The last sentence particularly hits me.
Yeah, I think ultimately fraud in science can only really be combatted by changing the system that incentivizes it. I liked these observations in the article:
He views its incentive structure as a “moral hazard” for scientists, he says, because it tells them that the only way to advance is to take big swings, but most big swings are misses, not home runs. So some may feel compelled to cut corners or fudge results to propel their status and career. Heinz sees a system in which labs push toward “personal brand goals” instead of biological truths. He loves being a scientist, he says, but he’s not sure if he can protect himself from the dissonance that comes from existing within the scientific world.
My research career as PI is now taking off, and my work is enmeshed in big, sometimes huge teams across disciplines. When I read these accounts, I want to go back to clinical work only.
My terror of perpetuating falsehoods made me so slow and deliberate through my training and early career that I routinely became the "bottleneck" that other team members would push and push to move faster. Being the person who frequently spots misused references (as in correct/good paper cited but with an incorrect interpretation or misrepresentation added by the writer) makes me both valuable and annoying to work with. But bring in checking for others' fraudulent data into the picture, and it's a much darker and stickier terror. I'm not sure how long I want to do this!
This is going to be required reading for my lab, and I'm going to create a lab manual similar to Dr. Laskowski's.
Thank you for a great read!
This pretty much explains it all right here. Until science rewards negative data and true replication, it will continue to hinge entire careers on positive data and novelty only, which directly incentivizes people to falsify stories and data.
Yeah, I think ultimately fraud in science can only really be combatted by changing the system that incentivizes it. I liked these observations in the article:
The last sentence particularly hits me.