this post was submitted on 21 Jul 2023
63 points (100.0% liked)

Science

13035 readers
8 users here now

Studies, research findings, and interesting tidbits from the ever-expanding scientific world.

Subcommunities on Beehaw:


Be sure to also check out these other Fediverse science communities:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 10 comments
sorted by: hot top controversial new old
[–] RootBeerGuy@discuss.tchncs.de 16 points 1 year ago (1 children)

This is a major problem in probably all high profile labs. The PI is super busy because he is such a top dog in his field, he has dozens of postdocs and phd students who are all lucky if they get to see him 10 minutes every few weeks. No supervision or control but all the academic pressure to produce something. And not just anything, but something great and interesting. Of course this can result in people doctoring (heh) results.

[–] Gaywallet@beehaw.org 9 points 1 year ago (1 children)

I think what's surprising about it, is that this isn't a laundry list of shitty journals. High quality journals have a fairly rigorous review process meant to surface and deal with exactly this kind of thing. The bigger journals are quite good at spotting simple techniques like omitting data or p-hacking, but it appears that at least historically they were less resistant to image manipulation. Although I've never been a prolific researcher going through the submitter process with a place with the amount of prestige that Science and Nature brings and it's very possible that they lax the process for high profile people or those who submit regularly. Either way, I'm sure many journals are watching this unfold quite closely as there will be much to learn to make processes more resilient to issues like this.

[–] RootBeerGuy@discuss.tchncs.de 4 points 1 year ago (2 children)

quite good at spotting simple techniques like omitting data or p-hacking

I don't know about that. Spotting omitted data would only work if a key experiment is missing or if a reviewer suggests a control experiment that was actually done but not shown, or what do you mean?

And how to spot p-hacking? That would only work if you'd be able to see all underlying raw data. Otherwise especially in high impact journals the p-values are always excellent when they need to be.

[–] Lowbird@beehaw.org 5 points 1 year ago

Not to mention these peer review processes rely on unpaid labor from professionals who are heavily incentivized to use their time for basically anything else. They skim.

The replication crisis does not at all exclude highly regarded journals, unfortunately.

[–] jarfil@beehaw.org 4 points 1 year ago* (last edited 1 year ago)

That would only work if you'd be able to see all underlying raw data.

A paper without the underlying raw data, is like a bicycle without wheels: you know it might've been useful at some point, but it isn't anymore.

Very few papers publish both the raw data, and the analysis tools used on it, for everyone to verify their results.

The rest, are no different than a 4th grader writing down an answer, then when a teacher asks them to "show your work", they come back with "no, trust me, my peers agree I'm right, you do your own work".

It's extra sad when you contact a researcher directly for the data, and you get any of "it got lost in the last lab move", "I'm only giving it if you show me how you going to process it first", or some clearly spotty data backfilled from the paper's conclusions.

[–] yip-bonk@kbin.social 13 points 1 year ago (1 children)

In light of Baker's reporting, Stanford University opened its own internal inquiry into the matter. A panel of scientists concluded that Tessier-Lavigne's work contained image manipulations in 2001, the early 2010s, 2015-2016, and 2021.

But the panel dismissed any allegations of fraud or misconduct on the part of Tessier-Lavigne himself. Instead, they conclude that the "unusual frequency of manipulation of research data" in the neuroscientist's lab "suggests that there may have been opportunities to improve laboratory oversight and management".

lol

[–] Cenzorrll@beehaw.org 8 points 1 year ago (1 children)

If I understand many of my colleagues gripes about their days in graduate school, the PI basically told them to make it work, so they did. Either by manipulating procedures, using the one study out of 5 that worked, or by photoshopping images. I'd say manipulation is absolutely rampant in his lab, this is just one way they were doing it and they got caught.

[–] prole@beehaw.org 9 points 1 year ago* (last edited 1 year ago)

Sure, but that doesn't mean the grad students that did the actual manipulation are blameless, or some kind of victim. A big part of science is integrity, and by the time you get to grad school, you know this and you know better.

[–] Kindajustlikewhat@beehaw.org 12 points 1 year ago

Between this and Northwestern, I'm so glad student journalism is getting a spotlight!

[–] FlashMobOfOne@beehaw.org 2 points 1 year ago

Well that escalated quickly.

load more comments
view more: next ›