Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
July 17, 2020 10:40 pm

Tech Firms Hire 'Red Teams.' Scientists Should, Too

The recent retraction of a research paper which claimed to find no link between police killings and the race of the victims was a story tailor-made for today's fights over cancel culture. From a report: First, the authors asked for the paper to be withdrawn, both because they'd been "careless when describing the inferences that could be made from our data" and because of how others had interpreted the work. (In particular they pointed to recent op-ed in The Wall Street Journal with the headline, "The Myth of Systemic Police Racism.") Then, after two days of predictable blowback from those decrying what they saw as left-wing censorship, the authors tried to clarify: "People were incorrectly concluding that we retracted due to either political pressure or the political views of those citing the paper," they wrote in an amended statement. No, the authors said, the real reason they retracted the paper was because it contained a serious mistake. In fact, that mistake -- a misstatement of its central finding -- had been caught soon after the paper's initial publication in the Proceedings of the National Academy of Sciences in July 2019, and was formally corrected in April of this year. At that point, the authors acknowledged their error -- sort of -- while insisting that their main conclusions held. That the eventual retraction came only after the paper became a flashpoint in the debate over race and policing in the wake of George Floyd's murder ... well, let's agree that the retraction happened. [...] As we and others have written many times, peer review -- the way journals ask researchers to perform it, anyway -- is not designed to catch fraud. It's also vulnerable to rigging and doesn't go so well when done in haste. Editors and publishers tend to admit these problems only under duress -- i.e., when a well-publicized retraction happens -- and then hope that we believe their claims that such colossal blunders are somehow "the system is working the way it should." But their protestations only serve as an acknowledgement that the standard system doesn't work, and that we must instead rely upon the more informal sort of peer review that happens to a paper after it gets published. The internet has enabled such post-publication peer review, as it is known, to happen with more speed, on sites like PubPeer.com. In some cases, though -- as with the PNAS paper described above -- the resolution of this after-the-fact assessment comes much too late, after a mistaken claim has already made the rounds. So how might journals do things better? As Daniel Lakens, of Eindhoven University of Technology in the Netherlands, and his colleagues have argued, researchers should embrace a "Red Team challenge" approach to peer review. Just as software companies hire hackers to probe their products for potential gaps in the security, a journal might recruit a team of scientific devil's advocates: subject-matter specialists and methodologists who will look for "holes and errors in ongoing work and ... challenge dominant assumptions, with the goal of improving project quality," Lakens wrote in Nature recently. After all, he added, science is only as robust as the strongest critique it can handle.

Read more of this story at Slashdot.


Original Link: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/8GU4zRbxEDw/tech-firms-hire-red-teams-scientists-should-too

Share this article:    Share on Facebook
View Full Article

Slashdot

Slashdot was originally created in September of 1997 by Rob "CmdrTaco" Malda. Today it is owned by Geeknet, Inc..

More About this Source Visit Slashdot