Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
March 8, 2021 05:34 am

Furious AI Researcher Creates Site Shaming Non-Reproducible Machine Learning Papers

The Next Web tells the story of an AI researcher who discovered the results of a machine learning research paper couldn't be reproduced. But then they'd heard similar stories from Reddit's Machine Learning forum:"Easier to compile a list of reproducible ones...," one user responded. "Probably 50%-75% of all papers are unreproducible. It's sad, but it's true," another user wrote. "Think about it, most papers are 'optimized' to get into a conference. More often than not the authors know that a paper they're trying to get into a conference isn't very good! So they don't have to worry about reproducibility because nobody will try to reproduce them." A few other users posted links to machine learning papers they had failed to implement and voiced their frustration with code implementation not being a requirement in ML conferences. The next day, ContributionSecure14 created "Papers Without Code," a website that aims to create a centralized list of machine learning papers that are not implementable... Papers Without Code includes a submission page, where researchers can submit unreproducible machine learning papers along with the details of their efforts, such as how much time they spent trying to reproduce the results... If the authors do not reply in a timely fashion, the paper will be added to the list of unreproducible machine learning papers.

Read more of this story at Slashdot.


Original Link: http://rss.slashdot.org/~r/Slashdot/slashdot/~3/kcUrW80tP2s/furious-ai-researcher-creates-site-shaming-non-reproducible-machine-learning-papers

Share this article:    Share on Facebook
View Full Article

Slashdot

Slashdot was originally created in September of 1997 by Rob "CmdrTaco" Malda. Today it is owned by Geeknet, Inc..

More About this Source Visit Slashdot