An Interest In:
Web News this Week
- February 29, 2024
- February 28, 2024
- February 27, 2024
- February 26, 2024
- February 25, 2024
- February 24, 2024
- February 23, 2024
OpenAI releases larger GPT-2 dataset. Can it write fake news better than a human?
OpenAI has released a more extensive version of its generative language model.
Were releasing the 774 million parameter GPT-2 language model after the release of our small 124M model in February ...
2. Humans can be convinced by synthetic text. Research from our research partners Sarah Kreps and Miles McCain at Cornell published in Foreign Affairs says people find GPT-2 synthetic text samples almost as convincing (72% in one cohort judged the articles to be credible) as real articles from the New York Times (83%). Additionally, research from AI2/UW has shown that news written by a system called GROVER can be more plausible than human-written propaganda. These research results make us generally more cautious about releasing language models
Blockquoted below is something I just had it make (using Talk to Transformer, which has been updated with the new dataset.)
I wrote the first (bolded) paragraph. GPT-2 wrote the rest.
Read the restFormer Democratic presidential candidate and United States Senator Hillary Clinton was arrested today and charged on four counts of conspiracy, one count of fraud, and one count of lying to Federal investigators.
The details of the case are detailed below.
A Brief Overview of the Case
On June 2, 2014, Clinton (pictured) admitted to FBI agents that, on June 23, 2013, she, and others, had conspired with other political figures to take "official action" in response to a series of negative articles which she wrote in the Washington Times and other outlets.
The following is a summary of Clinton's admission:
Secretary Clinton used the Washington Post as her de facto personal email account and for the official State Department email account.
Original Link: http://feeds.boingboing.net/~r/boingboing/iBag/~3/oQxKUzBDTgs/openai-releases-larger-gpt-2-d.html