aifaq.wtf

"How do you know about all this AI stuff?"
I just read tweets, buddy.

#fact-checking

Page 1 of 1

@omarsar0 on October 12, 2023

#fact-checking   #hallucinations   #tweets  

June 28, 2023: @mitalilive

#actual work   #labor   #journalism   #fact-checking  

I wasn't there, but I cite this tweet like every hour of every day.

May 30, 2023: @mmitchell_ai

#limitations   #fact-checking  

May 27, 2023: @d_feldman

#limitations   #lol   #hallucinations   #law and regulation   #fact-checking   #failures  

We're going to see a lot of people doubling down on the (accidental? incidental?) falsehoods spread by ChatGPT.

May 15, 2023: @ofirpress

#fact-checking   #challenges  

May 13, 2023: @jjvincent

#lol   #hallucinations   #fact-checking  

May 12, 2023: @mmitchell_ai

#summarization   #fact-checking  

The part I'll stress here is "without fiddling...[summarization] can go terribly wrong." We like to think summarizing things is easy – and it is, comparatively! – but give this a read. In a Danish newsroom experimenting with summarization, 41% of the auto-generated story summaries needed to be corrected before publication.

May 10, 2023: @kashhill

#lol   #hallucinations   #fact-checking   #failures  

Hallucinations for book and paper authorship are some of the most convincing. Subject matter typically matches the supposed author, and the titles are always very, very plausible. Because they are just generating text that statistically would make sense, LLMs are masters of "sounds about right." There's no list of books inside of the machine.

May 6, 2023: @baddatatakes

#lol   #hallucinations   #fact-checking   #failures  

The issue here is what is a "language model" actually for? We can say "predicting the next word in a sequence of words" but that's kicking the can down the road.

Most of the time it's pretty good at giving you facts, so where do you draw the line?

May 4, 2023: @cpautoscribe

#hallucinations   #lol   #fact-checking   #failures  

At some point I just stopped collecting tweets like this, there were just too many.