Artificial intelligence may well save us time by finding information faster, but it is not always a reliable researcher. It frequently makes unsupported claims that are not backed up by reliable sources. A study by Pranav Narayanan Venkit at Salesforce AI Research and colleagues found that about one-third of the statements made by AI tools like Perplexity, You.com and Microsoft’s Bing Chat were not supported by the sources they provided. For OpenAI’s GPT 4.5, the figure was 47%.
A new study finds AI tools are often unreliable, overconfident and one-sided
Tech News
-
Highlights
Free Dark Web Monitoring Stamps the $17 Million Credentials Markets
-
Highlights
Smart buildings: What happens to our free will when tech makes choices for us?
-
Apps
Screenshots have generated new forms of storytelling, from Twitter fan fiction to desktop film
-
Highlights
Darknet markets generate millions in revenue selling stolen personal data, supply chain study finds
-
Security
Privacy violations undermine the trustworthiness of the Tim Hortons brand
-
Featured Headlines
Why Tesla’s Autopilot crashes spurred the feds to investigate driver-assist technologies – and what that means for the future of self-driving cars