Tech News

Research has found that artificial intelligence search tools are often wrong.

A new study found that AI search tools confidently spit out the wrong answers.

Columbia News Review (CJR) conducted a study that provided excerpts of an article for eight AIs and asked the chatbot to determine the “title of the corresponding article, original publisher, publication date and URL.” Overall, the chatbot “provides the wrong answers to more than 60% of queries.”

See:

How to identify text generated by AI

Errors vary. Sometimes, search tools have reportedly speculated or provided incorrect answers to questions they cannot answer. Sometimes, it invents links or sources. Sometimes, it cites stolen versions of real articles.

Mixable light speed

“Most of the tools we tested come up with shocking confidence to present inaccurate answers, rarely using qualified phrases such as ‘it seems to be ‘may, ‘may,’ possible,’ etc., or acknowledge knowledge gaps in statements such as ‘I can’t find the exact article.'” CJR wrote.

The complete study is worth a look, but it seems reasonable to be skeptical of AI search tools. The problem is that people don’t seem to do that. CJR noted that 25% of Americans said they use AI instead of traditional search engines to search.

Search giant Google is increasingly pushing AI to consumers. This month, it announced it would expand its AI overview and begin testing AI-only search results.

Research from CJR just shows data points that are inaccurate in AI. These tools show time and time again that they will confidently give wrong answers. Moreover, technology giants force AI into almost every product. So be careful about your beliefs there.

theme
AI



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button