US News

Study shows AI models can exhibit human-like gambling addiction behaviors

Researchers warn that the irrational betting behavior of AI models could have an impact as the technology reaches deeper into finance. Photo by Sara Oliveira for Unsplash+

Gambling addiction in humans has long been characterized by behaviors such as the illusion of control, the belief that winning will follow a losing streak, and attempts to recoup losses by continuing to bet. A new study by researchers at South Korea’s Gwangju Institute of Science and Technology shows that this kind of irrational behavior may also appear in artificial intelligence models.

The study, which has not yet been peer-reviewed, noted that large language models (LLMs) exhibit high-risk gambling decisions, especially when given more autonomy. Seungpil Lee, one of the report’s co-authors, said these trends could pose risks as the technology becomes more deeply integrated into the asset management world. “We will use [A.I.] Increasingly involved in decision-making, especially in finance,” he told the Observer.

In order to test the AI ​​gambling behavior, the author ran four models by simulating slot machine games: OpenAI’s GPT-4o-mini and GPT-4.1.-mini, Google’s Gemini-2.5-Flash, and Anthropic’s Claude-3.5-Haiku. Each model starts at $100 and can keep betting or quit, and the researchers tracked their choices using the Irrationality Index, which measures factors such as betting aggressiveness, extreme betting and chasing losses.

The results showed that all four LL.M.s experienced higher bankruptcy rates when given more freedom to vary bet sizes and choose target amounts, but the extent varied across models – a difference that Li said may reflect differences in the training data. Gemini-2.5-Flash has the highest bankruptcy rate at 48%, while GPT-4.1-mini has the lowest bankruptcy rate at just over 6%.

These models also consistently exhibit human-like characteristics of human gambling addiction, such as chasing wins when gamblers continue to bet because they view winnings as “free money,” and chasing losses when they continue to try to recoup losses. Chasing wins is especially common: among LL.M.s, the rate of bet increase increases from 14.5% to 22% during a winning streak, according to the study.

Despite these similarities, Lee emphasizes that there are still important differences. “These results don’t actually show that they reason exactly the way humans do,” he said. “They learn some characteristics from human reasoning that may influence their choices.”

This does not mean that humanoid tendencies are harmless. Artificial intelligence systems are increasingly embedded in finance, from customer experience tools to fraud detection, forecasting and earnings reporting analytics. Earlier this year, MIT Technology Review Insights surveyed 250 bank executives, 70% of whom said they were using agent AI in some form.

Since gambling-like characteristics increase significantly when LL.M.s are granted more autonomy, the authors argue that they should be included in monitoring mechanisms. “We have to be more precise rather than giving them complete freedom of decision-making,” Lee said.

However, the prospect of developing a completely risk-free model is unlikely, Li added, noting that the challenges go beyond artificial intelligence itself. “It seems that even humans can’t do it.”

Study shows AI models can exhibit human-like gambling addiction behaviors



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button