-
Sunderland rout hapless Burnley
-
Costa Rican president-elect looks to Bukele for help against crime
-
Hosts Australia to open Rugby World Cup against Hong Kong
-
New York records 13 cold-related deaths since late January
-
In post-Maduro Venezuela, pro- and anti-government workers march for better pay
-
Romero slams 'disgraceful' Spurs squad depth
-
Trump urges 'no changes' to bill to end shutdown
-
Trump says India, US strike trade deal
-
Cuban tourism in crisis; visitors repelled by fuel, power shortages
-
Liverpool set for Jacquet deal, Palace sign Strand Larsen on deadline day
-
FIFA president Infantino defends giving peace prize to Trump
-
Trump cuts India tariffs, says Modi will stop buying Russian oil
-
Borthwick backs Itoje to get 'big roar' off the bench against Wales
-
Twenty-one friends from Belgian village win €123mn jackpot
-
Mateta move to Milan scuppered by medical concerns: source
-
Late-January US snowstorm wasn't historically exceptional: NOAA
-
Punctuality at Germany's crisis-hit railway slumps
-
Gazans begin crossing to Egypt for treatment after partial Rafah reopening
-
Halt to MSF work will be 'catastrophic' for people of Gaza: MSF chief
-
Italian biathlete Passler suspended after pre-Olympics doping test
-
Europe observatory hails plan to abandon light-polluting Chile project
-
Iran president orders talks with US as Trump hopeful of deal
-
Uncertainty grows over when US budget showdown will end
-
Oil slides, gold loses lustre as Iran threat recedes
-
Russian captain found guilty in fatal North Sea crash
-
Disney earnings boosted by theme parks, as CEO handover nears
-
Sri Lanka drop Test captain De Silva from T20 World Cup squad
-
France demands 1.7 bn euros in payroll taxes from Uber: media report
-
EU will struggle to secure key raw materials supply, warns report
-
France poised to adopt 2026 budget after months of tense talks
-
Latest Epstein file dump rocks UK royals, politics
-
Arteta seeks Arsenal reinforcement for injured Merino
-
Russia uses sport to 'whitewash' its aggression, says Ukraine minister
-
Chile officially backs Bachelet candidacy for UN top job
-
European stocks rise as oil tumbles, while tech worries weigh on New York
-
England captain Itoje on bench for Six Nations opener against Wales
-
Rahm says golfers should be 'free' to play where they want after LIV defections
-
More baby milk recalls in France after new toxin rules
-
Rosenior will not rush Estevao return from Brazil
-
Mercedes ready to win F1 world title, says Russell
-
Germany hit by nationwide public transport strike
-
Barca coach Flick 'not happy' with Raphinha thigh strain
-
WHO chief says turmoil creates chance for reset
-
European stocks rise as gold, oil prices tumble
-
Rink issues resolved, NHL stars chase Olympic gold at Milan
-
S. Korea celebrates breakthrough K-pop Grammy win for 'Golden'
-
Rodri rages that officials 'don't want' Man City to win
-
Gaza's Rafah crossing makes limited reopening after two-year war
-
African players in Europe: Ouattara dents Villa title hopes
-
Liverpool beat Chelsea to Rennes defender Jacquet - reports
Inbred, gibberish or just MAD? Warnings rise about AI models
When academic Jathan Sadowski reached for an analogy last year to describe how AI programs decay, he landed on the term "Habsburg AI".
The Habsburgs were one of Europe's most powerful royal houses, but entire sections of their family line collapsed after centuries of inbreeding.
Recent studies have shown how AI programs underpinning products like ChatGPT go through a similar collapse when they are repeatedly fed their own data.
"I think the term Habsburg AI has aged very well," Sadowski told AFP, saying his coinage had "only become more relevant for how we think about AI systems".
The ultimate concern is that AI-generated content could take over the web, which could in turn render chatbots and image generators useless and throw a trillion-dollar industry into a tailspin.
But other experts argue that the problem is overstated, or can be fixed.
And many companies are enthusiastic about using what they call synthetic data to train AI programs. This artificially generated data is used to augment or replace real-world data. It is cheaper than human-created content but more predictable.
"The open question for researchers and companies building AI systems is: how much synthetic data is too much," said Sadowski, lecturer in emerging technologies at Australia's Monash University.
- 'Mad cow disease' -
Training AI programs, known in the industry as large language models (LLMs), involves scraping vast quantities of text or images from the internet.
This information is broken into trillions of tiny machine-readable chunks, known as tokens.
When asked a question, a program like ChatGPT selects and assembles tokens in a way that its training data tells it is the most likely sequence to fit with the query.
But even the best AI tools generate falsehoods and nonsense, and critics have long expressed concern about what would happen if a model was fed on its own outputs.
In late July, a paper in the journal Nature titled "AI models collapse when trained on recursively generated data" proved a lightning rod for discussion.
The authors described how models quickly discarded rarer elements in their original dataset and, as Nature reported, outputs degenerated into "gibberish".
A week later, researchers from Rice and Stanford universities published a paper titled "Self-consuming generative models go MAD" that reached a similar conclusion.
They tested image-generating AI programs and showed that outputs become more generic and strafed with undesirable elements as they added AI-generated data to the underlying model.
They labelled model collapse "Model Autophagy Disorder" (MAD) and compared it to mad cow disease, a fatal illness caused by feeding the remnants of dead cows to other cows.
- 'Doomsday scenario' -
These researchers worry that AI-generated text, images and video are clearing the web of usable human-made data.
"One doomsday scenario is that if left uncontrolled for many generations, MAD could poison the data quality and diversity of the entire internet," one of the Rice University authors, Richard Baraniuk, said in a statement.
However, industry figures are unfazed.
Anthropic and Hugging Face, two leaders in the field who pride themselves on taking an ethical approach to the technology, both told AFP they used AI-generated data to fine-tune or filter their datasets.
Anton Lozhkov, machine learning engineer at Hugging Face, said the Nature paper gave an interesting theoretical perspective but its disaster scenario was not realistic.
"Training on multiple rounds of synthetic data is simply not done in reality," he said.
However, he said researchers were just as frustrated as everyone else with the state of the internet.
"A large part of the internet is trash," he said, adding that Hugging Face already made huge efforts to clean data -- sometimes jettisoning as much as 90 percent.
He hoped that web users would help clear up the internet by simply not engaging with generated content.
"I strongly believe that humans will see the effects and catch generated data way before models will," he said.
A.Gasser--BTB