A few months ago, millions of TV viewers across South Korea were watching the MBN channel to catch the latest news.
At the top of the hour, regular news anchor Kim Joo-Ha started to go through the day's headlines. It was a relatively normal list of stories for late 2020 - full of Covid-19 and pandemic response updates.
Yet this particular bulletin was far from normal, as Kim Joo-Ha wasn't actually on the screen. Instead she had been replaced by a "deepfake" version of herself - a computer-generated copy that aims to perfectly reflect her voice, gestures and facial expressions.
Viewers had been informed beforehand that this was going to happen, and South Korean media reported a mixed response after people had seen it. While some people were amazed at how realistic it was, others said they were worried that the real Kim Joo-Ha might lose her job.
More politely called AI-generated videos, or synthetic media, usage is growing rapidly in sectors including news, entertainment and education, with the technology becoming increasingly sophisticated.
One of the early commercial adopters has been Synthesia, a London-based firm that creates AI-powered corporate training videos for the likes of global advertising firm WPP and business consultancy Accenture.
"This is the future of content creation," says Synthesia chief executive and co-founder Victor Riparbelli.
Lilian Edwards, professor of law, innovation and society at Newcastle Law School, is an expert on deepfakes. She says that one issue surrounding the commercial use of the technology that hasn't been fully addressed is who owns the rights to the videos.
"For example, if a dead person is used, such as [the actor] Steve McQueen or [the rapper] Tupac, there is an ongoing debate about whether their family should own the rights [and make an income from it]," she says.
"Currently this differs from country to country."
C/o BBC News
@Field Marshal chunga sana
deep fakes all over
At the top of the hour, regular news anchor Kim Joo-Ha started to go through the day's headlines. It was a relatively normal list of stories for late 2020 - full of Covid-19 and pandemic response updates.
Yet this particular bulletin was far from normal, as Kim Joo-Ha wasn't actually on the screen. Instead she had been replaced by a "deepfake" version of herself - a computer-generated copy that aims to perfectly reflect her voice, gestures and facial expressions.
Viewers had been informed beforehand that this was going to happen, and South Korean media reported a mixed response after people had seen it. While some people were amazed at how realistic it was, others said they were worried that the real Kim Joo-Ha might lose her job.
More politely called AI-generated videos, or synthetic media, usage is growing rapidly in sectors including news, entertainment and education, with the technology becoming increasingly sophisticated.
One of the early commercial adopters has been Synthesia, a London-based firm that creates AI-powered corporate training videos for the likes of global advertising firm WPP and business consultancy Accenture.
"This is the future of content creation," says Synthesia chief executive and co-founder Victor Riparbelli.
Lilian Edwards, professor of law, innovation and society at Newcastle Law School, is an expert on deepfakes. She says that one issue surrounding the commercial use of the technology that hasn't been fully addressed is who owns the rights to the videos.
"For example, if a dead person is used, such as [the actor] Steve McQueen or [the rapper] Tupac, there is an ongoing debate about whether their family should own the rights [and make an income from it]," she says.
"Currently this differs from country to country."
C/o BBC News
@Field Marshal chunga sana
deep fakes all over