Deep Fake

Meria

Elder Lister
Staff member
A few months ago, millions of TV viewers across South Korea were watching the MBN channel to catch the latest news.
At the top of the hour, regular news anchor Kim Joo-Ha started to go through the day's headlines. It was a relatively normal list of stories for late 2020 - full of Covid-19 and pandemic response updates.
Yet this particular bulletin was far from normal, as Kim Joo-Ha wasn't actually on the screen. Instead she had been replaced by a "deepfake" version of herself - a computer-generated copy that aims to perfectly reflect her voice, gestures and facial expressions.
Viewers had been informed beforehand that this was going to happen, and South Korean media reported a mixed response after people had seen it. While some people were amazed at how realistic it was, others said they were worried that the real Kim Joo-Ha might lose her job.
More politely called AI-generated videos, or synthetic media, usage is growing rapidly in sectors including news, entertainment and education, with the technology becoming increasingly sophisticated.
One of the early commercial adopters has been Synthesia, a London-based firm that creates AI-powered corporate training videos for the likes of global advertising firm WPP and business consultancy Accenture.
"This is the future of content creation," says Synthesia chief executive and co-founder Victor Riparbelli.
Lilian Edwards, professor of law, innovation and society at Newcastle Law School, is an expert on deepfakes. She says that one issue surrounding the commercial use of the technology that hasn't been fully addressed is who owns the rights to the videos.
"For example, if a dead person is used, such as [the actor] Steve McQueen or [the rapper] Tupac, there is an ongoing debate about whether their family should own the rights [and make an income from it]," she says.
"Currently this differs from country to country."
C/o BBC News
158702449_489163362470713_4123443550146212519_n.jpg

@Field Marshal chunga sana
deep fakes all over
 

Kasaman

Elder Lister
A few months ago, millions of TV viewers across South Korea were watching the MBN channel to catch the latest news.
At the top of the hour, regular news anchor Kim Joo-Ha started to go through the day's headlines. It was a relatively normal list of stories for late 2020 - full of Covid-19 and pandemic response updates.
Yet this particular bulletin was far from normal, as Kim Joo-Ha wasn't actually on the screen. Instead she had been replaced by a "deepfake" version of herself - a computer-generated copy that aims to perfectly reflect her voice, gestures and facial expressions.
Viewers had been informed beforehand that this was going to happen, and South Korean media reported a mixed response after people had seen it. While some people were amazed at how realistic it was, others said they were worried that the real Kim Joo-Ha might lose her job.
More politely called AI-generated videos, or synthetic media, usage is growing rapidly in sectors including news, entertainment and education, with the technology becoming increasingly sophisticated.
One of the early commercial adopters has been Synthesia, a London-based firm that creates AI-powered corporate training videos for the likes of global advertising firm WPP and business consultancy Accenture.
"This is the future of content creation," says Synthesia chief executive and co-founder Victor Riparbelli.
Lilian Edwards, professor of law, innovation and society at Newcastle Law School, is an expert on deepfakes. She says that one issue surrounding the commercial use of the technology that hasn't been fully addressed is who owns the rights to the videos.
"For example, if a dead person is used, such as [the actor] Steve McQueen or [the rapper] Tupac, there is an ongoing debate about whether their family should own the rights [and make an income from it]," she says.
"Currently this differs from country to country."
C/o BBC News
View attachment 31506
@Field Marshal chunga sana
deep fakes all over
When is international day for the shemales ?
 

Mongrel

Elder Lister
A few months ago, millions of TV viewers across South Korea were watching the MBN channel to catch the latest news.
At the top of the hour, regular news anchor Kim Joo-Ha started to go through the day's headlines. It was a relatively normal list of stories for late 2020 - full of Covid-19 and pandemic response updates.
Yet this particular bulletin was far from normal, as Kim Joo-Ha wasn't actually on the screen. Instead she had been replaced by a "deepfake" version of herself - a computer-generated copy that aims to perfectly reflect her voice, gestures and facial expressions.
Viewers had been informed beforehand that this was going to happen, and South Korean media reported a mixed response after people had seen it. While some people were amazed at how realistic it was, others said they were worried that the real Kim Joo-Ha might lose her job.
More politely called AI-generated videos, or synthetic media, usage is growing rapidly in sectors including news, entertainment and education, with the technology becoming increasingly sophisticated.
One of the early commercial adopters has been Synthesia, a London-based firm that creates AI-powered corporate training videos for the likes of global advertising firm WPP and business consultancy Accenture.
"This is the future of content creation," says Synthesia chief executive and co-founder Victor Riparbelli.
Lilian Edwards, professor of law, innovation and society at Newcastle Law School, is an expert on deepfakes. She says that one issue surrounding the commercial use of the technology that hasn't been fully addressed is who owns the rights to the videos.
"For example, if a dead person is used, such as [the actor] Steve McQueen or [the rapper] Tupac, there is an ongoing debate about whether their family should own the rights [and make an income from it]," she says.
"Currently this differs from country to country."
C/o BBC News
View attachment 31506
@Field Marshal chunga sana
deep fakes all over
Ara ,Kama hataki @tiga Waana, hatabebelezwa,tiga Waana ,come to me baby girl
 

punce

New Lister
The Hader video is an expertly crafted deepfake, a technology invented in 2014 by Ian Goodfellow, a Ph.D. student who now works at Apple. Most deepfake technology is based on generative adversarial networks (GANs).
 

Kamau wa Kíríro

Elder Lister
A few months ago, millions of TV viewers across South Korea were watching the MBN channel to catch the latest news.
At the top of the hour, regular news anchor Kim Joo-Ha started to go through the day's headlines. It was a relatively normal list of stories for late 2020 - full of Covid-19 and pandemic response updates.
Yet this particular bulletin was far from normal, as Kim Joo-Ha wasn't actually on the screen. Instead she had been replaced by a "deepfake" version of herself - a computer-generated copy that aims to perfectly reflect her voice, gestures and facial expressions.
Viewers had been informed beforehand that this was going to happen, and South Korean media reported a mixed response after people had seen it. While some people were amazed at how realistic it was, others said they were worried that the real Kim Joo-Ha might lose her job.
More politely called AI-generated videos, or synthetic media, usage is growing rapidly in sectors including news, entertainment and education, with the technology becoming increasingly sophisticated.
One of the early commercial adopters has been Synthesia, a London-based firm that creates AI-powered corporate training videos for the likes of global advertising firm WPP and business consultancy Accenture.
"This is the future of content creation," says Synthesia chief executive and co-founder Victor Riparbelli.
Lilian Edwards, professor of law, innovation and society at Newcastle Law School, is an expert on deepfakes. She says that one issue surrounding the commercial use of the technology that hasn't been fully addressed is who owns the rights to the videos.
"For example, if a dead person is used, such as [the actor] Steve McQueen or [the rapper] Tupac, there is an ongoing debate about whether their family should own the rights [and make an income from it]," she says.
"Currently this differs from country to country."
C/o BBC News
View attachment 31506
@Field Marshal chunga sana
deep fakes all over
Reminds me of Dan Brown's Origin an AI character named Winston....
 

punce

New Lister
That's why it's a good idea to know how to spot deepfake videos.
...
1.Unnatural eye movement. ...
2.Unnatural facial expressions. ...
3.Awkward facial-feature positioning. ...
4.A lack of emotion. ...
5.Awkward-looking body or posture. ...
6.Unnatural body movement. ...
7.Unnatural coloring. ...
8.Hair that doesn't look real.
 
Top