′Deepfake′ technology sees dramatic rise | News | DW | 07.10.2019
  1. Inhalt
  2. Navigation
  3. Weitere Inhalte
  4. Metanavigation
  5. Suche
  6. Choose from 30 Languages
Advertisement

News

'Deepfake' technology sees dramatic rise

The technology that can realistically manipulate footage is mostly being used for pornography, says a new report. It has also caused several major political scandals already.

A new reported published Monday on the state of so-called "deepfakes" — highly manipulated audio and video, usually of famous people or politicians — shows that use of the deceptive technology is not only drastically increasing, it is creating real-world political crises.

The report by Deeptrace, an Amsterdam-based cybersecurity firm that is amongst the first to specifically offer protection from deepfake technology, also found that an overwhelming proportion, 96% in fact, of these videos are pornographic material. Specifically, they mostly depict the faces of famous women convincingly grafted onto nude footage.

"Despite being a relatively new phenomenon, deepfake pornography has already attracted a large viewership," of more than 134 million views, the report found, mostly on sites dedicated solely to deepfakes, however, eight of the top ten porn sites on the internet host some of the fake videos.

The report noted that while many of the targets of the malicious tech were American and British actresses, most of the content was being created in China and South Korea. Indeed, after the western actresses, the biggest target for the porn creators was female K-pop stars.

Deepfake prompts coup attempt

Deeptrace found that 100% of the porn videos were of women. However, manipulated videos on YouTube, such as the well-known footage purporting to show US House Speaker Nancy Pelosi "babbling," (and viewed at least 2.2 million times) are mostly of men — 61% to be precise.

These videos have already caused a scandal in the Malaysian government as well as an attempted coup in Gabon, after a video of President Bruno Moubamba shortly after having had a stroke was mistakenly labeled as deepfake for his odd-sounding speech.

Alongside deepfakes, there is also the problem of "shallowfakes," slightly less manipulated video that is usually used to promote a particular political agenda, such as a video shared by the administration of US President Donald Trump that manipulated the body movements of CNN journalist Jim Acosta to make it look like he was behaving more aggressively towards a White House aid than he had in reality.

The report also warned that the ever-increasing number of available apps, service portals, and professional deepfake creators is also helping online scammers, spies, and politically-motivated trolls.

New tool for trolls, spies, scammers

Whereas a fake picture used for a social media account used to be traceable to its original source, now fraudsters can create synthetic photos of non-existent people that can't be traced. Deeptrace found one case where a "foreign spying operation" was using a Linkedin profile purporting to be a US academic, and another where a Twitter profile claiming to be a business journalist in what appeared to be an attempt to scam investors in the car company Tesla.

Watch video 01:34

What’s real and what’s not?

Another startling new trend Deeptrace found was the emergence of a technology called DeepNude that can scan the picture of a clothed woman and create a realistic photo of what she might look like naked. The technology has been developed only for women, and does not work on male bodies. Although the website was taken down by its creators, the technology is still be used and was sold in July to an anonymous buyer.

The term "deepfake" was coined by a user on the website Reddit in 2017. Deeptrace found nearly 15,000 deepfake videos available online in September 2019, an almost 100% increase since the almost 8,000 it found in December 2018. They also found that online deepfake websites and communities on sites like 4chan and 8chan have at least 100,000 members, despite Reddit taking down the original deepfake thread.

DW recommends