Deepfake used to attack an activist couple shows a new disinformation frontier


WASHINGTON (Reuters) – Oliver Taylor, a student at the University of Birmingham in England, is in his twenties with brown eyes, a light beard and a slightly stiff smile.

A combined photograph showing an image purporting to be of British student and freelance writer Oliver Taylor (L) and a heat map from the same photograph produced by Tel Aviv-based Cyabra deep forgery detection company seen in this photo no date obtained by Reuters. The heat map, which was produced using one of Cyabra’s algorithms, highlights areas of suspected computer manipulation. Digital inconsistencies were one of several indicators used by experts to determine that Taylor was an online mirage. Cyabra / Brochure via REUTERS

Online profiles describe him as a coffee lover and political junkie who grew up in a traditional Jewish home. Its half-dozen independent publishers and blog posts reveal an active interest in anti-Semitism and Jewish affairs, with annotations in the Jerusalem Post and the Times of Israel.

The capture? Oliver Taylor appears to be an elaborate fiction.

His university says he has no record of him. It doesn’t have an obvious footprint online beyond an account on the Quora question-and-answer site, where it was active for two days in March. Two newspapers that published his work say they have tried and failed to confirm his identity. And deceptive imaging experts used state-of-the-art forensic analysis programs to determine that Taylor’s profile photo is a hyper-realistic fake, a “deep fake.”

Who is behind Taylor is not known to Reuters. Calls to the UK phone number that you provided to publishers generated an automated error message and did not respond to messages that were left at the Gmail address you used for correspondence.

Reuters was alerted to Taylor by London academic Mazen Masri, who drew international attention in late 2018 when he helped launch an Israeli lawsuit against the NSO surveillance company on behalf of the alleged Mexican victims of the company’s hacking technology. .

In an article published in the American Jewish newspaper The Algemeiner, Taylor accused Masri and his wife, Palestinian rights defender Ryvka Barnard, of being “well-known supporters of terrorism.”

Masri and Barnard were surprised by the accusation, which they deny. But they were also baffled as to why a college student would distinguish them. Masri said he took out Taylor’s profile picture. He said he couldn’t point a finger at him, but something on the young man’s face “seemed dull.”

Six experts interviewed by Reuters say the image has the characteristics of a deep fake.

“The distortion and inconsistencies in the background are a telltale sign of a synthesized image, as are some flaws around his neck and neck,” said digital imaging forensic pioneer Hany Farid, who teaches at the University of California, Berkeley.

Artist Mario Klingemann, who regularly uses deepfakes in his work, said the photo “has all the features.”

“I am 100 percent sure,” he said.

These faces are not realMany deepfakes look weird, but technology is improving fast

‘A DUMMY OF VENTRILOQUIST’

Taylor’s character is a rare example in nature of a phenomenon that has become a key anxiety of the digital age: the marriage of deep fakes and disinformation.

The threat is raising growing concern in Washington and Silicon Valley. Last year, the chairman of the House of Representatives Intelligence Committee, Adam Schiff, warned that the computer-generated video could “turn a world leader into a ventriloquist dummy.” Last month, Facebook announced the conclusion of its Deepfake Detection Challenge, a competition aimed at helping investigators automatically identify counterfeit images. Last week, online publication The Daily Beast revealed a network of fake journalists, part of a larger group of fake people who spread propaganda online.

Deep fakes like Taylor are dangerous because they can help build “a totally untraceable identity,” said Dan Brahmy, whose Israel-based startup Cyabra specializes in detecting such images.

Brahmy said investigators pursuing the origin of such photos are left “searching for a needle in a haystack, except that the needle does not exist.”

Taylor appears to have had no online presence until she began writing articles in late December. The University of Birmingham said in a statement that it could not find “any records of this person using these details.” The editors of the Jerusalem Post and The Algemeiner say they published Taylor after he told them cold stories via email. He did not ask for payment, they said, and did not take aggressive steps to investigate his identity.

“We are not a counterintelligence operation,” said Algemeiner editor-in-chief Dovid Efune, although he noted that the newspaper had introduced new guarantees since then.

After Reuters began asking about Taylor, The Algemeiner and the Times of Israel removed her job. Taylor sent an email to both newspapers protesting the move, but Times of Israel opinion editor Miriam Herschlag said he rejected it after he was unable to prove his identity. Efune said she did not respond to Taylor’s messages.

Slideshow (2 images)

The Jerusalem Post and Arutz Sheva have kept Taylor’s articles online, although the latter removed the reference to “supporters of terrorism” following a complaint by Masri and Barnard. Post editor-in-chief Yaakov Katz did not respond when asked if Taylor’s work would remain standing. Arutz Sheva editor Yoni Kempinski said that “in many cases” the media “use pseudonyms to write opinion pieces.” Kempinski declined to elaborate or say whether he considered Taylor a pseudonym.

Oliver Taylor’s articles attracted minimal social media engagement, but Herschlag of the Times of Israel said they were still dangerous, not only because they could distort public discourse but also because they ran the risk of making people in their position less willing to take chances with unknown writers.

“We absolutely need to eliminate imposters and increase our defenses,” he said. “But I don’t want to establish these barriers that prevent new voices from being heard.”

Report by Raphael Satter; editing by Chris Sanders and Edward Tobin

Our Standards:Thomson Reuters Trust Principles.

.