Cyberpunk 2077’s dialogue was synced to AI’s lips



[ad_1]

The goal was to do all of that for all the characters in the open world. Due to the huge scope of the game, CDPR needed to do all of that with zero facial motion capture. To do so, the studio leveraged Jali Research’s facial animation and lip sync technology to procedurally generate how the characters’ faces move.

Some of the characters in Cyberpunk 2077 speak several languages, sometimes switching between them in the same sentence. To account for that, transcript tagging was used. Tags also helped adjust a character’s facial expressions when their emotional state changes within a line of dialogue. The system also used audio analysis to reproduce the emotions of a vocal performance in the animations.

CDPR used algorithms to help it animate the scenes in The witcher 3, and this is an evolution of that approach. Many games use motion capture to sync dialogue in a single language, which can be a bit of a disruptive immersion for players switching to another.

While procedurally generated animations in Cyberpunk 2077 They may not be as detailed or expressive as some other AAA games, they could contribute to the experience of many players who prefer to play in different languages. You can get a closer look at how that works in practice when the game arrives on November 19.



[ad_2]