The Cyberpunk 2077 dialogue was lip-synced by AI


The goal was to do it all for every character in the Open World. Due to the vast scope of the game, CDPR needed to do it all with zero speed capture. To do this, the studio tapped on the lip syncing and facial animation tech of fake research on how the characters ’faces move forward.

Some characters Cyberpunk 2077 Many languages ​​are spoken, sometimes the same sentence is switched between them. To account for this, transcript tagging was used. Tugs also help adjust the character’s facial expressions when their emotional state changes in the line of communication. The system also used audio dio analysis to mimic the emotions of sound performance in animation.

CDPRA used algorithms to help animate the inherent cunning Vichar 3, And this is the evolution of that approach. Many games use motion capture for lip sync dialogue in only one language, which can break the immersion a bit for players who switch to different languages.

When procedurally generated animation in Cyberpunk 2077 Unlike some other AAA games that may not be detailed or expressive, they can enhance the experience for many players who prefer to play in different languages. When the game comes out on November 19th you will be able to take a closer look at how it works in practice.