When you listen to music and want to dance your brain sends signals that get lost along the way and never reach your extremities in the correct order. You feel made of reinforced concrete. You have forgotten how to move your joints. It makes you ball dance. Do not worry: despite all this, you can also be Beyoncé thanks to the machine learning. A computer program can take the movements of the pop goddess and transfer them to your body so that you appear on the screen as if you were born to win Fame. Technology (almost) everything can. This technique, which, soon after, does not seem very useful for your life, has changed the way in which the filmmakers, video game designers and animators develop the action of their creations.
The most novel technique to digitally transfer movement from one person to another is to monitor it using sensors and cameras to build a 3D image. But this process is quite expensive and time consuming. Therefore, a technique that performs the same trick but with a video of a lifetime, taken from a single camera, would be enormously innovative. This is exactly what researcher Caroline Chan and her colleagues at the University of California have done. The technique allows them to transfer the choreography of a professional dancer to an amateur with relative ease.
To make you dance like Beyoncé, you only need two videos: a sample of the diva and another of the individual whose movement must adapt: you. "Our goal is to generate a new video of the target person with the same movements as the source," explains Chan. MIT Technology Review. But it takes an intermediate step. To make the transfer, the researchers reduced the human figure to a wooden doll that takes the movements of the dancer and applies them to the amateur. That doll codifies the position of the extremities but not the appearance: it is the common minimum that both share (head, trunk, two legs and two arms) and is the key for the technique to work.
It's so simple that it still has some limitations. He software it does not take into account the different limb lengths between the source and the objective and does not identify when cameras and different angles can manipulate or shorten certain poses. Sometimes he can not detect the correct posture either because the subject is moving too fast. The only question, they explain in MIT Technology Review, is how the technique will reach the market.
At the moment, the 3D technology that we talked about at the beginning is already actively involved in the recording of series and films, especially in the action scenes and in which characters that do not exist in real life and have been created only for the movie. Special effects in movies like The war of the planet of the apes, released last year, are based on this technology of capture of movements and made it possible to give a more natural and credible movement to the apes to perform traditionally human actions, such as riding a horse, in a fluid way.