Capturing human behaviour

{Our Production Assistant Ellie has been getting her head, face and body into motion capture as part of research into streamlining our processes for soft skills VR training development. We asked her to talk us through her research and findings as part of a peek behind the curtain of how we do what we create.}

Utilising Motion Capture for Virtual Reality Soft Skills Training

People are complicated. Human behaviour has been studied for decades, but there are many people who struggle to understand social behaviours on a day-to-day basis.

Soft skills training uses fictional scenarios to help the user to recognise social cues, improve empathy, and improve their customer service skills. Through clever scripting and character design, it’s possible to create scenarios that are commonplace in the work environment, making it relatable to the end user. However, even with a good script and realistic-looking characters, behaviours and reactions can appear unrealistic or exaggerated. This can break the illusion of the training and the user is reminded that it’s not real. Effective character animation can mean the difference between an engaging, impactful learning experience or ‘just another training module to sit through’.

Developing VR soft skills

Characters need to be believable and relatable. If this can be achieved, then the user may forget they’re talking to a pre-scripted character, and therefore, feel more comfortable in their decision making during the training.

To begin animating a 3D character, their design needs to outline who they are as a person and how they behave. Are they cheerful, stubborn, careful? Who are their family, friends, partners? What do they prioritise in their daily life? By studying people’s behaviours, motives, speech patterns and body language, this can be applied to fictional characters, making them more believable and relatable.

Animating 3D Characters

It is possible to animate a character by hand (without motion capture) and many games and films have done this successfully for years. However, when animating humans, it is very easy for realistic-looking characters to appear unnatural in their movements. There are cheap libraries of animations that can be applied to human 3D models but these tend to suit prototype tests or characters in the background of a scene. With soft skills, we’re asking learners to get up, close and personal with 3D humans and standard animations need uniqueness to bring them to the next, engaging level.

With motion capture trackers & suits (using 6DoF*), they can record the coordinates of a person’s body within a virtual 3D space. This movement data can then be applied to a 3D character. This also means that it can record subtle movements that wouldn’t have otherwise been noticed, such as eye and shoulder movements. By capturing these subtleties, it saves time in animating the character and results in a much more organic, natural performance.

Necessary software and hardware – Body Capture

Firstly, let’s look at what is necessary for mo-cap and what hardware would be most appropriate, both for ease of use & financially.

There are two types of body capture;

  • One – where the actor has individual trackers strapped to their body
  • Two – where all the trackers are in a one-piece jumpsuit, fitted to the actor.

Purchasing a one-piece motion capture suit is a big investment, so using the HTC Vive Tracker pucks for body mo-cap can be a good alternative. Many VR enthusiasts have DIY-ed their own low-budget suits for full body VR tracking in games. By using their set ups, it is possible to record the motion data in a similar way to mo-cap suits.

Setting up the HTC Vive Trackers* for the first time came with a lot of trial and error. Some pucks had trouble pairing to the PC and, even when they were paired, tracking would be lost mid-performance, which disrupted the recordings. Even once the data was recorded successfully, it needs to be converted to a humanoid skeleton that can then be applied to the 3D character. This ended up being a lot less efficient because of the time it took to set up, record, clean-up capture disruptions and apply to the 3D character.

Creating a DIY motion capture suit can be low-cost and fun to use in social VR worlds, but for character animation investing in a pre-existing product that is designed for precision tracking is the way to go. However with limited budgets for development, in terms of cost and time, the systems used likely won’t be seen on set of the next Planet of the Apes, but suffice for soft skills training needs.

Necessary software and hardware – Facial Capture

For facial capture, one option was to paint dots onto the actor’s face, record using a standard camera and then transfer the dot tracking data onto a 3D character. This is do-able in Blender*, however considering we had over 200 spoken animation clips to record, this would have meant many dots being repainted for multiple shoots over several days.

The next option was to use an iOS mobile with a depth perception camera (iPhone X or later, iOS v14.2 or later) with a motion capture app (Face Cap – Motion Capture). This app makes it easy to record anyone’s face & export mo-cap data into a universal format (FBX*). The mo-cap is recorded onto the Face Cap app’s model using Blendshapes*, and the eye tracking is recorded to the Face Cap eyeball model’s rotation.

It was very impressive how eye tracking added a new level of animation that had been previously overlooked. People say the eyes are the gateway to the soul and this was particularly noticeable when compared with previous character animations. A subtle side-ways glance, looking up to the ceiling in thought, a rolling of the eyes, all made for a much more engaging 3D character.

Conclusion

After combining the body and facial capture onto a 3D character, it makes for a much more believable character performance which enhanced the entire VR training experience. Hand-animating a good character performance can be time-costly, but the subtly of motion found in motion capture can make a character more believable.

This is why at Make Real, the development team have spent considerable time and effort investigating and defining internal processes for motion capture. This has enabled the art and the code teams to work in a more streamlined, efficient manner, reducing development time and costs whilst maintaining an assured level of quality output.

Glossary:

6DoF – 6 Degrees of Freedom
Blendshapes – a 3D character’s facial deformation targets
Blender – a free, open-source 3D software
FBX – 3D file format for Unity
Trackers – An additional hardware device that can be tracked by the HTC Vive system to add more points of reference

Get in touch

We’re always happy to talk to you about how immersive technologies can engage your employees and customers. If you have a learning objective in mind, or simply want to know more about emerging technologies like VR, AR, or AI, send us a message and we’ll get back to you as soon as we can.