Close
Réalité virtuelle - Motion Capture

Virtual Reality : Motion capture with Unity and Oculus Quest

At VRProject, we needed to animate some 3d character for our project. One of the solution would have be to make keyframe by keyframe the animation on blender for example. But it’s a time carnage and not very natural.

One another solution is to have or to rent a motion capture studio.

Réalité virtuelle - Motion Capture
Motion Capture Studio

Quality is very good but price too ! And if you fail your animation, it will be very expensive !

So at VRProject, we needed a fast, reliable ways to record some animations with the sound. Blender and a classic motion capture studio was not good for us so we found another solution : use the Oculus Quest with the 6DoF controllers to make motion capture and a microphone headset for recording audio.

And guess what ? Result is pretty good !

So here is how we did !

General logic

Our motion capture studio is separated in 3 parts :

  • An application on the Oculus Quest that with a simple click, record controllers and headset position / rotation and send it to a nodeJS server.
  • A NodeJS server with socket.io that make the transmission in JSON between the Oculus Quest Client and the editor application.
  • An application on Unity Editor that receive the record command and the telemetry from the Oculus Quest and record it as an animation and wav sound.

For the occasion, we buyed a wi-fi headset microphone (HS70 – Corsair) that is directly connected to the PC. It was more simple to make juste the telemetry pass trough the server. Audio can be quite heavy and it avoid sync problem.

HS70 – Corsair : ~70$

Like we use one asset that we buyed one internet (BEST HTTP Pro), we cannot publish it on GitHub. But here is how we did it and the code that will permit you to do the same !

The client

Réalité virtuelle - Studio de "Motion capture"
View in the client

First, for the network part, for communicating between the client, server and editor we used socket.io : it’s free.

For both the client and the editor, we used an excellent asset : BestHTTP Pro

https://assetstore.unity.com/packages/tools/network/best-http-10872

It’s quite good and make the use of socket, with nodeJS very simple.

The code is very simple on the client. We just record controllers and headset position and send it to our NodeJS server.

The Code “socketClient.cs” :

using BestHTTP.SocketIO;
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;

[Serializable]
public class telemetry_packet
{
    public TF left_hand;
    public TF right_hand;
    public TF head;
}

[Serializable]
public class TF
{
    public Vector3 position;
    public Quaternion rotation;

    public TF(Transform t)
    {
        this.position = t.localPosition;
        this.rotation = t.localRotation;
    }
}

public class socketClient : MonoBehaviour
{
    SocketManager Manager;
    public const string localSocketUrl = "http://localhost:7000/socket.io/";

    public const string webSocketUrl = "http://yourInternetEndPoint:7000/socket.io/";
    telemetry_packet tp;

    public GameObject right_hand;
    public GameObject left_hand;
    public GameObject head;
    // Start is called before the first frame update
    void Start()
    {

        timeFrame = 1f / fps;
        timeTelemetry = timeFrame;
        SocketOptions options = new SocketOptions();
        options.AutoConnect = false;
        options.ConnectWith = BestHTTP.SocketIO.Transports.TransportTypes.WebSocket;

        Manager = new SocketManager(new Uri(webSocketUrl), options);
        Manager.Socket.On("connect", OnConnect);
        Manager.Socket.On("disconnect", OnDisconnect);
        
        Manager.Open();
    }
    bool connect = false;
    bool record = false;

    void OnConnect(Socket socket, Packet packet, params object[] args)

    {
        tp = new telemetry_packet();
        connect = true;
        Debug.Log("connected");

    }

    void OnDisconnect(Socket socket, Packet packet, params object[] args)
    {
        connect = false;
        // args[0] is the nick of the sender
        // args[1] is the message
        Debug.Log("disconnected");

    }

    bool click = false;
    float time = 0f;
    float fps = 60;
    float timeFrame = 0f;
    float timeTelemetry = 0f;
    public Material skybox;


    IEnumerator Tele ()
    {
        tp.head = new TF(head.transform);
        tp.right_hand = new TF(right_hand.transform);
        tp.left_hand = new TF(left_hand.transform);
        string json = JsonUtility.ToJson(tp);
        yield return null;

        Manager.Socket.Emit("telemetry", json);
        yield return null;
    }
    // Update is called once per frame
    void FixedUpdate()
    {
        if (Input.GetKeyUp(KeyCode.R) || OVRInput.Get(OVRInput.Button.One))
        {
            if (!click)
            {
                time = 0f;
                click = true;
                if (record)
                {
                    skybox.SetColor("_SkyTint", Color.white);
                    record = false;
                }
                else
                {
                    skybox.SetColor("_SkyTint", Color.red);

                    record = true;
                }

                if (connect)
                {
                    Manager.Socket.Emit("setrecording", record);
                }
            }
            
        }
        else
        {
            time += Time.deltaTime;
            if (time > 1f)
            {
                click = false;
            }
            
        }

        if (connect)
        {
            timeTelemetry -= Time.deltaTime;
            if (timeTelemetry < 0f)
            {
                timeTelemetry = timeFrame;

                StartCoroutine("Tele");
            }
           
            
        }
      

    }

    void OnDestroy()
    {
        skybox.SetColor("_SkyTint", Color.white);
        // Leaving this sample, close the socket
        if (Manager != null)
        {
            Manager.Close();
        }

    }
}

The server

It’s quite simple here too like we juste need to receive packet from socket.io and reemit it to our editor. You need to install express, fs and socket.io.

Use the command npm install + express, fs, socket.io

The code “server.js” :

var express = require("express");
var fs = require('fs');

var app = new express();
var http = require("http").Server(app);
var io = require("socket.io")(http);

app.use(express.static(__dirname + "/public" ));

app.get('/',function(req,res){
  res.redirect('index.html');
});

io.sockets.on('connection', function (socket) {

console.log('Someone is connected');
  socket.on('setrecording',function(value){
    console.log('Recording change to '+value);
      io.emit('recording', value);
    });





     socket.on('telemetry',function(json){

         io.emit('telemetry', json);
      });

    socket.on('disconnect', function() {
      console.log('Disconnected');
      io.emit('clientdisconnected', true);
    });

});

console.log('Server is correctly running for the moment :-)');
http.listen(7000,function(){
console.log("Server running at port "+ 7000);
});

The editor

It’s the more complicated but it’s remain quite understandable. First for recording audio and saving it from microphone, we used the project of DarkTable “SavWav”.

Why create again the wheel ?

https://gist.github.com/darktable/2317063

First we need to receive our telemetry for the server:

The code “socketMaster.cs” :

using BestHTTP.SocketIO;
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;

[Serializable]
public class telemetry_packet
{
    public TF left_hand;
    public TF right_hand;
    public TF head;
}

[Serializable]
public class TF
{
    public Vector3 position;
    public Quaternion rotation;

    public TF(Transform t)
    {
        this.position = t.position;
        this.rotation = t.rotation;
    }
}

public class socketMaster : MonoBehaviour
{
    SocketManager Manager;
    Follower follow;
    AnimMicroRecorder amr;
    public const string localSocketUrl = "http://localhost:7000/socket.io/";

    public const string webSocketUrl = "http://yourEndPoint:7000/socket.io/";
    // Start is called before the first frame update
    void Start()
    {
        SocketOptions options = new SocketOptions();
        options.AutoConnect = false;
     
        options.ConnectWith = BestHTTP.SocketIO.Transports.TransportTypes.WebSocket;
        Debug.Log(new Uri(webSocketUrl));
        Manager = new SocketManager(new Uri(webSocketUrl), options);
        Manager.Socket.On("connect", OnConnect);
        Manager.Socket.On("disconnect", OnDisconnect);
        Manager.Socket.On("telemetry", Telemetry);
        Manager.Socket.On("recording", OnStatusChange);
        Manager.Open();

        follow = GameObject.FindObjectOfType<Follower>();
        amr = GameObject.FindObjectOfType<AnimMicroRecorder>();

    }

    void OnConnect(Socket socket, Packet packet, params object[] args)

    {
        Debug.Log("connected");

    }

    void OnDisconnect(Socket socket, Packet packet, params object[] args)
    {
        // args[0] is the nick of the sender
        // args[1] is the message
        Debug.Log("disconnected");
      
    }

    void Telemetry(Socket socket, Packet packet, params object[] args)
    {
        // args[0] is the nick of the sender
        // args[1] is the message
       

        string data = (string)args[0];
        
        telemetry_packet tp = JsonUtility.FromJson<telemetry_packet>(data);
        if (follow.tp == null)
        {
            follow.tp = tp;
            follow.setHeadOriginal();
        }
        else
        {
            follow.tp = tp;
        }
        

    }

    void OnStatusChange(Socket socket, Packet packet, params object[] args)
    {
        bool record = (bool)args[0];
        // args[0] is the nick of the sender
        // args[1] is the message
        Debug.Log("OnStatusChange "+ record);

        if (record)
        {
            follow.record = true;
            follow.setHeadOriginal();
            amr.startRecording();
        }
        else
        {
            amr.stopRecording();
            follow.record = false;

        }

       

    }

    void OnDestroy()
    {
        // Leaving this sample, close the socket
        if (Manager != null)
        {
            Manager.Close();
        }

    }

    // Update is called once per frame
    void Update()
    {
        
    }
}

When we receive JSON data from the server, we decompress it in our class and send it to the object “Follower” that will mimics the client movement.

The code “Follower.cs” :

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class Follower : MonoBehaviour
{
    [HideInInspector]
    public telemetry_packet tp;

    public Transform root;
    TF original_pos;
    // Start is called before the first frame update
    void Start()
    {
        original_pos = new TF(root);
    }


    public GameObject right_hand;
    public GameObject left_hand;
    public GameObject head;
    public bool record = false;


    Vector3 originalHeadPosition = Vector3.zero;
    public void setHeadOriginal ()
    {
        if (tp != null)
        {
            originalHeadPosition = tp.head.position;
        }
    }
    // Update is called once per frame
    void FixedUpdate()
    {
        if (tp != null)
        {
            
            Vector3 right = tp.right_hand.position - originalHeadPosition;
            //right.y = tp.right_hand.position.y;

            right_hand.transform.localPosition = right;

            Vector3 left = tp.left_hand.position - originalHeadPosition;
           //left.y = tp.left_hand.position.y;
            left_hand.transform.localPosition = left;

            head.transform.localPosition = tp.head.position - originalHeadPosition;

            right_hand.transform.rotation = tp.right_hand.rotation;

            left_hand.transform.rotation = tp.left_hand.rotation;

            head.transform.rotation = tp.head.rotation;


        }
       

    }
}

Now everything is moving together, we juste need to save our animation and our sound. For saving animation, we used the editor utility from Unity called GameObjectRecorder that permit to record an object and save an animation. For the sound, we use the projet of DarkTable “SavWav”. I take the opportunity to thank him for his great work!

The code “AnimMicroRecorder.cs” :

using System.Collections;
using System.Collections.Generic;
using UnityEditor;
using UnityEditor.Animations;
using UnityEngine;

public class AnimMicroRecorder : MonoBehaviour
{
    private AnimationClip clip;

    private GameObjectRecorder m_Recorder;

    public GameObject root_target;

    AudioClip _RecordingClip;

    int _SampleRate = 44100;    // Audio sample rate

    int _MaxClipLength = 300;  // Maximum length of audio recording

    public float _TrimCutoff = .01f;  // Minimum volume of the clip before it gets trimmed    


    public Material mat_skybox;

    // Start is called before the first frame update
    void Start()
    {
       
    }

    float timeRecord = 0f;
    bool record = false;
    float timefps = 0f;
    public void startRecording ()
    {
        mat_skybox.SetColor("_SkyTint", Color.red);

        Debug.Log("Start Recording");
        clip = new AnimationClip();
        m_Recorder = new GameObjectRecorder(root_target);
        m_Recorder.BindComponentsOfType<Transform>(root_target, true);

        timeRecord = 0f;
        timefps = 1f / fps;
        timeFrame = 0f;
        record = true;
        if (Microphone.devices.Length > 0)
        {
            _RecordingClip = Microphone.Start("", true, _MaxClipLength, _SampleRate);
        }

    }
    int number = 1;
    public void stopRecording ()
    {
        mat_skybox.SetColor("_SkyTint", Color.white);

        Debug.Log("Stop Recording");

        record = false;
        m_Recorder.SaveToClip(clip, 30f);
        string fileName = System.DateTime.Now.ToString("dd-hh-mm-ss");
        string name = fileName + ".anim";
        AssetDatabase.CreateAsset(clip, "Assets/Resources/Recordings/Animations/"+ name);
        AssetDatabase.SaveAssets();



        if (Microphone.devices.Length > 0)
        {
            Microphone.End("");

            var samples = new float[_RecordingClip.samples];

            _RecordingClip.GetData(samples, 0);

            List<float> list_samples = new List<float>(samples);

            int numberOfSamples = (int)(timeRecord * (float)_SampleRate);

            list_samples.RemoveRange(numberOfSamples, list_samples.Count - numberOfSamples);

            var tempclip = AudioClip.Create("TempClip", list_samples.Count, _RecordingClip.channels, _RecordingClip.frequency, false, false);

            tempclip.SetData(list_samples.ToArray(), 0);
            string path = Application.dataPath + "\\Resources\\Recordings\\Animations\\" + fileName;
            SavWav.Save(path, tempclip);
        }

        
      


    }

    float fps = 30f;

    float timeFrame = 0f;
    // Update is called once per frame
    void FixedUpdate()
    {
        

        if (Input.GetKeyDown(KeyCode.R))
        {
            if (record)
            {
                stopRecording();
            }
            else
            {
                startRecording();
            }
        }
        if (record)
        {
            timeRecord += Time.deltaTime;
            timeFrame += Time.deltaTime;
            timefps -= Time.deltaTime;
            if (timefps < 0f)
            {
                //Debug.Log("fps");
                m_Recorder.TakeSnapshot(timeFrame);
                timefps = 1f / fps;
                timeFrame = 0f;
            }

        }
        // Take a snapshot and record all the bindings values for this frame.
    }

    void OnDestroy()
    {
        mat_skybox.SetColor("_SkyTint", Color.white);
       

    }

}

Conclusion

In conclusion, you have now a motion studio capture with only a microphone headset and an Oculus Quest, that’s the magic ! Our only limit is our imagination now !

For going more further, we animate the mouth of our character with an Unity Asset called Salsa Sync. It’s quite simple to use and very useful ! For making the video, we used the Unity Recorder. You can find it as a package in the package manager of Unity.

Don’t hesitate to ask your questions. If you add some ameliorations to this project or record things, don’t hesitate to send it to us too 🙂

Cheers !

The VR Project team

Notre seule limite : Votre imagination

VR Project, c’est l’union d’un ingénieur et d’un artiste passionnés de nouvelles technologies. Fondée en 2015, l’entreprise a pour objectif de démocratiser les technologies virtuelles comme outil professionnel. Le potentiel de ces technologies étant en pleine croissance, l’équipe de VR Project entend bien se positionner comme un acteur incontournable du milieu. Misez sur l’avenir! Le futur n’attend pas!

Contact

Discutons de votre projet autour d’un café :-)

VR Project Sàrl
Route de Denges 36
1027 Lonay