Note on creating 3D character for Kinect

This is a note on how I created 3D characters with Three.js for performers to control with Kinect, as part of professor Shawn Van Every’s project in Future of Storytelling. The idea is creating conversations between dancers and audiences through internet, thus audiences aren’t restricted within theatre.


  • key points: neutral characteristic for audience to easily relate to, simple and related action, the feeling of magic

( insert video inspiration )



<–What you can get from Kinect

<–My simplified version

To represent joints data as dancers, I tried different ways: 2D – plane, 3D – separate geometry, skeletal geometry, and joint geometry, and so far the most satisfying versions are 2D plane and 3D joint geometry, and the skeletal geometry didn’t suit this project because of the hierarchy structure of it, e.g. change joint_0 will also change joint_1, because joint_0 is the parent of joint_1.

( insert video documentations )

2D plane version


  1. draw texture for limbs firstbear
  2. based on the size of each picture, create THREE.PlaneGeometry
  3. offset the geometry –> change the pivot to appropriate position, e.g. the head is rotating around neck, not around nose.offset
    • code example:
      • headTexHeight = headTex.image.height/111, headTexWidth = headTex.image.width/111;
      • var headGeo = new THREE.PlaneGeometry(headTexWidth, headTexHeight);
      • transY(headGeo, 4);
  4. To make bear body part follow joint data, updating the position of bear body with joint data.
  5. Use lookAt ( Rotates object to face point in space ) function of Three.js to update the rotation of bear body. !!Note!! also use matrix rotation to make sure the direction of rotationpreRotate
  6. To have stretchy effect, scale the bear body base on the distance changes between joints.
    • hL2.lookAt(joints[5].position);
    • hL2.position.copy(joints[0].position);
    • hL2MoveX = Math.sqrt(Math.pow((joints[0].position.x – joints[5].position.x), 2))-5;
    • hL2MoveY = Math.sqrt(Math.pow((joints[0].position.y – joints[5].position.y), 2))-2;
    • hL2.scale.set(1, 1, 1+( hL2MoveX/armULTexWidth + hL2MoveY/armULTexHeight/2 ));


3D joint version


  1. Build 3D model in Maya
  2. Create jointsScreen Shot 2015-02-26 at 4.17.27 PM
  3. Skin bind the joint with model
  4. Skin weight paint to fixed the weight of binding, creating smooth transformationScreen Shot 2015-02-26 at 4.18.04 PM
  5. Export as FBX file, and then import to Blender, and then export as JS file (I know there’s a lot of detouring, but it’s so far the only successful method I’ve tried)Screen Shot 2015-02-26 at 4.20.45 PM
  6. Compared with 2D version’s hacky way to update the position of character’s body part, the skinned mesh is much easier to do the animation. Since the geometry is already binded with joints, what I have to do is updating the joint position correctly.
    1. elmoBone[0].position.copy( joints[5].position );
  7. For the rotation part, I can get the desired rotation angle based on the relationship of each joints’ positionsrotate
    1. lengthForRot = elmoBone[3].position.distanceTo( elmoBone[2].position );
      rotForJoint = Math.asin( (elmoBone[3].position.y-elmoBone[2].position.y)/lengthForRot );
      elmoBone[3].rotation.y = rotForJoint;


And here’s a snippet!

Screen Shot 2015-02-26 at 4.36.35 PM

Virtual Reality Tour of Met

For my internship during Spring semester 2014 in Media Lab of The Metropolitan Museum of Art, I hooked up

    1. 3D models of Met from the Architecture Department
    2. Official Audio Guide
    3. 3D models art pieces in Greek and Roman gallery, made by 3D scan with photos
    4. Unity as game engine
    5. virtual reality head-mounted display Oculus Rift as controller

and create an immersive virtual reality tour of Met!

forBlog With Oculus Rift, users can wonder around the museum, listening to the audio guide and admiring art pieces, walk upstair, watch butterflies, being blocked by huge bowl, and being inside of the surreal mash-up models(credits to Decho<horse> and Rui<uncolor triangulars>). metTour


With a background as VFX artist of 3D animation and post production, I’m always interested in 3D and how it can be interactive in the creative way. Once I got the chance to intern in Media Lab of the Met and knew we can access the 3D models of museum, I wanted to use Oculus Rift to walk inside the fantasy version of the Met, and to enjoy the immersive experience in space.



Virtual Met Museum –> Fantasy Experiment –> Art piece + Audio Guide



First of all, tons of basic knowledge about Unity here. And setup a project from scratch, here.


✓ Import BIM 3D models into Unity

Basically just put the fbx file into the Assets folder of the project you just created. Not too complicated but there’s one thing you should be aware of, the SCALE. It’s a good practice to setup scale right in the modeling application before importing the model to Unity, and associated details described as below:

  • 1 Unity unit = 1m
  • the fewer GameObjects the better. Also, use 1 material if you can
  • useful link: wiki unity3d


✓ Oculus Rift Plugin in Unity 3d Setup

Just follow the clear instruction on youtube!


✓ Add collider to meshes

In order to preventing player walking through meshes(e.g. walls, stairs), we need to add Collider attribute on models, steps as below:

  • select model
  • @inspector
  • Add Component –> Physics –> Box Collider or Mesh Collider
  • Mesh Collider is more specific than box collider but at the same time is more expensive to use.

collider copy


✓ Occlusion Culling

Means that things you aren’t looking at, aren’t loading into memory, so game will run faster.

  •  geometry must be broken into sensibly sized pieces.
    • if you have one object that contains all the furniture, either all or none of the entire set of furniture will be culled.
  • tag all scene objects that you want to be part of the occlusion to Occluder Static in the Inspector.
  • Back!
  • useful link: unity3d manual


✓ Import 3D-Scanned Models from 

  • Take about 20 photos around the object you want to 3D scan of(360 degrees!).
  • Upload the photos to 123D Catch.
  • Yeah now you’ll have both .obj model file and texture file!
  • Just download the file, and drag whole folder into the Asset folder of Unity!



  • Gain accessibility for people who can’t visit the museum in person.
  • Installation design simulation.



It’s really a good experience interning at MediaLab of Met. I know I want to keep working on 3D and also step into virtual reality world with Oculus Rift, and it’s a great match that I can have this topic as my own project, and also match to the needs of Met! From this internship, I gained valuable resources from the museum, and also knowing amazing mentors and colleagues from Labs. This project leads me to the world of virtual reality and I’m glad and also thankful that I’m a Spring 14′ intern of Media Lab of The Metropolitan Museum of Art.