Virtual Reality Puppeteering

An ongoing research project into creating and puppeteering characters in VR.

This is a virtual reality toolbox I am creating. This project is an ongoing process without a clear end goal (for now). On the top of this page are the latest developments in this ongoing process. Below that you can find the evolution of this project and all its different iterations.

Latest development

drawing and animating with live music
last puppeteering of 2020
timelapse of the making of

Evolution of the project

The first version of this project started in 2014, when I was in high school. Every year there was an evening where students could perform. There was a lot of music, but not a lot of visual performances. I wondered how I could make a performance of me drawing, and I came up with an idea. I would make a drawing of a dog on stage. People would be able to see what I was making via an overhead camera and a projection screen behind me. And then, suddenly the dog would come to life and starts running.

bringing a drawing to life

At that time I didn’t have the technical skills to do this for real, so I faked it. This was still extremely effective.


Years later, in 2019, I was still fascinated with trying to imitate a musical performance, but then visually. I still wanted to bring something to life, just like my performance in 2014. But unlike then, when I needed weeks to animate the dog and fake the drawing, this time I wanted to try to do it all live, in real-time.

So, in one week time a demo was made. Inspired by loop-musicians like Marc Rebillet. Could you create a loop machine but then for visuals in virtual reality? The result was my own virtual reality software in which I could move puppets.

demo of moving a puppet

I could loop this movements, make new loops and in this way create increasingly complex animations.

Demo of the VR puppeteering tool with loops

This demo used 3D models I got from the internet with some custom shaders. But I wanted it to make it even more custom then this. I didn’t only want to create the movement in real-time but I also wanted to create the 3D models in real-time. The easiest way I could think of was to build the models out of other 3D objects, for example cubes.

Creating a puppet out of cubes and then animating it
Puppet out of building tool like jigsaws, hammers, paintbrushes, paintbuckets and hardhats

At this point I started working on my graduation project and I stopped developing this tool. But it was always there at the back of my mind. A year later, after my graduation at the HKU, I started work on this project yet again.

I learned a lot about coding so I recreated the tool from scratch. I still wanted to make even more of the visuals inside the tool, so with a solid base to work from I started working on paintbrush functionality. I really liked the idea of bringing a drawing to life.

Painting and then puppeteering a character (performance at this point wat not yet optimized)
Demo of working with the Color cube (a 3D color picker) and some fun with animation loops

Being able to draw your own characters brought a lot of possibilities. But I was still stuck with human looking characters because of the fixed character rig I was using. Wouldn’t it be great if you could make your own character rig?

Yes, yes it would. Now the possibilities where truly limitless.

Flower Dance

All this time I didn’t have a clear idea about the final form of this project. At this point in time I imagined something like the performance in 2014, but then for real and with live music. In early 2021 I put this to the test. Together with Aicha Cherif I experimented with her live music and my virtual reality toolbox.

drawing and animating with live music

This was very insightful. There where some great moments, but overall it wasn’t that interesting. Probably for the following reasons:

  • It takes to long to create a character (at least 10 minutes). This is too long to show live.
  • There is no real tension or surprise, what you see is what you end up with. And you see everything so you know how it will end up.
  • There is a real limit in the movement of the characters in real-time. The loops are essential for the full movement but this means there is little space for improvisation.

So now we end up in the present (January 2021). I’m again taking a break away from this project. I still think it’s great and interesting and full op potential, but in a different form then now.

To be continued…

If you want to stay up to date on this project, or if you want to share your thought with me. You can follow me on instagram.


Recourses used:

Hardware:HTC Viveused as HMD
Valve Index Controllersused as motion controllers
Oculus Quest 2 + link cableused as secondary VR system
Software:Unitygame engine
Visual Studiocoding
Unity Assets:SteamVRused for VR functionality
FinalIKused for calculating motion
Dynamic Boneused for secondary motion
Quick Outlineused for the outline effect

Virtual Embodiment

Research into embodiment in VR, including but not limited to sharing your body with another person.

With motion controllers and full-body motion capture it is possible to translate your physical movements to a virtual avatar. How this avatar looks like and behaves doesn’t have to be limited to the rules of the physical world.

I wanted to know how this would feel. How would it feel if you where multiple copies of yourself? Or if you shared the same body with somebody else? Or if you were a goat? Or how would it feel if you where an abstract entity? Or just a pair of eyes connected to a brain?

Resources used:

Hardware:HTC Vive (Pro)used as HMD’s
OptiTrack Motion Capture Systemused for full-body motion capture
Leap Motionused for hand tracking
Software:Unitygame engine
Visual Studiocoding
Blender3D modeling
Motivemotion capture software
OSC Bridgeused to send motion data to multiple computers
Unity Assets:SteamVRused for VR support
Final IKused to translate movements between different sizes of people and avatars
Resources:Mixamofor the avatars
Adobe Fusefor the avatars

Miscellaneous Projects

Old, non-XR projects.

A collection of projects I did which are just fun to share.

2015 High School Art Project

2016 High School Self Portait

In my last year of secondary school I made a self-portrait. I asked if it would be oké if I made a giant head and it was, so I did.

2016 HKU Animation Seminar

In this seminar I learned the basics of the animation program TVpaint and about 2D animation. In the end we all had to make a little animation as part of a video but we had time to experiment too.

2016 HKU video for fun

HKU project Hell is other people

2017 HKU Cardboard Maquettes

2017 HKU project experimental film making

Hyperloop is a video I created together with Stijn Aa in our second year of IMT. The purpose of this project was to create an experimental film. You can see the process by which we came to this technique in this video.

2018 HKU animation for 2 voor 12

2 voor 12 is a tv quiz show. Every episode 2 animations are shown explaining a question. This animations are made by HKU students. I got the chance to make such an animation.

2018 HKU experiments with point clouds

In the second year of IMT you make a music video. This process is split in two parts: making a concept and making the actual videoclip. I worked on the concept and the prototypes with Stijn Aa. This video shows our process for creating the concept.