Experiments with:


This is a virtual reality toolbox I am creating. This project is an ongoing process without a clear end goal (for now). On the top of this page are the latest developments in this ongoing process. Below that you can find the evolution of this project and all its different iterations.

Latest development

drawing and animating with live music
last puppeteering of 2020
timelapse of the making of

Evolution of the project

The first version of this project started in 2014, when I was in high school. Every year there was an evening where students could perform. There was a lot of music, but not a lot of visual performances. I wondered how I could make a performance of me drawing, and I came up with an idea. I would make a drawing of a dog on stage. People would be able to see what I was making via an overhead camera and a projection screen behind me. And then, suddenly the dog would come to life and starts running.

bringing a drawing to life

At that time I didn’t have the technical skills to do this for real, so I faked it. This was still extremely effective.

Years later, in 2019, I was still fascinated with trying to imitate a musical performance, but then visually. I still wanted to bring something to life, just like my performance in 2014. But unlike then, when I needed weeks to animate the dog and fake the drawing, this time I wanted to try to do it all live, in real-time.

So, in one week time a demo was made. Inspired by loop-musicians like Marc Rebillet. Could you create a loop machine but then for visuals in virtual reality? The result was my own virtual reality software in which I could move puppets.

demo of moving a puppet

I could loop this movements, make new loops and in this way create increasingly complex animations.

Demo of the VR puppeteering tool with loops

This demo used 3D models I got from the internet with some custom shaders. But I wanted it to make it even more custom then this. I didn’t only want to create the movement in real-time but I also wanted to create the 3D models in real-time. The easiest way I could think of was to build the models out of other 3D objects, for example cubes.

Creating a puppet out of cubes and then animating it
Puppet out of building tool like jigsaws, hammers, paintbrushes, paintbuckets and hardhats

At this point I started working on my graduation project and I stopped developing this tool. But it was always there at the back of my mind. A year later, after my graduation at the HKU, I started work on this project yet again.

I learned a lot about coding so I recreated the tool from scratch. I still wanted to make even more of the visuals inside the tool, so with a solid base to work from I started working on paintbrush functionality. I really liked the idea of bringing a drawing to life.

Painting and then puppeteering a character (performance at this point was not yet optimized)
Demo of working with the Color cube (a 3D color picker) and some fun with animation loops

Being able to draw your own characters brought a lot of possibilities. But I was still stuck with human looking characters because of the fixed character rig I was using. Wouldn’t it be great if you could make your own character rig?

Yes, yes it would. Now the possibilities where truly limitless.

Flower Dance

All this time I didn’t have a clear idea about the final form of this project. At this point in time I imagined something like the performance in 2014, but then for real and with live music. In early 2021 I put this to the test. Together with Aicha Cherif I experimented with her live music and my virtual reality toolbox.

drawing and animating with live music

This was very insightful. There where some great moments, but overall it wasn’t that interesting. Probably for the following reasons:

  • It takes to long to create a character (at least 10 minutes). This is too long to show live.
  • There is no real tension or surprise, what you see is what you end up with. And you see everything so you know how it will end up.
  • There is a real limit in the movement of the characters in real-time. The loops are essential for the full movement but this means there is little space for improvisation.

So now we end up in the present (January 2021). I’m again taking a break away from this project. I still think it’s great and interesting and full op potential, but in a different form then now.

To be continued…

If you want to stay up to date on this project, or if you want to share your thought with me. You can follow me on instagram.

Recourses used:

Hardware:HTC Viveused as HMD
Valve Index Controllersused as motion controllers
Oculus Quest 2 + link cableused as secondary VR system
Software:Unitygame engine
Visual Studiocoding
Unity Assets:SteamVRused for VR functionality
FinalIKused for calculating motion
Dynamic Boneused for secondary motion
Quick Outlineused for the outline effect

find projects with similar themes: