This is a virtual reality toolbox I am creating. This project is an ongoing process without a clear end goal (for now). On the top of this page are the latest developments in this ongoing process. Below that you can find the evolution of this project and all its different iterations.
Evolution of the project
The first version of this project started in 2014, when I was in high school. Every year there was an evening where students could perform. There was a lot of music, but not a lot of visual performances. I wondered how I could make a performance of me drawing, and I came up with an idea. I would make a drawing of a dog on stage. People would be able to see what I was making via an overhead camera and a projection screen behind me. And then, suddenly the dog would come to life and starts running.
At that time I didn’t have the technical skills to do this for real, so I faked it. This was still extremely effective.
Years later, in 2019, I was still fascinated with trying to imitate a musical performance, but then visually. I still wanted to bring something to life, just like my performance in 2014. But unlike then, when I needed weeks to animate the dog and fake the drawing, this time I wanted to try to do it all live, in real-time.
So, in one week time a demo was made. Inspired by loop-musicians like Marc Rebillet. Could you create a loop machine but then for visuals in virtual reality? The result was my own virtual reality software in which I could move puppets.
I could loop this movements, make new loops and in this way create increasingly complex animations.
This demo used 3D models I got from the internet with some custom shaders. But I wanted it to make it even more custom then this. I didn’t only want to create the movement in real-time but I also wanted to create the 3D models in real-time. The easiest way I could think of was to build the models out of other 3D objects, for example cubes.
At this point I started working on my graduation project and I stopped developing this tool. But it was always there at the back of my mind. A year later, after my graduation at the HKU, I started work on this project yet again.
I learned a lot about coding so I recreated the tool from scratch. I still wanted to make even more of the visuals inside the tool, so with a solid base to work from I started working on paintbrush functionality. I really liked the idea of bringing a drawing to life.
Being able to draw your own characters brought a lot of possibilities. But I was still stuck with human looking characters because of the fixed character rig I was using. Wouldn’t it be great if you could make your own character rig?
Yes, yes it would. Now the possibilities where truly limitless.
All this time I didn’t have a clear idea about the final form of this project. At this point in time I imagined something like the performance in 2014, but then for real and with live music. In early 2021 I put this to the test. Together with Aicha Cherif I experimented with her live music and my virtual reality toolbox.
This was very insightful. There where some great moments, but overall it wasn’t that interesting. Probably for the following reasons:
- It takes to long to create a character (at least 10 minutes). This is too long to show live.
- There is no real tension or surprise, what you see is what you end up with. And you see everything so you know how it will end up.
- There is a real limit in the movement of the characters in real-time. The loops are essential for the full movement but this means there is little space for improvisation.
So now we end up in the present (January 2021). I’m again taking a break away from this project. I still think it’s great and interesting and full op potential, but in a different form then now.
To be continued…
If you want to stay up to date on this project, or if you want to share your thought with me. You can follow me on instagram.
|Hardware:||HTC Vive||used as HMD|
|Valve Index Controllers||used as motion controllers|
|Oculus Quest 2 + link cable||used as secondary VR system|
|Unity Assets:||SteamVR||used for VR functionality|
|FinalIK||used for calculating motion|
|Dynamic Bone||used for secondary motion|
|Quick Outline||used for the outline effect|