Manus (in development)

An intuitive VR experience where two people are invited to co-create moving sculptures with the movement of their hands.

I’m very excited that my project Manus (working title) has got funding from the Venice Biennale College Cinema VR Program! At the moment I’m working hard on creating this project, so for more info you will have to head over to Institute of Time.

Online virtual museum in collaboration with the artist Bink.

The bink cafe is a virtual museum where every month a new collaborative exhibition is organized. The artist Bink invites artist with different backgrounds working from various disciplines. What they all have in common is that they have a passionate story to tell.

Visit to experience the museum for yourself. Click on the door to go inside the cafe. By holding down you mouse button and moving your mouse you can look around. Click on the ground to move to that position. Click on people, radios and paintings for special interactions. Press F (or the cat) to enter fullscreen (and escape to exit fullscreen).

Making off

Bink already had the idea of a virtual museum in her head for some time before she came to me. At our first meeting she brought this sketches of what she would like to accomplish.

One of the main things we wanted to figure out for this project was how to make it easy for her to change the placement of artworks without needing to learn Unity. It would also have been a hassle to have to build and upload the Unity project every time we wanted to make a new expo.

In the end we decided to work with a json file which described where all the artworks where hanging, what everybody was saying, which radio made what sound, how many floors the lighthouse has and much more. Changes in the file could almost instantly be seen in the experience itself, creating a very fast feedback loop.

simplified excerpt of the json file:

      "floorName": "Pixels in Time",
      "darkMode": true,
      "people": [],
      "artworks": [
          "title": "giraffe",
          "artist": "NO FISH",
          "position_x": 118.2,
          "position_y": 0,
          "height": 1.5,
      "videos": []

Another important goal we had was to keep everything in the experience in Bink’s own drawing style. For the final look we used multiple techniques to achieve the final look. Like the inverted hull technique for the stairs and the bar. The other lines on the walls come from a texture. 2D billboards for the objects and most of the characters. And a special kind of billboard for Bonk (see image to the right).

My own expo

I was the first artist who could show their work in the virtual museum. The exhibition was temporary but here are some screenshots from when it was still online.

Bink Cafe Expo with NO FISH | 20 April 2021 – 31 May 2021

Resources used:

HardwareOculus Questexperience kind of works in VR
SoftwareUnitygame engine
Visual Studiocoding
Blender3D modeling
Photoshoptexture work
WinSCPfile transfer to server
Unity AssetsWebXR Exportused to get webXR working
Prefab Lightmappingused to bake lighting data into prefabs
Mesh Bakerused for optimization

An online virtual museum where the architecture is ever changing, giving a new context to the art on display. is a virtual experience I worked in for Museum Arnhem. I was responsible for technical and interactive side of this project. I also worked on the building and environment and realizing the visual style of the experience. The virtual museum can be enjoyed from your browser (in the window below) or at the museum itself from a virtual reality headset. The surfaces of the architecture in the museum are always changing based on where your mouse is. The patterns and textures are inspired by riso print.

Visit to experience it for yourself. You can look around by holding down you mouse button and moving your mouse. Click on the ground and you will move to that position.

Making of:

For this project we had to figure out how we would animate the patterns on the walls and floors. Eventually we used simple black and white animation frames which got picked randomly and recolored with colors from two color palettes.

To save space, the animation frames for the buildings where combined in single textures (one per building). By color coding them they can easily be read by a custom shader. They where also tiled so even more frames could fit in a singe (squere) texture.

The final combined textures have the extra benefit of looking very nice. 😀

Custom shaders where very important for this project. They where used for ‘animating’ the patterns, but they where also used to get the style we wanted. We were inspired by riso prints and the way they use noise to create gradients (see left).

In the experience we used the custom shaders to combine (baked) lighting, patterns and noise to create surfaces which look interesting from far away as well as from close by.

Resources used:

HardwareOculus GoVR headset used in the museum
SoftwareUnitygame engine
Unity Shader Graphfor custom shaders
Visual Studiocoding
Blender3D modeling
Photoshoptexture work
WinSCPfile transfer server
Unity AssetsWebXR Export used to get webXR working
Oculus Integration used for support on the Oculus Go
Odinused for custom Unity inspectors

In collaboration with:

Virtual Reality Puppeteering

An ongoing research project into creating and puppeteering characters in VR.

This is a virtual reality toolbox I am creating. This project is an ongoing process without a clear end goal (for now). On the top of this page are the latest developments in this ongoing process. Below that you can find the evolution of this project and all its different iterations.

Latest development

drawing and animating with live music
last puppeteering of 2020
timelapse of the making of

Evolution of the project

The first version of this project started in 2014, when I was in high school. Every year there was an evening where students could perform. There was a lot of music, but not a lot of visual performances. I wondered how I could make a performance of me drawing, and I came up with an idea. I would make a drawing of a dog on stage. People would be able to see what I was making via an overhead camera and a projection screen behind me. And then, suddenly the dog would come to life and starts running.

bringing a drawing to life

At that time I didn’t have the technical skills to do this for real, so I faked it. This was still extremely effective.

Years later, in 2019, I was still fascinated with trying to imitate a musical performance, but then visually. I still wanted to bring something to life, just like my performance in 2014. But unlike then, when I needed weeks to animate the dog and fake the drawing, this time I wanted to try to do it all live, in real-time.

So, in one week time a demo was made. Inspired by loop-musicians like Marc Rebillet. Could you create a loop machine but then for visuals in virtual reality? The result was my own virtual reality software in which I could move puppets.

demo of moving a puppet

I could loop this movements, make new loops and in this way create increasingly complex animations.

Demo of the VR puppeteering tool with loops

This demo used 3D models I got from the internet with some custom shaders. But I wanted it to make it even more custom then this. I didn’t only want to create the movement in real-time but I also wanted to create the 3D models in real-time. The easiest way I could think of was to build the models out of other 3D objects, for example cubes.

Creating a puppet out of cubes and then animating it
Puppet out of building tool like jigsaws, hammers, paintbrushes, paintbuckets and hardhats

At this point I started working on my graduation project and I stopped developing this tool. But it was always there at the back of my mind. A year later, after my graduation at the HKU, I started work on this project yet again.

I learned a lot about coding so I recreated the tool from scratch. I still wanted to make even more of the visuals inside the tool, so with a solid base to work from I started working on paintbrush functionality. I really liked the idea of bringing a drawing to life.

Painting and then puppeteering a character (performance at this point wat not yet optimized)
Demo of working with the Color cube (a 3D color picker) and some fun with animation loops

Being able to draw your own characters brought a lot of possibilities. But I was still stuck with human looking characters because of the fixed character rig I was using. Wouldn’t it be great if you could make your own character rig?

Yes, yes it would. Now the possibilities where truly limitless.

Flower Dance

All this time I didn’t have a clear idea about the final form of this project. At this point in time I imagined something like the performance in 2014, but then for real and with live music. In early 2021 I put this to the test. Together with Aicha Cherif I experimented with her live music and my virtual reality toolbox.

drawing and animating with live music

This was very insightful. There where some great moments, but overall it wasn’t that interesting. Probably for the following reasons:

  • It takes to long to create a character (at least 10 minutes). This is too long to show live.
  • There is no real tension or surprise, what you see is what you end up with. And you see everything so you know how it will end up.
  • There is a real limit in the movement of the characters in real-time. The loops are essential for the full movement but this means there is little space for improvisation.

So now we end up in the present (January 2021). I’m again taking a break away from this project. I still think it’s great and interesting and full op potential, but in a different form then now.

To be continued…

If you want to stay up to date on this project, or if you want to share your thought with me. You can follow me on instagram.

Recourses used:

Hardware:HTC Viveused as HMD
Valve Index Controllersused as motion controllers
Oculus Quest 2 + link cableused as secondary VR system
Software:Unitygame engine
Visual Studiocoding
Unity Assets:SteamVRused for VR functionality
FinalIKused for calculating motion
Dynamic Boneused for secondary motion
Quick Outlineused for the outline effect

rLung [Graduation Project]

A VR experience where you explore your surroundings with your breath.

rLung is the Tibetan word for both breath as for wind. There is a strong connection between ourselves and our environment. In this VR experience the normally invisible world of rLung is made visible for you to explore this connection by sensing your environment with your breath.

Recourses used:

Hardware:HTC Viveused as HMD
Leap Motionused for hand tracking
Lavalier microphonefor better breath detection
Legofor detachable sensor holder
Software:Unitygame engine
Visual Studiocoding
Blender3D modeling
Tiltbrushfor drawing the environment as a reference
Unity Assets:Leap Motionused for hand tracking functionality
LM Realistic Handsused for hand models
Oculus Integrationused for spatial sound

Artheneum [Riezebos+VR]

VR expo made for the artist Peter Riezebos in which your environment is unstably balanced on books.

A VR exposition made for the Dutch artist Peter Riezebos. In this experience you van view 14 of his paintings. Exhibited in a space inspired by the former library of Hardewijk, now the studio of Peter Riezebos.

Making off

The goal for this project was to create a space consisting of recognizable elements from the original library. With this building blocks a new space could be created that felt like it could fall over at any moment. With walls and roofs that are supported by wobbly piles of books.To achieve this custom software wasdeveloped with which you could stack the building blocks by hand in VR.

Demo of the software

Resources used:

Hardware:HTC Viveused for building the environment
Oculus Goused for the final experience
Software:Unitygame engine
Visual Studiocoding
Blender3D modeling
Unity Assets:SteamVRused for building the envrironment
Quick Outlineused for building the envrironment
Oculus Integrationused for support on the Oculus Go
Arc Teleporterused as basis for the teleportation system
Resources:Booksfor all the books
Brick Texturefor the walls

In collaboration with:

Peter Riezebos

01X digital

Virtual Embodiment

Research into embodiment in VR, including but not limited to sharing your body with another person.

With motion controllers and full-body motion capture it is possible to translate your physical movements to a virtual avatar. How this avatar looks like and behaves doesn’t have to be limited to the rules of the physical world.

I wanted to know how this would feel. How would it feel if you where multiple copies of yourself? Or if you shared the same body with somebody else? Or if you were a goat? Or how would it feel if you where an abstract entity? Or just a pair of eyes connected to a brain?

Resources used:

Hardware:HTC Vive (Pro)used as HMD’s
OptiTrack Motion Capture Systemused for full-body motion capture
Leap Motionused for hand tracking
Software:Unitygame engine
Visual Studiocoding
Blender3D modeling
Motivemotion capture software
OSC Bridgeused to send motion data to multiple computers
Unity Assets:SteamVRused for VR support
Final IKused to translate movements between different sizes of people and avatars
Resources:Mixamofor the avatars
Adobe Fusefor the avatars