MuseumArnhemVR.nl is a virtual experience I worked in for Museum Arnhem. I was responsible for technical and interactive side of this project. I also worked on the building and environment and realizing the visual style of the experience. The virtual museum can be enjoyed from your browser (in the window below) or at the museum itself from a virtual reality headset. The surfaces of the architecture in the museum are always changing based on where your mouse is. The patterns and textures are inspired by riso print.
Visit museumarnhemvr.nl to experience it for yourself. You can look around by holding down you mouse button and moving your mouse. Click on the ground and you will move to that position.
For this project we had to figure out how we would animate the patterns on the walls and floors. Eventually we used simple black and white animation frames which got picked randomly and recolored with colors from two color palettes.
To save space, the animation frames for the buildings where combined in single textures (one per building). By color coding them they can easily be read by a custom shader. They where also tiled so even more frames could fit in a singe (squere) texture.
The final combined textures have the extra benefit of looking very nice. 😀
Custom shaders where very important for this project. They where used for ‘animating’ the patterns, but they where also used to get the style we wanted. We were inspired by riso prints and the way they use noise to create gradients (see left).
In the experience we used the custom shaders to combine (baked) lighting, patterns and noise to create surfaces which look interesting from far away as well as from close by.
The bink cafe is a virtual museum where every month a new collaborative exhibition is organized. The artist Marie Bink invites artist with different backgrounds working from various disciplines. What they all have in common is that they have a passionate story to tell.
Visit bink.cafe to experience the museum for yourself. Click on the door to go inside the cafe. By holding down you mouse button and moving your mouse you can look around. Click on the ground to move to that position. Click on people, radios and paintings for special interactions. Press F (or the cat) to enter fullscreen (and escape to exit fullscreen).
Marie Bink already had the idea of a virtual museum in her head for some time before she came to me. At our first meeting she brought this sketches of what she would like to accomplish.
One of the main things we wanted to figure out for this project was how to make it easy for her to change the placement of artworks without needing to learn Unity. It would also have been a hassle to have to build and upload the Unity project every time we wanted to make a new expo.
In the end we decided to work with a json file which described where all the artworks where hanging, what everybody was saying, which radio made what sound, how many floors the lighthouse has and much more. Changes in the file could almost instantly be seen in the experience itself, creating a very fast feedback loop.
Another important goal we had was to keep everything in the experience in Marie’s own drawing style. For the final look we used multiple techniques to achieve the final look. Like the inverted hull technique for the stairs and the bar. The other lines on the walls come from a texture. 2D billboards for the objects and most of the characters. And a special kind of billboard for Bonk (see image to the right).
My own expo
I was the first artist who could show their work in the virtual museum. The exhibition was temporary but here are some screenshots from when it was still online.
Bink Cafe Expo with NO FISH | 20 April 2021 – 31 May 2021
This is a virtual reality toolbox I am creating. This project is an ongoing process without a clear end goal (for now). On the top of this page are the latest developments in this ongoing process. Below that you can find the evolution of this project and all its different iterations.
Evolution of the project
The first version of this project started in 2014, when I was in high school. Every year there was an evening where students could perform. There was a lot of music, but not a lot of visual performances. I wondered how I could make a performance of me drawing, and I came up with an idea. I would make a drawing of a dog on stage. People would be able to see what I was making via an overhead camera and a projection screen behind me. And then, suddenly the dog would come to life and starts running.
At that time I didn’t have the technical skills to do this for real, so I faked it. This was still extremely effective.
Years later, in 2019, I was still fascinated with trying to imitate a musical performance, but then visually. I still wanted to bring something to life, just like my performance in 2014. But unlike then, when I needed weeks to animate the dog and fake the drawing, this time I wanted to try to do it all live, in real-time.
So, in one week time a demo was made. Inspired by loop-musicians like Marc Rebillet. Could you create a loop machine but then for visuals in virtual reality? The result was my own virtual reality software in which I could move puppets.
I could loop this movements, make new loops and in this way create increasingly complex animations.
This demo used 3D models I got from the internet with some custom shaders. But I wanted it to make it even more custom then this. I didn’t only want to create the movement in real-time but I also wanted to create the 3D models in real-time. The easiest way I could think of was to build the models out of other 3D objects, for example cubes.
At this point I started working on my graduation project and I stopped developing this tool. But it was always there at the back of my mind. A year later, after my graduation at the HKU, I started work on this project yet again.
I learned a lot about coding so I recreated the tool from scratch. I still wanted to make even more of the visuals inside the tool, so with a solid base to work from I started working on paintbrush functionality. I really liked the idea of bringing a drawing to life.
Being able to draw your own characters brought a lot of possibilities. But I was still stuck with human looking characters because of the fixed character rig I was using. Wouldn’t it be great if you could make your own character rig?
Yes, yes it would. Now the possibilities where truly limitless.
All this time I didn’t have a clear idea about the final form of this project. At this point in time I imagined something like the performance in 2014, but then for real and with live music. In early 2021 I put this to the test. Together with Aicha Cherif I experimented with her live music and my virtual reality toolbox.
This was very insightful. There where some great moments, but overall it wasn’t that interesting. Probably for the following reasons:
It takes to long to create a character (at least 10 minutes). This is too long to show live.
There is no real tension or surprise, what you see is what you end up with. And you see everything so you know how it will end up.
There is a real limit in the movement of the characters in real-time. The loops are essential for the full movement but this means there is little space for improvisation.
So now we end up in the present (January 2021). I’m again taking a break away from this project. I still think it’s great and interesting and full op potential, but in a different form then now.
To be continued…
If you want to stay up to date on this project, or if you want to share your thought with me. You can follow me on instagram.
rLung is the Tibetan word for both breath as for wind. There is a strong connection between ourselves and our environment. In this VR experience the normally invisible world of rLung is made visible for you to explore this connection by sensing your environment with your breath.
A VR exposition made for the Dutch artist Peter Riezebos. In this experience you van view 14 of his paintings. Exhibited in a space inspired by the former library of Hardewijk, now the studio of Peter Riezebos.
The goal for this project was to create a space consisting of recognizable elements from the original library. With this building blocks a new space could be created that felt like it could fall over at any moment. With walls and roofs that are supported by wobbly piles of books.To achieve this custom software wasdeveloped with which you could stack the building blocks by hand in VR.
With motion controllers and full-body motion capture it is possible to translate your physical movements to a virtual avatar. How this avatar looks like and behaves doesn’t have to be limited to the rules of the physical world.
I wanted to know how this would feel. How would it feel if you where multiple copies of yourself? Or if you shared the same body with somebody else? Or if you were a goat? Or how would it feel if you where an abstract entity? Or just a pair of eyes connected to a brain?
HTC Vive (Pro)
used as HMD’s
OptiTrack Motion Capture System
used for full-body motion capture
used for hand tracking
motion capture software
used to send motion data to multiple computers
used for VR support
used to translate movements between different sizes of people and avatars
A collection of projects I did which are just fun to share.
2015 High School Art Project
2016 High School Self Portait
In my last year of secondary school I made a self-portrait. I asked if it would be oké if I made a giant head and it was, so I did.
2016 HKU Animation Seminar
In this seminar I learned the basics of the animation program TVpaint and about 2D animation. In the end we all had to make a little animation as part of a video but we had time to experiment too.
2016 HKU video for fun
HKU project Hell is other people
2017 HKU Cardboard Maquettes
2017 HKU project experimental film making
Hyperloop is a video I created together with Stijn Aa in our second year of IMT. The purpose of this project was to create an experimental film. You can see the process by which we came to this technique in this video.
2018 HKU animation for 2 voor 12
2 voor 12 is a tv quiz show. Every episode 2 animations are shown explaining a question. This animations are made by HKU students. I got the chance to make such an animation.
2018 HKU experiments with point clouds
In the second year of IMT you make a music video. This process is split in two parts: making a concept and making the actual videoclip. I worked on the concept and the prototypes with Stijn Aa. This video shows our process for creating the concept.