Sunday, September 20, 2015

Fun with Unreal Engine and VR

I've been meaning to try out Unreal Engine ever since it became "free". Recently found some time to do just that. Since I haven't coded any C++ for at least 15 years I was curious to see what was needed to get a basic project up and running - which turns out - no code at all. Blueprint (a node based graph network) and tweaking options was more than enough.

Since I haven't upgraded my VR rig yet, and still use my old Oculus DK1 I was pleasantly surprised that it worked out of the box with the latest Oculus 0.7 driver and Unreal Engine 4.9 under Windows 7.

This spring and summer Otoy did a competition called Render The Metaverse and I wanted to view some of the resulting images in VR - since that was the premise of the whole competition. If you own a Gear VR you are in luck, but unfortunately, as far as I have been able to find out, there's no viewer for these stereoscopic cube map images available for the Rift. This was an excellent opportunity to check out how hard it is to develop something in Unreal Engine.

Singe eye cube mapped
As it turns out, it's not hard at all. The most difficult part was to have the cube map for the left and right eye - go to the correct eye. The default behavior is to have a texture mapped to a object and then have the engine create a stereo pair out of that which will get the proper depth when viewed in a VR headset. This, of course, will give you a cube with flat surfaces - although with depth for the cube itself. Since the cube maps provided have depth included we need to take special care to only display the relevant texture to each eye.

The solution here was to create a shader that detects which eye that is being rendered and provide the correct texture. After some searching I found the magic node to be the "ScreenPosition" node. We only need the horizontal component so make sure you add a "BreakOutFloatToComponents" node before feeding the output to the "If" node.

Shader graph network
I also love the "VR Preview" mode in Unreal Engine which lets you test out stuff in the VR goggles easily from within the gui.

Ouput to VR-goggles
Although this solution works really well with the default setup in UE - it remains to bee seen if I can get an even better and more correct result. As it is now, we use the same cube for both textures and this cube is scaled arbitrarily without any thought to real world scale. This might cause issues since the cube itself will be rendered with a depth and then the texture will on it will, in some ways, inherit that depth. That again might work against the depth baked into the stereo cube maps and the result might be a feeling of wrong scale. Therefore we might have to work with the IPD (eye separation) a bit. At this point I'm only guessing, but we might have to set the IPD to 0, but I'm not sure yet.

Further testing needed.