VR Development: Cinder-VR VS Aframe

Mark van de Korput
2017 07 12 florim vr

Cinder Tools

  • Cinder - framework, similar to OpenFrameworks
  • Cinder-VR - Cinder block: "Virtual reality support for Oculus Rift and HTC Vive in Cinder!"
  • Cinder-poScene - scene graph

 

VR video

To get the video to work Cinder to build with the Cinder-VR block. In Visual Studio this work is not like that of a pain-in-the-ass. One Thing That I think is pretty common for a developer Cinder, but less so if you are used to work with OpenFrameworks, is That I had to first open the Cinder-block VR's project files and build the block so there are binary lib files on the development machine, which are then used when building the Cinder applications that use the block. This is probably the only thing you need to know about. Separately (32-bit / 64-bit) and target (debug / release) separately.

Another issue is that the actual VR-stuff is not always available (on Mac Oculus Rift is not even supported and VR glasses). For this, I created a VR_ENABLED compiler flag, which is enabled for windows machines (at compile time) and disabled for Mac machines (at compile time). When the flag is disabled, a standard Virtual Camera object is used as a "replacement" for the VR glasses.

This is the best method for the main application class:

#ifdef VR_ENABLED
   { CI_LOG_I("Setting up VR...");
try {
            ci::vr::initialize();
     }
     catch (const std::exception& e) {
     CI_LOG_E("VR failed: " << e.what());
     }
try {
        mVrContext = ci::vr::beginSession(
             ci::vr::SessionOptions().setOriginOffset(vec3(0, 0, -3))
                 .setControllerConnected([this](const ci::vr::Controller* controller) {
                       CI_LOG_I("VR controller connected");
                       this->pController = controller;
                  })
                  .setControllerDisconnected([this](const ci::vr::Controller* controller) {
                       CI_LOG_I("VR controller DISconnected");
                       this->pController = NULL;
                  })
            );
     }
     catch (const std::exception& e) {
            CI_LOG_E("VR Session failed: " << e.what());
     }
     if (mVrContext) {
            try {
                  mHmd = mVrContext->getHmd();
                  //mHmd->setMirrorMode(ci::vr::Hmd::MirrorMode::MIRROR_MODE_STEREO);
                  mHmd->setMirrorMode(ci::vr::Hmd::MirrorMode::MIRROR_MODE_UNDISTORTED_STEREO);
            }
            catch (const std::exception& e) {
            CI_LOG_E("VR Hmd failed: " << e.what());
            }
     }
}
#else
     { CI_LOG_I("Setting up VR-replacement camera...");
     mCamUi = CameraUi( &mCam );
     }
#endif


 


And here's the main draw routine

void CinderApp::draw() {
gl::clear(Color(0.02f, 0.02f, 0.1f));
#ifdef VR_ENABLED
// render for VR glasses
if(mHmd){
// render for individual eyes
mHmd->bind();
{
for (auto eye : mHmd->getEyes()) {
mHmd->enableEye(eye);
drawScene();
mHmd->drawControllers(eye);
}
}
mHmd->unbind();

if (drawMirrored) {
mHmd->submitFrame();
gl::viewport(getWindowSize());
mHmd->enableEye(ci::vr::EYE_HMD);
drawScene();
} else {
gl::viewport(getWindowSize());
gl::setMatricesWindow(getWindowSize());
mHmd->drawMirrored(getWindowBounds(), true);
}
if(bDrawDebug)
drawDebug();
return;
}
#endif
// draw "normal" (no VR)
gl::viewport(getWindowSize());
gl::setMatrices(mCam);
drawScene();
if(bDrawDebug)
drawDebug();
}

 

Note that when VR_ENABLED == true, it will try to render to the VR HMD (Head Mounted Display), but if there is no HMD available, it will fall back to rendering “normally”, the same way is when VR_ENABLED == false.

 

Spatial Audio

Cinder communicates through OSC with an independent MaxMSP patch that uses the Hoa Library for implementing Ambisonic (spatial) audio. There are also OpenFrameworks and Cinder bindings for the Hoa library, but I have not been able to get them to work so far ...

How the communication between the cinder app and max currently works is:

  • Cinder sends a trigger message for a certain audio sample
  • Cinder sends position information (pitch/yaw/distance) for the currently active audio object

This is not easy to support multiple audio audio samples this way, and it would be unclear how long the cinder app should be.

How the communication should work in a more robust setup:

  • Cinder send a trigger message, including a 3D (x, y, z) position for an audio object
  • Cinder constantly sends the listener's orientation

Cinder simply always provides the audio module with the listener's current orientation. And sound objects are triggered and positioned ("spawned") at a 3D position. This would require the audio module (the max application) to calculate the pitch/yaw/distance attributes (which is the data used by the Hoa library) and the relative position of the sound objects to the listener.

 

Aframe test

 

Uses

2017 07 12 aframe

 

Fast prototyping

Aframe lets you build your 3D scene like you build a web-page; using the HTML and JavaScript. It’s a very declarative way of programming which is very different from writing a C++ application (potentially MUCH faster, but also very opinionated). I was able to put some stuff together and get it working really fast. Adding leap motion, for example, was literally just a matter of adding the addon js file and two lines of HTML:

<script src="lib/aframe-leap-hands.min.js"></script>
<a-entity leap-hand="hand: left;"></a-entity>
<a-entity leap-hand="hand: right;"></a-entity>

This already gave me the 3D robot arms. (I will need to do a bit more to actually get some interaction with objects).

Aframe is great for quickly throwing a little scene together to try something and seems a good tool for fast-prototyping. You don’t have to think about low-level technical stuff like texture-loading (you simply add a tag) and it has a pretty sophisticated animation system for building for example transitions. I still have to see how Aframe holds up when you want to start building more complicated applications.

In my experience so far, Aframe works -and performs- pretty good on both desktop and mobile browsers, so you can easily develop and try it out in google cardboard/daydream. WebVR is a standard that’s still under development, but there are already some browsers with (beta) support for Oculus Rift and HTC Vive (see https://webvr.info/ for up-to-date information on the status of the different platforms). This means that with the right browser (and settings) you should be able to also run an aframe application inside a VR headset, but I haven’t tried it yet.

Another awesome “feature” of WebVR is that you can simply upload your app to a website or -for example- github page and everybody can try it on their computer/phone/VR-headset, which is great for sharing updates easily between team-members and with clients.

Spatial Audio in WebVR

I haven’t tried web-based spatial audio it yet, but it’s definitely possible