Thursday, June 1, 2017

Porting GHEngine to VR Part 1: VR GUI Rendering


We're porting our in-house engine to VR!  Namely to Oculus because they were generous enough to help us out with a development headset.  This version of the engine is in use for one released game on iOS, Android, Windows Store, and OSX.  It also has 2 in development games including one for Steam.

2D on-screen GUIs have worked for all other platforms we've ever worked on, but they don't work in VR.  The images are too close to your face and cause a cross eyed view while looking around.  The answer is to move the GUI out into the world.  However, we don't want to have to make a whole new GUI system.  Our existing system for loading and interacting with menus is fully implemented, working, and took an immense amount of time.

We introduced a new concept to our gui system we're calling a canvas.  It's basically an output location for our 2d gui system.  Each gui widget gets assigned a canvas via data inheritance.
class GHGUICanvas
{
public:
GHGUICanvas(bool is2d, const GHPoint3& pos, const GHPoint3& rot,
const GHPoint3& scale, GHMDesc::BillboardType bt);

bool is2d(void) const { return mIs2d; }
void createGuiToWorld(const GHViewInfo& viewInfo,
 GHTransform& outTrans) const;

protected:
bool mIs2d; // projected to screen or not.
GHPoint3 mPos;
GHPoint3 mRot;
GHPoint3 mScale;
GHMDesc::BillboardType mBillboardType;
};
We then moved the creation of transform for gui space (0-1) to screen space (-1 to 1 in d3d) from the shader to the CPU by using a per-object shader callback.  If there's no canvas or it specifies 2d we just pass a simple guispace to screenspace converter transform.  Otherwise we multiply offset, scale, rotation, position, and WorldViewProj together in that order and pass that to our gui vertex shader.

And the result:

No comments:

Post a Comment