Oculus Launchpad Experience 0.11

Introduction

The past few months have been a bit crazy; but we made it to end of the Oculus Launchpad program with a WebVR prototype submitted. A great deal has been learned and here I will talk briefly of what the circles team created; and some of the lessons learned.

Past

You can find many of the blogposts detailing our process here: http://portfolio.anthony-scavarelli.com/category/oculus-launchpad/. Everything started last year when starting my PhD in digital media at Carleton University and I started asking myself three basic questions:

  1. What is education?
  2. What is education going to be?
  3. What can I do to make it better?

With these three questions in hand, and largely inspired by my previous MASc and art work that aimed to bring people together in public spaces using technology, I stumbled across VR, read some books and was blown away by its potential to change people. The ultimate way to change someone’s environment; and thus behaviour. I often follow my interests and love the challenge of exploring new spaces, so I “jumped in”. From my initial research, and my experience as a post-secondary teacher, I found three main areas in where education and education technology could be improved:

  1. Accessibility (VR HMDs do not work for everyone)
  2. Maintenance (PC images and working with IT to keep things updated is a huge pain)
  3. Lack of social experiences (we learn together – in classrooms – and from each other. Why are we not building experiences that support this?)

I first started building a research platform in Unity and explored how we can share physical spaces in VR (without crashing into each other); but quickly found it cumbersome to develop VR in Unity (HMD on/off/on/off/on …); and difficult to build cross-platform/VR mode experiences for those that can’t, or don’t want to wear, an HMD. This is where I found WebVR and Aframe. It was right around this time I applied to Oculus Launchpad and was accepted; and so with my small but amazing team we have built a [very early] prototype …

Present

Throughout this process of building this prototype we wrestled often with what can we do, and how do we show intent for that which we can’t. It remains to be seen how well we have accomplished this (we hear back early November about who wins the Oculus Launchpad scholarships); but it was a constant thought as it becomes such a large project. But we need to start somewhere, and starting with WebVR and building upon the amazing tools developed by the talented developers of Aframe, Mozilla Reality, and Networked-Aframe teams was wise – a much better strategy than my original efforts to build it all myself using websockets. It allows us to focus on the questions that we wanted to explore above, and build the important components and experience, on top, that will drive it.

we wanted to create “lived-in” environments where, though there were no people present, you can hear and see the life of the space …

The Accessibility Issue – Are we considering it enough in VR?
For the accessibility question we need we needed to support multiple devices. Interestingly as progress developed we found high-end VR HMD support not a priority as educational institutions rarely have large budgets and IT know-how to keep them going (and the wires. oh my, the wires!); but focusing on mobile, desktop, and standalone 3DOF headsets like the OculusGO seem extremely useful for use within educational institutions due to lower-cost and/or greater access to hardware. This is an interesting tact as there are many VR developers that seem to believe that 3DOF will become extinct soon. I can’t help but feel that ignoring devices like the OculusGO is an example of lack of diverse thought in the industry. We already know that passive content (i.e. 360 videos) is a huge part of VR growth; and that cost and space is still a huge issue for many people. We can also look to similar industries such as gaming where the Nintendo Wii was huge; but eventually people didn’t want to dance around and would rather chill more passively, with a traditional controller after a days work, to play games. In any case, I guess part of the fun of new technology opening up new industries is that who knows what the future holds. Steve Jobs, after all, felt that only web apps should exist on the iPhone platform at one time …

Within VR we also need to try and support accessible modes of transportation. We decided upon a checkpoint system for several reasons.

  • Checkpoints are easy to understand across all devices (click to go!)
  • No nausea, as is found with free locomotion in immersive VR.
  • When performance is such a HUGE issue in WebVR being able to control where users can go allows us to refine models and environments for particular viewpoints.
  • In a classroom it is unlikely, unless it is a drama or gym class, that instructors want all their students wandering around a classroom full of desks and other hazardous objects. AR and VR video passthrough provides solutions for this; but they are not production-ready yet.
  • Other OculusGO experiences use them and they work well. Chances are that VR in the classroom will be used by all sorts of people from all sorts of backgrounds. Until VR is ubiquitous we should play it safe – and keep it simple.

We also implemented snap-turning and tried to create an object inspection system that works similarly across devices. One thing we still have yet to implement is a settings page where users can select what the angle of snap-turning is, whether they want to animate locomotion, how tall they are, etc. Soon though.

Stories
Our intent is to create a bunch of stories that instructors that mix and match to create lessons with. Currently, for the prototype we focused on the story of Canadian civil rights woman, Viola Desmond. It is a fascinating story and interestingly parallels Rosa Parks‘ “sit-in” 9 years later. Also, Viola is being honoured by being the first non-royal solo woman to be featured on Canadian currency this year, so there will hopefully be many people interested in exploring her story soon. Our plan is to make it as easy as possible to use VR as a learning tool.

We aim not to recreate the classroom; but rather, to extend it.

Lessons Learned

  • VR is hard to get right; and everyone has different idea of how to do it. User studies will be an important step forward, as they will focus on the more specific educational context.
  • VR in education requires a different way of thinking about VR. The environment and the diversity of people potentially using it requires us to make sure everyone can use it. We want to push what VR education could look like, not push people away from it.
  • The social component is huge. We have a lot of exploration and prototyping ahead of us in regards to how do we best utilize the fact that there are many exploring these worlds at time.
  • Bugs are not fun; but having access to the source of Aframe and Mozilla Hubs allows us to modify where necessary to get it working (i.e. laser-controls in Aframe assume a click is both “triggerup” and “trackpadup”; but we needed different functionality for both – click for selection tasks and trackpad left/right for snap-turning). Recently a PR (change) has been added to Aframe, after this post, to fix this 🙂
  • WebVR is still early; but it has so much potential. We need to push the technology so that others see its power. Not only in development; but also in accessibility and “ease-of-use”. Hopefully the Oculus team agrees 🙂
  • There are so many smart people working on WebVR. It is going to be around for the long-haul.
  • The Oculus Go browser is magic. With Fixed-Foveated Rendering it was handling our scenes quite well and this is likely due to the many optimizations they have made, including having FFR default in the browser. I wish Aframe made its easy to set these kind of VR display presentation variables (including canvas size). I might have to start looking at it if this project goes any further 🙂
  • There are bugs; but fortunately there are workarounds. It seems both the Oculus Browser and Firefox Reality on the GO suffer from a major performance issue where if you traverse a link, and stay in VR, when the linked scene loads performance goes to hell. Perhaps the last scene is stuck in memory? In any case a bug has been filed; and the workaround was to manually exit VR before the next scene loads. It is definitely immersion-breaking; but definitely more so that a sub-10fps experience :/
  • I feel like we have only scratched the surface of preparing models for WebVR (hardly using the PBR workflow currently). I am converting our models to GLTF; but there is much room for optimization. For example, for our theatre scene I originally had you able to go anywhere; but having many seats rendered was a terrible hit on performance. To fix this I limited movement to the front and remodelled the seats to be lower poly counts and used “imposters/cards” (i.e. single polygon stand-ins”) to mimic seats in the further rows.
  • We love dynamic lights but probably need to use baked-in shadows more often 🙁 It will help the lack of great depth perception without them. I am going to dive into keeping our dynamic point light in the campfire more optimized though as the bouncing shadows adds so much to the ambience 🙂
  • Debugging the OculusGo browser via Chrome tools is very handy for figuring out bugs, and seeing console output.

Anyhow, here in the final result of our prototype efforts in Twitter video form. We may write some more later once we move into our next phase; but for now this is the end of the “Oculus Launchpad Experience”.

Per aspera ad astra!

Future

We will still continue work on this project. It may be much slower if we don’t get funded, but I am still very interested in the project and still have a PhD thesis to write in a couple of years that could benefit from its development! There is also a few features I would like to get in, that I did not have time to do so before submitting. I am sure this story is not yet finished …

Acknowledgements

As mentioned in previous blogposts thanks to those that helped make this happen, including my graduate advisors Dr. Ali Arya and Dr. Rob Teather, and the wonderful development team (Grant Lucas, Tetsuro Takara, Kirk Starkey, Virginia and Nathaniel). Also thanks to my wonderful wife and colleagues, that have provided guidance and feedback. And, of course, special thanks to all the open-source contributors on Aframe, its community components, and Mozilla Hubs that have made starting social WebVR so much easier. And finally, to Oculus for supporting a program like Launchpad and the Oculus Launchpad team for their great feedback, program, organization, and support.

Your email address will not be published. Required fields are marked *