Welcome to Procter&Gamble VR deep dive workshop. (available on http://learnwebvr.xyz until April 1st)

Hello, let's all meet in VR! Let's go now on http://learnwebvr.xyz/social on your mobile phone. No presentation with slides. No question to know who knows what. No planning of the workshop. All that will come after. Explanation after that VR is the medium of experience.

What happens, how do you think it worked? How do you think you can make it work?

Description

The Procter&Gamble VR deep dive workshop is an opportunity to discover the key concepts of VR: how to build a scene, how to interact with it personally or in group, how to dynamically generate content and lots more. You will gradually learn the key principles and the technical framework to transform your idea into a proper VR experience. From marketing to presentation, from solo- to social market research: VR has many potential usages. Youíll learn the initial steps to start your own project in an efficient, collaborative and interactive environment.

Concepts we will explore:

  1. the stereo effect
  2. fundamental rules of VR-scene design
  3. AFrame as a sensible and established framework
  4. loading your own 3D assets
  5. sharing your VR experience with the outside world
  6. WebVR as a perfect entry to VR

Source : Inside P&Gís digital revolution McKinsey Quarterly November 2011

Introduction to virtual reality (key concepts : object position, camera, stereo effect)

This first session will be a gentle introduction to the key concepts. All will be done directly in a coding environment without any installation required to start.

Please note that the workshop is a place to experiment and learn, not an inventory of answers. The goal is to give you all the pointers you need to then efficiently starting thinking with and for VR.

Building for virtual reality (primitives, meshes, light, assets including 360 photos and videos)

Assuming that key concepts have been understood, in order to make a scene richer we can use:

Interaction in virtual reality (gaze, button, proper distance)

Focus not on objects as static or even animated elements but rather how the person having the experience can interact with them. It will explain what works, what doesnít, regarding hardware and user experience.

Note that most so far was done via the declarative aspect if AFrame solely. From now because of the dynamic aspect JavaScript is required.

Being social and dynamic in virtual reality (websockets, service side content)

A virtual reality experience doesnít have to be experienced alone. In fact it is quite easy to transform a basic solo experience to a rich multi user experience. Instead of programming complex algorithms letting user interact with other allows to quickly showcase interesting behavior. Still, in addition to this it will be shown how to use dynamic content either generated on the fly or processed by a server then generated a relevant scene.

Going faster and further

  • registering a component following the model of proper distance
  • selecting multiple custom elements
  • emitting an event
  • registering a shader
  • developing your own scaffolding e.g. grid with associated .svg files
  • try augmented reality using <a-marker> from AR.js


Done? Go back to learn more.