Product Manager's Guide to Getting Started in VR/AR — Notes: Design and Experience (1)


Background

I’m from a design and product management career path, and am keen to understand VR/AR principles, technologies, platforms, ecosystems, experiences, dependencies, markets and evangelists, so I can become relevant to the VR/AR industry. I have a specific interest in VR being used as part of the working today, not gaming. Therefore traditional software methods, frameworks and practises I may draw similarities too, specifically software that empowers users to create — e.g. Create an email, create a collage of images, create a presentation or create code.

This article represents my notes and learnings for a larger article piece: A Product Manager’s Guide to Getting Started in VR.

I hope that by sharing my notes for the article series, that I collate from studying numerous industry papers, articles and literature with the product management community I can help filter out the key points I feel relevant for product managers, who, like me, do not have VR experience or background.

I’m coming at this with zero days commercial VR/AR experience, so I encourage feedback in the form of inline comments and responses to this article. Your corrections will help me and the product management community learn.

Visual Design Methods for Virtual Reality by Mike Alger

Source for all of the below study notes: Visual Design Methods for Virtual Reality by Mike Alger, 2015, pages 1–51.

VR Basics

  • Common VR Chat applications include VR Chat, ConVRge, AltSpace.
  • As a medium there are very few standards, protocols and workflows.
  • “It’s like participating in film before cinematography had shot names..” — what a great quote, albeit more relevant at the time of writing, that currently.
  • “How can I contribute” — I am asking myself this same question.
  • A significant amount of VR content is design for being consumed.
  • Mouse and keyboard interactions require only movement from the shoulder or elbow downwards due to the mouse and keyboard input system. VR users move their body in completely new ways, thus the UX/HCI playbook relevant for GUI’s within rectangular windows, on rectangular screens may become irrelevant for VR.
  • The operating system for interacting with VR itself, needs to be ergonomically redesigned.
“In order to create this new medium of interaction, a method of creation must first be invented”.-Mike Alger, 2015
  • Many sources for VR interaction studies have not been completed in controlled environments, or with quantitative support, but hey, it’s 2016, everything is lean now.
  • The world around us is not simply around us. Our senses see, hear, feel, smell and taste, interpret and process that information, and then a world is revealed to us. Without the sense and the interpretation and processing we would all be the same I imagine, and the world would be the same to all of us. This learning was enlightening.
  • “…humans have certain predictable outputs based on certain sets of inputs.” — it is this principle that the best VR software will bow and curtsey too. Our instincts in VR will be played upon, and we will reflect behaviours in VR as natural as in reality. I’m looking forward my my first goosebumps, tears, and belly chuckles in VR.
  • The original desktop GUI was based on the 2D analogy of paper on a desk. I question whether this will remain in VR. Paper on a desk can live it’s little life on a desk because of gravity, and it is gravity that keeps files in folders. In VR we don’t have to have Sir Isaac Newton’s laws. If the first GUI was designed in a weightless world, I wonder what kind of desktop it would of been?
“A HMD is like headphones for your eyes.” — Mike Alger
  • All product managers getting into VR need to learn a little about the plenoptic function.
  • There are 10 degrees of freedom, learn them: x,y,z position; pitch, yaw, roll rotation; distance, horizontal and vertical position to the point of convergence and focus; size of pupils aperture.
  • Focus and aperture in current HMD’s are not catered for. The range of light in HMD’s is not as great as in the real world.
  • Immersion and presence are VR “wank words”, used for marketing. Instead remember this quote;
“Immersion is being presented only with data from a false environment, but presence is having your body believe it on a fundamental basis.”

Basics around viewing

  • 75 times a second — that's the frame rate suggested by Oculus in 2015.
  • 20 millisecond motion-to-photon latency — sounds pretty fast to me.
  • 90 frames per second — Vive requirements, no wonder a decent graphics card costs $700 for this stuff.
  • Shit gets nasty if you don’t know the above 3 points. No one wants vomit on their Oculus so pay attention. You see, apparently the brain senses a mismatch between the optical and vestibular (balance and orientation — I had to Google it) systems, assumes you have been poisoned (such as alcohol that simulates the room is spinning), and makes the user ill to eject whatever it is that has been consumed. Apparently this entire bullet point is a contributing reason as to why 1990’s VR didn’t hang around.
  • Also, don’t rotate or accelerate the viewpoint independently of the users head, else motion sickness can again occur.
  • Reading small text or text far away is still not great in VR.
  • The for viewing on VR can be applied directly to AR. The only difference between the two is the amount of natural light that can be seen by the eye (AR >1% up to 100% natural light, VR 0% natural light).
“Many elements of an operating system interface design for the Oculus Rift or HTC Vive could be used with Microsoft Hololens” — Mike Alger
  • The interactive methods however, may change between AR and VR.
  • VR interactions have the opportunity to be more intense, and it is this opportunity that we must design for.
  • Larger or multiple monitors for traditional desktops have been used to increase efficiency in the workplace, in VR we have an infinite amount of space for a desktop, so in theory VR may increase productivity in the workplace.
List of common computer uses — Visual Design Methods for Virtual Reality by Mike Alger
List of common computer uses — Visual Design Methods for Virtual Reality by Mike Alger

Designing for VR

  • Alger list of most common uses for desktop machines is impressive. It found it thought provoking that with all the computing power available to use today, we tend to have only around 35 use cases.
  • A cursor enables x and y location specification, adding 6 other variables will make design interesting.
  • Trackpads, single buttons, motion tracked controllers, hand tracking and omnidirectional treadmills (these look great!)
  • Hands seem like a natural input controller, as people in VR seem to raise their hands naturally.
  • Hand tracking and gesture recognition is still primitive, even with technologies like Leap Motion.
  • Hands in space don’t experience haptic feedback, this makes pushing VR buttons a little weird, as there is no pressure being return to your finger. Although this looks to target that need.
  • Ergonomical comfort will need to be succesfull for adoption of VR/AR in the workplace.

Input UI

Radial interface concept — Visual Design Methods for Virtual Reality by Mike Alger
  • For single item selection, VR cues for input have included a laser pointer (or cone for distance), radial 2d menus and world in miniature.
  • Locomotion (moving around) and text inputs are still maturing.
  • For text input, accuracy, speed and comfort will all need to outperform the traditional keyboard, for workplace adoption. Voice recognition may play a cameo role. Using your finger to draw letters may be of use, as per the Apple Watch, however this will not outperform 60 words per minute, therefore failing the speed needs. I’m most excited by the concept of radials around the hands, and Alger alludes to six options on each hand, and possibly many rings, enabling 36 input options, similar to the HEX input scene from “The Martian” film (Or something like this). Swype style input using a laser or finger may be an option, but this would require a lot of energy I think.
  • Objects inputs will need to be redefined for VR GUI, knobs, sliders, buttons, radio buttons, dropdowns etc, however the ability to intersect an object (with your hand, or an additional virtual object) in VR may assist with this.
  • We may need new models for GUI inputs. A video player for instance is a rectangular texture (of image that update rapidly per second) that is overlaid on a rectangular 2d model, however with a third dimension for watching or creating video, the rectangle model may no longer be the most suitable.
  • VR for data visualisation is going to be huge!
  • Hover and pressed states for buttons will be interesting as it evolves. The “tap” (as per smartphones) may be more intuitive than the “click”.

Content Zones

OK, so digital product managers first had to worry about content inside a 760px width, then screens got larger and smaller at the same time, so we worried about adaptive and then responsive experiences, and at the same times pixel density varied, for further complications. We all knew our numbers for these constraints and considerations. Now’s time to learn about the VR equivalents. In this paper, Alger refers to presentations by Alex Chu heavily.


I suggest learning the following (and draw it to learn it), however the paper refers to Oculus DK2 in this instance:

Horizontal

  • 210° — Human field of view
  • 94.2° — DK2 field of view
  • 0.5 meters — Content closer than this makes you go cross-eyed, or triggers eye strain (Oculus recommend 0.75m, 2015)
  • 0.5m to 10m — strong sense of depth between objects.
  • 10m-20m — weaker sense of depth.
  • 20m+ — Due to screen resolution stereo separation is not possible, environment is flat.
  • 30° — comfortable horizontal head rotation range (left or right from looking straight ahead)
  • 77° left or right from straight ahead — comfortable content zone
  • >77° and <102° left or right from straight ahead — uncomfortable content zone, strained looking.
  • >102° — stuff is behind you and you need to move more than just your head.
Workspace zones — Visual Design Methods for Virtual Reality by Mike Alger
Content zones in VR — Visual Design Methods for Virtual Reality by Mike Alger

Vertical

  • 60° up and 40° down— max head rotation
  • 20° up and 12° down — comfortable head rotation
  • All of these number are for visual, however sound comfort is not covered in the paper.

Reading Text

  • 1.3m — DK2, 2.5 meter for future devices.

  • Once the content zones have been defined, the overlap between what we can see, and what we can reach with our arms comfortably is defined, based on a 2/3 reach rule.
  • Although the workspace can be infinite, and assuming the most comfortable position whilst seated is looking down slightly (as per reasoning in the paper), users who fear heights may find this distressing in VR.
Reachable and visually friendly content working zones — Visual Design Methods for Virtual Reality by Mike Alger
Workspace zones at a comfortable viewing angle up to 20m depth — Visual Design Methods for Virtual Reality by Mike Alger

Personal reflection on these notes

Reading, studying and Googling, before processing and then writing these notes up represents about 6 hours or work, that I hope has been summarised into useful content for the purpose stated in the background of this article.

Some other concepts for interacting with VR content, that I’ve thought of as part of my research include iOS magnifier concept, foot or toe tracking and finger tracking on thighs of users, for the purpose of typing long form content.

I’m enjoying my time on this, and I hope you have found these notes useful. I’ll be collating my key learnings from this and future articles into my 4 piece series A Product Manager’s Guide to Getting Started in VR.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.