metaphor

Practical Tips on Time Travel: How to Transport Using Gestures in VR.

Derrick Rose probably wishes that he could travel to a time where he doesn't play for the Knicks.

Derrick Rose probably wishes that he could travel to a time where he doesn't play for the Knicks.

Why do we need time travel, when we already travel through space so far and fast? For history. For mystery. For nostalgia. For hope. To examine our potential and explore our memories. To counter regret for the life we lived, the only life, one dimension, beginning to end.
— James Gleick, Time Travel

Time travel is a powerful tool for storytellers, but here I argue that all VR/AR designers can use associations with time travel to communicate help users grasp concepts more quickly.  

For VR/AR designers, time travel presents an interesting case study of:

  • Using physical gestures to represent abstract concepts
  • Activating known associations with each of those abstract concepts

Make Physical Gestures Represent Time in Space

Conceptually, human move through time. That means that how we think about space affects how we think about time.  As new technologies for tracking gestures improve, that means that VR/AR designers can build existing brain/body connections into their operating systems.

In today’s world of VR devices, imagine that you wanted to activate people’s senses of the future or the past using HTC Vive controllers.  

HTC Vive controller

HTC Vive controller

First start by thinking about what represents the future.  Generally, people believe that the future is in front of them and they will even lean forward when thinking about it.  The opposite is true for the past where people can be observed leaning backwards when reminiscing.  

In VR, it’s a 360-degree environment so it’s no exactly clear what’s the front and back of any experience so you should anchor associations of the future and past to the user’s body using hand gestures. Point your index finger forward to advance time and extend your thumb being you to reverse time.  Or, if the user has a set of controllers, have the forefinger buttons map to future movement and thumb buttons move you into the past.  Plus, give users the sense of moving forward or backward if you want them to feel transported to the future or the past. 

But does it matter if I’m not literally building a VR game about time travel?

Yes! These physical associations with the future and the past can be used by all VR/AR designers. By now, I hope that I’ve convinced you that people are wired to perceive the future in front and the past behind. Now you can use associations your target user has with the future or the past to map back to physical gestures.

The future generally associated with the following:

  • It offers more influence over external events.  Meaning that it is easier to control outcomes in the future compared to similar situations than the past
  • People tend to expect more positive things and fewer negative things will happen to them in the future. 
  • This leads them to anticipate all sorts of self-improvements (being healthier, skinnier, wiser, etc.) due to them exercising, dieting, reading, etc. more.

Now if people have such a large sense of optimism and expect so many positive things in the future, this also means that their rumination on the future is:

  • More imaginative than realistic (no reality checks)
  • Focused on the fulfillment of their greatest desires
  • Filled with more stereotypes and less variety

Now that you know about each of the associations with the future and you can imagine the converse association for the past.  There is much more negativity, uncontrollable outcomes, variety and details in the past.  

Using physical cues for the future [front] and the past [behind] can bring to mind the associations that Americans already have with the future and the past.  Using physical gestures to represent concepts as abstract as the future, but could also bring to mind things associated with the future such as a greater sense of self-control.  

Here’s an example of how to use the gestures and associations with the future in an indirect way.  Let’s say that you wanted to give people a heightened sense of autonomy and control inside of your experience.  Prime them to act with more autonomy by having them lean into your experience.

Another instance where you could utilize these associations would be around storytelling.  If you wanted people to consume a narrative that you have created, let them sink back to watch it.  Indicate that the events have already occurred and that they cannot change.  

Takeaways

  • People come into any experience with pre-existing associations and it’s up to the designers to utilizing existing ones or train new ones.  
  • Abstract concepts such as the future or the past can have physical associations (leaning forwarding, etc).
  • It is possible to use physical gestures to represent abstract concepts.  Pointing with the index finger, or using the trigger button on a controller can map onto the future.  
  • Start by thinking about the feelings and associations that you want your users to have.  What the the associations and references that people have with those things and work backward from there.  

References

Kane, J., McGraw, A. P., & Van Boven, L. (2008). Temporally asymmetric constraints on mental simulation: Retrospection is more constrained than prospection. The handbook of imagination and mental simulation, 131-149.

 

 

Implications of Texture & Pressure in VR

Microsoft unveiled its Normal Touch and Texture Touch controllers.  They are specialized controllers that give people feedback to make them believe they are touching real objects in VR. Longer write up here and video is embedded below.  

The controllers aren't perfect and you are limited to getting feedback on the tip of your finger. However, the technology itself is an important gateway to more immersive experiences. People have powerful associations with texture and pressure that designers can use to change people's feelings inside of the experience.  

In a previous post, Metaphors are Jet Fuel, I discussed how feeling differing textures affected subsequent decision-making without people's conscious awareness of the influence.  The example I used was the metaphor of rough or smooth social interactions. Consider these common metaphors in American English:  

  • Having a rough day
  • Using coarse language
  • Being rough around the edges
  • Acting as a smooth operator

These new Microsoft controllers could be key to experiences that depend on creating feelings that are commonly associated with texture and pressure.  When people feel those sensations on the controllers, it will activate their associations with softness, smoothness, pressure, etc.

Reality Is What You Do (Not What You See)

“Being there” means the capability to act there. 

“Being there” means the capability to act there. 

Your perception of reality is based on what you can do.  When you are inside of a VR environment, the more functionality that you have, the more the experience resembles your every day life.  You believe an object is real when you can interact with it, not just when you see it.  

Presence is defined as a sense of “being there,” or the extent to which virtual environments are perceived as places visited rather than images seen.  If you accept that presence is a design ideal for VR environments, there are systematic ways to increase users’ feelings of it.  Here I review two scientific papers on using body movement to heighten presence.  

Locomotion: Walking in Place vs. Using a Mouse

The degree of presence depends on the match between proprioceptive and sensory data.  Researchers at the University of London asked people to walk in place while they were inside of a virtual experience.  The gaze of the participant in the HMD determined what direction people felt they were walking in.  They compared the walking-in-place experience against the use of a computer mouse for locomotion.  The researchers believed that walking in place offered an advantage because is that it doesn’t require people to use their hands for navigation.  

The hand may be entirely reserved for the purposes for which it is used in everyday reality, that is, the manipulation of objects and activation of controls.
— Slater, Usoh & Steed (1995)
Valve's The Lab uses a teleportation function built into the hand controllers.  The user points a light beam where they want to go and release the trigger to teleport.  In this case, the user will land in the green circle in front of them.

Valve's The Lab uses a teleportation function built into the hand controllers.  The user points a light beam where they want to go and release the trigger to teleport.  In this case, the user will land in the green circle in front of them.

From the Slater, Usoh, M. and Steed, A. (1995) article: 

A fundamental requirement for an effective virtual reality is, therefore, that there is a consistency between proprioceptive information and sensory feedback, and in particular, between the mental body model and the virtual body…Proprioception is “the continuous, but unconscious sensory flow from the movable parts of our body (muscles, tendons,joints) by which their position and tone and motion [are] continually monitored and  and adjusted, but in a way that ishidden from us because it is automatic and unconscious.” (Sacks 1985).  Proprioception allows us to form a mental model that describes the dynamic spatial and relational disposition of our body and its parts. We know where our left foot is (without having to look) by tapping into this body model. We can clap our two hands together (with closed eyes) similarly by relying on this unconscious mental model formed from the proprioceptive data flow.

The control groups (the “pointers”) navigated the environment using a 3D mouse, initiating movement by pressing a button, with direction of movement controlled by pointing. The experimental groups (the “walkers”) used the walking technique. In each case the mouse was also used for grasping objects. The task was to pick up an object, take it into a room, and place it on a particular chair. The chair was placed in such a way that the subjects had to cross a chasm over another room about 20 feet below in order to reach it…With respect to the ease of navigating the environment, subjects in both experiments marginally preferred to use the pointing technique. This result was not surprising: as Brooks et al. [ 1992] noted, with the real treadmill more energy certainly is required to use the whole body in a walking activity, compared to pressing a mouse button or making a hand gesture (or driving a car, with respect to the similar comparison in everyday reality).

This is quite interesting…People found that the mouse was EASIER, but walking was more natural.  More evidence that the best experiences might not be the easiest ones.

Other results showed that “for the “walkers” the greater their association with the virtual body the higher the presence score, whereas for the “pointers” there was no correlation between virtual body association and the presence score. In other words, participants who identified strongly with the virtual body had a greater degree of reported presence if they were in the “walking” group than if they were in the “pointing” group. Association with the virtual body is important.…We argue that the walking technique [helps people match their proprioception to their sensory information,] compared to the pointing technique,and therefore other things being equal should result in a greater sense of presence. However, we found that this is modified by the degree of association of the individual with the virtual body…The virtual body association is significantly positively correlated with a subjective presence for the walkers but not for the pointers, which is certainly consistent with the proposed model.

This is important because it means that Samsung Gear VR experiences where a person is represented as a black hole and has no virtual body, presence is going to be very difficult to create.

Beyond Walking: The Influence of Bending, Standing, and Task Complexity on Presence

The researchers of the walkers vs. pointers study conducted a follow up experiment where they asked people to “walk” in place through a forest.  They varied the height of the trees for some participants, meaning that when there was high (vs. low) variability, people inside of the experiment had to bend down and look up more.  

The results showed a significant positive association between reported presence and the amount of body movement, in particular head yaw, and the extent to which subjects bent down and stood up…The practical importance of the results of this experiment is that since there does seem to be a relationship between body movement and presence, it is a reasonable goal to design interactive paradigms that are based on semantically appropriate whole body gestures. These will not only seem more ‘natural’, but may also increase presence. We further believe that the increase in presence in itself will engender more body movement, which in turn will generate higher presence, and so on.

Interestingly, adding a layer of cognitive effort did not increase user feelings presence.  They manipulated task complexity by asking some participants to count the number of trees that they saw and remember the distribution of diseased trees.  However, there was no increase in presence by having to exert mental effort.  

Head Yaw is Good for Presence

Head Yaw is Good for Presence

Using Walking in Place to Make Stairs and Ladders

If you are doing a Harry Potter wizarding experience, then flying or teleporting might be the best locomotion.  However, if you are trying to do an education simulation, such as training fire fighters, consider integrating more humdrum actions.  

The same idea [of walking-in-place] can be applied to the problem of navigating steps and ladders. One alternative is to use the familiar pointing technique and to “fly.” While in some applications there maybe a place for such magical activity, the very fact that mundane objects such as steps and ladders are in the environment would indicate that a more-mundane method of locomotion be employed. The walking-in-place technique carries over in a straightforward manner to this problem. 

When the collision detection process in the virtual reality system detects a collision with the bottom step of a staircase, continued walking will move the participant up the steps. Walking down the steps is achieved by turning around and continuing to walk. If at any moment the participant’s virtual legs move off the steps (should this be possible in the application), then they would “fall” to the ground immediately below. Since walking backward down steps is something usually avoided, we do not provide any special means for doing this. However, it would be easy to support backward walking and walking backward down steps by taking into account the position of the hand in relation to body line: a hand behind the body would result in backward walking.

Ladders are slightly different; once the person has ascended part of the ladder, they might decide to descend at any moment. In the case of steps, the participant would naturally turn around to descend. Obviously this does not make sense for ladders. Also, when climbing ladders it is usual for the hands to be used. Therefore, in order to indicate ascent or descent of the ladder, hand position is taken into account. While carrying out the walking-in-place behavior on a ladder, if the hand is above the head then the participant will ascend the ladder and descend when below the head. Once again it is a whole-body gesture, rather than simply the use of the hand, that is required in order to achieve the required result in an intuitive manner. If at any time the virtual legs come off the rungs of the ladder, then the climber will “fall” to the ground below.

Key Takeaways to Maximize Presence in VR

  • Presence is defined as the user’s sense of “being there” inside of a simulated environment.  
  • The way that you believe you can interact with your environment is just as important as what you see in VR.
  • Walking-in-place has been proven to be a metaphor for locomotion and navigation that increases presence.
  • There is evidence that using body movements such as walking, bending down, and moving your head also heightens a sense of presence.
  • Cognitive complexity does not increase a sense of presence.  

Further reading

Sacks, Oliver. (1985). The Man Who Mistook His Wife for a Hat. Picador, London.

Slater, M., McCarthy, J., & Maringelli, F. (1998). The influence of body movement on subjective presence in virtual environments. Human Factors: The Journal of the Human Factors and Ergonomics Society, 40(3), 469-477.

Slater, M., Usoh, M., & Steed, A. (1995). Taking steps: the influence of a walking technique on presence in virtual reality. ACM Transactions on Computer-Human Interaction (TOCHI), 2(3), 201-219.

 

Metaphors are Jet Fuel

96 
  
    
  
 Normal 
 0 
 
 
 
 
 false 
 false 
 false 
 
 EN-US 
 X-NONE 
 X-NONE 
 
  
  
  
  
  
  
  
  
  
 
 
  
  
  
  
  
  
  
  
  
  
  
  
    
  
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
  
   
 
 /* Style Definitions */
table.MsoNormalTable
	{mso-style-name:"Table Normal";
	mso-tstyle-rowband-size:0;
	mso-tstyle-colband-size:0;
	mso-style-noshow:yes;
	mso-style-priority:99;
	mso-style-parent:"";
	mso-padding-alt:0in 5.4pt 0in 5.4pt;
	mso-para-margin:0in;
	mso-para-margin-bottom:.0001pt;
	mso-pagination:widow-orphan;
	font-size:12.0pt;
	font-family:Calibri;
	mso-ascii-font-family:Calibri;
	mso-ascii-theme-font:minor-latin;
	mso-hansi-font-family:Calibri;
	mso-hansi-theme-font:minor-latin;}
 
   C’est ne pas une brush

C’est ne pas une brush

VR/AR experiences will live or die by how quickly a user can learn how the experience works.  Thankfully, designers have an arsenal of tools in the form of metaphor to help speed the acquisition of knowledge.  Metaphors are cognitive shortcuts.  If I tell you, “I had a rough morning,” you know exactly what I mean. 

In Google Tilt Brush, you can choose the type of brush that you want from the palette.  However, you’re not choosing an actual brush.  You are choosing the type of line/pattern/texture that you want to use. Metaphors, or words applied to objects, actions, or concepts to which they are not literally applicable, are extremely efficient means of communicating complex ideas.  

“For instance, if I say “This is a Trash Bin,” you may not know a computer’s file management system or directory structures, but you’ve got a pretty good idea of how trash bins work, so you can deduce that the unwanted files go in the trash bin, and you’ll be able to retrieve them until the bin is emptied. Metaphors are assistive devices for understanding.”
— Frank Chimero

VR/AR offer tremendous possibility in creating entirely new experiences and environments.  The primary obstacle that designers and developers will encounter is that they can only design for the speed of the user’s understanding.

When virtual objects and actions in an app are metaphors for familiar experiences - whether these experiences are rooted in the real world or the digital world - users quickly grasp how to use the app.” - iOS Human Interface Guidelines, 2015

Learning on Jet Fuel

Metaphors that "embody," or perfectly represent, an idea will communicate those ideas faster.  Same for qualities, feelings, etc.  Furthermore, exposure to metaphor can be incidental!  It doesn’t have to be something that users are conscious of in order to influence them. Researchers from MIT, Harvard and Yale tested how mere exposure to smoothness or coarseness affected people’s subsequent decision-making.

One half of the participants completed a jigsaw puzzle with a smooth surface.  The other half completed the same jigsaw puzzle except that was covered in sandpaper.  After participants finished the jigsaw puzzle they were given a second task to read and evaluate a story.  They were told that this was an unrelated task.  However, the researchers were really studying how the haptics of touching a smooth or a rough puzzle would change people’s judgments of the story.  In American culture, roughness is associated with coarseness, difficulty and harshness.

Here’s how the researchers described the rest of their study:

“After the puzzle task, participants read a scenario describing an interaction between two people and formed impressions about the nature of this interaction. This passage described both positive components (e.g., kidding around) and negative components (e.g., exchange of sharp words) of a social interaction and thus was ambiguous as to the overall tenor of the interaction…After reading, participants rated whether the social interaction was: adversarial/friendly, competitive/cooperative, a discussion/argument, and whether the target people were on the same side/on opposite sides using 1-9 scales.

Results indicated that participants who completed the rough puzzle rated the interaction as less coordinated (more difficult and harsh) than did participants who completed the smooth puzzle, F(1, 62) = 5.15, P = 0.027. Thus, roughness specifically changed evaluations of social coordination, consistent with a 'rough' metaphor.”

96 
  
    
  
 Normal 
 0 
 
 
 
 
 false 
 false 
 false 
 
 EN-US 
 X-NONE 
 X-NONE 
 
  
  
  
  
  
  
  
  
  
 
 
  
  
  
  
  
  
  
  
  
  
  
  
    
  
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
  
   
 
 /* Style Definitions */
table.MsoNormalTable
	{mso-style-name:"Table Normal";
	mso-tstyle-rowband-size:0;
	mso-tstyle-colband-size:0;
	mso-style-noshow:yes;
	mso-style-priority:99;
	mso-style-parent:"";
	mso-padding-alt:0in 5.4pt 0in 5.4pt;
	mso-para-margin:0in;
	mso-para-margin-bottom:.0001pt;
	mso-pagination:widow-orphan;
	font-size:12.0pt;
	font-family:Calibri;
	mso-ascii-font-family:Calibri;
	mso-ascii-theme-font:minor-latin;
	mso-hansi-font-family:Calibri;
	mso-hansi-theme-font:minor-latin;}
 
   Experience of texture influences unrelated task of judging social interaction.  A rough texture led to more competition and less coordination.       Half of the participants touched the rough textured puzzle and the other half touched the smooth textured puzzle.  Next they read a story about an ambiguous social interaction.  The people who touched the rough texture believed that the interaction was more difficult and harsh.  

Experience of texture influences unrelated task of judging social interaction.  A rough texture led to more competition and less coordination.   Half of the participants touched the rough textured puzzle and the other half touched the smooth textured puzzle.  Next they read a story about an ambiguous social interaction.  The people who touched the rough texture believed that the interaction was more difficult and harsh.  

So why would touching a smooth or a rough puzzle piece change an American’s subsequent evaluations of a social experience?  Likely because of how it essential it is to the following metaphors:

  • Having a rough day
  • Using coarse language
  • Being rough around the edges
  • Acting as a smooth operator

This study is an example of how sensory input (touching rough or smooth puzzle pieces) affected people’s subsequent decision-making. The puzzle pieces acted as embodiments (e.g., tangible representations) of social dynamics. The lesson for people making VR/AR experiences is that people are making automatic associations with everything that they see, touch, and hear. Users are in a constant state of monitoring their environment and taking in new information.  Metaphors are a powerful cognitive shortcut to help users learn what your world is and how to navigate it. 

Metaphors can help you communicate abstract information quickly.  Consider which metaphors capture the experience that you want to create and how your want your user to feel.  Then work backward from there to how to represent it.  For example, if I wanted to create a narrative about love, the metaphor “Love is a Journey” would be extremely useful.  Consider the following examples used to describe love (and its challenges):

  • Look how far we’ve come.
  • We have a long way to go.
  • It’s been a long, bumpy road.
  • We can’t turn back now.
  • We’re at a crossroads.
  • We may have to go our separate ways.
  • This relationship isn’t going anywhere.
  • We’re spinning our wheels.
  • Our relationship is on the rocks.
  • This relationship has hit a dead-end street.

This list of metaphors likely triggered ideas for how to represent love inside of a digital experience.  Hopefully, it also provided an example of taking an abstract concept and representing it in a way that users can quickly grasp.  For additional advice on metaphor, I recommend Maggie Appleton's article, Why Metaphors Matter for App Designers. It's not specific to VR/AR, but it contains some nuggets.  

Further reading

Ackerman, J. M., Nocera, C. C., & Bargh, J. A. (2010). Incidental haptic sensations influence social judgments and decisions. Science,328(5986), 1712-1715.

Chimero, Frank.  What Screens Want.  Build conference Belfast.  Nov 14, 2013.  

 iOS Human Interface Guidelines, 2015