I’m not too sure how, but this video gave me the idea of possibly expanding my idea of having an experience in a room to having an experience within a whole building… A story told within a buidling rather than a room.
I started to think about how imagery around the building could change as the player moves, and how they could walk into a void and come out on the other side with something changed (e.g. blue tinted vision changing to red after moving through an object) and how I could use signage to direct the player to go to certain locations within the building – how would all this work though? Would I have to build the building in 3D software? Mapping (Hololens type of thing)? Or constant AR ‘trigger images’ for arrows? How much memory and power would this need/consume? Too complicated? How do I even test this?
Test 1 – Map a small area in 3D software and see if I can move an object around it in an AR prototype. Test 2 – try to setup player positioning triggers within a room i.e. when player reaches this very specific location, run this animation/code.
If I were to place specific things (Vuforia targets that display videos when triggered for example) in specific areas in the building and direct the user towards them in AR mode (so that the camera is on), I could this would eliminate the need for any type of mapping. Essentially this could be done anywhere, as long as the targets are setup in the desired way.
Here is an example of how I could lead the user around a room or building in AR mode:
- AR Mode – when the camera is on, enabling the user to see the real world
- VR Mode – traps user in simulated environment
- What happens if the user turns the arrow (trigger sheet) a different way around? Maybe the trigger needs to be fixed to something i.e. taped down on the ground or to a wall
- This could also be a method of triggering communication/assistance for the user – could have a ‘help’ area to help to solve issues they’re having in the game.