Yusuf Pisan

Cross Reality Views Via an Unmanned Aerial Vehicle

Aaron C. Hitchcock

The fields of Augmented Reality (AR) and Virtual Reality (VR) have a consistent focus on first person experiences. By utilizing wearable devices like the Microsoft HoloLens, Oculus Rift or Google Cardboard users can enter and interact with a virtual space. The Augmented Space Library (ASL) [1], developed by Cross Reality Collaboration Sandbox (CRCS) Research group [2], seeks to combine both physical and virtual spaces with virtual objects and allows multiple users, both local and remote to interact. The capabilities of both these technologies can be expanded by the use of a remote controlled camera allowing for the addition of third person or remote first person viewing. This project creates a system and corresponding API allowing for integration of these views into both AR and VR applications as well as the ASL system. The system gives a user the ability to navigate and explore a remote physical space in real time or see themselves and their surroundings in third person. From a functional perspective this requires the remote control of a highly maneuverable camera. This was achieved through the use of a Drone or UAV (Unmanned Aerial Vehicle). These new virtual views will allow for AR/VR interaction in new ways as all prior physical points of view were bound to the user.


Yusuf Pisan Computing & Software Systems (CSS) University of Washington Bothell