Al Zayer

PhD Student
Virtual Reality Lab
University of Nevada, Reno

Download CV

About Me

I am a researcher in the making at the University of Nevada, Reno. I have been working for the last two years with Dr. Eelke Folmer at the Virtual Reality Lab to push the boundaries of what we currently know about navigating virtual environments. My research is esepcially concerned with making VR locomotion equally effective and enjoyable for all users, irrespective of their individual differences. I also have research interests in accessbility computing and social networks analysis. Prior to joining UNR, I was a software engineer for almost 8 years at Saudi Aramco's EXPEC Computer Center. Proud to have my software engineering education at Carnegie Mellon University and King Fahd University of Petroleum and Minerals.

Research Portfolio

PAWdio: Hand Input for Mobile VR using Acoustic Sensing

Hand input offers a natural, efficient and immersive form of input for virtual reality (VR), but it has been difficult to implement on mobile VR platforms. Accurate hand-tracking requires a depth sensor and performing computer vision on a smartphone is computationally intensive, which may degrade the frame rate of a VR simulation and drain battery life. PAWdio is a novel 1 degree of freedom (DOF) hand input technique that uses acoustic sensing to track the relative position of an earbud from a headset that the user holds in their hand. PAWdio requires no instrumentation and its low computational overhead assures a high frame rate. Published in CHI-Play'16. (Best Note Award)

StereoTrack: 180 Inside-out Positional Tracking for Mobile VR using Acoustic Sensing

Positional tracking ensures a high presence in Virtual Reality (VR).Though mobile VR has a large potential to bring VR to the masses, enabling positional tracking on mobile VR platforms has been a challenge. Existing implementations often rely on computer vision or require special sensors and are often computationally intensive, resulting in a low frame rate and reduced battery life. StereoTrack is a 180 positional tracking method using acoustic sensing which has a low computational overhead, and is minimalistic in terms of required hardware (a pair of speakers); which may allow for large scale adoption. The project is under development.

Handsfree Omnidirectional VR Navigation using Head Tilt

Navigating mobile virtual reality (VR) is a challenge due to limited input options and/or a requirement for handsfree interaction. Walking-in-place (WIP) is considered to offer a higher presence than controller input but only allows unidirectional navigation in the direction of the user's gaze--which impedes navigation efficiency. Leaning input enables omnidirectional navigation but currently relies on bulky controllers, which aren't feasible in mobile VR contexts. This note evaluates the use of head-tilt - implemented using inertial sensing - to allow for handsfree omnidirectional VR navigation on mobile VR platforms. A user study with 24 subjects compared three input methods using an obstacle avoidance navigation task: (1) head-tilt alone (TILT) (2) a hybrid method (WIP-TILT) that uses head tilting for direction and WIP to control speed; and (3) traditional controller input. TILT was significantly faster than WIP-TILT and joystick input, while WIP-TILT and TILT offered the highest presence. There was no difference in cybersickness between input methods. Published in CHI'17.

Exploring the Use of a Drone to Guide Blind Runners

People with visual impairments have a hard time getting consistent physical exercise, as they can not do some exercises, such as running outside, without a sighted guide. People with visual impairments have been shown to have higher spatial localization skills than sighted people, which lead us to believe that they could follow a drone on a running-track environment. This paper presents a feasibility study where we investigate the ability to localize and follow a low-cost flying drone in people with visual impairments. A Wizard of Oz style study was conducted with 2 blind participants. Our results indicate that blind individuals can accurately localize the drone and follow it. Qualitative results also indicate that the participants were comfortable with following the drone and had high efficacy when it came to following and localizing the drone. The study supports future development of a fully functioning prototype. Published in ASSETS'16.

Analyzing the Use of Twitter to Disseminate Visual Impairments Awareness Information

People with visual impairments have been surrounded with myths and misconceptions that have made it challenging for them to live as productive members of the society and have partly contributed in making the majority live in substandard living conditions. To shift the public's focus from being on the disability to be on the abilities of the visually impaired, non-profit organizations and government agencies have conducted a series of campaigns to spread awareness about visual impairments. Social media platforms, such as Twitter, has become the primary channel to diffuse information to the public. The goal of this paper is to analyze the use of Twitter as a social media platform to spread information during visual impairments awareness campaigns. We focused our analysis on five key concerns: (i) the characteristics of active users during the event, (ii) the major players in information dissemination, (iii) the tweets' common topics (iv) the reachability of information, and (v) the temporal tweeting behavior. We report the results of the campaigns along with the design of an effective communication strategy of a campaign. Published in ASONAM'17.

A Vibrotactile Navigation Guide for The Visually Impaired In Unmapped Outdoor Spaces

Outdoor navigtation applications such as Google Maps are reliable enough to move people around from point to another, until one reaches an unmapped outdoor space such as an old market square. The lack of sufficient landmarks and navigation aids in such open spaces makes egocentric navigation to be used more often than its counterpart, allocentric navigation. WaypointsTracer is a smartphone app that augments the egocentric navigation capabilities of people with visual impairments in unmapped outdoor spaces. When started, the app drops frequent "breadcrumbs" that form a goe-history of the user's navigated trajectory. Such history becomes handy when the user wants to trace a previously navigated trajectory back to their initial position. Guidance to the next historical point is provided in the form of four encoded vibrotactile pulses, each representing a discrete relative waypoint position: to your left, to your right, in front of you, and behind you. A motivating example scenario along with a brief technical specification, and the source code are available at the project's repository at Github.