VIZSSTA: A HYBRID TABLET AND AUGMENTED REALITY INTERFACE FOR SPACE SYNTAX DATA ANALYSIS
Date
2022-08-04
Authors
Kaur, Ramanpreet
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
I developed a hybrid tablet and head-worn augmented reality (AR) interface called VizSSTA, a system designed to support analysis of Space Syntax data. Space Syntax is a family of quantitative approaches for characterizing physical environments and
predicting how they will be used. VizSSTA provides an interactive floorplan on a tablet display, renders related space syntax analysis data in layers above the tablet display (in AR), and uses the area around the display to render additional floorplans
(such as different floors of a building) or an expanded floorplan in AR. In a within-subjects comparative study (n=48) I explored how layers above the display promote understanding of how two spatial attributes (Openness and Visual Complexity) are
related to each other and to the raw visibility (\isovist") data, to promote understanding of how isovist perimeter and connectivity are related, and how well participants can identify regions with similar space syntax attributes across large floorplans. In the study, I compared VizSSTA against a tablet-only interface. Quantitative and qualitative results indicate that VizSSTA helped participants comprehend the space syntax attributes and their interrelationships. VizSSTA yielded more accuracy for tasks involving identifying how isovist shape and size are related to openness and visual complexity, facilitated detection of similar regions across a large floorplan, and enhanced comprehension of how isovist perimeter and connectivity correlate. A number of limitations of the current implementation of VizSSTA are explored, including ergonomic issues involving the AR headset, and related modi cations are proposed.
Description
Keywords
Augmented Reality, Hybrid system, Space Syntax