ABSTRACT

Authoring site-specific outdoor augmented reality (AR) experiences requires a nuanced understanding of real-world context to create immersive and relevant content. Existing ex-situ authoring tools typically rely on static 3D models to represent spatial information. However, in our formative study (n=25), we identified key limitations of this approach: models are often outdated, incomplete, or insufficient for capturing critical factors such as safety considerations, user flow, and dynamic environmental changes. These issues necessitate frequent on-site visits and additional iterations, making the authoring process more time-consuming and resource-intensive. To mitigate these challenges, we introduce CoCreatAR, an asymmetric collaborative mixed reality authoring system that integrates the flexibility of ex-situ workflows with the immediate contextual awareness of in-situ authoring. We conducted an exploratory study (n=32) comparing CoCreatAR to an asynchronous workflow baseline, finding that it enhances engagement, creativity, and confidence in the authored output while also providing preliminary insights into its impact on task load. We conclude by discussing the implications of our findings for integrating real-world context into site-specific AR authoring systems.

Full-text PDF Niantic project page arXiv page


Video

CoCreatAR System

CoCreatAR, a collaborative authoring system for outdoor AR experiences, designed to facilitate real-time interaction between ex-situ developers and in-situ collaborators. The system enables ex-situ creators, who typically design and develop the experience within Unity asynchronously, to synchronously collaborate with in-situ users who experience the AR content directly in the field. By integrating real-time communication, contextual reference tools, and spatial data capture, CoCreatAR aims to reduce the need for repeated on-site visits during the iterative design of site-specific AR experiences.

The five main site-specific AR issue types we aimed to address with CoCreatAR. We refer to our paper for further details on the full range of issues identified in our formative study.

The five main site-specific AR issue types we aimed to address with CoCreatAR. We refer to our paper for further details on the full range of issues identified in our formative study.

Overview of CoCreatAR feature usage during Phase 1, shown as ex-situ perspective screenshots. Participant conversations are shown in color-coded speech bubbles: in-situ (green) and ex-situ (blue). Speech bubbles with a glow indicate utterances made at the moment of the screenshot. (A) Alignment of a boombox based on spatial context captured using the 3D Snapshot feature; (B) The ex-situ participant moving the map to a position on the wall as specified by the in-situ participant through Surface Drawing; (C) Alignment of a garland to a previously unmapped region of the street using the Coarse 3D Mesh feature; (D) Alignment of misplaced food items based on in-situ input using the 3D Cursor.

Overview of CoCreatAR feature usage during Phase 1, shown as ex-situ perspective screenshots. Participant conversations are shown in color-coded speech bubbles: in-situ (green) and ex-situ (blue). Speech bubbles with a glow indicate utterances made at the moment of the screenshot. (A) Alignment of a boombox based on spatial context captured using the 3D Snapshot feature; (B) The ex-situ participant moving the map to a position on the wall as specified by the in-situ participant through Surface Drawing; (C) Alignment of a garland to a previously unmapped region of the street using the Coarse 3D Mesh feature; (D) Alignment of misplaced food items based on in-situ input using the 3D Cursor.

Ex-situ user interface of CoCreatAR. (A) All objects under NetworkedScene are automatically synchronized between ex-situ and in-situ users; (B) 3D Snapshots captured by the in-situ user; (C) The location mesh of Location A; (D) Coarse Mesh captured by the in-situ user; (E) Live feed of the in-situ user’s screen, including AR content; (F) The 3D Cursor of the ex-situ user, projected into world space; (G) Close-up of the 3D Cursor of the ex-situ user as seen in the scene view; (H) List of annotations and spatial captures, persistently saved in the scene for later review; (I) Sample assets that can be added to the scene at runtime.

Ex-situ user interface of CoCreatAR. (A) All objects under NetworkedScene are automatically synchronized between ex-situ and in-situ users; (B) 3D Snapshots captured by the in-situ user; (C) The location mesh of Location A; (D) Coarse Mesh captured by the in-situ user; (E) Live feed of the in-situ user’s screen, including AR content; (F) The 3D Cursor of the ex-situ user, projected into world space; (G) Close-up of the 3D Cursor of the ex-situ user as seen in the scene view; (H) List of annotations and spatial captures, persistently saved in the scene for later review; (I) Sample assets that can be added to the scene at runtime.

System Usage

Overview of CoCreatAR feature usage during Phase 1, shown as ex-situ perspective screen recordings. (A) Alignment of a boombox based on spatial context captured using the 3D Snapshot feature; (B) The ex-situ participant moving the map to a position on the wall as specified by the in-situ participant through Surface Drawing; (C) Alignment of a garland to a previously unmapped region of the street using the Coarse 3D Mesh feature; (D) Alignment of misplaced food items based on in-situ input using the 3D Cursor.


CITING

@inproceedings{numanCoCreatAREnhancingAuthoring2025,
  title = {{CoCreatAR}}: {{Enhancing Authoring of Outdoor Augmented Reality Experiences Through Asymmetric Collaboration}},
  shorttitle = {{CoCreatAR}}: {{Enhancing Authoring of Outdoor AR Experiences Through Asymmetric Collaboration}},
  booktitle = {{Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems}},
  author = {Numan, Nels and Brostow, Gabriel and Park, Suhyun and Julier, Simon and Steed, Anthony and Van Brummelen, Jessica},
  year = {2025},
  month = apr,
  series = {{{CHI}} '25},
  publisher = {Association for Computing Machinery},
  address = {New York, NY, USA},
  doi = {10.1145/3706598.3714274},
  isbn = {979-8-4007-1394-1/25/04},
}