Profile image

Nels Numan

I am currently a research intern in the BIRD Lab (Blended Interaction Research & Devices) at Google. As a PhD candidate in the VECG Group at University College London, I conduct research on asymmetric collaborative mixed reality in both indoor and outdoor environments. Previously, I was a research intern at Niantic Labs, Microsoft Research, and TNO Research. My current research interests span mixed reality, computer-supported cooperative work, human-computer interaction, and generative artificial intelligence.

Recent Updates

·

Started my internship at the Blended Interaction Research & Devices (BIRD) lab at Google under the mentorship of Eric Gonzalez and Mar Gonzalez-Franco!

·

BlendScape received an Honorable Mention award at ACM UIST ‘24 (top 1.5% of 608 submissions)!

·

SpaceBlender and BlendScape (both co-first authored) were accepted to ACM UIST ‘24. Two corresponding patent applications were filed!

·

Supported by the In2Research program, we are excited to welcome our interns, Rowan Meng and Aiman Sohail, to the VECG lab! Rowan and Aiman will be working on projects related to generative AI and virtual reality, based on Ubiq-Genie.

·

Finished my internship at Niantic Labs. It was wonderful to work with Jessica van Brummelen and Gabriel Brostow – looking forward to continuing our collaboration and sharing our work soon!

·

Our paper on DreamCodeVR and two posters (AIsop and StreamSpace) were presented at IEEE VR ‘24 (led by Daniele Giunchi). I had a great time as a student volunteer at the conference!

·

Our paper on extending Ubiq to the web was presented at ACM Web3D ‘23 by Sebastian Friston.

·

Our student competition entry on Reviving the Euston Arch: A Mixed Reality Approach to Cultural Heritage Tours was awarded an Honorable Mention for the Best Design Award at the ISMAR 2023 student competition.

·

Completed my internship at the EPIC Group at Microsoft Research in Redmond. It was a fantastic experience working with Bala Kumaravel, Nicolai Marquardt, and Andy Wilson – looking forward to sharing our work soon!

·

Presented our workshop paper on Ubiq-Genie at the OAT workshop, while Ziwen Lu presented another workshop paper on our outdoor collaborative MR prototype at the ReDigiTS workshop during IEEE VR ‘23. Presenting my ongoing research at the Doctoral Consortium was also a valuable experience!

·

Our paper on immersive competence and literacy was published in Frontiers in Virtual Reality, led by my advisor, Anthony Steed.

·

Presented my paper exploring user behavior in asymmetric collaborative mixed reality at VRST ‘22 in Tsukuba, Japan, written with my advisor Anthony Steed.

·

Our paper on our toolkit for running remote mixed reality experiments (Ubiq-Exp) was published in Frontiers in Virtual Reality, led by my advisor, Anthony Steed.

·

Had lots of fun at the Niantic Lightship VPS Hackathon in London! Our project, developed with Ziwen Lu, Kalila Shapiro, and Joe McFadden, won 2nd place!

·

Moved to London to start my PhD at University College London with Anthony Steed and Simon Julier! I’m grateful to receive funding through the European Union’s Horizon 2020 Research and Innovation program under grant agreement No. 739578.

·

Presented a workshop paper on generative head-mounted display removal at the VHCIE workshop during IEEE VR ‘21. Also grateful for the opportunity to present my work at internal events at Microsoft and IBM!

·

Completed my research internship at the Intelligent Imaging group within the Netherlands Organization for Applied Scientific Research (TNO) in The Hague, where I worked on my master’s thesis research on generative head-mounted display removal with Frank ter Haar.