VR/AR Showcase

The VR/AR Showcase will take place during the sixth annual University of Arizona Information Technology (IT) Summit.

Experience virtual reality, augmented reality, 360 video, and more. Consult with campus innovators about teaching and research applications. Join us for demonstrations of projects and a lunchtime panel discussion.

12:15-12:30 p.m.

Department of Defense/Defense and Security Research Institute Opportunities in VR Presentation: Chris Fox, Research, Discovery, and Information/DSRI

At the University of Arizona’s Defense & Security Research Institute, we promote the unique capabilities of UA researchers across a range of disciplines to foster investment and collaboration with the nation's aerospace and defense economy.

Lunch and Panel
12:30 p.m.–2:15 p.m

Virtual Reality and Education Panel Discussion


  • Ash Black, Global Initiatives


  • Lila Bozgeyikly, School of Information, College of Social and Behavioral Sciences
  • Bryan Carter, Africana Studies, Center for Digital Humanities, College of Humanities
  • Arne Ekstrom, Psychology, College of Social and Behavioral Sciences
  • Joe Farbrook, School of Art, College of Fine Arts
  • Hong Hua, College of Optical Sciences
  • Sam Rodriguez, College of Fine Arts

Lunch will only be available for the first 100 people to sign-in on the day of the event and ask for a lunch ticket.

Open Demo Sessions
9:00 a.m.–12:30 p.m. and 2:30–4:00 p.m. 

Memory Leaks: Joe Farbrook, School of Art, College of Fine Arts

Memory Leaks is a virtual reality art installation investigating memes, media narratives, and cultural mythology. Inspired by the influx of virtual content leaking into the physical world and as well as the human psyche, this phenomenon is magnified in a fully immersive VR experience. The conceptual underpinning is to use gross enlargement and collaged cinematic to dramatize current shifts in attention, trains of thought, dreams, time, time spent, communication, social life, emotions, expectations, etc. It offers an investigation into the relationship between the democratization of digital technology and its prevailing emergence as the de-facto form of communication.

Tangiball: A Tangible Virtual Reality Ball Game: Ren Bozgeyikly, School of Information, College of Social and Behavioral Sciences 

This demo includes an enhanced virtual reality interaction through real-world extensions of virtual objects. Users will interact with a tangible ball in a video game setting and see its movements replicated in the virtual world in real-time. This form of interaction offers them the unique experience of playing with a real ball in a virtual world.

VR in Architecture, Planning and Landscape Architecture: Lucas Guthrie, College of Architecture

Learn how students at the College of Architecture, Planning and Landscape Architecture are using Revit and Virtual Reality technology to experience their design in a new way. With the ability to "walk around" in their design, student's are able to fine tune their model and present their vision.

A Virtual Factory, Medical Surgery, and Emergency Evacuation based on Unity Game Engine: Son Young-Jun, Systems and Industrial Engineering, College of Engineering

In this VR demo, you will be in the middle of a modern manufacturing factory, and observe operations of various equipment, such as CNC machines, robots, automatically-guided vehicles, and overhead crane. This demo will be used for students in the manufacturing courses at UA. The second demo is about VR/AR-based training/planning in medical surgery that has been collaborated with researchers at Banner UMC. The third demo is about emergency evacuation scenarios. For all applications, Unity game engine was utilized to create highly realistic environments.

Leaning into Tomorrow: Immersive Student Engagement Experiences with the Center for Digital Humanities and Tech.Global: Bryan Carter, Africana Studies, Center for Digital Humanities, College of Humanities

Tech.Global and the Center for Digital Humanities collaborate on a multitude of interesting 360, AR, and VR projects designed to move UA faculty research agendas forward and put our students in the spotlight.  Please come by to explore our current projects including Lingyin Temple VR (Dr. Wu and Dr. Welter - China), Justice 360 (Dr. Kiser, Dr. Beita - Costa Rica), and our African Dance VR Gallery (Dr. Praise - CGI/Motion Capture).

Cyber Operations in Mixed Reality w/ UA South and Tech.Global: Ash Black, Tech.Global, UA Global Initiatives

Tech.Global and UA's Cyber Operations Program are working together to develop a network visualization tool leveraging VR and AR. The Virtual Exploit Network Offense Management (VENOM) will allow users to rapidly jump to any network node and conduct deep packet analysis, flow analysis, and/or vulnerability analysis. The goal of our demo is to allows users to dive into a 50-node shipping and receiving company network within CyberApolis, an unstructured, synthetic, live environment designed to replicate the real internet and simulate cyber attacks, all in a geo-spatial mixed reality environment. This project is leveraging University of Arizona expertise and equipment from the Cyber Operations program and Tech.Global, and partnering with faculty at the University of Texas San Antonio for research and equipment support. This multi-year, multi-phase project will significantly improve offensive cyber operations through use of Virtual and Augmented Reality environments and facilitate enhanced instruction of computer networking and network defense.

Box Cat: Escape XRtist: Sam Rodriguez, College of Fine Arts

"Box Cat: Escape XRtist" is an insight to our relationship with technology and ourselves. In this virtual reality/live experience, an animatronic (MAHT) will play the role of the artist (SR). The machine will play the role of an artificial intelligence (Mac).  Participants will play the role of  human beings. MAHT is an unknown/ not-well-known artist. As his “debut” performance piece he deliberately puts himself into a “coma” and urges participants to come and experience his “waking dream” where his friend, an artificial intelligence, resides. MAHT is hooked to a heart monitor and two participants must interact with his mental and physical forms. Two participants, one wearing a virtual reality headset and a second person monitoring an animatronic that is connected to the virtual game, will illustrate how we interact with one another as human beings and how we interact with technology. The presentation discussing this would cover topics such as human/human and human/tech relationships as well as our fears and power structure with technology as a tool or as a companion.

Virtual Tour of Whipple Observatory: Dallan Porter, MMT Observatory, Department of Astronomy and Steward Observatory

Whipple Observatory, located near Amado, Arizona on Mt. Hopkins, is an astronomical observatory owned and operated by the Smithsonian Astrophysical Observatory (SAO). The MMT Observatory is a large optical telescope at the summit of Mt. Hopkins and is a joint facility of the SAO and the University of Arizona. The MMT's 6.5-meter mirror is the first of the large mirrors made at the mirror lab on the UofA campus under the football stadium. This virtual reality tour will take the visitor from the Whipple Observatory's basecamp visitor's center all the way to the summit giving breathtaking views of all of the telescopes, the Santa Rita mountains, and the entire southern Arizona landscape. The tour is concluded with an amazing Arizona sunset and a view of the night sky from the chamber of the MMT.

How virtual and altered reality can help us to understand the neural basis of human spatial navigation: Arne Ekstrom, Psychology, College of Social and Behavioral Sciences

Devices such as head-mounted displays and omnidirectional treadmills offer enormous potential for gaming and networking-related applications.  However, their use in experimental psychology and cognitive neuroscience, so far, has been relatively limited.  One of clearest applications of such novel devices in experimental psychology is the study of human spatial navigation, historically, an understudied area compared to more experimentally-constrained studies in rodents. Here, we present several experiments the lab has recently conducted using VR/AR, and discuss in detail how we overcame some of these obstacles and the new insights we can gain into how humans navigate.  First, we show how VR/AR provides novel insight into how we learn large-scale environments with enriched body-based input, typically difficult to study in the real-world.  Second, we discuss novel findings from simultaneous wireless scalp EEG recordings and ambulation on an omnidirectional treadmill.

VR Studio and iSpace: Anthony Sanchez, UA Libraries

At the university library, we have been actively creating and iterating on a model of operating a VR service space while supporting a vision of discovery and informal learning within VR environments. We have hosted events and workshops to cultivate learning communities around virtual and augmented reality technology tools in an effort to have a greater impact on the campus curriculum. Additionally, this has made us rethink job responsibilities and models of staffing including hiring students to host drop-in hours and programming around VR tools and activities. Advances in technology are transforming education and offering new and exciting ways to deliver content, engage students, and better ensure comprehension across disciplines. Virtual, augmented, and mixed realities create significant immersive experiences for research and teaching and could greatly enhance any field of study in the sciences, from medicine to engineering. It can also enrich the experience of the liberal arts and social sciences.

Hangzhou Buddhist Culture VR: Feng Chen and Albert Welter, College of Humanities


November 7, 2018, 9:00 am to 4:00 pm
Student Union North Ballroom