Learn from experts breaking ground in AR and VR development in the Innovation Track. Hear in-depth lectures and panels illustrating how XR has the power to influence how we connect and engage with the world around us, inspired by leaders in aerospace, data visualization, games, museums, and more. Learn practical approaches to immersive design and UX challenges.
In this special session, XRDC has selected the co-creators of three standout AR/VR/XR titles to present case studies of their intriguing and innovative work. Firstly, Tinker VR's Shimon Alkon discusses bringing to life an experience of what it might be like to gradually lose a loved one to Alzheimer's. The experience integrates animation with live performance in VR in way that allows the participant to interact, influence, and affect the narrative that unfolds.
Second, The Guardian newspaper/website's creative technologist Federico Fasce will discuss his work on The Guardian VR project, specifically the VR experience 'Songbird', an investigation on sound ecology, including recordings of the singing of several extinct and endangered birds on the plateau of the Alakai in Hawaii.
And finally, Kimberly Hieftje from Yale's play4REAL lab will discuss 'smokeSCREEN VR', a VR intervention focused on e-cigarette prevention in teens. Created with input from teens, the game uses voice recognition software to allow the player to practice refusal skills in real-time.
Jim Toepel, former Rocket Scientist and current Game Designer will show you how the failure-mitigation techniques of the aerospace industry can be applied to individual AR products and the industry as a whole. This session will introduce the concept of Failure Modes and Effects Analysis in the context of Manned Spaceflight. From those learnings, Jim will apply them to the broad challenges facing AR platforms and products as well as provide examples on how FMEA guided the design process of 'Mindshow' and his other early AR design efforts with Kinect.
Each day, humans generate 2.5 quintillion bytes of data. Sifting through the noise to generate meaningful insights can be time-consuming and repetitive. What if there was a more efficient way to visualize your data and derive business intelligence? IBM Immersive Insights is an augmented reality tool that gives data scientists the ability to analyze data and present their findings in 3D. This supplemental tool has great implications for data analysis across business industries all over the world, because every functioning business has something in common – the need to understand their data in order to make profitable decisions.
How do you design the same app for both Hololens and VR? What do you do about the two different input systems? Do you stay consistent for design or for the platform? In this session, Microsoft Mixed Reality studio Lead UX Designer, Jada Williams, will cover these questions and compare how web and mobile have solved these problems with responsive/adaptive design. She will also cover how the Microsoft team thought about adaptive design in MR while designing for Microsoft Layout, a mixed reality app that allows space planners to see their ideas in context. Topics discussed in this session include UI consistency, menu patterns and input choices.
Every user's environment is unique, but what are the common trends seen in experience and space sizes? How do you design for various environmental constraints? How can knowledge of these constraints empower you to design for all users? Understanding the different environment sizes and how to design for them is one of the first steps creators can take when developing meaningful experiences everyone can enjoy. Google AR Designer, Alesha Unpingco, will share insights from ARCore applications and break down observations and techniques creators can use for designing table scale, room scale, and world scale experiences for unpredictable spaces.
Empathy, a word often used yet little understood is critical component of any XR experience and separates XR from all other media. This talk covers three empathy deepening XR techniques distilled from 4 years of research from thousands of players in our own XR Lab for Google, Hidden Path, Survios, 'Beat Saber', 'Fragments' (HoloLens), and 3 unannounced titles: Empathy Hurdles, Empathy Bridges, and Empathy Channels. Whether it's to reduce grieving; create branded emotions from Apple; or bring to life deeply rich narrative spaces for the Harry Potter IP; these three empathy techniques double your XR creation's ability to create emotional depth.
Methane emissions are invisible and odorless, a difficult problem to both communicate and solve, yet it accounts for a quarter of the warming we're experiencing today. Enter the Environmental Defense Fund's virtual "Methane CH4llenge." Informed by peer-reviewed science, and developed in collaboration with oil and gas experts, Fair Worlds, an immersive XR agency, and the EDF created an experience that takes users into a digitally-simulated wellsite to show the ease and efficiency of controlling key sources of methane emissions.
In this session, Isabel Mogstad (EDF) and Erik Horn (Fair Worlds) will talk through the 6 month process that brought this VR experience to life, how it was viewed by a US Senator, and how one of the largest publicly traded oil and gas companies is exploring the integration of the experience into their VR lab.
Augmented reality can extend our cognitive abilities. Things that we cannot perceive can be brought within reach of our senses in real time. As a demonstration we present a cognitive assistant for blind persons based on the Microsoft HoloLens. The system identifies objects in the environment and gives them virtual voices. By interacting with these voices the blind user gains all kinds of new abilities: from obstacle avoidance to formation and recall of spatial memories. In fact, without any training, blind people can navigate an unfamiliar multi-story building on their first attempt. The design principle is that the computer should maximally distill the desired knowledge, and then communicate it in a manner that is most intuitive and natural to the human user. Finally, the session argues that with the same principles AR can power intelligent assistants that boost cognitive powers for everyone.
How can museums use technology to both deepen appreciation of art to existing audiences, as well as bring art to new ones? frog and the San Francisco Museum of Modern Art aim to achieve this with The Interpretative Gallery, a mixed-reality space part of the René Magritte exhibition - open May to October 2018. Designed and developed by frog, the immersive environment uses stereo cameras and computer vision algorithms to invite visitor participation in a uniquely Magrittian "augmented reality." This talk focuses on how mixed reality is used to elevate art exhibitions in museums and beyond.
Since the onset of consumer virtual reality, room-scale VR experiences have expanded beyond the living room and into virtually every industry. Since pioneering early standards for VR interaction, Owlchemy Labs has new insights about the latest best practices for room-scale VR. Learn from the team behind 'Job Simulator', 'Rick and Morty: Virtual Rick-ality', and the upcoming 'Vacation Simulator' as they cover several topics - including advanced interaction with objects and characters, zone-based teleportation, accessible design, and more - that anyone doing VR development can apply to their current and future projects.
The augmented world is rich with data but obscured by fragmentation, and it grows every day. What the industry needs is a way to pull everything together so designers can communicate their needs in a way that form blueprints for engineers. Unity has been researching a solution they call "reasoning APIs". These "reasoning APIs" are a new AR technique: equal parts coding, puzzle solving, and adventure-game-style ingredient substitution. The Unity team has used it to turn weather into light, bridge the gap between devices, and more. Come hear about their experimentation in this area, and how to create your own!
The hardest part about building great AR apps is not the technology but getting users familiar with entirely new user interface paradigms. Until now, the tech behind AR cloud applications has been far from consumer ready. However, this is changing. Recent breakthroughs in 3D mapping and AI are now making it possible to build AR apps that truly interact with the real world like indoor navigation and location-based games, taking use cases from Placenote's work.