Panels & Discussions

Panel 1 - eXtending Reality (XR) into Our Hearts and Minds to Define the Future of Mixed and Augmented Reality (Hybrid Panel)

Tuesday, 5 October
18:00 CEST UTC+2

YouTube Stream (non-interactive)
Discord Channel for Zoom link and Interactive Q&A Access (registered attendees only): Browser, App
Post-session Discussion with Panelists in Gathertown Room: Q&A Track A (registered attendees only)

This panel explores how the eXtended Reality (XR) continuum can be extended by considering people’s hearts and minds. Human connection and social engagement has become paramount in the application of immersive media across the XR continuum. With the expansion of remote work, collaboration and interpersonal relationships, the XR continuum has been enhanced by other enabling and disruptive technologies such as machine learning, augmented intelligence, perceptual computing, bio-sensors, empathic computing and tracking of brain activity to integrate internal as well as external perceptual models of multiple realities (reality, virtuality, sociality and imaginality (Stapleton ISMAR 2012). This has brought more diverse disciplines and applications together to serve core human functions that will impact the larger implications of interpersonal relationship and social cohesion.

An example of how the XR continuum can be extended is through the field of Empathic Computing that sits at the overlap of three trends in interaction and communication; (1) Natural Collaboration, (2) Experience Capture, and (3) Implicit Understanding. Research in this space combines AR/VR technologies with physiological sensors to measure and share emotional state. For example, using EEG to measure brain activity of pairs of people in VR and observe brain synchronization, or capturing and sharing a user’s surroundings while overlaying remote gaze and gesture cues. The technology being developed can be applied in many domains where people want to better connect with one another over distance, from remote assistance in industry, to education, healthcare, entertainment, or psychology, among others.

Although this offers exciting new possibilities for applications, there are a wide range of open research issues. Such as, how do we measure and track the complexity of human experience using expanding capabilities for data collection and multi-sensory stimulation? How can Virtual/ Mixed and Augmented Reality be used for social good? What methodology can be used to design for these new types of XR experiences? How can knowledge of the brain and perception be used to enhance XR experiences? These are just a few of many topics that need to be addressed.

In this panel several pioneers of the technological invention, human experience modeling and social communication competency will present, discuss and propose future implications and expansion of the taxonomy of eXtended Reality. This panel will cover real world civic applications, academic research with commercial innovations that are stimulating future impact of XR to develop new measures, methods and meaning to model human experience and performance.

They will discuss future methods to measure and validate/aggregate the analytics of machine and human intelligence of user, developer and subject matter expert. These developments are providing a more rapid innovation cycle and evidence-based frameworks to transform potentially disruptive technology into liberating innovations for those who need it most. The panelists will also share their transfer of research into practice that is enhancing therapy, education, training and social resilience to enrich social cohesion after a tumultuous period of international pandemic and social unrest. Finally, they will discuss directions for future work, and the research opportunities in this space.

Professor Mark Billinghurst, Director, Empathic Computing Laboratory, University of Auckland, University of South Australia Mark Billinghurst is Professor at the University of South Australia in Australia, and at the University of Auckland in New Zealand, directing the Empathic Computing Laboratory. He earned a PhD in 2002 from the University of Washington and researches how virtual and real worlds can be merged, publishing over 600 papers. The current focus of his research is on using advanced technologies to develop systems that share understanding, and enable people to better understand what each other are seeing, hearing and feeling. In 2019 he was given the ISMAR Career Impact Award in recognition for lifetime contribution to AR research.

Denise McCall, Founder & Executive Director of Snyder Center for Aphasia Life Enhancement (SCALE), Aphasia Rehabilitation Pioneer Ms McCall is developing innovations in Life Participation Approach to Aphasia that help People With Aphasia (PWA) to address social isolation experienced by clients with communicative disorders. Her program in Aphasia Confidence helps community institutions to address cognitive impairment and mental health that can prevent social inclusion for PWAs. Her work in the transfer of clinical therapy to social engagement using XR assists in analyzing a client’s abilities, progress and challenges with people with and without Aphasia to have satisfying group experiences. Her investigations are contributing to international evidence-based-data repositories to advance communication and social inclusion using communication beyond words.

Valentino Megale, Co-Founder and CEO of Softcare Studios, Ph.D in Neuropharmacology, XR Entrepreneur Mr. Megale develops XR solutions designed to improve the therapy experience in hospitalized patients, with special focus on stress management and therapy education, reducing conditions such as anxiety, pain, and social isolation in the clinical setting. He created, TOMMI, a VR gamified experience provided to help pediatric patients better cope with therapy challenges in oncology and dentistry. His goal is to leverage the potential of the brain and perception in improving the overall wellbeing of patients through digital, and drug free solutions based on immersive technology. He will present on how a cross-reality approach creates meaningful impact in healthcare.

Christopher Stapleton, CEO Simiosys & Co-Founder, Virtual Reality Society. Mr. Stapleton is adapting his pioneering work in Mixed Reality and Experience Architecture to explore future social resilience in deep space and how to reverse the global epidemic of social isolation. His transdisciplinary research network is stimulating, capturing and measuring the escalation and de-escalation of social cohesion in populations at high-risk of isolation. His team is exploring new methods for measuring, virtualizing, analyzing and predicting group performance. He will present his design approach and taxonomy of XR in building collaboration within virtual worlds spanning cultural borders and time zones to simulate future challenges with sustaining social resilience.

Panel 2 - New Interactive Narratives in Virtual Reality

Wednesday, 6 October
11:30 CEST UTC+2

YouTube Stream (non-interactive)
Discord Channel for Zoom link and Interactive Q&A Access (registered attendees only): Browser, App
Post-session Discussion with Panelists in Gathertown Room: Q&A Track A (registered attendees only)

In this panel we intend to discuss our insights into the possibilities of designing interactive narratives, by analyzing a number of recently published VR game applications. Specifically, we will focus on Down the Rabbithole, The Line, A Fishermans’s Tale and Wolves in the Walls.

Who are you?’ this question is a literary classic, as posed to Alice by the stern Caterpillar during her adventures in Wonderland. For VR scholars, however, this question is also the starting point for an intriguing investigation into the user perspective of VR experiences. While film and videogames tend to offer mainly classic first- or third person perspectives, VR offers a whole range of newly invented hybrid viewpoints, the limits set only by the developers’ skills and imagination.

Do you see me?’ is one of the first issues that users are confronted with when entering a virtual world. Sometimes their only way of making themselves known to the characters inhabiting the virtual world is by handling simple mechanical interactions in the role of an invisible observer, as is the case in The Line. In other cases they may be highly involved in the story, by embodying the main character and interacting with both the storyworld and the characters, as is shown in A Fishermans’s Tale. In Wolves in the Walls the issue is even more complex as users are invited to take up the part of the imaginary friend of main character Lucy, while remaining invisible to the members of her family.

What am I doing here..?’ is another compelling issue that VR users are confronted with. In all four cases that we will compare and discuss, users are expected to take part in the experience by solving puzzles or performing other actions to move the narrative onward. In Wolves in the Walls the user even needs to be there as emotional support, as close friend and confidant of little Lucy, being the only one who knows what she knows and sees what she sees.
These examples pose new insights for the design of interactive narratives. Some of our cases show how interactive storytelling can be embedded in VR by combining physical interaction and different user perspectives to offer unique possibilities. In Down the Rabbit Hole, for instance, players can’t change the story progression, but they can interact with the stage itself by manually manipulating the storyworld, thus determining their own point of view and ultimately guiding Alice back home through a virtual version of Wonderland.

By discussing our insights and connecting these to theories of interactive narrative, we aim to come to a meaningful overview of best practices for storytelling for VR.

Mirjam Vosmeer, PhD, initiated her first VR project Interactive Cinema at the Amsterdam University of Applied Sciences in 2013. Since then, she has presented workshops, keynotes and presentations on storytelling for VR at conferences all over the world. She received two government grants for her research on VR, focusing on interaction, user experience and narrative design. In her current project VR for Diversity she collaborates with a range of knowledge, industry and educational partners to investigate the persuasive effects of interaction in VR. Her teaching is focused on immersive environments and videogame design. https://civicinteractiondesign.com/people/mirjam-vosmeer/ http://ispace.amsterdam/

Christian Roth, PhD, is a media psychologist investigating the cognitive psychology behind interactive narrative design and user experiences. He researches, teaches and designs transformational aspects of interactive artefacts. Since 2016, Christian is researcher and docent in the Netherlands, teaching narrative and applied game design as well as research for design at the School of Games & Interaction at the University of the Arts Utrecht (HKU). He leads the Minor Interactive Narrative Design, an educational program for the future generation of narrative designers with a cross-cutting perspective. Christian has been involved in several interdisciplinary European projects and is an advocate for innovative learning. spieleforschung.de / @spieleforscher.

Marijn Mado is a Ph.D. Candidate in Communication at Stanford University. She is working with the Virtual Human Interaction Lab and is interested in the implications of engaging with historical events via new media technologies. She draws from mixed methods to explore how the past is portrayed in VR, how VR can support history education, and the ways in which VR aims to promote empathy for others. Marijn obtained her MA in Sociology from The New School for Social Research and her BA in Liberal Arts and Sciences from Amsterdam University College.

EU Funding opportunities for disruptive innovation in the field of digital transformation, computer-human interface, augmented reality

Thursday, 7 October
16:00 CEST UTC+2

YouTube Stream (non-interactive)
Discord Channel for Zoom link and Interactive Q&A Access (registered attendees only)Coming Soon

Programme Managers in the European Innovation Council has been recently introduced, in order to proactively manage portfolios of projects and catalyze innovation in the beneficiaries of both Pathfinder, Transition and Accelerator funding programmes within Horizon EU. Among the roles of Programme managers, there is the identification of strategic research and innovation trends where to allocate budget for challenge based calls. One of the Pathfinder challenges of 2021 is focused on ‘awareness inside’ with a specific focus on the human-machine interactions and artificial intelligence, while the EIC accelerator programme funds SMIs to transform research into innovation to address the EU societal challenges. The key aims of this presentation are to describe the funding opportunities for disruptive research and innovation offered by EIC programmes in Horizon EU, and describe the process followed to define the specific challenges of the Work Programme, which takes into account both scientific and technological aspects, economic and societal implications, and the broader ecosystem innovation framework.

Dr. Antonio Marco Pantaleo has 20 years of experience in multidisciplinary research projects in renewable and clean energy technologies (solar, wind, biomass, and hybrid), energy systems integration, biosystems engineering, energy use in agricultural engineering and food processing. He holds a first degree in electric engineering form Politecnico of Bari and a PhD in process systems engineering from Imperial College London, where he is affiliated as research fellow to the Clean Energy Processes Lab (CEP) and Centre for Process Systems Engineering (CPSE) of the Department of Chemical Engineering. He is associate professor of clean energy technologies at the Department of agro-environmental sciences of the University of Bari. Before joining the University, he co-founded an energy service company, joined Edison Energie Speciali as wind and biomass energy engineer and the strategic planning division of GSE. He worked as scientific expert and consultant for public and private organisations, including the Italian Ministry of Research, and was vice-Rector for energy policy of University of Bari. Antonio authored over 100 scientific papers, and is member of IEA working group on energy storage, of ASME technical committee on clean fuels, delegate for energy in the Italian Sustainable Universities network. Since October 2020, he is programme manager at EIC for energy systems and green technologies.

 

Community Discussion - Planning a Virtual Experience Research Accelerator (VERA)

Thursday, 7 October
16:15 CEST UTC+2

YouTube Stream (non-interactive)
Discord Channel for Zoom link and Interactive Q&A Access (registered attendees only): Browser, App
Post-session Discussion with Panelists in Gathertown Room: Q&A Track A (registered attendees only)

Researchers Greg Welch, Tabitha Peck, Valerie Jones Taylor, Gerd Bruder, and Jeremy Bailenson are planning a research proposal to the US National Science Foundation to develop and test a Virtual Experience Research Accelerator (VERA): world-class research infrastructure intended to support the XR research community, initially in the US but eventually worldwide.

Specifically the researchers envision a Mechanical Turk like resource for the XR community to run human subject research en masse, with a curated pool of subjects across the US (for now), with a priori chosen demographics. Its primary function would be to ingest proposals and experiments, push experiment software out to the subjects, retrieve experiment data, and store/save for consumption by the community. The primary envisioned benefits to the XR community include the ability to obtain more and better experimental data, e.g., more subjects, a more diverse population, and repeatable controlled circumstances. In addition it could afford improved opportunities for replicability of studies and improved opportunities for longitudinal studies.

During this one-hour session, members of the team will present the basic ideas, listen/collect feedback, facilitate discussion, and convey opportunities for involvement.

Dr. Gregory Welch is a Pegasus Professor and the Advent Health Endowed Chair in Healthcare Simulation at the University of Central Florida College of Nursing. A computer scientist and engineer, he also has appointments in the College of Engineering and Computer Science and in the Institute for Simulation & Training. Welch earned his B.S. in Electrical Engineering Technology from Purdue University (Highest Distinction), and his M.S. and Ph.D. in Computer Science from the University of North Carolina at Chapel Hill (UNC). Previously, he was a research professor at UNC. He also worked at NASA’s Jet Propulsion Laboratory and at Northrop-Grumman’s Defense Systems Division. His research interests include human-computer interaction, human motion tracking, virtual and augmented reality, computer graphics and vision, and training related applications. His awards include the IEEE Virtual Reality Technical Achievement Award in 2018 (VR 2018), and the Long Lasting Impact Paper Award at the 15th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2016).

Dr. Tabitha C. Peck is an Associate Professor of Mathematics and Computer Science at Davidson College. Dr. Peck received her PhD in Computer Science from The University of North Carolina at Chapel Hill . Dr. Peck’s research involves developing and testing usable virtual reality systems. Her current research includes investigation of the psychological implications of embodiment in self-avatars with the goal of using avatars to reduce and mitigate bias. Dr.Peck has been both a Journal and Conference Paper Program Chair for IEEE VR and a Journal Paper Science and Technology Chair for IEEE ISMAR. She has received numerous honorable mentions and nominations for best paper awards and is an NSF CAREER award recipient.

Dr. Valerie Jones Taylor is an Assistant Professor of Psychology and Africana Studies at Lehigh University. She earned a doctorate in social psychology at Stanford University and a B.A. in psychology and ethnic studies, with a concentration in African & African American Studies, at the University of Texas at Austin. Taylor previously served as an assistant professor at Spelman College and was a postdoctoral fellow at Princeton University. Her research areas include intergroup relations, social identity threat, stereotyping and discrimination, and cultural psychology. Specifically, she investigates how stereotyping and prejudice affect underrepresented groups’ academic performance, interracial interactions, and the treatment of racialized physical spaces. Her research also examines ways to improve interracial encounters in educational and social contexts using virtual reality. Taylor has been awarded grants from the National Science Foundation and the Ford Foundation, and is currently the recipient of a NSF CAREER award to examine VR interracial interactions in STEM contexts.

Dr. Gerd Bruder is a Research Assistant Professor for virtual and augmented reality at the Institute for Simulation and Training at the University of Central Florida. He held prior positions as a postdoctoral scholar at the University of Hamburg (2014-2016) and University of Würzburg (2011-2014). He received his Habilitation in Computer Science from the University of Hamburg in 2017 and his Ph.D. in Computer Science from the University of Münster in 2011. His research interests include perception and cognition, virtual humans, social presence, locomotion, and 3D user interfaces. He authored almost 200 publications, was awarded more than a dozen best paper and honorable mention awards, and regularly serves as program chair, general chair, and committee member in the research community.

Dr. Jeremy Bailenson is founding director of Stanford University’s Virtual Human Interaction Lab, Thomas More Storke Professor in the Department of Communication, Professor (by courtesy) of Education, Professor (by courtesy) Program in Symbolic Systems, a Senior Fellow at the Woods Institute for the Environment, and a Faculty Leader at Stanford’s Center for Longevity. Bailenson studies the psychology of Virtual and Augmented Reality, in particular how virtual experiences lead to changes in perceptions of self and others. His lab builds and studies systems that allow people to meet in virtual space, and explores the changes in the nature of social interaction. His most recent research focuses on how virtual experiences can transform education, environmental conservation, empathy, and health. He is the recipient of the Dean’s Award for Distinguished Teaching at Stanford. In 2020, IEEE recognized his work with “The Virtual/Augmented Reality Technical Achievement Award”.

Sponsors

Platinum

Gold

sponsor-02

Silver

sponsor-04
sponsor-05

Bronze

sponsor-06

SME and Media partner

sponsor-03

Partners

ieee_mb_blue_3015
cs_logo
vgtclogo-transparent-large
ACM In-Cooperation