We met Professor Ovtcharova, Head of Institute for Information Management in Engineering (IMI) at the Karlsruhe Institute of Technology and Michael Grethler, Head of Industry 4.0 Test Bed of the IMI to talk about extended reality and AI technology and how it will develop in the future.
ILI.DIGITAL: A mother in Korea loses her seven-year-old daughter. She can’t get over it, because in her opinion she couldn’t say goodbye properly. In a project, the Korean television station Munhwa Broadcasting Corporation is developing a reunion with the help of Advanced Virtual Reality – and is thus helping the mother out of her depression. Prof. Ovtcharova and Mr. Grethler, what do you think about this use case from Korea?
Jivka Ovtcharova: This use case can be seen as the crowning glory of human efforts to overcome the physical limits in space and time. Because, since the invention of stereoscopy 200 years ago, people have always tried to put themselves in artificially created worlds as if they were „inside“. In the mid-80s, NASA pursued with the development of the Virtual Reality (VR) technology to enable astronauts to „meet“ their families and enjoy virtual encounters during their long flight across the universe. In fact, VR technologies „deceive“ our senses to „feel“ as a part of any world that we can imagine, based on our true perception of this world. This is why VR should be understood primarily as a mental process, the goal of which is not the completely „1:1 reproduction“ of reality, but the acceptance of the representation as realistic by the human. Our brain builds on past experiences from the physical world to develop “rules” by which to operate authentically also in the virtual world. Since we humans are all unique in our mental perception, the application of VR is revolutionary, especially in the social area, as mentioned above.
Michael Grethler: Psychologists from the USA have already been able to show that experiencing positive feelings can have a therapeutic effect. Virtual reality is used in a similar way to confrontation therapy. In contrast to the fight against negative emotions, however, a positive situation is used here to improve the mothers situation. Whether virtual reality against anhedonia has the same effect on mental states has to be proven.
Prof. Ovtcharova, you have been an expert in the field for a long time now, do you remember what sparked your enthusiasm for the relationship between humans and machines in a digital reality?
Jivka Ovtcharova: Enthusiasm is not inherent, it is born. As a child in the Sputnik and nuclear energy era I was fascinated by the technology that makes the invisible (whether in space or at atomic level) visible. This led me to study nuclear power plants in Moscow. At this time, I first experienced the interaction with machines analogously. The digital human-machine interaction came later with the punch cards and the computer terminals, which caused an enormous effort to adapt to the way machines work. Unforgettable for me is the moment when I saw the very first VR CAVE (Cave Automatic Virtual Environment) in the world at the Electronic Visualization Laboratory of the University of Illinois at Chicago in the early 90s. At that time, I was a guest researcher at the Fraunhofer Institute for Computer Graphics in Darmstadt, working on semantic modeling in an industrial environment. The realtime human-machine interaction in a virtual space has finally sharpened my vision for modern engineering.
Your institute conducts research among others on Smart Immersive Environments (SIE). Can you explain what this means and what fascinates you about this topic?
Jivka Ovtcharova: The Virtual Reality is described as a medium consisting of interactive computer simulations that track the user’s positions and actions in real time and replace or extend the feedback to one or more human senses. This creates the impression for the user that he or she is mentally present in the simulation. The focus lies on the overall human behavior, such as posture, gestures and speech. Thus, the human being is embedded with his whole body and his way of thinking for interaction in a VR environment, which is characterized by the technical term „Immersion“. The human captures information with his five senses simultaneously and reacts for example with speech, actions and unconscious body language. The task of a Smart Immersive Environment (SIE) is therefore to intelligently map this broad spectrum of human „input and output channels“ with the help of VR systems in order to enable seamless human-machine interaction. In this way, the human changes his role from the operator of a computer system into a customer to be served by the computer, which is one of the greatest challenges of the digital transformation of economy and society today. This refers to the future of organizations, working cultures and lifestyles in which people no longer think in computer processing but increasingly in experiences and sensory impressions.
Mr Grethler, do you see results from the research in the field of SIE that could be used to address everyday issues and challenges?
Michael Grethler: Some early adopters are now in their second or third iteration of product or service design. Others have taken use cases all the way to industrialization. For example, BMW has incorporated virtual reality into its automobile design process, while Air France has deployed “immersive entertainment systems” on some flights that allow passengers wearing VR headsets to watch movies in 3D. Some pioneering companies are using extended reality to immerse trainees in lifelike situations that would be otherwise too expensive or logistically impossible to recreate. For example, UPS now provides VR driving tests that allow new drivers to prove themselves in a virtual environment before taking the wheel of a five-ton delivery van. In its training simulation, KFC places employees in a virtual “escape room” where they must successfully complete a five-step chicken preparation process before they are released. Consumer-focused use cases are proliferating across the retail, travel-hospitality-leisure and real estate sectors as vendors use extended reality to bring potential customers closer to their products, services. For example, Estée Lauder has launched an AR virtual makeup mirror on its web and mobile sites that adjusts for light, skin texture, and shine so that users can virtually try on product shades using their photo or live video. Meanwhile, guided virtual visits are poised to transform the real estate industry and they agents dailywork; they may never have to show up for an open house again. Use cases and full deployments of DR technologies in gaming, storytelling, and live events are varied and numerous—and will likely become more so in the coming years. IDC projects that the investment in AR/VR gaming use cases alone will reach $9.5 billion by 2021.
Prof. Ovtcharova, in a physical world a human being has 5 senses. Do you see a future in which all 5 senses can be digitally addressed? What would be the consequences?
Jivka Ovtcharova: There are many reasons to digitally address all 5 human senses in the future. On the one hand, it is about the seamless interplay in a “virtuality-reality-continuum”, on the other hand, it is about the transition from “line-of-business” to “ecosystem” solutions. Especially in the case of so-called Virtual Engineering, it is of crucial importance to give experts the basis to make decisions as early as possible about future products or services based on their experience from the reality. These products or services can affect a wide range of industries – mechanical and plant engineering, automotive, but also agriculture, food and textile industries. A reality-virtuality-continuum is a scale between the completely virtual and the completely physical reality and therefore includes all possible variants and constellations of real and virtual objects. The closeness to reality is achieved by personalizing all interactions of humans with machines or computers and constructing them logically comprehensible in real time. All human senses are deceived by technical devices in order to increase the individual degree of perception in virtual spaces. The human senses mostly addressed today are vision (usually 3D projections), hearing (usually spatial sound) and touch (supported by haptic devices). Technologies such as head or finger tracking determine the position and orientation of the user in space in real time and adapt the simulation to his or her actions. The user can navigate through the virtual world and manipulate it with the help of input devices or gestures. Virtual engineering helps to design and explore tomorrow’s reality and to influence its development. Thus, situations such as the maintenance and repair of wind turbines or nuclear reactors can be simulated, which would be difficult to reproduce or very dangerous in reality. In this way, the human abilities and senses can be trained to make decisions based on purely virtual checks, for example as early as the concept development stage of a product, without wasting material and time unnecessarily and this is a paradigm shift in the world of work. Even taste and smell can be digitized today, e.g. in the perfume and beverage industry. However, a broad application of these possibilities is possible if there is continued targeted investment in this area, because it is mainly a matter of individual taste and smell.
In a TEDx lecture you describe the human polymorphism in extended reality. Can you explain what this term means?
Jivka Ovtcharova: I know the term polymorphism mainly from the computer science as the ability of an object to take on many forms or to behave differently. This is similar to how people act in real life. Because, every person is unique and its experiences are personal. Each Individual has many roles, e.g. in the family, in the circle of friends, in the job and in the social area. Moreover, his or her experiences are situation-driven. A human polymorphism means that every human being today has so many roles, moves in so many different life worlds, real and longed for, past, present and future, and develops accordingly many partial identities, according to the motto „I am many, so what!“ Contradictions are „programmed“ and they raise big questions, i.e. how versatile the individual may be.
How do you see the interaction between AI and extended reality? Since both of you are experts in the field of extended reality, it would be interesting to know, how digital your everyday life looks like!
Jivka Ovtcharova: We still don’t know what Artificial Intelligence (AI) is. What we understand today is that AI is still a tool driven by algorithms. In my opinion, AI stays for an Augmented Intelligence, driven by computers and I don’t believe that machines are capable of transferring knowledge from one open-ended system, like this of the human civilization, to another. So computer on machines in general will be dominant in the closed systems, whether it’s particular application, such as GPS or any other world designed by humans. Among the countless digital applications that accompany my everyday life, Google, Wikipedia and Spotify are indispensable for me. Google is still the fastest „fishing net“ in the data ocean, even though I would like to see more intelligence in the search process. Wikipedia makes the knowledge collection of mankind accessible to everyone, which in my opinion cannot replace a real encyclopedia but is an admirable ideal, and Spotify creates for me a kind of virtual reality of the streaming world.
Michael Grethler: The smartphone has played a major role in the digitalization of my everyday life. Like 76% of all German citizens, I spend most of my time on the Internet with my smart mobile phone. I use it as an alarm clock, listen to podcasts, check mails and messages and prepare for my appointments with Outlook. Maps brings me to my appointments in the shortest time without being flashed. But in my work I experience advanced techniques in MR, AI. I have been virtualy at the scanning of a Rodin sculpture and its optimal placement in a museum, the simulation of a production line and even to the simulation of a criminal case without leaving my office.
Mr. Grethler, what impact do the technical developments in extended reality have on digitization in general? Will organizations be empowered, or does it just mean more complexity to handle?
Michael Grethler: Extended Reality represents the next digital transformation. It changes how we engage with technology, through augmented-, virtual-, and mixed-reality, 360 video, and immersive experiences that are at once intuitive and data-rich, and which put the human user at the center of design. As you explore extended reality’s potential for your organization, consider these two opportunity areas: Connect: “Cooperation without colocation.” Extended reality already makes it possible for workers to engage, share information with, and support colleagues in other locations. Some may think of this as glorified video telephony, but it is much more than that. For example, engineers sitting in a regional office will be able to see what field workers see as they repair and maintain remote equipment, helping to guide their actions. Scientists separated by oceans will convene in a “virtual sandbox” where they can perform collaborative research.
Know: Extended reality can offer knowledge workers—a broad term that basically applies to anyone using a computer—access to the specific information at the exact moment they need it. This is more than a souped-up document-sharing tool – it can actually present information in a visual context. For example, wearing DR glasses, construction engineers can see a detailed description of a project’s electrical and plumbing parts and also how the individual parts will fit into a wall. Imagine leveraging this flexibility in any initial conceptualization phase, such as architecture and interior design, consumer product R&D, or supply chain and logistics mapping. Immersive analytics can further enhance virtual collaboration by helping users explore data in multiple axes and dimensions.
What do you think companies are currently missing out when it comes to extended reality?
Michael Grethler: We all need to remember that on this side of the transformation, the future can look opaque, and the areas above are only starting points. Much like the mobile transformation, we can’t yet see what, in hindsight, will someday be obvious. Ten years ago, if someone told you that the average person would spend five hours a day on their mobile phone and touch it more than 2,500 times per day, you’d think they were crazy. Now, it’s how we live.
It might just take that one simple, killer app to bring the full potential of extended reality into focus. That app could launch tomorrow – what do you think it will do? And will your company be ready when it comes?
Imagine a world which is
build pixel by pixel
Imagine a world which is
build pixel by pixel