key: cord-0051944-94r1cys9 authors: Kellermann, Timm; Räth, Detlef; Danz, Daniel; Heinath, Marcus title: People and the HMI of the Future date: 2020-10-23 journal: ATZ Worldw DOI: 10.1007/s38311-020-0308-8 sha: 5e38bc3e622150b27e839e335872a44d2fc53f94 doc_id: 51944 cord_uid: 94r1cys9 nan The digitization has fundamentally changed consumers' expectations of their mobility experience. It is creating ever-increasing pressure to change the Human-machine Interface (HMI) of a car. As a result of ever more extensive driver assistance systems, the classic driving task is becoming easier and less demanding; at the same time, drivers are looking for ways to repurpose their time behind the steering wheel: to do some work, to relax, to be entertained, to communicate -or to enjoy a treasured moment of solitude? IAV's insights from numerous customer projects show that an " ideal" traveling experience even within the same vehicle should probably look a little different for each occupant. As a result, the HMI of future vehicles would have to present itself differently to each user. However, this individualization is not explicitly asked-for by the user. Rather, it is identified and executed by a constant analysis of the activity context. For a long time, HMI development focused on designing the driver's workplace. Today's vehicles still have the technical character of "connected cars," that means, they are stand-alone products to which connectivity capabilities have been added, late in the lifecycle. Now, however, E/E vehicle architecture is starting to change: It is being transitioned toward an Internet of Things (IoT) platform with a redundantly designed, maximum-security domain for safetyrelevant functions and a well-protected second domain for comfort and thirdparty functions. At the end of this multi-step process, future vehicles will have been transformed into "rolling devices on the IoT," that means, they will have a digital DNA from the outset. As a consequence, vehicles will become yet a further device in the portfolio of every user. Users will expect relevant services and functions to be available on all devices at all times and without explicit orchestration by themselves. Core HMI aspects, such as menu navigation, color schemes and dialogues, will be perceived as intuitive if they fall in line with the paradigms of the digital systems chosen by the user, FIGURE 1. To date, display and operating systems have been developed and implemented in a rigid, 1 : 1 match of function with HMI interaction element, for example as switch or dial. In the future, HMIs will be implemented as a flexible, selfadapting platform with micro-services. Even during the use phase, hardware, operating system, function, content location and the design elements themselves can be continuously maintained and expanded. User behavior in the field will constantly provide important insights for optimizing intuitive operability. The theoretically possible individualization options thus increase exponentially, FIGURE 2. However, maximum customizability creates stress when users are guided through a seemingly endless chain of choices before use. In addition, unfamiliar presentation and interaction schemes impair many people's willingness to make full use of vehicle capabilities. The time people spend in a car accounts for only a small portion of their daily time. Thus, the majority of their interaction preferences are formed outside the vehicle. On current smart devices, their preferences are saved by means of digital profiles. The current evolutionary step in a digital device ecosystem lays in synchro nizing initiated activities across all of the users' devices. Users authenticate with their profile ID and do not need to make any further settings on a device before using it for the first time. Already today, this is orchestrated by Apple and Google in the background and is based on continuous learning -at a very high level of security and data protection. To optimize the mobility experience for each and every user, manufacturers will need detailed insight into user preferences and their mobility-relevant apps. The key to obtaining this consent for using this information lies, quite literally, in the key to the car itself. In the future, manufacturers will want to provide their vehicles with a digital -that means smart device-based -key as standard configuration. The purpose is not to earn money with this feature but to give their users access to the entire world of the vehicle, of the brand and of the company's mobility offerings. Every smart device, including all future "rolling devices," will require an automatically applied, personalized identity (ID) plus profile in order to deliver a learning, 100-% human-centric UX, UI and HMI experience which anticipates users' wishes based on full situational context. No car manufacturer, not even Tesla, has implemented the digital key in this fashion. What manufacturer value proposition could be so incredible, that it makes all users want to allow access to interfaces and personal preferences? Throughout the history of mankind, privileged people have expressed their wealth by employing staff. Using digital technologies, the experience of having the best personal assistants, butlers and chauffeurs for mobility can not only be emulated for the first time but can also be offered to nearly everybody, seemingly "free of charge," via a cloud service across all IoT devices. In the initial stages of maturity, this will likely have the nature of an assistant, meaning it will still require some direct instructions and feedback from its users. In later stages, particularly when data from all interactions on all mobility activities are gathered and evaluated from the individual user profile history, the assistance function will evolve into something new and more significant: From (years of) aggregated information, the mobility assistant will increasingly recognize the context that triggers user preferences: when they want to be mobile, in which situation they choose a means of transport, what the experience is like before and during the journey, with or without travel passengers and much more. Already today, there is a trend in China, among others, to humanize the assistance function with avatars. A system that takes shape, interacts and learns in this frequency, intensity and intimacy ultimately creates a relationship that is so human-like that the users ultimately perceive it as humanoid [1] . This example shows that there is a strong need to create these assistants and companions -and with the highest level of system security, privacy and governance. And even more important is the following understanding: If users give companies insight into their lives that intimately and on that large scale, their business model should not lie in selling this data to third parties. Their auto-mobility business model will need to lie in using this information to make each journey faster and more convenient, yet less stressful and also more ecologically sustainable. A 100-% human-centric vehicle HMI needs to start with understanding the expectations users have built in their digital lives away from the car: information structure, screen design, dialogue architecture, etc. In this context, user -friendliness is an essential purchase criterion for 93 % of all smartphone users [2] . Already with respect to seemingly simple questions such as "Where can I see the time of day?" or "How can I take an incoming phone call?", people are influenced so deeply by the digital ecosystem they have chosen that even minor changes in layout, color and form cause them to feel stress, FIGURE 3. Therefore, the free assignment of a function to a display-operating element is an important success factor. Along this line of thought, two flexibility challenges can be seen: -The same function is presented to each user in a slightly different way on a selected operating element (how?). -A function can be assigned freely or multiple times to all available displayoperating elements (where?). After users' opt-in, their display and control preferences are transferred in the most extensive way possible from the users' smart devices or cloud services. Hardly any setting requires explicit user intervention. At the same time, this self-adaptation scheme can easily be overridden, for example when a user chooses the traditional "look and feel" of the manufacturer or selects a new theme world of a third party for the car, for example TikTok, Game of Thrones, etc. One anticipates that flexibly formable, intelligent, morphing multi-functional interfaces have the potential to gradu- FIGURE 3 Example of the spread of requirements regarding "intuitive usability," illustrated by the use case of "incoming phone call": UI expectation, depending on the operating system used (Android versus iOS) and technology chosen (cellular telephony versus voice-over IP) (© IAV) ATZ worldwide 11|2020 ally replace classic instrumentation across all vehicles [3] . As an example, all classic displays could be removed in a car, as well as the decorative panels, with all-round flex-form OLED modules, thus creating a screen belt in the vehicle's interior. Such a screen belt allows a completely new UX/UI/HMI experience space to be opened up in the vehicle which exponentially exceeds the one from all replaced components. As a design and ambient light panel, the screen belt can, for example, create classic looks as well as completely new, dynamic patterns. In addition to standard patterns and photos that a user swipes from the smartphone in the direction of the screen belt, third-party providers can introduce entire theme worlds into the appearance of the vehicle via an interface provided by the manufacturer. This interface creates a "per use" business model. At the beginning, earnings will probably be insignificant. But as manufactures learn where users are willing to spend money, earnings will evolve through speed of learning, just as the ones from Google, Apple and Amazon before them. As most users find touch screen operation in the middle of the instrument cluster awkward, some segments of the screen belt in the HMI premium version should extend and tilt toward the driver's or passenger's hand for optimum operation. The key to ease of use in this case goes beyond component technology: it lies in anticipating which person in the vehicle wants to do what next, in a context-sensitive manner. In this HMI scheme, opening a window all the way or only part of the way can be done in a variety of ways: by voice dialogue; via a hand gesture at the window which is identified and intelligently interpreted by the interior camera; by touching the screen belt module that moves out; or via the screen belt door panel which replaces part of the ambience pattern as the hand approaches. The door panel section of the screen belt can morph and show window controls in classic, modern or customized form and fashion. For users, the beauty of this HMI concept lies in the fact that it adapts to each and every user. For manufacturers, the benefits lie in reducing component-and wiring harness variants. As a conse-quence, much of the complexity is shifted from hardware to the car platform and software. It goes without saying, that in order to develop and manage an HMI like the one just illustrated, a completely new set of tools for development and operations is needed. The use of public transportation or ride sharing comes with a fundamental drawback: the loss of privacy for the traveler. For many, the car is one of the few and last retreats where they can experience a pure freedom of choice: listening to loud music, enjoying the peace and quiet, making a confidential phone call while not having to wear a Covid-19 face mask. These are just three of the many examples that make a car journey a privilege in this hectic society that is so filled of the need to self-discipline. Thus, the most important value proposition of future vehicle interior concepts lies in the ability for driver, front passen-ger and rear-seat occupants to cocoon themselves. Or, if preferred, the opposite: to enable effortless interaction between driver, front passenger and rear-seat occupants, FIGURE 4. Whether one thinks of personalized, dynamic and anticipatory air conditioning, ambience control, communication, playing multimedia content or automatically orchestrating the transition between self-driving and assisted driving: IAV sees cocooning as one of the most precious value propositions overall. It substantially reduces any journey's stress level and, in return, generates a willingness to pay a premium for using a privilege of this nature. Ever since the first car has been built, there was the intention to make the car HMI driver-oriented. HMI functions were developed and implemented separate from one another; the HMI was a part of the car. So far, the concept of a "onesize-fits-all" HMI was accepted by the drivers; they were willing to spend time learning and adapting to each manufacturers' control logic. While this HMI paradigm was successful for over hundred years, it is no longer considered sustainable today. Digital systems have fundamentally changed user expectations. They define the understanding of intuitive, personalized and cross-device usage more than anything else in people's lives. The ability to integrate a vehicle seamlessly into the users' digital ecosystem on a seamless and individual basis is becoming a key to success for automobile manufacturers. As a consequence, a human-centric HMI presents two design challenges: -HMI components in the car will continue to be essential but become a commodity. -A cloud-based mobility companion will be created that knows and understands each user so well that it improves every aspect of mobility from the first moment the users think of it until they arrive. This goal is not considered as being achievable in one development step. It takes continuous, rapid, system supported learning from each interaction of each user in the field to create and sustain a premium UX, UI and HMI. Consequently, future components in the vehicle will be defined by multi-functional usability and a platform-based, highsecurity architecture, thus permitting a software-based, continuous evolution of the human-machine interface over the vehicle's entire service lifetime. According to [4] , "for an OEM, the central mission will be to attune to the personalized, on-demand and situation-specific delivery of content with a view, at the end of the day, to creating the ultimate experience for users, current drivers and future mobility customers." Emulating or adopting HMI control paradigms of the systems of Google, Apple, Wechat etc. collides with the strategies of almost all established manufacturers. However, this ability and willingness will be essential for a 100 % human-centric HMI. One sees emerging electric vehicle manu facturers trying to leverage this HMI paradigm to "leapfrog" the automotive establishment. The former principle of creating a "one-size-fits-all" HMI is giving way to the "anytime, anywhere" doctrine of the digital age. Intuitive usability in the car, as on any digital device, requires selfadaptation of the HMI based on extensive user preference and interaction history outside of the car. The trust and confidence of future vehicle users will be earned by providing the highest standards of data security and privacy to all users in everyday mobile life. The vehicle adapted to people's needs with its HMI represents both an element of freedom and a sanctuary, at the same time -it will thus continue to trigger such a high level of passion and willingness to buy that the developers can look to the future with optimism despite the demanding work that lies ahead of them. The HMI of the future belongs to people. Alone Together -Why We expect More From Technology and Less from each other What Smartphone Buyers really Want Intelligent Surfaces as Discreet Control elements Smart Mobility in der Praxis: Das Auto -unverzichtbar für den intermodalen Verkehr