Christina Huber (links), User Experience Designer at Audi, and Jan Pflüger, Mixed Reality Expert at Audi, developing the innovative Audi dimensions operating concept for the Audi activesphere concept*.
Christina Huber (links), User Experience Designer at Audi, and Jan Pflüger, Mixed Reality Expert at Audi, developing the innovative Audi dimensions operating concept for the Audi activesphere concept*.
A hand makes a delicate gesture in the air, as if it were turning a dial. With just a flick of the wrist, the hand then swipes to the left. It is the hand of Christina Huber, User Experience Designer at Audi, sitting in a mock-up of the Audi activesphere concept*. She takes off her high-tech glasses – and smiles: “It’s all about intuitive movement; that’s how Audi dimensions works.” Jan Pflüger, Coordinator Augmented & Virtual Reality and Mixed Reality Expert at Audi, who is sitting next to Christina, agrees: “We equip our customers with sensor technology that works both ways. I deliver the information the users need. At the same time, I get to find out what the users actually want to do with it.”
The mock-up of the Audi activesphere concept* is a model built to scale for testing the functionalities of the vehicle.
Jan Pflüger checking the mixed reality glasses. They are connected to a small computer unit attached to his belt.
The mock-up of the Audi activesphere concept* is a model built to scale for testing the functionalities of the vehicle.
Jan Pflüger checking the mixed reality glasses. They are connected to a small computer unit attached to his belt.
Christina and Jan develop the innovative operating concept of the Audi activesphere concept*. The graphical user interface (GUI) is no longer permanently displayed in just one place, on screens or projected, as has been the case up to now, but it is now reproduced in the right place at the right time – it is no longer just two-dimensional, but three-dimensional. When you are wearing mixed reality glasses, the operating elements will appear in a three-dimensional space, and you can interact with the content using natural gestures within this space.
Christina: “With this operating concept, there are no classic screens where you tap on the selected content. The glasses use sensors to detect what is in the focus of the user and move the control element towards them.” Sounds simple, but in fact it is highly complex. The fact that a vehicle uses sensors to get an understanding of the surroundings has actually become state of the art. Entirely new, however, is that mixed reality glasses offer the driver the same sensor technology to locate them in space. Jan: “We want to interact spatially and use the most advanced mixed reality technology currently available. Unlike virtual reality glasses that will isolate you from the real world, you can actually see the world around me through the transparent displays of the mixed reality glasses. Any kind of information that is relevant for use is then displayed in your field of vision.” Almost reverently, Jan lifts the glasses, turns them in the air and looks at them from all sides.
The mixed reality glasses, which locate the driver in space, are equipped with numerous sensors and cameras. They help to contextualise the surroundings, programming the experience with the glasses in a way that all the controls appear exactly where they are needed.
The complexity of the three-dimensional use must be logical. Jan Pflüger and Christina Huber exchange views on this challenge.
The complexity of the three-dimensional use must be logical. Jan Pflüger and Christina Huber exchange views on this challenge.
Jan Pflüger
The mixed reality glasses* won’t isolate you from the real world, but you can actually see the world around you through transparent displays.
Jan wearing the highly complex mixed reality glasses. They contextualise the surroundings using numerous sensors and cameras.
The mixed reality glasses* won’t isolate you from the real world, but you can actually see the world around you through transparent displays.
Jan wearing the highly complex mixed reality glasses. They contextualise the surroundings using numerous sensors and cameras.
And what’s more: The virtual control is intelligent and moves towards the user. As long as they look aimlessly around them, nothing will happen. The UI elements, which enable various operations, remain inactive. But once the user focuses their eyes on one of these elements, thus signalling interest, the system displays more detailed information, moving the element towards the user where it can be controlled via gestures. Here’s an example: The so-called Audi dimensions anchor point located in the door of the Audi activesphere concept* is a physical element. When you rest your eyes on it, the system will register your interest and then display the current temperature of the interior, for example. When you focus on it for longer, it will become interactive: The UI element moves towards the user and can be controlled virtually. When you’re done, it disappears after a few seconds. Alternatively, you can also swipe the content away with your hand, thus returning the UI element to its sleep mode. So there are not just physical elements superimposed with virtual 3D content, but an intelligent, context-based user interface which is flexibly displayed and can be controlled seamlessly and conveniently. Audi dimensions focuses on the users and understands their needs.
When you are in the front left seat of the Audi activesphere concept*, the steering wheel is retracted in an invisible position. If the driver wants to take over, the steering wheel swivels out from its invisible position, and the content of the user interface will adjust to manual driving mode. Christina: “Even with navigation, you don’t just click on an icon to access a use case, but you can interact three-dimensionally with the navigation map. There are different layers for different levels of information, which build up gradually depending on what interests you.”
When you leave the vehicle to explore nature on your mountain bike or your skis, for example, you want to continue to wear the glasses, so that relevant information can be displayed, such as trail networks, slope gradients or warnings of limited visibility.
Jan: “Not only are we making sure that now, for the first time, you can use such a technology to interact with your vehicle and control its functions, but we go far beyond that. Ultimately, the vehicle is a digital mixed reality platform that is ready for what we hope to see in the future: I get into the car, and it comes naturally to me to control it via the MR glasses. I get out of the car and go about my activities such as skiing or mountain biking, where the glasses will also be of support. The vehicle is my extended companion, enabling this seamless experience.” The car of the future is embedded in an ecosystem, enriching the real world with context-based virtual content.
Christina Huber
Christina Huber wearing mixed reality glasses.
The Audi dimensions anchor point located in the door is a physical element. When you rest your eyes on it, it will become interactive*.
Christina Huber wearing mixed reality glasses.
The Audi dimensions anchor point located in the door is a physical element. When you rest your eyes on it, it will become interactive*.
This technology of the future enables designers to create an unimagined user experience. Christina Huber: “The vehicle itself is a holistic feel-good zone without any visible computer technology. And yet it is intelligent – but only in the background. Only once I put on the mixed reality glasses, I will bring the controls to life.”
This opens up completely new possibilities for vehicle design in the future. Designers will no longer have to consider where controls should be physically placed. No matter how and where you sit in the vehicle, be it in a lounge or a reclining position, with mixed reality glasses, accessibility to all controls is always guaranteed. Completely different design languages are thus conceivable, and a completely new interior design becomes possible. The future remains innovative.