The Future of Mobility: BMW “i Interaction Ease” with Motius at CES 2020
Imagine a car that interacts with you naturally. It reacts to your voice, your gesture, your gaze. That is the future of mobility. And the future of mobility is close — in fact, it’s already here. How do we know? We are a part of it, we’re creating it. In this blogpost, we take you on a journey behind the BMW “i Interaction Ease” — because Motius was in the middle of it all.
What the future will be like
Swipe your hand through the air to darken the side windows. Press on the surface of your seat to make a footrest come up. Gaze through the windscreen to see an augmented version of the city: floating icons that provide enhanced information of your surroundings.
We could go on and on with such examples, but you get the concept. The way that you will interact with cars in the future will be natural and intuitive — it just makes sense to the user. Why is that so important?
In times when you don’t need to drive your car anymore because it drives autonomously, you want the best travel experience possible. You won’t get that by turning knobs manually or searching for buttons to control your entertainment system. Knobs and buttons simply don’t sound like a great concept for the future.
But why wait for level 5 of driving automation to create the best travel experience possible? What if we used cutting-edge technologies to build a functioning system with all those cool features now? Well, that is exactly what we did with the BMW “i Interaction Ease” for the CES 2020.
Still can’t imagine what this whole thing looks like? Check out the video below.
Now that you for sure know what we are talking about, let’s have a look at how it actually works.
An Intelligent Personal Assistant is the brain of it all
As you saw in the video, Anna’s car experience is seamless. It is as though the car knows what she is doing, what she needs, what she is about to do. How is that possible? Everything she does is recognized by the system surrounding her. We call this system IPA — Intelligent Personal Assistant.
The IPA is the car’s brain. It’s where all the different sub-systems come together. It’s where the car thinks. Many people say it’s where the magic happens.
But what different sub-systems are involved in such a seamless, next-level user experience? Well, let us give you some insights.
Enhancing your reality with AR windscreens
“Where do all the floating icons with restaurant information come from? Why can I watch movies on my windscreen? How is it pos-”, we cut you right there. The answer: panorama heads-up displays.
The cool thing? This is not just one of the tiny head-up displays with traffic information that you get nowadays but one that has the size of a whole windshield. It enables you to interact with the environment you are currently seeing. How does it do that?
First, through geolocation, the IPA knows exactly where you are at the moment. In addition, gaze-recognition cameras detect what you look at. Altogether, the system then displays all the required information with Augmented Reality right on top of the objects.
Sounds easier than it is. The software has to be paired with gaze- as well as hand-recognition cameras in order to make this happen. Synchronizing all this input was one of the main challenges since the different building blocks came from different suppliers.
Whereas the gesture control, the hardware and the applications were made by several project partners, the overall concept was designed by Designworks. We at Motius also built the software that coordinated the different subsystems. Thanks to the expertise of our partners and ourselves, we were able to make the IPA work smoothly.
Imagine you look at a restaurant and your car will show you the menu. Or you point at a cinema and your car shows you the movies for tonight. Looks smooth but getting a seamless user experience actually involves endless hours of development. Altogether, we made it happen — we built the future with emerging technologies that are here today. How cool is that?
But wait, there’s more…
Talking to your car via a Voice Service
You probably noticed the nice “Have a nice day, Anna” at the end of the video. As you can imagine, the Voice Service was a key element to ensure the intuitive user interaction as the speech recognition allows you to just talk with your car. But did you also hear how human the voice sounds? No? Make sure to check it out again and then find out about neural voices.
We all know those horrible robotic computer voices — nobody likes to listen let alone talk to them. That’s why we use neural voices. Using speech synthesis technology, we used a natural text-to-speech/speech-to-text (TTS/SST) interaction.
In other words, speech synthesis technology enabled us to use unique, customized and human-like voices. They are the basis for a Voice Service that you really like to talk to. Are we the only ones wondering why there still are so many robotic voices out there? Come on guys, it’s 2020.
But of course, the Voice Service doesn’t just have a nice voice but also has various functions which go way beyond the things that Siri, Alexa & Co. do. What’s so special about this Voice Service is that it reacts to you and to what’s happening in- and outside the car.
Imagine you’re stuck in traffic and the car recognizes that it’ll take a while to get out of there. What would you do with your free time? Watch Netflix? Take a nap? Just listen to your favorite music? All these are options that the Voice Service would suggest by speaking to you, depending on the situation, your habits, your mood etc.
Again, it’s not a simple thing to build. To create such a user experience, multiple systems need to interact with each other and work together perfectly. What if there’s one little bug in the code or one camera that doesn’t recognize your hand gesture? The whole experience would seem ridiculous. The fact that we were able to pull that off, proves that the future of mobility is already here.
Sit back and enjoy your interactive Zero Gravity Seat
With BMW i Interaction Ease, everything is connected, everything has a function. Many different systems and technologies interact with each other to enable a seamless natural travel experience. Let’s have a look at the seat as an example.
Why does the seat light up as you touch it? Simply put: the sensors underneath detect that you touched it. Whenever your seat lights up, you know that you can interact with the car in a certain way.
If, for example, you slide backwards with your hand, your seat tilts backwards. Sounds intuitive, right? The interaction is seamless, but it requires a lot of effort to build something like that.
Although the seat wasn’t the part that Motius was most involved in, we’re still proud of what was achieved during the whole project. After all, it was great teamwork and our car intelligence made the background system interactions possible.
The seat’s material needs to allow the tech underneath to recognize the interaction but still be comfortable. The tech underneath needs to recognize your input, interpret it correctly and react accordingly. The system that tilts your seat backwards needs to get the signal and move your seat fluently. And whatever happens, your car’s “intelligence” — the IPA — needs to know about it.
The future is here
At CES in Las Vegas, the BMW i Interaction Ease has delivered a very important tech message: With a lot of different systems and various partners, we managed to achieve something we’re really proud of — building the future of mobility with the emerging technologies we have at hand today. The result is a user-experience unlike anything before.
We don’t need to wait for new tech to arrive, we can build the future with what we have. If you’re down for a ride, let us know!