Mitsubishi Electric Automotive America believes that providing information about what an automated driving system is âseeingâ can make the occupants feel more confident while being transported. In this setup, there are three screens: a vertically oriented 10.2-inch screen in the center and two horizontal 12.3-inch screens, one in front of the driver and one for the passenger. Information from the sensor array can be displayed for the driver and passenger in real-time.
There is one thing that needs to be taken into account when it comes to automated driving, one critical thing: Trust. As in the trust that the person who takes her hands from the wheelâassuming that there is oneâand the trust of the person who is sitting in the passengerâs seatâassuming that all the seats arenât passengersâ seats.
And to address that, Jacek Spiewla, Mitsubishi Electric Automotive America (MEAA; meaa-mea.com) advanced development engineer, demonstrates, they have developed what theyâre calling the FLEXConnect.AI in-vehicle infotainment system that features a multiscreen interface. There are three displays built by Mitsubishi that are integrated into an instrument panel: two 12.3-inch 1920 x 720 displays and a 10.2-inch 768 x 1280 capacitive touch display in between the two. The first 12.3-inch display is for the driverâs information. The central 10.2-inch display is the infotainment unit where swipes and touches allow screens to be changed and options to be selected. Then there is that third screen, which is located directly in front of the front-seat passenger.
The system is running on a Snapdragon 820Am processor from Qualcomm. Spiewla notes that this quad-core processor is capable of being able to handle all the screens, whereas âwith traditional systems, you would need three separate processors to run the cluster, the center display and the passenger display.â He adds of the approach theyâre taking: âWeâre able to reduce both complexity and time to market.â
For the operating system, theyâre using Android. âOne of the nice things about Android,â he says, âis that you can customize each screen independently.â
So what does any of this have to do with trust?
One of the things theyâve done is, as Spiewla puts it, make it possible for the vehicle occupants to be brought âinto the loop of what the vehicle is doing.â
The car goes into automated driving mode. At the same time, the screens can show what the vehicleâs sensors are âseeingâ in real-time. âWe have a real-time visualization of the automated driving task based on cameras and radar. It shows lane markings, pedestrians, parked vehicles, cross-street traffic, and other cars.â Then a âthreat assessmentâ is made of the various items categorized. This allows the people to better understand not only what the vehicle is âseeing,â but gain the confidence that what they can see by looking through the windshield is also being monitored by the car.
Another feature of the Snapdragon 820Am processor is a Qualcomm Snapdragon X12 LTE modem that provides up to 600 Mbps downlink and 150 Mbps uplink speeds. Spiewla shows how theyâre using it in a couple of ways, one of which is that it can add a measure of confidence whether autonomous driving is involved.
MEAA is partnering with AccuWeather. What this facilitates is providing information on whatâs going to occur along a route. Spiewla programs a route from Atlanta to Memphis that the car âthinksâ it is traveling. (Weâre actually in a static Audi in a garage in Plymouth, Michigan.) At the time, there are actual thunderstorms 60 miles ahead on the route, which are displayed on screens of the FLEXConnect.AI system. This allows the driver and vehicle occupants to have a sense of whatâs ahead.
Another company MEAA is working with is Movimento (movimentogroup.com). That company provides over-the-air updates. Think of it as updating your smart phone, but in this case youâre updating various systems in your vehicle. Connectivity can occur via the embedded modem in the car or with a Wi-Fi connection (e.g., when the car is parked in your garage, it can connect with the home Wi-Fi system or a smart phone can be set up to serve as a hot-spot). The vehicle makes a secure connection to the cloud, there is a check made of the vehicleâs VIN number to assure that it is the vehicle in question, and necessary downloads are made to the required components in the vehicle (or the updates can be scheduled for convenience). These updates could be for a variety of things including infotainment, navigation and the powertrain. Or for presenting the information of what the car is âseeingâ to the vehicle occupants, as in adding labels to the objects (e.g., âpedestrianâ).
Getting data on the screens is something that MEAA is working on, too, says Mark Rakowski, executive director of sales and engineering for the organization. âWe have a variety of ADAS [advanced driver assistance systems] products, such as for automated parking, lane-keeping assistance and lane departure warning. Another division is working on mobile mapping.â Theyâre producing radar and ultrasonic sensors. They are developing ECUs for ADAS applications (working with processor suppliers like Qualcomm, Renesas, NVIDIA and Intel). While they donât make LiDAR sensors, Rakowski says theyâre looking to partner with companies that do.
Rakowski says that system development is occurring at an unprecedented speed. âPeople are designing systems around chips that arenât even available yet. From an automotive standpoint, we never did that. The computer industry did.â He explains that a company will tell MEAA what its next chip will do, and that theyâll provide samples in six months. âWe used to wait for the samples,â Rakowski says. âNow weâre designing around something thatâs just on paper.â
While he thinks that it will take some time for there to be a significant number of fully autonomous vehicles (âA lot of people are talking about 2021, but how much of the fleet will that be?â), he thinks that when it comes to the safety, comfort and convenience that ADAS can provide, this will continue to make big strides.
âCosts will keep coming down as you have more sensor fusion and deep learning takes over,â he says, pointing out, âThis is not going to be technology just for luxury cars, but it will be in trim lines for every vehicle.â