With Industry Giants All Stepping into Car Making, What Path Should Intelligent Vehicle Follow? — Ken YING from PATEO Gave His Answer
9/29/2021
BY XINHUANET.com
5445

On Sept. 27, the World Intelligent Connected Vehicles Conference (WICV) was held in Beijing, at which Ken (Yilun) YING, founder & Chairman of PATEO CONNECT+, delivered a speech titled “Smart Cockpit Software Platform Empowers the Take-off of Intelligent Vehicles”.

Gradual Integration — A Trend of Intelligent Connectivity

As the level of intelligence keeps rising, the identity of the vehicle is undergoing an accelerated change, from a traditional means of transportation to a third space for mobility, with a gradual decrease in the attribute of moving someone or something from one location to another, and a gradual increase in the experiential, interactive & technological attributes. In the future, it is expected to become the third-generation intelligent mobile terminal after the PC and the smartphone.

From the perspective of the development trend of the intelligent connectivity technology, the future smart cockpit system will be required to support more display screens, including full LCD screen for instrument cluster, large display screen for intelligent center console, co-driver & rear seat entertainment screens, streaming media rearview mirror, and AR-HUD; as for the interaction technology, the purely GUI-based interaction in the past will develop towards multimodal interaction based on an integration of facial, gesture, lip movements & speech / voice recognition; when it comes to communication, the 5G-V2X technology will represent the future development direction; and the performance of the car cockpit will develop in the direction of gradually approaching the capabilities of consumer electronic chips.

Accordingly, the architectures of the whole intelligent vehicle (IV) and hardware will develop in the direction of basing on service and separation of software & hardware. The SOA (Service-Oriented Architecture) will enable better cross-domain integration (CDI) between smart cockpit, body and intelligent driving, and allow the latter two to provide more capabilities of atomic services in the form of microservices.

Application of Technical Services — A Key to Intelligent Vehicle

As the third-generation intelligent terminal, in the future, the IVs will also need to deeply integrate the AI capabilities to create application scenarios that incorporate the real needs in the user’s life, as abundant personalized applications that allow rapid iteration will be the direct carrier for a comfortable and convenient car user experience.

In terms of the intelligent connectivity technology platform, the smart cockpit system is required to support multiple platforms and virtualization technology, as well as interacting with an application on multiple screens, displaying an application across screens and managing multiple windows; on top of the traditional basic car navigation, Qing Map also features excellent inertial navigation & positioning, high-precision lane-level positioning & navigation, and AR capabilities, to empower autonomous driving, and offers application services such as automatic parking, parking lot reverse vehicle searching, retrieving vehicle with one touch, AR (live view) navigation, group travel, and self-drive road book, to enhance user experience and driving safety; and the Qing AI platform has built an AI assistant that well represents the brand image, supporting all-scenario ecosystem covering hotel, air travel, airline / movie / train ticket, food and takeaway, and equipped with the capabilities of active, multimodal & multisensory interaction as well as text, image and visual emotional perception. As a voice assistant that truly realizes intelligent and natural speech, the Qing AI covers all scenarios, and supports any cross-scenario interaction. Mr. Ying also gave a live demonstration of its application scenarios, including wakeup in a high-noise environment, voice query & navigation, and continuous speech & dialogue.

User Experience — A Core to Intelligent Vehicle

With the emergence of the smartphone, the function of the mobile phone ceased to be making a call; likewise, with the arrival of the era of autonomous driving, the main function of the intelligent vehicle may cease to be a conveyance from one place to another. The future cabin space will be able to deliver an immersive and comprehensive experience with a sense of ceremony to the owner through vibration and audio-visual & olfactory effects in combination with AR and VR technologies.

Mr. Ken gave us several vivid examples: many white-collar workers are in the habit of taking a lunch break — in such a scenario, speech recognition (SR) can be used to trigger the operations of fully reclining the seat, adjusting the lights to a comfortable level, closing the windows, etc., and appropriate music recommended based on the user’s preferences; when driver fatigue is detected, a steering wheel vibration, cold air blown from A/C, windows opened, stimulating fragrance given off and other methods can be used to keep them stay sober; there’re also “Passenger Fatigue Mode”, “Waiting for Kid Mode”, and “Car Wash Mode”, among others.

By merging the body domain with the cockpit domain, and leveraging the NFC, UWB, Bluetooth and other capabilities of the mobile phone, all physical buttons are integrated onto one screen and abstracted into various scenarios, to truly create an intelligent vehicle that centers on user experience.

Besides this, Mr. Ying also gave free rein to his imagination of the future IVs, saying that the IV is not merely a means for movement between two sites, but also a collection of the Internet of Everything (IoE) + a new space + an immersive experience + virtual reality (VR) + a mobile robot + Metaverse. By virtue of superior immersion, all-round interaction from inside to outside of the vehicle, and user-defined scenarios, a more personalized and intelligent automotive user experience is created. When in the cabin, we can record our own or children’s voices to smoothly and unimpededly communicate and interact with the car; after exiting the car, we can remotely check its location on the mobile phone, and use the phone to navigate there, or even enable a passer-by to experience the vehicle self-selling mode when it is at a standstill by scanning a QR code with the phone.

Business Contact

By completing and submitting the form, you agree to be contacted by PATEO regarding the matter specified herein.

Description of proposed cooperation


By completing and submitting the form, you agree to have your personal information collected and used by PATEO according to PATEO Privacy Policy, and to be contacted by PATEO regarding the matter specified herein.

Contact Us more
Languages
For a better experience, please use the vertical screen to browse