Using Blended Reality to Reduce Downtime of Fleet Vehicles

February 26, 2019 Snigdha Petluru

 

The vehicle services industry has spent nearly $3B for maintenance activities. A significant portion of this is targeted towards the inspection, analysis, and forecasting of vehicle performance. While a large suite of tools provides insights and recommendations about vehicle health and longevity, there is a need to apply these insights to provide intuitive solutions that can be used within the confines of a garage. The lack of a unified system to support accurate and intelligent communication across departments and agencies has led to costly delays in task allocation, information assimilation, and effective work order tracking. In such settings, the ability to draw parallels simultaneously from physical interaction with vehicles and analysis of recorded data is vital to support prompt and effective decision-making. To address these challenges, we present "gAR-age"- an ecosystem that enables maintenance personnel to interact with both worlds in a common setting.

gAR-age is a Blended Ecosystem

With gAR-age, Conduent proposes the concept of a blended ecosystem - an ecosystem that utilizes blended reality to enable immersive interactions among the user, system, and the real environment. Our immersive solution allows maintenance personnel to effortlessly access dynamic insights about the vehicle's health with a hand-held device. In doing so, we reduce the time it takes to figure out what's wrong, thereby reducing the downtime of the vehicle. The design of our solution is influenced by three crucial factors:

  1. To support data-driven decision making without the need to switch contexts continually
  2. To facilitate faster contextualization and comprehension of information by supporting immersive analytical visualizations on the blended interface
  3. To enable multi-channel interaction among members within the organization through blended user feedback and context-sharing

In addition to providing fleet-level insights, our solution provides the capability of drilling down to granular information about the vehicle. Our computer vision modules support this ability and can identify individual parts within a vehicle. Personnel can scan a vehicle component of their choice (such as a tire), and obtain descriptive and predictive insights such as miles travelled and predicted remaining life. This enables personnel to make decisions instantaneously, supported on-site with data-driven insights. These automated and instantaneous decisions position agencies to make more informed, complex decisions that save them money and time.

We recognize that many decisions personnel make on a daily basis require actions that demand greater attention and understanding apart from the blending of data with real-time objects. Our immersive analytics suite has three distinct engines that support additional learning and contextualization of user preferences including:

Blended User Feedback Engine: Blended user feedback serves as the backbone which enables a multi-channel platform for shared, real-time interaction among the user, system, and environment. This engine provides three different opportunities of interaction between the user and the blended ecosystem:

  • Alerts - Ability to create prioritized notifications and tag users to contextual cues about objects in the real world with immediacy
  • Instructions – Ability to add and inform specialized directions and tasks relating to blended elements in the ecosystem for a more individualized experience
  • Conversations – Ability to interact with multiple users to access and share common element intelligence in the ecosystem

Longitudinal Learning Engine: The longitudinal learning engine captures data and assimilates the data context over a period of time.  This historical and current context enables users to understand not only the current state of an object, but also how the object has transitioned over time. This engine incorporates three different learning opportunities to aid the user in understanding the longitudinal context of objects:

  • Visual Learning – Learning about changes in the physical characteristics of the object through historical visual tracking
  • Behavioral Learning – Learning about changes in the characteristics and actions of the user over time, gathered through invasive and non-invasive methods
  • Feedback Learning – Learning about user needs and priorities from the feedback that the user leaves on the ecosystem across multiple sessions

Personalization Engine: By learning from object characteristics, user interactions, needs and priorities, this engine personalizes three aspects within the ecosystem:

  • System Learning from User Behavior – Customizing information display and architecture by learning the user's individualized role in the organization and information seeking patterns, i.e., information they could be looking for
  • System Learning from Data – Providing the user with immediate prioritized information on-the-fly by learning from the data that it collects by itself or retrieves from another source or object, i.e., what information they need to be looking at
  • System Learning from User Feedback– Customizes intelligence provided to the user by taking suggestions, feedback, and preferences from the users directly, i.e., information they frequently prefer finding

While most vehicle monitoring solutions do not address the need to capture both historical and current user feedback and personalize information, Conduent’s solution enables the customization of insights based on the user's needs, preferences, and priorities. In addition to learning from the historical changes of the user and the ecosystem, this blended ecosystem supports multi-channel interaction among users, thereby enabling users to make data-driven decisions in consultation with other personnel in the garage.

 

You can read more complete details in our paper: gAR-age: A Feedback-Enabled Blended Ecosystem for Vehicle Health Monitoring, published in the Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (Automotive UI 2018).

--------------------------------------------------------

Conduent is a digital interactions company designing interactive products and solutions that enable informed decisions and accelerate your business. Learn more about innovation at Conduent.

 

 

Previous Article
The Word at HIMSS: Healthcare Organizations Making Progress toward the Cloud
The Word at HIMSS: Healthcare Organizations Making Progress toward the Cloud

Cloud computing, long a staple in many industries, has faced a slower uptake in U.S. healthcare for several...

Next Article
Our Commitment to Quality and Innovation through Best Practices
Our Commitment to Quality and Innovation through Best Practices

Our platforms and solutions touch virtually every area of business and government, creating individualized,...