Fleets using artificial intelligence to accelerate safety, efficiency

user-gravatar Headshot
Updated Nov 11, 2019
Photo by Steven Diaz, courtesy of SmartDrive.Photo by Steven Diaz, courtesy of SmartDrive.

Artificial intelligence” (AI) may evoke fears of robots writing their own software code and not taking orders from humans.

The real AI, at least in present form, is delivering results in the business world. Technology companies are using powerful computers and advanced statistical models to accelerate their product development. Most are not calling these efforts AI but rather machine learning.

As a form of AI, machine learning is making it possible to quickly find relevant patterns in data captured by Internet of Things (IoT) devices and sensors, explains Adam Kahn, vice president of fleets for Netradyne, which has a vision-based fleet safety system called Driveri (“driver eye”).

Ten years ago, fleet safety managers had to interpret critical events reported from telematics systems, Kahn says. A “hard brake” event, for instance, may not be a result of distracted or aggressive driving. The driver might have hit the brakes to avoid a car that suddenly him cut off in traffic.

Video-based safety systems give fleets context for hard braking and other safety-critical events. With machine learning, these systems have advanced to bring automation to the review process of video and data by identifying complex patterns of risk.

New developments are giving drivers visual and audible tones and feedback to deter risky behaviors like fatigue and distraction. This direct-to-driver coaching model is helping to eliminate the need for managers to schedule face-to-face training meetings with drivers. Other applications that use AI are able to instantly solve other difficult transportation problems beyond the realm of safety.

Moving to the edge

The foundation of machine learning and artificial intelligence is precision of data and accuracy of statistical learning models, Kahn says.

Netradyne Driver I Hardware1 2016 09 15 14 29Data precision comes from vehicle and engine electronics, cameras, sensors and IoT devices in vehicles. With precision, technology suppliers are able to apply machine learning models to accurately identify relevant patterns.

The patterns are detected by algorithms uploaded to servers in the cloud and to “edge” computing devices with the processing power to support advanced mobile applications.

Some edge devices use teraflop processors similar to those in the Xbox gaming system. The processing power enables computer vision to detect complex patterns of risk from high-definition video, Kahn says. Patterns for driver fatigue, like yawning, and distraction can be instantly detected as can other behaviors like following distances that are unsafe given current speeds, road and traffic conditions.

Ward Trucking, a less-than-truckload carrier, is using the Netradyne Driveri technology in 220 of its 600 trucks with plans to implement the entire fleet. Steve Dunn, director of safety for the Altoona, Penn.-based carrier, is paying close attention to three kinds of risky driving behaviors.

When drivers have violations at stop signs and red lights, close following distances or hard brake events in the LTL environment, Dunn says it signals they are rushing around to get to their next stop and becoming frustrated.

“When a driver allows that to impact how he is driving, he becomes dangerous and it could result in accidents,” he says. With real-time information from Netradyne, “we can identify those who are experiencing that and interact with them sooner.”

Ward Trucking is recognizing professional drivers with information and video captured by the Driveri platform. Road-facing video clips of safe driving events are replayed on the TV systems at its 19 terminals to “recognize drivers for things they do well,” Dunn says. The company also lists drivers by name on the TV system with a high “Green Zone” score calculated by Driveri.

The open gateway

At present, motor carriers may be cobbling together data from multiple IoT systems in vehicles that do not “talk” to each other. As a result, they may have operational, safety and maintenance data in separate databases and do their reporting and analysis after the fact.

The Dell Edge Gateway 3002 is designed to interface to a multitude of sensors and systems on a vehicle.The Dell Edge Gateway 3002 is designed to interface to a multitude of sensors and systems on a vehicle.

Having disparate streams of data can also limit the opportunities for using machine learning and AI. Fleet management technology is reversing that trend as more vendors take an open platform approach.

Dell Technologies sees an opportunity to “defragment” the IoT systems used in trucking. It recently launched an edge computing device that connects to various hardware and sensors on a vehicle to “start the analytics process powering digital transformation,” says Brent Hodges, who leads the company’s IoT planning and product strategy.

“The trend is to put more compute power at the edge (in the cab) so that one device can run analytics on several applications before sending data to the cloud,” Hodges says.

The new Dell Edge Gateway 3002 is Intel Atom powered and runs Window IOT 10 or Linux for local applications and analytics processes. It comes with a wide band of connectivity options. Besides native CANbus connectivity it has Bluetooth, zigbee, Wi-Fi and mobile broadband wireless options.

Dell collaborated with several companies for the new IoT platform. Blue Dot, an enterprise mobile software application provider, has a fleet management system that runs on Dell’s Edge Gateway. Dell also partnered with Microsoft. Data from the gateway can reside in Microsoft Azure Cloud Platform for real-time analysis.

An open platform may appeal to transportation companies that want a single gateway in vehicles that communicates with a single database in the cloud through one wireless subscription.

Fleets may also want to choose their own mobile applications and software for analytics and machine learning, explains John Crupi, vice president of IoT Analytics at Greenwave Systems. An open platform makes that possible.

The Greenwave AXON Predict software engine runs on edge computing devices to find anomalies in data.The Greenwave AXON Predict software engine runs on edge computing devices to find anomalies in data.

Greenwave Systems is focused on real-time analysis of IoT data. It has a software engine called Axon that runs on edge computing devices such as the Dell gateway to find anomalies in data. Lately its efforts have been focused on real-time monitoring of cargo using sensor data like temperature and humidity among other conditions, he says.

Being able to detect patterns instantly at the edge, rather than in the cloud, brings transformative capabilities to carriers and shippers, he says. A shipper could instantly know if the temperature of a load fluctuates at certain locations during transit. Without real-time analytics on the edge, “these things are really hard to catch,” he says.

Accelerated development

With the volume and richness of IoT data continuing to increase, some technology suppliers have accelerated their path of product development.

“What you need for the next level is not who has most data, but who has the most that is relevant to the problem you are trying to address,” says Ray Ghanbari, chief technology officer of SmartDrive, which provides an open video-based driver safety and fleet telematics platform.

Having a database of GPS locations from commercial vehicles may be useful for solving routing problems. If the goal is solving problems with driver distraction and other risky behaviors then a much more robust dataset is needed, he says.

“Machine learning and computer vision are the backbone of what we have been doing as a company,” says Ray Ghanbari, chief technology officer of SmartDrive.“Machine learning and computer vision are the backbone of what we have been doing as a company,” says Ray Ghanbari, chief technology officer of SmartDrive.

In March, SmartDrive announced its next-generation Smart Recorder 4 (SR4) platform. The edge device and cloud-enabled analytics infrastructure will be able to capture even more data for analysis of commercial routes and events to find correlations with safety and operating efficiency.

To date SmartDrive has analyzed 200 million events with relevant human training data, he says. More than 250,000 collisions and near-collisions have been captured using the SmartDrive platform. The new SR4 platform has the ability to support up to nine cameras around a vehicle which will help accelerate development of computer vision products, he says.

“Machine learning and computer vision are the backbone of what we have been doing as a company,” he says.

SmartDrive is developing a line of SmartSense computer vision products. One of its products can monitor eye movements and head positions of drivers to detect inattention, distraction and drowsy driving behaviors. New products will alert drivers, assess behaviors and trigger the capture of event data and video for lane departures, short following distances, forward collision warnings, posted speed detection, traffic signs and signal violations.

“Data is the new oil and machine learning is the new internal combustion engine,” Ghanbari says.

To date, Lytx has captured and analyzed more than 80 billion miles of driving data through its DriveCam program.

All Truck, a Chicago-based local and regional Midwest truckload carrier, implemented the Lytx technology earlier this year with the Lytx ActiveVision service in its fleet of 350 trucks. ActiveVision has been in use for over two years in more than 55,000 vehicles. The service uses machine learning and a precision analytics algorithm to detect and capture driving patterns consistent with distracted and drowsy driving.

Lytx ActiveVision uses machine learning to detect behaviors due to fatigue and distracted driving.Lytx ActiveVision uses machine learning to detect behaviors due to fatigue and distracted driving.

Drivers receive real-time alerts as part of the ActiveVision service to encourage immediate self-correction. In addition to the real-time alerts, All Truck uses Lytx’s video command center remote coaching option. Drivers can review videos of risky driving events.

The remote coaching option “allows me to get in touch with my drivers quickly and removes the burden of waiting for a phone call or going out to meet them in the field, which can occupy several days,” said Don Henderson, safety director of All Truck.

Selective review

With machine vision, technology is automatically identifying safety-critical events for management to review and ferreting out “false positives.”

Fleet mobility provider PeopleNet has been using machine learning to automatically detect video events that need management review based on safety-critical behaviors like fatigue and distraction, says Chris Orban, vice president of cross business unit analytics for Trimble, the parent company of PeopleNet.

PeopleNet is currently beta testing a new machine learning feature with select customers with plans to expand the offering to all fleets using its Video Intelligence platform. Orban and his analytics team also have plans to use machine learning to create new products for fleets. One aspiration is to match drivers to the right freight at the right time to better manage fatigue levels and improve job satisfaction.

“The end game is an optimization problem,” Orban says.

The applications of machine learning go beyond safety. Omnitracs, for instance, developed an advanced routing algorithm that uses proprietary business logic to create optimal routes.

Omnitracs uses machine learning in its routing software to determine the optimal stop assignment across multiple routes.Omnitracs uses machine learning in its routing software to determine the optimal stop assignment across multiple routes.

A feature within its Routing application analyzes historical service times to recommend and adjust planned service times, says David Palle, Omnitracs’ vice president of product management. The feature enables fleets to account for various delays such as waiting at docks and traffic to more accurately account for time that increases or decreases with volume, he says.

The predictions determine the optimal stop assignment across multiple routes and the most efficient and effective order to sequence those stops within a route, Palle explains. The information helps minimize mileage and drive time while meeting customer delivery windows.

The predictions also identify the best time to start the route on a specific day and can auto-adjust the start time. As part of this machine learning process for routing, proactive notification alerts can leverage the dispatch algorithm and traffic data to notify the contact at each destination of the estimated arrival time, he says. This helps to minimize the back and forth calls between customers, dispatch, and drivers.

In these and other instances, fleets are more than willing to let machines learn how to identify risk, save time and accelerate product development. There seems to be no fear in that.

“I see more things being able to be learned with computer vision capabilities and identify other areas we need to focus on for a driver or fleet,” says Ward Trucking’s Steve Dunn.