This story appears in the February 6 print edition of iTECH, a supplement to Transport Topics.
The video safety systems in truck cabs are becoming smarter. Rather than simply recording video of events on the road, some of the latest systems have gained the ability to “see” the truck’s environment and draw conclusions about driver behavior.
“Machine vision” technology enables onboard cameras to track lane markings and objects such as nearby vehicles, making it possible to detect when drivers weave slightly in their lanes, travel significantly slower or faster than surrounding traffic, follow too closely, fail to stop completely at stop signs or run red lights.
These safety systems, which also collect data from other sources such as accelerometers and the engine’s electronic control unit, then can alert drivers and their fleets when dangerous situations arise, such as distracted or drowsy driving. Fleets also use the systems to coach employees on better driving habits and document when drivers are doing everything right.
BEST OF FEBRUARY iTECH: More stories, columns
Lytx Inc., the supplier of the DriveCam video-based safety program, now offers ActiveVision, an enhanced service that employs machine vision to monitor driver behavior.
“DriveCam is the core video-based safety program that Lytx is known for, but we’ve upgraded it to have the most advanced sensor inputs and algorithms,” said Kara Kerker, executive vice president and chief marketing officer for Lytx.
The event recorder, which has an outside facing lens and an inside facing lens, is equipped with sensors that make it possible for the software to see the environment ahead, Kerker said.
“Objects like road markers, lane markings and vehicles become data points that we can track,” she said. “We also can track the movement of the driver within that environment.”
Hirschbach Motor Lines, a refrigerated carrier based in Dubuque, Iowa, has installed ActiveVision on more than 200 trucks and plans to roll out the technology across its fleet of almost 1,000 trucks by the end of the first quarter.
ActiveVision has helped Hirschbach uncover distracted driving that previously went undetected, said Brian Kohlwes, chief general counsel and vice president of safety for Hirschbach.
“We’ve contacted several drivers after discovering that they exhibited some sort of distracted driving, and we haven’t had any of them show repeat behaviors,” he said. “While not a rampant problem, it is something we are now able to see with ActiveVision that we would not have had the opportunity to see without it.”
One of the telltale signs that a driver might be distracted or drowsy is repeated episodes of swerving slightly.
“We have algorithms built into our devices that help us discern patterns of behaviors that serve as triggers,” Lytx’s Kerker said. “For example, with respect to fitness to lane, we can detect if you swerve outside your lane line a certain number of times within a certain time period.”
If repeated swerving occurs, ActiveVision generates audio and visual alerts in the cab to warn the driver.
“If the alerts go unheeded and the behavior continues, the event recorder will capture video and corresponding data for use in coaching the driver,” Kerker said. “The video gets sent through a cellular network to expert human reviewers. The video is reviewed, scored and sent to the client.”
The video-based safety program from SmartDrive Systems also uses software to monitor and evaluate driver performance.
The company is testing a number of algorithms that have a high correlation to drowsy driving, SmartDrive CEO Steve Mitgang said.
“We are able to rapidly design, deploy, test and refine new algorithms in an iterative approach that significantly shortens the time between concept and production deployments,” Mitgang said. “We are right in the middle of this process and, while we’re encouraged by our early results, we know we have a few more cycles to go before we have a product ready for the market.”
Driver-i is a new video-based platform introduced by Netradyne in September. It uses a single module with four high-definition cameras. When the system detects an event, such as a driver following the vehicle in front too closely, it sends alerts to the fleet in the form of texts or e-mails and allows managers to view captured video.
About 12 fleets have joined Netradyne’s early adopter program. The company is planning a broader commercial launch of the product early this year, according to Adam Kahn, vice president of fleet business at Netradyne.
“Artificial intelligence and deep learning are the core technologies of Driver-i,” Kahn said. “We have equipped the device with a superprocessor that allows for a teraflop of processing — a trillion calculations per second. The system sees a number of objects in the driver’s environment, such as cars, trucks, pedestrians, traffic lights, stop signs and more.”
The use of artificial intelligence to evaluate video instead of providing human review is a key characteristic that distinguishes Netradyne’s approach from that of other suppliers. Kahn stressed that this enables carriers to access important video information more quickly.
“Video systems that require human review of video are slow to provide information to the fleet,” he said. “When an event triggers the capture of video for these systems, the video is sent to a review center. Someone has to look at that video before it gets sent to the fleet manager. The Driver-i system can do all that computation at the device level and supply information to the fleet manager within minutes. Timeliness of data is definitely an advantage that we bring to customers.”
Central Transportation Services, a dry bulk carrier based in Wichita, Kansas, is using Driver-i on five of its trucks as part of Netradyne’s early adopter program.
“We like the real-time aspect of Driver-i, both for capturing data for insurance purposes and for coaching drivers,” company President Steve Cullins said. “Other systems send the video first to people who review it and then it’s sent to the fleet. We like that Driver-i sends the video directly to us.”
In contrast, Lytx’s Kerker and SmartDrive’s Mitgang emphasized the importance of having people examine video before sending it to fleets.
“Human review is an important piece of what we do at Lytx, and we believe that it adds enormous value to our clients,” Kerker said. “Essentially, it removes from our clients any burden of determining if there is a coachable behavior. Identifying coachable behaviors is critical for helping drivers improve and become better drivers.”
In Mitgang’s view, the human review provided by SmartDrive is essential for ensuring that fleets are not wasting their time on irrelevant video.
“Services claiming accuracy merely with algorithms, or even just self-serve event video, are in effect asking the fleet to be in the ‘video review’ business,” he said. “No fleet in the world wants to wade through hours of worthless video. Not having a managed review process only leads to wasted time, false positives and more expense in missed opportunities and staff overhead.”
Products from two other technology providers — Seeing Machines and Safety Vision — target distracted and drowsy driving by focusing on the driver’s head and eye movements.
Seeing Machines’ Guardian system features internal camera sensors, which are focused on the driver, as well as a forward facing camera, said Chris Sluss, vice president of business development for North America.
“By using sensors and image processing technology to track the micromovements of a driver’s eyes, facial expressions and head, the system can identify a fatigue or distraction event,” Sluss said. “When an incident occurs, the cameras record both the driver’s face and the road, giving complete visibility of the driver’s environment.”
Drivers are alerted instantly through sound and seat vibrations. The Guardian system captures video for each event, which is then reviewed by specialists who are available around the clock.
“The program produces reports based on ongoing data concerning events and trends across the fleet,” Sluss said. “This can be used to develop ways to assist drivers, such as a fatigue management plan.”
Through a new partnership with MiX Telematics announced in December, Seeing Machines’ Guardian system will be integrated into the fleet management and safety technology offered by MiX.
“The hardware platforms will be connected in the vehicle,” said Pete Allen, executive vice president of sales at MiX Telematics North America.
Event alerts and distractions captured by Guardian will be sent to the event engine at the MiX Fleet Manager portals, Allen said. “This event data will be available in standard MiX safety reporting and can be sent as notification alerts to supervisors as requested.”
Adding the Seeing Machines’ drowsy and distracted driving events to the driving behavior events from MiX will give the end user “a deeper view into driver behavior metrics to improve overall safety,” he said.
MiX also offers its own integrated in-cab video product, MiX Vision, which now has the ability to support multiple cameras, allowing fleets to view different angles such as blind spots, Allen said.
Another product that uses camera technology to track facial movement is the Driver Distraction Monitor from Safety Vision.
“The DDM monitors and analyzes the driver’s eye lids and pupil dilation using proprietary algorithms,” said Rex Colorado, business development manager for Safety Vision.
The DDM detects driver distraction by tracking the percentage of time the driver’s eyes are focused directly ahead. The system changes the thresholds for alerts depending on the truck’s speed.
“When you’re at a lower speed — say 30 miles per hour — the system allows more time without focusing directly on the road,” Colorado said. “But when you’re driving at highway speeds, the system is less tolerant for the driver to be look- ing away.”
When an unsafe pattern is detected, the camera sounds an alarm to keep the driver alert.
“We can also vibrate the seat cushion and send text or e-mail alerts to fleet supervisors of the event,” Colorado said.
Two prominent features of the DDM are continuous video and Live-View.
The system keeps four or more weeks of video on the DVR inside the cab. Once it reaches full capacity, it writes over the old video.
“Continuous video is invaluable,” Colorado said. “It allows supervisors to see the entirety of what happened, and equally as important, what may not have taken place.
Many of Safety Vision’s customers are utilizing its Live-View feature.
“You can have immediate access to what the cameras are recording inside and outside the truck as it goes down the road,” Colorado said. “The technology is advancing and the cost is coming down for the cellular airtime used to transmit the live video. Several years from now, not having Live-View will be the exception and not the rule for most fleet operators.”
The goals of today’s video-based safety programs extend beyond detecting problems. They also give fleets the ability to document good driving behavior.
“The ActiveVision technology uncovers what could be done better, but it also uncovers when drivers are doing everything right and gives their companies an opportunity to recognize that,” Lytx’s Kerker said. “Sometimes you capture really terrific defensive driving, and that is as important as anything else.”
The Driver-i program from Netradyne also collects data on good and bad driving behaviors, Kahn said.
“We compute driver scores by looking at all of the data points collected during a workday,” he said. “In addition to detecting problems, we’re able to confirm that the driver was actually doing what he was supposed to do.”
Central Transportation Services is using Driver-i to evaluate its drivers.
“Driver-i is a great measurement of driving habits, because it provides quantitative data,” Cullins said. “The ability of the system to score drivers is key. That’s what really excites me about this product.” ³