Most AI dash cameras on the market have long been designed to detect safety risks like cellphone use, failure to use a seatbelt, following distance, lane departure and more. Lytx’s safety cameras can identify over 100 risky driving behaviors, but now the video telematics and fleet management technology provider is experimenting with taking that capability even further.
Daniel Witriol, a principal engineer at Lytx Lab, exhibited one of the company’s potential future solutions last week during the Innovation Showdown at its annual user conference Lytx Protect. By its name, it sounds like a competition, but the showdown instead highlights Lytx’s research and development efforts that haven’t yet – and may never – become available products.
“We test ideas before they become products, ask the tough questions and figure out what's worth scaling. We do this in partnership with you because we know that the best ideas come from co-creating with our customers because we know you're in the best position to tell us about your operations,” said Sundari Masters, lead product manager at Lytx Lab, a collaborative space where ideas are tested, refined and brought to life with direct input from customers.
Lytx showcased several prototypes for its customers during this year’s event – the first Protect conference the company has opened to media. CCJ, the sole media member present, was invited to view these prototypes alongside customers.
The first of those prototypes was Project Hercules, a capability that enables customers to define their own custom, cloud-based AI detections, not only from Lytx’s cameras, but from any source.
“Every fleet carries risks that standard detections weren't really built for,” Masters said. “So we got curious. What if your team could describe any of these problems in plain language and have AI go find it in your footage, no technical skills required.”
Custom detection
Witriol performed a live demonstration of the tool, which is still in development, illustrating an example of just one use case based on a real-world delivery fleet customer: driver dress code compliance. This prospective feature would enable that customer to identify when a driver is non-compliant, as well as trigger an action to address the issue.
Initially designed as a form with a multitude of options for the fleet to choose from, the Lytx Lab team opted instead for a chatbot where a fleet would be able to create a custom detection from scratch in natural language or by using a provided template.
Witriol prompted the AI to “ensure all drivers are wearing their uniforms while driving,” and with behind-the-scenes logic, it understood the direction and requested a description of the uniform. The prompter could type out a detailed description or upload some sample images.
After the tool analyzed the images to determine key components relevant to the detector it was requested to build, it responded with a list of identified characteristics of the uniform: a dark green, branded polo shirt, a black cap and dark pants.
“This is just to show that there can be a lot of intricacy and detail in what it is you're trying to detect,” Witriol said.
The tool then built out a rule that defined the criteria for when the camera should capture video, and Witriol configured the rule with specifications, including assigning the rule to driver employees and working hours only.
Once the tool processes all the inputs and creates the algorithm for the detection rule, the fleet can test the algorithm for accuracy before deploying it.
“You realize pretty quickly as we're working on this that oftentimes there are a lot of corner cases and issues you need to work through before you want something like this just to go live everywhere,” Witriol said. “So we introduced this review and refinement process.”
The fleet would be able to set a threshold for success and then review videos that identified drivers for noncompliance with the rule.
Witriol reviewed multiple videos and provided feedback. He confirmed that the system correctly flagged several issues of uniform noncompliance, but he also identified two videos that were unnecessarily flagged for review. For example, he marked the cap as optional and clarified that rolled up sleeves were OK, refining the engine’s understanding of the rule and adjusting it accordingly.
If the tool achieved the defined threshold for success, the fleet could then deploy the rule. If it failed, the system would continue to test the algorithm against the samples provided until it broke the threshold. If it is unable to break the threshold, it will interact with the customer through the prompt engine to understand what it means to perform better.
Once live, the fleet can set up automated actions associated with the rule. For example, if the driver is flagged for noncompliance, the fleet can ask the system to send an email to the driver, share a coachable event or even send an in-cab alert.
Witriol said the team has left room to further develop the tool with the potential to incorporate it with Lytx's recently announced agentic AI features.
“We expect a lot of cool things to be able to come from this,” he said.
From coder to customer
AI tools like this are fundamentally changing the software industry, enabling non-technical users to code using natural language. It’s a shift that is enabling the fleet industry to customize their systems to their diverse individual business models.
At last year’s conference, Lytx demoed the prototype of a detection feature that identified all active construction zones on major roadways around the U.S. in real time. That tool, which is now available to customers, took months to build, Witriol said.
“If you had an idea – if you had something that would really help your business – you better hope that there are a lot of other people sitting in this room near you that need that same thing,” Witriol told the crowd during the Innovation Showcase. “That's the only way we're going to be able to justify the expense of spending all that time building one detector.”
Lytx’s wide customer base has so many unique risks defined by their specific sectors of operation. Lytx Chief Technology Officer Rajesh Rudraradhya told CCJ that’s why Lytx has always offered a “sort of out-of-the-box AI that we do really well.”
Project Hercules can create detections in minutes, rather than months, and do so by the hands of customers who can tailor them to their distinctive needs without assistance from professional software developers that may not understand the ins and outs of customers’ operations.
Witriol tempered expectations with the caveat that this tool will not replace all machine vision algorithms.
“If you want to do something like fatigue detection, where you really want to analyze every single frame coming out of that video and that device in real time, this is not the tool for that,” he said. “This is really well suited to cases where you can think of a trigger condition where the cloud could effectively contact the device, fetch some video off of it, process it, and then trigger an action based on that.”
His demo of the driver uniform compliance rule is one such case. Another example he offered was a custom detector that would identify and notify when auxiliary components of a waste truck were extended at inappropriate times, causing safety risks.
Rudraradhya said Project Hercules is a gamechanger for Lytx customers.
“We’ve now democratized AI building models, which always used to be something that only my team could do or data scientists could do,” said Lytx Chief Technology Officer Rajesh Rudraradhya. “We’ve enabled someone speaking natural language, with simple sentences, to go through it on their own. That makes it a powerful platform.”






















![Img 9401[87]](https://img.ccjdigital.com/mindful/rr/workspaces/default/uploads/2026/05/img-940187.vJq7SVGjUK.jpg?auto=format%2Ccompress&fit=crop&h=167&q=70&w=250)