Cyber criminals use AI to accelerate cyberattacks

S A99lg K5t R Cls2 Headshot

Bad actors have begun using AI to build out fake LinkedIn profiles, build organization charts and make connections. They’re using these profiles to get hired at companies that then send them laptops to work remotely, gaining remote access to that company’s network.

That was one of the ways mentioned at the National Motor Freight Traffic Association (NMFTA) Cybersecurity Conference that hackers are finding yet another way in.

“They're getting more creative,” said NMFTA COO Joe Ohr.

That’s just one of the many ways AI can accelerate cyberattacks and expand opportunities for bad actors.

Ohr said ransomware remains the most common form of attack, and phishing – a form of social engineering – is one of the most common ways of delivering that. AI is automating those phishing attacks, making it easier to cast a broader net and increase the likelihood of someone taking the bait.

“You really have to enhance your training, because they're using the phishing to do even more,” Ohr added. “And the thing is, now they can hit the smaller trucking companies. They're not just going after the big ones. They can hit the smaller, medium sized ones, because AI gives them the capability to scale where they didn't have the capability to scale like that before; it had to be more surgical. Now it could be more of a grenade approach.”

Mollie Breen, co-founder and CEO of automation platform Perygee, said AI can help bad actors scale the volume of their attacks.

Just like developers using AI to write apps faster, it can help someone write an exploit when they previously may not have known how or help them do it faster, she said. She said bad actors could also ask an AI to discover vulnerabilities they can exploit.

“It used to be that it would take an expert – somebody with a ton of experience – to be able to spot those things, a lot of pattern recognition, but that's what AI is really good at,” Breen said. “I think that as an industry, we're not going to see a big change in attacks right away, but we're going to see a big step function in terms of the increase of attacks, and eventually we will probably also see entirely new types of attacks that AI is able is capable of.”

Partner Insights
Information to advance your business from industry suppliers

Seed-key exchange is one example where AI could be used to detect patterns and make it more efficient for a bad actor, Ohr said. If a bad actor has access to a truck and the screen from the data bus, AI can predict more of what’s coming from the data bus, he added.

Using chatbots

Another way AI is affecting trucking companies and the broader logistics industry is the use of AI chatbots like ChatGPT, Microsoft Copilot, Google Gemini, etc.

“AI is the silver, shiny new penny out there, and everybody wants to use it, but you have to be cautious,” Ohr said.

NMFTA uses a software to alert when a new AI tool is launched from a desktop, for example, but caution goes beyond that.

Artie Crawford, director of cybersecurity at NMFTA, said just as companies train employees to identify social engineering attacks, they should train them on what is and is not acceptable to enter into an LLM.  

He said employees need to have specific guidelines about entering company data into an open-source training model or LLM.

“We need to also make sure that the subscriptions that we get at the enterprise level are only using in-organization data and no outside training models,” Crawford added.

And an enterprise subscription is necessary to keep data secure for companies using those LLMs, Ohr said. NMFTA uses Copilot on the development side.

“It can cut your development costs by 40%, but we want to make sure we're using a version that is not going to feed the public engine because what we're putting in there is proprietary,” he said.

But the danger of using LLMs goes beyond entering data. Companies across every industry have begun using LLMs for customer service, which opens another attack vector. Bad actors can manipulate chatbot inputs to steal data, bypass access controls and perform remote commands.

“There are so many avenues,” Ohr said. “Sometimes you need to take a step back and say, ‘Have we gotten too automated? Are we too AI? Do we have to reintroduce the human factor?”

Angel Coker Jones is a senior editor of Commercial Carrier Journal, covering the technology, safety and business segments. In her free time, she enjoys hiking and kayaking, horseback riding, foraging for medicinal plants and napping. She also enjoys traveling to new places to try local food, beer and wine. Reach her at [email protected].