Data Mining: Hitting your numbers

Scott Sullivan, Pitt Ohio Express

Two years ago, Randy Black discovered that one particular lane in the Shaw Industries transportation division was performing significantly worse than average. Black, e-business manager for Shaw, was looking specifically at a 55-truck operation that delivers flooring to locations in Georgia. The Dalton, Ga.-based flooring giant also operates more than 900 total vehicles in interplant, regional and over-the-road divisions.

As part of a Six Sigma effort – a data-driven approach to improving business processes – Black had begun looking at lane, driver and vehicle metrics for the 55-truck operation on a per-pound basis. Viewing performance in terms of weight of product hauled made it easier to compare loads of varying volumes – carpet versus hardwood, for example.

After Black found that the labor cost per pound was higher than average in this one lane, he looked for the reason and found that drivers were making a daily 50-mile roundtrip simply to drop off bills of lading at the company’s headquarters in Dalton. Shaw easily fixed this problem by placing a drop box for BOLs at the truck leasing facility where drivers picked up and dropped off their trucks each day, Black says.

Through numerous small changes like this, Shaw realized savings of 6.5 percent in labor costs per pound in its Georgia delivery fleet. Shaw achieved this improvement not only by ferreting out bad habits and processes but also by identifying top performers and modeling their best practices.

“That is the fun thing – to not only see what is bad, but to know what the top performers are, and what are they doing right,” Black says. “Without running the data tools or a pro-rate chart on costs, you are really just guessing to see where you are inefficient.”

With varying degrees of sophistication, many fleet owners mine their operational and financial data to unearth hidden poor performers and unheralded best practices. Data mining helps executives get answers – often without even knowing the questions – and may help managers break out of conventional thinking and assumptions that hamper improvement.

Partner Insights
Information to advance your business from industry suppliers

Challenging assumptions
With today’s dispatch and accounting software systems, instant communications and various other technological “bells and whistles,” trucking executives seldom lack data. Indeed, trucking companies and private fleets typically generate numerous standard reports that are intended to help managers analyze operational and financial data. And they can get those reports far more frequently than in the past – weekly, daily or even hourly.

Faster and more comprehensive reports certainly represent a huge leap over yesteryear, but there’s a risk in focusing too heavily on metrics you choose. When you decide what to measure, you make assumptions regarding what is important. These assumptions sometimes may blind you to what truly is happening in your business or industry.

In 2004, a key manager approached Ray Johnson with a concern. Recent monthly financial statements showed that Pitt Ohio Express had only been able to increase its average shipment weight from 1,250 pounds to 1,280 – well below the supposed LTL “sweet spot” of 2,000 to 5,000 pounds. But Johnson, chief financial officer of the Pittsburgh-based carrier, responded by scrutinizing not Pitt Ohio’s performance but rather the metric itself.

Johnson had joined Pitt Ohio in 1998 after a 30-year career in the aerospace and defense industry and was an engineer and expert in data analysis. He questioned the significance of average weight per shipment, which was calculated by dividing the total tonnage by the total number of shipments in a month.

“There are lots of examples in business where people use an average that doesn’t mean anything,” Johnson says.

To identify a more revealing measurement of performance, Johnson created a histogram of shipments by weight range using historical data. A histogram is a statistical tool found in most data analysis packages, such as Microsoft Excel, that represents frequency data graphically.

Johnson found that the mode – the most frequently occurring shipment weight – was 400 pounds. The median, or midpoint, was 500 pounds. So a relatively few number of shipments above 1,280 pounds were weighting the average to the point that the average was meaningless.

“That changed the dynamics,” Johnson says. “It opened up a whole new approach to moving freight.”

Johnson used other types of analysis tools, including portfolio and bucket analysis, to characterize its customer base from many other different perspectives.

The portfolio analysis is similar to that which you might do with your stock portfolio, where you balance industry sectors and groups and among companies large and small within industries.

“The point was to get a broad mix of customers so as to dampen out economic impacts as best we could,” Johnson says.

In the bucket analysis, Johnson ranked customers in terms of revenue from large to small. He then placed them in 10 equal revenue “buckets.” Revenue was used as a proxy for the amount of “work” each bucket entailed – a rough approximation – and then Johnson looked at the profit from each group.

“Often the largest customers are not the most profitable but contribute a large amount to overhead,” Johnson says. “Doing this analysis helped us to understand where our profit was coming from and to develop strategies for improving the profit of buckets that were particularly low.”

One result from obtaining a deeper understanding of its customers and freight patterns was that management decided to change its equipment mix. The company added 25 Sprinter delivery vans in 2005 and another 25 in 2006. Using delivery vans, the company has increased efficiency 30 to 45 percent in the delivery of smaller shipments – typically 200 pounds and under.

Pushing the limits
For truckload carriers, shipment size generally is irrelevant as a customer typically books an entire trailer. The bigger challenges are freight pricing and routing, especially given variations in fuel prices, toll costs, wages and many other expenses. Data mining can help shed light on these factors.

“If you are generating a flat rate per mile and you don’t do some data mining, you think you are making money – but when you get into lane by lane, you find that you are not charging enough,” says Greg Brown, president of B.R. Williams Inc., a 145-truck carrier and logistics/warehousing provider based in Oxford, Ala.

Four years ago, Brown and others at B.R. Williams used Microsoft Access to develop a data mining tool that makes it possible to analyze all aspects of operations at the trip level, which includes both outbound and inbound dispatch, and all events that happen in between.

With the Access tool, the company converts revenues from its McLeod LoadMaster dispatch and accounting software into net revenue-per-mile for a trip. A staff member also downloads and, in some cases, manually enters expense items to ensure an accurate accounting of costs at the trip level. The costs for each trip include fuel, driver wages, layovers, hotels and credits for fuel surcharges.

With this information in a database, Brown can sort, query and analyze trips using any combination of geographical regions, lanes, customers and drivers. This capability helps B.R. Williams truly understand the profitability of each lane and customer. Brown has discovered the incremental costs of trips to the East Coast due to toll roads, identified areas of low backhaul rates and revealed the incremental cost of fuel in different regions.

“It has changed the way we solicit freight,” Brown says. “We get paid for going into areas, or else we don’t go there at all.”

Bison Transport developed a data mining tool that works with its enterprise software and activity-based cost analysis systems to help managers make complex pricing decisions on a daily basis. The 800-truck fleet based in Winnepeg, Manitoba, Canada, handles a large volume of cross-border shipments, so it must deal with frequent shifts in freight volume between the United States and Canada caused by changes in the exchange rate.

As the U.S. dollar drops in value, exports northbound suddenly heat up. “Two different customers will phone us from a state like Indiana,” says David Fulawka, director of business development. “They both want us to start hauling more freight or handle a new lane.”

In times past, Fulawka says, making a profitable decision on the spot required inherent assumptions and guesswork, such as trends in the company’s operating ratio (OR) in each lane due to toll costs or differing fuel prices in each region. Not anymore.

Bison Transport uses an activity-based cost analysis solution called the Truckload Cost Information System (TL/CIS) from Transportation Costing Group (TCG). The system automatically pulls operational and cost data from TMWSuite, the company’s enterprise transportation management system. On an ongoing basis, the tool builds an accurate, trip-by-trip record of the true cost of servicing each lane and customer.

The tool Bison developed – called “Order Ferret” – gives front-line managers quick access to cost data from the TL/CIS database. Order Ferret integrates information from the TL/CIS system into TMWSuite’s central dispatch application. By clicking on a custom Order Ferret icon in the dispatch screen, users bring up a grid or pivot table. They can bring in all the loads out of Indiana in the past month, for example, and then drill down to find the OR of a specific lane, and all the trips that went into computing the OR of that lane.

“[Order Ferret] gives you all the information you need,” Fulawka says. “Based on historical data and the profitability of lanes, we know more accurately what price to quote.”

Profiling personnel performance
Today, more businesses are drawing on new resources of data to find more detailed and dynamic ways to view their operations. But the challenge of using multiple sources of data is deciding where to look for patterns and relationships that can be turned into reasonable – and useful – predictors for your operations.

Matteson, Ill.-based RayTrans Distribution Services designed a data interface between two systems – a voice-over-IP phone system from Vertical Communications that captures important details on employee phone usage; and a custom, homegrown transportation management software system called White Lightning. The data interface has allowed President Jim Ray to connect phone usage with dispatcher productivity and profitability.

“Really good dispatchers have patterns in the way they use phones,” Ray says. By identifying and modeling these patterns, RayTrans can train and monitor new and veteran dispatchers in optimal phone usage.

When Ray began his data mining project, he assumed that dispatchers who made the most calls were the ones moving the most loads. Not true. He then looked to see if dispatchers who took the most incoming calls moved the most loads. Also false. And he looked to see whether dispatchers who had the longest calls moved the most loads. This too proved unreliable. “Some people just blab and get nothing done – some people are effective in blabbing,” Ray says.

The truth, Ray found, “is a combination of them all.” By correlating patterns in phone usage and dispatcher performance, Ray now knows the optimal talk time and the number of calls a dispatcher should be making – both incoming and outgoing – in a day.

“We can tell when a dispatcher is hot based on their phone usage,” Ray says. Through data mining, he found the following trends among dispatchers in its brokerage operation in August and September of this year:

  • Seasoned dispatchers have more calls overall and talk longer than rookie dispatchers; the average talk time was 2.04 minutes versus 1.69 minutes, respectively.
  • Seasoned and rookie dispatchers both make the same number of outbound calls (both average about 41 per day).
  • Seasoned dispatchers receive many more inbound calls, reflecting the quality of their carrier relationships (114 versus 28 per day).

Ray also correlates the number of calls abandoned and hold time for each phone queue with performance data in the White Lightning system. “There is a direct correlation,” he says. “I look at it every week.”

Ray also has identified a correlation between the profitability of a load and how quickly it moves – that is, the time between order entry and booking a carrier to take the load. With this data, the company has decided what shipments to accept based on how fast the load moved in the past.

“What we’ve found is that stuff that moves the quickest is sometimes the best, but it’s not 100 percent true,” Ray says. “We see that the terminals that do the best have an average less than one hour. The ones that do the worst (in time) are moving loads at too low a margin.”

All in one place
To leverage data mining to the fullest, some fleets have developed business intelligence systems that consist of a central database – often called a data warehouse or data mart – where data from multiple sources is stored. These systems include software applications generically referred to as online analytical processing (OLAP) that provide users with the ability to store and quickly extract data from the warehouse on demand into multi-dimensional “cubes.”

While it may sound sophisticated, cubes have the familiar look and feel of utilities such as grids, pivot tables and spreadsheets. On a practical level, the advantage of business intelligence tools is to provide a common, easy way to report and analyze data at many different levels within a company, says Pitt Ohio’s Johnson. The company has developed a business intelligence solution called Cube-IT.

“Typically in a management position, you have a SQL set of data, and you often need someone to help you write reports,” says Johnson. SQL means structured query language, the language that provides an interface to most databases. The problem is that when managers ask someone for a report, they base their request on a preconceived notion.

“The employee goes off and writes it, comes back and you say ‘I should have told you this.’ They rewrite, run it by you again, and you go through an iterative process. By the time you get it done right, you forget why you even asked for it.”

Scott Sullivan, Pitt Ohio’s vice president of information technology services, says the data used in Cube-IT originates from a separate database that contains the company’s activity-based costing model from TCG. Sullivan describes the look and feel of Cube-IT as a “spreadsheet on steroids.”

The company has created separate “cubes” and defined them by business function. A sales manager has a cube designed for sales to analyze sales volume by customer, by lane, or by customer manager and sales rep. The system’s flexibility allows users to define as many dimensions as they want.

“I primarily use it to track what sales is doing – to track our yields by customer,” Johnson says. “It allows me to really understand the revenue side of our business. I can also dive into the expense side of our business. This allows me to do analysis at the speed of thought.” Johnson can follow data where data wants to lead him, rather than having to test whether data fits his preconceived notions. “It is an inverse way of doing analysis.”

Cube-IT has helped the motor carrier eliminate $40 million in unprofitable business in three years, Johnson says. Pitt Ohio, which generates more than $225 million in revenue annually, also improved its operating ratio by 3 points while growing its business significantly in the same period.

As more businesses fine-tune the art of data mining, they can provide broad access to corporate information and become what consultants like to call an “information democracy.” Another approach is “information monarchy,” where only senior management has access to information. And there is “information communism,” where everyone gets the same information, but it’s not really that useful and they have to wait a long time to get it.

“Most managers understand the value of data and are working with those details themselves,” says Mike Ludwick, vice president of technology for Bison Transport. “We are continually looking at historical data to make decisions, but I think the real advancement is the ownership that non-IT people are taking in their data.”


Real-time insight
Affordable tools help fleets keep a close eye on important metrics

As lead designer and programmer for RayTrans Distribution Services’ transportation management platform called White Lightning, Jim Ray – who is also the president of the Matteson, Ill.-based truck brokerage firm – was perplexed by one task. How could he get the system to track and report automatically – and in real time – the company’s most important key performance indicator, profitability per day?

“I’ve really struggled with that for a long time,” Ray says. After much work, Ray has the process down to a science. “I can break down profitability by day to the terminal, department and individual level. Once you get profitability by day, you can really tune your operations. Instantly, you know when things are rolling and when to be scared shitless.”

Today, many fleets track key performance indicators (KPIs) by using real-time performance management tools they have either developed in-house or purchased from leading enterprise software providers. Often called scorecards or dashboards, these tools automatically capture, calculate, display and refresh KPIs throughout the workday. The good news is that tools can be had with minimal investment.

“When I think of data mining, I think of something large and complex,” says Quintin Holmberg, MIS manager at J.R. Schugel Trucking, a 600-truck carrier based in New Ulm, Minn. Holmberg says that for real-time monitoring of KPIs, the only tool he needs is Microsoft Excel.

Any time management comes up with a new idea for a performance metric or report they want to monitor daily, hourly or even by the minute, Holmberg develops a set of SQL (structured query language) statements to extract the data from the company’s database, the Innovative Enterprise System. He then writes a Visual Basic Application (VBA) in the “back end” of a spreadsheet in order to automatically pull that data into the spreadsheet and refresh it.

“When you get down to the nuts and bolts of measuring, simple tools really work,” Holmberg says. One of the most interesting KPIs that J.R. Schugel developed is the average miles per hour on a load; this metric includes all down time, from pickup to drop, and is a “very interesting way to book freight,” Holmberg says. “It lets you boil down utilization of a single load.”

While data mining techniques are helpful for shedding light on hidden patterns or relationships among historic data, scorecards and dashboards ensure the spotlight shines on key performance metrics throughout the day.

“If you mine data and summarize it, you can get the data to report key indicators, and you are then making day-to-day and more long-term decisions based on actual experience,” says Dave Fulawka, director of business development at Bison Transport, an 800-truck carrier based in Winnipeg, Manitoba, Canada.

The common denominator of different KPI systems is that they facilitate communication among employees by making key information visible and well-known. As a result, companies can operate with total clarity regarding their corporate goals – and how performance measures up against those goals.

“I can really quickly see where the problem is,” Ray says. “Or if a dispatcher is jamming, I make sure to give them praise.”


Case Study 1
Pitt Ohio Express, Pittsburgh
In 2001, Pitt Ohio Express began building a business intelligence tool named Cube-IT, which gave the less-than-truckload carrier new insights about the profitability of each customer. In the past three years, the company has eliminated $40 million in unprofitable business while significantly growing overall.

“From my view, this whole business intelligence platform allows us to bring together all of our customer information,” says Scott Sullivan, vice president of information technology services. “We are moving to be a more customer-centric organization. We no longer give general rate increases.”

Case Study 2
Bison Transport, Winnipeg, Manitoba, Canada
Bison Transport developed a homegrown program called the “Order Ferret” that provides management an analysis tool to make complex pricing decisions on the spot. Users click on an Order Ferret icon while working in the central dispatch screen to bring up a grid of various pricing and profitability data.

“We can make decisions based on real data, as opposed to estimates or guesses,” says Dave Fulawka, director of business development. “You’re not just eyeing a graph or trend. Your decisions are more precise.”

Case Study 3
B.R. Williams Inc., Oxford, Ala.
Wanting to assess profit on a trip-by-trip basis, B.R. Williams in 2002 developed a database and analysis tool using Microsoft Access that helps it make quick changes in lane pricing due to varying external factors, such as toll expenses, fuel prices and backhaul rate.

“We are soliciting more freight in certain lanes because they are more profitable,” says Greg Brown, president. “We know now where our niches are, and we solicit around those lanes.”

Case Study 4
RayTrans Distribution Services, Matteson, Ill.
By integrating two separate databases – a custom transportation management system and a software-based phone system – RayTrans has identified patterns in how successfully dispatchers in its brokerage division use their phones. The company also monitors profitability by day at multiple levels – by terminal, department, dispatcher and truck.

“Once you get profitability by day, you can really tune your operations,” says Jim Ray, president.

Case Study 5
Shaw Industries Group Inc., Dalton, Ga.
To guide its data mining efforts, fleet managers at Shaw Industries follow Six Sigma methodology, which companies use to analyze the efficiency of any process. Shaw’s analysis begins by converting all cost variables into a common metric – performance per pound of flooring – to help managers identify top and bottom performers.

“Essentially, that’s why we do Six Sigma,” says Randy Black, e-business manager. “Measures fluctuate based on the volume of shipments, but over time you see trends because we unify shipment data to the poundage level.”