I believe in the power of shared data and technology to build a better future.” While it’s unlikely Microsoft co-founder Paul Allen had supply chain in mind when he said this, the statement resonates truer than ever in our industry. Shared data and technology have the power to propel supply chain performance to new levels.
If I were to start telling you the benefits of cloud technology, you would roll your eyes and maybe even be tempted to start clicking on ads! But let’s face it—the traditional benefits are table stakes now. It’s simply expected. Kind of like when we reached a certain level of maturity with the internet, smart entrepreneurs began using it as the basis for breaking paradigms (think Uber, Netflix and Instagram). Providers of supply chain technology now must think about more than just planning a load, optimizing a pick path or planning an inventory purchase.
Moving Beyond Commoditized Capabilities
Supply chain execution is very established by now. We’re talking 30 years of transportation and warehousing software. Tendering a load, RF picking and scheduling appointments are basically the same regardless of what software you use. By now, most companies have exhausted the ‘5 percent to 10 percent savings’ every TMS and WMS vendor promises.
Applications in isolation do the best that they can, but an application within a network, leveraging data from the ecosystem—that’s where the real magic starts to happen. Companies that adopt this software model of the future will make strides and leaps forward in managing their operations more efficiently.
You don’t want to take my word for it? How about Gartner, who last year, released the first-ever Multienterprise Supply Chain Business Networks (MESCBN) Magic Quadrant, evaluating such networks that “support a community of trading partners that need to coordinate and execute on business processes that extend across multiple enterprises.” A subsequent brief in May 2019 summarized that MESCBNs “are essential and chief supply chain officers need to incorporate them into their business plans.”
The conversation about supply chain software development can no longer start and stop with applications…or data, or a network. Achieving best-in-class requires these components working together intelligently, and that is really where we believe supply chain tech is going to take our industry to a whole new level.
Operating in a Network
As previously described, purpose-built applications for transportation and warehousing have long done their job very well—optimizing within the isolation of the business where contact with the outside world is generally limited to 1:1 conversations. Can you take my load? How much? Where is it? By contrast, if that application is built to work within a network, companies gain easier access to services and other benefits. Collaboration with partners is easier. Access to carriers and increased capacity is faster by being part of a larger community and via digital freight marketplaces.
From a development perspective, applications that exist in a silo are fairly simple. Applications that participate in a network (or are the network) must be designed differently. API calls and web services must be available throughout the application to share or request data. Architects must think ahead of what can be done with information that may not exist natively within the application but is readily available in the extended network.
Producing and Harnessing Powerful Data
Large networks have two primary benefits. First and foremost is the connectivity and collaboration described above. But beyond that, networks that manage transactional data also have a natural by-product: data. Tons of data. Imagine tens of millions of loads and billions of dollars in freight spend moving through an application and network, and you’ve got massive amounts of data.
Used properly, that data is crunched with cutting-edge algorithms (insert your favorite buzz term here—artificial intelligence, machine or deep learning) in order to decipher where it has been and where it is going. When made available to the original applications, decisions can be driven by data. Choices can be made based on analysis and comparisons. Hence, another way in which the original application architecture must be reconsidered.
Take for example the scenario of meeting Walmart’s on-time, in-full (OTIF) policy, requiring suppliers to deliver full truckloads within a two-day window 87 percent of the time, and 97.5 percent in-full for food and beverage deliveries. The penalty of 3 percent of the cost of goods for failing to meet these requirements is significant, particularly for an industry that operates in single-digit margins to start.
Say a supplier is trying to increase its on-time performance by 3 percent to stay above the requirement. Multiple factors could help to achieve this, not all of which are available with a siloed application alone. For instance, with my current carriers, my shipments are late 5 percent of the time. However, data-driven analysis shows to reduce that to 2 percent would cost an extra $500,000 in freight spend. This could be more than the penalty. Armed with this insight, I can consider all factors to decide if the increased costs are worth it (or maybe have an entirely different conversation with Walmart).
“There are relatively few ideas that you can do just by yourself.” That Paul Allen guy was pretty darn smart. We don’t live in a world of isolation. We are connected in more ways than we probably even realize. So why develop and deploy applications when you can work in connected harmony by using applications with networks and data?