Supply Chains data accuracy and new technology purchase

Roger OakdenLogistics Management, Operations Planning, Procurement, Supply Chains & Supply NetworksLeave a Comment

ING_33594_270327g

‘New’ technologies in 2025

One skill that technologists have in abundance is promoting terminology that potential users will hopefully accept and use. Technology products promoted as a ‘must have’ for supply chains in 2025 include ‘agentic AI’ (or AI agents) and ‘AI-powered decision intelligence data platform (DIDP)’. What are they and are they needed?

A brochure states that “AI agents work alongside professional employees in Procurement and Supply Chain operations to make decisions and execute them across Procurement and complex global supply chains. The agents self-reflect, with a memory engine that allows them to learn from mistakes (but does not state how mistakes are identified). They integrate reasoning with LLM (large language model) inputs, so that employees can provide directions in natural language and the agents will pull data and interact directly through internal systems and external stakeholders”.

We are also informed that “AI-powered decision intelligence data platform (DIDP) enables businesses to collaborate in real time across functions. Unified data involves bringing data from relevant areas into a single cohesive supply chain system. DIDP analyses the real-time data and makes operational recommendations to managers, and updates recommendations when the data reflects changing dynamics”.

Whether you are impressed by the future technology is not important. The question is whether there is a required use for a technology in areas of supply chains, the limits of the technology and whether the resources required are available to make the technology operate.

The workings of your organisation’s supply chains can always be improved. Often it is in areas controlled by your organisation and where data is most likely to show distinct patterns that can be measured. Examples are warehouse and distribution centre operations, scheduling production, vehicle routing, fleet management and equipment maintenance.

As the challenges of gaining improvements become more complex, there is an increasing reliance on parties external to your organisation’s Supply Chains group. Therefore an increasing likelihood of errors in the data, of insufficient data being available for comprehensive analysis and consequently, insufficient time to evaluate potential options.

Quality of data

The quality of data in an organisation is typically not a topic for management and board meetings. Instead, issues with data quality stay with users, where correcting duplicated, outdated, incomplete and missing data is too often considered as part of the job. However, until data quality becomes an issue for senior management, implementing technologies that rely on data is fraught with unknowns.

Within an organisation, the basic Master Data includes relatively static reference data e.g. customer, location and product (SKU). However, values at the same data point can be inconsistent, which prevents grouping and summarizing data into useful information. Examples are:

  • How must a customer or supplier be designated? For example, should it be ‘Procter & Gamble’, ‘Procter and Gamble’ or ‘P&G’?
  • How must calendar dates be formatted? For example, October 21, 2024 can be written as 10/21/2024 or 21/10/2024 or 2024/10/21
  • How must addresses be formatted? For example ‘New York City’ can be written as ‘NYC’, ‘New York’ or ‘NY NY’ (which incorporates the State)

Linked to the Master Data is the Product Master Data, which identifies the product classification and attributes of a product SKU. Studies through this century have identified a high level of error.

  • A 2006 study in the US of nearly 100,000 new and changed products found that more than 60 percent of the data entries required correction for incorrect weight, dimensions or sizing; incorrect product classifications; multiple versions of the same product and incorrect conversions of internal descriptions to international standards. Also mistakes in spelling, punctuation and abbreviation.
  • A 2009 study in the UK by GS1 (the global product identification standards organisation) stated that over 80 per cent of transactions between suppliers and retailers had inconsistencies in what should be identical data.
  • A similar study in 2014, by GS1 in the US, identified about 50% of the data surveyed was inaccurate.

Although defect-free data should be the ultimate goal, what is an acceptable error rate for data? For physical items, the six Sigma target is a defect parts per million (DPM) rate of 3.4 defects per million opportunities. For data, is it realistic to accept 97 or more correct records from 100 data points? But does an accepted error rate reduce trust in the data?

Data in the Supply Chains group

In your organisation, when was the last audit of data quality for the Supply Chains group, and the results published? And why, although there is a potential for problems with data quality that can cross department boundaries, is there rarely a person in the Supply Chains group with responsibility for data quality, audits and a process for problem resolution?

Procurement: An analysis at one company found that documents contained errors from 93 percent of suppliers concerning 88 percent of items supplied. The errors were associated with five areas, which exist in all businesses, and illustrates the possible data quality challenge for Procurement professionals:

  • Suppliers can trade under multiple divisions and businesses, although ultimately owned or controlled by one entity. Must identify the one entity
  • A supplier’s name can be entered differently in the Procurement files and the buying organisation’s accounts payable files
  • Potentially multiple general ledger and cost centre names
  • Categories and standard industry (SIC) codes may be used in one part of the organisation, but not in others
  • Descriptions of a supplied item or service can vary, depending on the user. Require a standard descriptor

Operations Planning: In organisations that have implemented Sales & Operations Planning (S&OP), the Master Planner, as facilitator for the process, relies on the quality and reliability of the input data aggregated from multiple sources. These may be wrong, but what is the audit process? For example, at a large transport organisation, an audit of the spreadsheets used in the planning process identified that more than 90 percent of the formulas used contained at least one error.

Logistics: An example of the challenges is a US study of process compliance in a business found that supply chains accessed 117 disconnected document formats written in Access, Excel and Google Analytics, with opportunities for errors in all. Incorrect data can be uploaded and transmitted at each Node.

Given the data quality challenges to overcome, what value is there in purchasing, then implementing technologies that rely on high data quality as a pre-requisite? Is it not better to review the objectives and identify whether the investment is justified, or whether there is a different approach to addressing the challenge?

Share This Page

About the Author

Roger Oakden

LinkedIn X Facebook

With my background as a practitioner, consultant and educator, I am uniquely qualified to provide practical learning in supply chains and logistics. I have co-authored a book on these subjects, published by McGraw-Hill. As the program Manager at RMIT University in Melbourne, Australia, I developed and presented the largest supply chain post-graduate program in the Asia Pacific region, with centres in Melbourne, Singapore and Hong Kong. Read More...

Leave a Reply

Your email address will not be published. Required fields are marked *