In our post, Improve Your Terminal Operations With These 6 Steps, we note how operators must obsess about clean data. In order to measure everything correctly in a terminal, the data must be clean, which means finding, correcting, or removing inaccurate records from your data set. Dirty data, or a dataset that contains errors, not only impacts your measurements but also your day-to-day operations. Dirty data can even affect accounting, which can lead to a negative experience with your customers.
The first step on the path to clean data is to identify any dirty data in your data warehouse. Here are three common scenarios where you’ll find data that needs fixing.
Three Common Dirty Data Scenarios in Port Terminals
A manifest, or cargo document, can arrive with dirty data. In this scenario, unfortunately, you cannot control the data coming in, but it is important to be aware of this situation. This is a big problem because perhaps the manifest includes a typo of the consignee name; this means you may accidentally create a new consignee in the system, which wreaks havoc on the accounting department.
Incoming data such as an EDI can arrive with dirty data. Again, if a consignee’s name is misspelled, then you’ll end up with two people in the system when there should be one. In this scenario, the customer will end up incurring the problem with their cargo, and your team will be stuck spending hours trying to identify and fixing the error.
Do you have ‘floating containers’ in your yard? These are the containers that seem to be on top of your yard area without any other container underneath it. How quickly can you identify a floating container in your yard? Are you always able to fix these in time before the vessel arrives? If not, your data will suffer.
How to Solve the Dirty Data Problem
In the three examples above, it’s clear to see that the best way to avoid bad data is by not using manual entry. However, dirty data is almost inevitable in any terminal, so having a solution in place that identifies and corrects this data is vital for any port. When it comes to incoming data, terminals cannot always control how clean the data is, so a top-notch system is necessary.
Data errors can be simple, or they can be complex, but they all require a process to fix the errors. The faster the problem can be identified, the faster the problem can be fixed. Fixing data quickly also saves resources and finances. If you spend only 30 minutes per day fixing data, using basic math, you can see how this time spent adds up to severe loss in resources -- roughly $3,000 per year* per employee. That, plus the cost of an unhappy customer, shows how dirty data can impact a terminal.
How Octopi Solves the Dirty Data Problem
Our team understands the profound impact that dirty data can have on a terminal and its customers. We also know that the most important step is to identify the problem, especially when no one else realizes there is one. Our system alerts our customers of anomalies in their data. Octopi uses unique algorithms to identify common bad data scenarios and alerts the necessary team members and customers so that the dirty data can be found and corrected.
*Calculations based on 30 wasted minutes x 261 business days = 7,830 wasted minutes. 7830 minutes = 130.5 hours a year. Based on $24 USD/per hour salary, $24 an hour x 130.5 wasted hours is about a year $3,132.
About Author: Guille Carlos
Guille is the founder of Octopi: a modern, web-based Terminal Operating System (TOS). He is constantly trying to simplify processes and uses data to truly understand customer needs.