Golden Records, the key for a clean database?
Plan Maintenance, Procurement, Quality, Crew Management are all data-rich domains which information is usually stored in databases existing both in-office and onboard the ships.
Over time, this information is likely to be handled by a large number of people. Of all those people accessing data, some are very well trained on the software they are working with, and some others are not.
The natural result for the above scenario is a gradual but constant deterioration of data quality. There could be different reasons for this to happen, but the main culprit is generally to be found in the duplication of records inserted.
To make some example, it is extremely common that the same spare part is created multiple times in a database simply because it can be utilized on more than one piece of equipment. Or the whole machinery is duplicated just because the database engineer who created the record was not aware that a second one already existed in a different area of the ship.
Consumables! they are often created with generic descriptions which are not always matching the requisitions criteria, and this usually ends up with the creation of additional items which are again quite generic.
“… any ERP system is as good as the data supporting it!“
In general, users unable to find what they are looking for because of incomplete information will then create more records adding to the clutter and spiralling your database in a situation that ultimately will make your system unusable.
Is demonstrated that duplicates are a concern for procurement processes, causing financial losses to Companies every day, without mention the frustration generated to the users, affecting the procurement statistics and key performance indicators (KPIs) which will ultimately impact on your results.
Remember any ERP system is as good as the data supporting it!
“The companies achieving better results do carry out continuous monitoring …”
When you find yourself in such an unpleasant situation, spreading your arms and look up to the sky won’t solve it, but proceed with an in-deep analysis having as a target the creation of a “golden record” may do the trick.
The Golden record is a unique record created out of a list of verified duplicated records which replaces them all as per its definition.
“The Golden Record is the ultimate prize in the data world. A fundamental concept within Master Data Management (MDM) defined as the single source of truth; one data point that captures all the necessary information we need to know about a member, a resource, or an item in our catalogue – assumed to be 100% accurate”.
So, now you have a way to have your DB always in top condition by looking for duplicates and removing them thanks to the creation of golden records, but how could you get there?
“The Golden Record is the ultimate prize in the data world…”
A way is to proceed with a data cleansing project, this is usually done all at once but could be very time consuming especially if the information has been piling up for years. Data cleansing is beneficial if performed on a regular basis, which brings us to the concept of “database maintenance”.
The companies achieving better results do not wait for their databases to be engulfed with meaningless data before taking action, but they do carry out continuous monitoring instead with the help of specifically designed tools and skilled resources.
Every system on the long term will leave behind waste data and unless you want to expensively rebuild your databases to speed up your applications, a much wiser investment is to invest in a great database maintenance team.
Do you want to be notified when new content is available? please subscribe to our newsletter.
Written by Alberto Rinaldo