Data Validation is the first step in the journey to Customer Data Governance

It’s debatable as to which customer data you have is your most valuable but without a doubt, the data that is unique to your business is the most important. It is that data which you are able to rely on to make critical business decisions and engage most comprehensively and effectively with your customers that is certainly likely to be valued far above other data.

It helps to provide a definition of what that unique first-party data really is.

First-party data is the data that you are able to obtain from your prospects, customers and audiences. Typically it is given first-hand by form entries, engagement, calls or transactions.

On one hand, you could think of it as the classic Rolodex entry but these days it is more than likely lurking in your ERP, CRM, CDP or POS system.

Your first-party data may also be present in emails, spreadsheets and of course a CMDM like a Pretectum Customer MDM.

Your first-party data typically carries all the essential information that you need to contact, transact and engage with a given customer or prospect. Over time that data may be enhanced through the addition of measures, insights and indicators related to transactional behaviour, preferences, tastes and engagement. Some of these enhancements may be unique to your business and its direct relationship, others might come from annual data refresh or contact update requests. Some may be inferred.

The reason you have this data is to minimize friction in the engagement and transacting with the customer or prospect. You minimize friction best by personalizing the customer experience every time they engage with your brand, message, people, processes and technologies.

But there is a problem with first-party data, a problem that is inherent in almost all data that is not appropriately managed and which devalues the data. That problem is related to the classical six core data quality dimensions.

  • Accuracy
  • Completeness
  • Consistency
  • Timeliness
  • Validity
  • Uniqueness

You can learn more about these on the web but they are just a handful of the 65 dimensions and subdimensions created by DAMA that flex according to the needs of different industries.

Data Quality dimensions were described by Richard Y. Wang and Diane M. Strong in Beyond Accuracy: What Data Quality Means to Data Consumers. They recognized 15 dimensions. DAMA Internationala not-for-profit, vendor-independent, global association of technical and business professionals dedicated to advancing the concepts and practices of information and data management developed a more elaborate list containing 65 dimensions and subdimensions.

So the first challenge with ensuring that your data is initially, and then continues to be valuable, is ensuring that the data meets some sort of data quality objective. Defining data quality measures and then measuring the quality of your data is the best way to determine if your data is going to be useful.

Pretectum allows you to manage first-party data quality, by allowing you to define the measures of quality upfront before you even add data to your CMDM. When you eventually load or enter that data, the system then informs you of problems and keeps you aware of records that may not be complete or consistent with defined values. Other mechanisms allow you to verify the data against external reference sets to inform you on accuracy or validity as well as uniqueness. All the while, the latest version of the data is served up to you with the ability to examine the change history of the records over time.

The original version of this article is found here

    Please follow and like us:
    Pin Share