Introduction to Data Quality


Fullcast has a robust Data Quality functionality that assesses data quality on three key dimensions. With this assessment, you can make informed decisions about what datapoints to use in your go-to-market strategy and territory creation. This article will delve into the core aspects of Fullcast's Data Quality settings, specifically focusing on data quality threshold settings, their functionality, and their benefits.

The Importance of Data Quality

Nearly everyone in sales and RevOps is quick to bemoan their data quality. Indeed, data quality issues can significantly impact the efficiency and effectiveness of your sales motion. That said, when it comes to data for the purpose of go-to-market strategy, it’s important to align on what data you’re looking at and which sources you’re getting it from.  


What data? 

  1. CRM Objects: When sales and RevOps talk about data quality issues, they are often thinking of contact or opportunity data. When it comes to GTM strategy, however, account data is typically the most important. Indeed, in Fullcast, your market segments and territory structure typically uses account data as the primary input. 
  2. Fields: Once we’re looking at account data, we can focus in further on the subset of fields that are actually used to create the GTM segmentation model. Typically, that is actually quite a small number of fields compared to all of the data that lives in the CRM. 

In sum, focusing on a specific set of fields on the account object can actually help alleviate the feeling that your data quality issues are insurmountable. 


Which sources? 

In addition to understanding the specifics of your CRM data, understanding the broader landscape of data complexity is also essential to approaching data quality. Two components of this are recency and specialization. 

  1. Recency: Third-party data sources are typically outdated by at least six months. For this reason, we recommend organizational policies that prioritize sources of truth. As an example, SDRs often have the most current information, since they are talking directly to people who work at a company. Things like M&A can take years to be reflected in data providers’ datasets, so organizations can work internally to establish policies that account for things like this. 
  2. Specialization: Additionally, global data sources like Dun & Bradstreet (D&B) and ZoomInfo can quickly deteriorate in quality outside the US. So, if you are going to market in non-US countries, you may not get what you need from global data providers. Additionally, these global data providers may not provide data relevant to an industry-specific need. For example, if you sell to hospitals information such as “number of beds” may be a critical input in your segmentation. To mitigate these challenges, it is advisable to supplement popular data providers with industry-specific providers. This highlights the importance of establishing sources of truth for specific data points or prioritizing data sources to maintain data integrity.

In sum, Fullcast’s data quality functionality can only operate on the data that exists in your CRM. We can certainly suggest best practice policies for your organization and reputable data providers, but this is not part of our data quality functionality. 


Data Quality in Fullcast

Fullcast's Data Quality functionality enables users to set thresholds on acceptable data quality levels during the import process. This feature is particularly useful for fields used in creating territory hierarchies. Here's how it works:


Data Quality Dimensions and Thresholds

  1. Completeness: This measures how "filled in" a field is across all records. For example, a completeness threshold of 95% means that 95% of the records must have this field filled in. Below is a screenshot we used for the Number of Employees field. Note that you can also determine whether to accept Null or Empty values. In this case, we require accurate employee counts as part of our market segmentation, so we are not allowing nulls.

    Completeness thresholds for the Number of Employees field
  2. Uniqueness: This looks at every value in a given field to confirm it is unique. For instance, on a Company ID field, setting a uniqueness threshold of 100% ensures that every ID number is unique, which is crucial to avoid duplicates. Below is a screenshot we used for the Company ID field.

    Uniqueness thresholds for the Company ID field
  3. Accuracy (also referred to as "Validity"): This ensures that the values in a field are valid according to a predefined set of conditions. For example, you can set a rule that employee counts of 0 are not valid. Below is a screenshot we used for the Number of Employees field.
  4. Accuracy (better termed Validity) thresholds and conditions for the Number of Employees field

Data Quality Score

Based on the thresholds set for each of these dimensions, Fullcast will create an aggregate data quality score. The score represents the percentage of the records in your system which pass all three of your thresholds for Completeness, Uniqueness, and Accuracy. This score can be viewed in the Entities & Fields settings page. Below is a screenshot of the Number of Employees field, which has a Data Quality score of 99.07%. 


Rules on Relationships

In addition to the ability to set Data Quality thresholds on individual fields, Fullcast also has the ability to set up data quality thresholds on relationships between fields on different objects. For example, if you import both accounts and opportunities from Salesforce, you could potentially be importing opportunities that reference accounts that haven’t been imported into Fullcast. Fullcast's Data Quality functionality can be configured so that this situation doesn’t occur and all references across records are valid.


Data Quality Indicators

Data quality scores are represented using a color-coded tagging system:

  • Green: The data meets the threshold.
  • Yellow: The data is below the threshold but not significantly.
  • Red: The data does not meet the threshold.

To avoid undue concern about data quality, Fullcast limits the visibility of data quality scores to specific areas:

  1. Entities & Fields Imports Settings: Users can see data quality indicators when configuring imports. See screenshot below.

    In the Entities & Fields Settings, you can see the data quality score indicated in the column labeled DQ.
  2. Segmentation Rule Creation: Data quality indicators are displayed for fields used in segmentation rules. For instance, if you're using employee counts to create segments (e.g., enterprise, mid-market, SMB), but the fill rate for employee counts is only 50%, many accounts will end up Unassigned due to missing data. In the screenshot below, you can see the data quality tag for each of the fields used to create the territory segmentation rules.

    Data quality scores for Status, Shipping State, and ID fields


Data Pre-processing

As part of the standard import process, the following pre-processing is conducted. 

  • Geographical Data: Fullcast compares address information to Mapbox (geographical data provider) to normalize inconsistencies.
  • Industry Data: NAICS codes are verified with the NAICS database to ensure validity.
  • Parent-Child Relationships: Parent-child relationships are validated to ensure no loops or errors.

Data Policies

As your RevOps team works to institute data governance to maintain a level of data quality and integrity, below are a few key points that we recommend. 


Suggested Policy: Sources of Truth

Fullcast can provide customers with documentation of our suggested policies for sources of truth, particularly in cases where you source data from multiple providers. These policies help maintain consistency and reliability in data used for territory carving and other critical operations.


Fullcast Data Policies Overview

  • Account Families: Policies to set rules for accurate matching of parent and child accounts. 
  • Account Dedupe: Policies to deal with duplicate accounts.
  • Industry Taxonomy: Preferences on industry values for territory carving, customizable based on business needs.

Benefits of Fullcast's Data Quality Functionality

  1. Enhanced Decision-Making: By ensuring high data quality, businesses can make more informed decisions, leading to better outcomes.
  2. Improved Operational Efficiency: Automating data quality checks and thresholds reduces manual data cleaning efforts, saving time and resources. Practically speaking, we have heard of teams feeling like they need a master data project to improve their Revops and GTM strategy. That is rarely the case. Focusing on the fields that matter and investing in assessing, improving, and maintaining data quality on those is not only a much better approach, but is less costly than a huge undertaking to clean up all your data. 
  3. Increased Trust in Data: With reliable and accurate data, stakeholders can have greater confidence in the information they use for strategic planning and daily operations.

Fullcast's Data Quality functionality is an essential tool for maintaining high standards of data integrity. By setting and monitoring data quality thresholds, businesses can ensure their data is complete, unique, and accurate, leading to better decision-making and operational efficiency. Implementing these data quality measures helps overcome common data challenges and establishes a robust foundation for successful revenue operations.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us