The DCM Blog

Build a Business Case for Data Quality Improvement

By Pekka Korpi

April 17, 2023
Data Quality Improvement Business Case

Accurate and reliable data is the backbone of any successful organization. Countless data quality improvement projects are started across organizations since poor data quality can lead to misinformed decisions, wasted resources, and lost opportunities. On the other hand, good data quality can enable efficient workflows, automation, and even AI.

We often get asked for support creating business cases for data quality improvement initiatives and justifying the cost of a tool like Data Content Manager. In this blog post, I will explore what I think are some key elements of writing such a business case, and I will share several resources that I think you will find helpful.

Here is what I will talk about:

 

  • The Cost of Bad Data
  • Understanding Data Quality
  • Identifying Stakeholders
  • Assessing the Current Cost of Poor Data Quality
  • Outlining a Solution
  • Starting Small, Showing Results
  • Comparing Approaches
  • Calculating ROI
  • Addressing Risks & Challenges

The Cost of Bad Data

So why should you care if your data is good or bad? Well, bad data is bad for business and costs enormous amounts of money. According to a report by IBM, bad data can cost companies as much as 3.1 trillion dollars per year in the United States alone. The losses can be attributed to lost revenue, decreased productivity, and increased costs resulting from poor decision-making.

Bad Data Costs the U.S. $3 Trillion Per Year (hbr.org)

On the other hand, Gartner estimates that every year, poor data quality costs organizations an average of $12.9 million annually.

 12 Actions to Improve Your Data Quality (gartner.com)

In the context of ServiceNow, poor data quality can lead to inefficiencies in workflows and decision-making, causing delays, errors, and increased costs. Ultimately, if the data is not good enough, ServiceNow as a platform cannot deliver on all its promises. Want to utilize automation and even AI capabilities? It’s not going to happen if your data doesn’t support it.

Understanding Data Quality

Good quality data is accurate, complete, consistent, timely, and relevant to its intended use. In other words, good quality data is fit for purpose and can be relied upon for making informed decisions or supporting business processes.

Let’s break this down: 

  • Accuracy means that the data is free from errors and reflects the reality it represents.
  • Complete data contains all the necessary information and is not missing any critical elements.
  • Consistency means the data is formatted and structured consistently, making it easy to compare and analyze.
  • Timely data is up-to-date and available when needed.
  • Relevant data is directly related to the problem or question at hand.

Data is not automatically useless if it doesn’t meet all these requirements. Nevertheless, the further away you are, the more problems you will likely have. More problems mean increased costs.

When we look at what this means in the context of the CMDB, it is easy to come up with examples:

It is not an academic discussion either. The real-life consequences of an inaccurate or incomplete CMDB can be harsh: How to Avoid a 2 Million Euro Incident.

Data Content Manager has been built to help with data quality improvement. It addresses the issues mentioned above and more.

Data Quality Improvement

Identifying Stakeholders

Understanding who will be affected by the data quality improvement initiative is crucial in building a solid business case. First, identify key stakeholders, such as decision-makers, data users, and IT personnel impacted by the initiative. Then, communicate the benefits of improving data quality to each stakeholder group, addressing their unique concerns and priorities.

We always talk about establishing ownership. It is essentially about identifying the relevant stakeholders for your data and then going about making sure that people understand what their roles are and what is expected of them

A very effective way to motivate people to take care of their part is personalizing KPIs so that individuals only get measured for things they are responsible for and can affect themselves. Unfortunately, generic KPIs such as “Data Quality of the entire CMDB” are often too far from an individual’s work to have a meaningful, motivational effect.

Data Content Manager is very effective in turning these ambitions into reality. Here’s a short list of resources that can help you understand how DCM can be used to enable all of the above and more:

 

Assess the Current Cost of Poor Data Quality

Once you know your stakeholders and understand their requirements and contributions, you should also have better visibility into your current state of data quality. It would help if you quantified the negative impact of poor data quality on your organization. You could include the costs associated with the following:

  • Time spent correcting errors or validating data
  • Lost revenue due to poor decision-making based on inaccurate data
  • Customer dissatisfaction and potential loss of business
  • Regulatory fines or penalties due to non-compliance

By calculating the current cost of poor data quality, you can highlight the potential savings achievable through data quality improvement initiatives.

Cost of Low Data Quality

Outline the Proposed Solution

Your plan for improving data quality should include specific steps, tools, and technologies. Consider, for example, the following components:

  • Data governance policies and procedures
  • Data cleansing and validation processes
  • Data integration and consolidation efforts
  • Training programs for employees to improve data-handling practices

I want to stress that you will want to include tools in your solution. We believe Data Content Manager is the most comprehensive tool available for ServiceNow that addresses all of the parts in your plan. The most likely alternative is to resort to custom development around out-of-the-box capabilities.

Read more here: Data Content Manager & Out-of-the-Box Tools.

Finally, ensure your proposed solution addresses the identified data quality issues.

I suggest looking at our 5 Steps model for ideas around planning how to begin the project. Our 5 Steps model is very much applicable to initiatives around the CMBD. Look here for more generic ideas: Why data quality is (still) so important? | Data & Analytics Insight (positivethinking.tech)

Start Small, Show Results

You want to develop a realistic plan and timeline. Break your project into smaller, measurable parts. Start small to get results to show. Expand when you have something to show so that it is easier to communicate your successes. Others will be easier to convince once they see something concrete. This proof-of-concept approach will help stakeholders understand what you are striving for and encourage their buy-in.

Data Content Manager, as well as its licensing model, fully supports the start-small approach. For example, we offer a Free Guided Trial to help you verify that the tool works as expected. Then, you can begin in a production instance with a Starter License with a low commitment. Many companies use the Starter License for a Proof of Concept.

Compare Approaches

I think it is a good idea to compare approaches. We believe that Data Content Manager is the easiest and most comprehensive tool available to get data quality management under control. However, many organizations approach the matter by using OOB capabilities and developing them further with customizations & scripting.

Extend Capabilities by Customization

While custom development can achieve many things, the challenge is that it is… well, development. Developing is very involved and requires time and effort. The key steps are something similar to these:

  • Define requirements: You need to gather and document the business requirements and objectives for the customization. That includes understanding the problem or need the customization will address and any specific functionality and features desired.
  • Analyze existing functionality: You should review existing ServiceNow functionality to determine if it can meet the requirements with minor configuration changes or if customization is necessary.
  • Design the customization: A detailed design is needed, including the user interface, workflows, data model, and any necessary integrations. All while adhering to ServiceNow best practices.
  • Estimate effort and resources: Estimate the effort, time, and resources required to develop, test, and implement the customization. Use internal resources or external consultants?
  • Develop the customization: Developers will do their job according to the design. That may involve creating new tables, forms, fields, scripts, workflows, or integrations.
  • Test the customization: Thorough testing is needed to ensure the customization meets the requirements and performs as expected. Be prepared to fix any bugs or issues identified during testing.
  • Prepare for deployment: Document the customization, including any necessary setup or configuration steps, and create a deployment plan. Ensure you have a rollback plan if the customization fails for some reason.
  • Deploy the customization: Follow the deployment plan to deploy the customization to your production instance. Monitor the system for any issues that might arise.
  • Maintain and support the customization: Often overlooked, you need to plan for providing ongoing maintenance and support for the customization, including fixing any issues, addressing user feedback, and adapting the customization as required. Also, you may need to update the customization when ServiceNow is upgraded, as platform changes can impact customizations.

You may need several customizations to achieve everything you require. When things change – and they always do – there’s a good chance you’ll return to the drawing board. If you neglect maintenance, you will accumulate technical dept.

Get Data Content Manager

DCM is a ServiceNow-certified app available in the ServiceNow Store. It has already gone through all the steps above. Furthermore, it is maintained and further developed for future compatibility and feature enhancements.

And maybe most importantly: when things change, you only need to adjust your DCM Blueprints to match your changed requirements or create new ones to address new needs. There is no development involved.

While there are undoubtedly some overlapping steps, especially in defining requirements, getting DCM into use is very straightforward: 

  • Acquire a license
  • Install
  • Create your first Blueprint, run an audit, and you will get results

One of our customers, an Enterprise Architect in a global Healthcare company, summarized the difference:

My complex Blueprint was up and running in 10 minutes, and I got audit results immediately. It would have taken months to complete without DCM.

Compare Approaches

Calculate the Return on Investment (ROI)

Calculating the return on investment (ROI) for improved data quality can be challenging, as the benefits often include both tangible and intangible factors. However, you can estimate the ROI by quantifying the costs and benefits associated with better data quality.

We already talked about the cost of bad data. In addition to that, you should develop a realistic estimate of the benefits associated with improved data quality. You should include both tangible and intangible benefits, such as:

The 1-10-100 Rule

Coming up with numbers will be specific to your organization. However, there are some general guidelines and concepts. For example, there is a concept called the “1-10-100 Rule” related to data quality. The 1-10-100 Rule states that it costs:

  • $1 to verify a record as it is entered,
  • $10 to cleanse and correct the record later,
  • $100 (or more) if nothing is done, as the ramifications of the bad data continue to spread.

Garbage In, Garbage Out

Then, there is the ages-old Garbage In, Garbage out concept. It states that if the input data is of poor quality (garbage), the output or results generated by a system or process will also be of poor quality (garbage).

If CMDB is the data source for your business processes, you can quickly see how these two concepts work together:

 

  • Garbage in the CMDB = Garbage in your processes.
  • Proactively maintaining and improving your data quality = much less money lost than ignoring or not being aware of the problems in data quality.

Address Risks and Challenges

No initiative is without risks and challenges, so addressing them upfront is wise.

Almost always, when our customers use Data Content Manager as their tool for data quality improvements, the risks have to do with their organization and ambitions that go beyond what their organization can deliver.

Data quality improvement is not a one-person show. Consider the following:

  • Stakeholders have different and sometimes conflicting interests.
  • Often data needs to be maintained by people who may not even know they are expected to do so. So how do you motivate them? What’s in it for them?
  • The entire concept of data quality is abstract:
    • How do you assess the current state?
    • How do you assign tangible targets and KPIs
    • How do you make progress visible?

Data Content Manager helps address many of these issues. It helps focus efforts, set targets, and communicate progress. Furthermore, it helps make the entire thing personal so that people can see the results of their contributions. Management stakeholders can view progress as it happens.

Conclusion

Developing a solid business case for data quality improvement is vital for obtaining stakeholder support. By comprehending the detrimental effects of poor data quality and presenting a comprehensive solution, you can emphasize the importance of investing in data quality initiatives.

Identifying key stakeholders, quantifying the current costs of poor data quality, and outlining a robust solution will enable a strong foundation for your business case. Furthermore, breaking the project into manageable pieces, calculating the ROI, and addressing risks and challenges will facilitate stakeholder buy-in and foster a smoother implementation process.

Data Content Manager can be a centerpiece of your efforts for any data quality initiatives on ServiceNow. As mentioned earlier, a Guided Trial is available, and we’re more than happy to give you a demo.

Featured Posts

5 Challenges to Address for Better CMDB Data Quality

5 Challenges to Address for Better CMDB Data Quality

Comprehending the impact of CMDB Data Quality, especially its absence, can be difficult. It is a big problem since poor data quality is often the main reason ITSM systems, like ServiceNow, don’t meet expectations. We are deeply involved with data quality daily. Our...

How LapIT Improves CMDB Data Quality with DCM

How LapIT Improves CMDB Data Quality with DCM

LapIT designs and implements solutions for information and communication technology environment services in Northern Finland. Their customers are mainly municipal, public administration, and healthcare organizations. We interviewed Leena Broas, Development Manager at...

Video: How to Improve Foundation Data in ServiceNow and CSDM

Video: How to Improve Foundation Data in ServiceNow and CSDM

In this video, Pekka Korpi, CEO of Qualdatrix, and Mikko Juola, Product Owner of Data Content Manager, discuss the importance of Foundation data in ServiceNow and how it can be improved using Data Content Manager. Foundation Data in ServiceNow refers to the critical...

How Metsä Group Improves Data Quality with DCM

How Metsä Group Improves Data Quality with DCM

Metsä Group uses Data Content Manager to improve data quality in their CMDB. We had a chance to interview Mika Lindström, the ICT Configuration Development Manager at Metsä Group. Thanks, Mika, for joining us. Metsä Group are an internationally operating frontrunner...

How to use CSDM to Improve Incident Management

How to use CSDM to Improve Incident Management

It’s been a while since this article was originally published, so I thought it would be time to update it to reflect changes to the CSDM model and our latest thinking. We published this article first in 2020 when ServiceNow’s Knowledge event included the first...

How CMDB Supports Regulatory Compliance at Danske Bank

How CMDB Supports Regulatory Compliance at Danske Bank

Data Content Manager improves data quality within an organization’s CMDB, reducing manual and disparate work for data quality maintenance. This increase in data transparency and ease of data management helps companies to achieve and maintain regulatory compliance,...

Get Started

Book a Call with us Now.

 

Explore how Data Content can enhance the quality of your data in ServiceNow. See how you can accelerate your CSDM journey and improve your CMDB or any data in your platform. All without the need for scripting, additional reports, or customizations.

 

Related Content You Might Like:

CSDM

The Recipe For Success eBook

Contents:
  • Establish Ownership & Roles
  • Manage Your Scope
  • Choose the Right Tools

We talk to people about CSDM alignment every day and constantly see organizations struggle with the same things over and over again. We wrote CSDM - The Recipe for Success to share our experiences.

It gives you hands-on guidance on some of the most important things you must address on your CSDM Journey, regardless of your maturity level.

CSDM The Recipe for Success eBook

Get Your Free eBook