Avoiding Data Pitfalls

In our last post, we discussed some of the risks of having too much data including concepts like data security, data standardization and validation, data shoveling, and API abuse. 

italyMuch like the people of Amsterdam who have figured out how to live life and thrive below sea level (i.e. too much water), you and your company can also learn to navigate and benefit from “too much data”. 

The following are a few best practices adopted by organizations that are successfully maneuvering within the floods of data and the associated risks.

Data Discovery

Discover what data you already have! Nine out of ten companies we visit do not have up-to-date schemes showing all their systems, nor have their data flows mapped out. Bringing your systems and data catalog up to date is a great place to start. Remember that while automated data discovery tools and AI-enabled data classification applications can be helpful, they are not the end-all, and you will still require some human interaction to accurately map your data and validate it.

signsDefine Data Needs

Rather than simply proliferating data for the sake of potential future needs, asses your true data needs and the associated cost of wrestling with and fixing your data every day (i.e., the cost of dirty data). Look around (and also look in the mirror) at folks who are using Excel to perform time-consuming manual processing to structure their data to work. Sure, expensive applications (Mulesoft, Boomi, etc.) can help, but first evaluate the true end goal of your analysis and reporting. Many of us get overwhelmed by “too much data” by trying to utilize every last byte.

Trying to convey or utilize too much data can also become confusing for everyone involved, as in the street sign conundrum example referenced.

Focus on Data During Digital Transformation

If you are implementing a new enterprise technology, be it an ERP, EPM, or HCM solution, remember the ultimate purpose of your initiative centers around improving business processes, gaining efficiencies, and getting reliable data and insight out of your new system. Even if you are simply replacing a sunset solution, don’t squander this opportunity to get your data under control as you are migrating to your new system. Failing to improve your data’s health (i.e., garbage-in-garbage-out) or losing control of your data during migration by resorting to data shoveling is such a waste of opportunity and money.

Consider ESB vs. API

As referenced in the spaghetti example regarding API abuse, APIs are not the be-all and end-all. They do offer certain advantages, but ESB (Enterprise Service Bus) integration has come a long way and can help with data standardization as well as protection. With new advancements including canonical models, companies with a large number of applications, constantly changing technical environments, multiple locations or entities have some worthy options to help govern and manage data.

At ValidDatum we LOVE data and solving data issues. If you have questions on any of this content or would like to discuss your own situation, please reach out! We also welcome you to subscribe to our content to receive our latest publications via email. https://validdatum.com/contact/


#DataSecurity #API #DataManagement