2020-2021 have been game-changing years in nearly every aspect of life. Besides global shutdowns, impacted supply chains, inflation and other financial strains, the creation and use of data has been staggering. While tough to calculate, estimates for the amount of data created in 2020 alone range from 40 to 64 zettabytes (that's a substantial number, look it up), with that number growing yearly. So, what does this mean for businesses?
In this post, we will summarize some of the current cases and data issues we are seeing, as well as provide an overview of how organizations can potentially navigate these issues.
Consider the following as you head into the new year:
The Data Exists
It's not a matter of needing more data in most cases, the issue is appropriately using what we have. Companies are creating and storing data at unprecedented (i.e., zettabyte) rates, yet few are accurately using the data they create or that is available globally. A few examples of this include:
What We Are Seeing:
- Data Standardization: Companies are wrestling with how to standardize and combine their data and are struggling to get meaningful, trustworthy reports and accurate intelligence.
- Data Shoveling: Data shoveling is when a company shovels massive quantities of data into a new system without taking care to clean, standardize, and test their data before moving it.
- API Abuse: Let’s face it – APIs are the easy way to go, but APIs are like the dumb waiter. They move data in a predetermined way from one place to another. They typically don’t do anything to continually monitor and test the data moving through them and in a typical company the high-level view of API connections looks like my 2-year-old grandson just dumped a couple of boxes of uncooked spaghetti onto the floor!
What Can Be Done:
- Data Discovery: Discover what data you already have! Nine out of ten companies we visit do not have up-to-date schemes showing all their systems, never mind having their data flows mapped out.
- Define Data Needs: Rather than simply proliferating data for the sake of potential future needs, asses your true data needs and the associated cost of wrestling with and fixing your data every day (i.e., the cost of dirty data).
- Focus on Data During Digital Transformation: If you are implementing a new ERP, CRM, or HCM solution, don’t lose sight of the ultimate purpose of your initiative. Ideally, it is to improve business process efficiencies AND to get good data out of your new system. Don’t squander the opportunity to get your data under control as you are migrating to your new system.
Security is a Growing Concern
This has been a constant message for years and is an ever-increasing concern. Security measures and data protection are in constant evolution, but so is the competition (i.e., hackers, bugs, malware, hijackers, the list goes on).
What We Are Seeing:
- Remote Work: Global workforces are permanently adopting remote and hybrid work, allowing hackers the opportunity to exploit a distributed network with more access points to target.
- Increasing Data: The more data, the more opportunity but the more risk you have. Remember - data is your greatest asset and one of your biggest liabilities.
- Quantum Hacking: You’ve heard about quantum computing, but what about quantum hacking? China now has produced quantum computers capable of hacking AES 256 — the core security protocol upon which most of the Western world has relied to protect data and systems for the last 10 years.
- Reliance on Cyber Insurance: To cover the sins of poor data security hygiene, companies are resting on insurance as protection. Keep in mind that insurance companies adjust rates based on risk, so I think we can all see where this is going.
What Can Be Done:
- Tokenization: Tokenization is the process of replacing sensitive data with “tokens” that carry no referenceable meaning and cannot be identified by a mathematical formula. Sensitive information can then be shared across the internet without revealing its true form. This technology is used for credit cards, but it can also be adapted to make almost any kind of data quantum resilient.
- Compliance and Audits: sound like painful, scary things, but their objective is to expose your company’s weaknesses before the bad guys do. Your company has to keep its systems and data safe all the time, but the criminals only have to be right once to bring a company to its knees and ruin the careers of the executives in charge at the time of the breach.
- Update Security Protocols: every quarter and switch up your MSSP team. Consider an independent audit to help your team re-assess your weaknesses and correct them before the bad guys do.
- Hire a CISO: If you don’t have a CISO or VCISO (virtual or fractional Chief Information Security Officer), then get one!
- Take Cyber Insurance Renewal Self-Assessments Seriously: If you overstate or misstate what you are doing in terms of processes, protocols, and technology, you might find your claim delayed or denied.
Governance is Key
In addition to properly using and protecting data, we need to stay in charge. Watch a movie like Her (2013), Ex Machina (2014), and even Iron Man (2008), and while a bit futuristic, the idea of data becoming overwhelming is not a fantasy. The key is to make sure we humans accurately manage technology and associated data.
What We Are Seeing:
- Integration Software is Lacking: Specifically with easy-to-use granular controls over data access. Most integration software touts ease of connection and orchestration but what buyers need to know is that the data moving through those connections should be programmatically monitored for changes in veracity, volume, and variety.
- Infrastructure is Strained: Over-stressed IT departments and outgrown systems are struggling to keep up with companies’ data needs. The result is highly customized systems and workarounds that are cumbersome, uncontrolled, and unreliable.
- Field Hi-Jacking: Field hi-jacking is where a data field intended for one purpose is used for some other purpose. We see a lot of field hi-jacking in customer master data and sales transactions where over time elaborate manual codes are tucked away in strange places like squirrels hiding nuts in the fall.
- AI and Machine Learning Limitations: These systems consume data and claim to “clean” data, but they are NOT the auto-clean magic fairy godmother housecleaner- despite what the sales rep tells you!
What Can Be Done:
- Focus on What’s Important: Sexy dashboards and colorful BI reports are useless if the underlying data cannot be trusted or if you have not constructed granular data access and privacy controls and logs.
- Focus on Change Management: Many people bristle at the term “governance” because they view it as controlling and restrictive. This can cause change resistance even before the project even begins.
- Standardization is the Key to Trusted Data: Defining what data will be collected and how data will be rendered for various purposes across a company’s ecosystem is critical to maintaining trusted data.
- Consider ESB vs. API: As referenced in the spaghetti example above, APIs are not the be-all and end-all. They do offer certain advantages, but ESB (Enterprise Service Bus) integration has come a long way and can help with data standardization and protection.
Over the following weeks we will dive deeper into each of these areas, but if you have any questions in the meantime or would like to discuss your current data environment, please reach us at ValidDatum
info@validdatum.com
Recent posts
30 Jan, 2024