Analytics applications are aimed at solving specific business problems. But what if business and data change?
Companies experience "drift" with their analytics applications when the applications begin to lose accuracy and effectiveness. The analytics then start underperforming in the business use cases they were originally designed for. There are many reasons analytics drift away from their original purposes and lose effectiveness. Most of these reasons are linked to changes in data, algorithms or business use cases.
SEE: Electronic Data Disposal Policy (TechRepublic Premium)
When analytics drift occurs, it is damaging to proponents of analytics in organizations. Ineffective analytics make CEOs and other top-line leaders less trustful of analytics—and less likely to rely on or endorse them.
IT and analytics proponents can prevent these situations by proactively looking for instances when analytics begin to underperform and then taking corrective action. Early symptoms of underperformance might be analytics reports that aren't being used as frequently as they used to be or analytics outcomes that are often questioned. Once IT locates an analytics application that is underperforming, the application can be looked at more closely.
Here are the most logical places for IT to look when an analytics application begins to underperform:
Have new data sources become available that would improve the quality and thoroughness of the data that the analytics queries?
Data sources continue to come online that have the ability to improve the outcomes of analytics queries because the data is more comprehensive than what was available before. The key to improving analytics is to ensure that the most current data sources are integrated into the data repository that your company is using for queries.
Is the data corrupt?
How often are you refreshing the data in your analytics data repository? Is data being adequately cleaned and prepared before it is admitted into the master repository, or are there ways that users (or IT) have been altering data to make it less reliable?
Is there data lag?
If your industry is transportation, do you know with confidence the latest highway repairs and closures in different areas of the country that your truck fleet travels? And do you communicate with your data providers regularly to see how frequently the data they provide you is refreshed?
SEE: How to make data analytics work for you (TechRepublic)
Has the business use case changed?
Yesterday's analytics might have been based on lost and unclaimed shipments, but today's focus might be on inventory miscounts. If a business use case has significantly migrated away from the original intent of what the analytics were designed for, it might be time to rewrite the analytics or to discontinue them.
Algorithms and queries
Are the algorithms and queries that users pose getting the desired results?
It might be time to tune up algorithms so they can more accurately mine data for the information that users are looking for. This can be done by iteratively testing different variations of algorithms and queries and then checking results.
Has the business use case changed?
A significant change in a business use case can render most algorithms and queries useless overnight. If this occurs, it's time to redraw queries and algorithms that meet the objectives of the new business case.
SEE: Gartner: Top 10 data and analytics technology trends for 2021 (TechRepublic)
Other areas of analytics mitigation
There are many different reasons for analytics to begin losing their effectiveness. When this occurs, companies begin to distrust their analytics, and this leads to reduced use. This also places IT in a spot where doesn't want to be—trying to promote analytics when key individuals in the organization begin to distrust them.
In addition to the data and algorithm practices IT can adopt to maintain analytics relevance, IT can also take these steps:
- Regularly monitor for new sources of data that could contribute more meaning to existing analytics;
- Exercise strong data cleaning and preparation on data before it is admitted to analytics data repositories; and
- Implement machine learning, which can detect repetitive patterns of data and deduce meaning that can be added to the processing "brains" of artificial intelligence so the analytics can be made "smarter" and more responsive to changing business conditions.
- Geospatial data is being used to help track pandemics and emergencies (TechRepublic)
- Akamai boosts traffic by 350% but keeps energy use flat thanks to edge computing (TechRepublic)
- How to become a data scientist: A cheat sheet (TechRepublic)
- Top 5 programming languages data admins should know (free PDF) (TechRepublic download)
- Data Encryption Policy (TechRepublic Premium)
- Big data: More must-read coverage (TechRepublic on Flipboard)