A few months ago, I had the pleasure of speaking with Bill Hewitt, the president and CEO of Kalido, to learn more about data governance. He explained that data governance is the process of monitoring and maintaining data rules and inspecting data as it goes through systems to ensure that it complies with the established rules. This improves data quality and makes sure that the time, money, and effort spent to collect data is not wasted by having bad data put into the system. In addition, data management costs are lowered, and many types of business risks (particularly regulatory and legal) are reduced as well. Another benefit is that decisions can be made more quickly because the data is more reliable. Originally, data governance was tied to specific pieces of equipment to ensure the data that it was providing was within the realm of “reasonable.” Now, data governance has spread to many other areas.
We also talked about Kalido’s data governance product, the Kalido Information Engine, and how it fits into this space. Mr. Hewitt says that it goes far beyond a basic data constraint to actively help businesses improve their data quality. For example, a good data governance system can provide a way of directly contacting the users when data does not appear to comply with the system rules. Traditional database constraint mechanisms start at the bottom, at a technical level, and rely upon the UI or other systems built on top of them to resolve problems. In contrast, the Kalido Information Engine starts with the end user to prevent problems from ever reaching the database.
The Kalido product is a closed loop process and allows users to request changes to the system as needed. It translates business requirements to technical specifications, which is always a challenge in designing large systems. In addition to monitoring incoming data, the system includes a data warehousing component that puts business rules over the live database; as the underlying database changes, so does the data warehouse. This is in stark comparison to traditional data warehouse systems in which a change to the database requires a warehouse expert to change the associated models and how they work with the database, and possibly alter anything that depends on the data model. It also can provide a “timeline” functionality and unit multiple data sources into a unified data view.
Disclosure of Justin’s industry affiliations: Justin James has a contract with Spiceworks to write product buying guides; he has a contract with OpenAmplify, which is owned by Hapax, to write a series of blogs, tutorials, and articles; and he has a contract with OutSystems to write articles, sample code, etc.
Get weekly development tips in your inbox
Keep your developer skills sharp by signing up for TechRepublic’s free Web Developer newsletter, delivered each Tuesday. Automatically subscribe today!
Subscribe to the Executive Briefing Newsletter
Discover the secrets to IT leadership success with these tips on project management, budgets, and dealing with day-to-day challenges. Delivered Tuesdays and Thursdays