Data Management

Fixing a fine data mess

A college's business process improvement initiative is given a push when several changes occur in the IT shop.

I've written previously about our efforts at Westminster College to begin a campus-wide business process improvement initiative. In fact, I've already begun meeting with various departments on campus and will soon draft a first-run report to use as a starting point for these efforts. Over the past few weeks, though, other issues have come to light that will require immediate and significant attention.

A few weeks ago, my database person — as in, my only hardcore database person - left the college to take a position at the local nuclear plant where I'm sure he will have a glowing future. All kidding aside, this individual was a master DBA and very good at what he did. Over his few years at the college, he continued the efforts begun by his predecessor and worked very hard to take us even further, leading initiatives that have saved many hours in the long haul.

However, all was not well. Once a person leaves a department, it very quickly becomes apparent where weak spots lie. This is in no way a slam against the individual, as he definitely got the job (and more) done each day. However, in the short time since his departure, we've had multiple instances of what I'm calling data failures. That is, the processes and procedures that are in place are not adequate for the long-term appropriate functioning of the department or the college.

Some examples:

  • Our dining system was down for the first two days of the term because the scripts used to load the system referenced a table that did not have the right data. Our administrative software system has two locations in which this information can be stored. With the purchase of a particular module, which we own, the data is stored in one location. Without it, the data is stored in a master table. Now, it's possible that, at some point, this information is supposed to be automatically transferred from the module table to the master table, but that didn't happen. Instead, the query that was used to identify students on new meal plans referenced the master table. In this table, the necessary fields were blank. Hence, no data. With some sleuthing, that problem is now fixed.
  • Some new students were denied access to the student portal. Upon investigation, we discovered that these new students did not have security rights to the student portlet. Again, the problem was resolved.

In the past, our documentation on these processes was almost non-existent, but now, we do have information, so we were able to correct the problems. However, these issues have pointed out some glaring issues that we have and that we need to take steps to correct.

  • More documentation. We simply have to have a much deeper understanding of what's happening in our systems.
  • More cross-training. When a person departs, it shouldn't bring us to a standstill. Sure, any time someone leaves, things happen, but these instances should be relatively rare.
  • Better processes. We're working hard to help others on campus improve their processes so we have to make absolutely sure that our own are bulletproof. Right now, they aren't.

Although a lot of effort will be put into the first two points, it's really the last one that will reap some significant rewards. I'll expand on that a bit here.

While we make efforts to gain a better understanding of what's happening under the hood with our systems, we'll also start rebuilding the links between the systems to be more consistent and have much better documentation. As much as possible, I'd like to do away with "mass loads" and move to a more real-time synchronization model. For example, instead of mass loading all new freshmen into the dining system and then having the dining folks do manual updates as things change after the load date (which is what we do now), we'll develop on-the-fly routines that scan for changes. When a change is detected—maybe a student pays his tuition deposit, which would make him eligible for a meal plan—this routine will have enough intelligence to completely handle the process. An account for him will be automatically created in the dining services system and he will be assigned to the appropriate meal plan. Again, the whole process will undergo end-to-end analysis and documentation. The analysis will include everyone that takes part in the currently manual process, including IT, our housing office and the dining people. We want to make sure that any human decisions that would be made will find their way into the new process.

There is nothing revolutionary about what I'm considering, but I believe it's the next step in evolution when it comes to automating processes. I've never really liked mass data loads. There's too much room for error; they often happen so infrequently that people forget minor details; and, quite frankly, I don't like processes that should be routine being handled manually.

This whole concept fits in very nicely with our overall business process review that we're currently undertaking, too. We don't need to consider this service oriented architecture initiative to be separate from that effort; in fact, the two are quite complementary and, at the end of the day, they will together help us eliminate significant inefficiency and error that can exist in the current methods.


Since 1994, Scott Lowe has been providing technology solutions to a variety of organizations. After spending 10 years in multiple CIO roles, Scott is now an independent consultant, blogger, author, owner of The 1610 Group, and a Senior IT Executive w...

Editor's Picks