Data Management

Fixing a fine data mess

A college's business process improvement initiative is given a push when several changes occur in the IT shop.

I've written previously about our efforts at Westminster College to begin a campus-wide business process improvement initiative. In fact, I've already begun meeting with various departments on campus and will soon draft a first-run report to use as a starting point for these efforts. Over the past few weeks, though, other issues have come to light that will require immediate and significant attention.

A few weeks ago, my database person -- as in, my only hardcore database person - left the college to take a position at the local nuclear plant where I'm sure he will have a glowing future. All kidding aside, this individual was a master DBA and very good at what he did. Over his few years at the college, he continued the efforts begun by his predecessor and worked very hard to take us even further, leading initiatives that have saved many hours in the long haul.

However, all was not well. Once a person leaves a department, it very quickly becomes apparent where weak spots lie. This is in no way a slam against the individual, as he definitely got the job (and more) done each day. However, in the short time since his departure, we've had multiple instances of what I'm calling data failures. That is, the processes and procedures that are in place are not adequate for the long-term appropriate functioning of the department or the college.

Some examples:

  • Our dining system was down for the first two days of the term because the scripts used to load the system referenced a table that did not have the right data. Our administrative software system has two locations in which this information can be stored. With the purchase of a particular module, which we own, the data is stored in one location. Without it, the data is stored in a master table. Now, it's possible that, at some point, this information is supposed to be automatically transferred from the module table to the master table, but that didn't happen. Instead, the query that was used to identify students on new meal plans referenced the master table. In this table, the necessary fields were blank. Hence, no data. With some sleuthing, that problem is now fixed.
  • Some new students were denied access to the student portal. Upon investigation, we discovered that these new students did not have security rights to the student portlet. Again, the problem was resolved.

In the past, our documentation on these processes was almost non-existent, but now, we do have information, so we were able to correct the problems. However, these issues have pointed out some glaring issues that we have and that we need to take steps to correct.

  • More documentation. We simply have to have a much deeper understanding of what's happening in our systems.
  • More cross-training. When a person departs, it shouldn't bring us to a standstill. Sure, any time someone leaves, things happen, but these instances should be relatively rare.
  • Better processes. We're working hard to help others on campus improve their processes so we have to make absolutely sure that our own are bulletproof. Right now, they aren't.

Although a lot of effort will be put into the first two points, it's really the last one that will reap some significant rewards. I'll expand on that a bit here.

While we make efforts to gain a better understanding of what's happening under the hood with our systems, we'll also start rebuilding the links between the systems to be more consistent and have much better documentation. As much as possible, I'd like to do away with "mass loads" and move to a more real-time synchronization model. For example, instead of mass loading all new freshmen into the dining system and then having the dining folks do manual updates as things change after the load date (which is what we do now), we'll develop on-the-fly routines that scan for changes. When a change is detected--maybe a student pays his tuition deposit, which would make him eligible for a meal plan--this routine will have enough intelligence to completely handle the process. An account for him will be automatically created in the dining services system and he will be assigned to the appropriate meal plan. Again, the whole process will undergo end-to-end analysis and documentation. The analysis will include everyone that takes part in the currently manual process, including IT, our housing office and the dining people. We want to make sure that any human decisions that would be made will find their way into the new process.

There is nothing revolutionary about what I'm considering, but I believe it's the next step in evolution when it comes to automating processes. I've never really liked mass data loads. There's too much room for error; they often happen so infrequently that people forget minor details; and, quite frankly, I don't like processes that should be routine being handled manually.

This whole concept fits in very nicely with our overall business process review that we're currently undertaking, too. We don't need to consider this service oriented architecture initiative to be separate from that effort; in fact, the two are quite complementary and, at the end of the day, they will together help us eliminate significant inefficiency and error that can exist in the current methods.

About

Since 1994, Scott Lowe has been providing technology solutions to a variety of organizations. After spending 10 years in multiple CIO roles, Scott is now an independent consultant, blogger, author, owner of The 1610 Group, and a Senior IT Executive w...

14 comments
BrooklynPennyPincher
BrooklynPennyPincher

Your DBA might have had two good reasons to do bulk loads: 1. There might be tables with clustered indices based on ascending keys. A B-tree index based on a sequential ascending value will end up deep and thin instead of wide and bushy, leaving that index unable to speed up queries. The solution to this is to drop and rebuild the index after most of the data is loaded. 2. Do you have the hardware to support good online response time during large data flows? Those bulk data loads may have been done at night, to preserve online response time. You might need a workload replay system to test the waters before going into production on your incremental update system.

david_horn
david_horn

Sorry... is it just me or have others read this wondering how and why the situation occurred in the first place? Surley we have had it drummed into IT departments around the world for years (especially since 2000) that well documented processes for incident, as well as change and storage, management is key for any IT environment to be succesful in terms or 'Turn-a-round' when problems occur. IBM have indeed pushed this for years and we see the ITIL standard as a result of this. In fairness, I can understand that no matter how you prepare yourself or your department's staff and processes, these type of incidents can still occur. However, your actions now, although well placed, seem to be more reactive rather than pro-active (I know! I hate using those corporate, project managemnt type words too). Surley if you know your DBA's worth and excellance, you could not have seen him working at your college into his dying days!? Therefore, would it not have been prudant to team him up with a junior during his time there to lessen the pain upon his eventual departure. Just a thought.

Scott Lowe
Scott Lowe

You're absolutely right in that we're being very reactive in this case. As some other posters pointed out, it's easy to delegate to a very competent individual, but if that individual simply has too much to do, it's easy to become mired in the day to day and not be able to focus on the long term. Teaming him up with a junior person would have been a great solution... if I had a staff position to hire someone into! I have to work with what I have... additional resources are scarce.

georgef
georgef

..he's is dealing with the reality as opposed to theory or textbook concepts. In most situations, it cannot be escaped that IT operates in a business and therefor only has the resources available under the current profit situation of the company. In that framework, you will only be able to direct resources at activities that don't directly benefit/effect the bottom line, as your company and budget can afford. Yes, its nice to say reactive vs proactive, as a judgement of a situation you have no insight to or knowledge but this is the real world and no one know what situations you entered when you took the job or what arose once there. To understand this, you would've had to experience meeting with a CEO/CFO and explaining your budget and its expenditure purposes. The "rubber meets the road" in that case and you can easily be left in the treads, or as road-kill, if you ignore the business situation.

tuomo
tuomo

Right - real world is different than books and even different than it was yesterday. Only if you are (un)lucky to be the first and mess up, then.. You play with cards you get and try to make the situation better, sometimes it can be a fast process but most often it takes years and even then the results may not fully be what you wanted. The problems are "don't rock the boat", "this has been working for us years", "not my / our business", "be a good corporate citizen", "your manager is a god", "nobody (who pays the bills) have asked that", "we don't have time for that nuisance", "this standard is better than the other one", "this product will solve all our problems" or whatever those old cliches. If your CIO/CTO/COO/Cxx uses any of those without reality based facts and you don't have CEOs/top support, case lost! Sometimes makes you wonder why you were hired in first place, your skills to benefit the company or to support some managers own power play? Take your pick. Back to original question, I have worked in places where the process was supported by CEO and places where the top didn't understand the problems. It often is a sales job which leads to some interesting internal fights if you are not next to top or at least have unwavering support from top. Seen both. Now, government and academic institutions can be a problem because of the strong hierarchy, private / public companies can(?) be easier but no hard rules. Seen good and bad business documentation (and process) in both. Reasons as it is expensive, difficult, etc are just weak excuses for laziness or incompetence - of course if it is an afterthought it is both more difficult and more expensive but designed upfront to the business process (yes, IT is business!) it's neither. Same as with security, capacity, whatever. An example, in 70's I designed an online infrastructure documentation, change management, security and capacity (with CEOs request!) in a big company, millions of customers, 500+ IT department (defined as a profit center internally so we had to show the benefits to business, not just a support function!), etc. It worked beautifully to the end of 80's until IT separated from other business, downhill after that so much that the business did run to IT problems (just frozen and couldn't serve the company any more, a chaos with typical solution, hired hundreds of more IT people and got worse each time!) and had to be merged with even a bigger company - and we used to be one of the largest, sad! Seen that happening in to too many companies, some things just don't seem to change.

david_horn
david_horn

Indeed Gents, fair point. I do realise that in the 'Real world' not evrything is as black and white as following documented process to alleviate this type od problem. Yes budgetory issues are always the biggest challenge for IT managers and directors alike yet this is also an integra-gal part of their own function and responsibility. Please understand that i am not criticising or even agreeing that we could all 'blame' the old adige of 'If only I had the budget to be able to cover this'. One important skill, as i see it, for an IT Director and/or Manager is that or being extremely eeficient in 'balancing' the books for what is most critical for a departments future. The prsent of course needs a close eye on, but it is future planning that takes the great skill. Hope all works out great as i am sure it will with the new procedures you are putting into place.

georgef
georgef

Hello Scott, Your name sounds familiar for some reason. Enjoyed your comment about ex employee and the brilliant career ahead of him. I'm sure he'll apply boundless energy in pursuing the core/atomic needs of his new employer. The funnies and puns over, (cathartic?), you seem, to me, to be taking the correct approach. As IT managers we slip into a comfort zone when we have star employees who "take care of it" and free us to worry about other things. Also, sometimes in our business, people will resort to "single stroke" solutions either because they feel it looks brilliant (serious now) or they are lazy, ie. your mass loads. When I was a programmer, I always looked to cover my bases in the future, not the short term, and was always pleasantly surprised later when I realized I had done something with that in mind. The more effort you spend now, even to the ignorance of other issues, will equip you to better deal with the future. There are all sorts of fads in business management theory that try to evangelize concepts like this but its just plain good management and you either are one or you're not. Its all well and good for IT to get its act together with regards to these things, but you, and any other manager reading this, should try and get it to be an enterprise-wide program. All processes should be examined, refined, and clearly documented with updates, if any, applied regularly. Some may feel this adds to worker "replaceability" but I would counter that finding someone that conscientious about there work is much harder to find and identifies this person as a keeper. To scripter, I would add another long standing product for business process modeling and analysis, SciForma Process. Be Well and good luck. George

valerie.delahouliere
valerie.delahouliere

Hi. I definitely concur with Scott and George... Things can get so busy that it is easy to delegate more than operations to a competent, energetic and enthusiastic employee. And if there is not enough dialogue, well, then that employee could be doing more short-term fire fighting than medium to long-term prevention. (Sorry, just saying the same thing a different way.) In any case, the main thing that strikes me is that a focus on business rules expressed in client terms would be a better place to start documenting. Of course the business and technical processes are important and critical with regards to operations, but the business rules are more important - they are longer lived than technology, they should be independent of technology, they frequently impact security, and they give the technical implementers a better understanding of what needs to be achieved - which in turn can lead to clarification and efficiencies in the business processes. Hope this doesn't sound silly or obvious.

valerie.delahouliere
valerie.delahouliere

There are a few products on the market that could help in federating and/or synchronising identity related data across multiple different systems. True, that if everything is in the same db or db system, triggers and stored procs are fine, but if the business rules are complex (e.g. new student, but also part of faculty who can access other systems or resources), well, it might be worth investing in a synchronisation product. We use MIIS.

scripter
scripter

Very apropos article for me, as I have just been asked by our Risk team to find a good software solution for documenting our business processes. Most BPM software I've found so far consists of mid-six figure soup-to-nuts solutions that help you document, analyze and write code to improve your processes. Since we just want to document it (and don't want to rely on Visio or Excel since so many of these processes have one-to-many relationships), the field is much narrower. I'd be curious to learn which tools you're using for your documentation project. So far the best candidates I've found are Tibco Business Studio and a SaaS tool called BPM Blueprint by Lombardi Software.

feral
feral

I mapped a number of processes for our aircraft maintenance repair unit with the Graham Process Charting software, very nice and very fast. Shallow learning curve. Great documentation and tutorial files along with excellent support. Dr Graham's white papers and manuals are worth the read alone, his modelling method is ISO compliant if I remember correctly. Check it out.

Dr Dij
Dr Dij

Ravenflow is pretty neat: you can highlight a paragraph with an 'if' statement describing the biz process and it will draw a flowchart and insert it in-line in the document. tibco and lombardi both have free downloads you can try out the software with. iRise is good for slightly dift purpose - documenting reqts for quick prototyping. and they now have trial vsn and tutorials online. and also some programs are not as good at documenting the people part of biz processes. it depts tend to only document the computer / technical part. some of these support documenting people centric processes - such as deciding what to do with a document or what artifacts to create for a client or how to continue the process.

dfolzenlogen
dfolzenlogen

Can you give me ballpark figures on the cost of these software solutions?

aandrews
aandrews

We use this and it does the job well. Big step up from Visio and Word which is what we used to use.