Software Development

Short Order Programmers

54 comments
absher786
absher786

I have personally experienced that high cohesion between programmers while working on a project/code is always like "too many cooks, spoiling the soup." Usually, I find myself more relaxed and sharp while working independently. The collaboration between programmers within a team should be just the way like two or more classes collaborate to bring about a task: encapsulation (just offering each other a set of tools (classes/methods/functions/dll's) while not getting bogged down on the same code).

stew
stew

[I had to start at the top level to reply to "If/Else."] Now, instead of doing all of that if/else, I do a lot of try/catch. But, to be frankly honest, I still prefer to validate within if/else and other conditional blocks, because it lets me manage the response a bit better. Also important is that try/catch seems to be a bit slower compared to an if/else. You have a flawed view of exception handling, at least from my experience and reading. First, the speed of error handling isn't usually important unless try blocks add unwanted overhead. For example, in C++, most compilers manage to avoid any overhead for try blocks. When an exception is thrown, you pay a high performance penalty, of course. If the languages you use make try blocks slow, complain! Second, error handling of abnormal events is much simpler with exceptions. Using exceptions avoids the clutter of "if (failed) return error" throughout a function, its caller, its caller, etc. You can simply write the code assuming everything works, but with exceptions in mind, and then add the handler at whatever level of the call stack makes sense to address the problem. A language like Python uses exceptions for normal error reporting, so it clutters up one's code relative to conditional logic. In C++, however, exceptions are reserved for exceptional conditions. Things that one expects, especially those that are normal and even frequent, one can handle optimally with conditionals. Even there, however, if the relative frequency is low enough, exceptions can be superior by permitting a low level function to signal a failure by throwing an exception while a higher level handles the exception. Everything in between remains clean and unaware of the failure. C++ provides mechanisms that are captured in common idioms that make this treatment of exceptions straightforward. Python, by contrast, must be riddled with try blocks to account for errors along the way, making the code less readable and maintainable. Ultimately, it is the programming language that dictates the efficacy of exceptions. Just don't let their treatment in one language color your perception of them in another.

onbliss
onbliss

I would prefer exceptions for abnormal events, as you point out. In my view, it is a good practice to handle as much logic/validations using if/else than exception-handlers. I am missing your point about clutter. The logic (a.k.a clutter) has to be some place in the code, right? Be it in the if/else or try/catch block?

stew
stew

The clutter comes from every function in the call chain having to deal with errors reported further down the chain. Sometimes, the return types differ and a function has to translate from one type to another to satisfy its own interface. With exceptions, the function at the lowest level can throw an exception and the function at the top of the call chain can handle it. No functions between those two need any code to handle the errors reported by those exceptions. Please understand that I'm not advocating exceptions in lieu of conditionals. I'm advocating using each to its best advantage.

stew
stew

My original statement, for which I provided an example, was, "With exceptions, the function at the lowest level can throw an exception and the function at the top of the call chain can handle it. No functions between those two need any code to handle the errors reported by those exceptions." My example showed how the intervening functions can be oblivious -- other than being exception neutral -- to any exceptions reporting errors from lower levels. Your question has to do with situations in which the intervening functions need to handle errors from lower level functions. The same basic rules apply: how common are the errors and through how many otherwise uninterested levels must they be communicated? If b() needs to act based upon a()'s errors, then a() should return an error indicator and b() should act conditionally. If it is d() that needs to act on a()'s errors, but neither b() nor c() cares, then an exception is likely better. Now let's consider your example, which presents a different scenario from mine. In this case, b() doesn't propagate any of a()'s errors, so I wouldn't have a() throw exceptions for b() to catch unless they weren't common. I'd have a() return a value for b() to inspect. If, as you suggest in the latter half of your example, b() only cares whether a() succeeds or fails, then a() can take whatever actions are sensible to return a Boolean answer. If that means a() must handle exceptions from still lower level functions, then so be it. I advocate using exceptions when they are appropriate and other reporting mechanisms when they are better, according to some rational definition of "better."

onbliss
onbliss

But, say, if b() or c() had to do something more than returning "false" or "33" then either they have to catch exception or use some kind of validation. For example, if b() had to do something different if a() returned 0 or -1, and did not have to propagate the error up to d(). We could write the code in such a way that a() would trap all the exceptions (use exception handling here) and then return just 0 or -1. Then b() does not have to use exception handling, it could just check the value with an if/else, and execute the code it wants to. c() and d() do not even know about the guts (encapsulation ??).

stew
stew

I wrote: "The clutter comes from every function in the call chain having to deal with errors reported further down the chain. Sometimes, the return types differ and a function has to translate from one type to another to satisfy its own interface." Here's a quick example: int a() { ... }; bool b() { if (0 != a()) { return false; } ... } int c() { if (!b()) { return 33; } ... } int d() {   if (33 == c())   {      std::cerr

onbliss
onbliss

Can you give an example for the scenario described in your first paragraph?

Justin James
Justin James

"You have a flawed view of exception handling, at least from my experience and reading. First, the speed of error handling isn't usually important unless try blocks add unwanted overhead. For example, in C++, most compilers manage to avoid any overhead for try blocks. When an exception is thrown, you pay a high performance penalty, of course. If the languages you use make try blocks slow, complain!" I am well aware that try/catch do not add overhead, it is the speed of what happens on "catch" that is the problem. If it is 10% slower, and you are using try/catch to do what if/else validation could do while, say, parsing a 10 GB log file, your application is going to be slow as a dog. Remember, "input" means "any data that came from external to this executable file." Try/catch's speed of exception handling is find for the 5 things the user just typed in, but using it to handle batch processing of data is not such a hot idea, unless you require the process to halt on any errors. Re: Python/C++, etc. I know exactly where you are coming from on this. I spent a lot of time with Perl, where exception handling was kind of loosey goosey (the old "whatever || die('Error Message') stuff. Every language is indeed different, and what works in one does not work well in another. My statement above was not to say "don't use try/catch", not by any means, but try/catch, in my mind is a bit of a nuclear bomb of a solution. If you need something that says, "oh, goodness, an error has occured!" it is great. To use it to precisely handle the wide variety of errors that could crop up in a large block of code though is very tricky, and you would have to write so much conditional code in the catch that it would look like the validation code in the original would have been anyways. J.Ja

stew
stew

"It is the speed of what happens on 'catch' that is the problem. If it is 10% slower, and you are using try/catch to do what if/else validation could do while, say, parsing a 10 GB log file, your application is going to be slow as a dog." I already addressed that in my message, to wit: "Things that one expects, especially those that are normal and even frequent, one can handle optimally with conditionals. Even there, however, if the relative frequency is low enough, exceptions can be superior by permitting a low level function to signal a failure by throwing an exception while a higher level handles the exception. Everything in between remains clean and unaware of the failure." If the relative frequency of those conditions is high and performance matters, don't use exceptions. "To use [exception handling] to precisely handle the wide variety of errors that could crop up in a large block of code though is very tricky, and you would have to write so much conditional code in the catch that it would look like the validation code in the original would have been anyways." I've never written that sort of conditional logic in exception handlers. In the worst case, you can rely upon polymorphism to dispatch the correct response. You can also distinguish cases by throwing different exception types and handling them separately.

Ken Cox
Ken Cox

As the title suggests you have to scale your problem handling in your application to correspond with the layer in which the problem resides. Too often the solutions offered are holestic between what could be termed a business logic or user error and what is an exception. Definition: An exception is an event not planned for in code that constitues a major break in program logic that cannot otherwise be handled as a planned occurance. Lets face it most events can be forseen by programmers and the methods we employ to catch those "KNOWN" events are usually much better handled utilizing if/then - select/case type constructs and validation tools. Try/catch is designed for the unplanned event the true error that has not been forseen but still needs to be accounted for in the application architecture. Now granted that is just the approach I have usually taken and there are always exceptions, excuse the pun, to every rule.

stew
stew

[I find it ironic that you complained about my reply being snotty, overzealous, and bordering on personally attacking you in a post that is itself snotty, etc. Indeed, I find it in terrible taste to call me snotty, overzealous, and confrontational and then run away with no opportunity to defend myself to you. My response was written with none of those things you ascribed to it in mind. If you misunderstood something, I'm sorry. Turning on me as you did hardly improves communication. Read on if you'd like to continue any more civilized discourse.] You cited exceptions as those things that cannot be foreseen and that all other "events are usually much better handled utilizing if/then - select/case type constructs." That understanding of exceptions is far too constraining, hence it is a bad definition. Whether that definition succeeds in your "frame of reference" doesn't make it a good definition for anyone else. As for the false dichotomy, you relegated exceptions to unforeseen events, and called everything else "business logic or user errors." Since there are other interpretations for the use of exceptions, and additional categories of "events," yours was a false dichotomy. That is, there were more than two choices, so making the reader choose from among your two choices is what is known as a "false dichotomy." I defended my position by stating that "the right logic is to use exceptions when the points at which the problem can be detected and at which it can be addressed are well separated or when the events are relatively infrequent." Perhaps you took the phrase, "the right logic" as an affront. I was merely trying to adjust the narrow view of exceptions you presented. In this case, too, your "frame of reference" might allow certain simplifying assumptions that lead to your choice for when to use exceptions and not, but that doesn't make your simplifications generally useful. My correction may help others, at least, see the general picture. You categorically asserted that my statement, "It is common to want errors to halt the application," is "neither common nor something that *you* would want under any conditions" (emphasis mine). I find it odd that you would tell me what *I* wouldn't want under any conditions in my applications. I do want that behavior. Indeed, nearly every desktop/console application does that very thing. Suppose the user supplied bad command line arguments such that the application cannot start? Rather than run in some horribly deficient state, the application simply displays a diagnostic and exits. There are many other circumstances in which this behavior comes into play. In those cases, throwing an exception that isn't handled until some top level function allows displaying the diagnostic and quitting gracefully. No code anywhere else in the application need account for such error conditions. Using many exception handlers may be a clue that your use of exceptions is flawed unless you're using a language like Python for which they are a common error reporting mechanism. In C++, it most definitely is a problem to write many exception handlers. RAII, in its many forms, and exception-neutral coding are the preferred means to account for exceptions in most contexts. With other languages, other idioms apply, of course. Since you acknowledged that your "frame of reference" is a web environment, you might consider framing your statements in that light in the future so readers can judge your statements in that context. Otherwise, they are construed as intended for general application.

Ken Cox
Ken Cox

The dichotomy is neither false nor spurious and the definition of exceptions I provide may not correspond with your reality using your toolsets but fits perfectly within my frame of referance. If you percieve a specific error condition then you can catch and handle that condition gracefully. Our goal as developers should be not be that no errors occur but that no error occurs without handling. It is obvious that we are talking apples and oranges because your statement "that its common to want error to halt the application" is neither common nor something that you would want under any conditions. Try/Catch exists to prevent just such a thing from occuring. Granted I work in a web environment and that shades my perceptions. With that said I will not post here further. I might point out that my comments were posted as my opinion unlike your rebuttal which is framed with zealosness and borders upon personal attacks. You might try to take a deep breath next time before respond. I have removed my subscription from this post so respond as you will.

stew
stew

You present a false dichotomy and a bad definition of exceptions. The right logic is to use exceptions when the points at which the problem can be detected and at which it can be addressed are well separated or when the events are relatively infrequent. Consider iteration in Python: when you reach the end of the range you're iterating, you get an exception. Your Python code needn't check a condition every time. (Looping constructs take advantage of that implicitly, but you can deal with it explicitly if you like.) If the range to iterate is small, the exception overhead is high. If the range is large, the overhead is small. Was using exceptions to signal the end of iteration the right decision in Python? Maybe, but it certainly works. The point is that you have to decide whether to use exceptions based upon how likely or often they will occur. It is common to want errors to halt the application. By throwing an exception, your code can unwind to main() where you can report the error and exit. Meanwhile, the unwinding can close open files, release memory, etc., yielding a graceful exit. Whether to use exceptions of conditionals is not based upon whether you know about an error condition. It is based upon many factors, including readability and maintainability.

meceli
meceli

Not sure if you document your work and also prepare your analysis/design specs/ project plan, test cases, etc. but that is what you have to do when you are solo. They don't magically happen, you have to make them happen. For QC, don't do it, just get someone to do it as you are most probably the worst choice to QC your own project. With "Agile" you may be required to have more discipline than just checklisting the old way.

Justin James
Justin James

"Not sure if you document your work and also prepare your analysis/design specs/ project plan, test cases, etc. but that is what you have to do when you are solo. They don't magically happen, you have to make them happen." Most of the time, there is no time for this. Think about it... a customer calls you at 10 AM asking for something to be done by 4 PM. The "project specs" are a 5 bullet point list. The "documentation" is a quick email. There just is not enough time to do it properly. "For QC, don't do it, just get someone to do it as you are most probably the worst choice to QC your own project. With "Agile" you may be required to have more discipline than just checklisting the old way. " I agree, being my own QC is a recipe for disaster. Unfortunately, only one of my clients will help me QC, outside of them, everything has to be checked by myself, because my boss/corworkers typically do not have the time or the knowledge to check it.

whevansiii
whevansiii

I am in the same situation. I work alone. Getting help to Q&A and preventing project mission creep is a real pain. Using the client to "test" is a hit or miss, even if you spell out the process.

Ian Thurston
Ian Thurston

It's great to have the freedom and flexibility of the Lone Ranger, isn't it? But the price tag for all that autonomy and power is that we have to take responsibility for Quality Assurance, scope management, and disaster recovery, without even Tonto to backstop us. It's our job to explain (over and over) why resources should be dedicated to testing our work (and why the testing CANNOT be done by us alone, since it's our assumptions that have to be tested by someone with different assumptions). It's our job to take the heat for unforeseen consequences and convince management and workers that even though we are not programming gods of perfection, we are doing our job as well as they are doing theirs. A professional businesslike manner (and a TON of patience) backed up with documentation to avoid "he said / she said" arguments go a long way to smoothing the job. Sure, this is human relations, not programming. But it is a key part, perhaps the hear of succeeding at the Lone Ranger business. If you can't do this, you WILL fail on a project. Finally, "Count to 10" is a useful strategy for dealing with frustration as a kid. It's ESSENTIAL for dealing with clients as the Lone Ranger.

Justin James
Justin James

The customer's attitude plays a huge role in the success or failure of these projects. I have some customers that will (and in a bizarre way, seem to enjoy) do a lot of testing once I pass it on to them; they actually schedule their time around mine, so I can pass off code for them to play with. For this customer, it ends up being a very informal version of Agile Programming. The rest of my customers though, they have this fantasy of IT Black Box, you put project specs in and get results out. These are the customers that I get very nervous about, because there is no sanity check at all on the work. I make assumptions, they do not test them, and then 2 months later they call me about the assumptions, picking my brain to find out why something works the way it does. J.Ja

Mark Miller
Mark Miller

I was working with a small IT/software shop that did projects for non-profits and small businesses. I worked from home, doing a lot of the work myself. It's not my ideal work, but I learned a TON by doing it. It got kind of lonely though. I realized that when I worked in teams, there was a sense of comradery with fellow coders. We would get together and chat at lunch about what we were up to, and common interests. It was challenging trying to get that same feeling working at home. I guess forums like these were minor substitutes. :) [i]It is not that I turn these opportunities down, or even that I interview for them but I am rejected; it seems like they just do not exist in my neck of the woods.[/i] I got confused when you said this. Are you saying that the "5 star" opportunities exist, but you get rejected, or are you saying you just haven't found any, period? Having people like you and me working alone seems to be the way that a lot of companies have wanted to do things since IT went bust in 2001. When I was struggling to find work a few years ago, I noticed most want ads showed that they wanted developers to work alone and do everything that other people used to do in teams: gathering requirements, requirements analysis, design, coding, testing, and deployment. I did all of this in my last position, with some help from my boss, the guy I contracted with. One of the things I got into when I worked with teams in the past is I tended to lean on the other members. I did this without realizing it. I would pull my weight, but sometimes it would get to feeling like it was too much of a burden and I would kind of dump my problems on other people. Once I realized what I was doing I felt bad about it and vowed not to do that anymore. Sometimes it's tough not to, because otherwise you just have to suck it up and carry the extra burdens that you would rather not deal with. I think I learned to be a better tester by working alone, because I knew there was me, and then there was the customer. We always did it such that the customer would do an acceptance test before putting it into production, but the customer expected to find minor problems, if any. They'd complain loudly if the app. crashed during normal operation, for example. When I worked in teams, I used to do some integration testing, but I largely left the thorough testing to the QA team. It felt great, because I'd count on them to find any problems a more detailed test on my part would've found. I tended to hate putting a lot of time into testing. I just wanted to find the really obvious stuff. When I was working alone I imposed that extra pressure on myself to try and break my own program, because I knew if I didn't the customer would. The one thing I really get down on myself about in my work is if I disappoint a customer. So that motivated me to just tough it out and try and do what it took to find all of the bugs, even if it wasn't enjoyable. I'd make up for it by hearing how pleased the customer was with what I created. I didn't have a methodology for doing it. I knew of the principles of unit testing and code coverage, so I just tried to test all the different operations individually, and in combination. I tried boundary conditions, throwing junk at the app., and I challenged myself to come up with new inputs I hadn't tried before. That was it.

Justin James
Justin James

"I got confused when you said this. Are you saying that the "5 star" opportunities exist, but you get rejected, or are you saying you just haven't found any, period?" I just am not seeing them. By and large, when I *do* see a multi-developer environment, it is for "an exciting opportunity developing data-driven applications in [fill in framework name here]." Yawn. Just what I want to be doing, re-writing the same application that I have been writing for the last 5 years. It is telling that the last time I worked on a long-term project that constantly forced me to think instead of wrestling with the peculiarities of a library while writing "glue" code was for a dot-com that folded making a useless product that they targeted towards and audience that is notorious for not having technology budgets. J.Ja

Mark Miller
Mark Miller

[i]It is telling that the last time I worked on a long-term project that constantly forced me to think instead of wrestling with the peculiarities of a library while writing "glue" code was for a dot-com that folded making a useless product that they targeted towards and audience that is notorious for not having technology budgets.[/i] It sounds like you're saying something about the IT software market. Would you mind elaborating? I'm always interested in insights. From what I gather, you're saying there's hardly any innovation going on. It's all about turning out applications like (physical) widgets in assembly line fashion. Is this what you were getting at?

Justin James
Justin James

"I've heard many times that a lot of businesses are using Excel spreadsheets as poor-man's databases, something it really wasn't meant for." You would be surprised at what I have seen Excel used for. On the one hand, it is kind of cool in that hackish sort of "look what I managed to pull off!" way. I the other hand, it becomes a real mess, because it gets embedded as an important business tool and people keep attewpting to extend the app and it gets more and more impossible. Case in point: we made a small Excel report which originally took one small CSV file as input, and it worked like a charmed, kickedc out reports for each induividual sales territory. Then they wanted some detailed information added, and now Excel is attempting to handle 300 MB worth of data, and it is mangling itself when you run the macros. "The problem is it gets real difficult to relate data with each other, if that's ever needed." Oh, don't even get me started, you end up adding columns to make indexes of combinations of unique information, and VLOOKUPing on them, but if you add another column to VLOOKUP, you need to go through all of the macros and hope you find all of the references to the columns to shift them in code. I suppose a named range could help with that, but I never tried it. "Have you proposed the idea of using InfoPath instead?" Almost all of my customers are still on Office 2000. :( "It seems like if Microsoft would just take Excel, extend it with database capabilities (ie. roll Access into it, or something better), most customers would be thrilled. It would offend the purists, but most customers would love it." That would be insanely sweet! "Sounds like maps are the first reusable web component that most people find useful. It also sounds like a potential opportunity for Google: offer customers the ability to store their data in a Google database, and link it to Google maps, so they don't have to do it themselves. Slap a customer-specific web front end on it, and there you go!" Maps are easy because the input specification is so standardized, and easy to work with. Phone numbers, same thing. Try doing a mashup with more complex or less rigid data, and every customer will need to write a conversion layer. This is why Web services and UDDI never took off, XML is fairly useless (beyond its technical problems), and distributed server side processing as a collaboration of backend systems just isn't happening anytime soon. Everyone handles the same general data differently, the way their business needs to, and no one wants to change how they do it. J.Ja

Justin James
Justin James

"I've heard many times that a lot of businesses are using Excel spreadsheets as poor-man's databases, something it really wasn't meant for." You would be surprised at what I have seen Excel used for. On the one hand, it is kind of cool in that hackish sort of "look what I managed to pull off!" way. I the other hand, it becomes a real mess, because it gets embedded as an important business tool and people keep attewpting to extend the app and it gets more and more impossible. Case in point: we made a small Excel report which originally took one small CSV file as input, and it worked like a charmed, kickedc out reports for each induividual sales territory. Then they wanted some detailed information added, and now Excel is attempting to handle 300 MB worth of data, and it is mangling itself when you run the macros. "The problem is it gets real difficult to relate data with each other, if that's ever needed." Oh, don't even get me started, you end up adding columns to make indexes of combinations of unique information, and VLOOKUPing on them, but if you add another column to VLOOKUP, you need to go through all of the macros and hope you find all of the references to the columns to shift them in code. I suppose a named range could help with that, but I never tried it. "Have you proposed the idea of using InfoPath instead?" Almost all of my customers are still on Office 2000. :( "It seems like if Microsoft would just take Excel, extend it with database capabilities (ie. roll Access into it, or something better), most customers would be thrilled. It would offend the purists, but most customers would love it." That would be insanely sweet! "Sounds like maps are the first reusable web component that most people find useful. It also sounds like a potential opportunity for Google: offer customers the ability to store their data in a Google database, and link it to Google maps, so they don't have to do it themselves. Slap a customer-specific web front end on it, and there you go!" Maps are easy because the input specification is so standardized, and easy to work with. Phone numbers, same thing. Try doing a mashup with more complex or less rigid data, and every customer will need to write a conversion layer. This is why Web services and UDDI never took off, XML is fairly useless (beyond its technical problems), and distributed server side processing as a collaboration of backend systems just isn't happening anytime soon. Everyone handles the same general data differently, the way their business needs to, and no one wants to change how they do it. J.Ja

Justin James
Justin James

"I've heard many times that a lot of businesses are using Excel spreadsheets as poor-man's databases, something it really wasn't meant for." You would be surprised at what I have seen Excel used for. On the one hand, it is kind of cool in that hackish sort of "look what I managed to pull off!" way. I the other hand, it becomes a real mess, because it gets embedded as an important business tool and people keep attewpting to extend the app and it gets more and more impossible. Case in point: we made a small Excel report which originally took one small CSV file as input, and it worked like a charmed, kickedc out reports for each induividual sales territory. Then they wanted some detailed information added, and now Excel is attempting to handle 300 MB worth of data, and it is mangling itself when you run the macros. "The problem is it gets real difficult to relate data with each other, if that's ever needed." Oh, don't even get me started, you end up adding columns to make indexes of combinations of unique information, and VLOOKUPing on them, but if you add another column to VLOOKUP, you need to go through all of the macros and hope you find all of the references to the columns to shift them in code. I suppose a named range could help with that, but I never tried it. "Have you proposed the idea of using InfoPath instead?" Almost all of my customers are still on Office 2000. :( "It seems like if Microsoft would just take Excel, extend it with database capabilities (ie. roll Access into it, or something better), most customers would be thrilled. It would offend the purists, but most customers would love it." That would be insanely sweet! "Sounds like maps are the first reusable web component that most people find useful. It also sounds like a potential opportunity for Google: offer customers the ability to store their data in a Google database, and link it to Google maps, so they don't have to do it themselves. Slap a customer-specific web front end on it, and there you go!" Maps are easy because the input specification is so standardized, and easy to work with. Phone numbers, same thing. Try doing a mashup with more complex or less rigid data, and every customer will need to write a conversion layer. This is why Web services and UDDI never took off, XML is fairly useless (beyond its technical problems), and distributed server side processing as a collaboration of backend systems just isn't happening anytime soon. Everyone handles the same general data differently, the way their business needs to, and no one wants to change how they do it. J.Ja

Mark Miller
Mark Miller

[i]Excel is very much so the default app (along with Outlook) for a huge number of users. It is why you see Excel being used for so many things that it is lousy at, because people use it enough to feel comfortable.[/i] I've heard many times that a lot of businesses are using Excel spreadsheets as poor-man's databases, something it really wasn't meant for. I can understand why users decided to do this though. Simple databases are just rows and columns of data, conceptually. Excel facilitates that, and makes it frictionless. There's no setup. Just start typing in your rows and columns. Easy, right? The problem is it gets real difficult to relate data with each other, if that's ever needed. Another problem is Excel has a capacity limitation. Once you hit that ceiling the only choice you have is to create another empty spreadsheet with the same rows and columns, which has no relationaship to the data in other other spreadsheet. Data entry I can kind of understand, since that's a piece of the functionality that a spreadsheet must have. It's difficult to validate the input though, isn't it? Have you proposed the idea of using InfoPath instead? It's designed for data entry and validation, and it's part of Office. I vaguely remember seeing a demo where the guy somehow integrated Excel into InfoPath. It looked seemless. It seems like if Microsoft would just take Excel, extend it with database capabilities (ie. roll Access into it, or something better), most customers would be thrilled. It would offend the purists, but most customers would love it. I read an interview with Avi Bryant, one of the creators of Seaside, a web framework I've investigated (runs on Squeak). He's done something like what I'm talking about. He created an online app. for small business called "DabbleDB" (www.dabbledb.com). He wrote it using Seaside, and its user interface is similar to Excel. The difference is it has database capabilities. He took advantage of Excel's appeal as a database tool. Users put fields and values into rows and columns that can be moved and resized. All data is typeless unless the users asks it to put constraints on it. It uses a subscription model. Kinda neat. [i]Sure, there are some neat mashups with maps out there... but the fact that 75% of thre mashups out there are Google Maps (or a knockoff of it) + some database of addresses says a lot about the idea of mashups, doesn't it?[/i] Sounds like maps are the first reusable web component that most people find useful. It also sounds like a potential opportunity for Google: offer customers the ability to store their data in a Google database, and link it to Google maps, so they don't have to do it themselves. Slap a customer-specific web front end on it, and there you go!

Justin James
Justin James

"The frustration you're expressing is a symptom IMO that you're ahead of the curve." Thanks! Your point about the printing press is absolutely perfect. Right now, computers are the point of computers. Once they are accepted as simple appliances and "just work", then we can start using them as a lever. Some aspects of the Internet have already gotten there... email was almost there, but now spam is destroying it. "In terms of Excel, I guess it's become kind of the standard offline app?" Excel is very much so the default app (along with Outlook) for a huge number of users. It is why you see Excel being used for so many things that it is lousy at, because people use it enough to feel comfortable. But that is also why it is a good tool to use as an input/output system. Things like autofill make it easy for data entry, much easier than a standard database interface or a Web form. For one project, I made a giant Excel spreadsheet that takes the user's input and transforms it into the proper INSERT statements to be used to populate a database. While I hated writing it, the customer like it much better than the Web forms. "Are you saying Excel is pointless because these packages already have offline technologies they could've used?" Yup, Cognos (and probably Seiberl) can output results in Excel or PDF format, as well as on the screen. But using these monolithic OR mapping report systems is such a hassle, they would rather dump data on us and have us make the report, instead of their million dollar systems. Consultants are much easier black boxes than reporting sytems. :) "'ve seen some of these mashups where the maps make sense, like in a phone directory." Sure, there are some neat mashups with maps out there... but the fact that 75% of thre mashups out there are Google Maps (or a knockoff of it) + some database of addresses says a lot about the idea of mashups, doesn't it? J.Ja

Mark Miller
Mark Miller

[i]Yeah, I see little innovation indeed. The market is *still* stuck in the data processing mindset. We are still writing dumb reports based on dumb data. I look at the reports my clients have me run, what they measure is absolutely meaningless to the business, but because they just want to replicate a paper process on a screen, they are focused on "how" and not "why".[/i] When you said this it reminded me of something that Alan Kay and Douglas Engelbart have both complained about: that we're automating what we already do on paper. I see some value in doing that, actually, but I see their point as well. Automating paper is not that interesting to us. They see computers as a new medium as revolutionary to our culture as the printing press was. Kay said that when the printing press was invented, people did things with it that were not too novel: just automating what they used to do by hand. A few hundred years later our society was really harnessing the power of the printing press, doing things like enabling democracy and expanding educational opportunities. He predicts that computers will have a similar effect, but we may not see it for a while. It can take a while for people to realize what they really have. What would jumpstart the process is if some business people understood it and starting using it to their advantage. The frustration you're expressing is a symptom IMO that you're ahead of the curve. A few years ago I read an article that's now 10 years old about laid off employees at an industrial plant being bewildered as to why it happened. The plant managers were looking at new technologies for becoming more efficient. One former employee that was interviewed said something about how he was told the company was going to start using the internet. He didn't even know what the internet was. I doubt the plant managers did either. They were probably listening to outside consultants telling them to do this. I don't know, but maybe the people you're dealing with think that having computerized reports in the fashion you're delivering are a fabulous modernization of what they used to do. They don't get it yet that they could be doing so much more. Even if you explained it to them (if they'd listen), it still might be over their heads. [i]Yes, I am extremely frustrated with the software market. It is all "me too!" technology. Why am I pulling data out of Seibel and Cognos in order to generate Excel reports? It makes no sense.[/i] Alan Kay's Turing Award speech, which he gave a couple years ago, is titled "The Computer Revolution Hasn't Happened Yet". I loved the title. This was probably where he started talking about what I mentioned above. In terms of Excel, I guess it's become kind of the standard offline app? I wondered about this too when I worked on a project last year where the customer wanted a web app. that would receive and send back Excel spreadsheets as an auxilliary function (the main function was data entry into a database, done through the web app., and producing reports on that data). They were using Excel templated worksheets as an offline app., in case they were not connected to the internet. They could enter their values in the cells, upload it later, and the app. transferred the values into the database. At first this seemed real odd to me, but then I looked at what I would've had to have written from scratch if I wanted to duplicate its functionality. It would've been a lot of work and a lot of expense for them. Excel was the cheaper option. And Excel handles data entry and calculation well. It's already prepared for it. No need to write an app. to do that. The web app. for this project was pretty boring. All I was doing was mindless C/R/U/D. The Excel processing was the most interesting part. Figuring out how to produce the Excel templates, which were complex, was the most challenging. They had to be filled in with data, by category, for the end users to update, or add to. Personally I like working on apps that have some business logic in them, validating what the user enters, or filtering what they see if they put in a query. That takes some problem solving on my part--something I like exercising. I've heard of Siebel (for CRM), but haven't heard of Cognos. Are you saying Excel is pointless because these packages already have offline technologies they could've used? Maybe there's a training aspect as well. Rather than train new employees on Siebel and Cognos, they figure people already know Excel coming in, so they'll just have them use that when they're not connected to the central system. Just a guess. [i]Remember when AJAX and "Web 2.0" became the "bleeding edge" in "innovation"? And what do we have? A thousand "mashups" of some relatively worthless data with Google Maps. Astounding. "Innovation" is not combining a database with a map; MapPoint and other pieces of software have been doing this for years, even DECADES. What is "innovative" about putting in on the Web?[/i] I've seen some of these mashups where the maps make sense, like in a phone directory. In the recent election I wanted to look up what my state House and Senate districts were (something pretty obscure), and I found a very nice site that did that. I entered my zip code and it showed me the districts on a Google map, and where I was, and it told me who my current reps. were. I haven't really seen a useless map mashup yet, but then I haven't gone looking for them.

Justin James
Justin James

Mark - Yeah, I see little innovation indeed. The market is *still* stuck in the data processing mindset. We are still writing dumb reports based on dumb data. I look at the reports my clients have me run, what they measure is absolutely meaningless to the business, but because they just want to replicate a paper process on a screen, they are focused on "how" and not "why". Yes, I am extremely frustrated with the software market. It is all "me too!" technology. Why am I pulling data out of Seibel and Cognos in order to generate Excel reports? It makes no sense. Remember when AJAX and "Web 2.0" became the "bleeding edge" in "innovation"? And what do we have? A thousand "mashups" of some relatively worthless data with Google Maps. Astounding. "Innovation" is not combining a database with a map; MapPoint and other pieces of software have been doing this for years, even DECADES. What is "innovative" about putting in on the Web? J.Ja

onbliss
onbliss

...your code does not get reviewed by others?

Justin James
Justin James

The vast majority of the time, yes. There is no one in my company who would even understand what they are looking at, except for a few bits of SQL or maybe the more common pieces of VBA script in Excel macros. I have asked the other folks to review my code, instead it is reiterated that it has to be right when I say that it is done. To compound the problem, results must be *perfect*. Hundreds of thousands, sometimes millions of dollars ride on my work at any given moment in time. Not too get too geeky, but sometimes I feel like Luke Skywalker with his one chance to get the torpedo in the exhaust vent. This is why I am such a stickler and perfectionist about code, and why I really dislike the attitude a lot of developers, particularly Web developers, take about defects and bugs in software. "Well, if it eats a post or two, oh well." Well, I sometimes spend hours on a post, it is not "oh well." Since I have been with my current employer, I can only recall two bugs that made it to production, and luckily, neither one of them were bugs that affected the final output (they were interface bugs that simply prevented people from using the application when given odd input) or had to do with monetary matters. So yes, while others are willing and able to test with/for me on occassion, it is primarily all on my shoulders. To compound the problem, with the deadlines that I am freqeuntly given, it is often hard to do things the "right" way. For example, I work in OOP languages much of the time, but I rarely have the time to abstract or plan the code. That's why I like programming at home too; at least then, I can plan, with no deadlines or pressure. :) * Edited to clarify: at previous employers, I often had someone to review code, not at all employers. The lack of code review is a matter of circumstances, not preference. J.Ja

jslarochelle
jslarochelle

peer review. I use FindBugs (I work mostly in Java) but I know there are some good tools for C++ (the one with the little quiz in the advertisement in C/C++ Developpers Journal). I know Borland Delphi now has something similar also. I also use a nice Eclipse Metrics pluggin. Using this I can monitor things like the size of methods or more exotic parameters like the "cyclomatic complexity". That way I can quickly catch some problems in the nest before they become too complex to fix.

Justin James
Justin James

Mark - I did some research, and stand 75% corrected. Seagate's software devision changed their name to "Crystal Decisions" which was acquired in 2004 by Business Objects. I had it backwards, and thought Seagate had bought Crystal Reports. :) It's still hard to work in. J.Ja

Mark Miller
Mark Miller

What I remember is that CR is made by a company called Business Objects. Are they owned by Seagate?

Justin James
Justin James

But didn't, for that very reason. I looked at it for about 10 minutes, and said, "no". Talking to someone who is a consumer of Crystal Reports reports confirmed my suspicions about it. Plus the idea of Seagate as a company making development tools just does not make sense to me. J.Ja

Mark Miller
Mark Miller

Don't forget the other least supported technology, Crystal Reports.Net. The documentation for it sucks! I had to go to online forums to learn how to use it. It seems no matter what technology I've used (even non-Microsoft), the report creation tools are the worst. The reporting engines work OK, but the tools for creating the reports and the documentation for them are atrocious. Why is this??

Justin James
Justin James

... most of my work is in VB.Net, ASP.Net, SQL (primarily FoxPro, the least supported program in the history of the world), and VBA (the worst language interfacing with the worst object model in history). :( J.Ja

onbliss
onbliss

..that you have a good head on your shoulders :-) Based on your blogs and posts, you appear to be a very sharp and responsible individual. Happy coding....

Justin James
Justin James

It's always nice to hear nice things, even if on my first sick day since February, I get a call letting me know of my third bug to hit production. The customer changed project specs, I hand-ran the correction, didn't fold it back into the codebase, and when I re-ran the entire codebase a bit later, the change was lost. The customer says it is not a big deal, but the report had already been sent to the printers, so I am beating myself up over it. What a way to spend a sick day, VPNed in and fixing code. J.Ja

Big George
Big George

in order to avoid bugs when it goes to producton, since pretty much you alone analyse, develop and deployed your own programs.

Justin James
Justin James

jbeteta - That is an excellent question; I was thinking about writing about that very topic next week. Indeed I could (and maybe I should!) fill a whole book about the subject. In a nutshell, I have had to learn a bit about every aspect of the process, from the RFP all of the way through to sending an invoice and verifying that the project has been completed to the customer's satisfaction. Of course, my main area of focus is the coding itself and the QA/QC phase. Since most of my work is running various reports, I hand pick the data with single SQL statements or a data browser, using a methodology as different as possible from what the program uses, and compare the results. I do this at random, as well as inspect the first set of results and the last set of results. This turns up the vast majority of bugs. Using the first and last groups of results shows fenceposting problems, and a few random picks are likely to turn up the rest. Whenever possible, I use the code to essentially duplicate the work and compare them directly. If two different codebases generate identical results, either I am really making a mess of things, or the code is right. :) For non-report application, I typically will throw a ton of junk inputs and valid inputs at it, while stepping through the code in a debugger. It serves as a code review, and you get to see if anything is doing something wrong. It also lets me see interactively where my opportunities to refactor are. Hope this helps! J.Ja

Wayne M.
Wayne M.

Agile may just work for you. I think it addresses your short iteration cycle issues, your changing requirements issues, and your QA/QC concerns. XP is based on short 1-2 week cycles and there is not waiting for years for feedback. The concept is to take an idea into production code immediately. Likewise, in XP you do not wait for all requirements to stabilize. Decide upon something that can be completed in a week and do it. At the end of the week, either accept the new version, accept the new version and add some enhancements to the list, or reject the new version out right. Using test first design may solve many of your QA issues. The xUnit set of test frameworks handle most mid-tier and database issues. The user interface, especially when client-side coding of any sort is used, remains an issue unless you want to invest in some big time, yet clumsy, UI testing tools. I haven't been real happy with any of the written books on Agile Development; they seem to avoid the necessary details. There are some decent online tutorials and I'll post them back when I find them again.

Dr Dij
Dr Dij

Extreme Programming Refactored: by Matt Stephens and Doug Rosenberg Apress ? 2003 (428 pages) ISBN:1590590961 This text provides a thorough and systematic analysis of XP practices, proposing better ways of achieving XP's agile goals that are applicable to a much wider range of projects Hillarious, at points uses examples, such as the 'XP Picnic'. No requirements defined, people are just told to 'Go to the company picnic' (not even told which park to goto as that would be written reqts "they'll figure it out". People show up then realize they need something else so drive home... And they've got Beatles songs with XP lyrics; Has to be the funniest serious book. Goes over the C3 major project failure, which XP people seem to think was a success even after it was canceled. They give serious alternatives to really stupid parts of XP such as pair programming and not documenting requirements, and explain some parts that ARE good to adopt. I read this online at books24x7.com, in the IT subscription.

Justin James
Justin James

Wayne - Please do post those links. Nothing I have seen on Agile Programming or XP leads me to beleive that either one meets my needs. For example, I wrote the bulk of a program in 3 hours yesterday. It can barely be called a "program" (150 - 200 lines of code?), but it was cranking out results before lunch, and had gone through 3 or 4 iterations of specification changes on the customer's part by the time I left. Indeed, I had to code a brand new feature in under 20 minutes this afternoon. Most programming methodologies that I am aware of simply break down under those kinds of time restrictions. J.Ja

onbliss
onbliss

Once I worked in a group, that just cranked up code. No process, no methodology and no enforcement from the management. This was because everybody knew that some of things just took time. The turnover of the Managers was very high. Nothing wrong with writing code fast, but I just pitied the developers. Because of the way they programmed bugs crept in, and all the knowledge became resident in the developers' head. A dangerous situation for the company. But nobody really cared, but the developers.

jslarochelle
jslarochelle

"Code complete" are great suggestions. The case studies in Rapid Development are enlightening and fun to read. Code Complete is a classic. His last book "Software Estimation" is a good pratical guide to the estimation process (with just enough theory).

onbliss
onbliss

...are kind of resource hungry :-)

Justin James
Justin James

Mark - That is how I was taught as well. Now, instead of doing all of that if/else, I do a lot of try/catch. But, to be frankly honest, I still prefer to validate within if/else and other conditional blocks, because it lets me manage the response a bit better. Also important is that try/catch seems to be a bit slower compared to an if/else. For example, with .Net 2.0, if is recommended when casting to an Integer to do a TryParse and wrap that in if/else instead of putting a CInt or Integer.Parse within a try/catch, for speed reasons (I know, I just used VB.Net syntax/onjects there). J.Ja

Mark Miller
Mark Miller

Well, part way through it. The main thing I got out of it was: every "if" statement should have an "else". If it doesn't you should have a good reason why not, and document it. That helped me write robust code and code that could be diagnosed. Often the "else" case was an error, which I'd output to a log file (doing server work). It was great. Any time an error occurred I could diagnose the problem because I had covered every possible error condition. Nowadays with frameworks this rule doesn't really work, because the frameworks often throw exceptions in case of an error. There's often no return value to test. I wonder what the author says about this, if anything.

Justin James
Justin James

I remember that my father has this book as well. Between you & him, that is a powerful endorsement of it. To be frankly honest, I rarely read computing books. Too often, they have nothing to do with reality, it seems. The worst offenders are the programming books that seem to just be a printed copy of the online reference guide; they do a great job at explaining syntax (like I could not figure it out myself) with absolutely no helpful information regarden where, when, and why I would want to use a particular function or object or library or whatever. J.Ja

onbliss
onbliss

Here is his website. You might want to just browse some of his other books. Looks like he has a book on software estimation alone. http://www.stevemcconnell.com/ I have "Code Complete 2" on my book shelf. These two along with Martin Fowler's refactoring book are some of the books a developer should always read. 2 cents :-) edited: grammar

Justin James
Justin James

Thanks for the suggestion! I just added that book to my "To Get" list. J.Ja

onbliss
onbliss

[i]So why do people think that they can speed the coding process?[/i] Two kinds of people could think to speed up the coding process. One who are not aware of the intricate details, and another who, inspite of knowing, do not care. The lack of care could be as a result of time pressure or organizational development process. Essentially people do this because they can live with the consequences. On the subject of estimates, I find often consulting the book "Rapid Development" by Steve McConnell. There is a chapter exclusively devoted to deriving estimates. Ofcourse it is a rapid treatment of the subject.

Justin James
Justin James

That sounds like a pretty bad situation. I am surprised that there wasn't also a high turnover in *developers*. To be honest, I am not a fan of triage programming. It is hard for me to really enjoy my job when I am not given the resources (which includes "time") to do a job right. All too often, someone is screaming that "yesterday isn't soon enough!" which is a guarantee that the job will be botched. Would you say that to the guy working on your brakes and feel safe driving the car? No way. So why do people think that they can speed the coding process? They seem to think that the only thing holding me back is the speed at which I can type. In a weird way, it is nice to know that they think that I intuitively know the answers off of the top of my head and just need to type the code and hit "run". But the reality is, the mjority of the piece of code that initially pop into my mind need a ton of reworking or refinining to be useful, and that's where the time comes into it. My boss loves to ask me how long it will take me to write something. How am I supposed to know, especially for something when working with something new? On a month-long project, my time estimate will be right, give or take a day. On a 5 day project, my estimate can be off by as much as 2 days due to snags and changing project specs. Measure twice, cut once, and all of that... J.Ja