General discussion


Challenge - Delivery and Development

By larryhurwitz ·
If one had to ask the question's:

1) What are the elements of a succesful application delivery?
2) What are the elements of a succesful application development?

Your comments and input are encouraged.


This conversation is currently closed to new comments.

Thread display: Collapse - | Expand +

All Comments

Collapse -

Not Sure They Are Separate

by Wayne M. In reply to Challenge - Delivery and ...

First, one cannot judge the success of an application until after it is deployed. The development team and the deployment team must work together to ensure success.

The measure of success of an application is based on cost to date compared to benefits to date. The sooner an application is initially deployed, the lower the sunk costs and the sooner benefit is realized. Assume development and deployment will be ongoing and not one-time events.

Successfull development should minimize the time to initial deployment and between subsequent deployments. Successfull deployment should minimize the amount of disruption incurred. Development choices may affect the amount of disruption an application brings, and deployment choices may affect the tasks required of the development team.

I see a lot of risk implied by the separation of development and deployment, while noting it is avery common practice. It is better to get both groups aligned with the goal of successfully getting the application into the users' hands as quickly as possible.

Collapse -

Milestones and Benchmarking

by larryhurwitz In reply to Not Sure They Are Separat ...

Wayne - Thank you.

To some extent I agree with you, however, in order to better future deliveries and cycles, I would think one would need to break down the entire process and then evaluate each of the processes. If so, during the Product Life Cycle, would you not pre-schedule milestones (or markers), so one could benchmark the success or failure for such module/product? If so would they not be very different for both the development and the deployment? Assuming your answer is yes, what tools would you use to determine these markers/milestones, and what factors would determine these milestones?



Collapse -

Purely looking at development

by onbliss In reply to Milestones and Benchmarki ...

Organizations mean different things when they say "development". Some imply just "coding". Some imply "analysis, design & coding", and some mean "coding & testing".

In any case, most of them do differentiate this with the "requirements" gathering/phase/process. In my opinion, if one were to ponder on determining development success, then some kind of traceability matrix that a)links each requirement to a development task and to a test case and b)can provide a simple pass or fail status would be required.

Collapse -

Ellaboration on Delivery and Development

by larryhurwitz In reply to Purely looking at develop ...

If I had to ellaborate on the statement's, so they would read:

a) Define the Elements that make for a succesful delivery, and
b) Define the Elements that make a succesful development project,

What would those elements be in each case?



Collapse -

Full Release Cycle Evaluation

by Wayne M. In reply to Milestones and Benchmarki ...

The root of the issue, as I see it, is that "development", using the broadest possible definition, is inseparable from deployment. The most severe issues in deployment arise from prior decisions made by development and management. Often times, a poor deployment will over shadow a product's benefits. Clients can often recall a poor deployment years later; they don't just "get over it." Deployment is that important and cannot be segregated into its own little box.

If one accepts that the true value of a product is only realized after it has been put into use, then one can also accept that the product will be evaluated on both the benefits provided by the product's capabilities and the costs of the disruption incurred in introducing the product. Having development only success evaluations shifts efforts towards increasing benefits and away from reducing costs.

Ignoring deployment creates situations where the development team is not provided with the full range of the target environment. Many times the target environment is not fully defined or the definition is based on wishful thinking rather than on the realities of the user environment. Environment variables include operating systems and versions, browser types and versions, third party tools and versions, and hardware configurations. Product configuration steps are ignored and rarely automated. Key decisions are not made, resulting in options and configuration steps that result in higher product complexity and increased risk of deployment error.

Interim measures of development and other phases are simply surrogate measures for the value determined by product use. I am no longer of the opinion that these interim measures can be adjusted to reflect the true value and, therefore, provide a distorted view. I doubt I will convinvce very many based on a short message, but there is a growing body that is embracing feedback through short release cycles and a flattened release process. Process improvement needs to be based on a true end to end view.

Collapse -

Excellent reply - well substantiated

by larryhurwitz In reply to Full Release Cycle Evalua ...

Wayne -
Excellent reply and I agree whole heartedly with paragraph 1-3

Collapse -

Conscise and to the point

by larryhurwitz In reply to Full Release Cycle Evalua ...

Conscise and to the point, excellently put.
Thank you

Collapse -

Delivery or deployment ?

by Tony Hopkinson In reply to Challenge - Delivery and ...

I'm going to assume deployment because you can't have a successful development without a successful delivery.

How many deployments failed.

One of the big ones, is you naturally test on licensed fully patched systems with all the other components in place, in an architecture that at least meets the minimum specification.

For a horribly significant number of our customers this wasn't true. Of course our software got the blame, whether it was down to some of MS's eggregious mistakes or our customers in terms of their own IT infrastructure. It all used to work!

Our customer support had no in team resource to address these issues. So us R&amp bods had to go deal with irate IT illiterate customers. We were actually being embarrassed into asking for help from our IS function (the guys who support us) Oh unhappy day. I had to unplug my USB mug warmer as it wasn't company issue! Normally we don't let them in the office you see.

We changed vendors for doing the bulk CD copying, they were cheaper. The number of bad CDs in the run was a tad higher than usual, take a guess!

When covering off deployment, no case is too strange such as the customer who was running his business on XP media centre, and the chap who's IT resource sold him a linux box to install SQL Server on.

Measures of a successful development

How many bugs were discovered by customers ?

How many planned features did you get in the time frame?

How much unplanned extra effort did it take to achieve however successful you were at the above?

How many of your customers didn't like the way you did it?

How much easier will the next development be?

Did it meet the business expectations in terms of customer retention and new customers?

Newly added, How well did it deploy?

Some things to think about anyway.

Collapse -

Tony - Great reply

by larryhurwitz In reply to Delivery or deployment ?

Tony -
Thanks, great reply I can relate.


Related Discussions

Related Forums