Can your Web site pass the eight-second test?

While customers may be willing to stand in line at some stores, they won't wait around long at your Web site's storefront. In this article, we show you what several businesses did to ensure their e-commerce applications were up to speed.

Customers may stand in line in a bricks-and-mortar store, but they won’t wait at your clicks-and-mortar storefront. Response time can make or break your e-business. That’s why it is becoming increasingly important to test Web applications before they become public failures.

“The increasing emphasis on the need for testing has been brought about by the high cost of failure, the pressure of time to market, and the increasing complexity of applications that support e-business,” says Dick Heiman, analyst with International Data Corp., the Framingham, MA-based market research company.

New options, such as hosting services, are emerging for testing applications written to support e-business strategies. These options, which are many and growing, represent a convergence of application development/deployment testing and network testing.
This article originally appeared in the February issue of Wiesner Publishing's Software Magazine and appears on TechRepublic under a special arrangement with the publisher.
Performance factors
The e-business architecture typically involves applications running inside a firewall, and communicating through the firewall to applications of business partners or directly with consumers. This complexity gives rise to several key requirements. The most important are:
  • ·        Performance—Response time should be under eight seconds or the partner or consumer may click away.
  • ·        Availability—Brownouts and outages cost time, money, and increasingly, cause stock valuations to drop.
  • ·        Scalability—From an internal perspective, the organization fielding the e-business application needs to know where the break points are.

The “Victoria’s Secret” example was one of successfully driving visitors to your site, then not being able to serve them due to poor performance.

Historically, the major market for software testing tools has been the IT, service, and software supplier organizations building applications, first for the mainframe environment, and then the client/server platform. Now with applications being built for the e-business Internet environment, developers’ priorities have shifted to time-to-market and, to some degree, the ability to test infrastructure.

In response to time-to-market pressure, existing test tool suppliers are extending their solutions and new suppliers are emerging to offer testing services. These are much less invasive than tools that need to be used by internal application development and quality assurance (QA) departments, and of course, they serve a different purpose.

E-business applications live, in part, on the Internet architecture, outside of any one organization’s control. Thus the approach of many test tools, and increasingly, service providers today is to offer some degree of capability to test the Internet architecture, usually to identify performance bottlenecks.

Price pressure is also being introduced by some of the players who have a different business model than the traditional test tool players. This may translate into greater benefits for buyers.

What follows is an overview of selected experiences of IT professionals using available testing tools and services, as well as comments from some providers serving the market today.

Bitlocker busts bottlenecks
Bitlocker is a startup whose success will in large part depend on the reliability and performance of its Web site. The company is a free Web database service, offering small businesses and consumers the ability, through a Web browser, to create a database in minutes using pre-built templates. Formed in 1998 and having attracted investments from several sources, Bitlocker launched a public beta in late December and has since formally launched its service .

“We had three key requirements” for testing, says Deanna Falcon, director of customer care and QA for Bitlocker. “First, we needed to reliably scale to any number of users we wanted. Second, the testing had to be easy to set up and run. Third, that data that came back to us about I/O, CPU use, and so on had to be worthwhile.”

Bitlocker had conducted a search to find hosted testing services that use repeatable testing scripts and that could provide in-depth results. That search led to the ActiveTest service from Mercury Interactive Corp., Sunnyvale, CA, which was in a beta test stage at the time. Even so, the results were impressive.

“We found bottlenecks with our bandwidth and hardware configuration that were resolved before they became an issue,” Falcon says. The network provider was surprised to learn about the bottlenecks as well, she says.

Currently Bitlocker runs hosted monitoring from Mercury; a transaction runs on Bitlocker’s Web site every 15 minutes at eight locations worldwide. The company is considering the purchase of Mercury’s LoadRunner software to gain complete control over when the tests are run. Bitlocker has a request in to Mercury to run tests against their server. “We want to have control to run it any time we want,” Falcon says.

Mercury is working on an enhancement to enable ActiveTest users to schedule their own tests.

According to Zohar Gilad, vice president of product marketing for Mercury, a key advantage of ActiveTest is that it allows organizations to have load testing done without needing their own expert staff.

“Many companies just don’t have the bandwidth to perform load testing for their projects to go live,” so Mercury responded with a hosted service.

Service provider selects hosted solution
A different hosted solution was the answer for the Web applications group within GTE Internetworking , in Burlington, MA. The group offers a suite of value-added, Web-based applications and integration services that enable organizations to migrate their businesses to the Web.

The development team responsible for building e-commerce solutions at the company must ensure the applications perform under load conditions. “When people shop on the Web, they want results quickly,” says John Mihnos, QA engineer with the organization. “It is imperative that we be confident that our solutions will result in the end user having a positive experience regardless of what the traffic is on the site.”

To this end, the GTE Internetworking team investigated a number of load testing tools for Web applications. They found most to be adaptations of products designed to do load testing of client/server applications. An exception was WebLoad from RadView, Lexington, MA., designed from the ground up as a Web application testing tool.

“WebLoad provided the statistics we were looking for: transactions per second, HTTP connect time, and success and failure rates,” he says. The tool uses JavaScript for its Web test scripts, which was attractive to the GTE team. The total cost of ownership (TCO), including the price of the tools, estimated setup costs (including training and consulting), estimated time to develop and maintain test scripts, estimated costs of executing tests (including number of machines required, statistics gathered, and interpretation) and overall fit of the tool, all were strong points.

The JavaScript was especially appealing, because the scripting languages the group had been using with other testing tools had to be rewritten when they required revisions. “Complete rewrites of load testing scripts would be very expensive in terms of engineering time,” Mihnos says. The product’s overall TCO he found to be lower than any alternative.

“We were built on the Web and it shows in how we price the product,” says Andrew Cabot, vice president of sales and marketing for RadView. RadView is priced at $6,500 for a 100-virtual-client license. Other differentiating features for RadView, he says, include:
  • ·        The ability to pool hardware resources to run the virtual clients needed for load testing.
  • ·        The ability to have different numbers of virtual clients being tested at different times.
  • ·        The use of JavaScript to write test scripts.

Agents at work
Beyond availability and scalability, the ability to manage the performance of e-business response times is the third premier requirement. Response Networks Inc., in Alexandria, VA, is offering products and services focused on application performance measurement, especially of e-transaction sites. The company’s ResponseCenter can diagnose the response time of an e-transaction across networks, servers, databases, middleware objects, and application components. It does this with the use of live agents that continually hit multiple points on the Internet and report back the results.

“The agent generates synthetic transactions,” says John Morency, executive vice president with Sage Research Inc., Natick, MA. “The objective is to test the average availability of a site over a representative time period.” He says Response Networks is one of several companies now using this active agent technique. Other companies include Inverse Network Technology, Sunnyvale, CA, a provider of service-level management software and Internet measurement services for e-commerce and other Internet applications; and Keynote Systems Inc., San Mateo, CA, a provider of Internet performance measurement, diagnostic and consulting services.

“We don’t usually see a box fall over, but we see intermittent performance degradations—brownouts—becoming increasingly important in how people manage the performance of their e-business sites,” says Ivan Shefrin, founder and chief strategist of Response Networks. Instrumenting an environment in which multiple infrastructures are involved in serving the customers is the approach of Response Center, which plants a series of lightweight, thin-client software agents in distributed locations on the Internet. “They go out and create test transactions that mimic what an end user would do,” Shefrin says. The agents report back to a middleware component that stores data within a database, where it can be accessed via browsers.

Response Networks works closely with test tools from Mercury, Rational Software Corp., Cupertino, CA, and Lexington, MA-based Segue Software Inc., whose SilkTest products are tightly integrated with Response Networks’ as a result of a strategic relationship between the two companies.

The ability to predict the performance of an e-business application before it is deployed would be an extremely valuable capability. “I’m sure there are people working in their garages on this,” says IDC’s Heiman. “Any kind of simulated test is a good approach.”

The products available today are a good start. Still, more highly developed testing tools are needed to enable testing groups to focus their efforts where they are most needed.

The “tweener” zone
Another issue for many organizations is where the responsibility lies for availability of an application once it is deployed. That responsibility is in the “tweener zone” between application development and production operations, suggests Bruce Hall, vice president of Marketing at InCert Software Inc., Cambridge, MA. Production usually is charged to monitor whether applications are up and running, so they should be the first to know when they go down. What happens at that point? Who determines the cause of unplanned application downtime?

“The enterprise environment has been tightly controlled,” Hall suggested. “To try to convey these concepts in the hot, fast-moving world of the Internet is often a culture shock. There needs to be some maturity around this.”
What do you test for when rolling out a new e-commerce application? Post a comment below or send us an e-mail.

Editor's Picks

Free Newsletters, In your Inbox