So your company has gone through the trouble of developing a scalable, hosted platform and the software required to develop, deploy, and maintain it. Once the hosted system is installed, the ASP (or your company, if you’re self-hosting) will turn its attention to integrating with your legacy systems.
In this article, I’ll examine the three levels of system integration and describe why an ASP might just be the adhesive that holds your corporate applications together.
This is the fourth and final installment in a series of articles discussing the creation and management of hosted environments on micro-computer platforms. In the first article, “Hosting: What is it and why should CIOs care?,” we discussed the reasons that application hosting providers and some enlightened companies would be creating hosted platforms. Over the last two weeks, we’ve looked at issues surrounding the scalability of hosted platforms in “ Building a scalable network” and the software required to develop, deploy, and maintain these systems successfully in our article “Application hosting: How to manage your software the right way.”
Do we really care about interoperability?
A recent MetaGroup report summarized application interoperability issues as follows:
- Large corporate IT shops ranked application interoperability issues as the most important issues they face after fixing their Y2K problems.
- The “applications” listed by the respondents really aren’t applications in and of themselves but are, in fact, Enterprise Application Integration (EAI) initiatives. These include Customer Relationship Management (CRM), e-commerce, knowledge management, and data warehousing.
- The largest technical challenge will continue to be access to legacy systems.
- Although many companies are still working out their strategies, the most promising technologies appear to be message brokering with XML as the payload.
If interoperability is a big issue for companies trying to connect their own systems, then imagine how difficult they become when connecting to external systems like those provided by an ASP!
Let’s discuss the basic levels at which systems can integrate before we deal with the ASP integration issues.
How do we get systems to “play together”?
Interoperability between any two systems can take place at one of three levels:
Both systems must be capable of processing information at the same level in order for the systems to cooperate. Let’s look in more detail at each of these levels.
Two systems need some common characteristics at the network level before integration of any kind can take place. First and foremost, one system must be able to authenticate the credentials of the other. Then, based on this authentication, the second system must be able to grant access based on the authentication. Access permissions between the two systems will likely be based on either a common directory service (Novell’s Directory Services, Microsoft Windows NT Domains, or Active Directory Services), different directory services with common directory access protocols (Kerberos Tickets), or standard X.509 certificates issued by one system and accepted by the other. Access permissions will be granted by the second system based on the credentials of the authenticated user or system account. Once this base level of authentication and access has taken place, the systems can share data at a file protocol level (HTTP, FTP, NetBIOS file sharing, NFS, etc.).
In fact, this already happens every day. When you load your browser and surf to a Web site, this process happens automatically. Your PC (System 1) connects to the Web server (System 2) and gets authenticated as an anonymous user. You’re then granted read access to all the HTML files on that system, delivered to you over the HTTP protocol. In theory, you could build a corporate, integrated book-buying system by having your mainframe or PC Server respond to a user’s internal request to purchase a book by connecting in the background to Amazon.com. Then the user could complete the purchase. This isn’t that different from the 3270 screen scraping that many companies do today against their mainframes. If Amazon somehow knew who you were and what company you came from, then the order could be automatically processed for you without requiring an additional logon (like all e-commerce sites do now).
Once some level of network authentication has taken place, two systems can begin communicating at the data level. Data integration can take place with either native data streams (CICS, DB2, SQL Server, Oracle) using the software company’s provided driver or by using more standard, homogeneous connectivity standards like ODBC or JDBC. This communication can be a one time, client-initiated request for a set of records, a continuous set of two-way conversations passing data sets, or even continuous replication between two hosts. The major disadvantage of conversations done at the data level is that they can be easily broken if the data format on either side of the conversation changes.
Communications at the application level are much more robust. These conversations can take place synchronously between objects on the two systems if they both use the same object specification (DCOM or CORBA) or have object-brokering software in place. Synchronous communication can also take place between two systems capable of emitting and consuming XML packages containing business process instructions. Using message queuing technology, objects, or information packages (XML or proprietary) can not only be passed, but also staged for later processing in case the receiving system is not currently answering.
Can ASPs play too?
Interestingly, in order for most ASPs to “play” with corporate applications, the corporate systems themselves must have strong inter-application standards and strategies already developed. But maybe the contra-positive of this statement is also true; that is, ASPs may provide the glue necessary for corporate applications to play together. How would this work?
Since an ASP has to provide access and authentication mechanisms for its customers, corporations may choose to use these mechanisms to integrate their own applications.
Let’s suppose you choose to allow an ASP to handle your ERP, CRM, and Collaboration systems. If the ASP is successful offloading these systems from you, then the ASP will have also solved not only the user authentication issue, but also the application integration issues. If the ASP will allow your users to authenticate against their systems but then come “back into your building” and connect to your other internal systems, then they have, in effect, helped you solve one of the more difficult integration tasks that you’ll face. Moreover, ASPs could also be the repositories for the credentials that get passed to external sites. In our Amazon example cited earlier, the ASP knows who the user is and can interoperate seamlessly with Amazon to complete the purchase on behalf of the individual or the company.
Whether you choose to use an ASP or to self-host, you’ll still benefit from analyzing your interoperability issues from this perspective. As new systems are developed, developers and system designers should design interoperability as a feature from the beginning, rather than forcing companies to add this capability later on.
Tim Landgrave is the founder and CEO of eAdvantage, which assists small to medium enterprises in the development of their electronic business strategy.
Have you had good luck with the interoperability of your system? Tell us about it by posting a comment below. If you have a story idea you’d like to share, please drop us a note.