Tech Buyers Guide
The Challenges of Information Technology Today
If you are an IT executive today then you are very likely being challenged by issues and
circumstances that are uniquely more complicated than anything previously faced by managers of
First among these are the ramifications of the unprecedented investment in IT infrastructure that
was made over the last ten years. The larger part of this IT infrastructure investment was directed
towards core business operational requirements such as resource planning, supply chain
management, human resource management, and customer relationship management. These
were very ambitious projects with extraordinary costs, many of which are still in the process of
being implemented and absorbed within the enterprise or at the business unit level. The
information that resides in these various applications has global value throughout the organization
but its availability is constrained by the specific functionality of each respective system and
specialized knowledge is required to access it. Because there are dozens, if not hundreds, of
these separate and diverse systems, the backlog of application integration projects and the
budget for them is growing daily.
In addition, the economy has recently experienced a significant and difficult adjustment. These
major IT investments were made during a boom period when double-digit top line growth was the
norm. This is no longer the case and the impact has been swift and harsh. Budgets have been
cut, operational resources have been scaled back and capital investment requirements are
carefully scrutinized. The mantra today is more must be done with less.
Yet despite the economic conditions, innovations in information technology continue to advance
at an overwhelming pace. The Internet and the World Wide Web were the driving forces behind
this acceleration and they continue to be so. They provided the impetus for the distributed
computing model, which accounted for the wholesale rearrangement of the IT architectural
landscape over the last five years. And the momentum for this distributed dynamic continues to
grow, as evidenced by the emergence of Web Services, an exciting technology that represents
the programmatic enablement of the Web. To ignore these important innovations leaves one
potentially exposed to competitive disadvantage in the short and long term.
Last but not least, the Internet and the World Wide Web forever changed the behavior of people.
By empowering people with direct access to information in real time and anywhere, it raised the
expectations of information technology (and those responsible for it) way above the previous
levels of performance acceptability, in every respect. This phenomenon is impacting every aspect
of business life. Customers, management, workforce, and business partners alike now have a
much higher demand threshold for information than ever before. They want what they want, when
they want it, and how they want it. And they are making their loyalty, productivity and
responsiveness a coefficient of this. This is the new marching order of information technology
Information technology executives have to balance numerous competing contingencies,
expectations, constraints and opportunities. Under these circumstances tactical measures take
on strategic significance, and strategic choices can make or break a company’s future.
The Issue of Complexity
Presently, one of the most difficult problems in any organization is managing the complexity of
embedded IT infrastructure. In large companies there are typically dozens, if not hundreds, of
these systems spread throughout the organization and they provide core operational functions for
the corporate entity or an operational unit. There is nothing inherently wrong about this
Tech Buyers Guide
proliferation of heterogeneous systems. Large corporate entities are aggregate entities containing
multiple business units with completely different business models and agendas. The IT
infrastructure should reflect the diversity and multiplicity of business requirements and these
requirements should always drive the choice of technology.
The inherent complexity of an embedded infrastructure is not simply the result of the proliferation
and operation of the systems themselves. This is certainly difficult but it is manageable. What
have become unmanageable, particularly in large organizations, are the development and
integration projects that attempt to extend the functionality of these systems or make the
information that resides in them accessible to other applications.
Everyone today understands the value of timely and accessible information. Certainly this was
the rationale for the decisions to expend tens of millions of dollars to automate core business
processes in the first place. What was not anticipated at the time was how important access to
the information would be outside of the specific functional context of the application it is
generated from. The World Wide Web has educated us in how the value of information increases
in proportion to the number of people who have access to it. And this axiom is even more
demonstrable within the dynamics of a self-defined organization that shares common functional
and economic objectives. Information flow informs opportunities and solutions and makes
efficiencies possible. Consequently, once functionally specific business processes were
automated the logical and apparent next step was to connect these isolated systems together.
However, what was also not anticipated, because it is never apparent before it is known, was how
difficult and costly it would be to expose this information outside the context of a host application
so that it could be accessible and used by other applications.
To date the methodology for integrating applications is based on creating point-to-point interfaces
between applications. Programming is required to access an API, convert data formats, and
exchange information. It is a tightly coupled, highly specific set of functions that exist and execute
in the form of procedural code. It is worth examining the mechanics and logistics required to
create one instance of this type of application:
The specifications for the structured business information of the source and destination
applications must be stated.
A determination of data format incongruities between the source and destination systems
needs to be defined and mapping specified.
The transport protocols for how and when to exchange the information must be
The programming or procedural requirements to accomplish the respective export, import
or interfacing functions need to be determined and specified.
The programming or execution procedures above need to be written or tried with
The operational procedures for executing the exchange need to be documented and
made part of the appropriate larger process.
The resource requirements and activities involved in point-to-point application integration
development can take a life of its’ own, and it does. Consider the number of possible interfaces
that would need to be developed if just ten applications were required to exchange information
Tech Buyers Guide
with every other one. This is expressed by the N-Square equation: N*(N-1)/2, where N is the
number of interfacing endpoints. With just ten applications, forty-five interfaces would potentially
need to be written. Multiply this number by some time, manpower and cost factor representing the
cumulative effort involved in the one inter-exchange instance itemized above and suddenly the
magnitude of the problem reveals itself for what it is - overwhelming and endless. And this is just
the starting point, before anything is built. There are always changes and modifications and the
introduction of new applications. If just one application out of the ten requires modification a
minimum of nine program modules will have to be re-written.
Compounding the problem is the fact that point-to-point application integration development
requires detailed knowledge of the API for each application. It becomes necessary to continually
maintain programming specialists, or obtain the services of consultants, for every API potentially
involved in an integration scenario. Furthermore, because each integration module is uniquely
specific, the knowledge of its design, structure and functionality is usually tightly coupled to the
individual who developed it. As with all tightly coupled interactions, dependency is a defining
characteristic of the relationship and is ultimately a potential point of failure.
Point-to-point application integration has never been workable or manageable, even when using
pre-built interfaces or templates. The development process and subsequent maintenance is a
drain on time, resources and budgets and it does not scale. It’s inherent linearity and complexity
renders it inadequate to efficiently and cost effectively address the application integration
demands that are a result of the rapid proliferation of IT infrastructure over the last decade.
The way that this inadequacy is compensated for is through patches, workarounds, and manual
processes. A substantial amount of any company’s operational inefficiencies can be attributed to
the inability of its IT resources to build applications or extend the functionality of existing ones on
a timely and cost effective basis. Not surprisingly, in any given company, a significant range of
processes continues to be executed on a manual basis. This unsatisfactory reality is far from the
ideal of the “real-time” enterprise that is being conceptualized today.
An Alternative Paradigm for Enterprise Application Integration
What is required to minimize or eliminate these efficiency liabilities is an alternative paradigm for
application integration development. To be truly effective such a solution must facilitate
considerable improvements in the following areas of development:
Significantly decrease the dependency on procedural code in both the development and
maintenance phases of integration projects
Provide an alternative to tightly coupled interfaces
Eliminate the need to know and program to API’s
Minimize the effort required to formulate design and functional specifications
Facilitate the modular re-use of implementation services and functions
Use standards-based tools and specifications
Fortunately, an alternative model for application integration development has emerged and the
dominant providers of software development tools have embraced it and collaborated on
standards for the execution of its methodologies. The commercial products that embody this
paradigm are known as Enterprise Application Integration (EAI) or Integration Message Broker
The concept that defines this EAI paradigm is the elevation and hence abstraction of the
integration process from the application programming layer to the information (document) and
transport (messaging) layer. At the most basic level, the EAI platform primarily functions as a
Tech Buyers Guide
translation and transformation hub. Participant applications submit natively generated information
to it and the hub system makes the native information comprehensible, accessible and operable
to any other application on a re-usable basis. Because EAI platforms more and more frequently
use XML to expose the meaning of the information, and standards-based protocols to execute the
message inter-exchanges, the application integration process becomes transparent, modular,
and independent of the source applications. Consequently almost any application integration
requirement can be more expediently addressed with far less programming effort.
The core mechanism of the integration platform is the use of a semantic model, more and more
frequently an XML schema, to represent the functional meaning of information generated or
received by an application. These schemas are stored and made available in a repository. A
mapping tool (preferably, but not necessarily based on the XSLT standard) is used to map the
conversion of one application’s information format (based on its schema) to any other format.
These transformation maps are also stored in a repository. An inter-exchange takes place when
one application sends information to the EAI platform identified as a specific document type
(schema) that the platform recognizes as being the input to another application. The EAI platform
executes the format conversion through its mapping facility and furnishes the document to the
receiving application in the format required. While XML functionality is a feature of most EAI
platforms, specialized knowledge of XML technologies such as schema construction, SOAP
messaging, XPATH inspection, and XSLT transformation should not be required. The EAI
platform should hide the operation and complexity of these facilities from the user.
The flexibility and efficiency of this information hub model for application integration represents a
significant improvement over conventional point-to-point methods if only because of its inherent
ability to facilitate many-to-many inter-exchanges. For example, a source application could
generate a single, large document that contains information that will be used differently by
numerous other applications. This document can be distributed through a “publish and subscribe”
function that is common to many EAI platforms, to multiple subscriber destinations (folders,
URLs, message queues) where the information in each incidence of the document is extracted
and transformed according to the specific requirements of the subscriber application for that
An examination of how EAI platforms using an architecture based on XML technology can
address the development inefficiencies itemized above provides a strong indication of their
potential for providing quantum leap improvements in application integration productivity.
By assigning semantic value to information, and separating and exposing the information (with its
semantic definition) apart from the host applications that generate it, it is possible to use
standards-based XML tools to access, process and transform the information based on its
semantic content. This radically changes the development process. There is no longer a need to
write code (or, at least far less of it) to access, map and convert data formats. Further, there is
less of a need to understand the API’s of dozens, if not hundreds, of applications. The tightly
coupled, coded interfaces between applications, which require inordinate amounts of specialized
programming to create, no longer exist in this paradigm. The information is now de-coupled from
its sources, and available for any-to-any exchanges without coding and its attendant design and
It is also worth noting the complete independence and modularity of each integration coupling. A
change to one side of an inter-exchange implementation has no direct impact on the logic,
function, structure or integrity of the objects on the other side. In conventional application
integration development an inter-exchange is embodied in a singular program module that
incorporates the structure of the endpoint objects, conversions of formats and any other relevant
Tech Buyers Guide
implementation requirement. If a modification is required to any one facet of the integration
module, then the integrity of the entire inter-exchange implementation can, and frequently is,
compromised. The risk of introducing unexpected behavior when modifying code has always
been a pitfall of software development and it accounts for the trepidation and resistance to any
large-scale integration initiative based on conventional coding methods. The loosely coupled
architecture that characterizes XML-based EAI platforms largely eliminates this risk, which in and
of itself is an extraordinary benefit.
Features of Business Process Integration Platforms
EAI platforms provide functional support for these integration inter-exchanges that go beyond the
core mechanism for creating and exposing the semantic models of information and mapping
coupled transformations. In general, comprehensive EAI platforms are comprised of the following
Messaging encapsulation facilities with support for receipts, reliable messaging and
Built-in Web Server support for URL listening using programmatic (ASP/JSP) page
Folder and queue polling functions for document routing
Support for HTTP, SMTP and Message Queues for messaging transport
A virtual repository for storing schema and transformation specifications
Document transformation using mapping functions
Logical endpoints and connector functions that represent message destinations and
processing nodes where document exchanges can be validated, authenticated and
A standards-based adapter framework
Specific adapters to applications and data sources
Event monitoring facilities
This clearly represents an extreme diversity of tools and services, reflecting the myriad integration
challenges that will confront most distributed, heterogeneous organizations over time.
The EAI Platform Services Stack
A comprehensive EAI solution resembles a sophisticated and flexible toolset that allows different
technologies and services to be utilized in different circumstances to achieve an organization’s
overall integration objectives. Given the tremendous diversity of organizational structures,
determined standards, installed applications and systems, and operational processes, the tools in
the toolbox that will be utilized on a given project will vary dramatically. However, there is a fairly
logical hierarchy to those services, whereby the most common services that are required for most
or all integration projects should exist at the lowest level of the stack, allowing more sophisticated
services to be built upon them.
More specifically, in a comprehensive EAI solution, certain extremely common services can and
should be leveraged in the underlying platform. Services such as basic network connectivity for
support of transports such as HTTP and message queuing, security services, and data parsing
and validation services for business documents based on XML standards should exist in the
underlying operating system. Having these services predictably present as a result being
developed and tested in the core platform lowers the cost and complexity of the resulting solution
and therefore reduces risk.
There are many other technologies and services that will be needed in a comprehensive EAI
solution over time. The more specialized, integration-specific tools and services should be able
Tech Buyers Guide
to gracefully leverage those lower level services that should not only be accessible to the
integration tools but the organization’s more general application development efforts as well.
Capabilities such as a standardized deployment environment, management, monitoring, and data
replication and synchronization are quite important to integration projects, but represent more
general-purpose services that should exist in a more horizontal “layer” of an organization’s
services stack. Though these are important elements of a comprehensive EAI offering, allowing
the integration-specific tools and services leverage a more general set of infrastructure services
for accomplishing these tasks lowers the complexity and cost of the overall IT environment in
question. Another key technology development as of late that is critical to the overall integration
challenge is the recent progress on standards-based XML Web Services.
The Role of Web Services in EAI Platforms
The significance of the Web Services protocols is that they provide the first truly workable
architecture (from the perspective of their design simplicity, platform independence and use of
HTTP) for building highly distributed computing processes. Just as web servers and browsers
facilitated the explosive communication and distribution of information among people; Web
Services will facilitate the wide-scale proliferation of automated and distributed interactions
among applications and devices. Because Web Services leverage the scope of the Internet, the
ramifications of this technology are potentially disruptive on many levels. However, it is important
to consider that while Web Services are immensely valuable in the context of standards-based
application connectivity, they are simply an enabling technology upon which other higher level
services depend. In order words, Web Services addresses an important (lower level) aspect of
standardizing application access and integration, but other complementary services are required
to complete an overall integration strategy. These higher level complementary services include
capabilities such as state managements of long running processes, issues of transaction scopes,
robust process-level error and exception handling, process task flow management such as
concurrency and branching, and many others, which are the purview of EAI solutions.
Web Services implementations today normally leverage the use of two primary protocols, Web
Services Definition Language (WSDL) and Simple Object Access Protocol (SOAP). WSDL is the
specification in XML for describing what a program does and how to communicate with it. A
WSDL document resides on a web page and is “linked” to the actual program module that is
typically located elsewhere. In addition, a WSDL document also informs you of what information
you have to provide the program to do its job, and what information the program will respond with.
The way that this information is conveyed to and from the program is through a SOAP message.
SOAP is the specification, also in XML, for the message format in which the variables and
parameters required by the program are sent to it, through the WSDL intermediary, to invoke its
methods. The program in turn, sends the results of its process back to the request originator in
another SOAP message. A SOAP message is simply an XML formatted text document
encapsulated with an HTTP header for transport through the Internet. The content and
construction of a SOAP message is determined by the instructions found in the specific WSDL
document. In short, WSDL describes the standard interface to an application in an open way,
while SOAP determines the message format that will be used to access that interface.
Both of these functions, individually and combined, can potentially facilitate the integration of
different application programs and computing platforms, automate workflow processes, and
coordinate business interactions among multiple participants. The fundamental integration
mechanism is very straightforward: any WSDL request/response operation or SOAP messaging
event can be the input to, or output of, any other WSDL operation or SOAP messaging event. By
pipelining and coordinating operations, messaging events, and transformations in this way it will
eventually be possible to assemble complex interactions and processes.
Tech Buyers Guide
What is ultimately envisioned for Web Services is the automatic determination at run-time of any
process instruction set that governs the execution behavior of the interacting Web Services, as
well as any security protocol negotiations. The Web Services would then be executed according
to the instruction set on a spontaneous, ad-hoc basis without requiring a design-time
configuration. This will be accomplished by adding a Web Services process language and
security protocol on top of the WSDL and SOAP protocols. Work on creating standardized
protocols for both of these functions is in progress under the auspices of the relevant standards
It is important to keep in mind that a Web Service is nothing more than program code whose
methods are exposed through XML at a URL location and are invoked by an XML message
posted to the URL address. As with any other application, its secure and reliable execution is a
function of context, that is, the operational support infrastructure that it executes under. And this
is exactly what an EAI platform is designed to provide.
The way to approach Web Services technology right now is to recognize that although the run-
time protocol infrastructure to support secure, orchestrated and reliable execution of programs
based on Web Services is not yet in place, the infrastructure of a EAI platform provides these
deployment and execution capabilities presently within a design-time framework. In this respect
EAI platforms complement and facilitate the adoption of Web Services by providing the real-world
implementation support for functions such as state management of an ongoing process, process-
level errors and exception handling, and complex transaction management.
How to Evaluate an EAI Platform
It should be apparent from the information presented here that an EAI platform becomes part of
the embedded IT infrastructure. By definition it is a vital function that supports numerous
dependent operations. Consequently, choosing the right platform is a critical strategic decision. A
thorough investigation of an EAI platform’s architecture and design, implementation requirements
and performance attributes should be undertaken. Some of the important criteria that should be
taken into consideration when evaluating an EAI platform are:
Purity of Approach: Consider whether the platform was designed with an XML semantic model
and loosely coupled inter-exchange paradigm from the ground up. Most previous generation EAI
platforms are based on the application-to-application integration model with a proprietary
translation function in the hub. While vendors of these platforms are incorporating the newer
concept they have a vested interest in maintaining the value of their proprietary solutions.
Support for reliable messaging, fall-over, persistence, and recovery capabilities: Reliable
messaging is a platform’s ability to guarantee message delivery on an “at least once” and “once
and once only” basis with acknowledgment and unique correlation capabilities. Failover is the
incorporation of pervasive facilities throughout the platform to support alternate means of
delivering or receiving messages automatically. Persistence and recovery is the platform’s ability
to keep track of the progress of every message exchange and maintain its state and guarantee
completion at any time. It has to be assumed that integration inter-exchanges will support core
business functions that have financial ramifications. Every messaging instance needs to be
considered mission critical. Consequently the EAI platform should have built in and well thought
out facilities for these features.
Security: An EAI platform is an information routing hub tied to numerous other systems the
access to and control of its components and functions needs to be highly secured. Any EAI
platform needs to have pervasive authentication and encryption capabilities.
Tech Buyers Guide
Considerations of form, function and design: Most EAI platforms look conceptually similar on
paper but differ considerably in their design, structure, operation, user interface, functional
utilities, and organization. These factors will significantly impact the efficiency and overhead of
implementing an EAI platform. Working with a prospective EAI candidate platform to build a
“Proof Of Concept” (POC) application is highly recommended.
Robust adapter framework that enables a third party market for adapters: Investigate whether
there is the availability of a wide range of adapters that support the platform. Adapters are pre-
built integration modules and templates for accessing the data structures of legacy systems and
messaging formats such as EDI. An adapter should integrate cleanly and directly into the EAI
platform and provide plug and play capabilities. The presence of a third party market for adapters
for a given EAI platform encourages competition, which increases breadth of choice and
increasing quality while competitively driving down cost.
Scalability: Scalability is another pervasive design consideration. That is, the ability to scale the
individual functions of the platform incrementally and the platform in its entirety should be a
prominent design factor of the platform.
Same Platform-to-Platform Interoperability: This is not an obvious function but it can have
significant ramifications. When evaluating an EAI platform investigate what was envisioned for
same platform communication, replication and integration capabilities. This design feature is a
significant consideration as it applies to scalability, recovery and interoperability issues.
Vendor credibility, reliability and support: This is an important consideration for any strategic
information technology investment, but even more so for an EAI platform because of the central
role that an EAI platform will have in an IT operation.
Total Cost of Ownership: From a direct cost perspective, EAI platforms are an excellent value.
They cost significantly less than the previous generation of EAI platform because they use a
paradigm that takes advantage of standards based technology as opposed to being dependent
on proprietary technology, as was the previous generation. However, it is the implementation and
continuing operational expenses that make up the greater portion of information technology costs.
In this respect as well, the new generation of XML based EAI platforms offer significant value
because their methodology and architecture minimize the dependency on specialized knowledge
resources for both the applications being integrated and the operation of the platform itself. This
factor in itself figures prominently in reducing the total cost of ownership.
For application integration requirements alone, the immediate, ongoing and cumulative return on
investment that can be derived from this technology is potentially extraordinary, and it is certainly
where the need for development efficiency and productivity is most pressing. However, the value
of the XML semantic inter-exchange paradigm goes well beyond its applicability for enterprise
application integration. The very same model provides the basis for automated business-to-
business (B2B) interactions that can be defined and implemented in an equally efficient manner.
The same applies to automating business processes.
B2B inter-exchanges have a stronger requirement for authentication and encryption features as
well as standard Internet transport protocols to facilitate application integration scenarios. For
automating business processes the semantic inter-exchange model has to incorporate and be
directed by sophisticated business rules, concurrency and contingency logic, and exception
handling routines. All of which are available in the tool sets of most EAI platforms.
Tech Buyers Guide
A robust and well thought out EAI platform is suitable for, and capable of modeling almost any
business behavior, whether it is a purely internal function or involves external interactions.
Consequently the investment made in this technology is highly leveraged. The expertise and
knowledge acquired from applying this technology to application integration activities will translate
directly to almost any other development requirement, with equally impressive results.
Standardizing an organization’s development methodologies around the XML semantic inter-
exchange paradigm is the first step towards achieving the ideal of the “real-time” organization. By
exposing the meaning of information and functions of applications so that they are fully accessible
and comprehensible throughout the organization, any EAI platform can easily generate
applications and processes among disparate and distributed computing resources simply by
assembling “loosely coupled” computing, transformation and communication events.
EAI platforms that are based on the semantic inter-exchange paradigm have unique application
integration development capabilities because they leverage the combined versatility of XML, Web
Service technologies, URL addressing, and HTTP transport.
Combining these standards based technologies with additional support for security, fall-over,
persistence, and recovery, results in an application development and execution environment that
takes full advantage of the most compelling innovations in distributed computing yet provides the
operational assurances necessary for real-world information processing.
With the power of this technology at their service, IT managers can now confidently meet and
exceed the heightened expectations for the delivery of timely and useful information wherever
and whenever it is required.