The promise of Web services is that software components appear as services to other components and users
No one can question the need for integrating applications. It’s the one and the only way to extract the maximum value out of the applications that an organization has invested in. Though integrating applications calls for highly skilled resources and huge investments, it is central to an organization’s IT strategy and contributes directly to smartening the business. While the concept of integration has been around since the inception of IT, its ability to embrace applications within and outside the organization such that data, applications and business processes can be shared is a new possibility. Needless to say, the Internet and related technologies have made this possible.
The traditional view on integration deals with inter-application messaging. But new age integration envisages going beyond that stage and has a direct impact on the way business is carried out. Says Shamik Mehta, president, Asia Pacific, TIBCO: “The ability to get new business-level applications into play as early as possible implies the need for a solid development environment where application adapters can be readily built and deployed. A toolkit that allows the enterprise to develop its own toolbox is absolutely critical as is the availability of standard adapters for a broad range of applications.”
Integration is a business issue as much as a technical one. Says Alok Kumar, director-IT, Tata Teleservices (M): “Many companies look at the technology and then find a business problem it could solve ... in integrating applications, this approach would lead you nowhere.” Also, the project has to be owned up by business units/functions that are getting integrated. Cites a report from Butler Group: “Selecting an integration framework requires a thorough understanding of the business processes and an analysis of the IT infrastructure, so the integration strategy can be successfully aligned with both.”
The promise of enterprise application integration (EAI) is to provide a central point capable of integrating applications, processes and people.Â
An EAI framework will simplify connections, providing a central point of management and making it
easy to create a plug-and-play integration architecture.
The goal of EAI is to share data and processes between various applications and data sources in the enterprise. Integration happens at three levels: data, application and process. Data-level integration establishes links at the database level.
Application-level integration provides access to the data and business logic contained in the application. Process-level integration addresses the issues of business processes crossing application boundaries. (See box titled ‘The Three Levels Of Integration’.)
The mechanics of integration
The integration challenge, often referred to as ‘stovepipes’, ‘islands of information’, ‘vertical application silos’, is tackled by a diverse set of tools and technologies that have evolved out of the general category of middleware and have now been grouped under EAI tools. Whereas traditional middleware facilitates the integration of individual applications and discrete transactions between them, EAI enables an enterprise to manage relationships among multiple applications and the surrounding network of transactions that constitute a business process. It is essential to keep in mind that it is the chaining together of discrete transactions in the form of a business process from one application to the next that constitutes EAI. The amount of custom coding required to effectuate EAI with traditional middleware far surpasses that required through the utilization of special EAI tools like the ones from TIBCO and IBM.
So, how is integration between applications achieved? These EAI tools operate in three layers:Â
- The business process layer that automates or replicates a business process.Â
- The business rules layer that dictates the rules according to which a business process is to be executed.
- The data transformation layer which interprets and transforms a large number of data into a usable format (oftenXML) that can be communicated between applications. Finally, the transportation layer, which is a message-oriented
middleware, handles routing of messages and guarantees delivery of messages between applications.
Currently, in a distributed application architecture, these layers form an independent layer of computing and is called the integration layer. This is where applications plug and play. Why is this layer important? Because this is the layer where an SCM can talk to a CRM, where an analytic CRM tool can talk to a custom-developed marketing application, where a small ERP like Navision used by the Indian operations of an MNC can talk to an SAP used by the worldwide HQ. Understand that at this level new islands of information are not created–something that typically happens when point-to-point integration is achieved by authoring individual APIs between the two programs.Â
There is more to this layer. This is called event-driven processing. A “business event” is a piece of information generated by an application (such as an order entry by a customer) that must be shared with other applications (such as a requisition system with a supplier). Middleware has facilitated the transportation of this information on a point-to-point basis for a long time with the assistance of considerable custom coding. Today, this layer allows business analysts, not custom developers, to configure applications so that the “business events”, once published into the separate integration infrastructure, become available to all of the other applications within the enterprise that have a need to be aware
of the “business event”–hence the phrase event driven processing. (See box titled ‘Event-oriented and Service-oriented Architectures–Which One To Choose?’)
Tools of the trade
New generation EAI tools automate business processes by taking advantage of the separate tier of computing for the integration infrastructure. Simply put, rather than having business processes hard coded into applications, the business processes are configured by business analysts into the tool itself. In other words, the business process, like the integration itself, is “decoupled” from the native application. This helps in business process management (BPM). The advantage to this approach is that new processes can be configured into the integration infrastructure on the fly without having to rewrite any of the distributed applications. (See box titled ‘Managing the BPM Way’)Â
The complete EAI tools suite has toolsets for messaging, adapters, message transformation, business flow coordination, event notification, and monitoring and management. Most EAI tool vendors have substantially revised their product architectures or are in the process of doing so. New features have been added–Web-services, business-business protocols, application-specific adapters, adapter development toolkits, publish-and-subscribe messaging, event handling, business activity monitoring (BAM), and BPM. Integration vendors continue to innovate to include new capabilities from emerging architectures.Â
Integration technology is now available from specialist integration middleware vendors like TIBCO, IBM, SeeBeyond, Mercator and others; application server vendors like IBM Websphere, BEA Weblogic and others; and packaged application vendors. The involvement and interest shown by the latter is proof of the importance of the integration market. It is critical for enterprises to understand the technical evolution of integration middleware so that architects and developers are able to select and use the right tools.Â
One important trend worth noting is regarding the role of the network. It is a given that in the context of enterprise-wide application integration we are talking about applications that run on the network. In traditional architecture, all the data and logic (that constitute the intelligence) are resident within the application. The network is just a transport mechanism. The next generation of information systems will see a departure from this trend. Says Dilip Kumar, managing director, Datacraft India:
“The business information system or network, as we may call it, would be intelligent in that it would be process-aware and application-aware. Some functions like content-based routing, messaging, and BPM would be partially handled by the network. This is called application networking.” One another reason the network becomes all-important is that data, in its myriad forms and formats, relies on the network. Data relevant to multiple business units and processes is shared in operational data stores, message warehouses, data warehouses, metadata repositories–all of which are outside the applications.Â
The architecture imperative
For the CIO, selecting the right enterprise IT architecture is the biggest bet. Reason: the CIO would fail miserably if the selected architecture is not able to deliver business benefits or yield commensurate returns on IT investment. More difficult than selecting the new IT architecture is proposing a change to the existing architecture. Imagine the CIO asking for an investment of Rs X crore to change the current architecture with a technology argument. Not only would the management question the wisdom of the proposal and regardless shoot it down, but it would also make the CIO look up his CV and renew contacts with search firms.Â
Enterprise IT architecture has been hitherto positioned as a technical project while in reality it is the enabler for evolving business models and strategies.Â
What then is an enterprise IT architecture? Simply put, it is a combination of technologies, models and design patterns that have been put together to deliver multiple business processes and the orchestration between them. The ‘technology’ part is the standard shopping list comprising servers, development tools, application software, middleware and others. The ‘modeling’ part is the ‘this-is-the-way-it-is-to-be-done’ part wherein templates and blueprints in the form of object models, data models, process models and others are created. The ‘design’ part is the philosophy or style in which it would get done–centralized/ distributed, n-tier, fat client/thin client, zero-latency, straight-through-processing, event-driven or service-oriented, real-time and such like. The reason for discussing enterprise architecture here is that the choice of design principles as alluded to before has the most significant bearing on the business and what it expects out of IT.Â
This is particularly relevant for the Indian CIO–some who have multiple applications integrated or otherwise, and for many who are evaluating what-to-do next after the ERP implementation. Design principles, therefore, become very important as one proceeds along the continuum–from ERP as the backbone to SCM/CRM/PDM and others which give the functional reach and richness to the next generation business information system (analyst firms have started calling it ENS–Enterprise Nervous System).
A special case of the architecture having a business case is that of the real-time enterprise (RTE). Understand that an RTE does not mean that the enterprise is functioning at real-time latency. It only means that the system has been tuned to remove information lags and made responsive in time as the business situation demands. Supply chain strategies provide an excellent example of RTE in practice and illustrates its dependencies on architecture. According to a Gartner report, an RTE supply chain can do the following: quickly expand or contract production based on real-time inventory data, rapidly shift production from one product line to another based on real-time sales data, model and forecast two weeks into a quarter, rather than two months into a quarter, help enterprises exceed earnings targets, handover industrial design work from one location/time zone to another and such other capabilities.Â
Impact of Web services on application integration
Any current discussion on EAI would be incomplete without understanding the implication of Web services. The promise of Web services is that software components appear as services to other components and users through the medium of well-defined interfaces and message protocols. While the idea is alluring and sounds ridiculously simple, it doesn’t achieve the same effect of application integration. Says Dr Hemant Adarkar, CTO, Apar Technologies: “The silver bullet is that Web services would lower the cost of integration, but this is not true because the initiative is mute is many vital areas.” The biggest lacunae are in the areas of a lack of single management platform and lack of agreement on standards. Adds Adarkar: “What is required within the Web services arena is the ability to have a centrally-run application which talks to multiple applications ... unfortunately all the stakeholders are silent about this area.”Â
While Web services cannot fully replace a middleware-oriented EAI approach, piecemeal integrations can be enabled, thanks to some degree of basic standardization. Says Kumar, “Undeniably, everything is moving towards Web services. While traditional integration methods have their value, Web services would make integration a less formidable project to approach.”
Apart from XML, a meta-language that lends itself to creating other languages and protocols, other options are: Web Services Description Language (WSDL), which describes a service; Simple Object Access Protocol (SOAP), which provides a mechanism for packaging messages once services understand each other; and, Universal Description, Discovery, and Integration (UDDI), which supports the creation of a directory of services so that they can be located. Notice that there is not yet a standard agreed upon for business processes. Philosophically, the Web services approach is a federated one, based on the SOA. It is providing a momentum to SOA as a design principle and most EAI vendors are now adopting Web services standards. At the very least, Web services–with the hype and hoopla surrounding them–would get the credit of catapulting the issue of integration to the top of the CIOs’ agenda.Â
The biggest lacunae are in the areas of a lack of single management platform and lack of agreement on standards
Easwar S Nair in MumbaiÂ
The Three Levels Of Integration
n
Data-level integration establishes links at the database level. This requires understanding both the database structure and the way in which the data is to be used. It is common for this type of solution to create an enterprise-wide metadata repository that describes the structure and usage of data throughout the organization. It will also define the mappings and necessary transformations required to move data between applications. The actual links can be either point-to-point or can address the repository itself, which acts as a virtual database for the organization.
n
Application-level integration provides access to the data and business logic contained in the application. It enables integration of messages from any source, often using an asynchronous message flow, to reduce dependency between applications.
Message brokers can transform from one message format to another and route and store messages as required. Consumer applications will generally use adapters to receive and process these messages, which can be of several types–an application programming interface (API), a component, a database adapter, or a user-interface adapter. This type of architecture scales well and offers guaranteed message delivery.
n
Process-level integration addresses the issues of business processes crossing application boundaries. This approach begins with defining the underlying business process which might be the processing of a credit account application. Once the process has been defined, the underlying applications are integrated into this process, typically using adapters and a message broker to transmit the information.Â
Source : Butler GroupÂ
Event-oriented and Service-oriented Architectures: Which One To Choose?
Event-oriented and service-oriented are two design approaches to applications and the way they interact. Says Hemant
Adarkar, CTO, Apar Technologies, “Traditional EAI using TIBCO, MQ Series and the like largely rests on event-driven philosophy, but service-oriented constructs are getting increasingly mainstream.” Clarifies Anup
Varma, country manager, TIBCO India: “It is not a one versus the other kind of a debate. There are commonalities and differences between the two approaches and they are quite complementary.”Â
Both are ways of combining multiple software modules into large distributed applications. They differ in the way they organize the relationships among the modules. They are conceptual design patterns that can be implemented with many different kinds of middleware. A service-oriented architecture
(SOA) is based on interfaces which is the basic design element. SOA interfaces are one-to-one connections with the flow routed by the service request and is closed to new inputs once the flow has started. Event-driven architecture
(EDA) supports many-to-many connections with the flow determined by the content of the message and can react to new inputs.Â
Developers use SOA when the nature of the business problem requires a request/ response relationship. EDA is used when a business problem needs to understand what is happening in the application. The nature of applications that depend on events are: need to dynamically add, drop, or modify steps in processing, need to take in an external input coming in unpredictably, or multiple process threads that have to be executed simultaneously.Â
The good news is that EDA and SOA are compatible with each other and, in fact, complementary with each other. A transaction may begin in EDA and end in SOA and vice versa–developers and business analysts need to understand the business requirements and process models to determine the right choice or a combination for each step of each business process.
Managing the BPM Way
This is the new term on the block that is catching the attention of many CIOs. Its older avatar is workflow management, which is about moving work between people and the new-found excitement is about holistic process management where work is moved between systems. Says Paul Maguire, country manager-India, Staffware: “BPM enables things like
straight-through-processing (STP) and process optimization. When work moves between systems, one can start analyzing trends within these movements and can refine the process.”Â
Specialist BPM vendors like Staffware believe that the data-centric approach of EAI and application server vendors doesn’t suffice. Such vendors create a separate layer for business processes, not tied to any one application. Says Maguire: “Modern business processes are invoked through new channels like the ATM or the phone. The processes themselves are complex, hence business process management is the only way forward.”Â
One of the key drivers in the process-centric view of the market is Web services, which essentially is a form of data exchange between applications. BPM vendors are busy writing Web services packages recognizing the enormous opportunity.Â
ICICI Bank, HDFC Bank and HDFC Standard Life Insurance are some of the companies that have jumped on to the BPM bandwagon with Staffware’s product suite. HDFC Bank has adopted a phased approach, opting to start with its trade finance operations. Says Anand Narayan, vice-president-IT, HDFC Bank: “We looked at the best and earliest possible way to get value out of BPM, and hence we didn’t wait for seamless integration between all our back-end applications.”
The solution, which is in the process of being implemented, helps the bank augment its centralized processing thereby lowering operational costs while maintaining high levels of service. The next application in the pipeline is the customer service process for credit card customers. At HDFC Standard Life, BPM is being implemented initially to automate and streamline the processing of new business. Sunil Ramlani, head-IT, HDFC Standard Life, says: “The goal is to make process management a
central element of business operations.”Â