Document Sample
Specification Powered By Docstoc
   Master project: EJB with CORBA vs. COM+ with DCOM

    Authors:   Fredrik Janson and Margareta Zetterquist

Definition of the problem
   The employees at Adcore Creative AB should get better knowledge about the
    component models Enterprise JavaBeans™ and COM+.
   The company should get a decision framework based on a comparison between
    OMG´s CORBA and Microsoft’s DCOM. This thesis should give guidelines for
    which distributed model that should be chosen for a given kind of system that is
    going to be implemented.

The objective of the master project
To simplify network programming and to realize component-based software
architecture, two distributed object models have emerged as standards, Microsoft's
Distributed Component Object Model (DCOM) and Object Management Group's
(OMG) Common Object Request Broker Architecture (CORBA).
It is important to know the differences between these two competing architectures for
distributed computing in order to make the right choice. This master project should
serve Adcore Creative AB with a decision framework for choosing between those
architectures. Programmers mastering one side of this comparison should get an
overview of the other. The master project should also give an overview of the two
component technologies Javasoft’s Enterprise JavaBeans™ (EJB) and Microsoft’s

Description of the literature study
The goals of the literature study
Before the literature study it was decided that the results from the literature study
should be:
   1. that the students should be able to make a theoretical comparison between the
       two architectures, i.e. find differences, advantages and bottlenecks.
   2. that the students should know how to design a good test application.

The comparison should be the basis for the design of the test application. Questions
that should be answered during the literature study are how to show the differences,
what should be measured and how to measure it?

The contents of the literature study
A lot of information about EJB, CORBA, COM+ and DCOM was found on the
Internet. Useful information was found on OMG’s and Microsoft’s homepages, in
newsgroups and homepages debating which one of CORBA and DCOM is the
superior. The students have also read many books. To know how to design a test
application and make measurements with it, tests including benchmarking have been

What the students have learned can be divided and presented in three sections: facts,
tests and comparison.

   1. Facts: The students have learned about distributed objects and component
      technology, why it has been developed and why it is useful. They have learned
      very much about the architecture and functionality of DCOM and CORBA
      and now understand most of the terminology involved when reading about
      DCOM/CORBA. The students have also learned about multi-tier architectures
      and where DCOM/CORBA is used in a multi-tier system.

   2. Tests: The students have studied benchmarking tests for different orb
      implementations and specifications for the Debit-Credit Benchmark, which
      simulates a bank and tellers making transactions (also called TPC-A). The
      conclusion the students drew is that it takes a lot of both theoretical and
      practical knowledge to be able to create good tests. CORBA is only a
      specification and the implementations differ between different ORB vendors.
      Therefore, a specific ORB will be used in the tests and the comparison. There
      are many differences but they seem difficult to measure directly and the test
      applications will probably be overall performance tests.

   3. Comparison: The main part of the literature is not detached, either the texts are
      pro-DCOM or pro-CORBA. A CORBA feature which one DCOM-author sees
      as a disadvantage is turned into an advantage in a CORBA text, and vice
      versa. But the students have literature that recommends both sides and the
      presentation and the comparison of the differences will be impartial. The
      terminology in DCOM and CORBA is somewhat different, but the students
      will search for counterparts in the two technologies and compare them.

Methods for solving the problem
Starting with a background of basic RPC and distributed objects an introduction to
Enterprise JavaBeans with CORBA and COM+ with DCOM should be made.
Following this introduction is a comprehensive study of EJB, CORBA, COM+ and
DCOM focusing on CORBA and DCOM. The layer structuring for DCOM and
CORBA should be described; the top, the middle and the bottom layer. A code
example will be used to describe the way to implement a client and a server with the
two competing architectures.
Based on the study, the differences between the two are distinguished. The theoretic
differences serves as a platform for building the test procedures. Finally a decision
framework is formed in order to make the right choice for some system that should be

The tests
When comparing CORBA and DCOM the following test applications will be

1. Ping
A simple counter to measure the performance of CORBA and DCOM static method
invocations. Remote static count ping is studied for both technologies.

2. Throughput
To evaluate CORBA’s and DCOM’s capabilities to transfer data, benchmarks on
passing arguments are included in this test. Some basic data types (at least integers
and characters) are transferred as separate entities, within arrays or within sequences.
Both the time of delivery of the request and the round-trip-time of the request
completion are measured. The RTT is important for client-programmers and the time
of delivery hints at the throughput.

3. First call
This test is designed to compare the invocation time of a method when the method is
the first one which is called on an object’s instance, and when method has already
been called on the instance. This test consists of calling one method on 1000 objects
in order to see if there is a difference between the first and the second call of the
method. (The method is returning void and does not have any parameters).

4. Invocation speed
This test is designed to give an idea about the impact of the number of objects on the
performances of remote calls. A simple method without parameter and no return value
is called on a various number of server objects.

5. Multi clients
This test compares performances obtained for several clients, which perform calls on
the same server. This test consists of running several clients performing calls on the
same server. The method called takes a large array and do not return any value.

6. Debit-Credit
A 3-Tier Debit-Credit environment simulating a bank, based on the standard Debit-
Credit benchmark defined in 1985 by Jim Gray. The Debit-Credit transaction is
generated by a client, which performs a deposit/withdrawal transaction. One
implementation will use EJB and CORBA and the other COM+ and DCOM. Both
implementations will be written in Java.

These things will be measured:
    Throughput. Measure the throughput in transactions per second (tps).
    Response time.
    Concurrent users. The number of users has a great effect on the response time
       and the throughput.
    Cost per transaction.

The test environment
The tests will use the following test bed:

Platform: Windows 2000 service pack 1 (SP1)
Hardware: Pentium III 700 MHz, 196 MB Ram, 100 Mbit/s Ethernet

Three workstations will be used for the three-tier application, one for the client, one
for the application server and one for the database. The other tests will use two

Programming language: Java with JDBC
CORBA ORB: Inprise VisiBroker
Tools: WebGain VisualCafé

Programming language: Java with JDBC
Tools: Microsoft Visual J++

Preliminary disposition of the report        Page 6-9
Time plan for the master project             Page 10
Time plan for writing the report             Page 11

Preliminary disposition of the master thesis

1 Introduction
1.1 Background of the problem
This text gives a background of distributed objects and technologies (RPC).

1.2 Problem statement and the objective of the master
The definition of the problem and the objective of the master project.

2 RPC and Distributed objects
2.1 Distributed objects
Overview of distributed objects.

2.2 Remote procedure call
Overview of the RPC technology and a description of the structuring of the
architecture into the top, middle and bottom layers.

2.3 Components
Short description of what a component is.

3.1 Overview
An overview of CORBA including its origin and development. Describes the overall

3.2 The layer structuring
The top, middle and bottom layer of the RPC structure is described.

3.2.1 The top layer
A description of what the CORBA top layer consists of.

3.2.2 The middle layer
A description of what the CORBA middle layer consists of.

3.2.3 The bottom layer
A description of what the CORBA bottom layer consists of.

3.3 ORBS
3.3.1 ORB implementations
Different ORB implementations are described.

3.3.2 The Visigenic VisiBroker
One of the most widely used ORB implementation is described.

4.1 Overview
An overview of DCOM. Describes the overall architecture.

4.2 The layer structuring
The top, middle and bottom layer of the RPC structure is described.

4.2.1 The top layer
A description of what the DCOM top layer consists of.

4.2.2 The middle layer
A description of what the DCOM middle layer consists of.

4.2.3 The bottom layer
A description of what the DCOM bottom layer consists of.

5 Enterprise JavaBeans
5.1 Overview
An overview of Enterprise JavaBeans.

6 COM+
6.1 Overview
An overview of COM+.

7 CORBA compared to DCOM with code example
7.1 CORBA and DCOM side by side
CORBA and DCOM are compared based on the facts from the study.

7.1.1 Object models
The two object models are studied.

7.1.2 Abstraction level
Compares the level of abstraction for DCOM and CORBA.

7.1.3 Maintain state across invocations
The state of a session and across sessions is studied for CORBA and DCOM.

7.1.4 Ping performance
The results from the ping tests are studied for CORBA and DCOM.

7.1.5 Wire-level transactions
The DCOM Distributed Transaction Coordinator (DTC) is compared to CORBA´s
Object Transaction Service.

7.1.6 Persistent object references
The persistent object references are compared for CORBA and DCOM.

7.1.7 Open standard
The openness of CORBA and DCOM is compared.

7.1.8 Errors and exceptions
The support for an exception handling mechanism, user-defined exceptions and the
handling for in different target implementation languages.

7.1.9 Identity and persistence
Describe DCOM’s and CORBA’s different notions of object identity and how objects
are activated/deactivated. Description of how persistence of objects is supported.

7.1.10       Scalability
Description of how a system supports mechanisms to handle increasing numbers of
users, transactions and data.

7.1.11       Services
The services provided by DCOM are compared to the services provided by CORBA.

7.1.12       Platform support
Compares the platform support for DCOM and CORBA.

7.1.13       Tool support
Compares the support for development tools including design tools for CORBA and

7.1.14       The learning curve
The learning curve for CORBA and DCOM are compared.

7.1.15       Security
The wire-level security for CORBA and DCOM is compared.

7.2 The code example
A code example is used to show the differences on how to implement a server and a
client using CORBA and DCOM.

8 EJB compared to COM+ with code example
8.1 EJB and COM+ side by side
EJB and COM+ are compared based on the facts from the study.

8.2 The code example
A code example is used to show the differences on how to implement a component
using EJB and COM+.

9 Tests
9.1 Overview
Description of what should be measured and how it should be measured.

9.2 The test procedures
The test procedures are presented.

9.3 The test results
The results with benchmarks are presented.

10 Conclusion
Based on the tests a decision framework is formed.

Time plan for the master project

Activity\Week    32   33   34   35   36   37   38   39   40   41   42   43   44   45   46   47   48   49    50   51

Preparing the
Write the
of the tests
of the tests


Prepare oral

Time plan for writing the report

Activity\       32   33   34   35   36   37   38   39   40   41   42   43   44   45   46   47   48   49   50   51

Description +
Description +
of the tests
Code example

Final report

Shared By: