Docstoc

QoS Measurement Issues with DAML-QoS Ontology

Document Sample
QoS Measurement Issues with DAML-QoS Ontology Powered By Docstoc
					                QoS Measurement Issues with DAML-QoS Ontology

                                Chen Zhou, Liang-Tien Chia, Bu-Sung Lee
                               Center for Multimedia & Network Technology
                    School of Computer Engineering, Nanyang Technological University
                             Email: {pg04878518, asltchia, ebslee}@ntu.edu.sg


                        Abstract                                 that data be not only machine readable, but also ma-
                                                                 chine understandable. With the help of Semantic Web,
   With the industry’s efforts in promoting the use of Web        unifying system among different partners can be real-
services, a huge number of Web services are being developed      ized with minimum misunderstanding. DAML-S On-
for the web. In this paper we describe a framework for pro-      tology [2] (now OWL-S) is one of the semantic re-
viding QoS measurement for Web services through the use of       search group’s efforts for Web services. DAML-S aims
DAML-QoS Ontology and measurement code generator. The
                                                                 to enable automated Web service discovery, invocation,
framework comprises the DAML-QoS Ontology that specifies
                                                                 composition and monitoring. However, the specifica-
QoS constraints for Web services and works as a complement
to DAML-S; the measurement framework that checks the sys-        tion has not provided a detailed set of classes, prop-
tem’s compliance with the service level agreement; code gener-   erties and constraints to represent QoS descriptions[1].
ator that helps to generate the measurement code according to    We have tried to develop a proper QoS Ontology design
the DAML-QoS specification. Based on the Ontology level, se-      pattern for the formal specification of QoS constraints
mantics in the specification helps to achieve better interoper-   and QoS metrics as a complement to the DAML-S.
ability, automation and extensibility.                           This novel QoS Ontology is based on DAML+OIL and
   keywords: Ontology, Web Service, measurement, QoS             named DAML-QoS.
                                                                    The metrics concepts in DAML-QoS provide a pow-
                                                                 erful solution for measurement organization to moni-
                                                                 tor and bill against the agreed upon SLAs. This paper
1. Introduction                                                  will focus on the metrics layer of the DAML-QoS On-
                                                                 tology and present the measurement architecture and
   The number of Web services grows continuously                 code generator of the system.
when this technology becomes more and more popu-                    We organize the rest of this paper in the following
lar. Service requesters are often presented with a choice        way. In section 2 we introduce some background infor-
of offers that provide similar services. Different offers           mation which is helpful to understand the paper. In
have quite different Quality of Service (QoS).                    section 3 the design principle of the QoS metrics layer
   For service selection and management purpose, it is           is presented. After that, section 4 introduces the design
necessary to precisely and flexibly specify constraints,          of the metrics layer. Section 5 describes the measure-
Quality of Service metrics, service level objectives, and        ment system’s architecture and the measurement code
other contracts between Web services. The formal spec-           generator. Section 6 presents the related work, and, fi-
ification of constraints and Service Level Agreement              nally, in section 7 we draw the conclusion.
(SLA) of service have been researched extensively in
computing and telecommunications. An SLA defines                  2. Background
the agreed level of performance for a particular service
between a service provider and a service requester. Web          2.1. Ontology Languages
services are XML-based protocol stacks and have their
own specific features. Web service discovery, composi-               Ontology plays a key role in the Semantic Web
tion, and cooperation need to be more dynamic, auto-             by providing machine readable vocabularies for ap-
matic and across enterprise boundaries. This requires            plications to understand the shared knowledge.
domain specific knowledges.                                       DAML+OIL [3] is an Ontology language that has
   The semantic web technology is a promising solu-              been designed specifically to be used in the Seman-
tion for automatic service discovery process. It requires        tic Web and it is based on RDF and RDF Schema.
Its well-defined semantics is similar to the descrip-          designed for matchmaking purpose; the QoS property
tion logic (DL) SHIQ(D) [8]. DAML+OIL describes               definition layer for defining the property and elabo-
the structure of a domain in terms of classes and prop-       rating the property’s domain and range constraints;
erties. Like SHIQ(D), DAML+OIL (March 2001)                   the QOS Metrics layer for metrics definition and mea-
also supports the use of datatypes in class descrip-          surement. The main idea is to transfer the problem of
tion. By defining the service descriptions upon the se-        judging QoS constraints conformance to the problem
mantics provided by DAML+OIL, we can utilize a                of judging Ontology subsumption relationships.
DL reasoner to make inference and classify descrip-              Non-functional aspect of the system describes the
tions written in DAML+OIL. The latest Seman-                  constraints such as the Quality of Service, management
tic Web’s Ontology language OWL [15] has evolved              statements, security policies, pricing information, and
to be a new promising Ontology language to substi-            so forth. QoS specification is the major portion in the
tute the current place of DAML+OIL, but we build up           non-functional aspect of the system. During the design
our Ontology based on DAML+OIL because of its bet-            time, people normally focus more on the functional as-
ter tool support. In future we will migrate the current       pect of the system rather than the non-functional as-
system to OWL.                                                pect of the system. However, the best practice is to
                                                              keep in mind the QoS issues during early design phase.
2.2. DAML-S                                                   This requires appropriate QoS specification support for
                                                              describing Web services QoS. Web Services are XML-
   DAML-S [2] (now OWL-S) is a DAML+OIL On-                   based protocol stacks and have their own specific fea-
tology for describing Web services. Through the tight         tures. It is used in loosely coupled, platform indepen-
connection with DAML+OIL, DAML-S aims to make                 dent environment. It requires to be more dynamic, au-
Web services computer-interpretable and to enable au-         tomatic and across enterprise boundaries. Some of our
tomated Web service discovery, invocation, composi-           design principles are listed as follows:
tion and monitoring. It defines the notions of a Service
Profile (what the service does), a Service Model (how            • The specification should be easily understood and
the service works) and a Service Grounding (how to                used by developers. Considering QoS during de-
use the service). As a DAML+OIL Ontology, DAML-                   sign is valuable, since it will directly impact the
S retains all the benefits of Web content described in             system’s design pattern across all layers. Any ob-
DAML+OIL. It enables the definition of a Web services              stacles for learning and understanding the QoS
vocabulary in terms of objects and the complex rela-              specification should be eliminated. Through lay-
tionships between them, including class, subclass rela-           ered design and Object Oriented Style, the speci-
tions, cardinality restrictions, etc. [3]. It also includes       fication is relatively easy to be understood.
the XML datatype information. Recently the DAML-S
has evolved into OWL-S which is based on OWL On-                • To allow value-added services for dynamic service
tology language [15].                                             discovery, composition and integration in a more
   DAML-S provides a good representation of ser-                  automatic and customized manner, the QoS speci-
vice’s functional capability. However, Cardoso et. al.            fication is required to be both precise and flexible.
[1] point out that significant improvement for the QoS             Precision means the QoS specification should an-
model should be made to supply a realistic solution               swer the question when, which, where, what and
to DAML-S’ users. One current limitation of DAML-S’               how the specification should be evaluated against
QoS model is that it does not provide a detailed set              the Web Service[12]. Flexibility should allow the
of classes and properties to represent quality of service         specification to design customized metrics for di-
metrics. The QoS model needs to be extended to al-                verse Web Services to meet their varieties. These
low a precise characterization of each dimension. This            will be discussed further in section 4.
is an important motivation of our current work for the          • The QoS requirements should associate with the
QoS Ontology.                                                     Web service as defined in the DAML-S’ service
                                                                  profile. Since the Object Oriented Design prin-
3. Design Principle                                               ciple is widely accepted in the software develop-
                                                                  ment community, Object Oriented QoS specifica-
   DAML-QoS[16] is designed as a compliment Ontol-                tion is chosen for the specification. When the de-
ogy to provide additional QoS information for DAML-               velopers know the super interface’s QoS require-
S. It mainly deals with the non-functional aspect of the          ments, a natural way is to add the specific QoS
system. It contains three layers: the QoS profile layer            constraints in the inherited interface and utilize
    all inherited QoS constraints described in the su-
    per interface.
                                                                                       M etric
  • The automatic validation of the specification is                              AtomicM etric                           M etric
    important for large projects. Otherwise it is too                           ComplexM etric                           M etric
    costly to determine correctness of the specifica-
    tion manually. In our system, the syntax correct-         The Metric class is a common superclass for all met-
    ness is check by the Ontology parser. The seman-       rics. It defines some general properties for each metrics.
    tic correctness is checked by the reasoner accord-     The Metric class has related properties hasUnit, value,
    ing to the specification’s semantics.                   windowSize, and pushPoint. hasUnit property defines
  • Clear definition of the metrics is the premise for      the unit information of the metric. value property is
    measurement. The number of different SLA met-           used to initialize the measurement handler’s data regis-
    rics that can be defined for a service is potentially   ter when the measurement starts. windowSize describes
    large. Even simple metrics can be defined in many       the amount of history data the metric should keep.
    ways. Take response time as example, the defini-        pushPoint is the address to which the data should be
    tion concerns: measuring from the client, or from      reported to. Some collector (corresponds to the Com-
    the server; average or percentile; the time inter-     plexMetric) is deployed at the pushPoint to collect the
    val for measurement, the history data size, and        pushed data. When new data are measured or calcu-
    so forth. The specification should provide detailed     lated by the metric, a report is generated and pushed
    definition to calculate the metrics from first hand      to the address described by pushPoint. Figure 1 shows
    data.                                                  the property definition for the Metric class.

  • Computing the SLA metrics and evaluating the
    compliance may involve third parties and agents.            <daml:DatatypeProperty rdf:ID="value">

    It is important that the measurement knowl-                     <rdfs:domain rdf:resource="#Metric"/>
                                                                    <rdfs:range rdf:resource="&xsd;#integer"/>

    edge can be shared and understood unambigu-                 </daml:DatatypeProperty>


    ously among parties. Furthermore, each party                <daml:ObjectProperty rdf:ID="hasUnit">
                                                                    <rdfs:domain rdf:resource="#Metric"/>
    should receive only the part of the specifica-                   <rdfs:range rdf:resource="#Unit"/>
                                                                </daml:ObjectProperty>
    tion that it needs to carry on the measurement
                                                                <daml:DatatypeProperty rdf:ID="pushPoint">
    task. By representing the knowledges in con-                 <rdfs:comment>
                                                                  The push approach for the communication. The signature of the push web service is
    cepts and individuals, partial knowledge export-       defined as "void setMetricValue(String metricName, String strUID, String value)"

    s/imports can be supported.                                  </rdfs:comment>
                                                                     <rdfs:domain rdf:resource="#Metric"/>
                                                                     <rdfs:range rdf:resource="&xsd;#string"/>
                                                                </daml:DatatypeProperty>


4. QoS Metrics Layer                                             <daml:DatatypeProperty rdf:ID="windowSize">
                                                                  <rdfs:comment>
                                                                   The window size of the collected data, which defines current metric's data containter's
                                                           size. The data will be retrieved into current metric's data container and the metric will keep
   QoS Metrics Layer provides the QoS metrics concept      windowSize number of data for usage. The invoker function will operate on this metric operand
                                                           based on this size of data.
definition for the QoS property’s range constraints. It            </rdfs:comment>
                                                                      <rdfs:domain rdf:resource="#Metric"/>
also defines precise semantic meanings for service mea-                <rdfs:range rdf:resource="&xsd;#integer"/>

surement partner to measure the service and check the            </daml:DatatypeProperty>


compliance. The concept definition of the metrics pro-           <daml:DatatypeProperty rdf:ID="measureAt">
                                                                         <rdfs:comment>
vides a brief guide for developers to understand the                      Measure at "Server", "Client", or some customer defined place, which will in turn
                                                           help to generate the measurement code.
system’s non-functional requirements during the sys-                     </rdfs:comment>
                                                                         <rdfs:domain rdf:resource="#AtomicMetric"/>
tem’s design, while the individuals of metrics concept                   <rdfs:range rdf:resource="&xsd;#string"/>
                                                                </daml:DatatypeProperty>
are defined by measurement partner for measurement
purpose by filling in the measurement details. A mea-            <daml:ObjectProperty rdf:ID="hasFunction">
                                                                       <rdfs:domain rdf:resource="#ComplexMetric"/>

surement partner is the group that deploys the mea-                    <rdfs:range rdf:resource="#Function"/>
                                                                </daml:ObjectProperty>
surement handlers and performs the service measure-
ment tasks. The service provider or requester can take
on this role,or it can be outsourced to a third party.                   Figure 1. Metrics Property Definition
   The metric concepts are defined in the QoS Metrics
Layer in DAML-QoS. The service QoS metrics are di-
vided into AtomicMetrics and ComplexMetrics, which           The Metric class has two subclasses: AtomicMetric
are shown as follows:                                      and ComplexMetric.
    AtomicMetric collects the first hand data from           parser such as Jena parser[9]. As to the semantic cor-
managed resources using measurement handlers.               rectness, if the specification’s SLO concept is equiva-
The AtomicMetric’s property measureAt answers               lent to ⊥, there’s some contradict in the specification
the “where” question of the measurement. It may             and it is deemed as semantic incorrect. This conflict is
be “Server”, “Client”, and so forth. According to           caused by conflicted QoS requirement.
the AtomicMetric’s individual definition, the mea-
surement party can generate the measurement han-
dler and push its measured data to the collector
located at the pushPoint. The push interface’s signa-
ture is defined as void setMetricValue(String metric-                                                      MyQoS


Name, String strUID, String value). In which strUID




                                                                                                             <= 1000
is the unique id for the service invocation.
    The ComplexMetric retrieves data from other
(AtomicMetric or ComplexMetric) metrics and then
generates its result value according to its function def-
                                                                                    AverageResponseTimeMSMetric


inition. It has properties hasFunction, operand, win-
dowPos, and so on. The property operand points                                                       d1
                                                                                                                       ha
                                                                                                                            sF
                                                                                                an                               un
to its consisted metrics from which it retrieves the                                 o   pe
                                                                                            r                                         ct
                                                                                                                                           io
                                                                                                                                                n

data. The property hasFunction points to the func-
                                                                  ResponseTimeMSMetric                                                              Average
tion used by the ComplexMetric to calculate its
value according to the received data. The func-
tions contain the Arithmetic function, Boolean func-          Figure 2. Average Response Time Metric Exam-
tion, and Aggregate function. The Aggregate function          ple
can be used to describe the statistical character-
istics of the service, such as a percentile or mean.
The property windowPos is the parameter off-
set for the consisted metric’s data series, from which
the function will find its parameter. The negative win-         On the other hand, the measurement partner fills in
dowPos means the history data in the data series.           the details of metric measurement information to an-
These property definition and composition style pro-         swer the “how” question. These data are declared as
vides a detailed definition for high level metrics. The      the individual instances of the metric concepts. Figure
“what and how” question of the measurement is natu-         3 shows a sample individual instance definition for the
rally answered when the Metrics are defined.                 metrics concept shown in figure 2. The measurement
    The specification for design and measurement have        is related for the individual MyQoSSLO. MyQoSSLO
different views. During design, the developers focus on      individual is the SLO concept MyQoS’ (see equation
the essential question of what QoS metrics affect the        (1) and (2)) individual instance. This instance answers
design. The semantics of the metrics should be unam-        the “which and when” question for the measurement.
biguously understood, while its measurement details         That is, which Web service profile the SLO deals with,
is kept apart from the developers. This is achieved         and when the service evaluation begins and ends. Its
through defining the structure of the service level ob-      ComplexMetric AverageResponseTimeMSMetric’s in-
jective and its metric concept in the terminology defini-    dividual AverageResponseTimeMSMetric1 uses Func-
tion (T-Box), while keeping the detailed measurement        tion individual Average100 to calculate its operand1
information in the assertion definition (A-Box). Fig-        ResponseTimeMSMetric1. The Average100 function
ure 2 shows the example of AverageResponseTimeMS-           operate on the 100 entries (window ) of history data
Metric metric definition for developer during design.        (from offset -1 to offset -100) to generate the average
The SLO requires that the averageResponseTimeMS             value. The ResponseTimeMSMetric1 measures the re-
should be less than 1000 msec. The AverageResponse-         sponse time at the server side, and its measurement
TimeMSMetric is calculated by Average function using        unit is msec. It stores 200 history data (windowSize).
the data pushed by the ResponseTimeMSMetric. This           When the measurement data are collected, they will
metric concept definition guides developers to choose        be pushed to the AverageResponseTimeMetric1’s col-
proper design, for example, some resource redundancy.       lector web service. Section 5 will introduce the support-
    As to the specification validation, the syntax cor-      ing architecture and the code generator for the run time
rectness of the specification is checked by the Ontology     measurement.
                                                                                              which is described in the metric’s pushPoint prop-
<qosmetrics:TimeMSUnit rdf:ID="TimeMSUnit1"/>                                                 erty. Since all the measurement components are loosely
<qosmetrics:Average rdf:ID="Average100">
                                                                                              coupled, third parties can join in the measurement
 <qosmetrics:functionName>                                                                    easily. void setMetricValue(String metricName, String
  <xsd:string xsd:value="Average100"/>                                                        strUID, String value) is the push Web service’s signa-
 </qosmetrics:functionName>
 <qosmetrics:window>
                                                                                              ture. The metricName stands for the metric individ-
    <xsd:integer xsd:value="100"/>                                                            ual’s full name. strUID is the unique id for the mea-
 </qosmetrics:window>                                                                         sured service invocation. value is the measurement re-
 <qosmetrics:functionDefinition>
  <xsd:string xsd:value="Average"/>                                                           porting value. Each collector corresponds to one Com-
 </qosmetrics:functionDefinition>                                                             plexMetric and it stores the received data series for
</qosmetrics:Average>
                                                                                              function calculation. Partial of the specification knowl-
<qosmetrics:ResponseTimeMSMetric rdf:ID="ResponseTimeMSMetric1">                              edge can be shared by the measurement parties. The
 <qosmetrics:hasUnit rdf:resource="#TimeMSUnit1"/>                                            partner uses its required concepts and metric individ-
 <qosmetrics:windowSize>
  <xsd:integer xsd:value="200"/>
                                                                                              uals to build up their measurement codes. Since the
 </qosmetrics:windowSize>                                                                     metricName uniquely represents the metric individual,
 <qosmetrics:value>                                                                           third party can understand the received metric’s mean-
  <xsd:integer xsd:value="1000"/>
 </qosmetrics:value>                                                                          ing unambiguously according to the metric individual
 <qosmetrics:measureAt>                                                                       definition and generate the measurement reports cor-
  <xsd:string xsd:value="Client"/>
 </qosmetrics:measureAt>
                                                                                              rectly.
 <qosmetrics:pushPoint>http://155.69.150.102:9080/axis/urn:collector</qosmetrics:pushPoint>      A summary collector collects all the SLO’s prop-
</qosmetrics:ResponseTimeMSMetric>                                                            erty metrics’s pushed data and then generates a piece
<qosmetrics:AverageResponseTimeMSMetric rdf:ID="AverageResponseTimeMSMetric1">                of summary report in the similar form as the SLO con-
 <qosmetrics:hasFunction rdf:resource="#Average100"/>                                         cept. This report is used to check whether the Web ser-
 <qosmetrics:operand1 rdf:resource="#ResponseTimeMSMetric1"/>
 <qosmetrics:hasUnit rdf:resource="#TimeMSUnit1"/>
                                                                                              vice is in compliance with the SLA specification. The
 <qosmetrics:windowSize>                                                                      evaluation service is defined according to the confor-
  <xsd:integer xsd:value="15"/>                                                               mance definition of the DAML-QoS Ontology[16]. The
 </qosmetrics:windowSize>
 <qosmetrics:windowPos1>
                                                                                              algorithm for the evaluator service are defined as be-
  <xsd:integer xsd:value="-1"/>                                                               low: Suppose the MeasureSLO is the summary collector
 </qosmetrics:windowPos1>                                                                     generated SLO, and the MySLO is the agreed upon ser-
 <qosmetrics:value>
  <xsd:integer xsd:value="1000"/>                                                             vice level objective. The service conforms to the SLO
 </qosmetrics:value>                                                                          iff M easureSLO         M ySLO. Otherwise, a violation
 <qosmetrics:pushPoint>http://155.69.150.102:9080/axis/urn:collector</qosmetrics:pushPoint>
</qosmetrics:AverageResponseTimeMSMetric>
                                                                                              occurs in the service invocation. Take the Ontology
                                                                                              MyQoS in section 4 as example, assume the calculated
<myqos:MyQoS rdf:about="&myqosslo;#MyQoSSLO">                                                 AverageResponseTimeMSMetric’s value is 800. Equa-
 <qosmetrics:averageResponseTimeMS rdf:resource="#AverageResponseTimeMSMetric1"/>
</myqos:MyQoS>
                                                                                              tion (1) shows the definition of MyQoS class and Mea-
                                                                                              sureSLO class. Reasoner can draw the conclusion that
                                                                                              M easureSLO         M yQoS. This means the measured
     Figure 3. Monitoring Party Definition for Metrics                                         service QoS is more constrained than the MyQoS and
                                                                                              no violation occurred in the service invocation.
                                                                                                           .
                                                                                                MyQoS      = QoSProfile
5. System Architecture and Code Gen-
                                                                                                             (≤ 1000averageResponseTimeMS.
   eration                                                                                                 AverageResponseTimeMSMetric)
                                                                                                           .
                                                                                                MeasureQoS = QoSProfile
   The measurement system has some requirements.
                                                                                                             (= 800averageResponseTimeMS.
The service measurement should allow the involvement
                                                                                                           AverageResponseTimeMSMetric)
of third parties. The measurement system itself should
                                                                                                                                          (1)
be able to plug into the measured Web Service sys-
tem with no or minimal influences on the original sys-
tem.                                                                                           M yQoS(M yQoSSLO)
   Figure 4 shows a prototype example of the measure-                                          hasServiceP rof ile(M yQoSSLO, ”&provider service; #”)
ment system’s architecture. Each measurement han-                                              startT ime(M yQoSSLO, 2004-09-24T09:00:00)
dler corresponds to one AtomicMetric individual. Its                                           endT ime(M yQoSSLO, 2004-09-30T09:00:00)
measured data are pushed to the collector service                                                                                           (2)
                                        Figure 4. Measurement Architecture

     Measurement handlers are selected as monitoring        tainer, it is created as a circular list with default
implementation to minimize the influence on the ser-         value specified by the value property. A pointer tar-
vice. Our handler implementation is based on the Axis       gets at the last completed data. When the missing
SOAP engine[13]. Multiple handlers form the handler         data is reported to the complex data or the time-
chain to perform the more complex measurement tasks.        out reaches, the pointer will move foreword.
Since the handler locates before or after the service in-      The evaluator generates the measurement service
terface pivot, the measurement code does not change         level Ontology and compares it against the original ser-
the service’s implementation. To distinguish between        vice level objective Ontology. Since the measurement
different invocations, a UIDHandler is required in the       Ontology is clear enough to be understood by different
handler chain to assign unique identity in the SOAP         partners, the outsourcing can be naturally achieved.
message’s head.                                             DL Reasoning engine is used to judge whether the
     Some common metrics are reusable in many web ser-      measurement data comply with the SLO. If the mea-
vices. We have selected some common metrics as ba-          surement Ontology subsumes the SLO Ontology, it is
sic profile to speed up measurement handler’s devel-         deemed as a compliance. Otherwise there’s a violation.
opment, for example: response time, cost, invocation,       A notification can be sent out to the service provider
etc. Each AtomicMetric within the basic profile has a        to indicate the violation.
template handler for code generation and deployment.           The code generating process is listed at the fol-
If the AtomicMetric is supported in the code genera-        lowing: after the Ontology is parsed into the mem-
tor, according to the AtomicMetric’s property values        ory, the code generator starts from the SLO instance
and its code template, the code generator generates         and gets the top level metrics by the QoS properties.
the metric handler automatically for the measurement        After that the second level metrics contained in the
party’s deployment. The measureAt property will af-         first level ComplexMetrics are traversed, and so on.
fect the selection of the code template and the match-      Until all the metric levels are visited, the code gen-
ing code will be generated.                                 eration process finishes. During the code generation,
     The ComplexMetric instance utilizes its func-          each AtomicMetric instance generates one measure-
tion to calculate its consisted metrics. The functions      ment handler, while each ComplexMetric instance gen-
are divided into the Arithmetic function, Boolean           erates one collector. By deploying these measurement
Function, and Aggregate functions. Some of the sup-         services, measurement party can start the measure-
porting function definitions are listed below: NOT,          ment task now. Take the metrics in figure 3 as an exam-
,
  ∗, /, M OD, +, −, >, <,, AND, XOR, OR, Condi-             ple, the code generator will firstly generate the Aver-
tion(IF, THEN, ELSE), LN, LOG, EXP, SQRT, AV-               ageResponseTimeMSMetric1’s collector, and then Re-
ERAGE, MEAN, SUM, MAX, MIN, and so forth.                   sponseTimeMSMetric1’s handler. Since there’s no met-
Since each consisted metric is treated in the same          ric individuals any more, the generation process ends
form as a data container, the code generator will cre-      here. Since most of the measurement code has been
ate all ComplexMetric individual’s calculation code         generated automatically, it reduces the measurement
if it supports the function. As to the data con-            party’s burden significantly.
6. Related Works                                                WS-Policy [7] is a general framework for the speci-
                                                             fication of policies for Web services. The details of the
   There’re many research works that target the de-          specification for particular categories of policies will be
scribing, advertising and signing up to Web services         defined in specialized languages. It is flexible because
at defined QoS levels. These are normally application         policies are not limited in certain places and its speci-
layer specifications which are hardware and platform          fication is extensible through additional specifications.
independent. They include aspect oriented approach           However, when the new specification will appear and
[10] such as QuO framework [18] and its QoS Descrip-         how the polices are monitored and evaluated remain a
tion Language (QDL); object oriented approach such as        problem.
HP’s QoS Modeling Language (QML) [6]; XML based                 Our QoS Ontology is based on DAML+OIL layer in-
QoS languages such as HP’s Web services Management           stead of pure XML layer. This has some special advan-
Language (WSML) and its framework [12], IBM’s Web            tages gained by the semantic web’s technology:
service Level Agreement (WSLA) language [11] and its           • Interoperability: In a complex scenario such as an
supporting framework [4], the Web services Offer Lan-             Enterprise application, there are a variety of man-
guage (WSOL) [14] as well as approaches based on WS-             agement systems and tools to assist the systems
Policy [7].                                                      discovery, design, and execution. Semantic web
   QDL consists of three sub-languages, the contract             based specification provides a way for different sys-
description language (CDL), the structure description            tems to speak the same language. Integration be-
language (SDL), and the resource description language            tween different tools is simplified. Previous sup-
(RDL). The CDL specifies a QoS contract and it con-               porting system normally uses wrappers to solve
tains: nested regions for a possible state of QoS; transi-       the interoperability problems. The syntax wrap-
tions for trigger behavior when region changes; system           per is not necessary if semantics approach is cho-
condition objects for measuring QoS information; and             sen. Uniformed data definition also provides a bet-
callbacks for notification. There is no special reusable          ter view for decision-making and matchmakings. A
construct for specification reuse.                                common Ontology should be established for each
   QML [6] is a non-XML based specification for defin-             domain so that all partners in the cooperation
ing multi-category QoS specifications for components              speak and understand the same words[5].
in distributed object systems. Through object oriented
                                                               • Automation: Higher level automation is achieved
approach, it provides specification refinement and sim-
                                                                 through the logical view of the systems knowledge.
ple contract types such as reliability and performance.
                                                                 The system can aggregates knowledge from vari-
Complex QoS specification can also be expressed, for
                                                                 ous components. Matchmaking, validation, deci-
example using percentiles, variance and frequency. Pro-
                                                                 sion making, and so forth, are based on the logic
file refinement and conformance are defined for profile
                                                                 in the collected knowledge rather than the hard
management. QML supports the specification reusabil-
                                                                 coded programs. Reasoner or rule engines help the
ity through contract and profile refinement.
                                                                 system to achieve better automation.
   WSML [12] and WSLA [11] were developed for
the XML-based specification of custom-made SLAs for             • Extensibility: Different Web services have various
Web services. They define SLAs which contain QoS                  requirements for their QoS descriptions. The new
constraints, prices and management information. In               requirements are relatively easy to be added into
addition to the SLA definition, they’re oriented to-              the knowledge base. When the new definition for
wards management applications in enterprise scenar-              the QoS property is introduced, its related met-
ios. Appropriate management infrastructures are ac-              rics definition is needed to be introduced as well to
companied with these specifications. In the definition             solve the logical warnings. The openness of Ontol-
of SLA aspect, these specifications try to provide a pre-         ogy definition facilitates the share of experiences
cise and flexible solution. Some support for templates            and fastens the development cycle.
is available in WSML and WSLA to provide the flexi-              Similar to the QML specification, we use the ob-
bility.                                                      ject oriented approach to define our specification. The
   WSOL [14] provides formal representation of vari-         object oriented approach is implicitly inherited from
ous constrains as well as management statements. Its         DAML+OIL to enable the specification’s reusability.
major feature is its rich set of reusability constructs      Any programmer who is familiar with the object ori-
and lightweight management infrastructure. The defi-          ented design principle will have a relative short learn-
nition of QoS metrics about how they are measured or         ing curve for the specification design and reuse. Dif-
computed is done in external Ontologies.                     ferent from the QML’s syntax, our approach is based
on the XML syntax and its semantics is based on DL            References
logic. In this paper we mainly focus on using this On-
tology as descriptive advertisement for the service de-        [1] J. Cardoso, A. Sheth, and J. Miller. Workflow Quality of
sign and measurement purpose. It does not address the              Service. In Proceedings of the International Conf. on En-
                                                                   terprise Integration and Modeling Technology and Inter-
problem of which actions to trigger at runtime if the
                                                                   national Enterprise Modeling, 2002.
QoS requirements cannot be satisfied. Therefore some
                                                               [2] DAML-S Coalition. DAML-S: Web Service Description
specification, such as QDL, is more expressive than                 for the Semantic Web. In Proc. International Semantic
DAML-QoS Ontology. WSLA and some other works                       Web Conference (ISWC 02), 2002.
made good efforts on the SLA supporting framework               [3] DAML+OIL. The DAML+OIL Language. 2001.
[4] for Web services. The SLA supporting system based          [4] A. Dan., D. Davis, R. Kearney, A. Keller, R. King,
on Ontology layer is of better interoperability and au-            D. Kuebler, H. Ludwig, M. Polan, M. Spreitzer, and
tomaticity. Agents and third parties can share the do-             A. Youssef. Web services on demand: WSLA-driven au-
main knowledges and export/import partial informa-                 tomated management. IBM Systems Journal, Mar 2004.
tion for the measurement task.                                 [5] M. Fox and M. Gruninger. Ontologies for enterprise inte-
                                                                   gration. In 2nd International Conference on Cooperative
    By the efforts of semantic web research groups, the
                                                                   Information Systems (CoopIS)., pages 82–89, Toronto,
well-established reasoning tools are a great help to               Canada, 1994.
check the validation of the SLO and to build up the            [6] S. Frolund and J. Koistinen. QML: A Language for Qual-
matchmaking algorithm for DAML-QoS. The syntax                     ity of Service Specification. Technical report, HPL-98-
parsers for DAML+OIL (or later OWL) and the DL                     10, Feb 1998.
reasoners can ensure a quick development for the sys-          [7] M. Hondo and C. Kaler. Web Services Policy Frame-
tem frameworks. Furthermore, because these tools have              work (WS- Policy) Version 1.0, December 2002.
already been tested by many research groups, we are            [8] I. Horrocks and U. Sattler. Ontology reasoning in the
more confident about these tools’ correct interpretation            SHOQ(D) description logic. In Proceedings of the Sev-
for the syntax and semantics of our DAML-QoS spec-                 enteenth International Joint Conference on Artificial In-
ification. Design department can get the related tech-              telligence, 2001.
                                                               [9] HP Labs Semantic Web Programme.                   Jena -
nical support easily as well. When more optimized rea-
                                                                   a semantic web framework for java.             Available:
soner is built up by logic experts, it can be used directly
                                                                   http://jena.sourceforge.net/index.html, 2002.
rather than redeveloping all the tools from scratch.          [10] J. Jin and K. Nahrstedt. QoS Specification Languages
                                                                   for Distributed Multimedia Applications: A Survey and
                                                                   Taxonomy. IEEE Multimedia Magazine, 11:74–87, July
7. Conclusion                                                      2004.
                                                              [11] H. Ludwig, A. Keller, A. Dan, R. P. King, and R. Franck.
   This paper describes the QoS Metrics Layer in the               Web Service Level Agreement (WSLA) Language Spec-
DAML-QoS[16] Ontology and presents the measure-                    ification, v1.0, Jan 2003.
                                                              [12] A. Sahai, A. Durante, and V. Machiraju. Towards Auto-
ment supporting system for the QoS Metrics Layer.
                                                                   mated SLA Management for Web Services. HPL-2001-
As a compliment to the DAML-S, DAML-QoS helps to                   310 (R.1), 2002.
provide the detailed QoS constraint information for the       [13] A.     D.    Team.            Apache      Axis     1.0  ,
service discovery purpose. Meanwhile it provides clear             http://ws.apache.org/axis/. 2002.
metric definition for the QoS measurement purpose.             [14] V. Tosic, B. Pagurek, and K. Patel. WSOL - A Language
During design time, the QoS specification guides the                for the Formal Specification of Classes of Service for Web
developer’s design choice. The validation of the specifi-           Service. In the Int. Conf. on Web Services (ICWS03),
cation helps to check whether there’s conflict or not in            2003.
the QoS constraints. During run time, additional mea-         [15] W3C Committee. OWL Web Ontology Language Refer-
surement information is specified in the metrics indi-              ence. Available: http://www.w3.org/TR/owl-ref/, Feb
vidual instance by the measurement partner. Code gen-              2004.
                                                              [16] C. Zhou, L.-T. Chia, and B.-S. Lee. DAML-QoS Ontol-
erator helps to generate the codes for the measurement
                                                                   ogy for Web Services. In International Conference on
system according to the metric individual definition.
                                                                   Web Services (ICWS04), pages 472–479, 2004.
Based on the Ontology approach, better interoperabil-         [17] C. Zhou, L.-T. Chia, and B.-S. Lee. QoS-Aware and Fed-
ity, automation, and extensibility can be achieved.                erated Enhancement for UDDI. International Journal
   In future we’d like to convert the current system               of Web Services Research (JWSR), 2:58–85, 2004.
from DAML+OIL to OWL. Furthermore, we would in-               [18] J. A. Zinky, D. E. Bakken, and R. E. Schantz. Architec-
corporate the QoS matchmaking system into our cur-                 tural support for quality of service for CORBA objects.
rent QoS-Aware service discovery framework[17].                    Theory and Practice of Object Systems, 1997.

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:4
posted:10/10/2011
language:English
pages:8