ICEIS 2011 Abstracts


Area 1 - Databases and Information Systems Integration

Full Papers
Paper Nr: 21
Title:

DOCUMENTS AS A BAG OF MAXIMAL SUBSTRINGS - An Unsupervised Feature Extraction for Document Clustering

Authors:

Tomonari Masada

Abstract: This paper provides experimental results showing how we can use maximal substrings as elementary features in document clustering. We extract maximal substrings, i.e., the substrings each giving a smaller number of occurrences even after adding only one character at its head or tail, from the given document set and represent each document as a bag of maximal substrings after reducing the variety of maximal substrings by a simple frequency-based selection. This extraction can be done in an unsupervised manner. Our experiment aims to compare bag of maximal substrings representation with bag of words representation in document clustering. For clustering documents, we utilize Dirichlet compound multinomials, a Bayesian version of multinomial mixtures, and measure the results by F-score. Our experiment showed that maximal substrings were as effective as words extracted by a dictionary-based morphological analysis for Korean documents. For Chinese documents, maximal substrings were not so effective as words extracted by a supervised segmentation based on conditional random fields. However, one fourth of the clustering results given by bag of maximal substrings representation achieved F-scores better than the mean F-score given by bag of words representation. It can be said that the use of maximal substrings achieved an acceptable performance in document clustering.

Paper Nr: 48
Title:

A COMPREHENSIVE STUDY OF THE EFFECT OF CLASS IMBALANCE ON THE PERFORMANCE OF CLASSIFIERS

Authors:

Rodica Potolea and Camelia Lemnaru

Abstract: Class imbalance is one of the significant issues which affect the performance of classifiers. In this paper we systematically analyze the effect of class imbalance on some standard classification algorithms. The study is performed on benchmark datasets, in relationship with concept complexity, size of the training set, and ratio between number of instances and number of attributes of the training set data. In the evaluation we considered six different metrics. The results indicate that the multilayer perceptron is the most robust to the imbalance in training data, while the support vector machine’s performance is the most affected. Also, we found that unpruned C4.5 models work better than the pruned versions.

Paper Nr: 50
Title:

LOCAL ONTOLOGIES FOR SEMANTIC INTEROPERABILITY IN SUPPLY CHAIN NETWORKS

Authors:

Milan Zdravković

Abstract: Most of the issues of current supply chain management practices are related to the challenges of interoperability of relevant enterprise information systems (EIS). In this paper, we present the ontological framework for semantic interoperability of EISs in supply chain networks, based on Supply Chain Operations Reference (SCOR) model, its semantic enrichment and mappings with relevant enterprise conceptualizations. In order to introduce the realities of the enterprises into this framework, namely their models, we define and implement the approach to generation of local ontologies, based on the databases of their EISs. Also, we discuss on the translation between semantic and SQL queries, a process in which implicit semantics of the EIS’s databases and explicit semantics of the local ontologies become inter-related.

Paper Nr: 68
Title:

DATA MINING ON DENGUE VIRUS DISEASE

Authors:

Daranee Thitiprayoonwongse

Abstract: Dengue infection is an epidemic disease typically found in tropical region. Symptoms of the disease show rapid and violent for patients in a short time. The World Health Organization (WHO) classifies the dengue infection as Dengue Fever (DF) and Dengue Hemorrhagic Fever (DHF). Symptoms of DHF are divided into 4 types. The problem might be happen when an expert misdiagnoses dengue infection. For Example, an expert diagnosed a patient as non dengue or DF even if a patient was a DHF patient. That might be the cause of dead if patient did not receive treatment. Therefore, we selected data mining approach to solve this problem. We employed decision tree algorithm to learn from data set in order to create new knowledge. The first experimental result shows useful knowledge to classify dengue infection levels into 4 groups (DF, DHF I, DHF II, and DHF III). An average accuracy is 96.50 %. The second experimental result shows the tree and a set of rules to classify dengue infection levels into 2 groups followed by our assumption. An accuracy is 96.00 %. Furthermore, we compared our performance in term of false negative values to WHO and some researchers and found that our research outperforms those criteria, as well.

Paper Nr: 93
Title:

APPLICATION INTEGRATION OF SIMULATION TOOLS CONSIDERING DOMAIN SPECIFIC KNOWLEDGE

Authors:

Tobias Meisen and Philipp Meisen

Abstract: Because of the increasing complexity of modern production processes, it is necessary to plan these processes virtually before realizing them in a real environment. On the one hand there are specialized simulation tools simulating a specific production technique with exactness close to the real object of the simulation. On the other hand there are simulations which simulate whole production processes, but often do not achieve prediction accuracy comparable to the specialized tools. The simulation of a production process as a whole achieving the needed accuracy is hard to realize. Incompatible file formats, different semantics used to describe the simulated objects and missing data consistency are the main causes of this integration problem. In this paper, a framework is presented that enables the interconnection of simulation tools of production engineering considering the specific knowledge of a certain domain (e.g. material processing). Therefore, an ontology-based integration approach using domain specific knowledge to identify necessary semantic transformations has been realized. The framework provides generic functionality which, if concretized for a domain, enables the system to integrate any domain specific simulation tool in the process.

Paper Nr: 143
Title:

POST-PROCESSING ASSOCIATION RULES WITH CLUSTERING AND OBJECTIVE MEASURES

Authors:

Veronica Oliveira de Carvalho

Abstract: The post-processing of association rules is a difficult task, since a large number of patterns can be obtained. Many approaches have been developed to overcome this problem, as objective measures and clustering, which are respectively used to: (i) highlight the potentially interesting knowledge in domain; (ii) structure the domain, organizing the rules in groups that contain, somehow, similar knowledge. However, objective measures don’t reduce nor organize the collection of rules, making the understanding of the domain difficult. On the other hand, clustering doesn’t reduce the exploration space nor direct the user to find interesting knowledge, making the search for relevant knowledge not so easy. This work proposes the PAR-COM (Post-processing Association Rules with Clustering and Objective Measures) methodology that, combining clustering and objective measures, reduces the association rule exploration space directing the user to what is potentially interesting. Thereby, PAR-COM minimizes the user’s effort during the post-processing process.

Paper Nr: 157
Title:

RESOURCE-AWARE HIGH QUALITY CLUSTERING IN UBIQUITOUS DATA STREAMS

Authors:

Ching-Ming Chao and Guan-Lin Chao

Abstract: Data stream mining has attracted much research attention from the data mining community. With the advance of wireless networks and mobile devices, the concept of ubiquitous data mining has been proposed. However, mobile devices are resource-constrained, which makes data stream mining a greater challenge. In this paper, we propose the RA-HCluster algorithm that can be used in mobile devices for clustering stream data. It adapts algorithm settings and compresses stream data based on currently available resources, so that mobile devices can continue with clustering at acceptable accuracy even under low memory resources. Experimental results show that not only is RA-HCluster more accurate than RA-VFKM, it is able to maintain a low and stable memory usage.

Paper Nr: 253
Title:

UF-EVOLVE - UNCERTAIN FREQUENT PATTERN MINING

Authors:

Shu Wang and Vincent Ng

Abstract: Many frequent-pattern mining algorithms were designed to handle precise data, such as the FP-tree structure and the FP-growth algorithm. In data mining research, attention has been turned to mining frequent patterns in uncertain data recently. We want frequent-pattern mining algorithms for handling uncertain data. A common way to represent the uncertainty of a data item in record databases is to associate it with an existential probability. In this paper, we propose a novel uncertain-frequent-pattern discover structure, the mUF-tree, for storing summarized and uncertain information about frequent patterns. With the mUF-tree, the UF-Evolve algorithm can utilize the shuffling and merging techniques to generate iterative versions of it. Our main purpose is to discover new uncertain frequent patterns from iterative versions of the mUF-tree. Our preliminary performance study shows that the UF-Evolve algorithm is efficient and scalable for mining additional uncertain frequent patterns with different sizes of uncertain databases.

Paper Nr: 269
Title:

UNIFICATION OF XML DOCUMENT STRUCTURES FOR DOCUMENT WAREHOUSE (DocW)

Authors:

Ines Ben Messaoud and Jamel Feki

Abstract: Data warehouses and OLAP (On Line Analytical Processing) technologies analyse huge amounts of structured data that companies store as conventional databases. Recent works underline the importance of textual data for the decision making process and, therefore, lead to build document warehouses. In fact, documents help decision makers to better understand the evolution of their business activities. In general, these documents exist in XML format, are geographically distributed and described by multiple and different structures. This paper deals with a method to build a distributed document warehouse. This method consists of two steps: i) unification of XML document structures in order to set a global and generic perception/view of the distributed document warehouse, and ii) multidimensional modeling of unified documents for decisional purposes. More specifically, this paper focuses on the unification step.

Paper Nr: 287
Title:

DOCUMENT CLASSIFICATION - Combining Structure and Content

Authors:

Samaneh Chagheri and Sylvie Calabretto

Abstract: Technical documentation such as user manual and manufacturing document is now an important part of the industrial production. Indeed, without such documents, the products can neither be manufactured nor used according to their complexity. Therefore, the increasing volume of such documents stored in the electronic format, needs an automatic classification system in order to categorize them in pre-defined classes and to retrieve the information quickly. On the other hand, these documents are strongly structured and contain the elements like tables and schemas. However, the traditional document classification typically classifies the documents considering the document text and ignoring its structural elements. In this paper, we propose a method which makes use of structural elements to create the document feature vector for classification. A feature in this vector is a combination of the term and the structure. The document structure is represented by the tags of the XML document. The SVM algorithm has been used as learning and classifying algorithm.

Paper Nr: 296
Title:

CONCEPTUALISATION APPROACH FOR COOPERATIVE INFORMATION SYSTEMS INTEROPERABILITY

Authors:

Mario Lezoche

Abstract: In order to increase enterprise performance, economics paradigms focus, now more than ever, on how to better manage information. The modern architecture of information systems is based on distributed networks with a grand challenge representing and sharing knowledge managed by those ISs. One of the main issues in making such heterogeneous Cooperative Information Systems (CIS) working together is to remove semantics interoperability barriers. This paper firstly analyses interoperability issues between CISs and then proposes patterns for data models conceptualisation for knowledge explicitation, based on expert knowledge injection rules and a fact-oriented approach. A case study is proposed related to a work order process in Sage X3, an Enterprise Resource Planning application.

Paper Nr: 333
Title:

DATA CONCERN AWARE QUERYING FOR THE INTEGRATION OF DATA SERVICES

Authors:

Muhammad Intizar Ali and Reinhard Pichler

Abstract: There is an increasing trend for organizations to publish data over the web using data services. The published data is often associated with data concerns like privacy, licensing, pricing, quality of data, etc. This raises several new challenges. For instance, it must be ensured that data consumers utilize the data in the right way and are bound to the rules and regulations defined by the data owner and data service provider. Current Data Integration systems using data services lack the ability to preserve data concerns while querying multiple services in an integrated environment. In this paper, we design a new querying system which takes data concerns into account. It supports automatic service selection based on data concerns and perfectly fits into dynamic data integration applications.

Short Papers
Paper Nr: 14
Title:

EVENT-BASED MAINTENANCE OF DIGITAL LIBRARY COLLECTIONS

Authors:

Wendy Osborn

Abstract: We consider the problem of updating a digital library collection automatically when certain events occur. Existing digital library systems require the user to either re-build manually or to schedule re-building at specified times. Ideally, a collection should be re-built whenever some collection-changing event occurs. We propose the Event Monitor for Greenstone, a tool that continually tests for collection-altering events and re-builds the collection when these events occur. The Event Monitor interacts with the existing scheduling mechanism on the host system, thereby making it a simple yet powerful tool for event-based maintenance.

Paper Nr: 40
Title:

RESEARCH OF CREDIT RISK OF COMMERCIAL BANK PERSONAL LOAN BASED ON ASSOCIATION RULE

Authors:

Zhang Zenglian

Abstract: Guard against financial risks, reduce bad loans, increase the ability to identity risk of commercial banks, the key is risk warning. In view of the increasing proportion of personal loans in banking business, it is particularly important to warning personal loans credit risk. Commercial bank lending itself is a complex nonlinear system, using general linear theory is difficult to objectively reflect the laws of this, this paper uses association rule. Personal loan credit index first constructed, and then use apriori algorithm to extract rules. Results showed that apriori algorithm plays an important role in identifying risk in personal loans.

Paper Nr: 70
Title:

IRTA: AN IMPROVED THRESHOLD ALGORITHM FOR REVERSE TOP-K QUERIES

Authors:

Cheng Luo, Feng Yu, Wen-Chi Hou and Zhewei Jiang

Abstract: Reverse top-k queries are recently proposed to help producers (or manufacturers) predict the popularity of a particular product. They can also help them design effective marketing strategies to advertise their products to a target audience. This paper designs an innovative algorithm, termed IRTA (Improved Reverse top-k Threshold Algorithm), to answer reverse top-k queries efficiently. Compared with the state-of-the-art RTA algorithm, it further reduces the number of expensive top-k queries. Besides, it utilizes the dominance and reverse-dominance relationships between the query product and the other products to cut down the cost of each top-k query. Comprehensive theoretical analyses and experimental studies show that IRTA is a more effective algorithm than RTA.

Paper Nr: 75
Title:

STATIC OPTIMIZATION OF DATA INTEGRATION PLANS IN GLOBAL INFORMATION SYSTEMS

Authors:

Janusz R. Getta

Abstract: Global information systems provide its users with a centralized and transparent view of many heterogeneous and distributed sources of data. The requests to access data at a central site are decomposed and processed at the remote sites and the results are returned back to a central site. A data integration component of the system processes data retrieved and transmitted from the remote sites accordingly to the earlier prepared data integration plans. This work addresses a problem of static optimization of data integration plans in a global information system. Static optimization means that a data integration plan is transformed into more optimal form before it is used for data integration. We adopt an online approach to data integration where the packets of data transmitted over a wide area network are integrated into the final result as soon as they arrive at a central site. We show how data integration expression obtained from a user request can be transformed into a collection of data integration plans, one for each argument of data integration expression. This work proposes a number of static optimization techniques that change an order operations, eliminate materialization and constant arguments from data integration plans implemented as relational algebra expressions.

Paper Nr: 82
Title:

NON-EXHAUSTIVE JOIN ORDERING SEARCH ALGORITHMS FOR LJQO

Authors:

Tarcizio Alexandre Bini, Adriano Lange and Marcos Sfair Sunye

Abstract: In relational database systems the optimization of select-project-join queries is a combinatorial problem. The use of exhaustive search methods is prohibitive because of the exponential increase of the search space. Randomized searches are used to find near optimal plans in polynomial time. In this paper, we investigate the large join query optimization (LJQO) problem by extending randomized algorithms and implementing a 2PO algorithm as a query optimizer in a popular open-source DBMS. We compare our solution with an implementation of a genetic algorithm. Through a multidimensional test schema, we discuss pros and cons about the behavior of these algorithms. Our results show that 2PO algorithm is fast to run and the costs of generated plans are better in most cases when compared to those of the genetic algorithms.

Paper Nr: 98
Title:

RISK MANAGEMENT IN SUPPLY NETWORKS FOR HYBRID VALUE BUNDLES - A Risk Assessment Framework

Authors:

Holger Schrödl and Matthias Geier

Abstract: In the market for tangible goods there is increasingly a trend from the production of single individual products towards individualized mass customization. In contrast to this, so-called hybrid value bundles are getting more and more importance in achieving market share und make a differentation to the competitors. Hybrid value bundles are integrated solutions combined of tangible and intangible goods. For these complex solutions, subparts are often delivered from different suppliers and have to be bundled by a focal supplier. These bundles will be delivered in form of a single solution to the customer. The large number of heterogeneous suppliers within the supplier network needs a complex supplier relationship management. Classic supply chain management techniques fail because of the specific requirements of hybrid value bundles. One major issue in the supplier management is risk management. For this, the focal supplier has to evaluate its suppliers according to risk characteristics and then choose to take those who have the lowest risk. In this paper a risk management model is presented, which takes care of the specific requirements of hybrid value bundles and complex supply networks. This risk management model may serve as a risk assessment framework for a focal supplier to identify optimal supply chains for a specific offering.

Paper Nr: 113
Title:

SEMANTICALLY ENHANCING MULTIMEDIA DATA WAREHOUSES - Using Ontologies as Part of the Metadata

Authors:

Andrei Vanea and Rodica Potolea

Abstract: Data warehouses are versatile systems capable of storing and processing large quantities of data. They are most suited for aggregating and reporting. The data managed by these systems vary from simple, numeric data, to more complex, multimedia data. One of the domains in which multimedia data is intensively produced is medicine. We present a method for semantically enhancing the metadata stored in a medical multimedia data warehouse. This semantically rich environment will gain in autonomy, reducing the dependence on human intervention to resolve new, unforeseen queries. Furthermore, the use of the semantic relations defined in the ontology allows the system to speed up the execution of a query, by computing the results of new, unforeseen queries, from the fact data already stored in the data warehouse.

Paper Nr: 136
Title:

PERFORMANCE EVALUATION OF QUERY TRIMMING STRATEGIES IN SEMANTIC CACHING ENVIRONMENT

Authors:

S. Kami Makki and Stefan Andrei

Abstract: The Semantic caching is an efficient caching strategy for client-side processing of queries. This strategy involves comparing user queries with previously cached queries and finding the similarities between these queries. These similarities constitute the partial answer to the user query and therefore they would be extracted from the user query. Then only the remainder of the user query would be sent to the server. Therefore, this can reduce significantly not only the communication between client and server and as a result free network bandwidth, but also improves the speed of query processing in a distributed environment. This paper presents simulations for manipulation of multi-table queries and provides extensive simulations for single-table queries in comparison with previous methods.

Paper Nr: 151
Title:

COMPARATIVE ANALYSIS OF THREE TECHNIQUES FOR PREDICTIONS IN TIME SERIES HAVING REPETITIVE PATTERNS

Authors:

Arash Niknafs and Bo Sun

Abstract: Modelling nonlinear patterns is possible through using regression (curve fitting) methods. However, they can be modelled by linear regression (LR) methods, too. This kind of modelling is usually used to depict and study trends and it is not used for prediction purposes. Our goal is to study the applicability and accuracy of piecewise linear regression in predicting a target variable in different time spans (where a pattern is being repeated). Using moving average, we identified the split points and then tested our approach on a real world case study. The dataset of the amount of recycling material in Blue Carts in Calgary (including more than 31,000 records) was taken as a case study for evaluating the performance of the proposed approach. Root mean square error (RMSE) and Spearman rho were used to evaluate and prove the applicability of this prediction approach and evaluate its performance. A comparison between the performances of Support Vector Machine (SVM), Neural Networks (NN), and the proposed LR-based prediction approach is also presented. The results show that the proposed approach works very well for such prediction purposes. It outperforms SVM and is a powerful competitor for NN.

Paper Nr: 161
Title:

GUIDELINES FOR THE CHOICE OF VISUALIZATION TECHNIQUES APPLIED IN THE PROCESS OF KNOWLEDGE EXTRACTION

Authors:

Juliana Keiko Yamaguchi

Abstract: Visualization techniques are tools that can improve analyst's insight into the results of knowledge discovery process or to directly explore and analyze data. They allows analysts to interact with the graphical representation to get new knowledge. The choice of visualization techniques must follow some criteria to guarantee a consistent data representation. This paper presents a study based on Grounded Theory that indicates parameters for select visualization techniques, which are: data type, task type, data volume, data dimension and position of the attributes in the display. These parameters are analyzed in the context of visualization technique categories: standard 1D - 3D graphics, iconographic techniques, geometric techniques, pixel-oriented techniques and graph-based or hierarchical techniques. The analysis over the association among these parameters and visualization techniques culminated in guidelines establishment to choose the most appropriate techniques according to the data characteristics and the objective of the knowledge discovery process.

Paper Nr: 190
Title:

A NEW METHOD FOR MONITORING INDUSTRIAL PRODUCT-SERVICE SYSTEMS BASED ON BSC AND AHP

Authors:

Michael Abramovici and Feng Jin

Abstract: Driven by the increased competition pressure in the last few years, a number of manufactures are shifting their focus from products towards Industrial Product-Service Systems (IPS²). However, the shift to IPS² is also accompanied by risks. The monitoring of IPS² could support executives in identifying the IPS² risks in time and could serve as the basis for optimizing future IPS². In this paper a new method for the hierarchical monitoring of IPS² is developed based on Balanced Scorecard (BSC) and Analytic Hierarchical Process (AHP). The performances and the imbalance degrees of IPS² on different levels are calculated to show IPS² comprehensively. BSC is applied to define IPS²-specific perspectives and indicators. AHP is used to construct a hierarchical monitoring structure and to generate weights for different IPS²-specific perspectives and indicators. Finally a case study is introduced to validate this method.

Paper Nr: 195
Title:

AVIS - APPLIED VISA INFORMATION SYSTEM - Case Study for the Embassy of the Gabonese Republic in China

Authors:

Tristan Daladier Engouang and Liu Yun

Abstract: Nowadays, the handwriting workbook has been a valuable tool that is disappearing into a computer keyboard typing. Since the establishment of Sino-Gabonese diplomatic relations in 1974, using the book filling procedure of visa application, the Gabonese republic embassy to china has accumulated a very large amount of paperwork data and using this data efficiently becomes a major problem to solve. A computerized data management application that manages new visa applications and record existing paper archives into electronic data for effective knowledge of visa demands and analysis was proposed. In this paper, the Applied Visa Information System is develop, a system that consist in data storage and data view modules. Development process and implementation of the AVIS and the methods for goals procedures to operate the system are detailed. The system works with a database that contains all applicants’ information and resulting visa decisions such as accepted or denied.

Paper Nr: 209
Title:

LABEL ORIENTED CLUSTERING FOR SOCIAL NETWORK DISCUSSION GROUPS

Authors:

Ahmed Rafea

Abstract: This paper proposes applying Bisecting K-means algorithm, to cluster the social network discussion groups and providing a meaningful label to the cluster containing these groups. The clustering of the discussion groups is based on the heterogeneous meta-features that define each group; e.g. title, description, type, sub-type, network. The main ideas is to represent each group as a tuple of multiple feature vectors and construct a proper similarity measure to each feature space then perform the clustering using the proposed bisecting K-means clustering algorithm. The main key phrases are extracted from the titles and descriptions of the discussion groups of a given cluster and combined with the main meta-features to build a phrase label of the cluster. The analysis of the experiments results showed that combining more than one feature produced better clustering in terms of quality and interrelationship between the discussion groups of a given cluster. Some features like the Network improved the compactness and tightness of the cluster objects within the clusters while other features like the type and subtype improves the separation of the clusters.

Paper Nr: 214
Title:

COST-BASED BUSINESS PROCESS DEPLOYMENT ADVISOR

Authors:

Steffen Preissler

Abstract: Service-oriented environments increasingly become the central backbone of a company’s business processes. Thereby, services and business process flows are loosely coupled components that are distributed over many server nodes and communicate via messages. Unfortunately, communication between SOA components using the XML format is still the most resource- and time-consuming activity within a traditional business process flow. This paper describes an approach that considers these transfer-costs and advises a process and service placement for a given set of server nodes to partially eliminate remote message transfers. Furthermore, experiments have been conducted to show its feasibility.

Paper Nr: 248
Title:

CRL DISTRIBUTION USING AN ALTERNATIVE COMMUNICATION MEDIA FOR VEHICULAR NETWORKS

Authors:

HyunGon Kim

Abstract: Efforts on vehicular network security have been undertaken, with consensus on utilizing public key cryptography to secure communications. However, how to efficiently revoke node’s certificates represents a major challenge in vehicular networks. Certificate revocation lists (CRLs) should be distributed quickly to every vehicle within the networks to protect them from malicious users and malfunctioning equipment as well as to increase the overall security and safety of the vehicular networks. In this paper, we propose a Terrestrial-Digital Multimedia Broadcasting (T-DMB) aided distribution method for CRL distribution. The method can broaden breadth of network coverage and can get real-time delivery and enhanced transmission reliability using an alternative communication media thus, T-DMB data broadcasting channels. Even if road side units are sparsely deployed or, even not deployed, vehicles can obtain recent CRLs from T-DMB infrastructure. In addition, to broadcast CRLs over T-DMB infrastructure, we design a new Transport Protocol Expert Group (TPEG) CRL application followed by TPEG standards.

Paper Nr: 257
Title:

FAULT DIAGNOSIS OF BATCH PROCESSES RELEASE USING PCA CONTRIBUTION PLOTS AS FAULT SIGNATURES

Authors:

Alberto Wong Ramírez and Joan Colomer Llinàs

Abstract: The diagnosis of qualitative variables in certain types of batch processes requires time to measure the variables and obtain the final result of the released product. With principal component analysis (PCA) any abnormal behavior of the process can be detected. This study proposes a method that uses contribution plots as fault signatures (FS) on the different stages and variables of the process to diagnose the quality variables from the released product. Therefore, in a product resulting from the abnormal behavior of a process the qualitative variables that need to be measured could be obtained through the quantitative variables of the process by classifying the FS with a knowledge model from a fault signature database (FSD) extracted with a classification algorithm. The method is tested in a biological nutrient removal (BNR) sequencing batch reactor (SBR) for wastewater treatment to diagnose qualitative variables of the process: ammonium (NH+4 ), nitrates (NO−2 or NO−3) and phosphate (PO3−4).

Paper Nr: 259
Title:

ACCEPTANCE OF ENTERPRISE RESOURCE PLANNING SYSTEMS BY SMALL MANUFACTURING ENTERPRISES

Authors:

Rubina Adam

Abstract: ERP systems are widely used by large enterprises for managing functional areas of the enterprise. However, recently ERP systems have also been introduced to the small enterprise market. ERP systems are now considered an important small enterprise management aid that may contribute to the sustainability and growth of the small enterprise. Although there are several acceptance factors that may impact on the acceptance of ERP systems, limited research has been done to understand the acceptance of ERP systems by small enterprises. This paper addresses this gap by considering the strategic, business, technical and human factors that influence the acceptance of ERP systems in small manufacturing enterprises in South Africa. The consultative list of acceptance factors flowing from this research may guide future initiatives aiming to ensure the acceptance of ERP systems by small manufacturing enterprises.

Paper Nr: 267
Title:

COMPUTER-AIDED DATA-MART DESIGN

Authors:

Fatma Abdelhédi and Geneviève Pujolle

Abstract: With decision support systems, decision-makers analyse data in data marts extracted from production bases. The data-mart schema design is generally performed by expert designers (administrator or computer specialist). With data-driven, requirement-driven or hybrid-driven approaches, this designer builds a data-mart defining facts (analysis subjects) and analysis axes. This process, based on data sources and decision-makers requirements, often turns out to be approximate and complex. We propose to design a data-mart schema by the decision-maker himself, following a hybrid-driven approach. Using an assistance process that visualises successively intermediate schemas built from data sources, the decision-maker gradually builds his multidimensional schema. He determines measures to be analysed, dimensions hierarchies within dimensions. A CASE tool based on this concept has been developed.

Paper Nr: 268
Title:

A UNIFIED MODEL DRIVEN METHODOLOGY FOR DATA WAREHOUSES AND ETL DESIGN

Authors:

Faten Atigui and Franck Ravat

Abstract: During the last few years, several frameworks have dealt with Data Warehousing (DW) design issues. Most of these frameworks provide partial answers that focus either on multidimensional (MD) modelling or on Extraction-Transformation-Loading (ETL) modelling. However, less attention has been given neither to uni-fying both modelling issues into a single structured framework nor to automating the warehousing process. To overcome these limits, this paper provides a generic unified and semi-automated method that integrates DW and ETL processes design. The framework is handled within the Model Driven Architecture (MDA). It (i) first helps the designer in modelling the decision-makers requirements and then (ii) generates the MD model as well as (ii) the logical and the physical models and finally (iv) generates the source code. In this approach, the transformation rules are formalized using the Query/View/Transformation (QVT) language.

Paper Nr: 279
Title:

SEMANTICALLY RICH API FOR IN-DATABASE DATA MANIPULATION IN MAIN-MEMORY ERP SYSTEMS

Authors:

Vadym Borovskiy and Christian Schwarz

Abstract: Assuming the feasibility of main-memory database management systems, the current research aims at designing a new type of data manipulation API, called Business Object Query Language (BOQL), specifically tailored for in-database data manipulation in main-memory ERP systems. The paper contributes the concept of business object virtualization and describes a query processor that takes advantage of this concept. The first serves as a means of grouping raw memory-resident data into high-level data structures, while the second exposes a flexible query-like API to manipulate the high-level data structures. Special effort has been dedicated to integrating the API into C++ programming language.

Paper Nr: 294
Title:

ENABLING UBIQUITOUS DATA MINING IN INTENSIVE CARE - Features Selection and Data Pre-processing

Authors:

Manuel Santos and Filipe Portela

Abstract: Ubiquitous Data Mining and Intelligent Decision Support Systems are gaining interest by both computer science researchers and intensive care doctors. Previous work contributed with Data Mining models to predict organ failure and outcome of patients in order to support and guide the clinical decision based on the notion of critical events and the data collected from monitors in real-time. This paper addresses the study of the impact of the Modified Early Warning Score, a simple physiological score that may allow improvements in the quality and safety of management provided to surgical ward patients, in the prediction sensibility. The feature selection and data pre-processing are also detailed. Results show that for some variables associated to this score the impact is minimal.

Paper Nr: 302
Title:

JAR2ONTOLOGY - A TOOL FOR AUTOMATIC EXTRACTION OF SEMANTIC INFORMATION FROM JAVA OBJECT CODE

Authors:

Nicolás Marín

Abstract: We present here a novel approach (and its implementation) for the automatic extraction of semantic knowledge from Java libraries. We want to match software libraries, so we need to obtain as much information as possible to use it in the matching process. For this purpose, this approach extracts information about the structure of the classes (i.e., name, fields and hierarchy), as well as information about the behavior of the classes (i.e., methods). In the literature, to our knowledge, it can be only found lightweight approaches to the extraction of this kind of information from Java object code. The approach is implemented in an automatic extraction tool (called {\em Jar2Ontology}) that has been developed as a plug-in of the Prot\'eg\'e Ontology and Knowledge Acquisition System. Jar2Ontology extracts the semantics from Java libraries and translates it into OWL (Ontology Web Language).

Paper Nr: 310
Title:

THE DESIGN OF THE FRAMEWORK OF THE SIYE INTEGRATED E-COMMERCE PLATFORM BASED ON THE THEORY OF THE INTEGRATED E-COMMERCE

Authors:

Shuguang Kong

Abstract: This article describes the meaning and development of the integrated e-commerce and the integrated e-commerce platform. The integrated e-commerce is the integration of ERP and e-commerce. The integrated e-commerce platform is a e-commerce supply chain management system (eSCM) which is a informationization platform for small and medium enterprises.Then, based on the theory, it proposes the design idea of the framework of the SIYE integrated e-commerce platform.

Paper Nr: 311
Title:

A UML-EXTENDED APPROACH FOR MINING OLAP DATA CUBES IN COMPLEX KNOWLEDGE DISCOVERY ENVIRONMENTS

Authors:

Alfredo Cuzzocrea

Abstract: In this paper, we propose theoretical assertions and practical instances of an innovative UML-extended approach for mining OLAP data cubes in complex knowledge discovery environments. This analytical contribution is further extended by means of a comprehensive set of case studies that clearly demonstrate the feasibility and the benefits of the proposed approach in the context of next generation Data-Warehousing/Data-Mining platforms.

Paper Nr: 324
Title:

RISK MANAGEMENT IN ENTERPRISE RESOURCE PLANNING IMPLEMENTATION USING A FUZZY APPROACH

Authors:

Hamid Akhavan Asghari and Mehrbakhsh Nilashi

Abstract: Without powerful risk management, it is very difficult to imagine a successful implementation of enterprise resource planning (ERP) system. In order to analyze risk management we need an overall view to project's Critical Success Factors (CSFs). Having an accurate assessment of CSFs is a key point to challenges ahead in ERP projects. Some weighted list of CSFs has been created up to now. In this article, a new classification over CFSs by Balance Scorecard (BSC) will be presented. Our fuzzy approach causes some changes over CSF importance.

Paper Nr: 326
Title:

STUDY ON DEVELOPING WAREHOUSE RECEIPT HYPOTHECATION FINANCING METHOD IN CHINA

Authors:

Xiao Xiang and Xue Shengjie

Abstract: Small and Medium-sized Enterprises (SME) play an important role in the development of national economy, but financing difficulty has always been one of the major obstacles in their development. How to broaden financing channels and innovate upon financing methods are problems need urgent resolve. Logistics and finance is combined to logistics finance, and warehouse receipt hypothecation financing method starts an effective approach for providing cash flow to SME. The paper give an instruction to concept and operation flow of warehouse receipt hypothecation, and analyze it’s operation mode and benefits, in the same time ,analyze it’s risks control, furthermore ,the paper put forward some suggestions for developing it in China.

Paper Nr: 344
Title:

RESEARCH ON THE EVALUATION INDEX SYSTEM AND METHOD FOR INTELLIGENT PERFORMANCE OF SMART SPACE BASED ON INTERNET OF THINGS TECHNOLOGY

Authors:

Xiaoquan Gong and Tao Gu

Abstract: This article firstly defines the smart living space and it’s six application aspect which is widely used under the Internet of Things, subdivide and analysis the researches home and abroad when study the relative issues about smart living space assess theory system of the “Internet of Things” technology, concluding the grey multi-levels methods on the basis of “Internet of Things” technology assess theory system at last. Then specify the evaluation of the intelligence storage.

Paper Nr: 350
Title:

RESEARCH OF CIRCULAR LOGISTICS PROCESS OF TELECOMMUNICATIONS OPERATORS

Authors:

Yingkui Wang

Abstract: This paper first defines the logistics of the telecommunications operators and the circular logistics system of telecommunications operators. It points out that there exists important differences among logistics of telecommunications operators, manufacturing enterprises and materials distribution enterprises, and lacks the relevant research results. From the aspects of supply logistics, maintenance logistics and waste material logistics, it gives a detailed analysis of process of circular logistics of telecommunications operators. Finally, it points out that the research of circular logistics of telecommunications operators has important theoretical and practical value.

Paper Nr: 365
Title:

EFFICIENT INDEXING OF LINES WITH THE MQR-TREE

Authors:

Marc Moreau and Wendy Osborn

Abstract: This paper presents an evaluation of the mqr-tree for indexing a database of line data. Many spatial access methods have been proposed for handling either point or region data, with the vast majority able to handle these data types efficiently. However, line segment data presents challenges for most spatial access methods. Recent work on the mqr-tree showed much potential for efficiently indexing line data. We identify limitations of the data sets in the initial evaluation. Then, we further evaluate the ability of the mqr-tree to efficiently index line sets with different organizations that address the limitations of the initial test. A comparison versus the R-tree shows that the mqr-tree achieves significantly lower overlap and overcoverage, which makes the mqr-tree a significant candidate for indexing line and line-segment data.

Paper Nr: 367
Title:

KNOWLEDGE EXTRACTION GUIDED BY ONTOLOGIES - Database Marketing Application

Authors:

Filipe Mota Pinto and Teresa Guarda

Abstract: The knowledge extraction in large databases has being known as a long term and interactive project. In spite of such complexity and different options for the knowledge achievement, there is a research opportunity that could be explored, throughout the ontologies support. Then this support may be used for knowledge sharing and reuse. This paper describes a research of an ontological approach for leveraging semantic content of ontologies to improve knowledge extraction in a oil company marketing databases. We attain to analyze how ontologies and knowledge discovery process may interoperate and present our efforts to propose a possible framework for a formal integration.

Paper Nr: 386
Title:

TRAJECTORY SEMANTIC VISUALIZATION

Authors:

Stanimir Bakshev, Laura Spinsanti and Jose Antonio Fernandes de Macedo

Abstract: Thanks to current GPS technologies, the capture of evolving positions of individual moving objects has become technically and economically feasible. This opens new perspectives for a large number of applications (from transportation and logistics to ecology and anthropology) built on the knowledge of objects’ movements. The goal of this work is to propose a framework that supports querying and visualization of trajectory data. Trajectory data and its semantic context are modeled by the means of an application ontology, which allows the user to elaborate semantic queries. Results are rendered using an automatic matching procedure that allows the user to change the actual visualization of the data.

Posters
Paper Nr: 54
Title:

EFFECTIVE DATABASE MIGRATION STRATEGY - THE NEED FOR ADDRESSING DATABASE MIGRATION CHALLENGES OF TODAY, TOMORROW AND BEYOND

Authors:

Prabin R. Sahoo

Abstract: Database migration is considered to be a least priority activity in software industry. This is a wrong perception as database migration in reality becomes a nightmare to many who have experienced it. As the migration process starts to the new target system, migration issues crop up which pile up gradually and become unmanageable during the migration process. That is why to eliminate the migration issues require highly effective database migration strategies. There are several papers where migration strategies have been mentioned, and there are a number of tools those have been developed for successful database migration, yet database migration issues still persists and organizations find it difficult to adapt an effective migration strategy. Organizations spend enough in purchasing tools to do the migration. By the time one realizes that the business is suffering after the post migration, it is too late. A huge amount is spent in the repair process. This is not because of lack of migration strategy, rather lack of effective migration strategy. For example: Using a migration tool can be a strategy, however before that an assessment is required to find out how much percentage of tool can do and how much percentage required to build in-house tools, how would the database behave after migration etc? Such analysis are lacking in today’s migration strategy. This paper proposes an effective approach towards database migration to address migration assessments, the migration issues, also improvising reusability.

Paper Nr: 69
Title:

A FRAMEWORK FOR ERP EVALUATION AND SELECTION USING MACBETH TECHNIQUE

Authors:

Abdelilah Khaled and Mohammed Abdou Janati Idrissi

Abstract: Purchasing an inappropriate Enterprise Resource Planning (ERP) system may prove to be a major reason for its implementation failure. Recently, given the cost of the investment required to acquire, implement and operate an ERP system, the interest expressed by academics and practitioners concerning the selection of measures and evaluation techniques of enterprise systems is highly justifiable. Accordingly, system selection process is an important step in ERP adoption. This paper intends to elaborate a comprehensive framework for ERP selection and evaluation. It serves a threefold objective. First, it proposes a structured methodology in which strategic, functional, technical and managerial goals are considered in the selection decision. Second, it suggests a classification of the main criteria mentioned in the literature under four categories. Third, it presents an application of the Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) that allows the quantification of criteria’s weights, the construction of utility functions related to each criterion and the evaluation of the candidate ERP solutions.

Paper Nr: 102
Title:

INTELLIGENT TRANSPORTATION SYSTEMS DATA WAREHOUSES AND THEIR APPLICATIONS

Authors:

Yunjie Zhao, Madhubabu Sandara, Shan Huang and Adel Sadek

Abstract: Archived Data Management Systems (ADMS) offers an opportunity to take full advantage of the data collected by Intelligent Transportation Systems (ITS) devices in improving transportation operations and planning at a minimal additional cost. This paper develops an ITS data warehouse, or an ADMS, which can be used to a wide range of applications, such as more effective transportation planning, decision-making and performance measurement, extreme traffic event study, traveller information subscriber system, data support for model development, calibration and validation and so forth. In short, the development of such a data warehouse is going to be beneficial for both traffic management and research purposes.

Paper Nr: 120
Title:

ZACHMANNTEST - A SYNTHETIC MEMORY BENCHMARK FOR SAP ERP SYSTEMS

Authors:

André Bögelsack and Holger Wittges

Abstract: SAP ERP systems are the backbone of today’s enterprises and business processes. The technical performance of such systems directly influences the business performance. Estimating or measuring the performance of SAP ERP systems is a hard task due the diversity of the measurement process. We present a synthetic benchmark called Zachmanntest which measures the performance of the SAP ERP system focusing on main memory operations. The internal flow logic is presented as well as the usage of the Zachmanntest. Results of the peak performance measurement and scalability of heterogeneous servers are presented.

Paper Nr: 134
Title:

TOWARDS MODEL-DRIVEN EVOLUTION OF DATA WAREHOUSES

Authors:

Christian Kurze, Marcus Hofmann and Frieder Jacobi

Abstract: Data warehouse systems represent centralized data collections used for the purposes of data analysis and decision support. Development and maintenance of extensive data warehouse systems require appropriate support through methods and tools. Therefore, we introduce the project Computer-Aided Warehouse Engi-neering (CAWE). It is a model-driven approach to the engineering of data warehouse systems. Especially the process of data warehouse evolution, i.e. the maintenance of the core data storage component, the data warehouse in narrow sense, is a tedious task since the change of data structures implies the challenge of en-suring integrity and history of corporate data. The paper at hand provides research in progress and suggests the adoption of a multidimensional algebraic formalism to the model-driven development paradigm.

Paper Nr: 222
Title:

INDEXING BANGLA NEWSPAPER ARTICLES USING FUZZY AND CRISP CLUSTERING ALGORITHMS

Authors:

A. K. M. Zahiduzzaman and Mohammad Nahyan Quasem

Abstract: The paper presents two document clustering techniques to group Bangla newspaper articles. The first one is based on traditional c-means algorithm, and the later is based on its fuzzy counterpart, i.e., fuzzy c-means algorithm. The key principle for both of those techniques is to measure the frequency of keywords in a particular type of article to calculate the significance of those keywords. The articles are then clustered based on the significance of the keywords. We believe the findings from this research will help to index Bangla newspaper articles. Therefore, the information retrieval will be faster than before. However, one of the challenge is to find the salient features from hundred of features found in documents. Besides, both clustering algorithms work well on lower dimensions. To address this, we use three dimensionality reduction techniques, known as Principle Component Analysis (PCA), Factor Analysis (FA) and Linear Discriminant Analysis (LDA). We present and analyze the performance of traditional and fuzzy c-means algorithms with different dimensionality reduction techniques.

Paper Nr: 244
Title:

COMPARISON OF DIFFERENT CLASSIFICATION TECHNIQUES ON PIMA INDIAN DIABETES DATA

Authors:

Farhana Afroz and Rashedur M. Rahman

Abstract: The development of data-mining applications such as classification and clustering has been applied to large scale data. In this research, we present comparative study of different classification techniques using three data mining tools named WEKA, TANAGRA and MATLAB. The aim of this paper is to analyze the performance of different classification techniques for a set of large data. The algorithm or classifiers tested are Multilayer Perceptron, Bayes Network, J48graft (c4.5), Fuzzy Lattice Reasoning (FLR), NaiveBayes, JRip (RIPPER), Fuzzy Inference System (FIS), Adaptive Neuro-Fuzzy Inference Systems(ANFIS). A fundamental review on the selected technique is presented for introduction purposes. The diabetes data with a total instance of 768 and 9 attributes (8 for input and 1 for output) will be used to test and justify the differences between the classification methods or algorithms. Subsequently, the classification technique that has the potential to significantly improve the common or conventional methods will be suggested for use in large scale data, bioinformatics or other general applications.

Paper Nr: 246
Title:

ELEARNING VIRTUAL ENVIRONMENTS MULTI-AGENT MODEL FOR MEDICAL STUDENTS

Authors:

Luis Gaxiola Vega, Bogart Yail Márquez and José Magdaleno-Palencia

Abstract: In today's education it is becoming normal to talk about virtual learning, online and at a distance. Such environments are incorporated daily attendance practices that know, greatly enriching, with the potential of media and technology, educational opportunities in all areas, this paper explains how to implement the use of multi-agents. It will discuss how the curriculum can be enriched by activities involving problem-based learning, case studies simulations and virtual reality. This new model provides multiple uses for exploring knowledge and supporting learning-by-doing. It engages users in the construction of knowledge, collaboration, and articulation of knowledge in a virtual environment, especially in the teaching - learning.

Paper Nr: 327
Title:

RESEARCH ON INTERACTIVE MECHANISM OF TECHNOLOGY PROGRESS AND M&A

Authors:

Yongmei Cui

Abstract: This paper is based on new growth theory, evolutionary economics and industrial impact hypothesis theoretical basis, Using the mathematical statistics analysis, case analysis, system analysis and inductive method, first respectively expatiates the internal mechanism of technical progress drives M&A and M&A promote technological progress, reveals the interactive of technological progress and the M&A through the establishment of interaction model, then through the comparative analysis of the developed countries ’M&A and technical wave, verifies the positive correlation between the technological progress and M&A, it comes to the conclusion that technical progress is a two-phase dynamic process which contains both technological innovation and technology transfer, in this process, M&A and technical progress exist mutual drive internal mechanism, mainly for technological innovation and transfer make the reallocation of industry resources, which finally drive the M&A waves, then M&A promote the technology innovation and transfer through break the monopoly and concentrate element. The paper is valuable not only in theory but also in practice, which favourable to the establishment of technological progress system and the maturity of the M&A market. Dynamic definition of the technical progress and the establishment of the technical progress and M&A interaction model is the innovation of the article.

Paper Nr: 329
Title:

THE CONSTRUCTION OF ECONOMIC MODEL EVALUATION SYSTEM BASED ON THE ECONOMIC MODEL OF RESOURCE PLATFORM

Authors:

Rong Ruan

Abstract: With the development of computer and information technology, more network technology have been applied into teaching area. The economic model resources platform is firstly built by schools of economics and management, which hosting a economic model library. And those models come from classical or innovation models and are built by teachers and students, using technologies such as Matlab and website development to display on the platform. As one kind of teaching resources, the economics model should be well built on the platform to enable the learners easily understanding. Then what standard can we find to determine the quality of the economic models? So we need construct a evaluation system to help students select models for learning. This article combines AHP with qualitative and quantitative method and Delphi to construct the evaluation system. Finally we applicate the system to evaluate the economic models. It can help the learners to select high quality models. And for the bad quality models, the managers can improve them.

Paper Nr: 387
Title:

SOA IMPLEMENTATION IN PASSENGER TRANSPORT MARKETING AIDED DECISION SYSTEM

Authors:

Jianxiong Wang and Chunhuang Liu

Abstract: Business Intelligence (BI) technology is researched in the first part of this paper, and application with Service-Oriented Architecture (SOA) is built with IBM Cognos BI components to implement Railway Passenger Transport Marketing aided Decision System, which providing intellectual query and analysis solution for China Ministry of Railway and all railway bureaus in China. Finally, A development flow of multi-dimension cube and its application of analyzing railway passenger data is presented to describe the flexibility and high reliability.

Area 2 - Artificial Intelligence and Decision Support Systems

Full Papers
Paper Nr: 43
Title:

PATTERNS IDENTIFICATION FOR HITTING ADJACENT KEY ERRORS CORRECTION USING NEURAL NETWORK MODELS

Authors:

Jun Li and Karim Ouazzane

Abstract: People with Parkinson diseases or motor disability miss-stroke keys. It appears that keyboard layout, key distance, time gap are affecting this group of people’s typing performance. This paper studies these features based on neural network learning algorithms to identify the typing patterns, further to correct the typing mistakes. A specific user typing performance, i.e. Hitting Adjacent Key Errors, is simulated to pilot this research. In this paper, a Time Gap and a Prediction using Time Gap model based on BackPropagation Neural Network, and a Distance, Angle and Time Gap model based on the use of Probabilistic Neural Network are developed respectively for this particular behaviour. Results demonstrate a high performance of the designed model, about 70% of all tests score above Basic Correction Rate, and simulation also shows a very unstable trend of user’s ‘Hitting Adjacent Key Errors’ behaviour with this specific datasets.

Paper Nr: 73
Title:

RESEARCH AND DEMONSTRATION OF AGRICULTURAL POLICY SIMULATION BASED ON CGE MODEL

Authors:

Zhigang Li and Quan Qi

Abstract: In the event that lacks of faced to policy simulation platform, we presented a simulation platform which makes use of computer technique in this paper. On this platform, we integrated CGE model, DSS, data warehouse, data convert and other components together, established a prototype of policy simulation platform system, and simulated the agriculture subsidizes policy through the scene analysis method. The analytic results demonstrated the feasibility and functionality of the simulation platform prototype system.

Paper Nr: 86
Title:

REASONING IN INTELLIGENT DIAGNOSIS SYSTEMS

Authors:

Vadim Vagin and Alexander Eremeev

Abstract: The paper is devoted to research and modeling reasoning based on Assumption-based Truth Maintenance Systems (ATMS) and reasoning by analogy in intelligent diagnosis systems. The new heuristic approaches of current measurement point choosing on the basis of supporting and inconsistent environments are presented. Reasoning by analogy method is viewed. This work was supported by the Russian Fund for Basic Research.

Paper Nr: 108
Title:

HIGH-SPEED RAILWAY BASED ON GENETIC ALGORITHM FOR PREDICTION OF TRAVEL CHOICE

Authors:

Long Chen-xu

Abstract: Genetic algorithm is a new optimizing searching method based on biology evolutionary theory. Just as evolution deals in populations of individuals, genetic algorithms mimic nature by evolving huge churning populations of code, all processing and mutating at once. With the frequency of passenger travel speeding up and passenger's demand to the quality of life higher and higher, passengers have higher and higher demands to the travel. Especially in environment, comfort and quick aspect, different passenger focuses on different aspects, therefore, before the research, we must classify the passenger. This paper applies British Sheffield university GA toolbox, with the application of matlab, finally makes a forecast analysis. Because the forecast analysis is based on the questionnaire during the Spring Festival, so the emphasize on the travel choice made by the passengers in this certain circumstances is necessary. In addition, the forecast analysis will be more or less different with other forecast analysis normally.

Paper Nr: 112
Title:

A CASE-BASED ENTERPRISE INFORMATION SYSTEM FOR THERMAL POWER PLANTS’ SAFETY ASSESSMENT

Authors:

Dong-xiao Gu, Chang-yong Liang, Chun-rong Zuo and Isabelle Bichindaritz

Abstract: Security assessment of Thermal Power Plants (TPP) is one of the important means to guarantee the safety of production in thermal power production enterprises. Modern information technology may play a more important role in TPP safety assessment. Essentially, the evaluation of power plant systems relies to a large extent on the knowledge and length of experience of the experts. Therefore in this domain Case-Based Reasoning (CBR) is introduced for the security assessment of TPPs since this methodology models expertise through experience management. This paper provides a case-based approach for the management system security assessment decision making of TPPs (MSSATPP). A case matching method named CBR-Grey is introduced in which Delphi approach and Grey System theory are integrated. Based on this method, we implement a prototype of enterprise assessment information system (CBRSYS-TPP) for the panel of experts.

Paper Nr: 117
Title:

TOWARDS AN INTELLIGENT SYSTEM FOR MONITORING ELECTRICAL ENERGY QUALITY - Foundations and Motivations

Authors:

Luiz Biondi Neto, Pedro Henrique Gouvêa Coelho and João Carlos C. B. Soares de Mello

Abstract: Electrical energy must be supplied in enough amount but with adequate quality. One of the components of electrical energy quality is the harmonic distortion. In this paper, we show an alternative way to measure distortion, mixing Data Envelopment Analysis (DEA) and Fourier Analysis. The technique here presented is specially useful for comparative analysis and is intended to be the basis for an intelligent system for monitoring electrical energy quality.

Paper Nr: 118
Title:

STOCK MARKET FORECASTING BASED ON WAVELET AND LEAST SQUARES SUPPORT VECTOR MACHINE

Authors:

Xia Liang

Abstract: In this paper, we propose a novel method using wavelet transform to denoise the input of least squares support vector machine for classification of closing price of stocks. The proposed method classifies closing price as either down or up. We have tested the proposed approach using passed three-year data of 10 stocks randomly selected from sample stock of hs300 index and compared the proposed method with other machine learning methods. Good classification percentage of almost 99% was achieved by WT-SVM model. We observed that the performance of stock price prediction can be significantly enhanced by using hybrized WT in comparison with a single model.

Paper Nr: 130
Title:

RANDOM BUILDING BLOCK OPERATOR FOR GENETIC ALGORITHMS

Authors:

Ghodrat Moghadampour

Abstract: Genetic algorithms work on randomly generated populations, which are refined toward the desired optima. The refinement process is carried out mainly by genetic operators. Most typical genetic operators are crossover and mutation. However, experience has proved that these operators in their classical form are not capable of refining the population efficiently enough. Moreover, due to lack of sufficient variation in the population, the genetic algorithm might stagnate at local optimum points. In this work a new dynamic mutation operator with variable mutation rate is proposed. This operator does not require any pre-fixed parameter. It dynamically takes into account the size (number of bits) of the individual during runtime and replaces a randomly selected section of the individual by a randomly generated bit string of the same size. All the bits of the randomly generated string are not necessarily different from bits of the selected section from the individual. Experimentation with 17 test functions, 34 test cases and 1020 test runs proved the superiority of the proposed dynamic mutation operator over single-point mutation operator with 1%, 5% and 8% mutation rates and the multipoint mutation operator with 5%, 8% and 15% mutation rates.

Paper Nr: 153
Title:

PLICAS - A Texas Hold'em Poker Bot

Authors:

Christian Friedrich and Michael Schwind

Abstract: Since two decades, poker is a very popular game and in the last decade poker became also an interesting topic in game theory (GT), artificial intelligence (AI) and multi-agent systems (MAS). This paper describes the development and evaluation of the poker bot PLICAS designed for the variant ‘Texas Hold’em Fixed Limit Heads-up’. In the development process, approaches such as opponent modeling, abstraction techniques, and case-based reasoning were studied and implemented. PLICAS also introduces simulation-based methods for the exploitation of the opponent’s play. In the experimental part of this paper, we analyze the strengths and weaknesses of PLICAS, which participated in the 2010 AAAI Computer Poker Competition (ACPC).

Paper Nr: 223
Title:

BANGLA ISOLATED WORD SPEECH RECOGNITION

Authors:

Adnan Firoze and M. Shamsul Arifin

Abstract: The paper presents Bangla word speech recognition using spectral analysis and fuzzy logic. As human speech is imprecise and ambiguous, the fuzzy logic – the base of which is indeed linguistic ambiguity, could serve as a more precise tool for analysing and recognizing human speech. Even though the core source of an uttered word is a voiced signal, our system revolves around the visual representation of voiced signals – the spectrogram. The spectrogram may be perceived as a “visual” entity. The essences of a spectrogram are matrices that include information about properties of a sound, e.g., energy, frequency and time. In this research the spectral analysis has been chosen as opposed to image processing for increased accuracy. The decision making process of our system is based on fuzzy logic. Experimental results demonstrate that our system is 80% accurate compared to a commercial Hidden Markov Model (HMM) based speech recognizer that shows 73% accuracy on an average.

Paper Nr: 224
Title:

TOWARD A NON INVASIVE CONTROL OF APPLICATIONS - A Biomedical Approach to Failure Prediction

Authors:

Ilenia Fronza and Alberto Sillitti

Abstract: Developing software without failures is indeed important. Still, it is also important to detect as soon as possible when a running application is likely to fail, so that corrective actions can be taken. Following the guidelines of Agile Methods, the goal of our research is to develop a statistical prediction model for failures that does not require any additional effort on the side of the developers of an application; the key concept is that the developers concentrate on the code and we use the information that is naturally generated by the running application to assess whether an application is likely to fail. So the developers concentrate only on providing direct value to the customer and then the model takes care of informing the environment of the possible crash. The proposed model uses as input data that is commonly produced by developers: the log files. The statistical prediction model employed comes from biomedical studies about cancer survival prediction based on gene expression profiles where gene expression measurements and survival times of previous patients are used to predict future patients' survival. One of the most prominent models is the Cox Proportional Hazards (PH) model. In this work, we draw a parallel between our context and the biomedical one; we consider types of operations as genes, and operations and their multiplicity in the sequence as expression profiles. Then, we identify signature operations applying the above mentioned Cox PH model. We perform a prototypical analysis using real-world data to assess the suitability of our approach. We estimate the confidence interval of our results using Bootstrap.

Paper Nr: 238
Title:

A SERVICE FRAMEWORK FOR LEARNING, QUERYING AND MONITORING MULTIVARIATE TIME SERIES

Authors:

Chun-Kit Ngan

Abstract: We propose a service framework for Multivariate Time Series Analytics (MTSA) that supports model definition, querying, parameter learning, model evaluation, monitoring, and decision recommendation. Our approach combines the strengths of both domain-knowledge-based and formal-learning-based approaches for maximizing utility over time series. More specifically, we identify multivariate time series parametric estimation problems, in which the objective function is dependent on the time points from which the parameters are learned. We propose an algorithm that guarantees to find the optimal time point(s), and we show that our approach produces results that are superior to those of the domain-knowledge-based approach and the logit regression model. We also develop MTSA data model and query language for the services of parameter learning, querying, and monitoring.

Paper Nr: 298
Title:

A CONTROLLED NATURAL LANGUAGE INTERFACE TO CLASS MODELS

Authors:

Imran Sarwar Bajwa and M. Asif Naeem

Abstract: The available approaches for automatically generating class models from natural language (NL) software requirements specifications (SRS) exhibit less accuracy due to informal nature of NL such as English. In the automated class model generation, a higher accuracy can be achieved by overcoming the inherent syntactic ambiguities and semantic inconsistencies in English. In this paper, we propose a SBVR based approach to generate an unambiguous representation of NL software requirements. The presented approach works as the user inputs the English specification of software requirements and the approach processes input English to extract SBVR vocabulary and generate a SBVR representation in the form of SBVR rules. Then, SBVR rules are semantically analyzed to extract OO information and finally OO information is mapped to a class model. The presented approach is also presented in a prototype tool NL2SBVRviaSBVR that is an Eclipse plugin and a proof of concept. A case study has also been solved to show that the use of SBVR in automated generation of class models from NL software requirements improves accuracy and consistency.

Short Papers
Paper Nr: 13
Title:

SYSTEM ARCHITECTURE OF THE DECISION SUPPORT SYSTEM EMPLOYING MICROSCOPIC SIMULATION AND EXPERT SYSTEM IN PARALLEL FOR THE POST INCIDENT TRAFFIC MANAGEMENT

Authors:

S. Akhtar Ali Shah and Hojung Kim

Abstract: This paper presents system architecture of the post-incident decision support system (PIDSS), which incorporates the predicted incident impacts from an offline microscopic simulation platform into an expert system. The system yields an immediate operational strategy for the freeway managers that can further be fine-tuned with the online simulation results. The novel idea presented in this paper is the replacement of the domain expert and knowledge engineer with the output of the microscopic simulation that would make post incident congestion mitigation on the road network more efficient and cost effective.

Paper Nr: 15
Title:

APPLYING FUZZY MULTIPLE CRITERIA DECISION MAKING TO EVALUATE AND IDENTIFY OPTIMAL EXPLOSIVE DETECTION EQUIPMENTS

Authors:

Ying Bai and Dali Wang

Abstract: The purpose of this research is to develop a universal model with a practical system to evaluate, identify and select an optimal system or device to perform the desired task from a large collection of available systems that have multiple objectives based on a fuzzy multiple criteria decision making model (FMCDMM). As an example, here we are using this research to identify and select an optimal detection system or device to detect hazardous chemical materials.

Paper Nr: 71
Title:

RESEARCH ON THE ROUTE OPTIMIZATION OF BOOK DISTRIBUTION BASED ON THE TABU SEARCH ALGORITHM

Authors:

Pengfei Zhang and Zhenji Zhang

Abstract: Currently, the research of distribution problems not only widens the area of distribution research, but also makes distribution research content more concrete. What's more, it can solve distribution problem in the actual industry. Distribution route optimization Problem, is the Vehicle Routing Problem (VRP), and it is a research hotspot of logistics industry. Based on the book distribution route optimization, mainly aiming at all the sales network circumstance served by the distribution center, to optimize and analyze the distribution path will be a new attempt.

Paper Nr: 84
Title:

SUMMARY OF LASSO AND RELATIVE METHODS

Authors:

Xia Jianan

Abstract: Feature Selection is one of the focuses in pattern recognition field. To select the most obvious features, there are some feature selection methods such as LASSO, Bridge Regression and so on. But all of them are limited in select feature. In this paper, a summary is listed. And also the advantages and limitations of every method are listed. By the end, an example of LASSO using in identification of Traditional Chinese Medicine is introduced to show how to use these methods to select the feature.

Paper Nr: 100
Title:

OPERATIONAL HAZARD RISK ASSESSMENT USING BAYESIAN NETWORKS

Authors:

Zoe Jing Yu Zhu

Abstract: This research investigates a method for hazard identification of modern drinking water treatment technologies. Bayesian networks are applied to quantify risk assessment. Bayesian networks represent an important formalism for representation of, and inference with, uncertain knowledge in artificial intelligence. A physicochemical ultra filtration (UF) membrane train is expressed as a Bayesian network. They can be used in quantifying understanding of the hazards at the operational level of treatment plant that impact the risk of infection from pathogens. Once such a Bayesian network is established, the risk assessment can be performed automatically using algorithms developed in artificial intelligence which facilitates risk assessment of complex water treatment domains.

Paper Nr: 111
Title:

PROBLEM SOLVING FRAMEWORK WITHIN DECISION SUPPORT SYSTEMS

Authors:

Yan H. Ng

Abstract: In the study of problems solving, cognitive psychologists had generalized it as a process of applying skills to overcome obstacles and constraints, so as to move from a given state to the desired goal state. In here, we look at where a line could be drawn to divide the process of problem solving into two sub-processes; a one-off, but cognitive demanding task for human to define the problem space, in terms of representation, and a routine search process, that could be undertaken by a machine, to search and select a solution within a defined solution space, so that to oversee its execution from start to finish to realize the goal.

Paper Nr: 148
Title:

DEVELOPING A PRICE MANAGEMENT DECISION SUPPORT SYSTEM FOR HOTEL BROKERS USING FREE AND OPEN SOURCE TOOLS

Authors:

Slava Kisilevich, Daniel Keim and Roman Byshko

Abstract: In the Internet age, e-commerce provides customers global reach to a wide variety of products and plays a dominant role in business activity and competition. Competition is especially aggressive in the online travel domain where wholesalers, e.g. brokerage companies, contract through their contract managers with thousands of hotel brands and trade hotel products (usually hotel nights) for travel businesses or end customers. In order to conclude a profitable contract, a contract manager should be able to compare all the particulars of the prospective partner hotel with those of the competing hotels in the target city. Given that the number of contract managers is comparatively small compared to the large number of hotels, the possible knowledge base is limited. Thus, the hotel brokerage companies are only able to bargain with a relatively limited number of hotels, and the contract profitability relies heavily on the contract managers’ expertise and communication skills. In this paper we present a price management decision support system (DSS) for hotel brokers that allows analysis of hotel prices using spatial and non-spatial characteristics, estimation of the objective relative hotel prices, and determination of the profitability of the existing or future contracts. We built our system using free and open source tools including geographic information system and data mining frameworks that allow companies with limited money resources or manpower to implement such a prototype. We show the effectiveness of our tool by covering all the major components of the DSS such as data selection and integration, model management and user interface. We demonstrate our tool on the area of Barcelona, Spain using a real data of 168 hotels provided by one of the travel service providers.

Paper Nr: 179
Title:

MULTISENSORY ARCHITECTURE FOR INTELLIGENT SURVEILLANCE SYSTEMS - Integration of Segmentation, Tracking and Activity Analysis

Authors:

Francisco Alfonso Cano and José Carlos Castillo

Abstract: Intelligent surveillance systems deal with all aspects of threat detection in a given scene; these range from segmentation to activity interpretation. The proposed architecture is a step towards solving the detection and tracking of suspicious objects as well as the analysis of the activities in the scene. It is important to include different kinds of sensors for the detection process. Indeed, their mutual advantages enhance the performance provided by each sensor on its own. The results of the multisensory architecture offered in the paper, obtained from testing the proposal on CAVIAR project data sets, are very promising within the three proposed levels, that is, segmentation based on accumulative computation, tracking based on distance calculation and activity analysis based on finite state automaton.

Paper Nr: 182
Title:

PASSENGER TRAVEL CHOICE PREDICITON BASED ON FUZZY LOGIC

Authors:

Liu Mei

Abstract: Travelling choice about high-speed rail is very important to improve the level of management. The paper put forward a prediction model based on fuzzy logic to predict the choices of the travelers , after compared several methods. Then , combined with the survey data and the training and testing of the model , we determined a effective model with fuzzy rules. The inspection findings of the model indicate that the choice result predicated by the model based on fuzzy logic and the actual choice very close. Therefore, the prediction method based on fuzzy logic is feasible. In addition, we put forward the concept of improving factor, to provide reasonable proposals about how to improve the services and management quality of the high-speed way.

Paper Nr: 205
Title:

THE RESEARCH ON THE INTERACTION EFFECTS OF RURAL LOGISTICS AND ECONOMIC DEVELOPMENT

Authors:

Xuegang Chen and Jianqin Zhou

Abstract: There are interaction effects between rural logistics and economic development. The effect rural logistics has on the economic development presents as an S-shaped curve. So the interaction effects between rural logistics and economic development vary at different stage. It’s necessary to analyse the interaction effects to make the appropriate policies to promote the development of rural logistics. A model is constructed to analyse the interaction effects. Then the contribution that rural logistics makes to economic development in China is discussed. The interaction effects at present imply a result that the rural logistics elasticity coefficient of economic development is less than 1. Finally, some policies to develop rural logistics are proposed.

Paper Nr: 219
Title:

FINANCIAL TIME SERIES FORECAST USING SIMULATED ANNEALING AND THRESHOLD ACCEPTANCE GENETIC BPA NEURAL NETWORK

Authors:

Anupam Tarsauliya

Abstract: Financial time series forecast has been eyed as key standard job because of its high non-linearity and high volatility in data. Various statistical methods, machine learning and optimization algorithms has been widely used for forecasting time series of various fields. To overcome the problem of solution trapping in local minima, here in this paper, we propose novel approach of financial time series forecasting using simulated annealing and threshold acceptance genetic back propagation network to obtain the global minima and better accuracy. Time series dataset is normalized and bifurcated into training and test datasets, which is used as supervised learning in BPA artificial neural network and optimized with genetic algorithm. Results thus obtained are used as seed for start point of simulated annealing and threshold acceptance. Empirical results obtained from proposed approach confirm the outperformance of forecast results than conventional BPA artificial neural networks.

Paper Nr: 226
Title:

AN IMPROVED GENETIC ALGORITHM WITH GENE VALUE REPRESENTATION AND SHORT TERM MEMORY FOR SHAPE ASSIGNMENT PROBLEM

Authors:

Ismadi Md Badarudin, Abu Bakar Md Sultan and Md Nasir Sulaiman

Abstract: The purpose in shape assignment is to find the optimal solution that combines a number of shapes with attention to full use of area. To achieve this, an analysis needs to be done several times because of the different solutions produce dissimilar number of items. Although to find the optimal solution is a certainty, the ambiguity matters and huge possible solutions require an intelligent approach to be applied. Genetic Algorithm (GA) was chosen to overcome this problem. We found that basic implementation of Genetic Algorithm produces uncertainty time and most probably contribute the longer processing time with several reasons. Thus, in order to reduce time in analysis process, we improved the Genetic Algorithm by focusing on 1) specific-domain initialization that gene values are based on the X and Y of area coordinate 2) the use of short term memory to avoid the revisit solutions occur. Through a series of experiment, the repetition of time towards obtaining the optimal result using basic GA (BGA) and improved GA (IGA) gradually increase when size of area of combined shapes raise. Using the same datasets, however, the BGA shows more repetition number than IGA indicates that IGA spent less computation time.

Paper Nr: 228
Title:

SELF-ADAPTIVE INTEGER AND DECIMAL MUTATION OPERATORS FOR GENETIC ALGORITHMS

Authors:

Ghodrat Moghadampour

Abstract: Evolutionary algorithms are affected by more parameters than optimization methods typically. This is at the same time a source of their robustness as well as a source of frustration in designing them. Adaptation can be used not only for finding solutions to a given problem, but also for tuning genetic algorithms to the particular problem. Adaptation can be applied to problems as well as to evolutionary processes. In the first case adaptation modifies some components of genetic algorithms to provide an appropriate form of the algorithm, which meets the nature of the given problem. These components could be any of representation, crossover, mutation and selection. In the second case, adaptation suggests a way to tune the parameters of the changing configuration of genetic algorithms while solving the problem. In this paper two new self-adaptive mutation operators; integer and decimal mutation are proposed for implementing efficient mutation in the evolutionary process of genetic algorithm for function optimization. Experimentation with 27 test cases and 1350 runs proved the efficiency of these operators in solving optimization problems.

Paper Nr: 266
Title:

MAS-ML TOOL - A Modeling Environment for Multi-agent Systems

Authors:

Enyo José Tavares Gonçalves, Kleinner Farias, Mariela I. Cortés and Allan Ribeiro Feijó

Abstract: Multi-Agent Systems (MAS) emerged as a promising approach for developing complex and distributed systems. However, tools that support development of MASs are essential for this approach is effectively exploited in industrial context. Therefore, there is a need for tools for the modeling of MAS, because create and manipulate models without support of an appropriate environment are tedious and error-prone tasks that demands time. This paper aims to satisfy this need by built a modeling environment domain specific to MAS, implemented as a plug-in for Eclipse platform. The environment is based on MAS-ML, a modeling language for MAS. This work focuses the implementation of tool to MAS-ML static diagrams, according version 2.0 of the language.

Paper Nr: 312
Title:

DECISION SUPPORT SYSTEM FOR COST-BENEFIT ANALYSIS IN SERVICE PROVISION

Authors:

Emadoddin Livani

Abstract: Cost-benefit analysis is an approach to relate effort and cost of an activity to the resulting benefit. In this paper a novel decision support system for cost-benefit analysis in the context of service provision is proposed. Four decision support scenarios are investigated: (i) analyzing the impact of the services on cost and benefit, (ii) sensitivity analysis for the system variables, (iii) goal-seek analysis, and (iv) analyzing the impact of the services on operational resources. The key engine of the analysis approach is a Bayesian Belief Network (BBN). The BBN incorporates the key incoming, control and outgoing service parameters as well as their probabilistic relationships. In the sense of a hierarchical system, the variation of some of the parameters is guided by the results of optimizing operational resources being some of the BBN parameters. We’ve evaluated the framework in a case study with the City of Calgary’s Waste and Recycling Services. The results showed that using such a DSS facilitates the decision making process and improves the overall cost-benefit ratio.

Paper Nr: 313
Title:

A DISTRIBUTED AGENCY METHODOLOGY APPLIED TO COMPLEX SOCIAL SYSTEMS - A Multi-Dimensional Approach

Authors:

Bogart Yail Márquez and Manuel Castanon-Puga

Abstract: The methodology refers to the forms in which reality and knowledge can be studied; it does not question knowledge that has been accepted as true by the scientific community but instead concentrates on strategies to expand the knowledge. This work is motivated by need to establish a methodology for the study of complex social systems in situations where conventional analysis is insufficient in describing the intricacies of realistic social phenomena and social actors. The proposed general methodology we describe requires the use of all available computational techniques and interdisciplinary theories. This growing consensus must be able to describe all aspects of social life as well as serve as a common language in which different theories can be contrasted.

Paper Nr: 323
Title:

BASED ON 'SCENARIOS-RESPONSE' MODEL OF SECURITY PLANS FOR EMERGENCY MANAGEMENT SYSTEM OF DATABASE DESIGN

Authors:

Ding Dan and Li Xiaoran

Abstract: This article on plans to improve the design of library-type research, based on the existing research of "scenario" theory, focusing on finding how "Scene" elements change with the sudden through the corresponding mathematical model, and forming the corresponding model (that is the "scenario" network). Through sorting the analyzing the events that have occurred, extract "Scene" elements, Enrich library plans, so that enhance security management system for real emergency warning capacity.

Paper Nr: 352
Title:

INFLUENCING FACTORS OF HIGH-SPEED RAILWAY PASSENGERS’ TRAVEL CHOICE BASED ON ROUGH SET

Authors:

Fan Yuhang

Abstract: Inter-city traffic is a large complicated system, solving the inter-city traffic problems lies in the reasonable allocation of the methods of transportation, and its core is to satisfy the need of high-speed railway passenger travel. By using rough set theory, this paper calculated the single factor weight and multi-motion weights for the 5 main influencing factors of High-speed railway passengers’ choice of travel, and cleared out the interaction and contacts between the various influencing factors, provided scientific decision support for the construction of high-speed railway and the actual operation management.

Paper Nr: 355
Title:

MUTUAL INTERDEPENDENCE OF STOCK MARKETS BASED ON SUPPORT VECTOR MACHINE

Authors:

Minghao Zhu and Jie Li

Abstract: China's market economy continues to advance, which makes the transparency of information of stock market increasing, the information between the stock market flows faster, a variety of interactions between the stocks increasingly significant. In this paper, support vector machine method is used to study the stock market in the nonlinear discontinuous time series, through the establishment of different support vector machine model, respectively to predict for the Shanghai A shares index, the Shenzhen A share index, the Shanghai B share index and Shenzhen B share index, analyze their absolute error and relative error, it was found there is a strong nonlinear interdependence in the same stock market and a strong dependence of different securities markets, the Shanghai index has a larger effect compare to the Shenzhen index slightly.

Paper Nr: 393
Title:

A GENETIC ALGORITHM FOR SOLVING A PUBLIC SECTOR SUSTAINABLE SUPPLY CHAIN DESIGN PROBLEM

Authors:

Ernesto Del R. Santibanez-Gonzalez

Abstract: This paper presents a novel mixed-integer 0-1 model (MIP) for solving a sustainable supply chain network design problem that arises in the public sector. In our problem, we have to determine a fixed number of facilities to be located at sites chosen from among a given set of candidate sites. Sustainable issues are integrated into the model by reducing the greenhouse gas emissions produced by the transportation and the operation of the facilities. We propose a simple genetic algorithm (GA) for solving this problem. In order to validate our GA solutions we used GAMS to obtain optimal objective values on the MIP. Computational results are very good for instances generated from a known OR test library.

Posters
Paper Nr: 37
Title:

THE MATCHING FOR THE MULTI-PROJECT COLLABORATIVE PLAN OF NEW PRODUCT DEVELOPMENT AND RESOURCE BASED ON GENERALIZED RESOURCE UNIT

Authors:

XiaoGang Deng and Gang Guo

Abstract: The Objective is the matching of collaborative development of products in manufacturing enterprises including involvement of suppliers on a large scale. Match function for collaborative multi-project planning and design resources is analyzed under different requirements and time intervals, when any conflicts exist. A generalized design resource unit and a resource granule quantification model are defined. A multi-project collaborative planning and resource granule constraint-matching model with realization algorithm is presented. According to plan matching with resource granule, new product development with multi-project collaborative planning method, based on generalized design resource constraint is proposed. Presented is a case according to the model and planning which demonstrates the feasibility of the method.

Paper Nr: 39
Title:

RESEARCH OF UNIVERSITIES RESEARCH INDIRECT COSTS ALLOCATION BASED ON COOPERATIVE GAME

Authors:

Zhang Zenglian

Abstract: Universities engage in teaching and research activities, universities’ research funds now mainly offset direct costs of research activities in China, indirect costs of research activities are mainly extracted about 5% management fee, but this is far from research activities should be assessed to make up for the share. At present, universities’ research indirect costs mainly measure through the traditional function cost and advanced activity-based cost, but these methods have some shortcomings. This paper proposes a new universities research indirect cost allocation method based on the cooperative game theory, and an example is justified.

Paper Nr: 64
Title:

RESEARCH ON THE PATTERN OF TECHNICAL SERVICE INNOVATION OF RAIL TRANSIT INDUSTRY BASED ON INDUSTRY CHAIN VIEWPOINT

Authors:

Yingsi Zhao and Yanping Liu

Abstract: The complete industrial chain and unique technical service characteristics have formed in the rail transit industry in China. In this paper, we study the existing technical service pattern of this industry from the viewpoint of industrial chain. We also discuss the pattern of technical service innovation from up, mid and down-stream of industrial chain respectively and analyse its evolution trends. Analysis demonstrates that industry alliance and collaboration innovation will be the important developmental direction of the pattern of technical service innovation, which should be helpful for continuous innovation of rail transit technical service providers.

Paper Nr: 89
Title:

COOPERATIVE SYSTEM FOR INTERMODAL RAILROAD OPERATIONS

Authors:

Jukka Hemilä

Abstract: The research project EURIDICE (European Inter-Disciplinary Research on Intelligent Cargo for Efficient, Safe and Environment-Friendly Logistics) focuses on the development of intelligent solutions for the transport sector. The basic concept of EURIDICE is an information services platform centered on the individual cargo item and its interaction with the surrounding environment and the user. EURIDICE has promoted the Intelligent Cargo concept as a future solution for transport sector information needs. The platform will be implemented and tested in eight industrial transport scenarios. This paper looks at one scenario related to intermodal railroad operations. The paper presents the functionalities of the developed cooperative system for intermodal railroad operations and the way end-users can use the system environment in their operations.

Paper Nr: 162
Title:

MULTI-OBJECTIVES OPTIMIZATION ON THE DEPARTURE OF AIRPLANES FROM BUSY AIRPORT

Authors:

Zhang Yue

Abstract: The delay of airplanes has been the core negative factor to influence the service quality and development of the airline business. The optimized scheduling the arrival and departure of the airplanes is one of good methods to decrease the delay except the uncontrollable weather. The sequence of take-off can be seen as one machine scheduling problem with two objectives: minimizing the number of tardy jobs and minimizing the maximal tardiness of all the jobs. The mathematical model is formulated and multi-objective GA (Genetic Algorithm) is utilized to solve the Pareto optimization. Computational results show that the proposed algorithm performs well when compared with traditional heuristic methods and also provide some choices to the dispatcher which can decide according to the real condition. The process will promote the flexibility and effectiveness of scheduling the departure of airplanes.

Paper Nr: 183
Title:

HIGH-SPEED RAILWAY PASSENGERS’ CHOICES OF TRAVEL FORECAST BASED ON MATLAB NEURAL NETWORK

Authors:

Li Jing

Abstract: As a newly developing mode of transportation, high-speed railway is expanding its influences on national economy and social life. At present, domestic research on high-speed railway mostly focus on tech level, no systematic and comprehensive research have been done to the aspect of passenger travel. This study taking uses of Matlab 6.6 concentrates on environmental factors’ effects on travel choices of High-speed railway passengers, building up a forecast model based on BP Artificial Neural network. Through the comparison and analysis of predicted and real data, effectiveness of this method is proved.

Paper Nr: 368
Title:

THE COMPETITION BETWEEN AVIATION AND HIGH-SPEED RAILWAY

Authors:

Li Chaonan

Abstract: In this article, we mainly want to analyze the competition between high-speed railway and aviation in China, and study whether the opening of high-speed railway will have tremendous impact on the airlines. Firstly, we analyze the development and trends of both the high-speed railway and aviation in China. Secondly, we compare the advantages of these two travel tools. Thirdly, we take the examples in and outside China that high-speed railway affects the operation of airlines. And then we use the Share Rate Model to analyze the share rate of these two transportations in different mileages and reach the results. When the mileage is below 900 km, the high-speed railway occupys more percents of passengers than the aviation. With the increase of mileage, the share rates of these two ways are getting closer to each other. And at the 900 km, each way occupys 50% of passengers.

Area 3 - Information Systems Analysis and Specification

Full Papers
Paper Nr: 97
Title:

DEVELOPMENT OF AN EMPIRICAL KNOWLEDGE MANAGEMENT FRAMEWORK FOR PROFESSIONAL VIRTUAL COMMUNITY IN KNOWLEDGE-INTENSIVE SERVICE INDUSTRIES

Authors:

Yuh-Jen Chen

Abstract: With the advent of service-oriented knowledge economy in the 21st century, knowledge-intensive service industries (KISI) have become a trend nowadays for industrial development. In knowledge-intensive service industries, enterprise activities have highly creativeness. By performing and achieving each enterprise activity, the domain professional knowledge and experiences involving various ideas such as service innovation or service value-added are employed. Therefore, it is most urgent task for implementing knowledge management effectively, to quickly accumulate knowledge assets of enterprise and increase the efficiency of knowledge-intensive service industries. Professional virtual community is an interactive platform for enterprise experts to mutual creating and sharing empirical knowledge in knowledge-intensive service industries. The platform has recorded high-volume rubbish information and empirical knowledge during the expert discussion. Therefore, how to manage and share these useful contents of knowledge discussion has become an important issue for empirical knowledge management in professional virtual community. This study presents a systematic approach to developing a framework for empirical knowledge management to support professional virtual community in knowledge-intensive service industries. The approach presented in this study comprises three phases: (i) proposing an empirical knowledge management model for professional virtual community, (ii) designing an empirical knowledge management system framework for professional virtual community, and (iii) implementing an empirical knowledge management system prototype for professional virtual community. Results of this study facilitate efforts within the professional virtual community to extract, verify, store, and share empirical knowledge in order to effectively assist knowledge-intensive service industries enhancing service innovative abilities and creating the best services for customers' requirements.

Paper Nr: 99
Title:

ON THE USE OF SOFTWARE VISUALIZATION TO ANALYZE SOFTWARE EVOLUTION - An Interactive Differential Approach

Authors:

Renato Lima Novais and Glauco de F. Carneiro

Abstract: Software evolution is one of the most important topics in modern software engineering research. This activity requires the analysis of large amounts of data describing the current software system structure as well as its previous history. Software visualization can be helpful in this scenario, as it can summarize this complex data into easy to interpret visual scenarios. This paper presents a interactive differential approach for visualizing software evolution. The approach builds multi-view structural descriptions of a software system directly from its source code, and uses colors to differentiate it from any other previous version. This differential approach is highly interactive allowing the user to quickly brush over many pairs of versions of the system. As a proof of concept, we used the approach to analyze eight versions of an open source system and found out it was useful to quickly identify hot spot and code smell candidates in them.

Paper Nr: 132
Title:

A NEW METHOD AND METRIC FOR QUANTITATIVE RISK ANALYSIS

Authors:

Peng Zhou and Hareton Leung

Abstract: Quantitative risk analysis provides practitioners a deeper understanding of the risks in their projects. However, the existing methods for impact assessment are inaccurate and the metrics for risk prioritization also can not properly prioritize the risks for certain cases. In this paper, we propose a method for measuring risk impact by using AHP. We also propose a new indicator, risk intensity (RI), to prioritize the risks of a project. Compared with the widely used metric Risk Exposure (RE), the contours of RI show a convex pattern whereas the contours of RE show a concave pattern. RI allows practitioners weight probability and risk impact differently and can better satisfy the needs of risk prioritization. Through a case study, we found that RI could better prioritize the risks than RE.

Paper Nr: 163
Title:

BPEL-TIME - WS-BPEL Time Management Extension

Authors:

Amirreza Tahamtan and Christian Oesterle

Abstract: Temporal management and assurance of temporal compatibility is an important quality criteria for processes within and across organizations. Temporal conformance increases QoS and reduces process execution costs. WS-BPEL as the accpetd industry standard lacks sufficient temporal management capabilities. In this paper we introduce BPEL-TIME, a WS-BPEL extension for time management purposes. It allows the definition, execution and monitoring of business processes with time management capabilities. This extension makes a fixed, variable and probabilistic representation of temporal constraints possible and checks if the model is temporally compliant. Our approach avoids temporal failures by the prediction of the future temporal behavior of business processes.

Paper Nr: 176
Title:

TESTING IN SOFTWARE PRODUCT LINES - A Model based Approach

Authors:

Pedro Reales Mateo

Abstract: This article describes a model-driven approach for test case generation in software product lines. It defines a set of metamodels and models, a 5-step process and a tool called Pralíntool that automates the process execution and supports product line engineers in using the approach.

Paper Nr: 233
Title:

AN MDA-BASED METHOD FOR DESIGNING INTEGRATION PROCESS MODELS IN B2B COLLABORATIONS

Authors:

Ivanna M. Lazarte, Pablo D. Villarreal and Omar Chiotti

Abstract: The design of integration processes is a key issue for implementing collaborative business processes in Business-to-Business collaborations. A collaborative process is executed through the enactment of the integration process of each organization, which contains the public and private logic required to support the role an organization performs in the collaborative process. Integration process models must be aligned with and derived from their corresponding collaborative process models to guarantee interoperability among organizations. In this work, we propose a method based on a Model-Driven Architecture to enable organizations to support and automate the design of integration process models. This method provides a model transformation process that uses Workflow Activity Patterns to generate the public/private activities required in integration processes to support cross-organizational message exchanges.

Paper Nr: 251
Title:

APPROACH FOR VERIFYING WORKFLOW VALIDITY

Authors:

Yuan Lin, Thérèse Libourel and Isabelle Mougenot

Abstract: This article presents the solution adopted for tackling the problem of incompatibility inherent in process compositions during a workflow’s construction. The proposed approach is based on a context of pre-constructed resource hierarchies (data and processes) and consists of finding possible composition “paths” between processes within GRSYN and GRSEM resource graphs constructed from the context. We explain the stage of constructing the context from a simple formal description of resources. The stage for resolving the incompatibility is then covered in detail. We briefly present the implemented prototype before highlighting future avenues of research.

Paper Nr: 264
Title:

IMPROVING THE CONSISTENCY OF SPEM-BASED SOFTWARE PROCESSES

Authors:

Eliana B. Pereira and Ricardo M. Bastos

Abstract: The main purpose of this paper is to improve the consistency of Spem-Based Software Processes through a set of well-formedness rules that check for errors in a software process. The well-formedness rules are based on the SPEM 2.0 metamodel and described using the Unified Modeling Language - UML multiplicity and First-Order Predicate Logic - FOLP. In this paper, the use of the well-formedness rules is exemplified using a part of the OpenUP process and the evaluation of the one of the proposed rules is shown.

Paper Nr: 276
Title:

TOPOLOGICAL MODELING FOR ENTERPRISE DATA SYNCHRONIZATION SYSTEM - A Case Study of Topological Model-driven Software Development

Authors:

Uldis Donins and Janis Osis

Abstract: In this paper a problem domain and system modelling formalization approach is shown in context of enterprise data synchronization system development case study. Formalization approach is based on topology borrowed from topological functioning model (TFM). TFM uses mathematical foundations that holistically represent complete functionality of the problem and application domains. By applying the proposed topological modelling approach in software development process we aim to enable computation independent model creation in a formal way and to enable transformation from it to platform independent model. Besides that a traceability can be established between problem domain model, solution domain model (or models) and the software code.

Paper Nr: 301
Title:

MULTIOBJECTIVE SOFTWARE RELEASE PLANNING WITH DEPENDENT REQUIREMENTS AND UNDEFINED NUMBER OF RELEASES

Authors:

Márcia Maria Albuquerque Brasil, Thiago Gomes Nepomuceno da Silva and Fabrício Gomes de Freitas

Abstract: Release Planning is an important and complex activity in software development. It involves several aspects related to which functionalities are going to be developed in each release of the system. Consistent planning must meet the customers’ needs and comply with existing constraints. Optimization techniques have been successfully applied to solve problems in the Software Engineering field, including the Software Release Planning Problem. In this context, this work presents an approach based on multiobjective optimization for the problem when the number of releases is not known a priori or when the number of releases is a value expected by stakeholders. The strategy regards on the stakeholders’ satisfaction, business value and risk management, as well as provides ways for handling requirements interdependencies. Experiments show the feasibility of the proposed approach.

Paper Nr: 339
Title:

APPLICATION OF ANALYTIC HIERARCHY PROCESS ON CALCULATING THE WEIGHTS OF ECONOMIC MODEL EVALUATION

Authors:

Dai Wang and Dan Chang

Abstract: With the combination of teaching resources and IT technology being more and more close, it has attracted more attention on the issue of how to manage the quality of online teaching resources. This thesis has constructed an evaluation system for the model library of the Economic Model Resource Platform, and adjusted the weighted scales and the corresponding calculations based on both of the theory of the Analytic Hierarchy Process (AHP to be brief) and the characteristics of the Economic Model Resource Platform during the process of determining the weights of the evaluation system in order to make the final weights more suitable for practical applications. This study helps achieve the purposes of monitoring the quality of the economic models and promoting the optimization of the models. On the other hand, the calculation method of determining the weights has provided a reference for the application of AHP.

Paper Nr: 363
Title:

COST MODELING AND ESTIMATION IN AGILE SOFTWARE DEVELOPMENT ENVIRONMENTS USING INFLUENCE DIAGRAMS

Authors:

Efi Papatheocharous and Despoina Trikomitou

Abstract: Software development according to agile principles seeks to promote adaptive processes, teamwork and collaboration throughout the life-cycle of a project. In contrast, traditional software development focuses on the various phases and activities of the life-cycle while seeking for repeatable, predictable processes to maximize productivity and quality. Additionally, project management in conventional development processes aims to plan and predict the future, whereas in agile development environments, aims to adapt according to any future change. In this paper we investigate, through modeling with Influence Diagrams, the benefit of switching from traditional software development to agile in terms of productivity, expected value and cost. Additionally, we examine how software costs might differentiate if traditional or agile development methodologies are followed. We explore the factors that contribute in successful software development and draw our main conclusions through hypothetical and real case scenarios recorded in agile surveys on Information Technology practices. One of our main conclusions includes verification of the need for a skillful manager and small development team to lead to successful agile projects.

Paper Nr: 381
Title:

BUILDING A WEB EFFORT ESTIMATION MODEL THROUGH KNOWLEDGE ELICITATION

Authors:

Emilia Mendes

Abstract: OBJECTIVE – The objective of this paper is to describe a case study where Bayesian Networks (BNs) were used to construct an expert-based Web effort model. METHOD – We built a single-company BN model solely elicited from expert knowledge, where the domain experts were two experienced Web project managers from a medium-size Web company in Auckland, New Zealand. This model was validated using data from eleven past finished Web projects. RESULTS – The BN model has to date been successfully used to estimate effort for numerous Web projects. CONCLUSIONS – Our results suggest that, at least for the Web Company that participated in this case study, the use of a model that allows the representation of uncertainty, inherent in effort estimation, can outperform expert-based estimates. Another nine companies have also benefited from using Bayesian Networks, with very promising results.

Short Papers
Paper Nr: 32
Title:

FORMALLY MODELING AND ANALYZING DATA-CENTRIC WORKFLOW USING WFCP-NET AND ASK-CTL

Authors:

Wang Zhaoxia and Jianmin Wang

Abstract: Despite the abundance of workflow analysis techniques from control-flow perspective, there is hardly any method for workflow verification from data processing perspective. In this paper, we restrict the WFCP-net, a Colored Petri Net with WF-net structure, to formally describe the key business entities in a data-centric workflow model. Then, we use ASK-CTL logic to describe the workflow requirements on business data processing perspective. The model checking method is adopted into our verification approach, which can explore some of the business contraventions of data perspectives in the workflow models. The effectiveness of our works has been validated with the CPN Tools.

Paper Nr: 103
Title:

THE PRICING MECHANISM OF SUPPLIERS IN RISK-SHARING IN EXTERNAL FINANCING

Authors:

Zhenyi Wang

Abstract: In the background of retailers financing to financial institutions because of lacking of fund, assume that suppliers are risk-averse and it costs much to rebuild the network of retail outlets, the paper uses utility function to deduce the best wholesale prices for suppliers in risk-sharing. Research shows that suppliers will avoid financing risks at high wholesale price in risk-sharing. And it also shows the influence mechanism of the wholesale price.

Paper Nr: 115
Title:

TRACEABILITY AND VIEWPOINTS IN ENTERPRISE ARCHITECTURES

Authors:

Dmytro Panfilenko and Roman Litvinov

Abstract: In the times of increasing information volumes it is virtually impossible to harness the complexity and changes of the enterprise processes and requirements without taking into account the aid of the enterprise architectures, which are being supported by methodologies surveyed in this article. Software architectures for them serve as templates for system development processes and can describe the basic infrastructures for hardware, software and networks as well as their interrelations. Each involved participant fashions its own view on the final system, a developer or an architect alike, thus constituting rationale for the introduction of viewpoints at different abstraction levels provided in this article.

Paper Nr: 124
Title:

DESIGN APPROACH OF DISTRIBUTED SYSTEMS FOR THE CONTROL OF INDUSTRIAL PROCESS

Authors:

D. Boudebous and J. Boukachour

Abstract: This article describes a methodological approach to the design of distributed systems for the control of industrial process. The designer tackles the problem by specifying the behaviour of the process rather than by specifying a solution. In this way he defines the “what to control”. This specification can then be converted not only into a Petri net to allow the checking of certain properties of the described behaviour, but also into a logical network of communicating modules which defines the logical structure of the process control system. In both cases, the rules of conversion are direct and simple.

Paper Nr: 129
Title:

AN EFFICIENT METHOD TO IDENTIFY CUSTOMER VALUE IN TOURIST HOTEL MANAGEMENT

Authors:

Changqiu Li

Abstract: Fierce market competition forces tourist enterprise to put more and more attention on the demands of the customer. Customer relationship management (CRM) becomes increasingly important in tourist industry. Identify valuable customers and cultivate these valuable customers is the two basic tasks of CRM. So it has significant meaning for manager to indentify the value of customers. In this paper, we introduce a more convenient statistical method based on RFM (Recency, Frequency, Monetary value) analysis (Xiaoyu Zhao, 2005) to the customer value identification in tourist hotel. We study how to use this method in the management of a tourist hotel. The convenience and importance of this method are demonstrated through comprehensive analysis.

Paper Nr: 149
Title:

NDT-GLOSSARY - A MDE Approach for Glossary Generation

Authors:

J. A. García-García and M. J. Escalona

Abstract: This research paper is contextualized within the paradigm of Model-Driven Engineering (MDE) and it is specifically related to NDT. NDT is a methodology included within the MDE paradigm. The aim of this paper is to present a software tool to facilitate the work of requirements engineers during the requirements validation in a software project. The requirements validation activity takes place within the requirements phase of the life cycle in a software project. The developed tool is called NDT-Glossary and it implements an automatic procedure to generate, from the requirements defined in a project developed with the NDT methodology, the first example of the glossary of terms for this project.

Paper Nr: 172
Title:

A CONTEXT-AWARE SERVICE CENTRIC APPROACH FOR SERVICE ORIENTED ARCHITECTURES

Authors:

Hatim Hafiddi, Mahmoud Nassar and Hicham Baidouri

Abstract: Evolution in the fields of telecommunication and software engineering has promoted the birth of a new generation of software architectures known as Context-Aware Service Oriented Architectures (CASOA) which are articulated on a new design and development paradigm called Context-Aware Service (CAS). However, the ambiguity of the context concept and the multiplicity of services execution contexts make CAS hard to build and show why a generic approach, in accordance with best practices of software engineering for designing such services, is necessary. This paper focuses on a CAS design approach for building CASOA. To deal with such architectures development, challenges such as context management and dynamic service adaptation have to be faced. We propose in this article a design process that exploits both of our context and CAS specifications and metamodels in order to fulfil the passage from a core service in Service Oriented Architecture (SOA) to a CAS in CASOA. This passage is satisfied across a mechanism that, inspired by the Aspect Paradigm concepts, considers the service adaptations as aspects.

Paper Nr: 173
Title:

MODELING AND ANALYSIS OF A POWER LINE COMMUNICATION NETWORK SUBJECT TO CHANNEL FAILURE

Authors:

Shensheng Tang and Yi Xie

Abstract: Power line communication (PLC) is a promising technique for information transmission using existing power lines. We analytically model a finite-source PLC network subject to noise/disturbance and evaluate its call-level performance through a queuing theoretic framework. The proposed PLC network model consists of a base station (BS), which is located at a transformer station and connected to the backbone communication networks, and a number of subscriber stations that are interconnected with each other and with the BS via the power line transmission medium. An orthogonal frequency division multiplexing (OFDM) based transmission technique is assumed to be used for providing the transmission channels in a frequency spectrum. The channels are subject to failure during service due to noise/disturbance. When a channel is in failure, its associated call will wait at the channel until the channel is recovered. We model this process and determine the steady-state solution and derive several performance metrics of interest. Numerical and simulation results are presented for the purpose of performance evaluation. The proposed modeling method can be used for evaluation and design of future PLC networks.

Paper Nr: 180
Title:

STUDY ON THE INFORMATION SECURITY SYSTEM FOR BANK IN CHINA

Authors:

Xichen Jiang

Abstract: During the wide use of the new network technology in finance, the electronic finance, commercial services on the net, and the cash payment system bring convenience to us, at the same time, it also brings lot of hidden dangerous and financial risk. Now this paper will construct a security system for the information system in bank based on contemporaneous theory of information safety and using the combination of information technology and the methods of management.

Paper Nr: 187
Title:

FORMALIZING TRUST REQUIREMENTS AND SPECIFICATION IN SERVICE WORKFLOW ENVIRONMENTS

Authors:

Wattana Viriyasitavat and Andrew Martin

Abstract: The emergence of advance communication technologies such as Internet has changed the nature of face-to-face towards virtual interactions in the form of services. Proliferation of services has enabled the creation of new value-added services composed of several sub-services in a pre-specified manner, known as service workflows. There are a number of security issues as workflows require disparate services to dynamically collaborate and interact on demand. Trust is an enabling technology serving as an adaptive and platform-independent solution that fits in this context. However, the lack of consensus on a unified trust definition and the traditional mindset of treating trust requirements separately pose the difficulty in developing formal specification. This paper provides a formal framework to this problem. The central part of the paper is logic based formalism with algebraic expressions to formally specify trust requirements. A trust definition and three modes of trust are described with algebraic operators to form specification formulae. The contribution of the framework is to allow trust requirements to be formally and uniformly specified by each distributed autonomous service, serving as a core component for automatic compliance checking in service workflows.

Paper Nr: 192
Title:

SURVEY AND PROPOSAL OF A METHOD FOR BUSINESS RULES IDENTIFICATION IN LEGACY SYSTEMS SOURCE CODE AND EXECUTION LOGS

Authors:

Wanderley Augusto Radaelli Junior

Abstract: Computer systems implement business processes from different organizations. Among the currently operating computer systems, much of it is classified as legacy system. Typically, legacy systems are complex applications that are still active, due to the high cost of modernization and a high degree of criticality. In recent years, were published several works addressing the importance of legacy systems modernization, emphasizing the extraction of the business process model implemented in these systems. Within this context, a key step is to extract knowledge from source code and / or systems execution logs, aiming to use this information in reverse engineering processes. In this work are presented and analyzed methods based on source code manipulation and system’s execution logs mining, which can be used to extract knowledge from legacy systems, prioritizing business rules identification. A comparison between the two different approaches is presented, as well as their positive and negative characteristics. Our results include a list of desired features and a proposal of a method for legacy systems reverse engineering and business rules identification.

Paper Nr: 194
Title:

RESEARCH ON COMPONENT COMPOSITION BASED ON FEATURE MODEL

Authors:

Lirong Xiong

Abstract: Traditional component description methods lack sufficient semantic information. It is difficult for users to find suitable components to match their requirements. And it is also difficult in doing automatic composition and verification of components. Furthermore, the component trustworthiness is an important factor that must be considered during component composition process. Feature-Oriented Domain Analysis (FODA) that uses features and relations between features to describe the problem domain can provide necessary supports to component composition. This paper focuses on component functional semantics and component trustworthiness. A feature meta-model is proposed to provide sufficient semantic information, and trustworthiness is taken into account in the feature meta-model. Based on feature meta-model, it presents the component composition algorithm. Finally, an application of the approach in the credit evaluation domain is presented.

Paper Nr: 210
Title:

DEVELOPMENT OF CONTEXT-AWARE APPLICATIONS IN UBIQUITOUS INFORMATION SYSTEMS

Authors:

Benselim Mohamed Salah and Seridi-Bouchelaghem Hassina

Abstract: Nowadays, software engineering is moving towards the development of ubiquitous and distributed applications. This tendency is constrained by parameters such as mobility and heterogeneity that characterize the current situation of a user. Each new application will be able to adapt its services with the change of context of use and satisfy all user's preferences. The aim of this paper is to propose a new development approach that can take into account the change in context of use during the application development process. It permits us to develop contextual aspects of a system in a separate way and independently from the business aspects of this system and from the technological constraints of the chosen platform. Our proposal, based on the principles of MDA (Model Driven Architecture), is defined by three steps. First, the separation of contextual aspects by introducing the 3TUP process (3 Track Unified Process) and the development process as PSI "Ψ". Second, the context modeling using UML (Unified Modeling Language) and conform to a proposed context metamodel. And thirdly, the integration of the contextual model in MDA process using model merging operation.

Paper Nr: 215
Title:

GETTING TO GLOBAL YES! - Designing a Distributed Student Collaboration

Authors:

Selma Limam Mansar

Abstract: The authors have taught a course called 'Global Project Management' for four years, engaging students in three international locations in hands-on distance projects. The distance projects are intended to provide students with enriching, realistic global project experience. With experience, improved planning and better coordination, each iteration of the distance projects has improved. In this paper, the authors present lessons learned and a mind map demonstrating key aspects of design of global hands-on projects.

Paper Nr: 218
Title:

ONTOLOGY-BASED KNOWLEDGE MANAGEMENT - Graphical Query Editor for OWL Ontologies

Authors:

Markus Schwinn

Abstract: The OnToBau research project aims to provide a way to classify, archive and effectively use business knowledge with the assistance of an ontology-based knowledge archive for small and medium companies from construction industry. This archive is intended to pro-actively provide users with information to assist them in their daily business process handling. The system consists of four main parts. The document converters prepare the different resources (EMails, Paperdocuments, PDFs etc.) that should be stored in the knowledge archive for the enclosed inference system. The inference system is the core component and extracts the information from the preprocessed resources. Ontologies provide the necessary domain knowledge. In order to exploit the available knowledge, a personal agent monitors the current activities of the user and tries to infer the intention from his behaviors. At certain points it automatically offers the user helpful information. Again ontologies are used to represent information about the business processes. In addition, the user has the option to search for information in the archive through the graphical user interface. The importance of simple query systems has already been identified in the area of database systems. This paper gives an overview of the OnToBau research project presenting a first approach to visual query for information in the knowledge archive.

Paper Nr: 232
Title:

BUSINESS-IT ALIGNMENT AND ORGANISATION AGILITY ENABLED BY BPM AND SOA APPROACHES INTERPLAY

Authors:

Youness Lemrabet, Hui Liu and Nordine Benkeltoum

Abstract: Business Process Management (BPM) and Service-Oriented Architecture (SOA) approaches receive considerable attention from scholars and industrial practitioners. Thanks to the three streams of the EMT model (Economy, Methodology, Technology) this paper shows that BPM and SOA approaches are two sides of the same coin, which are different but complementary. This combination enables organisations to improve their agility in the face of uncertainty, complexity and change.

Paper Nr: 235
Title:

A MODEL DRIVEN APPROACH SUPPORTING MULTI-VIEW SERVICES MODELING AND VARIABILITY MANAGEMENT

Authors:

Boutaina Chakir and Mounia Fredj

Abstract: Service Oriented Architecture (SOA) is an architectural paradigm for defining how people, organizations and systems provide and use services to achieve their business goals. Moreover, the growing of information systems increases the need of agility which implies the ability of a system to be adaptable to the changes in requirements and context of use. Managing variabily is considered as new leading edge concept for improving interoperability and reuse. Indeed, variability refers to the characteristic of a system to adapt, specialize and configure itself with the context of use. Several proposals have been proposed in this sense, but they are still immature and incomplete. Consequently, in this paper we propose a model driven method for managing variability in SOA based on MDA (Model Driven Architecture). In fact, through MD, the method enables the automation of service’s realization regardless of supporting platforms. Our representation of variability is based on the extension of SOAML which is the future standard for modeling services. In addition, we adopt the separation of concerns theory by integrating modeling views, to better organize the various modeling artifacts.

Paper Nr: 260
Title:

DOMAIN-SPECIFIC MODELLING APPLIED TO INTEGRATION OF SMART SENSORS INTO AN INFORMATION SYSTEM

Authors:

Jean-Philippe Schneider and Joël Champeau

Abstract: (Kopetz, 1997) stated that a trend in the sensor technology is the development of intelligent sensors also called smart sensors. The development of such sensors do not only rely on the hardware development but also on the software. The later should so meet the requirements on low costs and of quality. This paper presents our approach to model the software of a smart sensor and to generate the code for the embedded real-time application. It will also describe how the use of a domain-specific modelling methodology enabled us to achieve a high level of modularity which will permit to save costs and development time.

Paper Nr: 261
Title:

MAPPING THE EVOLUTION OF RESEARCH ON GLOBAL SOFTWARE ENGINEERING - A Systematic Literature Review

Authors:

Josiane Kroll

Abstract: Studies on Global Software Engineering (GSE) have increase in recent years. However, it is not clear how research on this area has evolved in terms of topics being investigated. For this reason, the purpose of this paper is to identify areas within Software Engineering (SE) about which studies related to GSE have been published and to indicate possible gaps in this recent research area. We report from a Systematic Literature Review (SLR) conducted from September to November of 2010. The main result is mapping the evolution of this research field including what has been investigated.

Paper Nr: 270
Title:

MEASURING THE BUFFER OCCUPATION OF SAP ERP SYSTEM APPLICATIONS

Authors:

Stephan Gradl, Manuel Mayer, Alexandru Danciu and Ramón Escrihuela

Abstract: Enterprise resource planning (ERP) systems form the backbone for the execution, controlling and management of business processes in today’s large companies. Availability and performance of ERP systems is extraordinary critical for a company as even short unavailability or reduced throughput can be very costly. As companies are evolving, the number of applications and the kinds of applications that have to be supported are rising, which inherently also increases performance needs. However, the determination of what makes up the performance needs is critical. The performance of SAP ERP systems strongly depends on the usage of buffers for caching database contents. In order to predict the performance of SAP ERP systems, it is necessary to understand and measure the buffer usage of applications running on SAP ERP systems. In this work we explain the basic concepts and introduce a method for measuring the buffer usage of SAP ERP applications. This method will be illustrated by a case study where each step of a business process, executed on a SAP ERP system, will be analyzed according to the memory usage.

Paper Nr: 282
Title:

AN ADVANCED HYBRID P2P BOTNET 2.0

Authors:

Ta-Te Lu

Abstract: Recently, malware attacks have become more serious over the Internet by e-mail, denial of service (DoS) or distributed denial of service (DDoS). The Botnets have become a significant part of the Internet malware attacks. The traditional botnets include three parts – botmaster, command and control (C&C) servers and bots. The C&C servers receive commands from botmaster and control the distributions of computers remotely. Bots use DNS to find the positions of C&C server. In this paper, we propose an advanced hybrid peer-to-peer (P2P) botnet 2.0 (AHP2P botnet 2.0) using web 2.0 technology to hide the instructions from botmaster into social sites, which are regarded as C&C servers. Servent bots are regarded as sub-C&C servers to get the instructions from social sites. The AHP2P botnet 2.0 can evaluate the performance of servent bots, reduce DNS traffics from bots to C&C servers, and achieve harder detection bots actions than IRC-based botnets over the Internet.

Paper Nr: 293
Title:

BUSINESS PROCESS MODELING AND SOA IN INDUSTRIAL O&M APPLICATION DEVELOPMENT

Authors:

David Hästbacka

Abstract: While striving to increase profits in global competition, companies are trying to improve efficiency and reduce costs by outsourcing and focusing on their core functions. For operation of industrial plants this often results in provision of services even for high-priority activities such as maintenance. Integration of external information systems and service providers to business processes and information workflows brings new challenges to application development in order to support introduction of maintenance services as efficiently as possible. This paper discusses the approach of applying business process modeling and service-oriented concepts to development of supporting software applications. Business process modeling is proposed for describing service interactions and information flows, and to function as a foundation for the application development. To satisfy required flexibility in changing business environments, the applications represented as services are composed into executable process workflow orchestrations using standard Internet technologies. To validate the approach a scenario consisting of a condition monitoring process and an environment footprint estimator is presented.

Paper Nr: 295
Title:

SOA PRACTICES AND PATTERNS APPLIED IN GLOBAL SOFTWARE DEVELOPMENT

Authors:

Marcelo Zilio Pereira, Jorge Luis Nicolas Audy and Rafael Prikladnicki

Abstract: Prior research has established a relationship between coordination of software development activities and software architecture both in collocated and distributed projects. Despite the recognized importance of the software architecture in the coordination of development activities, it is still unclear how software architects design the architecture of software systems in distributed projects. To better understand this scenario, this paper reports from a qualitative empirical study where we interviewed software architects to collect information about the software architecture of distributed projects. Information collected has exposed the wide adoption of Service Oriented Architectures (SOA), indicating a trend towards the usage of this low coupling architectural style by companies developing projects with distributed teams. More detailed data collected by follow-up interviews suggested a set of best practices for designing SOA architectures to facilitate the work of the project members.

Paper Nr: 297
Title:

CHARACTERIZATION OF CONSULTANT ACTIVITIES IN ERP PROJECTS - A Case Study

Authors:

Emiliano Lutteri and Barbara Russo

Abstract: Enterprise Resource Planning (ERP) systems are very common worldwide, but their implementation and upgrade projects are expensive, complex and often unsuccessful. A great role in their implementation is played by the consultancy firms, which provide manpower and qualified knowledge. Often, IT managers, on the preparation stage, need to know how the project will be organized, how the resources will be distributed and which effort will be necessary from the employees to perform a successful project. Through a case study we aim to define a way to measure the business value of the consultancy and understand if it is possible to build a model to achieve this result.

Paper Nr: 305
Title:

SUPPORTING BUSINESS MODELING USING MULTIPLE NOTATIONS - A Case Study

Authors:

Moshiur Bhuiyan

Abstract: In this paper, we present a case study that illustrate the use of an approach that facilitates and supports the combined use of i* and BPMN for performing business modeling in a synergistic fashion on a complex project for a large government agency in Australia. We used a constrained development methodology to facilitate this modeling practice. The purpose of this case study is to further demonstrate the applicability of our proposed methodology in a real time, big scale industrial project.

Paper Nr: 316
Title:

MANAGING TETRA CHANNEL COMMUNICATIONS IN ANDROID

Authors:

Luis Corral

Abstract: Mobile systems have evolved to a level in which they have to carry out their operations in environments demanding high availability and short response time. An excellent radio transmission protocol is necessary to provide the means to satisfy such requirements. TETRA standard offers a large number of features designed for safety and emergency use; however, to access a TETRA network it is compulsory the use of a standard-specific device. In this paper we explore the conditions that have to be fulfilled to extend Android's capabilities to access TETRA networks from a general-purpose mobile device, and set a path for a future development aimed to accomplish it.

Paper Nr: 321
Title:

ANALYSIS OF REQUIREMENT CAPTURE BASED ON UML

Authors:

Tao Ni

Abstract: Unified modeling language UML is a visualized modeling language of current object-oriented areas. Many enterprise information system developers concern how to apply this technique to the development of enterprise information management system. The capture demand in UML is the first stage of system development. Its purpose is to find the real demand and suitable for user, the customer and developers, which is the way to the success of the demand for development system determines. Purchasing management system is a very important part of the enterprise information management system. In order to solve most of enterprise information system development and maintenance problems, this paper uses cases prototype model, combines started software expatiates on how to use UML to purchasing management system of capture demand analysis based on purchasing management system as example.

Paper Nr: 331
Title:

SPECIFICATION-BASED AUTOMATED GUI TESTING

Authors:

Andreas S. Andreou

Abstract: GUI testing is currently one of the most expensive and time consuming processes in the software life-cycle and according to some estimates this cost can reach up to 50 to 70% of the whole cost of the project. This paper proposes a framework for specification-based automated GUI testing which employs a GUI analyzer for dynamic analysis and extraction of GUI object information, a system for automatic test case generation driven by Spec#, a test-case execution algorithm that executes test-cases automatically and a verifier that compares the expected with the actual result of the test. Preliminary experimental results demonstrate the efficiency and effectiveness of the framework.

Paper Nr: 334
Title:

CO-DESIGN OF AN ADAPTABLE SYSTEM FOR INFORMATION MANAGEMENT

Authors:

Kecheng Liu and Navid Karimi Sani

Abstract: Information management is a discipline dealing with creation, communication, utilisation and disposal of information. In the 21st century, the volume of information forces the organisations not only to have formal processes for information management, but also employing effective IT systems to leverage the value of information. This paper introduces a method for improving the process of information management and co-design the business process and IT system to support the information management practice in organisations. This method takes into account stakeholders’ roles and responsibilities, and their needs for information provided by the information management processes. Adopting organisational semiotics as theoretical and methodological foundation, the IT system designed will be adaptable to the changes of requirements due to the changes of the business environment and stakeholders’ interest.

Paper Nr: 380
Title:

RESEARCH ON INTERNATIONAL MULTIMODAL TRANSPORT DEVELOPMENT STRATEGY IN CHINA

Authors:

Xin Zang

Abstract: As a kind of senior organization form, multimodal transport can integrate the advantages of various modes of transport。By means of seamless linking, Multimodal transport is able to improve efficiency and quality and meet the requirement of individual needs in the transportation, which makes it important in the environment of global competition. After nearly four decades of construction and development, China has acquired remarkable achievements in multimodal transport. However, there still exists great gap between developed countries. Currently, the relationship between international multimodal transport and the field of manufacture is becoming increasingly close, International multimodal transport network has become the core resource and business of shipping companies have tried to continually penetrate to the land. Under this kind of situation, in order to promote the development of multimodal transport in china, we must actively foster operators of large-scale international multimodal transport business. Meanwhile, we need to construct international multimodal transport network, improve the management structure and eliminate institutional obstacles to development. We should make a breakthrough in the combination of sea and railway transport and try to promote the development of network between land and sea transport. Great efforts should be made to accelerate the application of advanced information technology in the business of international multimodal transport.

Paper Nr: 391
Title:

MODELING WORK PROCESSES AND SOFTWARE DEVELOPMENT - Notation and Tool

Authors:

Andre L. N Campos and Toacy C. Oliveira

Abstract: Process modeling are becoming more essential to the activities of acquisition and development of systems. There are a number of possible notations and tools for process modeling, and sometimes it isn’t an easy choice. This papers try, through research in a real environment, identify selection criteria and recommend the most appropriate notations and tools for process modeling.

Paper Nr: 397
Title:

A FRAMEWORK FOR EFFECTIVE IT INVESTMENT - From the Perspective of Business – IT Alignment and Organization

Authors:

Kayo Iizuka

Abstract: Although studies on the investment effect of information technology (IT) have been proceeding, the requirements for effective IT investment are changing due to changes in the conditions and circumstances of many companies, such as diversified organization types (e.g. types of relationship between IT section and business planning section), and consideration of effectiveness of enterprise groups at the international level etc. In this situation, a methodology for reinforcing the IT investment effectiveness of diversified companies individually (not simply a standardized methodology) might be significant. In order to take up this challenge, the authors propose a framework for effective IT investment from the perspective of business –IT alignment and organization.

Paper Nr: 401
Title:

SCHEME OF AUTHENTICATION OF HEALTH MONITORING SYSTEM BASED ON CREDIT CARD MECHANISM

Authors:

Qiming Huang and Qilei Hao

Abstract: Health monitoring systems are not just supported by local hospitals, but also supported by other province’s hospitals. How to share the patient’s health data by different hospitals? Similarly, a user first applies for a credit card with a bank whereby to buy goods at any merchant accepting credit cards. Merchants need not establish agreements with each other, but just need to have a trust relationship with one or a few banks that accept payments from credit-card users and pay merchants. After designing the logic hierarchical diagram of the health monitoring system, the credit card mechanism is applied to establish mutual authentication scheme with Identity-Based Cryptograph (IBC), which are used by clients in referral care between different hospitals of different provinces and between different hospitals of the same province. The authentication scheme ability has been analyzed to resist for Anti-counterfeit attacks, location privacy attacks and replay attacks.

Posters
Paper Nr: 45
Title:

EVOLUTION OF ENTERPRISE INFORMATION SYSTEMS. AN INDUSTRIAL APPROACH

Authors:

Pietro D'Ambrosio

Abstract: The problem of the evolutionary maintenance of large information systems is very critical. The time and costs required to adapt the system to the market’s evolutive dynamics are no longer compatible with the objectives of the business. The experimentation, on small-scale, in the laboratory of new technologies and new methods is ineffective for this type of systems and it is rare that the customer accepts the risk of innovations in an industrial project. In this paper we propose an intervention strategy and a reference architecture to transfer, using an approach based on "small steps", innovations obtained from research into new industrial applications.

Paper Nr: 96
Title:

BTRANSFORMER - A Tool for BPMN to CSP+T Transformation

Authors:

Aleksander González, Luis E. Mendoza Morales, Manuel I. Capel and María A. Pérez

Abstract: In any organisation, properties such as scope, structure, deployment, capability, structural consistency and concurrency, supporting the critical factors for success in Business Process (BP) modelling, need to be verified. And thus, relevant parts of a BP must be formally specified in an appropriate way. Process Calculi (PC) such as CSP, ACP, CCS, which constitute a mathematical basis for programming reactive, communication-bounded systems, can be used to model critical systems and to verify their correctness properties. PC-based notations can be used to specify business processes (BPs) and reason about their properties. Without a demanding training, to make use effectively of these languages is beyond the ability of many business modellers. In order to cope with this drawback, we propose a set of rules to automatically transform a semi-formal model expressed in terms of Business Process Modelling Notation (BPMN) into a Communicating Sequential Processes + Time (CSP+T) formal system specification. In this paper, we present BTRANSFORMER tool that permits to automatically generate such a formal specification and has been programmed with the ATLAS Transformation Language (ATL). As result, we obtain a plug-in for Eclipse platform, which is capable of transforming BPMN models designed with Intalio into a text file with the equivalent CSP+T formal specification of the business model.

Paper Nr: 107
Title:

KNOWLEDGE FORMS OF ENTERPRISE AND THE RESEARCH OF ITS IDENTIFICATION AND DISTRIBUTION

Authors:

Z. D. Yu

Abstract: Knowledge has been a kind of important factor in enterprise, the status of knowledge is the same as resource and manpower under knowledge economy. To enterprise, knowledge is very abstract, in order to make use of knowledge, we should know the basic forms of knowledge firstly, then we can identify the knowledge and know its distribution rule. The knowledge is surveyed on the base of summarizing knowledge concept in the thesis, and its rules of identification and distribution is also given.

Paper Nr: 109
Title:

ACCESS ISOLATION MECHANISM BASED ON VIRTUAL CONNECTION MANAGEMENT IN CLOUD SYSTEMS - How to Secure Cloud System using High Perfomance Virtual Firewalls

Authors:

Alexey Lukashin

Abstract: The paper describes the access isolation model based on virtual connection management and proposes the mechanism of traffic filtering in transparent mode, invisible to other components. New level of complexity of information security tasks was observed in the distributed virtualized systems. The paper proposes a specialized firewall solution for implementing access isolation and information security in hypervisors and entire distributed cloud system.

Paper Nr: 155
Title:

TEMPORAL META-DATA MANAGEMENT FOR MODEL DRIVEN APPLICATIONS - Provides Full Temporal Execution Capabilities throughout the Meta-data EIS Application Lifecycle

Authors:

Jon Davis and Elizabeth Chang

Abstract: In this paper we discuss how the application of temporal data management techniques to the atomic elements of a meta-data application model can provide for a complete temporal execution capability for meta-data Enterprise Information Systems (EIS) applications by maintaining a perfect synchronisation of historical data and historical application states. Temporal data management is a well understood field as it applies to the common database and its application to the meta-data EIS application lifecycle in such a solution would minimise the reduction of historical information accessibility currently experienced in most applications as the logical application functionality and data formats are regularly changed due to often irreversible version upgrades.

Paper Nr: 213
Title:

TOURISM INFORMAITION SYSTEM FOR INDEPENDENT TRAVELLERS — BASED ON INFORMATION REQUIREMENT

Authors:

Ping Yin and Lintao Chen

Abstract: In the last twenty years in China, people’s leisure time increased quickly, and tourism facilities and service quality were improved steadily. What’s different with the former tourism development is Independent Travelling has been the major style and possibly will take the place of the Group Travelling. Independent Travellers have different tourism demands, and tourism information is the most important factor within tourists’ decision making program. On the basic of tourist information collection through a questionnaire, this paper analyzed the Independent Travellers’ information demands and designed an information system for independent travellers.

Paper Nr: 229
Title:

METHOD FAMILY DESCRIPTION AND CONFIGURATION

Authors:

Deneckère Rébecca

Abstract: The role of variability in Software engineering grows increasingly as it allows developing solutions that can be easily adapted to a specific context and reusing existing knowledge. In order to deal with variability in the method engineering (ME) domain, we suggest applying the notion of method families. Method components are organized as a method family, which is configured in the given project. As variability relates to variation points, our proposal is to consider the method family configuration as a decision-making (DM) process. We illustrate our approach by a method family dealing with scenario elicitation.

Paper Nr: 241
Title:

MODEL-BASED FAILURE ANALYSIS OF BUSINESS PROCESS

Authors:

Xiaocheng Ge

Abstract: The success of most organisations are based on how well the organisation can engineer and execute their business processes in order to better manage the extra value that these processes can provide. However, business processes may fail to deliver the functions which are designed. In order to help the understanding of how and why a business process may fail, we considered the techniques in system safety engineering and integrated them with existing techniques of business process modelling and analysis. We proposed a framework of business process failure analysis. The kernel of the framework is that we extended the workflow net of a business process by modelling failures as coloured tokens and so that the failure behaviours of the business process can be simulated. In addition, we developed a tool based on the codes of an open source project to support the analysis. The applications of proposed technique have demonstrated that it is a powerful and easy-to-use technique for the management of business processes in large complex enterprise systems.

Paper Nr: 288
Title:

PROFILING THE EFFORT OF NOVICES IN SOFTWARE DEVELOPMENT TEAMS - An Analysis using Data Collected Non Invasively

Authors:

Ilenia Fronza and Jelena Vlasenko

Abstract: New developers exhibit working patterns that are different from existing and experienced software developers. Understanding such patterns may help in determining the actual level of introduction developers have within a company. Moreover, browsing the internet has achieved a pivotal role, as browsing people collect valuable information about working issues and, indeed, developers spend a significant amount of effort browsing. Models have been proposed and validated for the introduction of novices in companies. This paper analyses one of the most promising of these models, the one proposed by (Fronza et al., 2009), and validates it against the patterns of usage of browsing. It appears that the patters of use of the browser confirm the proposed model and that at the end of the observation period, when according to the model, the new developers should have been fully introduced into the working patterns of the company, there is a substantial congruence between the working patterns of the new developers and of the existing developers.

Paper Nr: 289
Title:

VERIFICATION OF THE CONSISTENCY BETWEEN USE CASE AND ACTIVITY DIAGRAMS - A Step Towards Validation of User Requirements

Authors:

Sana Oueslati Ben Amor

Abstract: The requirements elicitation is a step between the user and developers has to be precise and formal. This step requires understanding the requirements to be covered by the system and to express and formalize these requirements. For structuring, documenting and analysing user requirements, UML use case diagram illustrates all functional requirements. In an advanced step, all functionalities of a system can be represented and detailed by a set of activity diagrams. In our work, the requirement validation is to check that all requirements are covered by these functionalities. In this paper, we present a validation requirement approach of UML models based on a comparison of UML use case (requirement) and activity diagrams (functionality). This comparison ensures that the use case model and activities model are consistent. It is based on a set of rules. Furthermore, we give an overview of UML-Validation tool which automates the use of these rules.

Paper Nr: 319
Title:

THE BRAIDED RECONSTRUCTION THEOREMS

Authors:

Duanliang Liu and Shouchuan Zhang

Abstract: In this paper, we introduce the method (transmutation) turning an ordinary (co)quasitriangular Hopf algebra into a braided Hopf algebra, and give the other one which is dual to it.

Paper Nr: 346
Title:

LOGISTICS OPERATION SIMULATION IN BEIJING OLYMPIC GAMES STADIUM

Authors:

Xiaochun Lu and Zheng Ni

Abstract: In 2008, the 29th Olympic Games was held successfully in Beijing. It was the largest scale games in the history, and achieved amazing results. The experience of the Olympic Games is worth to summary. Statics and analyse of logistics operation volumes in an Olympic Games stadium are made in this paper. A simulation model about stadium logistics system based on discrete event is developed by using Anylogic software. With the simulation model, the logistics resource dispatching is discussed. We find logistics coefficient that 3.2 order sheets for a fork truck and 1.6 orders sheets for a worker are proper burden. These coefficient could help us to plan logistics resources readily. The paper is useful for us to improve logistics system in a stadium.

Area 4 - Software Agents and Internet Computing

Full Papers
Paper Nr: 88
Title:

QPF SCHEDULING SCHEME FOR PERFORMANCE IMPROVEMENTS OF INTEGRATED MULTIMEDIA APPLICATIONS OVER 3.5G NETWORK

Authors:

Shin-Jer Yang

Abstract: IMS can provide full IP technology platform which combines information and communications technologies to achieve the objectives of integrated multimedia service. In IMS, SIP is introduced to transmit session control signal. Currently, the session transmission time of SIP is affected by wireless channel bandwidth, Frame Error Rate (FER), message exchange volume, retransmission mechanism and other factors. In order to effectively reduce the delay time for session transmission, we did revise conventional MAC-hs scheduling algorithm (i.e. PF) and propose QoS-based Proportional Fairness algorithm, called QPF, to enhance the session setup on SIP with channel performance of wireless communication. We perform simulations with comparisons on two KPIs of Delay and Throughput between PF and QPF schemes, by using NS-2 software tool under various multimedia applications, such as VoIP, VoD and Web. The simulation results indicate that QPF lowers 20% for VoIP delay and obtains 8.19% more VoIP Throughput than PF. The QPF performance for VoD delay can be enhanced by 11.53%, the Throughput is increased by 8.05%. Also, the delay for Web of QPF can be improved by 28.58%, and the Throughput is enhanced by 25.25%. Consequently, the proposed QPF can enhance transmission performance and obtain a higher utilization of system resources for various multimedia applications in wireless network.

Paper Nr: 123
Title:

KCSR: KEYMATCHES CONSTRAINED SECURE ROUTING IN HETEROGENEOUS WIRELESS SENSOR NETWORKS

Authors:

K. Shaila, G. H. Vineet, C. R. Prashanth and V. Tejaswi

Abstract: Wireless Sensor Networks (WSNs) consists of a large number of tiny autonomous devices called sensors which are vulnerable to security threats. Heterogeneous Wireless Sensor Networks consists of two types of nodes L1 and L2 that employees Unbalanced Key Distribution Scheme to ensure enhanced security. In this paper, the concept of Link Strength is utilized, which determines the path to be selected for secure routing. Our Keymatches Constrained Secure Routing(KCSR) algorithm provides the flexibility to choose secure path and then route the data accordingly. In this approach, secure and stable paths are chosen for communication. Simulation results show that the proposed algorithm yields better results emphasizing security when compared with earlier works.

Paper Nr: 135
Title:

A CASE STUDY: INTEGRATING A GAME APPLICATION-DRIVEN APPROACH AND SOCIAL COLLABORATIONS INTO SOFTWARE ENGINEERING EDUCATION

Authors:

Weifeng Xu and Stephen Frezza

Abstract: Teaching software engineering to undergraduate students is a challenge task. Students are expected to understand both technical and social aspects of software engineering. This paper presents a complete case study of a hybrid approach that systematically combines a game application-driven approach and social collaborations into the software engineering curriculum at the undergraduate level. The case study consists of 1) proposing a new curriculum design process, 2) identifying a set of software engineering principles, practices, and online collaborative learning tools by following the design process, 3) proposing a semester-long game project, 4) integrating the principles, practices, and the collaborative learning tools into the game development process and 5) delivering the principles, practices, and tools to students during the game devolvement. The results of the case study, including analysis of the related project documentation and students’ feedback indicate that adopting the games app-driven approach motivate students to learn in teams, help transferring knowledge effectively between instructors and students and facilitate achieving the student learning objectives.

Paper Nr: 140
Title:

CONFLICT MANAGEMENT PROCESS FOR VIRTUAL COMMUNITIES

Authors:

Juliana de Melo Bezerra and Celso Massaki Hirata

Abstract: Through the participation on collaborative tasks in virtual communities, members can express their divergences during discussions, which characterize conflicts. Conflicts can contribute positively creativity, innovation and quality of decisions. However, if not managed, conflicts can negatively impact community performance and members’ satisfaction. We propose a conflict management process for virtual communities. The process is useful to design new virtual communities because it allows to bring conflict management mechanisms, and also to improve mechanisms of existent virtual communities by correctly addressing the causes and consequences of conflicts.

Paper Nr: 160
Title:

KOI SCHOOL - Towards the Next Level of Communication, Organization and Integration in Education

Authors:

Jonas Schulte and Reinhard Keil

Abstract: Information technology plays a central role in early school education of pupils, since the information society is continuously changing and developing. Facebook, Twitter, iPhone, Flickr, blogs, YouTube, and Google Earth are just some of the new technologies that bombard us from all directions since the genesis of the Web 2.0. Although there is a broad offer of educational software, yet an integrated solution for continuous support does not exist. In this paper, we present KOI, a collaborative school management system, that combines typical administration tasks with document management and social networking features. In our opinion, modern school software has to comprise both, educational software and those new technologies that are particularly popular among pupils. Thereby, private interests (e.g. social networking or instant messaging) and those functions supporting the schooling can be combined.

Paper Nr: 202
Title:

NEW MECHANISM DESIGN IN THE C2C ONLINE REPUTATION EVALUATION OPTIMIZING

Authors:

Yi Yang

Abstract: In view of the assumption of the trader’s bounded rationality, our research analyzed the defects of current reputation evaluation mechanism and the trust problem in the online trade. The mechanism was redesigned from the three aspects of the evaluation process, the accounting method of the reputation-limit and reputation rating scores, then, the system of "tell the truth" has been added to the mechanism to realize the improvement of incentive function in evaluation mechanism. In order to prove the effectiveness of the new mechanism, an algorithm example was given based on the sequential game analysis and Harsanyi transformation of the bounded rationality trader’s decision-making. The results show that the new mechanism can effectively provide the decision-making information for the trading parties in the process of reputation evaluation, and encourage both parties of evaluation to select the "tell the truth" strategy achieving the maximum reputation score.

Paper Nr: 234
Title:

TOWARDS AN EXPLANATORY MODEL OF eMARKETPLACES UTILIZATION - A Case Study of Saudi Arabia

Authors:

Fahad Algarni

Abstract: Many aspects of ICT such as use of smart cards, use of mobile phones and the internet have become integrated into business operations today, becoming indispensable aspects of organizations. Recent developments have seen the introduction of eMarketplaces which are virtual spaces where businesses and consumers can interact and exchange goods and services. Whilst utilisation of eMarketplaces in many regions of the world such as, North America, Europe and Asia are increasing, its adoption and utilization in Saudi Arabia has been very slow. The aim of this paper was to investigate the current lack of utilisation of eMarketplaces in Saudi Arabia. A comparison was made between utilisation of eMarketplaces in Saudi Arabia and other parts of the world. Statistical data collected shows that utilisation of eMarketplaces in Saudi Arabia is the lowest. Possible explanations were identified as weak ICT infrastructure in the country, weak technological culture, undeveloped complementary services and lack of investment by the government. Several strategies that can be used to address this problem are identified. An explanatory model of eMarketplaces utilization is proposed in this paper with suggestions for further work in this area.

Paper Nr: 254
Title:

MULTIPLE MOBILE SYNCHRONISED SINKS (MMSS) FOR ENERGY EFFICIENCY AND LIFETIME MAXIMIZATION IN WIRELESS SENSOR NETWORKS

Authors:

H. Sivasankari, Vallabh M. and Shaila K.

Abstract: Wireless Sensor Networks(WSNs) consist of battery operated sensor nodes. Improving the lifetime of sensor network is a critical issue. Nodes closer to the sink node drains energy faster due to large data transmission towards a sink node. This problem is resolved through mobility of the sink node. The Mobile sink moves to particular positions in predetermined order to collect data from the sensor nodes. There is considerable delay in the case of single mobile sink. In this paper we have used the concept of multiple mobile sinks to collect data in different zones which in turn coordinate to consolidate the data and complete the processing of data received from all the sensor nodes. A distributed algorithm synchronizing all the mobile sinks are used to reduce the delay in consolidation of data and reducing the overall energy consumption. The twin gain increases the lifetime of theWireless Sensor Network. Simulation results using MultipleMobile Synchronized Sinks clearly shows that there is an increase of 28% and 56% in the lifetime of the Wireless Sensor Networks in comparison with Single Mobile Sink and Static Sink respectively.

Paper Nr: 345
Title:

RESEARCH ON GRID-BASED MOBILE BUSINESS PROCESS AND SIMULATION

Authors:

Dan Chang and Li Si

Abstract: Since the emergence of mobile commerce, there have been much research and practice on how to improve wireless communication technology and safety technology and so on (Hull, 1997), however, the research which integrated wireless technology and business processes of the original e-commerce is still in its early stage, lacking of systematic analysis and theoretical support regarding the information sharing, business collaboration, and effectively access of mobile devices in practice. In this paper, mobile business processes is the research object. On the basis of combing and analyzing the current mobile business process, utilizing the grid management theory construct mobile business process based on grid. Furthermore, a quantitative simulation will be made on non-grid and grid-based mobile business processes in order to prove the superiority of mobile business processes based on grid.

Short Papers
Paper Nr: 53
Title:

AN E-BUSINESS FRAMEWORK DESIGN USING ENHANCED WEB 2.0 TECHNOLOGY

Authors:

Karim Ouazzane and Muhammad Sajid Afzal

Abstract: The main aim of this research is to develop a robust, reliable, efficient and novel framework by using Web 2.0 technology that will serve as a front and middleware collaboration model between data persistence logic and operational requests. This framework will serve as a mediation platform for request brokers. It will provide a high level of abstraction by encapsulating low level details of the system, such as request handling, request mediation, response handling and service loading. In order to overcome the hard coded service mapping with interface, there are no customizable business logic and no generic customized workflows problems. These are the most essential requirements of converting the Small and Medium Enterprises into one e-business platform swiftly.

Paper Nr: 106
Title:

DYNAMIC CONTEXT MODELING BASED FCA IN CONTEXT-AWARE MIDDLEWARE

Authors:

Zhou Zhong

Abstract: This paper presents a proposal that aims to process dynamic Context modeling for mobile objects in the Generalized Adaptive Context-Aware Middleware (GaCAM). Context changes closely related to difference sensors in mobile environment, so the common data modeling in which data structures were fixed after models designed is inadaptable. The principal task is: dynamic Context modeling when the mobile entities holding Context models meet each other or meet a new environment. It is important to consider the differences between dynamic Context modeling and common data modeling. The proposal is Context modeling by applying Formal Concept Analysis (FCA) merging. Our approach creates a merged specialization/generalization hierarchy which captures the knowledge of Context source in mobile environment. The hierarchy can not only describe both Context concepts and related sensors, but also can help higher Context reasoning and activating potential Context event.

Paper Nr: 116
Title:

COST EARNINGS ANALYSIS ON AQUATIC PRODUCT E-COMMERCE

Authors:

ZhuoFan Yang and ShuHua Ren

Abstract: With the development of e-commerce in the global trade, e-commerce has brought substantial profits, so it's necessary to put e-commerce into Chinese aquatic enterprises. From the Angle of cost and earnings analysis, this article discusses the feasibility of e-commerce in Chinese aquatic enterprises. We got this conclusion that e-commerce can let purchase and trade cost down and increase the transition cost accordingly. In the early stage of e-commerce development, the switching cost is very high, but along with the development of e-commerce, the switching cost will drop, meanwhile, the earnings of aquatic product trade will rise.

Paper Nr: 181
Title:

MODEL-DRIVEN APPROACH FOR USER-CENTRIC MASHUPED SOA

Authors:

Meriem Benhaddi

Abstract: The Mashup - a new Web 2.0 technology - has emerged as a new way to promote and to enable the End User Development approach. In fact, as underlined by (Boris Büchel and al., 2009), the Mashup targets the inexperienced end-user, and allows him to develop his own applications. The Service Oriented Architecture (SOA) is enhanced and made user-centric via the Mashup that allows end users, without any technical skills or advanced knowledge on the SOA, to compose services. However, mixing services with Mashup provide fragile and non stable solutions; hence the need to convert the Mashup solution into BPEL to benefit from the ease of composition of Mashup and the strength and the security of the BPEL engine. In Model Driven Development, an essential idea is to automatically transform models from one modelling domain to another. In this paper we present a new approach based on the Model Driven Development paradigm to transform the SOA logic composition from a Mashup script into a BPEL script.

Paper Nr: 193
Title:

RESEARCH ON THE EVOLUTION MECHANISM OF ECOSYSTEM OF CYBER SOCIETY BASED ON THE HAKEN MODEL

Authors:

Xiaolan Guan and Zhenji Zhang

Abstract: The cyber-society is a new social form derived from the emergence of computer and Internet technology which serve as the representatives of the information technology. It is similar to the ecological system, and its internal structure is hierarchical and composed according to a certain structure, and self-organization is an important mechanism of its evolution. Based on the theory of self-organization and by the cross-disciplinary analysis and research, this paper analyzes the self-organization conditions of Ecosystem of cyber-society, constructs the evolution model based on the Haken model, and then analyzes the functions of competition and coordination between the system elements and the process that the order parameter dominates that they bring. Finally, it selects the Internet development measurement data of Beijing, Shanghai, Shanxi, Tianjin, and other 27 provinces and cities in 2006 and 2009 as samples and analyzes the evolution process or China Ecosystem of cyber-society. The result reflects that technology innovation is the dominant order parameter in the evolution of Ecosystem of cyber-society, and a key factor to the development process of China Ecosystem of cyber-society and then the paper puts forward some ideas and suggestions to the development of China Ecosystem of cyber-society.

Paper Nr: 197
Title:

PROFIT POINT OF WEB 2.0

Authors:

Juchong Liu

Abstract: Web 2.0, with its social, free, and open characteristics, has been bringing the Internet a profound evolution within less than a decade by increasing the applications of Web 2.0 elements. In the academia, most research is focused on the concept, characteristics and theoretical basic research of Web 2.0. Instead, this paper reviews the literature about Web 2.0 research first, introducing the definition, the theoretical basis, the features and business model of it; thence puts emphasis on studying the “Teainchina” case to demonstrate the real application of Web 2.0 elements; ultimately, this paper concludes that business model is truly important, but the site's profitability is even more vital by putting forward Profit Point and analysing it. Internet companies should deliberate websites design under the theory of Web 2.0 to explore Profit Point and then carry out promotion and marketing of the website to manage internet business well.

Paper Nr: 203
Title:

RESEARCH ON THE COORDINATED DEVELOPMENT OF ECOSYSTEM OF CYBER-SOCIETY AND ITS MEASUREMENT

Authors:

Honglu Liu

Abstract: Similar to the general characteristics of the ecosystem, the cyber-society is also composed of a number of interrelated and interactive elements. Therefore, to promote the health, orderly and rapid development of the overall ecosystem of cyber-society, we must ensure the coordination development between the various system elements, and make their structures and functions in a dynamic balance state of mutual adaptation and coordination. Use the theories of Ecology and Synergetics, this paper considers the cyber-society as a typical ecological system, and puts forward the concept of “coordinated development of ecosystem of cyber-society” first and studies its content, requirements and measure method systemically, and then applies it to the development of China ecosystem of cyber-society. It aims to provide new theories, ideas and methods for human to understand, regulate and develop the cyber-society much better, and also make forward-looking research and exploration to realize the coordinated development of the cyber-society.

Paper Nr: 220
Title:

AN EVALUATION MODEL FOR METAMARKET ORIENTED WEBSITES

Authors:

Emad Farazmand

Abstract: Website is an important part of e-business. There are many researches done about website design and evaluation, almost focus on technical aspects of website neglecting the importance of evaluating website, based on business requirements. This research tries to determine suitable set of criteria for website evaluation in the field of metamarkets. To do so at first the metamarket and website elements are discussed. Then we have selected some metamarket websites and examined the website elements in each of them. Finally there is conclusion about the suitable set of criteria and evaluation model for metamarket website evaluation that some of them are much more important. Also there are recommendations for those who want to establish metamarket website.

Paper Nr: 286
Title:

AN EFFICIENT COOPERATIVE CACHE APPROACH IN MOBILE INFORMATION SYSTEM

Authors:

Thu Tran Minh Nguyen and Thuy Thi Bich Dong

Abstract: Data availability in mobile information systems is lower than in traditional information systems due to its limitations of wireless communication such as low bandwidth, instability, and easy disconnection. Cooperative data caching is an attractive approach related to this problem because it allows sharing and coordinating the cached data among multiple mobile users in the network. So it can improve the performance of the system and the accessibility of data items. However, if we do not have a good cooperative caching approach then the cache hit ratio and response time may become ineffective. In this paper, we propose an efficient approach for cooperative caching in mobile information systems namely MIX-GROUP. We evaluate and demonstrate the experimental results on datasets by using NS2 simulator. MIX-GROUP is also proved the effectiveness by comparing it with the existing other approaches such as COCA and GROUPCACHING. The experimental results present that MIX-GROUP is more effective than those other approaches.

Paper Nr: 304
Title:

QUALITY OF SERVICE OPTIMIZATION OF WIRELESS SENSOR NETWORKS USING A MULTI-OBJECTIVE IMMUNE CO-EVOLUTIONARY ALGORITHM

Authors:

Xing-Jia Lu and Zhi-Rong Chen

Abstract: Quality of Service(QoS) is the the performance level of a service offered by the wireless sensor networks (WSNs) to user, which is an important topic of WSNs. The goal of QoS is to achieve a more deterministic network behavior. QoS of WSNs is an extension of the multi-objective optimization problem, which is modelled as a optimal model with constraint of network connection. The QoS must satisfy the multi-objectives such as energy consumption, bandwidth, delay jitter, packer loss rate. In order to search the optimal solution of the QoS of WSNs, we propose a multi-objective immune co-evolutionary algorithm (MOICEA) for QoS of WSNs. The MOICEA is inspired from the biological mechanisms of immune systems including clonal proliferation, hypermutation, co-evolution, immune elimination, and memory mechanism. The affinity between antibody and antigen is used to measure the optimal set of QoS, and the affinity between antibodies and antibodies is used to evaluate the diversity of population and to instruct the population evolution process. In order to examine the effectiveness of the MOICEA, we compare its performance with that of genetic algorithm (GA) in terms of four objectives while maintaining network connectivity. The experiment results show that the MOICEA could obtain promising performance in efficiently searching optimal solution by comparing with other approaches.

Paper Nr: 337
Title:

BRIEF ANALYSIS ON PROFITS ALLOCATION OF IOT INDUSTRY CHAIN

Authors:

Sun Qin and Lu xi-yan

Abstract: The development of Internet of Things (IOT) is in its infancy in China, many key technologies and core problems of the industry chain are not resolved, including how to allocate profits of companies on the industry chain to promote the development of the whole industry is important. This paper analyzes the IOT industry chain under the model dominated by Chinese Telecom Operator, and put forward the basic principles of the benefits distribution and proposes own view on the whole industry chain in the distribution of interests of all parties by using the Shapley-value model.

Paper Nr: 338
Title:

A REVIEW OF ONLINE CONSUMER BEHAVIOR RESEARCH

Authors:

Ling-yu Tong and Qin-jian Yuan

Abstract: As online consumer behavior is being paid more and more attentions from both managers and academic researchers, it is urgent to understand the whole landscape of the research field. Since then, this paper, based on knowledge metric analysis technique of mapping knowledge domains, reviews the literature of online consumer behavior research from three perspectives – the development of the research around the world, the theoretical foundations of the research and the focused topics, providing some insights for future studies.

Paper Nr: 341
Title:

AN INTEGRATED FRAMEWORK FOR THE IMPLEMENTATION AND CONTINOUS IMPROVEMENT OF THE ROMANIAN SPATIAL DATA INFRASTRUCTURE

Authors:

Cristina Oana

Abstract: This paper provides an overview of the first version of the Romanian NSDI GeoPortal. Geospatial data availability, interoperability and integration remain is still a problem of the current spatial data infrastructures (SDIs). The Romanian INIS GeoPortal, an ANCPI’s special IT project, has aimed to address those challenges today. This online GeoPortal application enables easy, open, seamless, and on demand discovery, access, retrieval, visualization and analysis of distributed geospatial data, information, applications and web services from any member of the NSDI INIS Council. Challenges must still be resolved, but a mature next release of the Romanian INIS GeoPortal will produce tremendous benefits for the Romanian society and our National Spatial Data Infrastructure implementation.

Posters
Paper Nr: 105
Title:

RECOGNIZE TRUSTWORTHY WEB SERVICES VIA INSTITUTIONS

Authors:

Han Jiao

Abstract: The emergence of web services, which provides flexible but standard methods for heterogeneous entities interacting with each other, has tremendously revolutionized the communication mode on the Web. However, due to large scalability and anonymity, it is difficult for web service users to determine the trustworthiness of web services. This paper introduces a new concept, web service institution, based on which we proposes a web service trust determination framework. We discuss the necessity of web service institutions that leads to a win-win situation for all. We devise the core trust computation logic which takes both institution-level and service-level factors into account. We also discuss several typical use cases.

Paper Nr: 126
Title:

AN ANALYSIS OF RULES-BASED SYSTEMS TO IMPROVE SWRL TOOLS

Authors:

Adriano Rivolli

Abstract: The Semantic Web renewed a growing interest in rule based software systems and their development. Semantic Web Rule Language (SWRL) is a rule language that enables Horn-like rules to be combined with Web Ontology Language (OWL) knowledge bases to provide even more expressivity. However, as rule based web system mature, the number of rules they use grows making them difficult to manage. Developers face problems when trying to understand and control the big rule sets they create. In order to address this problem, techniques and tools are necessary to organize, view and create new rules as part of a large rule set. This work presents strategies and techniques developed in order to improve SWRL tools based upon a survey of rule tools, a study of the state of the art and the analysis of representative rule sets.

Paper Nr: 133
Title:

OPTIMISED MODEL OF INFORMATION TRANSFER IN VIRTUAL ENTERPRISES IN CLOUD COMPUTING ENVIRONMENT

Authors:

Rongqian Ni and Runtong Zhang

Abstract: The information transfer in a virtual enterprise is a crux of achieving effective integral management of virtual enterprises. We implement cloud computing in information transfer of virtual enterprises, the model of which is simulated by JSP language.

Paper Nr: 146
Title:

CONSTRUCTING WEB ONTOLOGIES INFORMED BY SEMANTIC ANALYSIS METHOD

Authors:

Júlio Cesar dos Reis

Abstract: In the context of the Semantic Web (SW) research, recent proposals have explored new approaches for a more precise representation of the meanings. These proposals attempt to model the information in a more adequate way, and, at the same time to be compatible with the SW standards. This paper proposes heuristics for deriving an initial Web ontology (WO) from Ontology Charts (OCs) produced by the Semantic Analysis Method (SAM).

Paper Nr: 159
Title:

TRANSFORMATION OF OBJECT-ORIENTED CODE INTO SEMANTIC WEB USING JAVA ANNOTATIONS

Authors:

Petr Ježek and Roman Mouček

Abstract: This paper deals with difficulties occurring during transformation of schema and data from an object-oriented code to a semantic web representation (RDF, OWL). The authors describe differences in semantic expressivity between the object-oriented approach and the semantic web approach and look for the ways to fill this semantic gap. Then some existing approaches with their difficulties are introduced and a preliminary idea using Java annotations is proposed. Java annotations add missing semantic information into Java code, which is consequently processed by the proposed framework and serialized into output semantic web structure (OWL).

Paper Nr: 358
Title:

ENHANCING RECOMMENDER SYSTEMS DEVELOPMENT WITH HUMAN, SOCIAL, CULTURAL AND ORGANIZATIONAL FACTORS

Authors:

Andreas S. Andreou and Stephanos Mavromoustakos

Abstract: Recommender Systems (RS) aim at suggesting filtered Web information adapted to the needs or interests of users by predicting their access behavior using a certain strategy or algorithm. The creation of RS is usually approached focusing mostly on user behavior modeling, while the recommendation engine often neglects critical, non-technical aspects of software systems development. Conceiving a RS primarily as a self-contained or part of a Web-application, the present paper utilizes the SpiderWeb methodology and takes into account important requirements that result from human, cultural, social and organizational factors (HSCO) so as to drive the RS development activities.

Area 5 - Human-Computer Interaction

Full Papers
Paper Nr: 51
Title:

USER ACCEPTANCE OF SOCIAL SHOPPING SITES - Social Comparison and Trust

Authors:

Jia Shen and Lauren Eder

Abstract: This paper describes a study on user acceptance of an emerging e-commerce technology: social shopping websites. Leveraging the power of social networking technologies with online shopping, social shopping sites have emerged in recent years to address the fundamental nature of shopping as a social experience. Despite tremendous business interest and anticipated potential benefits, some central issues remain such as whether users will adopt such websites and the factors that affect the adoption. Incorporating social science theories, this study extends the Technology Acceptance Model (TAM) with social factors such as an online shopper’s tendency to social comparison, and trust in information privacy and data security. Results provide significant support of the extended model. Directions for future research are discussed.

Paper Nr: 164
Title:

AUGMENTING ACCESSIBILITY IN SOCIAL NETWORKS - A Virtual Presenter

Authors:

Leonelo D. A. Almeida, Elaine C. S. Hayashi, Júlio C. Reis and Paula D. P. Costa

Abstract: Vila na Rede is an Inclusive Social Network (ISN) system developed as a product of e-Cidadania's Project, with the objective of being accessible to a broad variety of users, including those less familiar with technology or with low literacy skills. Different features were incorporated into Vila na Rede in order to provide its users with scaffolding learning support that helps them profit more from the system. One of these features is the Virtual Presenter, a speech synchronized facial animation system integrated to a text-to-speech synthesizer (TTS) that allows users to have textual information alternatively presented by a talking head. This paper investigates the integration of the Virtual Presenter into Vila na Rede system, and discusses results of activities conducted with the target audience to evaluate this new feature at the ISN.

Paper Nr: 185
Title:

VALUATION FRAMING FOR SOCIAL SOFTWARE - A Culturally Aware Artifact

Authors:

Roberto Pereira and M. Cecilia C. Baranauskas

Abstract: Despite the popularity of the so-called social software, just a small fraction of the systems launched on the Web is really successful. The diversity of users, their limitations, preferences, values and culture, are examples that indicate the complexity of developing this kind of system; moreover there is still a lack of approaches, artifacts and methods for supporting designers to deal with this complexity. This paper presents an artifact specially adapted to support designers in the task of evaluating social software, taking values and cultural issues into account. It draws on Organizational Semiotics and on building blocks of culture to shed light on this research area. The artifact was applied to the evaluation of five different prototypes of systems for supporting cross-cultural collaboration, and the results demonstrate the viability of using this artifact for supporting the evaluation as well as the design of social software.

Paper Nr: 225
Title:

ELECTRONIC GOVERNMENT IN BRAZIL - Evaluating Brazil Initiative

Authors:

Giovanni Bogea Viana and Maria Beatriz Felgar de Toledo

Abstract: This article presents an overview of a major e-government site in Brazil, the Transparency Portal, and makes a comparison with electronic government of some other countries, in order to assess the degree of accessibility of each site. To obtain the results, the validation tools ASES, DaSilva and TotalValidator were used to evaluate the sites based on e-MAG, WCAG v1 and WCAG v2. A survey with entities, NGOs and ordinary users is also presented, and aims at evaluating the Transparency Portal according to criteria such as navigation and ease of use. These assessments will be used as suggestions for improving the Brazilian site and make it easier to use and accessible to a greater number of citizens, regardless of educational level and specific needs.

Paper Nr: 240
Title:

UNDERSTANDING HCI ISSUES OF BROWSER GAME PLAYING IN CHINA - An Empirical Study

Authors:

Fan Zhao and Qingju Huang

Abstract: Online browser game is becoming one of the most promising and lucrative growth markets. A comprehensive understanding of online browser game adoption is the first step to understand browser game adoption. A growing body of research into the experience and effects of online games indicates that the enjoyment of playing is a complex, dynamic and multifaceted phenomenon. Based on Technology acceptance model, Flow, Theory of Planned Behavior, and social role theory, this paper proposed an integrated framework that explains player behavior toward adoption of online browser games. A survey was conducted to evaluate the research model. The results indicated that increasing consumers’ perceptions of ease of use, flow experience, social norms, attitude, Perceived behavioral control, subjective norm, critical mass, descriptive norms, perceived enjoyment, and relaxation, and providing players with low access cost would improve their acceptance of online browser games.

Paper Nr: 249
Title:

UNDERSTANDING MEDIA VIOLENCE FROM A ROLE-PLAY PERSPECTIVE - Effects of Various Types of Violent Video Games on Players’ Cognitive Aggression

Authors:

Younbo Jung

Abstract: In this study I investigated the effects of depicted character roles and wishful identification with the main character on aggression and game enjoyment among players in violent video games. The results (n = 36) showed that character roles (e.g., the police, gangster, and athlete) did not have any significant effect on post-game aggression. However, there was a significant association among depicted character roles, wishful identification, and game enjoyment. Implications in terms of the level of aggression, identification with the game character, and game enjoyment are discussed.

Short Papers
Paper Nr: 33
Title:

STUDENTS’ ACCEPTANCE OF E-GROUP COLLABORATION LEARNING

Authors:

Adam Marks

Abstract: Online students in higher education are increasingly using Electronic Group Collaboration learning tools such as Discussion Forums, Blogs, Wikis, and Journals within their course environment. This study discusses some of these new online group-collaborative tools, and the extent to which they are being used. This study also investigates the level of acceptance of learners of these tools. The findings of this study describe the number and type of Electronic Group Collaboration tools most preferred by online students, and the reasons behind their preference.

Paper Nr: 165
Title:

EVALUATION OF CROSS-CULTURAL WEB INFORMATION SYSTEM DESIGN GUIDELINES

Authors:

Gatis Vitols

Abstract: This article summarizes, analyzes and evaluates the selected set of guidelines provided for the cross-cultural web information system design. Globalization affects various fields of human activities such as communication, business, travelling and others. In a business field, many companies search a way to expand their services globally. One of the main support tools for a global business is an information system that can function online and instantly serve information to people from various parts of the world. Such information system is often called a cross-cultural web information system. In this paper hypothesis is brought forward that part of the guidelines provided for cross-cultural web information system designers are not appropriate for application in the existing or in future developed projects. Guidelines are selected based on certain criteria. Evaluations of most visited web information systems by users from Japan and Latvia are performed by researchers from both countries. Evaluation results compliance to the summarized guidelines is reviewed.

Paper Nr: 200
Title:

THE DESIGN OF ADAPTIVE INTERFACES FOR ENTERPRISE RESOURCE PLANNING SYSTEMS

Authors:

Akash Singh and Janet Wesson

Abstract: Research has shown that AUIs can improve the usability of software systems. Enterprise Resource Planning (ERP) systems are an example of complex software systems which suffer from several usability issues. Limited research has been published on the applicability of AUIs to the domain of ERP systems. This paper aims to address this limitation by discussing the design of AUIs for ERP systems. A secondary contribution of this paper is the proposal of an adaptive taxonomy for ERP systems. Application of the proposed taxonomy and the proposed design is demonstrated through the implementation of an AUI for an existing ERP system, namely SAP Business One (SBO). The AUI was designed to address several of the usability issues identified in SBO, specifically efficiency and learnability.

Paper Nr: 237
Title:

A METHOD PROPOSAL FOR IMPLEMENTING ACCESSIBILITY IN DESKTOP APPLICATIONS FOR VISUALLY IMPAIRED USERS

Authors:

Livia Cristina Gabos Martins and Bruno Elias Penteado

Abstract: Currently, little is said about the accessibility-oriented desktop applications. In the case of this study, problems related to the application structure, which has characteristics of a legacy software, brings challenges that hinder the implementation of accessibility. This article shows an implementation of accessibility by applying the concepts of web standards in desktop applications, addressing factors such as controlling the events of Flash components to make data accessible to screen readers and communication between the layers of user interface and business taking into account information accessibility through the use of technology MSAA.

Paper Nr: 281
Title:

WEB INTERFACE FOR SEMANTICALLY ENABLED EXPERTS FINDING SYSTEM

Authors:

Abramowicz Witold, Bukowska Elżbieta and Dzikowski Jakub

Abstract: In this paper, we address the issue of designing an interface for a semantic-based expert finding system that allows for creation of sophisticated queries in a user-friendly manner. The designed interface supports the structured way of building queries, the usage of semantic concepts, as well as facilitates the process of defining complex logical structures. We also discuss the procedure and the outcomes of the interface usability evaluation.

Paper Nr: 284
Title:

ACCESSIBILITY OF GROUP SUPPORT FOR BLIND USERS

Authors:

Yuanqiong Wang and John Schoeberline

Abstract: Group support applications are widely used in workplace. Unfortunately, persons who are blind often find difficult to access such applications, due to the highly graphical nature of the applications, which hinders their ability to contribute to the group. As the result, persons who are blind are often face problems gaining and retaining employment. This paper presents preliminary results of a series of focus group study conducted in the mid-Atlantic region on accessibility and usability issues of group support applications. How persons who are blind utilize group support applications to support their group tasks; the tasks/steps utilized to complete a group project; and, the accessibility and usability issues experienced by blind users are discussed. Additionally, the focus group study identified the reasons persons who are blind discontinued utilizing group support applications; the other tools utilized to support group work; the accessibility design considerations; and, the accessibility documentation and support needed.

Paper Nr: 285
Title:

RIGHT MOUSE BUTTON SURROGATE ON TOUCH SCREENS

Authors:

Jan Vanek and Bruno Jezek

Abstract: The pervasiveness of computer systems is largely determined by their ease of use. Touch screens have proven to be a natural interface with a strong sensorimotor feedback. Although multi-touch technologies are ever more popular, single-touch screens are still often preferred. They are cheaper and they map directly to pointing devices such as computer mice, thus requiring no software modifications. Therefore, they can easily be integrated into existing systems with WIMP interfaces, e.g. MS Windows based systems. Applications in such systems often rely on user pressing the right mouse button to open context menus etc. Since single-touch screens register only one touch at a time, different methods are being used to allow a user to determine the outcome of the touch. The paper proposes a new interaction scheme for this purpose and an algorithm to detect it.

Posters
Paper Nr: 46
Title:

DESIGNING A SITUATIONAL 3D VIRTUAL LEARNING ENVIRONMENT TO BRIDGE BUSINESS THEORY AND PRACTICE - The Role of Scaffolding

Authors:

Shwu-Huey Wang and Yufang Cheng

Abstract: The objective of the study is to explore the effect of scaffolding strategy in enhancing students’ application ability in a 3D virtual learning environment. In order to provide students with a “hands-on” experience, we utilized virtual reality technology to build a 3D virtual supermarket (3DVS) to guide students to apply class theory to the practice. There is a female virtual customer poses questions during her shopping in the 3DVS, the participants have to be a clerk to answer the questions. The questions posed by the virtual customer are designed based on marketing mix theory (Kotler et al., 2006). We invited ten students to understand the system efficiency. The results of the interview indicated that the participants hold quite positive comments toward the system.

Paper Nr: 262
Title:

MINDDESKTOP - Computer Accessibility for Severely Handicapped

Authors:

Ori Ossmy, Ofir Tam, Rami Puzis and Lior Rokach

Abstract: Recent advances in electroencephalography (EEG) and electromyography (EMG) enable communication for people with severe disabilities. In this paper we present a system that enables the use of regular computers using an off-the-shelf EEG/EMG headset, providing a pointing device and virtual keyboard that can be used to operate any Windows based system, minimizing the user effort required for interacting with a personal computer. Effectiveness of the proposed system is evaluated by a usability study of non-handicapped users, indicating high user satisfaction and decreasing learning curve for completing various tasks. A short demonstration movie of the proposed system can be seen in the link provided.

Area 6 - Enterprise Architecture

Full Papers
Paper Nr: 90
Title:

REQUIREMENTS FOR AUTOMATED ENTERPRISE ARCHITECTURE MODEL MAINTENANCE - A Requirements Analysis based on a Literature Review and an Exploratory Survey

Authors:

Matthias Farwick, Berthold Agreiter, Ruth Breu and Steffen Ryll

Abstract: Enterprise Architecture Management (EAM) is the practice of modeling the business and IT artifacts in an enterprise and relating them with each other. By documenting these interdependencies between business and the supporting IT, strategic decisions can be made towards a planned and consolidated enterprise architecture that matches the business needs. However, enterprise architecture models can grow very large, thus making the manual creation and maintenance of these models a difficult and time consuming task. From this, we derived the three research questions of this paper: (i) Which related work exists on the maintenance of EA models, and does it refer to automation techniques? (ii) Is EA maintenance automation demanded by practitioners? If yes, (iii) What are the requirements for such an automation method and tool? In this paper we tackle these questions by conducting a literature analysis on EA literature from research and practice. In addition, we present the results of a survey among EA practitioners we conducted to find out current maintenance practices. We then describe a collection of requirements for an automated EA maintenance method and tool, that we derived from the results of the survey, the findings in the literature review and our own experience as EA consultants. Finally, we present several success evaluation criteria for an automated EA maintenance solution.

Paper Nr: 158
Title:

A CROSS INDUSTRY EVALUATION OF CRITICAL SUCCESS FACTORS FOR ALIGNMENT OF STRATEGY AND BUSINESS PROCESSES - A Case Study of SMEs in the Region of Jönköping in Sweden

Authors:

Marie Noel Eyenga Ondoa

Abstract: The challenges in the global economy have forced companies to rethink the way they operate and their relations with both customers and subcontractors. To remain competitive, companies need to align their business processes with the firm’s strategy and make a strategic use of information technology. This paper addresses one of the important issues in business process management field as well as strategic alignment that is, how do we create and sustain alignment between business processes and strategy? The authors have performed a literature review in order to analyse the challenges and critical success factors in process management and business and IT alignment. The results of that investigation are the basis for developing the approach that is advocated in this paper. Four case studies have been conducted in the area of Jönköping in order to test the validity of the approach in Small and Medium Enterprises (SMEs). The results show that SMEs continuously put efforts to maintain alignment between their business processes and strategy by means of Information Technology. They usually consider people, management, IT/IS and organisational culture as most important in order to create alignment between strategy and business processes. Organisational structure and performance measurement tend to be less important.

Paper Nr: 299
Title:

WASABIBEANS - Web Application Services and Business Integration

Authors:

Jonas Schulte

Abstract: Albert Einstein, a German physicist borne in 1879, said: “Progress requires exchange of knowledge”. This sentence is more relevant than ever, since projects in economy and research become increasingly complex. Due to the information flood and the growing complexity scientists all over the world do research on knowledge management and organization. This article outlines the particular position of CSCW systems for knowledge organization, since involving people in the process of data structuring and organization is crucial to support (work) process and information retrieval. Furthermore, a framework will be presented to demonstrate some innovative concepts for collaborative knowledge organization and work. The focus is on presenting the powerful authentication and authorization infrastructure and means of flexible user-driven repository integration to build up complex knowledge networks.

Paper Nr: 300
Title:

AN INFORMATION ORIENTED FRAMEWORK FOR RELATING IS/IT RESOURCES AND BUSINESS VALUE

Authors:

Alexander Borek and Ajith Kumar Parlikad

Abstract: Several studies have highlighted the importance of information and information quality in organisations and thus information is regarded as key determinant for the success and organisational performance. At the same time, there are numerous studies, frameworks and case studies examining the impact of information technology and systems to business value. Recently, several studies have proposed maturity models for information management capabilities in the literature, which claim that a higher maturity results in a higher organizational performance. Although these studies provide valuable information about the underlying relations, most are limited in specifying the relationship in more detail. Furthermore, most prominent approaches do not or at least not explicitly consider information as important influencing factor for organisational performance. In this paper, we aim to review selected contributions and introduce a model that shows how IS/IT resources and capabilties could be interlinked with IS/IT utilization, organizational performance and business value. Complementing other models and frameworks, we explicitly consider information from a management maturity, quality and risk perspective. Moreover, the paper discusses how each part of the model can be assessed in order to validate the model in future studies.

Short Papers
Paper Nr: 52
Title:

THE ADIWA PROJECT - On the Way to Just-in-Time Process Dynamics based on Events from the Internet of Things

Authors:

Markus Schief, Christian Kuhn, Birgit Zimmermann, Philipp Rösch, Walter Waterfeld, Jens Schimmelpfennig and Dirk Mayer

Abstract: In this paper, we introduce a concept, which focuses on innovative commercial system implementations reflecting process-embedded events from the Internet of Things. The developed concepts are derived from experiences applying recent research advances to industry scenarios. The rationale behind the overall concept is twofold: while transparency is increased by event-based methodologies in the context of the Internet of Things, the agility of business processes is fostered by enhanced business process models, orchestration support, execution control, and user assistance.

Paper Nr: 80
Title:

COMPOSITE ENTERPRISE PROCESS MODELING (CEPROM) FRAMEWORK - Setting Up a Process Modeling Center of Excellence using CEProM Framework

Authors:

Eswar Ganesan

Abstract: Process Modeling is one of the important subject area as well as a practical tool in enterprise wide initiatives like Business Process Management, Enterprise Business Architecture and Enterprise Modeling. Process Modeling is both an art and a science which leads to better understanding and improvement of how products/services of the enterprise are reaching the customers. Though there are various researches already conducted and reported on process modeling, these researches are disparate in nature and either address a part of process modeling in detail or provide theoretical background of how to address specific issues rather than addressing enterprise wide process modeling using a structured framework that assist practitioners. The research objective is to prescribe a generic framework for enterprise process modeling – Composite Enterprise Process Modeling (CEProM) Framework with modular parts based on applied research and practical exposure enabling practitioners to comprehend enterprise process modeling at the large.

Paper Nr: 110
Title:

INVESTOR-MANAGER HETEROGENEOUS BELIEFS AND CORPORATE FINANCING DECISION

Authors:

Jian Ma and Zhixin Liu

Abstract: This paper considers a firm that may issue common stock or debt to undertake an investment opportunity when the investors and the manager have different estimates of the expected return from the investment. Firstly, an equilibrium model is developed to reveal the impact of investor-manager heterogeneous beliefs on corporate financing decision, and the model concludes that the greater the investors’ belief relative to the manager, the more likely the firm is to choose to issue equity rather than debt. Secondly, using a sample of debt and seasoned equity issues from Chinese listed firms we empirically analyze the conclusion above. We find empirical results support for the conclusion.

Paper Nr: 154
Title:

A MULTI-LAYER TREE MODEL FOR ENTERPRISE VULNERABILITY MANAGEMENT

Authors:

Bin Wu and Andy Ju An Wang

Abstract: Conducting enterprise-wide vulnerability assessment (VA) on a regular basis plays an important role in assessing an enterprise’s information system security status. However, an enterprise network is always very complex, separated into different types of zones, and consisting hundreds of hosts in the networks. The complexity of IT system makes VA an extremely time-consuming task for security professionals. They are seeking for an automated tool that helps monitor and manage the overall vulnerability of an enterprise. This paper presents a novel methodology that provides a dashboard solution for managing enterprise level vulnerability. In our methodology, we develop a multi-layer tree based model to describe enterprise vulnerability topology. Then we apply a client/server structure to gather vulnerability information from enterprise resources automatically. Finally a set of well-defined metric formulas is applied to produce a normalized vulnerability score for the whole enterprise. We also developed the implementation of our methodology, EVMAT, and Enterprise Vulnerability Management and Assessment Tool, to test our method. Experiments on a small E-commerce company and a small IT company demonstrate the great potentials of our tool for enterprise-level security.

Paper Nr: 156
Title:

VARIANT LOGIC META-DATA MANAGEMENT FOR MODEL DRIVEN APPLICATIONS - Allows Unlimited End User Configuration and Customisation of All Meta-data EIS Application Features

Authors:

Jon Davis and Elizabeth Chang

Abstract: The scope for end users to influence the design and functionality of off the shelf Enterprise Information system (EIS) applications is usually minimal, requiring pursuing expensive vendor supported customisations. Our ongoing development of temporal meta-data EIS applications seeks to overcome these issues, through modelling rather than coding, and with the meta-data model supporting the capability for end users to define their own application logic meta-data, to supplement or replace the originating vendor’s pre-defined application logic, as what we term Variant Logic. Variant Logic can be applied to any object defined in a meta-data EIS application, and can be defined by any authorised user, without the need for additional coding, and is available for immediate execution by the framework runtime engine. Variant Logic is also preserved during automated meta-data application updates.

Paper Nr: 166
Title:

THE CONSTRUCTION OF OPEN INNOVATION PARADIGM - A Perspective from the Knowledge Management

Authors:

Sun Xin and Wang Qian

Abstract: In a highly competitive time, innovation is the foundation and necessary conditions which keep the ability of manufacture competition and survival. But innovation is a high failure rate and high-risk activity with market and technical uncertainties. With the rapid development of science and technology and rapid change of customers’ demand, the traditional closed innovation paradigm can not maintain the competitive advantage of company. In this situation, a new innovation paradigm-open innovation comes out and compensates the disadvantages of closed innovation. In the open innovation paradigm, companies can obtain external knowledge resources to improve the ability of knowledge management. From the perspective of knowledge management, this paper analyzes the characteristics of open innovation paradigm and the relationship between open innovation and knowledge management. According to the analysis result this paper will put forward some useful suggestions about how to construct of an effective open innovation paradigm.

Paper Nr: 191
Title:

LABORATORY 2.0 - Towards an Integrated Research Environment for Engineering Mechanics

Authors:

Jonas Schulte and Reinhard Keil

Abstract: Cooperation Support Systems, respectively CSCW-Systems, increasingly offer standardized interfaces to allow their integration into university-wide IT infrastructures. However, several disciplines (e.g. engineering and medical science) require the use and seamless integration of additional applications to meet researchers’ requirements and support collaboration in a sustainable manner. This articles outlines the possibilities to integrate high-tech laboratories into existing IT infrastructures to strengthen the exchange of information among teaching, research, and industry. Since, laboratory components are usually characterized by proprietary interfaces, we replaced these manufacturer-specific interfaces and protocols by a service-oriented architecture for laboratories. Therefore, functionalities of laboratory components will be encapsulated as a service and made accessible by Linux Field-Bus Couplers. The modularization of the laboratory allows the connection to the world of e-learning documents. This article highlights the symbiosis between research in engineering and teaching at universities. The authors explain in the article that not only the research can take a significant influence on teaching, but also vice versa, the teaching is a part of the later researches in laboratories.

Paper Nr: 204
Title:

MINING OF AD-HOC BUSINESS PROCESSES USING MICROSOFT SHAREPOINT, NITRO & PROM 6.0 - An Industrial Practice

Authors:

Farhad Naderipour

Abstract: The improvement and re-engineering of business processes is challenging task especially when ad-hoc processes are take into account. Process mining techniques allow identification of process knowledge and characteristics based on so called event logs. Many information systems today provide such log i.e. Microsoft SharePoint which is one of the pioneers of collaborative portals. Although it supports either structured and ad-hoc process orchestration, managing dynamic processes in SharePoint is open to accept new ideas. Since SharePoint registers all change events on its content databases, it can serve as valuable source to mining process context. This paper introduce process mining concepts and techniques and demonstrates the application of Nitro to constructing event logs based on XES standard and finally apply process mining techniques using Prom 6.0 which is newly developed framework for process mining.

Paper Nr: 217
Title:

COMMUNITY-BASED OPEN SOURCE - The Phenomenon and Research Opportunities

Authors:

Manlu Liu and Qiang Tu

Abstract: Community-based open source (community source) development has emerged as a new way of developing enterprise applications, leading to a unique type of open source practice involving collaboration from multiple organizations in a virtual environment. In this paper, we introduce the concept and motivation of community source and address several research directions in community source under multiple perspectives. We examine a real world case, the Kuali community source project, to help better understand this new phenomenon. The long term research objectives are discussed at the end of the paper. This paper facilitates a general understanding of the emerging community source development landscape. We believe that the issues presented in this paper can attract more researchers to study this new area and help organizations make better decisions in IT investments.

Paper Nr: 221
Title:

A FRAMEWORK FOR KNOWLEDGE MANAGEMENT ARCHITECTURE

Authors:

Emad Farazmand and Ali Moeini

Abstract: Many organizations began to reexamine and rearrange their business strategies, processes, information technologies, and organizational structures from a knowledge perspective. Adoption and assimilation of the knowledge management paradigm requires the design and establishment of structures, processes, and technologies along with organizational knowledge resources. The knowledge differs from the data and information by origin. Many experiences about preexisting methods for information system planning are still usable for knowledge management planning. One of the well known of them is enterprise architecture and specially Zachman framework. This paper is about customization of Zachman framework for defining knowledge management architecture in an enterprise.

Paper Nr: 255
Title:

AN ARCHITECTURE FOR INTEROPERABILITY OF ENTERPRISE INFORMATION SYSTEMS BASED ON SOA AND SEMANTIC WEB TECHNOLOGIES

Authors:

Fuqi Song

Abstract: Enterprises interoperability becomes a necessary and critical task for enterprises as the information and complexity of systems increase exponentially. To adapt existing information systems to new requirements and build bridges between them, a conceptual vision and understanding of interoperability can facilitate the technical development to reduce the conflicts and gaps between heterogeneous systems. Now the focus shifts from the syntactic concern to semantic issues. Semantic heterogeneity currently becomes the next barrier and challenge to face in enterprise interoperability, it promises to play a major role to enable interoperability of enterprise information systems. The approach can be transposed at enterprise decision level, where heterogeneous business processes also need to be interoperable. SOA helps to explore a loosely coupled architecture with many advantages, especially for enhancing interoperability among enterprises. To obtain business process interoperability, service orchestration and choreography could coordinate and compose the services and processes effectively. This paper, first gives a brief introduction of main issues in this domain, and then recalls a framework for enterprise interoperability. Based on these methodologies and the interoperability framework, a new architecture for enterprise interoperability of information systems using SOA ideology and semantic web technologies is proposed, whose final objective is to enable and enhance enterprise interoperability and then create business values.

Paper Nr: 280
Title:

AN ARCHITECTURAL APPROACH TO ANALYZE INFORMATION QUALITY FOR INTER-ORGANIZATIONAL SERVICE

Authors:

Shuyan Xie and Markus Helfert

Abstract: In inter-organizational service (IOS) system, the quality of information exchanged and shared by involved organizations is important. Although information quality (IQ) has been emphasized for decades, IQ problems still widely exist. For a significant class of information related to semantic issues, it is necessary to improve information quality not just by working on the information/data itself. However, this is not commonly understood and often leaves little doubt about the effectiveness of the current approach. We consequently propose an architectural approach to enhance IQ for IOS: enterprise-level information architecture allows a rich contextual environment to guarantee IQ, and provides a traceable path to measure IQ across organizations. It is demonstrated in an emergency medical service enterprise.

Paper Nr: 314
Title:

RESEARCH ON THE OPERATIONAL INTEGRATION PROCESSING OF TELECOM OPERATORS BASED ON GAME THEORY

Authors:

Xiaoliang Wang and Zhongliang Guan

Abstract: Referring to the Game theory, the paper gives a conclusion of China's telecom industry restructuring, and analyze how to reserve advantage business in both sides and how to find the profit-maximizing strategy for each side when they are merged together. In addition, the paper also analyzes the competition between telecom operators and cooperation, and gives the corresponding recommendations based on the "Prisoner's Dilemma" case study.

Paper Nr: 359
Title:

DESIGN SCIENCE AND ACTOR NETWORK THEORY NEXUS - A Perspective of Content Development of a Critical Process for Enterprise Architecture Management

Authors:

Theodora Ngosi and Markus Helfert

Abstract: Design science in the Information Systems (IS) field is situated at the intersection of behavioural sciences, engineering and social sciences. However, little critical attention is paid to the behavioural aspects in a design process. Specifically, the role of the actors, group dynamics, consensus building and how results are achieved. This paper describes a design-oriented case study on the content development of the Enterprise Architecture Management, one of 32 critical processes (CP) of the Information Technology Capability Maturity Framework (IT-CMF). The case study methodology is an ethnographic exploratory approach tracing the content development of this CP from conception to maturity. We apply the combined principles of design science and Actor Network Theory (ANT). We develop a Translation Model that gives an integrative basis for using these combined principles in order to interpret the content development of this CP. The paper concludes with the validation of the Translation Model model to highlight its dominating qualities.

Paper Nr: 394
Title:

MODELING APPROACH FOR BUSINESS IT ALIGNMENT

Authors:

Karim Doumi

Abstract: Nowadays, the business IT alignment has become a priority in most large organizations. It is a question of aligning the information system on the business strategies of the organization. This step is aimed at increasing the practical value of the information system and makes it a strategic asset for the organization. Many works showed the importance of documentation, the analysis and the evaluation of business IT alignment, but few proposed solutions applicable to the strategic and functional level. This paper aims has to fill this gap by proposing a simple approach with has two levels of modeling (1) the strategic level, to model through the approach of goals modeling and (2) the functional level based on the approach of enterprise architecture. This approach is illustrated by case study of a real project in a Moroccan public administration.

Posters
Paper Nr: 57
Title:

A BOTTOM-UP APPROACH FOR THE REPRESENTATION AND CONTINUOUS UPDATE OF TECHNOLOGY ARCHITECTURES IN PUBLIC ADMINISTRATION - Position Paper

Authors:

João Soares

Abstract: In organizations where there are not any concerns about updating Information Systems Architecture (ISA) models in a strict and automatic fashion, business requirements and pressures frequently lead to an increasing gap between current Information Systems and the latest version of their ISA and, in particular, their Technology Architecture (TA) models, when they exist. One of the causes for the lack of “investment” in TA models is the considerable effort required to “discover” infrastructural features that serve as input to the development of architectures. This work is focused on devising a bottom up approach for the representation and continuous update of TAs in Public Administration (PA) using tools to automatize the discovery of architectural evidences (such as logs or network events) and a process based on the annotation mechanism to enable interaction contexts that allow actors to make explicit their knowledge about their activities through graphic representations.

Paper Nr: 74
Title:

THE RESEARCH ON KNOWLEDGE MANAGEMENT OF CONSTRUCTION ENTERPRISE

Authors:

Yang Xiaohong

Abstract: The enterprise’s knowledge management has gradually become a key factor in development of the company. Researching on knowledge management of construction enterprise mainly supporting industry of national economy is particularly needed. By analyzing the questions and necessity of knowledge management in construction enterprises, this paper gives the thought of knowledge management from several aspects, such as enterprise culture, organization structure, incentive mechanism, Information Construction, etc.

Paper Nr: 81
Title:

PATTERN-BASED DEVELOPMENT OF ENTERPRISE SYSTEMS - From Conceptual Framework to Series of Implementations

Authors:

Sergey V. Zykov

Abstract: Building enterprise software systems (ESS) is a dramatic challenge due to data size, complexity and rapid growth of the both in time. The issue becomes even more dramatic when it gets to integrating heterogeneous applications. Therewith, a uniform approach is required, which combines formal models and CASE tools. The suggested methodology is based on extracting common ERP module level patterns and applying them to series of heterogeneous implementations. The approach includes an innovative lifecycle model, which extends conventional spiral model by: (i) formal data representation/management models and (ii) DSL-based "low-level" CASE tools supporting the formalisms. The methodology has been successfully implemented as a series of portal-based ERP systems in ITERA International Oil and Gas Corporation, and in a number of trading/banking enterprise applications for other enterprises. As for work-in-progress, currently underway are semantic network-based airline dispatch system, and a 6D-model-driven nuclear power plant construction methodology.

Paper Nr: 231
Title:

AN EVALUATION OF ENTERPRISE MODELLING METHODS IN THE LIGHT OF BUSINESS AND IT ALIGNMENT

Authors:

Banafsheh Khademhosseinieh and Ulf Seigerroth

Abstract: All organisations have to keep improving different aspects of their business in order to survive in their business area. This improvement can be done in different ways, including as business and IT alignment. There exist different ways of performing business and IT alignment, such as applying EMMs. Enterprise Modelling (EM) is an area that has been built up with the intention of improving business practice and management. An Enterprise Modelling Method (EMM) is a systematic way of building models that are to be used for planning improvements. For this purpose, it is needed to make an evaluation of methods capabilities in supporting business and IT alignment. According to all above, the results of this paper include a list of EMMs that are applicable in business and IT alignment.

Paper Nr: 343
Title:

INCREMENTALLY DEFINING ANALYSIS PROCESSES USING SERVICES AND BUSINESS PROCESSES

Authors:

Weisi Chen and Fethi A. Rabhi

Abstract: Increasingly, the idea of using SOA and BPM principles is being applied outside the enterprise computing context. One potential area is the use of reusable services to facilitate the definition, composition and execution of analysis processes over large distributed repositories of heterogeneous data. However, it is hard to see how such a solution can be applied when users are already engaged in performing the same tasks using their own tools and processes. This paper studies the problem in more detail and proposes a simple method in which the initial analysis process is iteratively refined into a service-based one. The methodology relies on an ADAGE architecture which gives the flexibility to define data analysis processes in an incremental way. As a case study, the approach is demonstrated on the process of conducting event studies using the SAS package together with different data sources (e.g. Thomson Reuters Tick History-- TRTH). The case study demonstrates that the resulting process is substantially more efficient and effective than the original one.

Paper Nr: 349
Title:

PREDICTIVE MODEL OF RAIL CONSUMPTION FOR BEIJING SUBWAY LINE 2

Authors:

Lin-rong Pang

Abstract: This paper focus on the rare researched project, the consumption of subway steel rail based on quantitative analysis; make a Fuzzy Time Series Prediction Model for the aggregate consumption of the steel rail expended in subway. Set Beijing Subway Line.2 as a case object, make an analysis and give a prediction, conclude the rule of steel rail consumption, the result provides a scientific basis for management of Beijing subway steel rail maintenance.

Paper Nr: 395
Title:

BUSINESS IT ALIGNMENT - A Survey

Authors:

Karim doumi

Abstract: Nowadays, the strategic alignment of information systems has become a priority in most large organizations. It is a question of aligning the information system on the business strategies of the organization. This step is aimed at increasing the practical value of the information system and makes it a strategic asset for the organization. In the literature several approaches have been developed to solve the problem of alignment. For example the approach of alignment between architecture and the business context, the approach needs oriented, approach alignment between process and information system...etc. In this paper we propose a detailed study of each approach (benefits and limitation) and we propose a comparison between these different approaches.