"Annals. Computer Science Series" Journal Romania, 300559 Timişoara, 6 Lascăr Catargiu str. Phone: 004 0256 220 687 E-mail: conference.fcia [@] tibiscus [.] ro |
|
» Heuristic Algorithm for Graph Coloring Based On Maximum Independent Set » Hilal Almara'beh and Amjad Suleiman ABSTRACT: A number of heuristic algorithms have been developed for the graph coloring problem, but unfortunately, on any given instance, they may produce colorings that are very far from optimal. In this paper we investigated and introduce a three heuristics algorithm to color a graph based on a maximum independent set. The select a node with minimum and maximum degree consecutively (Min_Max) algorithm is implemented , tested and compared with select a node with minimum degree first (SNMD), select a node with maximum degree first (SNXD) in terms of CPU time and size of graph with different densities ( 0.1,0.2,…,0.9 ). The result indicated that the Min_Max algorithm and SNXD is better than SNMD based on the time of first maximum independent set, running time of CPU and the number of coloring nodes. KEYWORDS:Maximum independent set, Heuristic algorithm, adjacent nodes, Independent set. |
9 - 18
|
» Grouping Based Coordinator Election for Service Provisioning in Mobile Ad Hoc Networks » Arul Jothi Yuvaraja and Barakkath Nisha Usman Ali ABSTRACT: Efficient routing and service provisioning in MANET is a big research challenge. In centralized directory-based schemes, some mobile nodes hold the service directory to assist the communications between service providers and clients. Although service co-ordination is easier, such centralized management is hard to scale and the centralized directories lead to bottlenecks. Later Hybrid and Distributed schemes constructed local directories which form the backbone of the network. But its topology-based scheme is still hard to scale to a larger network (e.g., with several hundreds of nodes). The existing service provisioning techniques makes use of hierarchical decomposition of the geographic area into zones and selects a core node in each of the zones to act as an agent for all the nodes in its zone. The existing techniques use node ID or hash value to select a core node. This paper deals with Adaptive service coordination, by a rendezvous node which delivers efficient tracking and coordination of services. This rendezvous node identifies services that can be grouped and maintains information about available services, and helps in reducing the overhead and network traffic. KEYWORDS:Hybrid and Distributed schemes, topology-based scheme, service provision, Mobile Ad Hoc Networks. |
19 - 24
|
» On Comparing Different Chaotic Maps in Differential Evolutionary Optimization » Mohamed F. El-Santawy and A. N. Ahmed ABSTRACT: This paper presents a comparison between new approaches introducing different chaotic maps with ergodicity, irregularity, and the stochastic property in Differential Evolution algorithm (DE). The members of the new family so-called Chaotic Differential Evolution (CDE) algorithms employ chaos in order to improve the global convergence by escaping the local solutions. KEYWORDS:Chaos, Chaotic Maps, Differential Evolution Algorithm, Optimization. |
25 - 28
|
» CORE generation from phone calls data using rough set theory » Sanjiban Sekhar Roy and Sarvesh S. S. Rawat ABSTRACT: During the year 1982, rough set theory was introduced by Z Pawlak in order to deal with uncertain data. Rough set theory relies only on the available data and attributes to analyze features as well as to generate classification rules without any additional information. Here we have used rough set theory to find out reduct and core of the call center data by reducing the superfluous information, and then provided an algorithm based upon the hidden pattern in data which will adequately increase the efficiency of the organization. KEYWORDS:Rough set, Uncertain data, Reduct and Core. |
29 - 32
|
» Intrusion Detection Data Analysis Using Dominance Based Rough Set » Sanjiban Sekhar Roy, V. Madhu Viswanatham and P. Venkata Krishna ABSTRACT: Being an extended part of the approach, known as classical rough set theory, today dominance based rough set approach has appeared as a useful mathematical device for dealing with uncertain data. The central theme of this paper is the analysis and evaluation of intrusion detection data set through the application of dominance based rough set approach. KEYWORDS:Dominance based rough set, Intrusion Detection. |
33 - 38
|
» A Fast Algorithm To Construct Neural Networks Classification Models With High-Dimensional Genomic Data » Waheed B. Yahya, Morolake O. Oladiipo and Emmanuel T. Jolayemi ABSTRACT: This paper presents efficient techniques for constructing Artificial Neural Networks (ANN) models for tissue samples classification using high-dimensional microarray cancer data. A good ANN classification model is governed mainly by the quality of gene predictors, the chosen activation function and the number of interactive (hidden) neurons employed to build the model. Up till now, no standard procedure to uniquely determine the suitable number of hidden layers that would yield good ANN models in any given microarray cancer classification problem has been reported in the literature. To fill this gab, a data-driven algorithm that efficiently determines the optimal number of hidden layers that are desirable to yield efficient neural networks models in any binary response microarray cancer tumour classification problem is proposed in this work. The sub-sampling scheme of Monte Carlo cross-validation (MCCV) was adopted to construct the best neural networks models within a range of specified number of hidden layers. Applications on simulated and real life data sets showed that the proposed method yielded stable and efficient neural networks classification models with good prediction results. The leukemia and diffuse large B-cell lymphoma (DLBCL) microarray cancer data sets both of which are publicly available are employed to demonstrate our results. KEYWORDS:Artificial Neural Networks; average misclassification error rate; Monte Carlo cross-validation; Receiver Operating Characteristic Curve; AUC; Sensitivity; Specificity. |
39 - 58
|
» Super QC-Pack: a novel statistical software for analysing quality control data » Waheed Babatunde Yahya and Isaac Adeola Adeniyi ABSTRACT: In this paper, we present a novel statistical package (Super QC-Pack) for analysing quality control data and constructing various quality control charts. The new Super QC-Pack (SQC) computes the control limits for various quality control charts and simultaneously provides the graphical outputs of the results in an easily understood manner. The SQC package is currently packed as a single setup which can be installed directly into any windows operating system. It was developed using the C-Sharp programming language under the Visual Studio .Net 2008 framework and the Zed graph graphics library. The SQC package is computationally efficient and provides accurate results like any of the existing statistical packages. The new SQC software is graphics-driven and very user-friendly that is suitable for users with limited background in statistics and statistical computing. The SQC software is not commercialised. It is freely available and can be downloaded and installed without restrictions at www.unilorin.edu.ng/sqcpack/yahya_adeniyi/webpage.htm. KEYWORDS:P-chart, C-chart, R-chart, S-chart, U-chart, X-Bar chart, process capability indices, C-sharp, Microsoft Visual Studio 2008, Zedgraph graphics library. |
59 - 72
|
» Minimization of Boolean Functions Using Genetic Algorithm » Masoud Nosrati, Ronak Karimi and Mehdi Hariri ABSTRACT: Minimization of Boolean functions is one of basic Boolean algebra functions. This paper presents a method for minimizing Boolean functions. To do this, first a graph data structure that is needed for storing Boolean function and basic operations will be investigated.In fact, it is used for storing Karnaugh map adjacencies. Then, the adjacencies and conditions for selection of appropriate adjacencies for factoring are nominated. As a essential part of paper, a brief review of genetic algorithms is presented and finally usage of GA for selection of appropriate adjacencies is described. KEYWORDS:Boolean Functions, SOP, Minimization, Factoring, Genetic Algorithm, GA. |
73 - 77
|
» Design of a Fingerprint Authentication System for Access Control » Akinsowon Omoyele Ajoke, Olumide Sunday Adewale and Alese Boniface Kayode ABSTRACT: With increasing use of the internet, there is increasing opportunity for identity fraud, organised crime, money laundering, theft of intellectual property and other types of cybercrime. There has also been an increase in reported biosecurity incidents, border control incidents and terrorism. The events of September 11 2001, the recent one that occurred here in Nigeria on the 24th of December, 2009 and many more occurrences have triggered increased response from governments, intelligence and law enforcement agencies world-wide. The structure of an automated fingerprint authentication system is described in this project work. The system completely eliminates the need for manual perusal of fingerprints to find a possible match. KEYWORDS:Biometric, Security, Authentication, Fingerprint, Minutiae. |
78 - 84
|
» Curve-Fitting Models for Immune Mediated Celldistruction - Comparison » Bogdan Timar, Corina Vernic, Simona Apostol and Viorel Şerban ABSTRACT:
Background and aims: Variables in biology and medicine have a series of particularities when fitted to a curve, especially when their value is conditioned by time. Many curve fitting comparisons are based on the value of R squared, indicator which is not accurate in describing not-nested non-linear models. curve fitting models, Akaike’s Information Criterion, R squared, Diabetes Mellitus. |
85 - 89
|
» An Approach for MAC-Protocol In Wireless Sensor Network - The Group Split Property (GSP) - » Mohammad Abd-Allatef Al-Shalabi and Majd Osama Al-Shalabi ABSTRACT: Wireless Sensor Networks (WSN) are collections of sensors that are equipped with a radio and form a wireless network together. In this work we look at communication protocols, which have an obvious effect on the energy within the network, The energy issue is important in WSN, it is generally hard or impractical to charge/replace the exhausted battery, which gives way to the primary objective of maximizing network life time, so that we focused in our Protocol on MAC, consume this protocol the most part of energy from the sensor. In this protocol we proposed an approach for MAC protocol called GSM protocol. In this Protocol, cluster-heads change on a regular intervals To ease the load on the sensors in the network ,the sensors send average of data to avoid large bit of the message, we used own simulation to get the comparison results with "Direct Communication with the base station", we proposed GSP protocol expected to become more effective in reducing energy consumption , by split the cluster into two part that is the default and use the timers for organizing the operations in cluster and avoid the Collision. KEYWORDS:
|
90 - 93
|
» A New Sensor Network Protocol Stack Architecture for Congestion Control » Jasmine K.S. and Babitha Naidu ABSTRACT: A wireless sensor network is new technological advancement in wireless communication. Sensor networks are made up of hundreds or thousands of sensor nodes which are densely deployed in a remote environment with the capabilities of sensing, wireless communications and computations. Many different routing, power management and data dissemination protocols have been designed for wireless sensor networks. In this paper a study on layered protocol stack architecture for communication in sensor networks along with the various congestion control algorithms is discussed. KEYWORDS:Protocol stack, Sensornet protocol layer, Congestion control, Flow control, Reliability. |
94 - 98
|
» On the Robustness of k-Sort and its Comparison to Quick Sort in Average Case » Mita Pal and Soubhik Chakraborty ABSTRACT: The present paper examines the robustness of the average case O (nlogn) complexity on K-sort, a new version of quick sort. In our first study we reconfirm this through computer experiments. A computer experiment is a series of runs of a code for various inputs. A deterministic computer experiment is one which produces identical results if the code is re-run for identical inputs. Our second study reveals that K-sort is the better choice for discrete uniform distribution U(1, 2,..., k) inputs whereas quick sort is found better for continuous uniform distribution U(0,1) inputs. Interestingly, increasing k which decreases the ties is good for quick sort but bad for K-sort. KEYWORDS:Cauchy distribution, Computer experiment, K-Sort, Robustness, average complexity. |
99 - 102
|
» The Personal Response System in Combination with the Smart Board » Adriana Hudecz ABSTRACT: Spread of new technologies in education, and intelligent table system response InterweiteWorkspace and its associated personnel PRS IR have determined, evaluation techniques, interactive. This paper proposes interactive evaluation technique, using remote personal response of each student learned in the bank to obtain a real-time feedbeack. KEYWORDS:smart board, evaluation, assessment. |
103 - 105
|
» Edge Detection Algorithm for Machine Vision Systems » B.Janani, R.Harini, J.B.Bhattacharjee and B.Thilakavathi ABSTRACT: In real world machine vision problems, issues such as noise and variable scene illumination make edge and finding object difficult. Effective edge detection algorithm is required for many important areas like machine vision, automated interpretation systems and is often used as the front-end processing stage in object recognition and interpretation systems. Many research works has been done to develop effective edge detection algorithms. In this paper parameters are modified in the Robert cross edge detection method to achieve a higher level of scene illumination and noise independence. Results obtained by this method are compared with several leading edge detection methods, such as Sobel and Canny. Further this project uses the logarithmic based edge detection method for the same. Application of this algorithm can be used for Image Enhancement which is based on this edge detection concept, shows better image enhancement. KEYWORDS:Edge Detection, Logarithmic image processing, Contrast Estimation, Standard deviation. |
106 - 111
|
» C# Implementation of a Face Detection System using Template Matching and Skin Color Information » ANIDU Adesola Olaoluwa and FASINA Ebunoluwa Philip ABSTRACT: Face detection is the first step for any automatic face recognition system, Human Computer Interaction systems, surveillance systems and also a step towards Automatic Target Recognition (ATR) or generic object detection/recognition. This paper presents a face detection system using template matching and skin color information as methods for detecting a face / faces in an image. The system is developed using C#.net programming language. The stages of development are categorized into three namely: the pre-processing, normalization and the face detection stages. The system was tested using pictures taken from a digital camera and stored on the computer system and an accuracy of 80% on the average was achieved for the tested images. KEYWORDS:Template, normalization, template matching, skin color. |
112 - 119
|
» www.fcia.tibiscus.ro | » www.tibiscus.ro | | © 2003-2021 |