Home > Products > Journal and Reviews > I.Re.Co.S. > Latest issue


International Review on
Computers and Software
December 2013
(Vol. 8 N. 12)

Select the previous issues





    Learning Objects Retrieval Algorithm Using Semantic Annotation and New Matching Score

    by E. A.Vimal, S. Chandramathi


    Abstract - Learning objects retrieval is important for a variety of information needs and applications such as collection management, summary and analysis. Especially, retrieval of learning objects from the big collection of textual database has become a very active field. Since the learning information is stored in textual format in most of the time, the retrieval of learning objects from textual database faces two major challenges such as, large data handling, and effectiveness. If these two challenges are solved, the performance and its applicability will be improved significantly. Accordingly, the first challenges of large data processing will be handled through the semantic annotations method. In the semantic annotation method, the text document was converted to semantic annotation data using concept-based modeling. With the help of existing work, the large data will be converted to annotation object so that the matching process with the input query will be reduced and the complexity of handling big data will be also reduced. The second challenge of effectiveness will be solved using new matching score to obtain better retrieval effectiveness. Here, query will be matched with the annotated object using matching score that will be devised newly and it will be applied for learning object retrieval. The experimentation will be carried out utilizing the learning objects given in IEEE digital library and the performance of the proposed technique is evaluated using precision, recall and F-measure and also, comparative analysis will be performed to prove the better performance of the proposed technique. The analysis from the experimentations showed that the proposed approach has obtained a maximum precision of 0.98 and average precision of 0.95.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Data Retrieval, Data Handling, Semantics, Matching Score.


    An Efficient Technique for Frequent Item Set Mining in Time Series Data with aid of AFCM

    by J. Mercy Geraldine, E. Kirubakaran


    Abstract - Frequent itemset mining from the database is a difficult task. Many techniques have been proposed to mine the frequent rules from the database, but it consider only the frequency value to decide whether the extracted rule is frequent are not. Hence we proposed one technique to overcome these problems by utilizing AFCM. Initially the time series database values are clustered using the AFCM technique. After that, the frequent itemsets are mined from the clustered results by exploiting the sliding window technique. During the frequent itemsets mining process, the itemsets frequency and utility are considered and that itemsets are frequent which are satisfying the utility and its consistency. The proposed technique is implemented in the MATLAB platform and its performance is evaluated using rainfall database.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Adaptive Fuzzy C Means (AFCM), Cuckoo Search (CS), Frequent Item Set, Window Sliding, Consistency.


    Approximate Search in very Large Files Using the Pigeonhole Principle

    by Maryam S. Yammahi, Chen. Shen, Simon. Berkovich


    Abstract - This paper presents a new technique for efficient searching with fuzzy criteria in very large information systems. The suggested technique uses the Pigeonhole Principle approach. This approach can be utilized with different embodiments, but the most effective realization would be to amplify some already given intrinsic approximate matching capabilities, like those in the FuzzyFind method [1][2]. Considering the following problem, a data to be searched is presented as a bit-attribute vector. The searching operation consists of finding a subset of this bit-attribute vector that is within particular Hamming distance. Normally, this search with approximate matching criteria requires sequential lookup for the whole collection of the attribute vector. This process can be easily parallelized, but in very large information systems this still would be slow and energy consuming. The suggested method, in this paper, of approximate search in very large files using the Pigeonhole Principle, circumvents the sequential search operations and reduces the calculations tremendously.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Algorithms and Data Structures, Big Data, Information Retrieval, Approximate matching, Pigeonhole Principle.


    A Comparative Analysis of Software Clone Management Techniques

    by Kodhai E., Kanmani S.


    Abstract - Software clone research has proved that there are redundancies in software. This redundancy increased the maintenance effort. For nearly past two decades a number of software clone detection techniques and clone management techniques have been proposed in the software clone research. In literature some researchers has also proposed clone management techniques such as clone removal, clone modification, analyses the effect of clones during maintainability, investigating their evolution, and assessing the root causes of clones. There have been a number of researches which also focused on the evaluation of clone detection approaches. This paper is the analysis of different clone detection and management techniques. First, we analyze and evaluate all the currently available clone detection techniques and tools. Second, we study and discuss the different clone management techniques and tools which are currently available.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Software Clones, Clone Detection, Refactoring, Clone Management, Software Maintenance.


    A New Ontological Approach to Build Projects Memories in Software Development Life Cycle A Case Study of the Software Industry

    by Rabab Chakhmoune, Hicham Behja, Youssef Benghabrit, Abdelaziz Marzak


    Abstract - The goal of software development in today's industry is to provide products meeting customer requirements at the lowest cost, the best quality and the shortest time. Design knowledge is needed, and cases and developer’s experiences should be utilized at most as possible. In addition, software development is becoming increasingly knowledge intensive and collaborative. In this situation, the need for an integrated know-how, know-why and know-what to support the representation, capture, share, and reuse of knowledge among distributed professional actors becomes more critical. Our approach consists in studying each stage of the process of software development and defining knowledge necessary to capitalize in order to organize the project memory based on domain ontology. Afterwards, the development of such knowledge base will be used to help professional actors to accomplish their task in bringing knowledge of past projects.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Software Development, Knowledge, Project Memory, Domain Ontolog.


    A Requirements Engineering Process Assessment Model for Small Software Development Organization

    by Anurag Shrivastava, Surya Prakash Tripathi


    Abstract - Requirements Engineering Process Assessment (REPA) was conducted prior to Requirements Engineering Process Improvement (REPI) activities to identify the Requirements Engineering Process area to be improved. The Requirements Engineering Process Assessment methods such as REPM, R-CMM, REMMF, Uni-REPM were available to all enterprises, but they were difficult for most of the small organization because of complexity and consequent large investment in terms of time and resources. In order to conduct Requirements Engineering Process Assessment (REPA) activities with few efforts in small organizations effectively. We have presented a quantitative Requirements Engineering process assessment model based on Quantitative Assessment Model for Software Process Improvement in Small Organization. It was an efficient approach to evaluate the quality of the Requirements Engineering process execution for improvement of the organization Requirements Engineering process maturity. We present Requirements Engineering process assessment results for four software companies, describe milestone compliance score distribution and compliance score distribution for Requirements Engineering assessment items. All these companies reported satisfaction with their participation. Our conclusion is that quantitative Requirements Engineering Process Assessment model is useful in assessment of Requirements Engineering process in small organization and in identifying the strength and weakness of the Requirements Engineering processes.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Requirements Engineering Process Assessment, Requirements Engineering Process Improvement, Small Organization, Compliance Score.


    QoS Aware Vertical Handoff Decision for UMTS-WiMAX Networks

    by Nirmal RajT, R. M. Suresh


    Abstract - The main challenge in 4G networks is to perform the vertical handoff between pairs of different types of networks in the presence of 2G, 3G, WLAN, WMAN, satellite etc along with the fulfillment of quality of service (QoS) requirements. The lack of QoS can cause break in network during handoff or loss of network at remote condition. Hence in order to overcome these issues, in this paper, we propose a QoS aware vertical handover decision for (QAVHD) in 4G networks. Initially, when the mobile terminal (MT) on movement finds a new network, it collects the QoS information of the respective network that includes signal strength, network coverage area, data rate, available bandwidth, velocity of MT and network latency. Then MT compares estimated measurements with its old network and network which provides better QoS is selected as current network. The old network then performs the data transmission to the new network. By simulation results, we show that the proposed approach enhances the network throughput and minimizes the latency.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: 4G Networks, Vertical Handoff (VHO), Mobile Terminal (MT), Quality of Service (QoS).


    Route Optimization Using Adaptive Shrink Mechanism for MANET

    by G. Mathiyalagan, Amitabh Wahi


    Abstract - In Mobile Ad hoc networks (MANET), routing protocols do not adapt to changing network topology. Unless there is a link failure, the route will not be updated even if another route with lesser hop count becomes available. In this paper, we propose a route optimization technique using adaptive shrinking mechanism. It initially uses the estimated geometrical distance (EGD) metric for route discovery and link quality prediction technique. The adaptive shrinking mechanism involves sending a shrink packet along with the data packets, based on the parameters link quality, traffic rate and link change rate. When the source want to send some data to destination, it verifies the above said parameters and then decides to send the shrink packet. It optimizes the existing routing protocol by reducing the delay and increasing the lifetime of the network. By simulation, it is shown that the proposed mechanism reduces the delay and packet drop while increasing the throughput.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Mobile Ad Hoc Networks (MANET), Routing Protocols, Estimated Geometrical Distance (EGD).


    Performance Analysis of MAC Schemes in Wireless Sensor Networks

    by Revathi Venkataraman, M. Pushpalatha, K. Sornalakshmi


    Abstract - Wireless sensor networks have become an important area of research in recent years. A sensor network can be a network infrastructure consisting of elements that are nothing but some sensing devices which perform communication and computation along with the sensing of real world phenomenon. The data collected by these elements are sent to an administrator located at a base station who can react to specific situations by analyzing the data. Thus data collection is an important activity performed by a sensor network. To facilitate the smooth and faster collection of data, various channel access schemes are available in literature. Among these schemes TDMA (Time Division Multiple Access), CDMA (Code Division Multiple Access) and FDMA (Frequency Division Multiple Access) are the most basic as well as the most important ones. This paper analyzes the performance of these schemes in wireless sensor networks by considering a data collection scenario with a set of performance metrics. The TDMA scheme has been implemented using an adaptive timeslot assignment mechanism. The experimentations are conducted in TinyOS platform using Crossbow TelosB motes. The performance done in this paper will be helpful in choosing an optimal protocol based on these schemes for deployment of a network as well as for developing a new medium access protocol.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Wireless Sensor Networks, TDMA (Time Division Multiple Access), CDMA (Code Division Multiple Access), FDMA (Frequency Division Multiple Access), TelosB Mote.


    Surrogate Object Based Mobile Transaction

    by S. Ravimaran, A. N. Gnana Jeevan


    Abstract - The advancement in wireless technology and rapid growth in the use of mobile devices in mobile paradigm have the potential to revolutionize computing by the illusion of a virtually infinite computing infrastructure. This new mobile paradigm enabled mobile devices to support data and transaction management in addition to the static nodes. A mobile device from anywhere in the wireless environment could utilize seamlessly a large computing power or any other resources in order to provide effective data and transaction process. In addition, advancement in wireless network together with ubiquity of devices has resulted in storing and retrieving incredible volumes of knowledge provided by data in the mobile devices. Knowledge provided by data from mobile system has become very essential in many day-to-day applications. However, transaction management and knowledge discovery in mobile environment faces several challenges such as scarce bandwidth, limited battery resources, asymmetry in wired and wireless connectivity, asymmetry in mobile and fixed hosts, and mobility of host and their limitations. In addition to the traditional mobile system challenges, it also faces many challenges such as dependency on continuous network connections, data sharing applications and federation with multiple service providers. Due to these challenges, data loss, frequent disconnection, unpredicted number of wireless and wired access and high transaction aborts has occurred in data and transaction management over distributed mobile paradigm. To realize data management in mobile cloud, Surrogate Object based Mobile Transaction is proposed to deal with the fundamental issues of asymmetry problems, latency problems, low abort rate and disconnection. The proposed strategy caches the data in the surrogate object which helps in reducing the average transaction time. The required reliability is provided by sending the transaction request to surrogate object and network lifetime is maximized by migrating the surrogate objects in a way to avoid improper load balancing. The performance of proposed models have been evaluated and compared with existing models by simulation.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Disconnection, Low Abort, Mobility, Surrogate Object, Transaction.


    Random Scheduling for Exploiting Throughput and TSMA Scheduling for Alleviating Interference in Wireless Systems

    by D. Rosy Salomi Victoria, S. Senthil Kumar


    Abstract - Wireless systems schedule communicates resourcefully by allocating the shared band among associations in the identical geographic region. The lack of central control in wireless networks calls for the design of distributed scheduling algorithms. We present a distributed framework that guarantees maximum throughput. The framework is based on random scheduling which reflect best throughput power distribution in multi-hop wireless systems. The revision of this difficulty has been restricted due to the non-convexity of the primary optimization complications that forbids a well-organized answer. We yield a randomization method to agree with this trouble. We demonstrate the power distribution result through mathematical analysis and perform numerous extensions for the measured difficulty. Also, we examine how fast data can be composed from a Wireless Sensor (WS) system structured as tree. We discover and estimate a method by means of accurate simulation prototype under join cast. We reflect period setting up on a single occurrence channel with the goal of reducing the number of period slits needed to complete a join cast. We associate Time Separation Multiple Access (TSMA) scheduling with broadcast power controller to lessen the properties of interference and demonstrate that while power controller aids in decreasing the plan distance under a single occurrence, arrangement broadcasts by means of numerous occurrences is well-organized. We estimate the performance of several channel task techniques and discovery experientially that for reasonable extent systems of many nodes, the use of multiple occurrence scheduling can be sufficient to reject maximum intervention. Graphical representations are given for Mean life time, Packet Delivery Rate (PDR), throughput and time delay.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Gossip Procedure, Power Distribution, TSMA Scheduling, Information Gathering, Euclidean Distance.


    Analysis and Improvement Design on P2P Botnets Detection Framework

    by Raihana Syahirah Abdullah, Faizal M. A., Zul Azri Muhamad Noh, Robiah Yusof


    Abstract - Developing the P2P botnets detection framework is crucial when we trying to fight against P2P botnets. Poor detection method can lead to a failure of P2P botnets detection. Thus, it needs to be accurately functioned well. This paper reviews and evaluates various current frameworks of P2P botnets detection and analyzing the existing gaps to make improvement of P2P botnets detection framework. Based on a review that conducted manually, we report our findings and analysis has been done on different frameworks concern on P2P botnets detection. Consequently, the gap and motivations found from this reviews are discussed. Then, the P2P botnets detection framework architecture has been proposed with the new improvement been reinforced by hybrid detection technique, hybrid analyzer and in-depth hybrid analysis. Future directions of this review are to develop the P2P botnets detection system that has capability in high detection accuracy and efficiency.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: P2P Botnets, P2P Botnets Detection, P2P Botnets Framework, P2P Botnets Detection Criteria.


    An Adaptive Iris Recognition System with aid of Local Histogram and Optimized FFBNN-AAPSO

    by Nuzhat F. Shaikh, Dharmpal D. Doye


    Abstract - Iris recognition is the process of recognizing a person by analyzing the apparent pattern of his or her iris. Many techniques have been developed for iris recognition so far. A method that has been recently developed uses local histogram and image statistics. But it failed to locate the inner and the outer boundaries of the iris in the presence of dense eyelashes which results in poor pupil and iris localization. It also has high computational complexity which reduces the performance of the system. Here we propose a new iris recognition system with the help of local histogram and optimized with FFBNN-AAPSO. In the proposed system, first the input eye images are fetched from the iris database, it is then preprocessed using adaptive median filter to remove noise. Then the features which are extracted from the preprocessed image are given to FFBNN for training the neural network. In order to get accurate results, the FFBNN parameters are optimized using the proposed AAPSO (Adaptive Acceleration Particle Swarm Optimization). In the testing process, the images are preprocessed and subjected to feature extraction process. Then the output obtained from the feature extraction process is given to well trained and optimized FFBNN-AAPSO to check whether the given image is recognized or not. In order to analyze the performance of our proposed recognition system, images from UBRIS iris database are used and the performance of the system is compared with an existing system, PSO-Feed Forward Neural Network and Feed Forward NN.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Feed Forward Back propagation Neural Network (FFBNN), Adaptive Median Filter, Feature Extraction, Iris Recognition, Adaptive Acceleration Particle Swarm Optimization (AAPSO).


    New Automatic Clustering Method Based on the Dissemination of Binary Trees Applied to Video Segmentation

    by Adil Chergui, Abdelkrim Bekkhoucha, Wafae Sabbar


    Abstract - In order to manage the growing amount of video information efficiently, a video scene segmentation method is necessary. Many advanced video applications such as video on demand and digital library indexation also require the scene detection to organize the video content. In this paper we use clustering techniques in the video processing field to discover the video scene segmentation. Data clustering is a useful technique for the discovery of interesting data distributions and trends in the underlying data. The concentrated effort of the research community in the last few years resulted in many approaches for data clustering that progressed the field quickly in a few years. However, it remains more work to be done on non-parametric clustering techniques on large databases of high dimensionality. Through this feature, we have developed a novel method of non-parametric clustering based on the dissemination of binary trees structures on spatial representation of data. To assess the performance of this method, we compared with conventional clustering methods. We also showed its applications and advantages on video scene segmentation.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Clustering Algorithms, Trees of Dissemination, Density Based Clustering, Criteria of Validation, Dissimilarity Matrix, K-Means, Video Scene Segmentation.


    Dead Sea Water Level and Surface Area Monitoring Using Spatial Data Extraction from Remote Sensing Images

    by Nazeeh A. Ghatasheh, Mua’ad M. Abu-Faraj, Hossam Faris


    Abstract - Satellite images provide important information on earth surface, geographic area, weather and natural phenomena. Analyzing the satellite images of the Dead Sea in Jordan can help determining the water level and surface area of the Dead Sea and its declining rate. This paper derives a various measurements from temporal medium-quality remote sensing images to calculate the surface area of the Dead Sea over the past 29 years. It shows the outcome of applying spatial data calculations on the Dead Sea images taken from Google Earth Engine. Our analysis approach is to extract the region of interest followed by performing spatial based calculations. Furthermore deriving several results according to an adapted mathematical model of the Dead Sea. The main findings are the calculated 60 km2 shrinkage and the 18 meters decline over the study period. Resulting in maximum of 1.8 meters in the water level change. In addition our findings show the atmospheric noise tolerance of each processing technique. We also present a case study showing our analysis.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Dead Sea, Remote Sensing, Spatial Data Analysis, Temporal Image Processing.


    3D Face Matching Based on Depth-Level Curves

    by Naouar Belghini, Arsalane Zarghili


    Abstract - In the domain of 3D face matching, many techniques have been developed as variants of 3D facial matching approaches that reduce the amount of facial data into few 3D curves. In the literature, many curves have been considered: level-curves, radial curves, iso-stripes, crest lines, etc... In this paper, we exploit curve concept to represent 3D facial geometric. First, depth-level curves were extracted to present 3D facial data then, we investigate the dimensionality reduction offered by Random Projection to perform an artificial system for face recognition using Back-propagation neural network. Experiment was conducted using vrml files from FRAV Database considering only one training sample per person.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: 3D Face Recognition, Depth-Level Curves, Neural Network Classifier, Random Projection.


    An Efficient Multimodal Biometric System Based on Feature Level Fusion of Palmprint and Finger Vein

    by C. Murukesh, K. Thanushkodi


    Abstract - Biometric authentication is playing a vital role in providing security and privacy. This paper presents a contemporary approach for identifying an individual using the multimodal biometrics has great demands to overcome the issues involved in single trait system. Finger vein and palmprint biometric is a promising technology, now-a-days widely used because of its important features such as resistant to criminal tampering, high accuracy, ease of feature extraction and fast authentication speed. The features extracted from the preprocessed finger vein and palmprint images using Contourlet Transform reduce the overall dimensionality and computational complexity. Fusion at feature level utilizes the fused feature vectors of finger vein and palmprint using the Discrete Stationary Wavelet Transform (DSWT) to improve the overall performance of the multimodal biometric system. Integrating the biometric traits increases the robustness of the person identification and reduces fraudulent access. Experimental results based on homologous database demonstrate that the proposed system is very efficient to reduce the False Rejection Rate (FRR) and False Acceptance Rate (FAR).

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Multimodal Biometrics, Countourlet Transform, Feature Level Fusion, DSWT.


    An Analysis of Object Detection and Tracking Using Recursive and Non Recursive Algorithms for Motion Based Video

    by Thulasimani K., Srinivasagan K. G.


    Abstract - In this study, it is proposed that a frame work evaluation of recursive and non recursive algorithms for motion based video object detection and tracking. Object detection and tracking is a challenging task. Video based object detection systems rely on the ability to detect moving objects in video streams. There are many approaches adopted for video based object detection and tracking. Some of the factors should be considered such as stationary and non stationary background, deal with unconstrained environments, various object motion patterns and the dissimilarity in types of object being detected and tracked. This study proposes a recursive and non recursive algorithms such as frame differencing, Mixture of Gaussians are used to detect the object in a motion based video through foreground and background separation. Next, for object tracking is made by Mean-Shift and Lucas Kanade optical flow (KLT) tracking algorithms are used. Based upon the video resolution and frame rate, the detection and tracking timings are calculated for the input video dataset. We observed that based on their evaluation to obtain correct detection and tracking, Recursive detection algorithm and Mean shift tracking is used to track the detected objects in motion based video.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Object Detection and Tracking, Foreground Background Segmentation, Mixture of Gaussians, Mean-Shift Tracking, Lucas Kanade Optical Flow (KLT).


    Automatic Feature Extraction Using Replica Based Approach in Digital Fundus Images

    by Padmalal S., Nelson Kennedy Babu C


    Abstract - In diabetic retinopathy, exudates play a major part which causes blindness to the diabetic patients. Diabetic patients lost its vision if the exudates will extend to the macular area of the retina. Spreading of this disease on the retina is prevented by automated early detection of presence of exudates. So, it is necessary to prevent the exudates which act as a major challenge in diagnostic task. The presence of exudates is identified based on the variation in grey color presents in retina. The recognition of the optic disc is necessary in the exudates detection procedure while both are related in terms of color, contrast, etc. A various techniques has studied before like morphological approach, region growing approach and so on for the detection of automatic diabetic retinopathy. But the automatic retinal detection leads to a greater chance of loss of sight if the detection is not done properly. So, to enhance the automatic feature extraction in digital fundus images, in this work, we are going to implement replica based approach. Here Optic disk is restricted by the Localized Principal Component Analysis (LPCA) and its shape is detected by a Customized Dynamic Shape Model (CDSM). The extraction of exudates fluid is done by mutual region growing and edge recognition methods. A digital fundus image coordinate system is built to present an enhanced depiction of the features. The success of the LPCA and CDSM algorithms can be attributed to the utilization of the Replica-Based methods. Performance of the replica based approach is measured in terms of optic disk boundary detection, sensitivity and specificity.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Fundus Images, Exudates Detection, Automatic Feature Extraction, Optic Disk, Localized Principal Component Analysis, Customized Dynamic Shape Model, Mutual Region Growing, Edge Detection.


    A Review of Biometric Template Protection Techniques for Online Handwritten Signature Application

    by Fahad Layth Malallah, Sharifah Mumtazah Syed Ahmad, Salman Yussof, Wan Azizun Wan Adnan, Vahab Iranmanesh and Olasimbo A. Arigbabu


    Abstract - Handwritten signature biometric is considered as a noninvasive and nonintrusive process by the majority of the users. Furthermore, it has a high legal value for document authentication, as well as being dependent on by both commercial transactions and governmental institutions. Signature verification requires storing templates in the database, which threatens the security of the system from being stolen, to being vulnerable to the template playback attack that may give an attacker an invalid access to the system. Moreover, an individual cannot use his / her signature with two applications or more, otherwise, a cross matching problem will occur. The aforementioned problems can be avoided by using biometric template protection techniques for the online signature, which have been reviewed and discussed in this paper considering both protections and verification. Furthermore, the verification elaboration comprises of capture devices, pre-processing, feature extraction and classification methods.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Authentication, Biometrics, Online Handwritten Signature Verification, Template Protection.


    Heuristic Search Attacks on Gradual Secret Release Protocol: A Cryptanalysis Approach on E-learning Security

    by Jibulal B Nair, Saurabh Mukherjee


    Abstract - Cryptanalysis on GSR fair exchange protocol to study and analyse the robustness of the protocol. In order to perform the cryptanalysis, heuristic attacks are to be over the protocol. The heuristic attacks can be termed as the attacks, which are artificially generated by various heuristic search algorithms. This work intend to use renowned heuristic search algorithms such as Genetic algorithm (GA) and Particle Swarm Optimization Algorithm (PSO) and a latest search algorithm such as cuckoo search (CS). GA and PSO have been applied for cryptanalysis of various protocols and encryption standards. CS is an emerging search algorithm and it proved the performance in solving optimization problems. Based on the aforesaid peculiarities, perform cryptanalysis based on such algorithms. Under various algorithm parameters, attacks will be made over the protocol and the performance will be studied. The protocol and the search algorithms will be implemented in MATLAB and the results will be analysed in depth to study the robustness of the protocol under such security issues.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Gradual Secret Release (GSR), Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Cuckoo Search Algorithm (CSA), Encryption, Decryption, Public Key and Private Key.


    Developing an Effective and Compressed Hybrid Signcryption Technique Utilizing Huffman Text Coding Procedure

    by R. Sujatha, M. Ramakrishnan


    Abstract - The functions of digital signature and public key encryption are simultaneously fulfilled by signcryption, which is a cryptographic primitive. To securely communicate very large messages, the cryptographic primitive called signcryption efficiently implements the same and while most of the public key based systems are suitable for small messages, hybrid encryption (KEM-DEM) provides a competent and practical way. In this paper, we develop a compressed hybrid signcryption technique. The hybrid signcryption is based on the KEM and DEM technique. The KEM algorithm utilizes the KDF technique to encapsulate the symmetric key. The DEM algorithm utilizes the AES algorithm to encrypt the original message. Finally the signcrypted data is compressed with the help of Huffman text coding procedure. Here, for the security purpose, we introduce the three games and we proved the attackers fail to find the security attributes of our proposed signcryption algorithm.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: KEM (Key Encapsulation Mechanism), DEM (Data Encapsulation Mechanism), KDF (Key Derivation Function), Huffman Encoding Algorithm, Signcryption, Hybrid Signcryption .


    Efficient Elliptic Curve Cryptography Encryption Framework for Cloud Computing

    by Aws N. Jaber, Mohamad Fadli Bin Zolkipli


    Abstract - Cloud computing has benefited significantly from the ever-increasing capabilities of computer hardware, including faster microprocessors, larger memory capacity, and greater network bandwidth with high storage. This trend is expected to continue. Cloud computing enables the addition of colored displays to full-blown Web browsers. Some processes add embedded cloud warnings to investigate how data are stored on a local or intermediary system. For instance, all log files of a cloud computing server upload or download data, and all the sessions are stored in clear, unencrypted text on a local computer that multiple users have access to. The cloud client encrypts sessions with the standard method; however, the server does not fully protect such sessions. A high scheme with low-cost encryption and an elliptic curve is established in this study to solve this issue. A lookup table is integrated into the proposed framework.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Cloud Public Key, Community Cloud Computing.


    Performance Evalution of CSTA Based Effective Transpiration In Data Hiding

    by R. Kalaiselvi, V. Kavitha


    Abstract - Cryptography is effectively synonymous with the technique of encryption and also legally prioritized for secure communication. The cyclic shift transposition algorithm (CSTA) is utilized to disguise the information and transmit message from one linear end to another in the environment of grid. Grid is a unique computational approach which mainly focuses on illuminating the précised problem in the heterogeneous environment by favoring the obligations of the users. The security issue is a captious area in grid computing. In the proposed work, the cyclic shift transposition algorithm (CSTA) is formulated by java and it is integrated with grid-sim. In CSTA, proportionate statistic of rows and columns are considered. The two main techniques such as encryption and decryption are involved above.In the course time of encryption process, the encrypted message is transpired through the image and contributes massed security, and then it is decrypted using the same manner. Correlating to other security algorithms, the proposed CSTA algorithm is adequate, decisively enhances security and provides exceeding performance than other algorithms.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Grid Security, Encryption, Decryption, CSTA Model, Data Hiding .


    Analysis Electroencephalogram Signals Using ANFIS and Periodogram Techniques

    by S. Elouaham, R. Latif, B. Nassiri, A. Dliou, M. Laaboubi, F. Maoulainine


    Abstract - In this paper the applications the Adaptative Neuro Fuzzy Inference Systems (ANFIS), the Empirical Mode Decomposition (EMD) and Discrete Wavelet Distribution (DWT) are used. An electroencephalogram (EEG) is a diagnostic test which measures the electrical activity of the brain using highly sensitive recording equipment attached to the scalp by fine electrodes. An EEG recording is often affected with noises. These noises strongly affect the visual analysis of EEG. To overcome this problem the denoising techniques as ANFIS, EMD and DWT are applied. The efficiency of the ANFIS, EMD and DWT to remove the noises was evaluated by several standard metrics between filter EEG output and clean original signal. The results obtained show that the ANFIS outperformed other denoising techniques in terms of localization of the components of the abnormal EEG signal. Due to non-stationary nature of the EEG signal, the uses of time-frequency techniques are inevitable. The parametric time-frequency technique used is Periodogram (PE). The EEG signals used are normal and abnormal; the abnormal signals are obtained from the patient that has the sleep-disordered breathing (SDB) and the patient that has the sleep movement disorders (periodic leg movements or PLM). The PE technique shows its higher performance at the level of resolution and deleting any interference-terms over other non-parametric time-frequency techniques given in the scientific literature. This study demonstrates that the combination of ANFIS and the PE techniques are a good issue in the in biomedicine. For experimental study we have used the MIT/BIH arrhythmia database. Simulations were carried out in MATLAB environment.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: EEG, ANFIS, EMD, DWT, Time-Frequency, Periodogram.


    Optimized Fuzzy Min-Max Artificial Neural Network got Cervical Cancer Application

    by Anas Mohammad Quteishat


    Abstract - In this paper the application of a Fuzzy Min-Max Neural (FMM) network optimized by Genetic Algorithm (GA) for cervical cancer cells is proposed. The proposed system classifies cervical cells as normal, low-grade squamous intra-epithelial lesion (LSIL) and high-grade squamous intra-epithelial lesion (HSIL). The system consists of three stages. In the first stage, cervical cells are segmented using the Adaptive Fuzzy Moving K-means (AFMKM) clustering algorithm. In the second stage, feature extraction is performed where a total of 18 feature where extracted. Finally in the third stage the extracted features are fed to a FMM with GA Neural Network for classification. The obtained results show that the proposed system can enhance cancer cell classification. To further assess the obtained results the bootstrap hypothesis statistical technique is used to clarify the results.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Fuzzy Min-Max Neural Network, Genetic Algorithm, Adaptive Fuzzy Moving K-means, Cervical Cancer.


    Watermarking of Medical Images with Optimized Biogeography

    by A. Umaamaheshvari, K. Prabhakaran, K. Thanushkodi


    Abstract - Multimedia security has become an active area of research in recent years due to the extensive growth and development of multimedia data exchange on the internet. A number of security techniques have been developed to deal with the problems of attackers and hackers. But, most of the existing security approaches do not provide reliable results under certain attacks. This research work presents a novel technique to deal with the security problems. This research work uses the digital watermarking process which embeds the original information into a digitalized signal. Strength and reliability are the two important qualities in watermark embedding. But, digital watermarking has its own limitations wherein the insertion of the original information into the cover image and identification of the exact location without degradation of the original image quality is difficult. In order to solve the problem in digital watermarking, this research work proposes a biogeography algorithm with GA optimization to find the optimal allocation of location in images to embed information. But, Genetic optimization algorithm has certain limitations such as slow convergence, less accuracy, etc. Hence, this research work uses Biogeography Particle Swarm Optimization (BPSO) and Biogeography Firefly optimization (BFA). Experimental results are carried out with standard images and the performances of three methods are compared based on the parameters like MSE, SSIM and PSNR. It is observed that BFA shows best results to different threshold values which results in better visual quality of watermarked images.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Water Marking, Genetic Algorithm, Particle Swarm Optimization, Firefly Algorithm, Biogeography, Independent Watermark.


    Bi-Dimensional Zero Padding Angular Interpolation for Arc Handling in Computed Tomography Scanner

    by Ahmed Bacha, Aa. Oukebdanne, Ah. Belbachir


    Abstract - In this work we examine the accuracy of our proposed interpolation algorithm by zero-padding comparing to standard techniques for the task of interpolating additional projections to be insert in sinogram witch lost some projections caused by an X-ray tube arcing in computed tomography scanners. During the time that the X-ray tube recovers to full voltage after an arc, image data is not collected and data projections are lost caused poor image quality in the tomography reconstruction process. We have developed an algorithm based on an estimated calculation of a virtual projection using the zero-padding interpolation. The results show a significant reduction of the star effect noise on the reconstructed image. Our algorithm was compared to simple interpolation linear method using statistical hypothesis testing .Our test simulation was experimented with a R × R (R number of pixels in row and column) value phantom simulating the human body while the programming is carried out in MATLAB 6.5.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Filtered Back Projection, Computed Tomography, Zero Padding, linear interpolation, Radon Projection Estimations.


    Cross-Layer based Energy Efficient Congestion Control Protocol for MANETs

    by R. Vinod Kumar, R. S. D. Wahidabanu


    Abstract - Most common issue the transport layer, data layer and network layer faces is the congestion problem in the MANETs. To overcome this issue MANETs needs a cross layer based congestion control technique that considers the hops and energy consumption during the transmission. So in this paper a Cross-layer based Energy Efficient Congestion Control Routing Protocol for MANETs has been proposed. This approach considers the shortest paths for the transmission using PEER approach. For the selected shortest paths the network estimates the link cost. The estimation of link cost considers the transmission power and receiving power. In addition to this, a method has been proposed for detection of congestion in the selected paths. By simulation results, we show that the proposed protocol achieves increase in energy efficiency, decrease in end to end delay, increase in packet delivery ratio and decrease in packet loss.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Shortest Paths, Link Cost, Transmission Power, Receiving Power, MAC Layer and Congestion Detection.


    Authorship Attribution in Tamil Language Email for Forensic Analysis

    by A. Pandian, Abdul Karim Sadiq


    Abstract - This paper presents Authorship attribution (AA) to Tamil language email. This work presents generation of representative signatures of Tamil emails using lexical and syntactic based methods. The signature of each email has large dimensions. In order to make it suitable for subsequent processing, conversion of large dimension of the signature into 2-dimensional pattern using Fisher’s linear discriminant function (FLD) method is given. The 2-dimensional patterns of the signatures are used as training data for the radial basis function (RBF) network and echo state neural network (ESNN). The improved classification of Tamil email is shown by transformation of patterns using FLD followed by training using RBF, as well as, training ESNN. The paper presents a new technique for building signature database and for optimal AA in Tamil email forensics.

    Copyright © 2013 Praise Worthy Prize - All rights reserved


    Keywords: Echo State Neural Network, Tamil Email; Lexical Features; Syntactic Features; Discriminant Function; Radial Basis Function.


Please send any questions about this web site to info@praiseworthyprize.com
Copyright © 2005-2018 Praise Worthy Prize


Cookies are used by this site. To decline or learn more, visit our Cookies Policy page.