Best Online Journal of Computer Science and Technology - IRA-JOURNAL

International Research & Analysis Journal facebook twitter youtube

IRAJ VOLUMES

IRAJ Volume 6 Issue 1, 2015-07-15 To 2015-10-14

  • pages (96-102)

    Reads (179)

    Topic Name: Increased Performance of MapReduce by High Performance Processes

    Author (s) : Shubhra Farsoiya, Deepak Choubey


    Quick abstract

    Hadoop is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. In general, a map task is divided into map and combine phases, while a reduce task is divided into copy, sort and reduce phases. Since Hadoop-0.20, reduce tasks can start when only some map tasks complete, which allows reduce tasks to copy map outputs earlier as they become available and hence mitigates network congestion. However, no reduce task can step into the sort phase until all map tasks complete. This is because each reduce task must finish copying outputs from all the map tasks to prepare the input for the sort phase. This research is proposing an innovative approach to solve the issue of the performance on MapReduce Based Hadoop implementation for which an additional sorting mechanism with incremental clustering approach is being applied.


  • pages (103-108)

    Reads (142)

    Topic Name: High Performance Multi Agent Based Intrusion Detection System for Network

    Author (s) : Rimjhim Jain, Shubhi Srivastava

    Keywords : Multi Agent System, IDS, Mobile Adhoc Network, MANET, Madkit, Routing,


    Quick abstract

    this work proposes a multi-agent system that cooperates in order to detect intrusions. Some of the agents implement IDS models, other evaluates the predictions made by the firsts and finally a third kind of agent considers the evaluator suggestion establishing the IDS effectiveness. The dynamic weights involved in the adaptive evaluation performed to generate a final suggestion from several different intrusion detection models showing a better performance than classical approach based on the average sum of the predictions received. The improvement has been measured introducing a promising metric that takes into account the response costs. The recent introduction of decision making techniques to intrusion detection reveals the necessity of formal robust metrics that considers all the parameters involved in the task. Testing with real data instead of synthetic data has been carried. This is not an easy task because of the problems to experiment in real networks and the suspicious results based on simulated traffic. Adaptive behavior of agents can be really useful in the intrusion detection field due to the very changing environment that is faced and the need of automated responses.


  • pages (109-115)

    Reads (119)

    Topic Name: Distributed Web Data Mining Using Incremental Clustering

    Author (s) : Deepanshu Mahdel, Asst. Prof. Brajesh Patel

    Keywords : Incremental Clustering, Web Usage Mining, Web Data Mining, Proxy Log, Clustering


    Quick abstract

    Internet is growing very fast and large information is being stored over it every minute. This increasing amount of information provides users with more options, but also makes it difficult to find the aEoerightaE or aEoeinterestingaE information out of the huge information. When a user accesses the Web his usage details are logged on the servers. Web access log contains a lot of information about how the users explore the web. Web usage mining discovers user preference from this log and makes recommendations based on the extracted knowledge. Clustering is a pivotal building block in many data mining applications and in machine learning. In this work, two types of processing has been considered 1) Off-line (batch) processing 2) Online or incremental Clustering. Incremental Clustering requires initial clusters to be decided in advance i.e. they must pre exist for processing. If the initial clusters are to be fixed, then there are several ways it can be achieved.


  • pages (116-122)

    Reads (140)

    Topic Name: An Approach for Contrast Enhancement of Color Images in HSV Color Space Using DCT, SVD and Adaptive Histogram Equalization with Color Preserving Framework

    Author (s) : Prakiti Kapoor, Sandeep Sahu

    Keywords : Histogram Equalization, Adaptive Histogram Equalization, Contrast Enhancement, Color Preservation, Hue Saturation Value Color Model, Discrete Cosine Transform, Singular Value Decomposition.


    Quick abstract

    Preserving color information on color image is a challenging task. Many widely used algorithms are able to enhance contrast of given color image but these methods are not able to preserve the color information in the processed image. In this work we propose an algorithm for enhancement of contrast of color image without much affecting its color information. The proposed method works on HSV (Hue Saturation Value) color space and it decomposes the input color image's V channel into high and low frequency part using the Discrete Cosine Transform and then it uses the Singular Value Decomposition for contrast enhancement. The results show that the proposed method is able to enhance contrast of given color image without much affecting its color information.


  • pages (123-129)

    Reads (133)

    Topic Name: Efficient Web Usage Mining using Ant Colony Optimization

    Author (s) : Rakhi Chourasia, Prof. Brajesh Patel

    Keywords : Web Data Mining, Clustering, XML, XPATH, XSL, ANT Clustering


    Quick abstract

    the process of grouping a set of physical or abstract object into classes of similar objects is called clustering. There are several techniques and algorithms are used for extracting the hidden patterns from the large data sets and finding the relationships between them. The main novelty of the Hierarchical Data Divisive Soft Clustering (H2DSC) algorithm is that it is a quality driven algorithm, since it dynamically evaluates a multi-dimensional quality measure of the cluster to drive the generation of the soft hierarchy. Specifically, it generates a hierarchy in which each node is split into a variable number of subnodes. Cluster at the same hierarchical level share a minimum quality value: cluster in lower levels of the hierarchy have a higher quality, this way more specific clusters (lower level clusters) have a higher quality than more general clusters (upper level clusters). Further, since the algorithm generates a soft partition, a document can belong to several sub-clusters with distinct membership degrees. The proposed algorithm is divisive, and it is based on a combination of a modified bisecting K-Means algorithm with a flat soft clustering algorithm used to partition each node.


  • pages (130-136)

    Reads (120)

    Topic Name: Social Data Mining using Cultural Algorithm to make usable in Intrusion Detection System

    Author (s) : Gurmeet Gill, Prof. Prateek Gupta

    Keywords : Web Data Mining, Clustering, XML, XPATH, XSL, Extended Cultural Algorithm


    Quick abstract

    Classification rule mining is that the most wanted out by users since they represent extremely apprehensible variety of data. The foundations are evaluated supported objective and subjective metrics. The user should be ready to specify the properties of the foundations. The foundations discovered should have a number of these properties to render them helpful. These properties are also conflicting. Thence discovery of rules with specific properties may be a multi objective optimization drawback. Cultural algorithmic rule (CA) that derives from social structures, and which contains organic process systems and agents, and uses five data sources (KSaETMs) for the evolution method higher suits the requirement for resolution multi objective optimization drawback. Within the current study a cultural algorithmic rule for classification rule mining is projected for multi objective optimization of rules. The social data created by the people within the ECA is to be regenerate into unjust social data or collective social intelligence to be applied in numerous applications like Associate in Nursing Intrusion detection and hindrance system to unravel the pc security drawback.


  • pages (137-142)

    Reads (91)

    Topic Name: DICOM Compression, Secure Transmission and Decompression Using DANN (DABPA)

    Author (s) : Jitendra Shekhar Pandey, Ritesh Rai

    Keywords : DICOM(), Dynamic adaptive neural network, Edge Preserving Image Compressor and Secure transmission


    Quick abstract

    the increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally and socially to the patient, potentially shaking the credibility of the healthcare institution. This demands adoption of security mechanisms to assure information integrity and authenticity. Before digital medical images in computer-based patient record systems can be distributed online, it is necessary for confidentiality reasons to eliminate patient identification information that appears in the images. Structured descriptions attached to medical image series conforming to the DICOM standard make possible to fit the collections of existing digitized images into an educational and research framework. Progressive transmission of medical images through Internet has emerged as a promising protocol for teleradiology applications. The major issue that arises in teleradiology is the difficulty of transmitting large volume of medical data with relatively low bandwidth. With the tremendous growth in imaging applications and the development of filmless radiology, the need for compression techniques which can achieve high compression ratios with user specified distortion rates become necessary. With neural network compression techniques based on Dynamic Associative Neural Networks (DANN), to provide high compression ratios with user specified distortion rates in an adaptive compression system well-suited to parallel implementations. Improvements to DANN-based training through the use of a variance classifier for controlling a bank of neural networks speed convergence and allow the use of higher compression ratios for aEoesimpleaE patterns.


top