IMPLEMENTATION OF NAÏVE BAYESIAN DATA MINING ALGORITHM ON DECEASED REGISTRATION DATA |
Author : Dr. P. Y. Desai |
Abstract | Full Text |
Abstract :The use of data mining algorithm in different domains like marketing, finance,
retail, cyber security, fraud detection, medical science, etc…is well known. In recent
times, the data mining is also implemented for e-governance data. Normally, egovernance data are only used for Online Transaction Processing System (OLTP).
However, one can use data mining algorithm to uncover hidden and new trends from
e-governance data. In this paper, Naïve Bayes data mining algorithm is applied on
Deceased Registration data to find hidden trends. |
|
BUFFER BASED ROUTING MECHANISM FOR LOAD BALANCING IN WIRELESS MESH NETWORKS |
Author : Keerthi D S, Shobha Rani A and T G Basavaraju |
Abstract | Full Text |
Abstract :In recent years it is witnessed that the Wireless Mesh Networks (WMNs) are
becoming the most promising technology as they offer low cost broadband wireless
connectivity, larger coverage area, high flexibility and easy deployment. WMNs are an
extension of existing wireless networks. WMN is an emerging technology; however,
there are certain challenges that still exist in the network such as scalability, load
balancing, mobility, power management etc. Here we have proposed a novel routing
protocol which considers buffer occupancy of intermediate nodes for route selection.
Simulation results convey that the proposed protocol outstandingly enhances the
performance of the network by balancing the traffic load among less congested nodes
compared to the standard protocol.
|
|
GPU BASED TOOLBOX FOR FUZZY LOGIC SYSTEM USING WHALE OPTIMIZATION ALGORITHM |
Author : Sarabjeet Singh, Satvir Singh and Vijay Kumar Banga |
Abstract | Full Text |
Abstract :Fuzzy Logic System (FLS) is an efficient method to solve engineering problems.
However, the training of a Fuzzy Logic System is a time-consuming task. Optimization
Algorithm can be used to optimize the rule base of any FLS. Out of Type-1 FLS and
Type-2 FLS, the type-2 found to be more effective to deal with noisy data. Due to their
computational requirements Interval Type-2 (IT2) has been preferred over General
Type-2. Whale Optimization Algorithm (WOA)has been introduced recently. The
algorithm has been tested on different engineering problems and is found to be more
effective. General Purpose Computing using graphics Processing Unit (GPGPU) is a
new way to solve compute intensive problems on Graphics Processing Unit (GPU).
CUDA-C is a parallel language that can be used to execute parallel code NVIDIA GPU.
This paper integrates IT2 FLS, WOA and Processing Power of GPU. A toolbox is
proposed that can be used to optimize the rule base in parallel. The toolbox provides
both the implementations, i.e. serial and Parallel. FLS WOA Toolbox is design in such
a way that user can pass parameter dynamically according to their need without
interfacing with the code.
|
|
AUTOMATED DETECTION AND SEGMENTATION OF VASCULAR STRUCTURES OF SKIN LESIONS: A SURVEY OF EXISTING TECHNIQUES |
Author : Er. Komal Sharma and Er. Sanjay Madaan |
Abstract | Full Text |
Abstract :Skin cancer accounts to be a standout amongst the most prevalent types of
carcinoma ailments, particularly among Caucasian offspring and pale-skinned
persons. Specifically, the melanocytic dermis lesion are conjectured as the most lethal
among three pervasive skin carcinoma ailments and the second most communal type
amongst youthful grown-ups who are 15-29 years old. These apprehensions have
impelled the requirement of automated systems for the diagnosis of skin carcinomas
within a limited time frame to reduce unnecessary biopsy, proliferating the momentum
of diagnosis and giving reproducibility of indicative outcomes. In this survey paper a
brief overview of automated detection and segmentation of vascular structures of skin
lesions is presented |
|
IMPLEMENTATION OF ARTIFICIAL NEURAL NETWORK DATA MINING ALGORITHM: A CASE STUDY OF BIRTH REGISTRATION DATA |
Author : Dr. P. Y. Desai |
Abstract | Full Text |
Abstract :The scope of different applications of Artificial Neural Network is immense. In
recent times, Artificial Neural Network has been used in different type of applications.
The use of Artificial Neural Network in Medical Science, Financial Markets,
Employee Hiring, Fraud Detection, Cyber security are well know. In this paper, the
use of Artificial Neural Network is proposed for Birth Registration e-governance data.
The practical implementation was done using Microsoft Analysis Service and results
indicates that interesting relationship can be obtained using Microsoft Neural
Network data mining algorithm |
|
A COMPARATIVE STUDY ON GOOGLE APP ENGINE AMAZON WEB SERVICES AND MICROSOFT WINDOWS AZURE |
Author : MAHESH K, DR. M.LAXMAIAH, DR. YOGESH KUMAR SHARMA |
Abstract | Full Text |
Abstract :In today Internet has grown to be persistent in daily livelihood furthermore Cloud
computing is a rising model where computing resources offered over the Internet as
scalable, on-demand (Web) services. An association deploy internet service needs to
use enormous amounts of money on infrastructure needs to serve feasible users which
is not a problem for large venture but when it comes to Small and Medium Enterprises
or Enterprises affordability becomes a huge factor with the huge infrastructure come
problems like machines failure, hard drive noises, software bugs, etc. Here might be a
big problem for such a community. Cloud Computing is the ultimate solution to this
problem. Rather than buying, installing and operating its own systems, an
organization can rely on a cloud provider to do this for them. Cloud Computing key
market leaders like Google, Amazon and Microsoft etc, these providers introduce new
operating and business models that allow customers to pay for the resources they
completely use, instead of making tremendous upfront investments. The purpose of this
paper is to analyze most popular platforms, The Google App Engine, Amazon Web
Services, and Windows Azure Platform. |
|
REVIEW ON CLUSTERING CANCER GENES |
Author : Prabhuraj, P.M Mallikarjuna Shastry, S.S Patil |
Abstract | Full Text |
Abstract :Present studies, development of genomic technologies are highly concentrated on
galactic scale gene data. In Bioinformatics community, the sizable volume of gene
data investigation and distinguishing the behavior of genes in antithetical conditions
are the intriguing task. This cognitive factor can be deal by the clustering technique,
its groups the similarity patterns at various features. Moreover, gene expression data
indicates the contrastive levels of gene behaviors in various tissue cells and it does
provide the feature information effectively. This gene clustering investigation is
precise and accommodating in cancer uncovering because of its easiness to detect the
cancerous and non-cancerous genes. The precautionary measures cancer diagnostic
is precise crucial for cancer prevention and treatment. The existing cancer gene
clustering techniques includes several limitations such as time complexities in training
and testing samples, maximum redundant features and high dimensional data. These
issues are severely influences the data clustering accuracy. This paper focuses on
survey of various clustering techniques of cancer gene clustering with respect to
cancer gene benchmark datasets. Furthermore, review of existing cancer gene
clustering technique describes the advantages and limitations comprehensively. |
|
UNDERSTANDING ADOPTION FACTORS OF OVER-THE-TOP VIDEO SERVICES AMONG MILLENNIAL CONSUMERS |
Author : Dr. Sabyasachi Dasgupta and Dr. Priya Grover |
Abstract | Full Text |
Abstract :With growing digitization, the challenge for marketers is to understand how
consumers consuming Over-The –Top (OTT) content adopt and consume messages in
this format effectively. Superimposing the theoretical framework of Uses and
Gratification meant for television to internet platform, this paper in a novel approach
tries to understand the consumption patterns and adoptability factors of OTT among
consumers. The qualitative methodology adopted for this research, brought out four
themes enabling the success of this platform: Convenience, mobility, content and
subscription strategies. These strategic parameters will ensure higher engagement
levels of the consumers for the OTT content.
|
|
EXPERIMENTAL STUDY ON CLOUD SECURITY FOR PERSONAL HEALTH RECORDS OVER PATIENT CENTRIC DATA |
Author : Birru Devender, Dr. Syed Abdul Sattar |
Abstract | Full Text |
Abstract :Cloud computing offers many services among one is Storage as a service. Using
this service user can outsource his data and whenever required he can download or he
can share with others. Using cloud computing PHR owner can share his documents,
but due to security challenges in cloud computing the health records of owner must be
encrypted before outsourcing. In order to provide security for owner health records
ABE (attribute based encryption) is best suitable for other encryption techniques.
Using ABE the PHR owner can define access control over his encrypted cloud data.
But using ABE dynamic polices over encrypted cloud data is big challenge. In this
paper for achieving grant and revoke privileges over encrypted data Timestamp
server is introduced.
|
|
AN IMPLEMENTATION OF SOFTWARE EFFORT DURATION AND COST ESTIMATION WITH STATISTICAL AND MACHINE LEARNING APPROACHES |
Author : B. M. G. Prasad and P. V. S. Sreenivas |
Abstract | Full Text |
Abstract :In software industry estimation of effort, cost (EDC) and duration is a troublesome
procedure. The exertion itself is in charge of experiencing trouble in evaluating EDC.
In any software estimation process, the preeminent advance is to characterize and
comprehends the framework to be assessed. Software cost estimation algorithmic
methods are estimating by analogy, expert judgment method, price to win method, topdown method, bottom-up method are developed by researchers in the field of software
engineering. Any methods are not superior to the other methods. In fact, the qualities
and shortcomings of those strategies are differentiating to one another. Presently
there are two types of software EDC models to estimate the software those are
statistical approaches and machine learning approaches. Many of the software cost
estimation methods follows statistical approaches, which are not having the capability
to present causes and strong or accurate results. Machine learning techniques are
suitable in software engineering because it produces accurate results, which estimates
by training rules and iterations. Machine learning techniques resolve the challenges
like developing programs in a computer and to give the effective outputs by using the
experience |
|
COMPARATIVE ANALYSIS OF FACE RECOGNITION BASED ON SIGNIFICANT PRINCIPAL COMPONENTS OF PCA TECHNIQUE |
Author : Manzoor Ahmad Lone |
Abstract | Full Text |
Abstract :Face recognition systems have been emerging as acceptable approaches for
human authorization. Face recognition help in searching and classifying a face
database and at a higher level help in identification of possible threats to security. In
face recognition problem, the objective is to search a face in the reference face
database that matches a given subject. The task of face recognition involves the
extraction of feature vectors of the human face from the face image for differentiating
it from other persons [6]. In this work, the comparative analysis is done based on the
varying number of highly significant principal components (Eigenvectors) of PCA for
face recognition. Experimental results show a small number of principal components
of PCA are required for matching. PCA technique is a statistical technique, it reduces
the dimension of the search space that best describes the images |
|
A 3-LEVEL MULTIFACTOR AUTHENTICATION SCHEME FOR CLOUD COMPUTING |
Author : Charanjeet Singh and Dr. Tripat Deep Singh |
Abstract | Full Text |
Abstract :The objective of this paper is to propose a secure, user friendly and economical
multi-level authentication scheme that uses multiple factors for gaining access to
resource on insecure platforms and for financial transactions. The proposed study is
based on a premise that when multiple levels and multiple factors are incorporated in
an authentication scheme it not only becomes difficult to break but also resistant to
different forms of attacks. This work purposes a scheme where authentication process
is carried out in three levels using multiple factors and is called 3L-MFA. The scheme
also uses Out of Band (OOB) authentication as one of the factors that offers credible
security against man-in-the-middle (MIM) attack. The first level uses username
password based on double encryption. Second level uses OTP verification based on
Out of Band (OOB) authentication using email id and mobile number. Third level
involves user’s interaction on graphical screen in terms of predetermined number of
clicks on images, buttons and selection of predetermined number of menu items. The
security of proposed system depends upon double encryption using SHA-1 and AES128-CBC, out of band authentication using OTP and user interaction on a graphical
screen that uses probability combination of various numbers. |
|
AN OPTIMIZED VEHICLE PARKING MECHANISM USING ARTIFICIAL NEURAL NETWORK |
Author : Ruby Singh and Dr. Niraj Singhal |
Abstract | Full Text |
Abstract :Our country has developed rapidly for decades with lot of commercial buildings and
well contacted roads with a growing number of automobiles. The transportation
industry has become the backbone of economy because of its widespread use in trade
and commerce. Therefore, parking the vehicles has become a consideration. For
parking the vehicles in a parking space, old parking system is still used which is
maintained in an unplanned mode and without any discipline. Because of this, people
usually park their cars wherever they want, which can be a mess because most people
don’t follow any discipline. There should be some mechanism to park the vehicles
appropriately; therefore, this research has introduced an optimized vehicle parking
mechanism with the concept of Artificial Bee Colony (ABC) and Artificial Neural
Network (ANN) which helps to get rid of the chaos usually faced. |
|
A HYBRIDIZATION OF ARTIFICIAL NEURAL NETWORK AND SUPPORT VECTOR MACHINE FOR PREVENTION OF ROAD ACCIDENTS IN VANET |
Author : Chiranjit Dutta and Dr. Niraj Singhal |
Abstract | Full Text |
Abstract :Vehicular Ad hoc Network (VANET) is known as an infrastructure less network
having dynamic nodes with Road Side Units (RSUs). Data Broadcasting becomes a very
difficult task because of more density, scalability, randomness, mobility of vehicles.
VANET has an ability to prevent accidents by transmitting data on-time on the network
and this has raised an attention for number of researchers. Therefore, in this paper a
realistic mechanism has been proposed to avoid the fatal accidents on road using
clustering approach with the concept of Artificial Intelligence. Hybridization of
Artificial Neural Network (ANN) and Support Vector Machine (SVM) is conducted to
speed up the data transmission process that assists in providing information accurately
and on-time. To demonstrate the efficacy of the novel mechanism, parameters such as
Throughput, Packet Delivery ratio (PDR) are considered.
|
|
A REVIEW ON DATA MINING AND BIGDATA |
Author : Aarepu Lakshman, Dr. B.M.G. Prasad and Dr. Yogesh Kumar Sharma |
Abstract | Full Text |
Abstract :The recent years have seen an exponential growth of data generation this
enormous amount of data has brought new kind of problem. The existing data mining
techniques are unable to process the Big Data or they are not efficient in handling it.
The major problems appeared with the Big Data are storage and processing, and how
big data process efficiently in order to this high computation platforms required. To
extract valuable information from big data appropriate algorithms required. To
deeply understand this issues and to find recommendations in this paper taking review
from previous studies. First part of review explores how data analysis done with data
mining algorithms and its limitations. Second part of review explores data analysis
with big data and with different algorithms and framework.
|
|
ENHANCED CLUSTER HEAD MANAGEMENT IN LARGE SCALE WIRELESS SENSOR NETWORK USING PARTICLE SWARM OPTIMIZATION (PSO) ON THE BASIS OF DISTANCE, DENSITY & ENERGY |
Author : Shishir Rastogi, Neeta Rastogi And Manuj Darbari |
Abstract | Full Text |
Abstract :Wireless Sensor Networks (WSNs) are utilized for a plethora of applications such
as weather forecasting, monitoring systems, surveillance, and so on. The critical issues
of the WSN are energy constraints, limited memory, and computation time. This
spectrum of criticality takes a deep dive with large-scale WSNs. In such scenario, the
network lifetime has to be efficiently utilized with the available resources by organizing
into clusters. Even though the technique of clustering has proven to be highly effective
in minimizing the energy, the tradition cluster based WSNs, the protocol overhead is
high for Cluster Heads (CHs) as it receives and aggregates the data from its cluster
members. Therefore, efficient management of CH along with routing behavior is vital
in prolonging the network lifetime. In this paper, an enhanced CH-Management
technique is proposed which efficiently elects its CH using Particle Swarm Optimization
(PSO), hereafter referred to as PSO_DDE. The PSO_DDE approach considers various
parameters such as within-cluster distance between nodes (intra-cluster distance),
neighbor density, and residual energy of nodes for the best candidate selection of CH.
Also, the cluster formation is defined by the k-means based on the Euclidian distance.
The PSO_DDE approach is integrated with the Dynamic Source Routing (DSR) for
efficiently traversing the data packet to the sink node. The performance metrics are
compared with the existing approaches using NS-2 simulator, and the proposed
approach shows superiority of results. |
|
RELIEFF FEATURE SELECTION BASED ALZHEIMER DISEASE CLASSIFICATION USING HYBRID FEATURES AND SUPPORT VECTOR MACHINE IN MAGNETIC RESONANCE IMAGING |
Author : Halebedu Subbaraya Suresha and Dr. S.S. Parthasarathi |
Abstract | Full Text |
Abstract :Alzheimer disease is a form of dementia that results in memory-related problems
in human beings. An accurate detection and classification of Alzheimer disease and its
stages plays a crucial role in human health monitoring system. In this research paper,
Alzheimer disease classification was assessed by Alzheimer’s disease Neuro-Imaging
Initiative (ADNI) dataset. After performing histogram equalization and skull removal
of the collected brain images, segmentation was carried-out using Fuzzy C-Means
(FCM) for segmenting the white matter, Cerebro-Spinal Fluid (CSF), and grey matter
from the pre-processed brain images. Then, hybrid feature extraction (Histogram of
Oriented Gradients (HOG), Local Binary Patterns (LBP), and Gray-Level CoOccurrence Matrix (GLCM)) was performed for extracting the feature values from the
segmented brain images. After hybrid feature extraction, reliefF feature selection was
used for selecting the optimal feature subsets or to reject the irrelevant feature
vectors. Then, the selected optimal feature vectors were given as the input to a
supervised classifier Support Vector Machine (SVM) to classify three Alzheimer
classes of subjects; those are normal, Alzheimer disease and Mild Cognitive
Impairment (MCI). The experimental outcome showed that the proposed methodology
performed effectively by means of sensitivity, accuracy, specificity, and f-score. The
proposed methodology enhanced the classification accuracy up to 2-20% compared to
the existing methodologies.
|
|
APP FOR PLACING ORDERS AND BARGAINING WITH AI AT A RESTAURANT |
Author : Shounak Mulay, Rushabh More, Priyanka Wagh and Prof. Saroja T.V |
Abstract | Full Text |
Abstract :The proposed system attempts to implement various new technologies to an app
for enabling users to place orders and bargain the prices with a machine learning
powered bot. Such a system can provide new experience to the customers while
increasing business profitability and efficiency with the help of collected data. This
system comprises of 4 parts. The Android app at the frontend, and the database,
natural language processing platform and the machine learning model to predict
price as the backend deployed on the cloud. The app is designed with modular
architecture pattern so that changes can be made easily to fit any business model.
Since the predictions for bargaining are provided by a machine learning algorithm,
the system performance and accuracy is likely to improve with additional data |
|
TIME TO MODIFY OPERATING SOFTWARE (OS), DATABASES (DB) AND TCP/IP PROTOCOLS FOR DATA TRASH ELIMINATION, BASED ON USER DEFINED SHELF LIFE OF DATA. |
Author : Sarvesh Kumar Tripathi, Dr Avjeet Kaur, KR Dr RK Pandey and Dr Meghna Chhabra |
Abstract | Full Text |
Abstract :Exponentially growing data, big data, dark data and data trash are throwing
excellent opportunities in the world. But associated costs and risks are also significant.
“Big Garbage in, Big Garbage out” seems new phrase in computing. This paper is
motivated by risk and cost stresses on non-IT firms. Literature review, from
management perspective, reveals lack of world’s attention towards immortality of data
in the data stores of the world and rising risks & costs. Progressive digitization calls
for regular elimination of data trash and consequent avoidable costs. Existing concept
of Time to Live (TTL) or hop limit is eliminating huge data in transit on the internet
system in real time. Similar concepts and tools could be potential help in reduction of
size of data inventory. Paper presents a rudimentary model for expansion of concept of
TTL with the assistance of user defined shelf life of data. |
|
RECENT ADVANCES IN HARMONY SEARCH ALGORITHM |
Author : Assif Assad |
Abstract | Full Text |
Abstract :Harmony Search (HS) a meta heuristic algorithm inspired by music improvisation
process in which the musician searches for the best harmony and continues to polish
the harmony in order to improve its aesthetics. The HS algorithm was introduced in
the year 2001 and has found applications in diverse fields.
This manuscript reviews the recent significant developments in the structure of
Harmony Search algorithm and also describes its recent state-of-the-art applications.
As evidenced by a number of studies, this algorithm features several innovative
aspects in its operational procedure that foster its utilization in diverse fields such as
engineering, construction, telecommunications, robotics, health and energy |
|
STATE-OF-THE-ART REVIEW ON APPLICATIONS OF HARMONY SEARCH META HEURISTIC ALGORITHM |
Author : Assif Assad |
Abstract | Full Text |
Abstract :Harmony Search (HS) a meta heuristic algorithm inspired by music improvisation
process in which the musician searches for the best harmony and continues to polish
the harmony in order to improve its aesthetics. The HS algorithm was introduced in
the year 2001 and has found applications in diverse fields.
This manuscript reviews state-of-the-art applications of Harmony Search
algorithm. As evidenced by a number of studies, this algorithm features several
innovative aspects in its operational procedure that foster its utilization in diverse
fields such as engineering, construction, telecommunications, robotics, health and
energy |
|
MACHINE AS ONE PLAYER IN INDIAN COWRY BOARD GAME: BASIC PLAYING STRATEGIES |
Author : Pouyan Davoudian and P. Nagabhushan |
Abstract | Full Text |
Abstract :Cowry game is an ancient board game from India, also known as Chowka Bhara.
It is a race game of chance and strategy for 2-4 players, in which playing pieces are
moved around a square board according to the throw of special dice (cowry shells).
This game involves decision-making under uncertainty and imprecision with multiple
players, and therefore can be considered as an appropriate model for real-life
problems that contain stochastic components. In this research, we propose and
analyze a few basic playing strategies for Cowry game, and describe the framework
for the implementation of these strategies. We also provide an experimental
comparison of the proposed strategies to evaluate their performances. The
comprehensive study of Cowry game presented in this work can be used to gain a
better understanding of the game, and may result in the formulation and
implementation of more advanced strategies. It can also serve as a basis for
producing better artificial players in similar strategic race games. |
|
SECURED DATA AGGREGATION USING FIBONACCI NUMBERS AND UNICODE SYMBOLS FOR WSN |
Author : Mohamed Yacoab, Mohemmed Sha and Mohamed Mustaq Ahmed |
Abstract | Full Text |
Abstract :Wireless Sensor Network is a combination of one and more nodes basically used
for data collection and data aggregation. The collected data is aggregated through a
data aggregator and it will be sent to bases station. While data transfer takes place,
the intruders sniff the data that are being transmitted from one node to another. In
order to prevent data sniffing by the intruder a study is made to prevent data sniffing
and data breaching its boundaries. A two-fold cryptographic method is proposed
using Fibonacci number and Unicode systems. The proposed method is two-fold viz,
converting the normal simple text to secret message ii) convert secret message to
Unicode text. The proposed method uses a two-fold security key encryption algorithm
for transfer of data in the Wireless sensor networks and a decryption algorithm is also
provided to convert back to its normal plain text. |
|
AN OPTIMAL COMPOSITION PLAN SELECTION USING MULTI OBJECTIVE PARTICLE SWARM OPTIMIZATION |
Author : Parimalam.T, and Dr. Meenakshi Sundaram.K |
Abstract | Full Text |
Abstract :Domain-ontology based Particle Swarm Optimization (PSO)-inspired Balanced
Iterative Reducing and Clustering using Hierarchies (BIRCH) and Improved Bipartite
graph is an efficient web service composition approach. It composed a number of web
services to achieve high customer satisfaction. It also improves the clustering quality
and reduced the processing time of web service composition. The most verification
problems are easily verified by using Petri Net in planning, verification and execution
phase. In order to provide more efficient automatic web service composition, in this
paper multiple Quality of Service (QoS) parameters like cost, accuracy, accessibility,
robustness, scalability, modifiability and security are included in Petri net-based
algebra approach. While adding a greater number of QoS parameters there will be
multi-objective services composition optimization problem. It is solved by introducing
a Multi-Objective Genetic Algorithm (MOGA) and Multi-Objective PSO (MOPSO)
algorithm. It selects qualitative different services from multiple functionally identical
web services which achieve the best reliability model. Thus, the automatic web service
composition is performed for discovering the appropriate services by satisfying the
customer requirements. Experimental results show that the proposed method performs
better than the existing method. |
|
FAULT DATA DETECTION IN SOFTWARE USING A NOVEL FGRNN ALGORITHM |
Author : Neeta Rastogi, Shishir Rastogi, Manuj Darbari |
Abstract | Full Text |
Abstract :The use and dependence on software in various fields has been the reason why
researchers for past decades have spent their efforts on finding better methods to
predict software quality and reliability. Soft computing methods have been used to
bring efficient improvement in prediction of software reliability. This study proposed a
novel method called Fuzzy Greedy Recurrent Neural Network (FGRNN) to assess
software reliability by detecting the faults in the software. A deep learning model
based on the Recurrent Neural Network (RNN) has been used to predict the number of
faults in software. The proposed model consists of four modules. The first module,
attribute selection pre-processing, selects the relevant attributes and improves
generalization that improves the prediction on unknown data. Second module called,
Fuzzy conversion using membership function, smoothly collects the linear sub-models,
joined together to provide results. Next, Greedy selection deals with the attribute
subset selection problem. Finally, RNN technique is used to predict software failure
using previously recorded failure data. To attest the performance of the software, the
popular NASA Metric Data Program datasets are used. Experimental results show
that the proposed FGRNN model has better performance in reliability prediction
compared with existing other parameter based and NN based models. |
|
MULTIPLE KERNEL FUZZY CLUSTERING FOR UNCERTAIN DATA CLASSIFICATION |
Author : Nijaguna GS and Dr. Thippeswamy K |
Abstract | Full Text |
Abstract :Traditional1call tree classifiers work1with information whose values1area
unitcelebrated and precise. We1have a tendency to extend1such classifiers to handle
information1with unsureinfo. Worth uncertainty1arises in several applications
throughoutthe1informationassortmentmethod.Example1source of uncertainty
embrace1measurement/quantization errors, information staleness, and multiple
recurrent measurements. With1uncertainty, the worth of a knowledge item1is
commonlydepicted not by one1single worth, however by1multiple values forming a
probability distribution. Instead of1abstracting unsureinformation by applied1math
derivatives (such as1mean and median), we have a1tendency to discover that1the
accuracy of a call1tree classifier will bea lot1of improved if the
“complete1information” of a knowledge item is utilized. Since1process pdf’sis
computationall1 a lot of pricey than1process single values (e.g., averages), call
tree1construction on unsure information1is more electronic equipment1demanding
than that sure information. To1tackle this problem, we have a tendency1to propose a
series of pruning1techniques which will greatly improve1construction potency. |
|
Comparison of Feature Extraction Techniques for EEG Based Brain-Computer Interface |
Author : Mandeep Kaur Ghumman and Satvir Singh |
Abstract | Full Text |
Abstract :The analysis of electroencephalogram(EEG) signals,
for implementation of brain-computer interface (BCI), has enticed a lot of interest in the research community. It can be used
in a variety of applications ranging from medical rehabilitation
to pleasure and entertainment. BCI is very promising research
domain even in the face of a number of challenges, especially
in domain of signal processing, feature extraction and that of
classification techniques, as these EEG signals contain considerable amount of noise and artifacts. In this paper, various
feature extraction algorithms used in BCI are investigated
and compared. Common spatial pattern(CSP) is a popularly
applied algorithm for extracting features from EEG signals in
an implementation of BCI. Filter bank spatial pattern (FBCSP)
and spectrally weighted common spatial pattern (SWCSP) are
further extensions of CSP. The performance of these methods
is evaluated and calculated on data set 2a of BCI competition
IV, on the basis of standardized mean squared error (SMSE)
. Results shows that FBCSP considerably outperforms the
performance of the other methods under consideration. |
|