we make your business better
PLANS & PRICING
Most Recent Documents
Jobs & Careers
Politics & History
Health & Fitness
Art & Literature
STARTING A BUSINESS
GROWING A BUSINESS
Confidential Algorithm for Golden Cryptography Using Haar Wavelet
One of the most important consideration techniques when one want to solve the protecting of digital signal is the golden matrix. The golden matrices can be used for creation of a new kind of cryptography called the golden cryptography. Many research papers have proved that the method is very fast and simple for technical realization and can be used for cryptographic protection of digital signals. In this paper, we introduce a technique of encryption based on combination of haar wavelet and golden matrix. These combinations carry out after compression data by adaptive Huffman code to reduce data size and remove redundant data. This process will provide multisecurity services. In addition Message Authentication Code (MAC) technique can be used to provide authentication and the integrity of this scheme. The proposed scheme is accomplished through five stages, the compression data, key generation, encryption stage, the decryption stage and decompression at communication ends.
A Secure Attribute-based Model to Foster Collaboration in Healthcare Systems
In today´s rapidly-evolving globalized world, there is an undeniable trend towards establishing secure distributed collaborative work environments. In the prototypical example of health care institutions, critical research needs involve both effective collaboration modeling, and highly secured dynamic interactions insurance. In this paper, we introduce a new system design that enables both synchronous and asynchronous secure communication between different entities in a collaborative work environment. The proposed system provides a fine-grained attribute-based access control model to secure the collaboration in distributed systems, namely in Computer Supportive Cooperative Work (CSCW) systems. We opted for breast cancer diagnosis as a case study to apply our system design. Through a clear specification model of our system, we highlight the feasibility of achieving real-time and secure breast cancer diagnosis process, in which several medical organizations are engaged.
Limitations of Current Security Measures to Address Information Leakage Attacks
Information leakage attacks represent a serious threat for their widespread and devastating effects. Their significance stems from the fact that they are committed by an organization’s authorized computer users, and/or processes executing on their behalf. The diverse avenues that could be exploited to carry out such attacks add another barrier towards addressing them. Based on literature review, this paper explores strengths of security measures intended to confront information leakage attacks, and focuses on pinpointing their respective limitations. It demonstrates that only few of them are capable of mitigating such attacks, whereas the rest suffer from conceptual and/or implementation-related limitations that render them vulnerable to circumvention. They are basically prone to high false positive and/or false negative rates, complex to apply, inflexible during execution, suffer from degraded performance, or require hardware modification. Most importantly, neither of them provides a remedy for new undetected malicious software, nor the ever increasing insider threat.
A two-stage architecture for stock price forecasting by combining SOM and fuzzy-SVM
This paper proposed a model to predict the stock price based on combining Self-Organizing Map (SOM) and fuzzy – Support Vector Machines (f-SVM). Extraction of fuzzy rules from raw data based on the combining of statistical machine learning models is the base of this proposed approach. In the proposed model, SOM is used as a clustering algorithm to partition the whole input space into several disjoint regions. For each partition, a set of fuzzy rules is extracted based on a f-SVM combining model. Then fuzzy rules sets are used to predict the test data using fuzzy inference algorithms. The performance of the proposed approach is compared with other models using four data sets.
Solving the Problem of the K Parameter in the KNN Classifier Using an Ensemble Learning Approach
This paper presents a new solution for choosing the K parameter in the k-nearest neighbor (KNN) algorithm, the solution depending on the idea of ensemble learning, in which a weak KNN classifier is used each time with a different K, starting from one to the square root of the size of the training set. The results of the weak classifiers are combined using the weighted sum rule. The proposed solution was tested and compared to other solutions using a group of experiments in real life problems. The experimental results show that the proposed classifier outperforms the traditional KNN classifier that uses a different number of neighbors, is competitive with other classifiers, and is a promising classifier with strong potential for a wide range of applications.
Proposing a New Hybrid Approach in Movie Recommender System
Due to the unprecedented growth of information, goods and services, a lot of application programs have been created in recent years to help the selection of goods and services to customers. One of the most important application programs are recommender systems that used to things as proposed movies, books, web pages, and E-Business, etc. Most of recommender systems using Collaborative filtering (CF) and or content based filtering (CBF) to provide suggestion for users. In this paper a new approach examined for better assessing the interests of customers. With the understanding of customer behavior, appropriate offer will be provided to customers. In fact, by using a new hybrid approach, weakness of content based filtering and Collaborative filtering methods as much as possible to will be resolve. The results of this paper can be used to keep and attract customers in the institutions or stores that have fixed customers. In this paper, first we reviewing the recommender systems and investigating types of filtering. Then a new hybrid approach by using CF and CBF methods is presented in a movie recommender system. Results are evaluated on movielens valid data, that the results show improvement in the movie recommender system.
Computational Algorithms Based on the Paninian System to Process Euphonic Conjunctions for Word Searches
Searching for words in Sanskrit E-text is a problem that is accompanied by complexities introduced by features of Sanskrit such as euphonic conjunctions or ‘sandhis’. A word could occur in an E-text in a transformed form owing to the operation of rules of sandhi. Simple word search would not yield these transformed forms of the word. Further, there is no search engine in the literature that can comprehensively search for words in Sanskrit E-texts taking euphonic conjunctions into account. This work presents an optimal binary representational schema for letters of the Sanskrit alphabet along with algorithms to efficiently process the sandhi rules of Sanskrit grammar. The work further presents an algorithm that uses the sandhi processing algorithm to perform a comprehensive word search on E-text.
Trellis Analysis of Transmission Burst Errors in Viterbi Decoding
The Viterbi decoder is the most favorable solution to the problem of decoding codewords from a convolutional encoder. Viterbi decoder performs exceptionally well when a received codewords block contains single or multiple and scattered errors in a received codewords block. However, the formation of burst errors in data transmission due to high transmission speed and the widely varying error conditions of wireless media in fading channel creates decoding challenge for such conditions which result in unbearable amount of residual errors. By using Viterbi decoders’ trellis diagrams, this paper analyses the effects of burst errors to the decoder that lead to residual errors and proposes improvement to the encoding and decoding procedures of the existing (2, 1, 2) binary convolutional encoder. The improved version facilitate effectiveness in the decoder (Viterbi algorithm) in decoding burst errors and hence reduction of residual errors in a poor channel. The proposed enhancements improve the decoder’s operational performance by 75 percent. However, the proposed modification reduces the encoder’s data transmission rate from 1/2 to 1/6.
Towards a Mobile-Based DSS for Smallholder Livestock Keepers: Tanzania as a Case Study
Building a useful and responsive Decision Support System (DSS) requires a deep understanding of the pertinent application domain before starting the system design. In this paper we report about an attempt to develop a mobile-based DSS for smallholder livestock keepers with Arusha region as a case study. The objective of the reported study is to provide an information tool for decision making to the smallholder livestock keepers. The development process involved: 1) employing information gathering techniques to understand smallholder livestock keepers’ information needs 2) studying the current methods that are used for information flow among livestock stakeholders. (i.e. smallholder livestock keepers, extension officers and livestock researchers) 3) analysis of the current situation within Arusha: located in the northern parts of Tanzania in terms of mobile phones penetration, with prospects of leveraging the high mobile phone penetration rate for enhanced information sharing among the smallholder livestock keepers and 4) exploration of options for the platform/model to be used for information access and delivery. The outputs of the above four activities were used to inform the requirements elicitation, and design phases of the mobile-based DSS system development. In addition, the mentioned four activities were supplemented by an extensive literature review of related works on requirements engineering in DSS development. It is anticipated that once the system has been developed, it will be of help to livestock keepers, improving farm-level productivity and decision making process. Findings from the study indicate that majority of smallholder livestock keepers in the selected area possess mobile phones and are in need of access to specific information to support their livestock related decision making. However, information access platforms/models that are currently in place do not cater for a satisfactory solution to their needs. Analysis of various options for design
An Ultra Low Power and High Throughput FPGA Implementation of SHA-1 Hash Algorithm
In this paper, we present a low power and highly parallel SHA-1 architecture which is considered as extremely iterative in nature specifically suitable for power sensitive applications. That is achieved by first identifying non dependent operations among the consecutive iterations of the algorithm and then aligning them for their execution in a highly parallel way. Consequently, when iteration completes, some other iterations also get completed and only a few of their dependent operations are left. By using this approach we were able to perform up to four SHA-1 iterations simultaneously resulting an increase in its throughput approximately by four times. We also explain how our results critically effect in lowering down the power consumption of the design.
Classification of Sleep Stages Using Neural Network Based on EEG and EOG signals
This paper introduces an algorithm for different sleep stages classification. The algorithm consists of wavelet packet transformation (WPT) which is applied to 30 seconds long epochs of EEG and EOG recordings to provide time-frequency information, a feature generator to quantify the information and reduce the data set size, and then artificial neural networks for doing optimal classification. This led to a classification method with efficiency of 90.41 percent.
Various Solutions of Black Hole Attack in A mobile Ad Hoc Network (MANET)
Mobile ad hoc network (MANET) is a kind of wireless network that has a number of nodes, these nodes are distributed and connected without dependency on any infrastructure. MANET security has been an important issue since many years, many researchers have concerned in the black hole threat which "announce itself that it has a route to the destination in all cases". There are many solutions have been proposed to encounter these threats, the problem is that the security threats still exist because it is not prevented or avoided completely, in addition the performance of MANET is adversely affected by these solutions, the objective is to find out to what degree it is possible to prevent this attack by these solutions without causing negative effect on efficiency of MANET, so this survey may facilitate developing or proposing more compact idea to encounter security threats. This paper discusses many important solutions that work to detect a black hole node by using different strategies. In this paper, a new strategy proposed but still under testing.
Generic Lightweight Certificate Management Protocol (GLCMP)
This paper describes a Generic Light Weight Certificate Management Protocol (GLCMP) for handling certificates on mobile devices. Theoretically, various security solutions are designed to protect the valuable information of mobile users. But, its power, memory and processing constraints, high response time and authentication latencies are the main challenges for the researcher to develop and integrate standard security mechanisms in it. It is observed that, most of mobile users are not technical enough to configure security parameters and even already developed libraries do not support extended security features like transparent handling of certificates, verification of identities, and distribution of certificates. In this paper, an innovative and comparatively efficient protocol is designed and implemented. It does not only overcome the shortcoming of the certificate handling in mobile devices but also provides some extended certificate related features like registration, authentication and trust delegation. The designed GLCMP is lightweight because all complex and computation-intensive operations, involved in creation of certificate request in PKCS#10 standard format, are offloaded to a proxy server. It also provides domain based secure registration and verification of the identities without exchanging any confidential information to the proxy servers and even no user’s credential is exchanged on network for authentication. After analyzing its performance, we noticed that authentication latency of GLCMP is 0.394 sec which is less than previously proposed protocols like NSI (4.7), PKI (5.01), and PKASSO (5.19 delegation time + 0.082 authentication times). We also formally verified our designed by using Z-Notation Modeling techniques and found that it is protected against man-in-the-middle, replay and impersonation and non-repudiation attacks.
Tools and Techniques for Ontology Interoperability: A Survey
The idea of the semantic web is to add machine process able information to web-based data in order to realize interoperability. Ontology is a shared conceptualization of knowledge representation of particular domain. These are used for the enhancement of semantic information explicitly. Ontologies play a prominent role in the concept of the semantic web to provide semantic information for assisting communication among heterogeneous information repositories. Ontology Interoperability provides the reusability of ontologies Different domain experts and ontology engineers create different ontologies for the same or similar domain depending on their data modeling requirements. These cause ontology heterogeneity and inconsistency problems. As increasing numbers of ontologies are developed by diverse communities, the demand for rapid ontology mapping is arising. For more better and precise results ontology mapping is the solution. As their use has increased, providing means of resolving semantic differences has also become very important. Papers on ontology interoperability report the results on different frameworks and this makes their comparison almost impossible. Therefore, the main focus of this paper will be on providing some basics of ontology interoperability and briefly introducing its different approaches. In this paper we survey the approaches that have been proposed for providing interoperability among domain ontologies and its related techniques and tools.
Efficient RSA Variant for Resource Constrained Environment
The work in this paper is concerned with the memory consumption as well as the performance of RSA cryptosystem so that the most popular public key algorithm can be used efficiently in the resource constrained environment also. For this purpose, RSA variant, RC RSA, is proposed which results in low computational cost and low memory consumption. RC RSA is the improvement over dual RSA small e (based on less memory consumption). Mathematically, as compared to Dual RSA, RC RSA results in the increase of decryption speed by a factor of 9 and in implementation roughly by a factor of 6. On the other hand the encryption speed becomes as low as in standard RSA. Besides the computational speed up, RC RSA is proved to be more secure than the Dual RSA scheme.
The best documents & resources to start and grow a business.
How are you planning on using Docstoc?
JOIN WITH FACEBOOK
By registering with docstoc.com you agree to our
terms of service
, and to receive content and offer notifications.
Already a member?
Sign Into your Account
Not a member yet?
Sign in with Facebook