博電競菠菜平臺 www.rrkuhedt.buzz Muhannad Akram Ahmad1, Farah Hanna Zawaideh2, Ahmad Bisher3
The advance adoption of Accounting Information Systems (AIS) in low and medium scale organizations, make many uncertainties in different scopes of development and reliability. In medium scale organizations, the e-commerce is the most critical process that could be used with AIS with the need of strong criteria to meet the reliability and productivity requirements. The effects of applying e-commerce on medium scale organization with the cooperation of AIS is studied and dem-onstrated in this paper. The study sample involves Jordanian medium scale organizations. A questionnaire had been and based on analysis that are demonstrated in this paper, and a hypothesis checked based on the study sample which is a group of 50% of the medium scale in Jordan companies. A statistical analysis where done to demonstrate the hypothesis and the relationship between the e-commerce and AIS in the study sample was done in this paper. The use of e-commerce supplies suitable accounting information about elements present at the right time, at a state that is both dependable and steady for decision makers, besides, using e-commerce tackles the issue of providing information access security by requiring a username and password which prevents unauthorized system entrances. A strong relationship between the two side; AIS and e-commerce is presented in this paper at a specific scale of organizations, and business model.
A. Namachivayam and Kaliyaperumal Karthikeyan
The paper introduces a face recognition method using probabilistic subspaces analysis on multi-module singular value features of face images. Singular value vector of a face image is valid feature for identification. But the recognition rate is low when only one module singular value vector is used for face recognition. To improve the recognition rate, many sub-images are obtained when the face image is divided in different ways, with all singular values of each image used as a new sample vector of the face image. These multi-module singular value vectors include all features of a face image from local to the whole, so more discriminator information for face recognition is obtained. Subsequently, probabilistic subspaces analysis is used under these multi-module singular value vectors. The experimental results demonstrate that the method is obviously superior to corresponding algorithms and the recognition rate is respectively 97.5% and 99.5% in ORL and CAS-PEAL-R1human face image databases.
Keyloggers have been widely used by hackers as a tool to steal information and passwords from users in e-commerce. The malware security software has also grown but keylogger grow too. This article reviews some of the techniques used by hackers to spread Keyloggers and bypass various security techniques by using advanced Keyloggers and finally, we describe a novel approach to deal with Keyloggers. This approach, capable of dealing with the most advanced Keyloggers, and greatly reduces damage caused by Keyloggers.
With the initiation of the Web and search engines, online searching has become a common method for obtaining information. People search for information on a daily basis, for various purposes: for help with job-related tasks or schoolwork, or just for enjoyment. Searching is the second most common online activity after email. On a typical day more than half of Internet users are searching the Internet using a search engine. Web searching is an easy and convenient way to find information. Web search provide searchers with almost instant access to millions of search results, but the quality of the information linked to by these search results is rough. Some of this information may be incorrect, or come from unreliable sources. Today, So many search engines available on the Internet, such as Google, Yahoo, Bing, AltaVista, MSN Search and Ice rocket. These search engines provide different features and efficiencies. Obviously, we cannot use all of them at the same time, so we can get confused, which one is the best? Which one should I use? The present study is trying to answer these questions by comparing two search Engines: Google and Yahoo.
H.B. BASANTH KUMAR
Recent advances in technology have made tremendous amount of multimedia content available. The amount of video is increasing, due to which the systems that improve the access to the video is needed. A current research topic on video includes video abstraction or summarization, video classification, video annotation and content based video retrieval. In all these applications one needs to identify shot and key frames in video which will correctly and briefly indicates the contents of video. This paper presents a brief overview on shot boundary detection and its techniques.
Jalal H. Awad and Amir S. Almallah ??
Moving object detection has been widely used in diverse discipline such as intelligent transportation systems, airport security systems, video monitoring systems, and so on. In this paper we proposed an edge segment based statistical background modeling algorithm, which can be implemented for moving edge detection in video surveillance system using static camera. The proposed method is an edge segment based, so it can help to exceed some of the difficulties that face traditional pixel based methods in updating background model or bringing out ghosts while a sudden change occurs in the background.As an edge segment based method it is robust to illumination variation and noise, it is also robust against the traditional difficulties that faces existing pixel based methods like the scattering of the moving edge pixels. Therefore they can’t utilize edge shape information. Some other edge segment based methods treat every edge segment equally creating edge mismatch due to non stationary background. The proposed method found elegant solution to this lake by using a model that uses the statistics of each background edge segment, so that it can model both the static and partially moving background edges using ordinary training images that may even contain moving objects.
Jayachandra Kannemadugu?and D. Punyasheshudu2
Much more challenging than that of groundwater is the improvement of performance of canal systems, which will remain the dominant water transfer technique in developing countries in the future. The standards used for design of irrigation canals in many countries have not kept up with the development of new technologies. Global Position Systems ( GPS) and Geographical Information System (GIS) integrated systems have emerged in the recent years as powerful tools in conducting large scale surveys for mapping and to create spatial data base of any Geographical area in any scale and integration of non-spatial information, modeling, analysis and generating statistical data according to the nature of application. This article employs spatial analysis techniques to extract the extent of land components of the individual owners/enjoyers which would be acquired by the Government for constructing the canal and to make the compensations against their land.
Prerna Jain* And Balraj Singh
Fault proneness prediction of a software module having a great role in the quality assurance. Faults proneness prediction is to find the areas of the software which have a greater impact. In today’s world it is becoming a big issue to find the fault prone modules. Faults are the flaws in the software that can cause a failure. Most of the failures are in the small parts of the software modules. Testing of the fault prone modules again and again after resolving the fault decreases the testing cost and effort as compared to test the whole module after resolving the fault again and again. The primary goal of this research paper is to analyse the techniques of fault proneness of the software module.
Saad Talib Hasson 1? And?? Sura Jasim
Mobile Ad hoc Network (MANET) is one of the essential required wireless networks in this era. The routing process is one of the most important challenges that faces the designers and the managers of such networks. While, the performance evaluation is the main process that must be controlled in order to test, improve and develop any new or available MANET. This network’s performance can be measured using many metrics such as throughput, delay, overhead, etc. This paper represents a behavioral study for certain MANET Routing protocols using OPNET Modeler with respect to three performance metrics (Throughput, Routing Overhead and Delay). Different scenarios with different number of nodes (network size) and different network area size were applied and implemented. Three routing protocols namley (Adhoc On-Demand Distance (AODV), Optimized Link State Routing protocol (OLSR) and Geographic Routing Protocol (GRP)) were used and applied in this study.
Geetanjali Kandhari1,?Deepak Aggarwal2
The challenges area of Ad Hoc networks is attributable to their lack of established infrastructure, requirement for redistributed management, dynamic topology, and wireless channel characteristics. We have a tendency to study the performance of IEEE 802.15 MAC protocol below varied conditions such as using Power control method and other techniques that improve better than of other Ad Hoc networks. To satisfy these needs a WSN networks is used to supports adaptively and improvement across layers of the protocol is needed. The WSN network improves better than other Ad Hoc networks types because in WSN the data only transferred when the both (sender /receiver) satisfied otherwise they kept on sleeping mode. The major challenge of this paper is to avoid the attacks because WSN continuously performed better than other technologies. The attack that is classified in the section 3; from these attack classification we analyze that the Message Replay attack is one of the more powerful attack that continuously touch with destination node and destination node assumes that the packets was received soon but an malicious node can’t transferred the packets to the destination node. So in this stage message dropping start and the performance of the entire network goes down. We implement the secret key methodology in the network scenario so that each node communicates to other node only when if they having a secret key otherwise communication could not occurred. From the results studied it was observed that we improve the performances of WSN and avoided the Message replay attacks.
There are several web architectures present in World Wide Web (WWW) such as Client-server web architecture (CWA), distributed web architecture (DWA) and serviceoriented architecture (SOA). DWA and SOA is the most popular distributed technology in whole WWW. The use of SOA or more specifically the web services technology is an important architecture decision. Marketing application also used only distributed technology, because distributed technology is provided several quality attributes modifiability, performance, availability, security, interoperability and scalability. CWA doesn’t provide quality attribute as much DWA and SOA. Today distributed web architecture has become the most popular approach for developing web architecture. In this paper we have evaluated the performance quality attributes for CWA, DWA and SOA. These performance evaluations have observed in terms of time.
Mohammed Saad Talib1 and?Zainab Saad Talib2
Simulation techniques are being the most important tool that can be used in verification, test and implementation of the Mobile Ad-hoc Network (MANET) environments. These computerized techniques can be feasibly used as a virtual tool to model and operate any suggested environment. MANET was defined “as a set of mobile nodes that moved freely and connected among each other without any infrastructure”. Simulation represents a good reliable tool that can be used to evaluate the performance of an existing or proposed MANET environment under different configurations along periods of real time. In this paper certain MANET’s environment was simulated and operated with AOMDV routing Protocol. The effects of the number of nodes, their speeds and their pause times were modeled and analyzed. The network simulator NS-2 was used as a tool to study and evaluate the effects of these factors. Many performance metrics such as throughput, packets loss, packet delivery fraction (PDF), average end-to-end delay, normalize routing load (NRL) and jitter were used as comparison indications. The simulation environment was implemented with different number of nodes, different speeds, and varying pause times. Significant effects and relations between all these parameters and performance metrics were found and calculated.
Merin Varghese, Sherin Chacko Thomas,?S.Manimurugan
Image inpainting and geometric rearrangements are closely linked and can be understood simultaneously with the use of algorithms and shift map. This paper also discusses the formulation and algorithm of inpainting by correspondence map with the clear cut notations and equations. The correspondence map is defined as linking each and every blank or missing pixel to the some pixels, where that pixel’s value is in the seed image. Here the algorithm is a descent for E(F). This paper also describes a new representation of geometric rearrangement of images using operations such as graph labelling as part of image editing, where the shift map represents the selected label for each output pixel. Pixel rearrangement, pixel saliency and removal, smoothness term for pixel pairs is described in detail.
Shrikant Lade And Ashish Pahdey
Data mining represent the process of extraction interesting and previously unknown knowledge from data.? In this thesis we address the important data mining problem of discovering association rules in single-table and multiple-table database and we also introduce a generalization of database concept of functional dependency: the purity dependencies, which can be viewed as a type of rules that are information-ally more significant than association rule. An association rule expresses the dependence of a set of attributes value pairs, also called items, upon another set of items.?
Pramod Kadam, Sachin Kadam
This paper provides initial thought for why should do evolutionary study in computer science (in general) and in sorting algorithmic domain (in particular)?Primaryevaluation ofevolutionary study of sorting algorithms and list of literatures which can help during this evolutionary study of sorting algorithms.
Cloud computing, with its promise of (almost) unlimited computation, storage and bandwidth, is increasingly becoming the infrastructure of choice for many organizations. As cloud offerings mature, service-based applications need to dynamically recompose themselves, to self-adapt to changing QoS requirements. In this paper, we present a decentralized mechanism for such self-adaptation, using natural language processing. We use a continuous double-auction to allow applications to decide which services to choose, amongst the many on offer. The proposed scheme exploits concepts derived from graph partitioning, and groups together tasks so as to 1) minimize the time overlapping of the tasks assigned to a given resource and 2) maximize the time overlapping among tasks assigned to different resources. The partitioning is performed using a spectral clustering methodology through normalized cuts. Experimental results show that the proposed algorithm outperforms other scheduling algorithms for different values of the granularity and the load of the task requests.
Ibtisam A. Aljazaery
In this paper, a new method to encrypt the signals with one dimension and images(monochrome or color images) in a time more less than if these signals and images are encrypted with their original sizes This method depends on extracting the important features which are distinguished these signals and images and then discarding them. The next step is encrypting? the lowest dimensions of these data. Discrete Wavelet transform (DWT) is used as a feature extraction because it is a powerful tool of signal processing for it’s multiresolutional possibilities. The chosen data is encrypted with one of conventional cryptographic algorithm (Permutation algorithm) after shrinking it’s dimension using suitable encryption key. The encrypted data is 100% unrecognized, besides, the decryption algorithm ?returned ?the encrypted data to it’s original dimension efficiently.
?Dr. Saad Talib Hasson1;? Jullanar Ali Al-Motairy1?, Luay Habeeb Hashim1, Zainab Saad Talib 2?and wurood mahdi sahib 1
Computer Simulation and Modeling techniques were used successfully to represent and propose an Optimum resource allocation problem in many field and applications. In wireless communication networks these techniques can be applied in the field of developing and controlling the available paths, power control, coverage area, delay, and delivery guarantee.
In this study the main objective function is to indicate the optimum location of the wireless network stations to maximize the throughput, minimize delay, reduce path loss information, increase the coverage area and minimize the total network costs. The network performance must not exceed the limitation constraints under the required feasible standard quality of service.
The modeling results showed good indications and gave a suitable guide to build, extend or develop such networks.
By leveraging virtual machine (VM) technology which provides performance and fault isolation, Cloud resources can be provisioned on demand in a fine-grained, multiplexed manner rather than in monolithic pieces. By integrating volunteer computing into Cloud architectures, we envision a gigantic Self-Organizing Cloud (SOC) being formed to reap the huge potential of untapped commodity computing power over the Internet. Towards this new architecture where each participant may autonomously act as both resource consumer and provider, we propose a fully distributed, VM-multiplexing resource allocation scheme to manage decentralized resources. Our approach not only achieves maximized resource utilization using the proportional share model (PSM), but also delivers provably and adaptively optimal execution efficiency. We also design
a novel multi-attribute range query protocol for locating qualified nodes. Contrary to existing solutions which often generate bulky messages per request, our protocol produces only one lightweight query message per task on the Content Addressable Network (CAN). It works effectively to find for each task its qualified resources under a randomized policy that mitigates the contention among requesters.This paper extends our previous work and proposes a lightweight mathematical model to estimate the energy cost of live migration of an idle virtual machine quantitatively. A series of experiments were conducted on KVM to profile the migration time and the power consumption during live migration. Based on these data we derived an energy cost model that predicts the energy overhead of live migration of virtual machines with an accuracy of higher than 90%.
Priyanka Dubey1, Prof. Roshni Dubey2
The past fifteen years are characterized by an exponential growth of the Web both in the number of Web sites available and in the number of their users. This growth generated huge quantities of data related to the user’s interaction with the Web sites, recorded in Web log files. Moreover, the Web sites owners expressed the need to better understand their visitors in order to better serve them. The Web Use Mining (WUM) is a rather recent research field and it corresponds to the process of knowledge discovery from databases (KDD) applied to the Web usage data. It comprises three main stages: the preprocessing of raw data, the discovery of schemas and the analysis (or interpretation) of results. A WUM process extracts behavioral patterns from the Web usage data and, if available, from the Web site information (structure and content) and on the Web site users (user profiles). In this thesis, we bring two significant contributions for a Web Use Mining process. We propose a customized application specific methodology for preprocessing the Web logs and a modified frequent pattern tree for the discovery of patterns efficiently.
Omni Thakur*, Janpreet Singh
Risk Identification tools are usually established for avoiding or minimizing problems, likely to occur during software development. It can be stated as the task of analysing and managing the impact of every important risk occurred in the project. In the context of Risk Identification tool practices, we developed a scoping study, aiming at analysing the current scenario of Risk Identification practices in software development process. We analysed different studies published by the most important venues published up to the year 2013. Based on the analysed data set, we sketched a set of useful techniques and tools for applying Risk Identification in software projects. The analysis indicates that most of the studies subjectively describe ways to evaluate risks, instead of providing readers with details on how Risk Identification is to be performed. Such findings points out to the need of further research in the field of Risk Management, especially for the identification of risks in software development process for better results.
*Anil Rajput, **Naveen Kumar?
In this paper we are discussing about a new technology in mobile industry that is known as 5th generation mobile technology (5G) or Nano technology. The enhancement in technologies by everyday, Mobile industry is also getting more enhanced. In every 4 to 5 years a new technology in the IT market is introduced. New technology is developed for integrating the real life with mobile technology. Over the years, wireless telecommunications market has recognized as one of the most dynamic and fastest growing segments of the global telecommunications industry .First came the 1G, after that ?2G and 3G came to surface. Nowadays the very popular technology 4G is running that is also known as super fast technology. Now there will come?5G?Fifth Generation technology. This latest and more enhanced technology will provide a unique experience to its users.5th generation wireless systems is a major phase of mobile telecommunications standards.?
1Sheo Das, 2Dr. Kuldeep Singh Raghuwanshi
A meta-search engine is a search engine that utilizes multiple search engines, when a meta search engine receives a query from a user, it invokes the underlying search engines to retrieve useful information for the user. Most of the data on the web is in the form of text or image. A good database selection algorithm should identify potentially useful databases accurately. Many approaches have been proposed to tackle the database selection problem. These approaches differ on the database representatives ?they use to indicate the contents of each database, the measures they use to indicate the usefulness of each database with respect to a given query, and the techniques they employ to estimate the usefulness. In this paper, we study to design an algorithm and compare with the existing algorithm to select the most appropriate search engine with respect to the user query.
Er. Syed Minhaj Ali?
This paper presents a short survey of a framework of discriminative method of least squares regression (LSR) model and additionally its applications for. The core plan is to review the Discriminative method of least squares Regression framework and its application areas, wherever it provides the optimum answer. As per we have a tendency to study for applying in varied drawback of classification several new concepts are added to that like ε-dragging that is introduced to force the regression targets of various categories moving on opposite directions specified the distances between categories are often enlarged. Then, the ε-dragging are integrated into the LSR model for multiclass classification. during this means the new learning framework, cited as discriminative method of least squares regression, has a compact model form, wherever there's no ought to train two-class machines that are independent of every alternative. With its compact form, this model are often naturally extended for feature selection. So, the aim of this paper is to present survey on Discriminative least squares Regression method for finding its uses in varied fields wherever it may be used for different fine works.
Jnana Ranjan Tripathy1, Hrudaya Kumar Tripathy2, S.S.Nayak3
The Neural Network Trainer (NNT) was originally developed as a tool for training neural networks for use on a PC or comparable computing machine. NNT originally produced for the user an array of weights that corresponded to the weights in a neural network architecture designed by that user. From this point, it is was the user's responsibility to create a neural network that could utilize these weights. This paper transforms this original tool into a complete neural network implementation package for microcontrollers. This software package includes the trainer, an assembly language based generic neural network for the PIC 18 series microcontroller, 8-bit neural network simulator, a microcontroller communication interface for testing embedded neural networks, and a C implemented neural network for any microcontroller with a C compiler.
N. Lakshmi Prasanna
The field of Graph Theory plays vital role in various?fields. One of the important areas in graph theory is Graph?Labeling used in many applications like coding theory, x-ray?crystallography, radar, astronomy, circuit design,?communication network addressing, data base management.?This paper gives an overview of labeling of graphs in?heterogeneous fields to some extent but mainly focuses on the?communication networks. Communication network has two?types ‘Wired communication’ and ‘wireless communication’.?Day by day wireless networks have been developed to ease?communication between any two systems, results more efficient?communication.?This paper also explored role of labeling in expanding?the utility of this channel assignment process in communication?networks. Various papers based on graph labeling have been?observed, and identified its usage towards communication?networks. This paper addresses how the concept of graph?labeling can be applied to network security, network addressing,?channel assignment process, social networks. An overview and?new ideas has been proposed here.
JUNEESH K KURIACHAN
This article recounts research-in-progress which attempts to account for a paradigm
shift in Indian design education. The research also explores features that need to be rooted and
nurtured in the foundation year of design education to be suitable for the realities of life in 21st century
India. This foundation year has had diverse titles including “Design Fundamentals” or “Basic Design”.
The foundation year began at the Bauhaus and evolved after 1945 at the Hochschule für Gestaltung,
Ulm and the Academy of Art and Design Basel (HGK Basel). In its emergent period design was
concentrated on discrete products including industrial goods, textiles, ceramics, architecture and
graphic design. Today, however, to be pertinent to concurrent society, designers need to work on
convoluted issues that are interdisciplinary and much comprehensive in scope. 21st century design
education needs to be able to apply design and develop strategies to solve actual issues and not
just look at “good form”. There is also a visible shift from client-driven projects towards a more
reflective “issue-based” design education that strives for more socially inclusive, locally/glocally/
globally relevant solutions: a move from “human-centric design” to “life-centric design”. The research
to date incorporate an overview of the history of design education in India from its European origins
and a literature scrutiny of both formal (books, papers, and reports) and informal sources (blogs and
emails) to justify and reinforce the argument for change. Identification of central issues to contemporary
design education is based in-depth interviews and emphasize group discussions. The interviews
and discussions were with design educators and professional designers who interact with design
students as mentors or share a common con-cern for design education. Deliberations on design
and insights into future directions were congregated from confer-ence and seminar recordings. The
complications of the Indian people, both nationally and locally, within the mesh of cultural diversity
with eco-nomic discrepancy-ncluding health, transportation, housing, agricultural support, safe
water provision, to mention some of the many sectors—are areas which offer potential for the
designer to make a contribution in finding panacea to the wide range of problems facing India. It is
important for designers to comprehend the complexity of issues at stake as well as being aware of
“intangibles” like values, social responsibilities, empa-thy, humility, and local/global relevance. When
design education includes political, social, economic and eco-logical discourses in a collaborative,
inter/multidisciplinary way perhaps then design can participate actively in nation building.
A. HYILS SHARON MAGDALENE
It is widely recognized that the threat to enterprises from insider activities is increasing and
that significant costs are being incurred. The multi-faceted dimensions of insider threat and
compromising actions have resulted in a diverse experience and understanding of what insider
threats are and how to detect or prevent them. The purpose of this research is to investigate the
potential for near real-time detection of insider threat activities within a large enterprise environment
using monitoring tools centred on the information infrastructure. As inside threat activities are not
confined solely to cyber-based threats, the research will explore the potential for harnessing a
variety of threat indicators buried in a different enterprise operations connected or interfacing with
the information infrastructure, while enabling human analysts to make informed decisions efficiently
MAZIN OMAR KHAIRO
While the current IT plethora is replicating enormous data and it need to have tremendous
processing power by the servers to maintain this valuable data and storages. Therefore in order
to provide ease to data storage Erasure coding exhibits much success in the area of data mining
as it can reduce the space and bandwidth overheads of redundancy in fault-tolerance delivery
systems, so the exploration of erasure coding in concurrency with the metadata will be appreciable;
Research also on the other side of the coin shows as few have analyzed the understanding of
consistent hashing too will proved productive for certain related issues and also the neural
networks can be used for maintaining and exploring new data sciences in order to provide
encouraging frameworks in managing infinite volumes of data we have at our disposal. In this
work, we prove the visualization of simulated annealing in order to render solution at global
maxima and provide provision of improvements to the specified framework or model. We also
analyzed and state the disconfirmation about the fact that write-back caches and neural networks
are never incompatible