博電競菠菜平臺 www.rrkuhedt.buzz Mohammad Shkoukani, Eman Alnagi and Rawan Abulail
With the increasing interactions and interrelationships among companies, their suppliers and customers, the need for management of the flow of products, material and information is increasing, which created the concept of the Supply Chain Management. SCM is an old concept implemented to create integration between these entities for many years ago, but with the emergent of e-business and the widespread of internet technologies, the activities and processes of supply chain management have been enhanced. This paper gives short description of the concept of SCM, with its two models, Upstream and Downstream. Two case studies are discussed within these concepts. And finally, it discusses the impact of e-business evolution on the processes of both models of SCM.
Jnana Ranjan Tripathy1 and Hrudaya Kumar Tripathy2
ANnSP is a stream-based programmable and code-level statically recon?gurable processor for realization of neural networks in embedded systems. ANnSP is provided with a neural-network-to-stream compiler and a hardware core builder. The ANnSP stream compiler makes it possible to realize various neural networks using ANnSP. On the other hand, the ANnSP builder makes the ANnSP processor an IP core that can be restructured to satisfy different demands and constraints. This paper presents the architecture of the ANnSP processor, the streaming mechanism, and the builder facilities. Also, synthesis results of a 64-PE ANnSP on a 0.18μm standard-cell library are presented. The obtained results show that a 64-PE ANnSP can perform computations of 25.6 giga connections in a second, while its throughput is upto 51.2 giga 32-bit ?xed point operations per second. Comparing with high performance parallel architectures locates 64-PE ANnSP among the best state of the art parallel processors.
Subrata Paul1, Anirban Mitra2 and Prasanta Padhi3
in popularity. While most people use these social networking to connect with friends, but by going through a close study one can easily observe that their utility extends much beyond the scope of knowing and connecting like-minded peoples. These tools can also be used for learning and knowledge sharing. "Educational Networking" is the appropriate term, which is a kind of social networking technologies used for educational purposes. As the phrase "social networking" can sometime carries negative impact and connotations for educators, the phrase "educational networking" seems to be safer and more objectively towards discussing the pedagogical value. In this article, we have provided a brief overview on advantages of education. Our works include a bit of review of the concepts, definitions and categorizations available in Social Network literature. We have also argued on the central characteristic of Educational networking, which is a combination of personalization and socialization. This combination has a potential to facilitate transparency between student communities. Such transparency gives students insight into each other's work, leading to an increase in quality and benchmark of their own work.
P. Deepan Babu1 and T. Amudha2
Software agents are the autonomous, problem-solving computational entities capable of effective operation in dynamic and open environments. Intelligent agent is a type of software agent, autonomous in nature which observes and acts upon environment and performs some task at each host. Agent can also coordinate, reason, solve a problem, clone and merge with other agents. A Grid is a set of resources (such as CPU, Memory, Disk, Applications, and Database) distributed over wide area networks and supports large scale distributed applications. Resources in grid are geographically distributed, linked through internet, to create virtual super computer for vast amount of computing capacity to solve complex problems. Genetic Algorithm works with key parameters such as fitness function, crossover probability and mutation probability and optimizes task scheduling. The paper proposes a software agent based architecture to utilize the resources effectively in Grid environment. The architecture is compared and analyzed the resource utilization with three algorithms namely Shortest job first, Arbitrarily Scheduling Algorithm and Genetic algorithm. The efficiency of resources utilization is analyzed and suitable algorithm is suggested.
Hasan Z.H Alhajhamad, Faris Amin Abu Hashish, Hoshang Kolivand and Mohd Shahrizal Sunar
This work is an accumulative study on some techniques which could help to extract the geometry and color of an image in the real-time environment. Image segmentation is a hot-zone in Computer Vision approach, however, works still on to produce accurate segmentation results for images. In corporation with other surveys which compares multiple techniques, this paper takes the advantage of choosing the most appropriate technique(s) to be adopted for Augmented Reality environment.Interested reader will obtain knowledge on various categories and types of research challenges in the image-based segmentation within the scope of AR environments.
Trushar.B Patel1 and Premal Soni2
DM i.e. Data Mining is the technique to find out the hidden facts from the large amounts of data. From the historic data, we derive knowledge. Data Mining is important aid for decision making in organizations. Data mining task is the automatic or semi-automatic analysis of large quantities of data to extract previously unknown interesting patterns from the data store.
Fakhrosadat Fanian and Marjankuchaki Rafsanjani
Directed routing protocols, one of them is data centre protocol are the main proposed methods in improving exploration quality of sensor networks. In these protocols, routing data stored within networknodes locally. Therefore, there isn’t universal data about routes; one of these protocols is sensor protocol for information via negotiation (SPIN), in this article it was attempts to solve lack of receiving data by remote nodes from origin in case of non-request from nearer nodes to origin in this protocol using clustering process. In the other hand, it was attempt to prevent routing among origin and receiver nodes separately in this protocol that cause waste of major parts of network resources and by delivering this algorithm led to improve performance and efficacy and also increase life time of network and bearing of this protocol against error because of rotating cluster heads roles.
U. V. Arivazhagu1 and S. Srinivasan2
The P2P routing protocol is affected from the fact that queries need to reach the largest number of peers to enhance the chances to locate the target file and messages should be low as possible We propose trust based query routing technique for P2P Networks. Initially the node with maximum trust value is chosen as cluster head. These cluster heads are designated as trust managers. Each peer maintains a trust table which gets updated once it gets feedback from the trust manager about the resource requested peer. If the update denotes that the node is reachable and trusted, the routing is performed. Otherwise its echo time is verified again to decide the re-routing process. By simulation results, we show that the proposed work offers secured and reliable routing. This peer to peer network is carried out without using a caching mechanism to store the data packets while routing. To achieve this caching mechanism we have proposed methods for caching the data packets in the peers and also to replace these data packets with the new data packets in the next routing process.
Israa Hadi and Saad Taleb Hasson
This paper aims to develop new techniques to construct a graph structure to represent the relation between the objects in movie film. The main approach includes a mechanism for many steps, first step is the Data Base Construction , which create a table consist record for each frame, and each record contain entities such as area, Perimeter, and other parameters. The second step is to calculate the number of features for any object in each frame depending on the parameters in the data base. While the third step constructs a graph structure that represent a base on the features of the objects.
M. Geetha1 and G. M. Kadhar Nawaz2
Improving the efficiency of dynamic routing problem on road network is a difficult .There is numerous works proposed for this problem and they try to solve this in different aspects. Most of the existing routing problem based on static approach. In this paper, we propose a fuzzy Dijkstra’s shortest path algorithm based on dynamic approach. The linguistic variables that qualify user parameters are quantified using fuzzy set theory that provides fuzzy numbers outputs to predict the shortest route on network. By handling the fuzzy parameter, it gives issue to compare the distance between two different paths with their edge lengths represented by fuzzy numbers. The addition of fuzzy numbers using graded mean integration representation is used to improve Dijkstra’s algorithm. A numerical example of a road network is used to illustrate the efficiency of the proposed method.
Saad Talib Hasson and Alaa Taima
A mobile ad hoc network (MANET) is an autonomous, self-configuring network of mobile nodes that can be formed without the need of any pre-established infrastructure or centralized administration. MANETs are extremely flexible and each node is free to move independently, in any random direction. Each node in the MANET maintains continuously the information required to properly route traffic. This paper presents a simulation study to analyze and evaluate the behavior of the MANET with AODV routing protocol by testing four mobility models (i.e. Waypoint(RWP), Reference Point Group Model (RPGM), Gauss Markov Model (GMM) and Manhattan Grid Model (MGM)). Several performance metrics (Throughput, Packet Delivery Fraction (PDF), Average End-to-end Delay (AED),Normalize Routing Load (NRL) and packets loss) were suggested as a measuring tool to be used in the comparison stage for all these four mobility models using NS- 2.Various parameters such as different number of nodes, different speeds, different pause times, different environment areas and different traffic rates were also used in five suggested scenarios. The results indicated that the best performance of AODV routing protocol is with RPGM mobility model than other mobility models.
Mohammed Najm. Abdallah*1?and Ahmed Mosa Dinar2
In this paper, an investigation for the facts about what is the robust and energy efficient routing protocol in underwater wireless sensor networks by comparing between three routing protocol are Vector-Based Forwarding Protocol (VBF), hop-by-hop vector-based forwarding (HHVBF) and Vector-Based Void Avoidance (VBVA). The typical physical layer technology for underwater networks is acoustic communication compared with electromagnetic waves and optical waves. Energy constraint, long delay and limitation in bandwidth are greater challenges in underwater acoustic sensor networks for monitoring and other applications.in this paper present the performance evaluation of these protocols in term of energy consumption ,average end-to-end delay and Packet delivery ratio. Comparison is carried out by using Aqua-Sim simulators for underwater sensor networks and NS2 based simulator installed in Linux environment.
Majid J. Jawad1 and?Tawfiq A. Abbas*2
While the transmission technology of data is increases rapidly, the technology to protect these from unauthorized users need to be enhanced. There are several technologies have used such as cryptography, but this technology don’t provide efficient solution. Digital watermarking technology can provide efficient solution to the above crucial problem. Digital watermarking is applied in several media such as image, audio, video, text. As we know, in most of the applications of digital watermarking, the watermark is used to protect the copyright of digital product. In the other word, the product(cover) is important. According to this fact, we presented a new watermarking concept called intelligent watermarking. We took vector map of GIS (Geographic Information System), as case study. Briefly, the scheme depends on the feature of the cover. The embedding of the watermark can be done in several steps. First extracting some features from the original digital vector map. Second combining these features with external watermark in order to get the intelligent watermark. Third embedding the intelligent watermark in the vector map to get the watermarked vector map. The extracting of the watermark can be done in several steps. First, extract the intelligent watermark from the watermarked digital vector map. Second decomposing the intelligent watermark into features and external watermark. Third reversing the extracted features to features of watermarked vector map. So, if the watermarked vector map is attacked, this watermarked vector map will be distorted, otherwise will not be distorted. Our proposed scheme can be applied in all the other media.
Swati Rastogi and Sanjeev Thakur
language processing, which governs the process of identifying which sense of a word (i.e. meaning) is used in a sentence, when the word has multiple meanings. Research has progressed steadily to the point where WSD systems achieve sufficiently high levels of accuracy on a variety of word types and ambiguities. A rich variety of techniques have been researched, from dictionary-based methods that use the knowledge encoded in lexical resources, to supervised machine learning methods in which a classifier is trained for each distinct word on a corpus of manually sense-annotated examples, to completely unsupervised methods that cluster occurrences of words, thereby inducing word senses. Among these, supervised learning approaches have been the most successful algorithms to date. The senses of a word are expressed by its WordNet synsets, arranged according to their relevance. The relevance of these senses are probabilistically determined through a Bayesian Belief Network. The main contribution of the work is a completely probabilistic framework for word-sense disambiguation with a semi-supervised learning technique utilising WordNet. Word sense disambiguation is a major problem in many tasks related to natural language processing. This paper aims to enriching wordnet for word sense disambiguation by adding some extra features to wordnet that may enhance the efficiency of knowledge-based contextual overlap WSD algorithms when they are used on wordnets.?
Tawfiq A. Al-Asadi1?and Wurood H. Albayati2
Communication is an essential part of the human experience, as humanity evolves, technology evolving too, impacting the way communicative aspect of our lives functions. we hear many acronym nowadays from here and there like IoT, M2M,Cloudmobile and so on, cellular communications is the seed for all , there is many research on that growing fields, In this paper we will display what other did in cellular communication from 2007-2012 , to determine , challenges and future trends, also the role of computer techniques on solving such challenges will be given too. The paper try to answer such question like: What is the open research area (or challenges ) of cellular communication? Are Computer techniques represent good solution for cellular challenges? What is the computer algorithms used to solve those challenges.
Andril Alagusabai1 and T.P. Saravanabava2
Recently trends and technologies are growing drastically, where the internet connectivity plays a significant role. But the disappointment is that the connectivity is not present in all locations. In this case study, we took the location based services which completely rely on connectivity, example, if the user loses his or her connectivity they will not be able to receive or get updates of the location based services. So we implemented an offline location based service where the users can get profited even when there is no connectivity. This system is proposed by developing an android application which contains all location based contents using database management system, and implemented it on a smart phone.
Ashish Kumar1 and Kunwar Singh Vaisla2
In the world of computing, information plays an important role in our lives. One of the major sources of information is database. Database and Database technology are having major impact on the growing use of computers. Almost all IT applications are storing and retrieving the information or data from the database. Database Management Systems (DBMS) have been widely used for storing and retrieving data. However, databases are often hard to use since their interface is quite rigid in co-operating with users. For storing and retrieving the information from database requires the knowledge of database language like SQL. Structured Query Language (SQL) is an ANSI standard for accessing and manipulating the information stored in database. However, everyone may not be able to write the SQL query as they may not be aware of the syntax and structure of SQL and database respectively. In India, the natural language of people is mainly Hindi. Also large number of e-governance applications use database. So, to use such database applications with ease, people who are more comfortable with Hindi language, requires these applications to accepts a simple sentence in Hindi, and process it to generate a SQL query. The SQL query is further executed on the database to produce the results. Therefore, any interface in Hindi language will be an asset to these people. This paper discusses the architecture of mapping the Hindi language query entered by the user into SQL query.
Nikhil Lambe?and Rajesh Prasad
Cloud computing is not a new technology. Cloud computing is related to grid computing as well as it is based on distributed system. Cloud computing is used to provide services, access resources according to its different types. This paper explains in brief about the cloud computing and Grid computing as well as how both concepts differ from each other, its key strengths, and characteristics.
Md. Sadique Shaikh1 and Gurav Ashok Patil2
Web-based Content Management Systems (WCMS) consist of applications that are used to create, manage, store and deploy content on the Web, including text, multimedia data(graphics, video or audio),andapplication code. Web-based Content Management Systems are often a component of Enterprise Content Management (ECM) solutions. The Content Management layer contains the core components for the Web Content Management Application. In WCMS, the authorization component grants the appropriate privileges to users, based on their respective roles. Also the library Services provide the core content management functionality (check-in/out, version control), along with Publish, Staging, Logging, and Content Reporting/Auditing. The Web Interface and Portal Application presents the contents to the various user segments based on their authorization. Remote portals (e.g., web parts, gadgets, widgets) can be used to embed the content management functionality or sourced content in portals provided by other vendors. The search indexing engine can create searchable indexes from websites supported by the WCMS solutions. This paper concerned with some review work, introduction towards some basic modeling of CMS/WCMS for Business Processes with the reference of entire reviewed sources.
M. Rajendran and K. Thirukumar
Link prediction is a key technique in many applications in social networks; where potential links between entities need to be predicted. Typical link prediction techniques deal with either?uniform entities, i.e., company to company, applicant to applicant links, or non-mutual relationships, e.g., company to applicant links. However, there is a challenging problem of link prediction among the composite entities and mutual links; such as accurate prediction of matches on company dataset, jobs or workers on employment websites, where the links are mutually determined by both entities that composite entity belong to disjoint groups. The causes of interactions in these domains makes composite and mutual link prediction significantly different from the typical version of the problem. This work addresses these issues by proposing the Support Vector Machine model. By implementing the proposed algorithm it is expected that the accuracy will get increased in the link prediction problem.
Lija Chacko1, S. Karthiprem2 and V. Rajasekar2
Digital image processing is rapidly evolving field with growing application in science and engineering. Edge enhancement plays a vital role in Digital image processing. Edge enhancement is a pre-processing technique for all images. Digital images are numeric representation of 2-dimensional images. A multi-scale algorithm for the unsupervised extraction of the most significant edges has been done in SAR images. It is planned to propose a novel technique for automatic edge enhancement in indoor digital images. The characteristics of digital images justify the?importance of an edge enhancement step prior to edge detection. Therefore, a robust and unsupervised edge enhancement algorithm based on a combination of wavelet coefficients at different scales can be applied. Firstly, wavelet transform is used to remove noises and enhancement for images collected. Secondly, the edge detection can be performed in enhanced digital images by Canny edge detector. A double thresholding method modified from Canny’s method can be used. To derive the final image, the local intensities of each image, as well as their correlation must be taken into account using the fuzzy logic. It has been proven to be robust and effective for this application. This method could be effectively used for segmentation purposes in digital images.
Fakhrosadat Fanian1 and Marjankuchaki Rafsanjani2
Directed routing protocols, one of them is data centre protocol are the main proposed
methods in improving exploration quality of sensor networks. In these protocols, routing data
stored within networknodes locally. Therefore, there isn’t universal data about routes; one of
these protocols is sensor protocol for information via negotiation (SPIN), in this article it was
attempts to solve lack of receiving data by remote nodes from origin in case of non-request
from nearer nodes to origin in this protocol using clustering process. In the other hand, it was
attempt to prevent routing among origin and receiver nodes separately in this protocol that
cause waste of major parts of network resources and by delivering this algorithm led to improve
performance and efficacy and also increase life time of network and bearing of this protocol
against error because of rotating cluster heads roles.
Israa Hadi and Saad Taleb Hasson
This paper aims to develop new techniques to construct a graph structure to represent
the relation between the objects in movie film. The main approach includes a mechanism for
many steps, first step is the Data Base Construction , which create a table consist record for
each frame, and each record contain entities such as area, Perimeter, and other parameters.
The second step is to calculate the number of features for any object in each frame depending
on the parameters in the data base. While the third step constructs a graph structure that
represent a base on the features of the objects.