博電競菠菜平臺 www.rrkuhedt.buzz Akhil Khare1, Meenu Kumari1 and Pallavi Khare2
In any communication, security is the most important issue in today’s world. Lots of data security and data hiding algorithms have been developed in the last decade, which worked as motivation for our research. In this project, named “JPEG Compression Steganography”, we have designed a system that will allow an average user to securely transfer text messages by hiding them in a digital image file using the local characteristics within an image. This project is a combination of steganography and encryption algorithms, which provides a strong backbone for its security. The proposed system not only hides large volume of data within an image, but also limits the perceivable distortion that might occur in an image while processing it. This software has an advantage over other information security software because the hidden text is in the form of images, which are not obvious text information carriers. The project contains several challenges that make it interesting to develop. The central task is to research available steganography and encryption algorithms to pick the one the offer the best combination of strong encryption, usability and performance. The main advantage of this project is a simple, powerful and user-friendly GUI that plays a very large role in the success of the application.?
Mazhar Khaliq1, Riyaz A. Khan2?and M. H. Khan3
This paper presents a review on software quality measurement. An effort has been made to put forth the demand and dependency of quality software in our modern society. The demand has increased the size and complexity of computer software system during the past decades. The software industry is responding to the demand for high quality product by spending resources to improve the quality of software product. A brief discussion on quality and software quality has been given. Further, the discussion on measurement and software quality measurement are presented. Finally, it throws the light on the role of various software quality models and software metrics for software quality measurement.?
Shwetank1, Kamal Jain2 and Karamjit Bhatia3
In the present study Multispectral Image Processing (MIP) technique is applied on ASTER (Advance Spaceborne Thermal Emission and Reflection Radiometer) L1 B high resolution (15 m/ pixel) satellite data. A comprehensive spectral library of rice crop varieties : Hybrid-6129 (IET 18815), Pant Dhan-19 (IET 17544), Pusa Basmati-1 (IET-18990) and Pant Dhan-18 (IET-17920) has been developed with Blue (0.56 nm), Red (0.66 nm) and NIR (0.81 nm) spectral bands. The PCA (Principal Component Analysis) transformation with correlation matrix is applied for feature extraction to select an optimum subset of data in term of classification accuracy. Four PC (Principal Component) images selected for conventional spectral and integrated image classification. The integrated image Spectral/ NDVI (Normalized Difference Vegetation Index) is developed using Spectral and NDVI bands classified using ML (Maximum Likelihood) classifier. The conventional spectral classification accuracy for rice mapping is 79.5%, which improves up to 84.5% with Spectra/NDVI imagery data.
R.S. Kamath1 and R.K. Kamat2
In this current implementation, the software is able to browse the VRML and STL files. The 3D scenes are rendered by reading .wrl and .stl files, with an inclusion of properties: applying various lights, material colors, options for solid, wire frame, points and lines viewing, texture mapping, transformation, different camera views etc. A stereoscopic display is a prime part of this implementation that accounts for virtual reality. The passive stereo is a low cost 3D visualization technique. It enables the user to enter a world of Virtual Realism. This paper will highlight the technical details of STL, VRML files, the method of parsing such files, rendering the model and passive stereo vision of respected model.
Abdolvahab Ehsanirad and Sharath Kumar Y. H.
In this paper, the image processing techniques has been used in order to classify the plants by applying on the leaves images. To extract the leaves texture features, the Gray-Level Co-occurrence matrix (GLCM) and Principal Component Analysis (PCA) algorithms have been considered. The Algorithms are trained by 390 leaves to classify 13 kinds of plants with 65 new or deformed leaves images. The result indicates that the accuracy for the GLCM method is 78% while the accuracy for the PCA method is 98%.
Himanshu Agarwal1, Ashish Seth2 and Ashim Raj Singla3
SOA is a Buzzword today and much is said about it, the actual goal of SOA is to help align IT capabilities with business goals .Another important goal of SOA is to provide an agile technical infrastructure that can be quickly and easily reconfigured as business requirement change. Until the emergence of SOA based IT systems, business and government organizations were faced off with difficult trade off between the expanses of custom solution and the convenience of packaged applications. In this paper we have argued that how service based information systems are different from component based systems. Further we have identified the line of division between two approaches and pointed out the issues of SOA adoption in an organization.
Arushi Arora, Swati Priya and Akhil Khare
During linked data structures(LDS) traversals, prefetching improves the performance by reducing memory latency. We will discuss about the jump pointer prefetching which hides additional load latency by using an extra pointer to prefetch objects further than a single link away. Jump pointers can be implemented in Binary tress by adding jump pointers at creation time and in LDS by adding jump pointers at traversal time. Prefetch Arrays are also used to store jump pointers. It has two approaches hardware and software. Both the approaches have highly improved the performance of prefetching with the use of jump pointers. Prefetching in pointer-based codes(java programs) is difficult because separate dynamically allocated objects are disjoint, and the access patterns are thus less regular and predictable. However, according to experimental results, the largest performance improvement is 48% with jump- pointers in java programs, but consistent improvements are difficult to obtain.
Kiran Kumar Reddi1* and M.V. Basaveswara Rao2
Bioinformatics, an area that has evolved in response to this deluge of information, can be
viewed as the use of computational methods to handle biological data. It is an interdisciplinary field
involving biology, computer science, mathematics and statistics to analyze biological sequence data,
genome content & arrangement, and to predict the function and structure of macromolecules. Soft
computing is a consortium of methodologies that work synergistically and provide, in one form or
another, flexible information processing capabilities for handling real life ambiguous situations. Its aim,
unlike conventional (hard) computing, is to exploit the tolerance for imprecision, uncertainty, approximate?reasoning and partial truth in order to achieve tractability, robustness, low solution cost, and close?resemblance with human like decision-making. The paper will focus on soft computing paradigm in?bioinformatics with particular emphasis on research.
Saurabh Mukherjee1, Nitin Paharia1 and Ritu Tiwari2
Drinking water is known as another life for any living being. For rest of the animal they have the natural separators to separate the pollutant and the safe material, but for human it is not so. Safe drinking water is a big problem area in Indian and other developing countries. Due to massive pollution present in drinking water, it becomes immensely important to find a cost effective solution for the safe drinking water. Various Public water treatment plants are doing well in this task but there is a huge gap between the demand and supply. Even if the water underwent through treatment, it is always questionable whether it is safe to drink or not. Various private sector jumps in the market claiming to give safe and soft drinking water. Some private players are doing well but cost still plays a big factor to reach common person. The present paper is an attempt to propose a novel model to clean the dirty water with some constraints in a specified domain.
A.I. Beena and B. Arthi
The goal of image enhancement is to improve the visual appearance of the image. For this purpose we are using two methods to compute the image background using morphological transformations. Contrast enhancement in grey level images with poor lighting can be achieved by two operators based on Weber’s law. The one operator employs information from block analysis and the other utilizes the opening by reconstruction. The contrast operators can normalize the grey level of the input image to avoid the abrupt changes in intensity among different regions.
Akhil Khare1, Harsh Lohani2?and Pallavi Khare3
The first step taken while developing software, by an analyst is to construct a sequence diagram which describes the interaction that must occur between classes. The sequence diagram does not show the interaction of classes at run time. It shows static coupling, i.e. it only acknowledges the number of interactions between the classes. There have been many studies done regarding the relationship between coupling and external quality factors of object-oriented software. A common way to define and measure coupling is through structural properties and static code analysis. However, because of polymorphism, dynamic binding, and the common presence of unused or dead code in commercial software, the resulting coupling measures are not precise as they do not perfectly reflect the actual coupling taking place among classes dynamically. This paper proposes the use of the estimated frequency of the use cases and propagates these frequencies through the sequence diagram to estimate dynamic coupling. This can be done by using Dynamic Clustering Mechanism (DCM) in which classes which interact with high frequency are clustered or grouped together (called as hot spots), which are highly dynamically coupled. With this evaluation we will be able to determine which hot spot are indeed relevant and actually deserves close attention from the designer w.r.t design of each class.
R. Velayutham1 and D. Manimegalai2
In November 2001 NIST published Rijndael as the proposed algorithm for AES(Advanced Encryption Standard). The result of new attack methods shows that there may be some missing part in the design of S-box and key schedule with AES algorithm. The problem is the weakness of linearity existing in the S-box and key schedule. In order to keep away from the new attacks and implement the AES in software and hardware provides higher level of security and faster encryption speed; we analyze in detail the AES algorithm and propose a new implementation scheme for increasing complexity of nonlinear transformation in design of S-box. Implementation scheme with Java and the use of reconfigurable coprocessor as a cryptography hardware is proposed.
Smita Selot1, A.S. Zadgaonkar2 and Neeta Tripathi3
Natural language processing has wide coverage in application areas like machine translation , text to speech conversion , semantic analysis , semantic role labeling and knowledge representation .Morphological and syntactic processing are components of NLP which process each word to produce the syntactic structure of the sentence, with respect to its grammar .Semantic analysis follows syntactic analysis. One of the key task in morphological analysis is identifying the correct root word from its inflected form.In Sanskrit language, these inflected word follow the rules which are used to separate the root word from its suffix.These extracted suffix carry sufficient amount of syntactic and semantic information with them .To develop such word splitter, rules called sandhi rules given in the grammar of the Sanskrit language has been used.The challenge in the problem lies is in identifying the junction point or breaking point of the word as multiple junction can be obtained within the word .Developed system maintains database of all possible suffix , which are then used for splitting the word . The algorithm for the same is presented in the paper with the solutions to problem faced while developing the module.
Anurag Lal1 and Vivek Dubey2
In this paper loss of packets in TCP is detected using two diverse methods CPR (Constant Packet Re-arranging) and WCPR (Without Constant Packet Re-arranging) in diverse platforms. This paper proposes a new version of the TCP which gives the high throughput when the packet rearranging occurs and in another case if the packet rearranging is not occurs then in that case also it is friendly to other version of the TCP. The key feature of Constant packet rearranging is that duplicate ACKs are not used as an indication of packet loss. Instead the timer is used to detect the packet loss From a computational view-point, CPR is more demanding than WCPR. Because CPR does not rely on duplicate acknowledgments, packet rearranging (including out-or-order acknowledgments) has no effect on CPR performance.
Manish Saxena1* and Anubhuti Khare2
This paper deals with mathematical modeling, design and application of Fiber Bragg Grating as temperature sensor .In this paper we used the MATLAB and filter characteristics simulation software as a tool for simulation results. The fabrication of Fiber Bragg Grating, their characteristics and fundamental properties are described. The reflectivity of FBG is described using simulation results. This paper also presents the simulation results of FBG as temperature and gas sensor. From the plotting analysis it can be concluded that for reduce the width of reflection spectrum we can take long grating.
Sheikh Umar Farooq and S.M.K. Quadri
Testing remains the truly effective means to assure the quality of a software system of nontrivial complexity. An important aspect of test planning is measuring test effectiveness. To make testing more successful we need to choose effective testing techniques. To compare testing techniques we need to place software testing techniques on a measurement scale which can define the relative merits of the existing testing techniques, but due to differences in software’s and its allied parameters this task seems to be complex, if not impossible.
There are different techniques that can be used to recognize handwritten digits and characters. Two techniques discussed in this paper are: Pattern Recognition and Artificial Neural Network. Both techniques are defined and different methods for each technique is also discussed. Bayesian Decision theory, Nearest Neighbor rule, and Linear Classification or Discrimination is types of methods for Pattern Recognition. Shape recognition, Character and Handwritten Digit recognition uses Neural Network to recognize them. Neural Network is used to train and identify written digits. After training and testing, the accuracy rate reached 99%.This accuracy rate is very high.
Anil Rajput1, Manmohan Singh2,Naveen Kher3*,Shumali Kankane4 and Pawan Meena5
Virtual Private Networking, or VPN, is a technology that lets people access their office's computer network over the Internet while at home or traveling. Accessing a network in this way is referred to as remote access. (For comparison, another common form of remote access is dialing in to the office network over a telephone line.) But VPN is useful for more than just remote access. It can also be used to link two separate offices over a distance. This is sometimes called a "persistent VPN tunnel", or "site-to-site VPN".
M. Ravinder, T. Venu Gopal2 and T. Venkat Narayana Rao3
Video data indexing and retrieval which applies tags to large video databases, is useful as a complementary means for applications which are having multi media content and need faster search responses. Well-ordered and effective management of video documents depends on the availability of indexes. Manual indexing is not viable for large video collections in this modern era of information super highway. There is a want for a techniques and frameworks that can store, handle, search and retrieve the data from the huge media archive. This paper discusses about application areas, emerging challenges of video indexing & retrieval, and some future directions for video indexing, video retrieval management system.
Mandeep Kaur, Parul Batra and Akhil Khare
The relationships between coupling and external quality factors of object-oriented software have been studied extensively for the past few years. For example, a clear empirical relationship between class-level coupling and the fault-proneness of the classes have been identified by several studies. A number of statistical techniques, principally Agglomerative Hierarchical Clustering (AHC) analysis, Byte Code Instrumentation are used to facilitate the identification of such objects. Dynamic coupling indicates the strength of association established by a connection from one software module to another at runtime. Despite the rich body of research in the field of software measurement, dynamic coupling measurement for Aspect Oriented software is still missing. A dynamic coupling measurement framework for AspectJ programs is presented in this paper. The framework consists of a suite of measures for both method-level and class-level coupling relations. This paper also presents a new approach towards static analysis, in particular class analysis to the computation of dynamic coupling measures and is designed to work on incomplete programs.
K.H. Wandra1* and Ketan Kotecha2
These instructions give you guidelines for preparing papers for conferences or journals. Use this document as a template if you are using Microsoft Word. Otherwise, use this document as an instruction set. The electronic file of your paper will be formatted further at World Enformatika Society. Define all symbols used in the abstract. Do not cite references in the abstract. Do not delete the blank line immediately above the abstract; it sets the footnote at the bottom of this column. Page margins are 1,78 cm top and down; 1,65 cm left and right. Each column width is 8,89 cm and the separation between the columns is 0,51 cm.
Bhavna Arora Makin1 and Devanand Padha2
Data aggregation is a widely used technique in wireless sensor networks. The security issues, data confidentiality and integrity, in data aggregation become vital when the sensor network is deployed in a hostile environment. There have been many related work proposed to address these security issues. In this paper, we introduce the concept of mobile agents for secure data aggregation in sensor networks. We propose an algorithm that can be used to track a malicious node in the network and hence maintain data confidentiality and data integrity in data aggregation.
Wasim Ahmad Bhat and S.M.K. Quadri
FAT file system is the most primitive, compatible and simple file system which still sustains in this era on digital devices, such as mini MP3 players, smart phones, and digital cameras. This file system is supported by almost all operating systems because of its simplicity and legacy. This paper presents review of the basic design technique, constraints and formulas of the most important and building block data structure of FAT32 file system; the FAT data structure.
Network Intrusion Detection aims at distinguishing the behavior of the network. It is an inseparable part of the information security system. Due to rapid development of attack pattern it is necessary to develop a system which can upgrade itself as new threats are detected. Also detection rate should be high because the rate with which attack is carried out on the network is very high. In response to this problem AdaBoost Based Algorithm is proposed which has high detection rate as well as low false alarm rate. In this algorithm decision stumps are used as weak classifier. The decision rules are provided for both categorical and continuous features. Weak classifier for continuous features and weak classifier for categorical features are combined to form a strong classifier. Strategies for avoiding the over fitting are adopted to improve the performance of the algorithm.
Olusegun Folorunso1*, Lateef O. Yusuf1 and Julius O. Okesola2
With the increase of complexity in Real-time Database Systems (RTDBS), the amount of data that needs to be managed has also increased. Adoption of a RTDBS as a tightly integrated part of the SOA development process can give significant benefits with respect to data management. However, the variability of data management requirements in different systems, and its heterogeneity may require a distinct database configuration. We addressed the challenges that face RTDB managers who intend to adopt RTDBS in SOA market; we also introduce a service oriented approach to RTDBS analytics and describe how this is used to measure and to monitor the security system. A SOA approach for generating RTDBS configurations suitable for resource-constrained real-time systems using Service Oriented Architecture tools to assist developers with design and analysis of services of developed or new systems was also explored.
Nitin D. Shelokar1 and S.A. Ladhake2
We will learn the concept of intrusion detection system in real time. This seminar will give brief idea to have our data to be in secured system i.e free from hackers. We will elaborate the types of intrusion detection system and get into concept of real-time system to detect any intruder coming into the system. An intrusion detection system is used to detect several types of malicious behaviors that can compromise the security and trust of a computer system. We have also discussed the real time intrusion detection system in brief with the efficiency related to instruion detection systems. Data processed by the IDS may be a sequence of commands executed by an user, a sequence of system calls launched by an application (for example a web client), network packets, and so on. Finally, the IDS can trigger some countermeasures to eliminate attack cause/effect whenever an intrusion is detected.
K.S. Mann1 and Preeti Saini2
In hospitals, the ability to send and receive healthcare data including patient information and the various lab reports are required . The information is stored in non standard formats. The information must be converted from one format to another for the transmission .In order to achieve this ,all healthcare information must be sent in a specialized health care language. In healthcare, important or mean-ingful data can beeasily identified by XML markup or tags and stored as electronic documents. The XML is useful because of its variable length of the nested structures.XML is well-formed document format, it can be easy to be parsed and managed.
Olusegun Folorunso, Adesina Temitayo Bello, Adesina S. Sodiya and Lateef O. Yusuf
The present visual analytics query system that is being used to process large quantities of information with complex analytic reasoning processes does not support intelligent selection of best visual display and task analysis. In this paper , Knowledge-Based Visual Analytics Query System (KBVAQS) architecture was designed to support and correct the challenges. The architecture consists of three main parts which include the application layer, logic layer and back layer. These parts were used for querying and displaying data graphically based on User request. In the application layer, the administrator and ordinary users interact directly with the system through the user interface provided by the system. Graphical display based on the decision table is provided and some task analyses are given for the user to interpret. The logic layer handled the full functionality of the implementation while the back layer was used for record keeping. The implementation was achieved by employing object oriented programming language C-Sharp with the data base created in Microsoft Structured Query Language. The effectiveness of KBVAQS tool has been evaluated in surveys carried out at the Nigerian Stock Exchange which deals with stock markets. It shows that users generally viewed KBVAQS tools more positively than using existing Intelligent Visual Query Algorithm (IVQA) technique. These differences were significant to p<0.05. The mean interactions precision and calculated value using expert judge relevance rating show a significant difference between KBVAQS tool and IVQA performance 2.47 against 1.73 for precision with calculated t= 6.33. The hypothesis testing revealed that KBVAQS user performed better and achieved acceptable results.
Hemant K. Sawant and Zahra Jalali
The Electroencephalograph (EEG) signal is widely used to analyze neural activity within the human brain for detection of any abnormalities. Since the EEG signal is dynamic by nature and changes constantly a highly sensitive yet robust system is required to monitor the activities. In this work, EEG waves are analyzed and then classified using first the Discrete Wavelet Transform DWT, used for time-frequency analysis, followed by Fast Fourier Transform (FFT) that captures the rhythmic changes in EEG data. The process uses DWT for classifying EEG wave’s frequencies, where as FFT is implemented to visualize these waves.
Hasrat Jahan1 and Manmohan Singh2
An emerging trend in the Signal and Image Processing (SIP) community is the appearance of middleware and middleware standards that can be readily exploited for distributed computing applications by the SIP community. High performance computing and High Performance Embedded Computing (HPEC) applications will benefit significantly from highly efficient & portable computational middleware for signal & image processing. Open middleware standards such as VSIPL, MPI, CORBA, encoding and SOAP –based messaging protocols. More specifically, we will be focused on the appropriate use of such technologies for implementing new SIP applications, or extending legacy applications through the use of these technologies. The three middleware standards we have selected all have certain commonalities. All are based around the concept of a client application using the services available on a remote machine, or server. A remote executable object that implements one or more exposed interfaces provides these services. The object’s interface represents a contract between the client and the server. This interface is written as a Java interface for Java RMI, in IDL for CORBA, and in WSDL for web services. In the latter two cases, the more generic descriptions can be translated intospecific language imp lementations,