A mobile ad-hoc network is an infrastructure less network with self-configuring mobile nodes connected by wireless. To provide more security in MANET routing of data is done through the friends present in the network. The data is being encrypted and then it is routed through the friend list. The malicious nodes are being detected and isolated from the network by the challenge process.
2 The autonomous detection and tracking of moving objects - A Survey work?, P Sanyal, S K Bandyopadhyay?
The video frame mainly consists of foreground and background objects. For effective detection and tracking of moving objects background subtraction is a very important part of surveillance applications for successful segmentation of objects from video sequences. In this paper we mainly survey these methods for autonomous detection and tracking of moving objects.
3 Generating SMS (Short Message Service) in the form of Quick Response Code (QR-code), Mohammad Zainuddin, D. Baswaraj, SM Riyazoddin?
This article shows how a QR-code can be generated from a simple sms. We can see lot of QRcodes or mobile barcodes around us on websites, books, gadgets, T-shirts etc. which are making our work easier with just 1 click decode with high speed. With respect to old barcodes, QR-codes are very fast and they can store lot of data, making it more superior. QR-Codes originated within the technology hungry country of Japan, and have only recently began to become popular within the Middle East and Europe. Barcodes that you see on any commercial product are extremely beneficial as their reading speed, supreme accuracy and their functionalities are main keys. As barcodes reached their peak and began getting used worldwide, the need for more data and character types to be stored was inevitable. Developers began trying to expand on the current amount of bars within the barcode and how their positioning resides to allow further data capacities. The need for smaller barcodes also was another defining factor in QR-Codes development.
4 Reduce Energy Consumption by Improving the LEACH Protocol?, Ali F. Marhoon, Mishall H. Awaad
The wireless sensor networks suffer from the problem of energy consumption, so it has been used several protocols to avoid this problem, the best of these protocols is a LEACH protocol which works to reduce the energy consumption of the network. On the other hand, LEACH protocol suffers from the problems of accelerated the dead nodes as well as the short duration of the network lifetime. In the present work, an improvement is added to the original LEACH protocol via the use of the SPIN protocol idea. That result a new protocol call (S-LEACH). Since the protocol SPIN uses socalled meta-data (which is very small in size) before receiving packets full advantage of this feature so that there is no identical or similar packets. The improved LEACH protocol is simulated using matlab software. The simulation results shows that the improved protocol gives better performance than the original one in the following aspects: 1- Increasing the number of rounds. 2- Delayed the first node dies. 3- Deceleration in the death of nodes. 4- More remaining nodes. 5- Extended lifetime network.
5 Data Mining Techniques: To Predict and Resolve Breast Cancer Survivability?, Vikas Chaurasia, Saurabh Pal?
Breast cancer is one of the deadliest disease, is the most common of all cancers and is the leading cause of cancer deaths in women worldwide, accounting for >1.6% of deaths and case fatality rates are highest in low-resource countries. The breast cancer risks are broadly classified into modifiable and non – modifiable factors. The non modifiable risk factors are age, gender, number of first degree relatives suffering from breast cancer, menstrual history, age at menarche and age at menopause. While the modifiable risk factors are BMI, age at first child birth, number of children, duration of breast feeding, alcohol, diet and number of abortions. This paper presents a diagnosis system for detecting breast cancer based on RepTree, RBF Network and Simple Logistic. In test stage, 10-fold cross validation method was applied to the University Medical Centre, Institute of Oncology, Ljubljana, Yugoslavia database to evaluate the proposed system performances. The correct classification rate of proposed system is 74.5%. This research demonstrated that the Simple Logistic can be used for reducing the dimension of feature space and proposed Rep Tree and RBF Network model can be used to obtain fast automatic diagnostic systems for other diseases.
6 Adaptive Pixel Pair Matching Technique for Data Embedding?, Nalimela Mounika, T. Madhavi Kumari?
This paper proposes a new data-hiding method based on pixel pair matching (PPM). The basic idea of PPM is to use the values of pixel pair as a reference coordinate. The PPM search a coordinate in the neighborhood set of this pixel pair according to a given message digit. The pixel pair is changed to the searched coordinate to conceal the digit. Exploiting modification direction (EMD) and diamond encoding (DE) are two data-hiding methods proposed recently based on PPM. The capacity of EMD is 1.161 bpp and DE prolongs the payload of EMD by embedding digits in a larger notational system. The designed method offers lower distortion than DE by providing more compact neighborhood sets and allowing embedded digits in any notational system. The proposed approach always has lower distortion for various payloads compare to optimal pixel adjustment process (OPAP) method. Experimental results reveal that the proposed method not only provides better performance than those of DE and OPAP, but also is secure under the detection of some well-known steganalysis techniques.
This project provides the experience of applying an advanced version of Spurious Power Suppression Technique (SPST) on multipliers for high speed and low power purposes. When a portion of data does not affect the final computing results, the data controlling circuits of SPST latch this portion to avoid useless data transition occurring inside the arithmetic units, so that the useless spurious signals of arithmetic units are filter out. Modified Booth Algorithm is used in this project for multiplication which reduces the number of partial product to n/2. To filter out the useless switching power, there are two approaches, i.e using registers and using AND gates, to assert the data signals of multipliers after data transition. The simulation result shows that the SPST implementation with AND gates owns an extremely high flexibility on adjusting the data asserting time which not only facilitates the robustness of SPST but also leads to a speed improvement and power reduction
8 Energy Efficient Mobile Relaying in Data Intensive Wireless Sensor Networks?, Satish Chikkala, K. Sathi Reddy?
WSN is common in different types of application scenarios. It includes a set of sensor nodes deployed over a geographical area to monitor a variety of phenomenon. WSN become increasingly useful in variety critical applications, such as environmental monitoring, smart offices, battlefield surveillance and transportation traffic monitoring. The sensor nodes are tiny and limited in power. Sensor types vary according to the application of WSN. Whatever be the application, the resources such as power, memory and band width are limited. More over, most of the sensors nodes are throw away in nature. Therefore it is vital to consider energy efficiency so as to maximize the life time of the WSN. This paper presents energy efficient mobile relaying in data intensive wireless sensor networks. The concept of mobile relay is that the mobile nodes change their locations so as to minimize the total energy consumed by both wireless transmission and locomotion. The conventional methods, however, do not take into account the energy level, and as a result they do not always prolong the network lifetime.
9 Virtual Machine-Based Resource Management System for Cloud Computing Services, Sri hari Reddy Medapati?, K. Sathi Reddy?
Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software and information are provided to users over the network. Cloud computing providers deliver application via the Internet, which are accessed from web browser, while the business software and data are stored on servers at a remote location. In cloud computing, Resource Allocation (RA) is the process of assigning available resources to the needed cloud applications over the internet. Resource allocation starves services if the allocation is not managed precisely. Resource provisioning solves that problem by allowing the service providers to manage the resources for each individual module. Resource Allocation Strategy (RAS) is all about integrating cloud provider activities for utilizing and allocating scarce resources within the limit of cloud environment so as to meet the needs of the cloud application. This paper presents dynamically allocating resources for cloud computing services using virtual machine.
Wireless networking is inherently insecure. From jamming to eavesdropping, from man-in the middle to spoofing, there are a variety of attack methods that can be used against the users of wireless networks. Modern wireless data networks use a variety of cryptographic techniques such as encryption and authentication to provide barriers to such infiltrations. However, much of the commonly used security precautions are woefully inadequate. They seem to detract the casual sniffer, but are unable to stop the powerful adversary. In this article, we look into the technology and the security schemes in IEEE 802.11, cellular and Bluetooth wireless transport protocols. We conclude that the only reliable security measure for such networks is one hat is based on application level security such as using a VPN.The wireless communication technology also acquires various types of security threats. This paper discusses a wide variety of attacks in WSN and their classification mechanisms and different securities available to handle them including the challenges faced.
11 WarningBird MailAlert Based Malicious URLs Blocker System in Twitter?, MS. SARANYA.S, MR. UDHAYA KUMAR.V?
Twitter is prone to malicious tweets containing URLs for spam, phishing, and malware distribution. Conventional Twitter spam detection schemes utilize account features such as the ratio of tweets containing URLs and the account creation date, or relation features in the Twitter graph. These detection schemes are ineffective against feature fabrications or consume much time and resources. Conventional suspicious URL detection schemes utilize several features including lexical features of URLs, URL redirection, HTML content, and dynamic behavior. However, evading techniques such as time-based evasion and crawler evasion exist. In this paper, we propose WARNINGBIRD, a suspicious URL detection system for Twitter. Our system investigates correlations of URL redirect chains extracted from several tweets. Because attackers have limited resources and usually reuse them, their URL redirect chains frequently share the same URLs. We develop methods to discover correlated URL redirect chains using the frequently shared URLs and to determine their suspiciousness. We collect numerous tweets from the Twitter public timeline and build a statistical classifier using them. Evaluation results show that our classifier accurately and efficiently detects suspicious URLs.WARNINGBIRD as a near real-time system for classifying suspicious URLs in the Twitter stream. In this project I proposed block the malicious URLs and providemailalert for malicious URLs occur in the twitter stream.
12 Iris Recognition using Wavelet Transformation Techniques?, P.Thirumurugan, G.Mohanbabu
In this paper, we propose the novel techniques that we have developed to create Iris Recognition. With the help of a fusion mechanism that amalgamates both, a Canny Edge Detection scheme and a Circular Hough Transform, which is used to detect the iris’ boundaries in the eye’s digital image. For extracting the deterministic patterns in a person’s iris in the form of a feature vector we have applied the wavelet transformation technique. By comparing the quantized vectors using the Hamming Distance operator, we determine whether two irises are similar.
In remote areas the sun is a cheap source of electricity because instead of hydraulic generators it uses solar cells to produce electricity. While the output of solar cells depends on the intensity of sunlight and the angle of incidence. The solar panels must remain in front of sun during the whole day. But due to rotation of earth those panels can’t maintain their position always in front of sun. This problem results in decrease of their efficiency. Thus to get a constant output, an automated system is required which should be capable to constantly rotate the solar panel. In this project we are implementing Automatic Sun tracking System on both sides. One side of automatic sun tracking system we have sensor network which track the position of sun and based on the position rotate the solar panel towards the direction where the intensity of sunlight is maximum and transmit the data to control remote system via wired or wireless medium. Based on the data received it gives the signal to stepper motor to rotate the large panel. The unique feature of this system is that instead of taking the earth as its reference, it takes the sun as a guiding source. The system can display the result on the LCD display. The primary objectives were compact design, efficient energy collection, and ability to monitor available battery charge status using ARM based Microcontroller and also provide to power supply for control system unit. In this project we are using ARM7TDMI based LPC2148 microcontroller to control the position of the solar panel ,which is having 64 pin capability ,512KB of Flash memory ,8 to 32KB of SRAM and many peripherals ,working with internal clock generation using PLLs up to 60 MHz . The application code will be developed in Embedded C language. Keil4 IDE software will be used to build the Hex file and flash magic is used to dump the GENERATED Hex file in to microcontroller.
A DoS attack is the most prevalent threat, viz., traffic in communication resources in order to make the service unavailable for legitimate users, since a decade and continues to be threatening. Denial-of-Service (DoS) attacks cause serious impact on these computing systems. In this project, neuro-fuzzy systems were proposed as subsystems of the ensemble. Sugeno type Neuro-Fuzzy Inference System has been chosen as a base classifier for our research. Single classifier makes error on different training samples. So, by creating the classifiers and combining their outputs, the total amount of error can be reduced and the detection accuracy can be increased. The proposed Adaptive Neuro-Fuzzy Inference based system will be able to detect an intrusion behavior of the networks. The experiments and the evaluations of the proposed method were performed with the KDD Cup 99 intrusion detection Dataset. The results show that our system outperforms two other previously developed state-of-the-art approaches in terms of detection accuracy.
15 Noise Suppression using Weighted Median Filter for Improved Edge Analysis in Ultrasound Kidney Images, K.Ramamoorthy, T.Chelladurai, P.N.Sundararajan, M.Krishnamurthy?
Due to the characteristic speckle noise of ultrasound(US) kidney images, a noise reducing filter must be first applied before image processing stages like segmentation, registration etc. In addition the speckle noise suppression methods are highly required to improve the quality of the ultrasound image in retaining the edge features of the kidney images. The effect of this stage increases the dynamic range of gray levels which in turn increase the image contrast. The proposed system develops Weighted Median filter speckle noise suppression method for ultrasound kidney images. This paper designs intensity invariant local image phase features, obtained using improved Circular Gabor filter banks, for extracting edge texture features that occur at core and intermediate layer interfaces. The proposed model does the extension of phase symmetry features to modified circular Gabor mode for use in automatic extraction of kidney edge texture features from US normal and diseased patient images. The system functionality is proved qualitatively and quantitatively through experimentation for synthetic and real data sets. The speckle noise error ratio with respect to the standard US images are compared and experimented.
16 An Efficient Methodology for Detecting Spam Using Spot System, MRS. SARANYA.S, MRS. R.BHARATHI?
Major Security challenge on the Internet is the existence of the large number of compromised machines. Compromised machines on the Internet are generally referred to as bots, and the set of bots controlled by a single entity is called a botnet. Botnets have multiple nefarious uses: mounting Distributed denial of service attacks, stealing user passwords and identities, generating click fraud, and sending spam email. Compromised machines are one of the key security threats on the Internet. Given that spamming provides a key economic motivation for attackers to recruiting the large number of compromised machines, and focus on the detection of the compromised machines in a network that are involved in the spamming activities, commonly known as spam zombies. Develop an effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT is designed based on a powerful statistical tool called Sequential Probability Ratio Test, which has bounded false positive and false negative error rates. The number and the percentage of spam messages originate by spam detection technique.
17 A Study of Electronic Document Security, Mr. Parag S.Deshmukh, Mr. Pratik Pande?
This paper is an overview of relevant document security issues and technologies, as well as to introduce the suite of document security solutions. This paper also summarizes implementations for document control and digital signatures to protect electronic documents. As organizations move more business processes online, protecting the confidentiality and privacy of information used during these processes, as well as providing authenticity and integrity, are essential. Because many automated processes rely on electronic documents that contain sensitive information, organizations must properly protect these documents. Many information security solutions attempt to protect electronic documents only at their storage location or during transmission. However, these solutions do not provide protection for the entire lifecycle of an electronic document. When the document reaches the recipient, the protection is lost, and the document can be intentionally or unintentionally forwarded to and viewed by unauthorized recipients. A significantly more effective solution is to protect a document by assigning security parameters that travel with it. Six criteria must be met in order to provide more effective protection for an electronic document throughout its lifecycle: Confidentiality, Authorization, Accountability, Integrity, Authenticity, and Non-repudiation. The two major security techniques used to establish these six document security criteria are document control and digital signatures. The Electronic suite of security solutions delivers document control and digital signature services that simplify the process of protecting sensitive electronic documents and forms. Organizations can easily integrate Electronic document security solutions into current business processes and enterprise infrastructure to support a wide range of simple and complex processes. The solutions dynamically protect electronic documents inside and outside the network, online and offline to provide persistent, end-to-end protection throughout an electronic document’s lifecycle.
18 Mobile Web Services Invocation System Using SMAP?, Prof. Deepak Kapgate?, Monali Khune?
This paper presents an experiment relative to the use of short message application protocol (SMAP) to provide framework for mobile applications accessing Web Services. Here author refer to the most common architecture used to invoke Web Services, where a client and a server exchange short message application protocol (SMAP) messages provided by short message service (SMS) technology. To guarantee the independence of the application from the type of communication channel used, the paper deals with the problem of designing a framework allowing a Java application to directly interface Web Services from a mobile device using a short message application protocol (SMAP). In this system author has developed an application server called SMS server to accept end user information via short message service (SMS), access that information for user from internet and send it back in form of short message service (SMS) messages. Application server is connected to internet to access user data and send this information to end user’s mobile phone. Sending and receiving short message application protocol (SMAP) is done by SMS server.
19 FPGA Implementation of Mutual Authentication Protocol Using Modular Arithmetic?, T.Sadaiyappan, K.K.Manoj, S.A.Subhasakthe
Radio-Frequency Identification (RFID) is a modern technology that utilize radio frequencies to locate the object. In this paper, we study the radio-frequency identification (RFID) tag–reader mutual authentication (TRMA)scheme. Two improved authentication protocols for generating the Pad Gen function are described. In this project, a protocol for RFID tag–reader mutual authentication scheme is proposed which is hardware efficient. Modified MOD scheme is implemented in protocol system to reduce the hardware cost. The proposed mutual authentication protocol with consumes less logic elements and also more secure from external attacks. The proposed protocol is described in Verilog HDL and simulated using Xilinx ISE design suite 14.2.
20 Automated Cryptanalysis of Transposition Ciphers Using Cuckoo Search Algorithm?, Morteza Heydari, Mahdieh Nadi Senejani?
An approach of information security is Cryptography.Cryptanalysis is the science study to break cryptography without the encryption key. The present paper shows the benefits of the implementation of a novel genetic algorithm, the "Cuckoo Search" Algorithm (CSA) with new fitness function for the cryptanalysis of transposition cipher. The fitness function is evaluated based on the most common bigrams and trigrams. Results show that the algorithm proposed in this paper is effective for cryptanalysis of transposition cipher with long key lengths up to 30 due to its strong reliability and fast convergence speed.
21 The Survey of Techniques for Link Recovery & Admission Control in Wireless Mesh Network?, Kalyani Pendke, Anup Dhamgaye?
A Wireless Mesh Network is one of the most advanced wireless network used for cost effective communication. Wireless Mesh Network is used to cover very large physical region. During their operational period , the Wireless Mesh Network may get affected from frequent link failure which degrades the performance of network largely . Admission Control plays crucial role in improving the performance of WMN. The paper presents the study of various techniques used for recovery of wireless mesh network. The paper also gives the basic idea about the admission control in Wireless Mesh Network.
22 Application of Unified Network Management in LAN for Load Shedding?, Ravi Prakash?
The differentiated services (DiffServ) model has been proposed as a scalable traffic management mechanism to ensure Internet QoS without using per-flow resource reservation and per flow signaling, but it sacrifices the ability to accurately configure the network devices and efficiently utilize the network resources. In this thesis, the DiffServ model is augmented with traffic engineering tools, per-flow call admission control (CAC), dynamic resource sharing schemes to improve resource utilization efficiency. Specifically, an advanced two-tier resource management (ATTRM) model is proposed for efficient resource allocation over DiffServ networks, which manages network resources based on the “first plan, then take care” principle. By proper boundary service level agreement (SLA) arrangement and path-oriented internal resource mapping, the Internet service provider (ISP) can optimally plan the network resources to achieve the maximum long-term network revenue. To efficiently utilize the well-planned network, novel effective bandwidth techniques are developed for packet- and call-level QoS control in DiffServ networks.
23 Cued Click Point Technique for Graphical Password Authentication?, Vaibhav Moraskar, Sagar Jaikalyani, Mujib Saiyyed, Jaykumar Gurnani, Kalyani Pendke
In today’s world the password security is very important. For password protection various techniques are available. Cued Click Points are a click-based graphical password scheme, a cued-recall graphical password technique. Users Click on one point per image for a sequence of images. The next image is based on the previous click-point. The passwords which are easy to memorize are chosen by the users and it becomes easy for attackers to guess it, but the passwords assigned by the strong system are difficult for users to remember. In this paper, we focus on the evaluation of graphical password authentication system using Cued Click Points, including usability and security. In this authentication system, our usability goal is to support the users in selecting better passwords, thus increases the security by expanding the effective password space. The emergence of hotspots is mainly because of poorly chosen passwords. Thus click-based graphical passwords encourage users to select more random, and hence more complex to guess, click-points.
24 Data Design and Analysis for Survey System Based on Statistical Functionality?, Wessam Abbas Hamed, Atheer Yousif Oudah, Sarmad Kadhim Dhiab?
Nowadays, and after the rapid development of information systems make the researchers look for an easy and automated way to collection and analysis of a data in a questionnaire that enables researchers to obtain the results quickly instead of using manual way. Questionnaires are used in a wide range of settings to gather information about the views and behavior of individuals to inform people about the specific issues. Feedbacks of the questionnaires are used by the researchesr to find an accurate result with supporting of statistical analysis functionalities. The main invention summary concerns with analyzing the online questionnaireis the easy way that enables researchers to get the results quickly instead of manual way. Finding a result using java tools enable the researcher to find an accurate result for statistical analysis. In addition, there is a strong relationship between questionnair eand research, questionnaireis are the most important phase in research. Consequently, this study focuses on thedevelopment of the online questionnaire to facilitate and support the researcher’s tasks by using Online Questionnaire System to reduce loss of papers, time and effort. Moreover, the system has been tested based on usability and flexibility.
25 Survey on Distributed Accountability for Data Sharing in the Cloud?, K.S.Khadke, Prof. Umesh.B.Chavan
Cloud computing makes highly scalable services to be used over Internet on as per need basis. In cloud the user data is processed remotely on unknown machines. Cloud computing has some risks for both the customer and the cloud provider. Usually cloud computing services are offered by a third party provider who owns the infrastructure. It moves the application software (services) and databases (financial, health) to the data centers of CSP, where the controller of the data and services may not be fully trustworthy which have not been well understood. This paper presents a review on new way to supplement the current consumption and delivery model for IT services based on the Internet, by providing for dynamically scalable framework and often virtualized resources as a service over the Internet. In cloud whenever a problem occurs it is hard to know mistake is from user or cloud service provider. In this technology user fears of loss of his own personal data. How to provide appropriate privacy protection for cloud computing is important. To solve this problem we propose data accountability approach to keep track of usage of user’s data in the cloud. We create JAR programmable files to ensure user’s data authentication and automated log in JARs.
26 Survey Paper on Efficient and Secure Dynamic Auditing Protocol for Data Storage in Cloud?, Lokesh.P.Chaudhari, Prof. Umesh.B.Chavan?
Cloud computing is growing now days, all physical systems are going to be history in coming years as cloud computing provides the virtualize framework of all i.e. software, hardware etc. The one of the most efficient use of cloud is data storage on cloud server on pay as you go scheme. But as its good to hear there are some challenging aspects behind this cloud data storage as per end users perspective. How end users know their data is secure on cloud server? How they satisfied that the data is not tampered and successfully updated after performing some operation over it? Here the Trusted Third Party auditor comes in picture and using auditing framework he satisfy end users that there data is secure over server and successfully updated. So in this paper efficient secure auditing algorithm is designed and also extended to dynamic auditing.
Data Monitoring Application using Cortex M3 Core Processor” We combine the mature technology of Web with the embedded and fully utilize the advantages of both. The System can complete the remote monitoring and maintenance operations of equipment through the network using Web browser. Through introducing Internet into control network, it is possible to break through the spatial temporal restriction of traditional control network and effectively achieve remote sensing, monitoring and real-time controlling for equipments. Communication systems and especially the Internet are playing an important role in the daily life. Using this knowledge many applications are imaginable. Home automation, utility meters, security systems can be easily monitored using either special front-end software or a standard internet browser client from anywhere around the world. Web access functionality is embedded in a device to enable low cost widely accessible and enhance user interface functions for the device. A web server in the device provides access to the user interface functions for the device through a device web page. A web server can be embedded into any appliance and connected to the Internet so the appliance can be monitored through the browser in a desktop. Temperatures, Pressure, displacement, motion are the most often measured quantities. For example, some processes work only within a narrow range of temperatures; certain chemical reactions, biological processes, and even electronic circuits perform best within limited temperature ranges. So, it is necessary to measure the temperature and control if it exceeds some certain limit to avoid any misbehavior of the systems. To accurately control process temperature without operator involvement, a temperature control system relies upon a controller, which accepts a temperature sensor.
28 Quality of Service Assessment of AOMDV for Random Waypoint and Random Walk Mobility Models?, V B Kute, Dr. M U Kharat?
Routing Protocol performance is strongly affected by user or node mobility in Mobile Ad hoc Network. Performance of any routing protocol for Mobile Ad Hoc Network is investigated and assessed using simulator. Network Simulator - 2 uses various mobility models to mimic mobility patterns. Simulating a precise real life user/node mobility pattern is difficult. The ns-2 mobility models attempt through its mechanism to emulate real time traffic. This paper is an attempt to study one MANET routing protocol with two mobility models of ns-2.34. The investigation focuses three qualities of services to examine the protocol performance, Throughput, Average Delay and Drop Packet Ratio. Through this evaluation we are able to show that Random Waypoint Mobility model is superior in performance over Random Walk Mobility Model. But the simulation of real time mobility patter is more factual with Random Walk Mobility model than Random Waypoint Mobility Model.
29 An Optimized Solution to Map Web User Profile Based on Domain Ontology?, Prof. Ratheesh Kumar A.M, Prof. Velusamy A, Prof. Shobana G?
Ontology defines concepts, attributes and relations used to describe and represent an area of knowledge. The aim of this paper is to create a personalized ontology for web information gathering using language processing techniques. When representing user profiles, many models have utilized only knowledge from either a global knowledge base or user’s local information. To generate user’s local instance repositories to match the representation of a global knowledge base and to develop a combined ontology using strategies like ontology mapping technique, text categorization, jakard and cosine similarity methods are used to evaluate the efficiency.
Most of the existing packet-scheduling mechanism of WSN use First Come First Served (FCFS),non-preemptive priority and preemptive priority scheduling algorithms. These algorithms incur a high processing overhead and long end-to-end data transmission delay due to improper allocation of data packets to queues in multilevel queue scheduling algorithms. Moreover, these algorithms are not dynamic to the changing requirements of WSN applications since their scheduling policies are predetermined. This paper proposes a Dynamic Multilevel Priority (DMP) packet scheduling scheme, deal with the circular wait and preemptive conditions. In proposed scheme, WSN has three levels of priority queues. Real-time packets are placed into the highest-priority queue and can preempt data packets in other queues. Non-real-time packets are placed into two other queues based on a certain threshold of their estimated processing time. Leaf nodes have two queues for real-time and non-real-time data packets since they do not receive data from other nodes and thus, reduce end-to-end. Improve the performance of task scheduling schemes in terms of end to end delay and deadlock prevention.
31 Evaluate the Performance of the Router Algorithms in Different Scenarios TCP Newreno and TCP Reno?, Wessam Abbas Hamed?
Nowadays,As the new user applications and Internet traffic are rapid increas growth, the need for developing the Internet infrastructure that guarantee good level of quality of service became necessary. Congestion that is caused by amount of traffic remains steady as a main problem that threats the Quality of Service (QoS) on the Internet. Queue management mechanisms classified in to proactive working in Internet routers help enhance the performance of applications responsive applications such as TCP. Choose Active queue management mechanism plays an important role to lead to good network performance and utilization. In our research this we will evaluate the performance of the router algorithms namely DropTail, REM, and RED proposed for IP routers to achieve performance among competing sources based on different protocols(TCP Newreno and TCP Reno) using (ns2) and operating system linux . the purpose of this performance checking is to identify the key parameters to improve the fairness and link utilization in TCP/IP networks .also,this will help obtaining a better understanding of these mechanisms by identifying and clarifying factors that influence their performance in order to improve TCP/IP networks performance overall .after that ,We compared the results obtained from the statistical analysis based on the rate of the queue capacity and packet loss and link utilization.
32 Segmentation of Nuclei in Cytological Images of Breast FNAC Sample: Case Study, Aditya P. Pise, Rushi Longadge, L. G. Malik?
Demand for increased robustness, better reliability and high automation of image segmentation algorithms is apparent in recent years. Precise diagnosis and prognosis is essential to reduce the high death rate. In this paper, the study of different methodologies of cytological image segmentation proposed in this paper. The study includes the watershed algorithm and active contouring. One can also find here a description of de-noising and contrast enhancement techniques; because the raw image taken from camera mounted on microscope contain less information and noise. The study covers the different pre-segmentation processes, like Circular Hough Transform (CHT) for circle detection and nucleus localization method. Until now many segmentation algorithms were introduced but unfortunately those cannot be used directly for purpose of nuclei segmentation. From past few years’ large efforts are taken to develop a fully automatic segmentation algorithm. Here, a group of modified versions of cytological image segmentation method adopted for fine needle biopsy images are presented. The discussion on common errors and possible future problems is also added.
33 A Review of Numerous Facial Recognition Techniques in Image Processing, A.Swaminathan, N.Kumar, M.Ramesh Kumar?
Recognizing faces in images is an emerging trend of research in image processing streams. There were various systems proposed in this stream. Human emotions and intentions are communicated more often by changes in one or two discrete facial features. Given a single image, the goal of face detection is to identify all image regions which contain a face regardless of its three-dimensional position, orientation, and lighting conditions. Such a problem is challenging because faces are no rigid and have a high degree of variability in size, shape, colour, and texture. Numerous techniques have been developed to detect faces in a single image, and the purpose of this paper is to categorize and evaluate these algorithms. We also discuss relevant issues such as data collection, evaluation metrics, and benchmarking. After analysing these algorithms and identifying their limitations, we conclude with several promising directions for future research.
34 Self-optimization and Self-Protection (Transactional Security) in AODV Based Wireless Sensor Network?, Rajani Narayan, Dr. B.P. Mallikarjunaswamy, M.C. Supriya?
WSN technologies present significant potential in several application domains. Given the diverse nature of these domains, it is essential that WSNs perform in a reliable and robust fashion. This paper presents methods of integration of autonomic computing principles into WSNs. PSO based clustering for self-optimization and watch mechanism based method for self-protection from black hole attack has been presented in this paper, the paper proposes a novel approach for transactional security with chaos based AES cryptography. Results show that lifetime and throughput of wireless sensor network can be increased using this methods.
35 Surveillance on Bigdata to Mine Pattern?, N. Monica, Dr. K. Ramesh Kumar, T. Nelson Gnanaraj?
Big data is a collection of large amount of data with various types of data and usable to be processed at much higher frequency. One of the most popular knowledge discovery approaches is to find frequent items from a transaction data set and derive association rules. Pattern finding is one of the most computationally expensive steps in large data sets. Patterns often referred to association rules. Association rule plays an important role in the process of mining data for sequential pattern. Association rules are used to acquire interesting rules from large collections of data which expresses an association between items or sets of items. Apriori is a classic algorithm for learning association rules. It is designed to operate on databases containing transactions. Apriori algorithm attempts to find subsets which are common to at least a minimum number C of the item sets. Apriori uses a "bottom up" approach, where frequent subsets are extended one item at a time and groups of candidates are tested against the data. The algorithm terminates when no further Successful extensions are found. In this paper we enhance Apriori algorithm to solve its complexity over large data sets. We first collect variety of data and then integrate both structured and unstructured data using MapReduce to find out sequential pattern from the required data sets.
36 Improved Human Identification using Finger Vein Images?, Sameer Sharma, Mr. Shashi Bhushan, Ms. Jaspreet Kaur?
Finger vein is a unique physiological biometric for identifying individuals based on the physical characteristics and attributes of the vein patterns in the human. The technology is currently in use or development for a wide variety of applications, including credit card authentication, automobile security, employee time and attendance tracking, computer and network authentication, end point security and automated teller machines. The proposed system simultaneously acquires the finger-vein and low-resolution finger image images and combines these two evidences using a novel score-level combination strategy. Examine the previously proposed finger-vein identification approaches and develop a new approach that illustrates it superiority over prior published efforts. In this thesis developed and investigated two new score-level combinations, i.e., Gabor filter, Repeated Line Tracking with Median filter and comparatively evaluate them with more popular score-level fusion approaches to ascertain their effectiveness in the proposed system.
37 An Image Fusion Method Using DT-CWT and Average Gradient?, A. M. El Ejaily, F. Eltohamy, M. S. Hamid, G. Ismail?
Image fusion is a technique of combining complementary information from multiple images originated from different sources into single fused image. This paper proposes an image fusion method to merge remote sensing satellite images based on Dual-Tree Complex Wavelet Transform (DT-CWT). The Quickbird and Worldview satellite data are used for carrying out the experimental work. The objective and visual results show the superiority of the proposed fusion method over that based on classical discrete wavelet transform (DWT) and other methods based on the DT-CWT.
38 GAIT RECOGNITION OF HUMAN USING SVM AND BPNN CLASSIFIERS?, Arun Joshi, Mr. Shashi Bhushan, Ms. Jaspreet Kaur?
Recognition of any individual is a task to identify people. Human identification using Gait is method to identify an individual by the way he walk or manner of moving on foot. Gait recognition is a type of biometric recognition and related to the behavioural characteristics of biometric recognition. Gait recognition is one kind of biometric technology that can be used to monitor people without their cooperation. Controlled environments such as banks, military installations and even airports need to be able to quickly detect threats and provide differing levels of access to different user groups. Gait shows a particular way or manner of moving on foot and gait recognition is the process of identifying an individual by the manner in which they walk. Gait is less unobtrusive biometric, which offers the possibility to identify people at a distance, without any interaction or co-operation from the subject; this is the property which makes it so attractive. In this thesis, firstly binary silhouette of a walking person is detected from each frame. Secondly, feature from each frame is extracted using image processing operation. Here center of mass, step size length, and cycle length are talking as key feature. At last BPNN and SVM technique is used for training and testing purpose. Here all experiments are done on gait database and input video.
39 Wireless Mesh Networks: The Survey of Andover Continuum Wireless Technology and CyberStation Wireless Control?, K.Sangeetha, P.Revathi, N.Kokhila, C.Theebendra?
In this paper discusses how to use the next evolution of network technology - wireless mesh technology - to help improve the automation and control of a dedicated HVAC network, while also saving costs and the overhead expenses of a typical hard-wired network. The Andover Continuum solution results in cost reductions for installation, maintenance and distributed control over hard-wired HVAC controllers and devices. Building automation companies looking for an energy efficient, competitive edge are developing wireless network products for the complex internal HVAC, lighting, and safety systems of urban buildings. Andover Continuum’s building automation control system has developed wireless technology.
In this paper provides on steganography and steganalysis for digital images, mainly covering the fundamental concepts, the progress of steganographic methods for images in spatial representation and in JPEG format, and the development of the corresponding steganalytic schemes. The steganographic technique for the MP3 audio and video format, which is based on the Peak Shaped Model algorithm used for JPEG images. The proposed method relies on the statistical properties of MP3 samples, which are compressed by a Modified Discrete Cosine Transform (MDCT). After the conversion of MP3, it’s possible to hide some secret information by replacing the least significant bit of the MDCT coefficients. The performance analysis has been made by calculating three steganographic parameters: the Embedding Capacity, the Embedding Efficiency and the PSNR. It has been also simulated an attack with the Chi-Square test and the results have been used to plot the ROC curve, in order to calculate the error probability.
41 Fuzzy Mining Approach for Gene Clustering and Gene Function Prediction, Akhil Kumar Das, Debasis Mandal, Mainak Adhikary, Amit Kumar Sen
Microarray technology helps biologists for monitoring expression of thousands of genes in a single experiment on a small chip. Microarray is also called as DNA chip, gene chip, or biochip is used to analyze the gene expression profiles. After genome sequencing, DNA microarray analysis has become the most widely used functional genomics approach in the bioinformatics field. Biologists are vastly overwhelmed by the enormous amount of unique qualities of genome-wide data produced by the DNA Microarray experiment. Clustering is the process of grouping data objects into set of disjoint classes called clusters so that objects within a class are highly other classes. Generating high-quality gene clusters and identifying the underlying biological mechanism of the gene clusters are the important goals of clustering gene expression analysis. It is presently the far most used method for gene expression analysis which provides a Fuzzy mining strategy to extract meaningful information from expression profile. In this paper we have used a fuzzy mining approach for Gene Clustering with using different membership function and dividing the available gene expression data for each type of experimental value with four variables for better accuracy. This approach can effectively capture heterogeneity in expression data for pattern discovery. Based on these patterns, it can make accurate Gene Function Predictions and these predictions can be made in such a way that each gene can be allowed to belong to more than one functional class with different degrees of membership.
Clustering analysis is used to explore the classification for large dataset and Canberra distance is generalized so that it can process the data with categorical attributes. Based on the generalized Canberra distance definition, an instance of constraint-based clustering is introduced. Meanwhile, the nearest neighbor classification is improved. Class-labeled clusters are regarded as classifying models used for classifying data. The proposed classification method can discover the data of big difference from the instances in training data, which may mean a new data type. The generalize Canberra distance for continuous numerical attributes data to mixed attributes data, and use clustering analysis technique to squash existing instances, improve the classical nearest neighbor classification method.
Object detection and tracking is one of the most researched areas in computer vision and is receiving a growing attention because of its wide area of applications which include surveillance, industrial inspection, robotics, mobiles, and 3D gaming among others. This paper focuses and presents the implementation of Speed Up Robust Feature in development of a Detection and Tracking System for inanimate objects. The system can detect an inanimate object from a still image containing many other objects which have been saved as data set as well as detect and track objects using webcam. The algorithm is developed in Microsoft Visual Basic 2010 express edition using Speed Up Robust Feature available in EmguCV libraries which are used for the image processing and computer vision tasks. A Logitech C310 High Definition webcam with 5 Mega Pixel is used for the purpose of real-time detection and tracking.
44 An Analysis on Clustering Algorithms in Data Mining?, Mythili S, Madhiya E?
Clustering is the grouping together of similar data items into clusters. Clustering analysis is one of the main analytical methods in data mining; the method of clustering algorithm will influence the clustering results directly. This paper discusses the various types of algorithms like k-means clustering algorithms, etc…. and analyzes the advantages and shortcomings of the various algorithms. In each type we can calculate the distance between each data object and all cluster centers in each iteration, which makes the efficiency of clustering is not high. This paper provides a broad survey of the most basic techniques and identifies .This paper also deals with the issues of clustering algorithm such as time complexity and accuracy to provide the better results based on various environments. The results are discussed on huge datasets.
45 An Approach to an Emerging Classification Method for Large Dataset in Clustering, Kathiresan V, Dr. P Sumathi?
Clustering analysis is used to explore the classification for large dataset and Canberra distance is generalized so that it can process the data with categorical attributes. Based on the generalized Canberra distance definition, an instance of constraint-based clustering is introduced [1]. Meanwhile, the nearest neighbor classification is improved. Class-labeled clusters are regarded as classifying models used for classifying data. The proposed classification method can discover the data of big difference from the instances in training data, which may mean a new data type. The generalize Canberra distance for continuous numerical attributes data to mixed attributes data, and use clustering analysis technique to squash existing instances, improve the classical nearest neighbor classification method.
46 The Image Quality in Computer Tomography Using Curve-let Transform, M.Selvi, J.Vanitha, S.Yasotha?
The purpose of the ct-image de-noising is an important research topic both in image processing and biomedical engineering. Independent component analysis (ica) is a statistical technique where the goal is to represent a set of random variables as a linear transformation of statistically independent component variables. The curve let transform as a multiscale transform has directional parameters occur at all scale, locations, and orientations. in this proposed a new model for ct medical image de-noising, which is using independent component analysis and curve let transform. By this approach could remove more noises and reserve more details, and the efficiency of our approach is better than traditional de-noising approaches.
47 Efficient Service Broker Algorithm for Data Center Selection in Cloud Computing, Prof. Deepak Kapgate
In cloud computing, load balancing is required to distribute the local workload evenly across all the nodes. It helps to achieve a high user satisfaction and resource utilization ratio by ensuring an efficient and fair allocation of every computing resource. Proper load balancing aids in minimizing resource consumption, implementing fail-over, enabling scalability, avoiding bottlenecks etc. In this paper, we proposed and implemented new service broker (DC selection) algorithm in cloud computing. Also we compare the results of proposed technique with existing technique. This study concludes that the proposed DC selection algorithm mainly focus on reducing associated overhead, service response time and improving performance etc. Various parameters are also identified, and these are used to compare the existing techniques.
In this paper a lot of extensions of malicious attacks for packet dropping and bad mouthing attacks with implications to energy, reliability and security. Multipath routing based tolerance protocols and intrusion detection are utilized in these attacks. Light weight intrusion detection system is used to detect malicious nodes in networks and to decrease the energy loss, increase the QoS and achieving high security and Trust/reputation management system to investigate Strengthen intrusion detection through “weighted voting” and provides the trust system for neighbor nodes as well as to overcome the downside in multipath routing for intrusion tolerance in WSNs for achieving high security and utilizing the HWSNs time period.
Mobile Ad-hoc Networks (MANET) are Infrastructure less networks where self-configuring mobile nodes are connected by wireless links. In MANET, each node in a network performs as both a transmitter and a receiver. They rely on each other to store and forward packets. Due to inherent characteristics like decentralization, self configuring, self -organizing networks, they can be deployed easily without need of expensive infrastructure and have wide range of military to civilian and commercial applications. But wireless medium, dynamically changing topology, limited battery and lack of centralized control in MANETs, make them vulnerable to various types of attacks. Intrusion Detection System (IDS) is required to detect the malicious attackers before they can accomplish any significant damages to the network. This paper focus on problem of misbehaving nodes in MANETs which is based on Dynamic source routing. As well as for above said problem this papers point out pros and cons of various responses based techniques.
50 Design and Implementation of Web Based Collaborative Learning Model for ICT Course of College Student in Bangladesh?, Md. Tariqul Islam, S. M. J. Rahman, Syed Md. Galib, K. M. A. Uddin, G. M. M. Bashir
Design and Implementation of Web Based Collaborative Learning Model for ICT Course of College Student in Bangladesh
51 Design and Implementation of Web Based Collaborative Learning Model for ICT Course of College Student in Bangladesh, Md. Tariqul Islam, S. M. J. Rahman, Syed Md. Galib, K. M. A. Uddin, G. M. M. Bashir
The present age is about information technology. In order to build technical human power, the higher secondary education board of Bangladesh has launched a new course named Information and Communication Technology (ICT) as compulsory subject for intermediate college students in the traditional way. Statistic has shown that there are many lacking in the traditional system to teach ICT course. Lacking of proper amount of ICT based and qualified teacher is one of them. This paper examines the inadequacies involved with traditional system of learning the ICT course and proposes a solution by developing a technology enhanced web based collaborative learning (WBCL) model named “ICT Course Helper (ICH)” that will increase the ICT based knowledge of learners. Finally, the model is implemented and surveyed. The survey result indicates that the proposed and developed system may give better result than conventional learning procedure of ICT course to the student.
52 AES and DES Using Secure and Dynamic Data Storage in Cloud, Gowtham B, Prasanth SP?
Cloud computing is the usage of both hardware and software as a service through the internet. When it comes to software as a service, it itself depends on the hardware to execute the instructions and hence can carry out the user’s request. When the data is accessed through the internet in the cloud, security on the data being transferred will be the major concern. Since cloud computing is scalable and the servers are located in a distributed manner, security is still increasing higher. Users of the cloud can access the cloud from anywhere, from any device, so the device is also to be secured from security attacks. Data that are stored in the cloud is also in the risk of security attacks. To ensure security for the data that are stored in the cloud the Digital Signature Algorithm (DSA) is used to ensure the integrity of the file and Advanced Encryption Standard (AES) algorithm to encrypt and decrypt the files in the cloud storage. Public auditability can also be implemented by using the public key that is created during the file creation or editing process. Secure Hash Algorithm-1 hash function is used to create the message digest in the Digital Signature Algorithm, any other hash functions can be used in the place of SHA-1 like SHA-2, MD5 etc. Advance Encryption Standard Algorithm (AES) is used in the application to encrypt the data stored in the cloud, so that the files stored in the cloud will be free from all the users of the cloud including the administrator of the cloud and the files can be verified to check the integrity of the file.
53 An Overview of MANET: Applications, Attacks and Challenges?, Mr. L Raja, Capt. Dr. S Santhosh Baboo?
Advancement in the field of internet due to wireless networking technologies gives rise to many new applications. Mobile ad-hoc network (MANET) is one of the most promising fields for research and development of wireless network. As the popularity of mobile device and wireless networks significantly increased over the past years, wireless ad-hoc networks has now become one of the most vibrant and active field of communication and networks. A mobile ad hoc network is an autonomous collection of mobile devices (laptops, smart phones, sensors, etc.) that communicate with each other over wireless links and cooperate in a distributed manner in order to provide the necessary network functionality in the absence of a fixed infrastructure. This type of network, operating as a stand-alone network or with one or multiple points of attachment to cellular networks or the Internet, paves the way for numerous new and exciting applications. This paper provides insight into the potential applications of ad hoc networks, various attacks and discusses the technological challenges that protocol designers and network developers are faced with.
54 Real Time HCI using Eye Blink Detection?, R.T.Narmadha, T.Mythili, R.T.Nivetha?
This project aims to design an effective and efficient HCI both in terms of performance and cost. It also designs a Hands-free interface which helps to interact with the computer by using the facial features. To select a robust facial feature, we use the pattern recognition paradigm of treating features. We are using an off-the-shelf webcam that affords a moderate resolution and frame rate as the capturing device. It compensates people who have hand disabilities that prevent them from using the mouse by designing an application that uses facial features to interact with the computer.
55 A Survey on Wireless Sensor Network Protocols?, T.Mythili, R.T.Narmadha, R.T.Nivetha?
In this research work, a survey on Wireless Sensor Networks (WSN) and their technologies, standards and applications was carried out. Wireless sensor networks consist of small nodes with sensing, computation, and wireless communications capabilities. Many routing, power management, and data dissemination protocols have been specifically designed for WSNs where energy awareness is an essential design issue. Routing protocols in WSNs might differ depending on the application and network architecture. A multidisciplinary research area such as wireless sensor networks, where close collaboration between users, application domain experts, hardware designers, and software developers is needed to implement efficient systems. The flexibility, fault tolerance, high sensing fidelity, low cost, and rapid deployment characteristics of sensor networks create many new and exciting application areas for remote sensing. In the future, this wide range of application areas will make sensor networks an integral part of our lives. However, realization of sensor networks needs to satisfy the constraints introduced by factors such as fault tolerance, scalability, cost, hardware, topology change, environment, power consumption and efficient energy.
56 Mining Association Rules to Improve Academic Performance?, Rakesh Kumar Arora, Dr. Dharmendra Badal?
The main objective of higher education institutions is to provide quality education to its students. Institutions hope to improve the quality of education by identifying the set of students that needs special focus to clear the exams such that appropriate steps can be taken to improve the overall success ratio of the students. This will result in excellent placements and hereby increasing the quality intake of students in subsequent years. A system to analyze the performance of students using association analysis algorithm is being described in this paper. This paper will assist the academic planners in identification of students that need more attention such that the extra efforts can be employed on these set of students to improve the results.
57 Design & Implementation of Data Protection Server: Detect Guilty Agent & Protect Secure Data?, Hema Donekar, Pratiksha Raut, Nita Janorkar, Shital Admane, Indu Mandwi?
This paper presents a proactive protect scheme based on Data protection Server. We propose an improved approach based on detection of leakage and identifying the guilty party.which enhances the security of data. A data distributor has given sensitive data to a set of supposedly trusted agents (third Parties). Some of the data are leaked and found in an unauthorized place (e.g., on the web or somebody’s laptop). The distributor must assess the likelihood that the leaked data came from one or more agents, as opposed to having been independently gathered by other Means. We propose data allocation strategies (across the agents) that improve the probability of identifying leakages. These methods do not rely on alterations of the released data (e.g., watermarks). In some cases, we can also inject “realistic but fake” data records to further improve our chances of detecting leakage and identifying the guilty party.
Mobile Ad hoc Networks community provides us with a wealth of technologies that enable the source and the destination nodes to route the data through a number of intermediate forwarding nodes. Fast resources discovery and high Quality of Service are key determinants for efficient multimedia transmission. In this paper, we describe a technique of Multipath Routing using AOMDV routing protocol used for multicasting multimedia data transmission in MANET. Multi-path routing represents a promising routing method for wireless mobile ad hoc networks. Multi-path routing achieves load balancing and is more resilient to route failures. Ad Hoc On-demand Multipath Distance Vector protocol is used to choose the multiple paths available for multicasting multimedia data in MANET, based on the rate-distortion metric instead of finding the disjoint paths. The multimedia data further transferred to one and two hop neighbours. The ability of creating multiple routes from the source to a destination is used to provide backup route. When primary route fails to deliver the packets in some way, the backup is used for maintaining connection establishment. Multipath routing using AOMDV achieves lower average end-to-end delay, high video data delivery, lower routing overhead and packet loss rate, higher network throughput, quality of service in comparison with single hop neighbours.
Cloud computing has emerged as one of the most influential paradigms in the IT industry in recent years. Since this new computing technology requires users to entrust their valuable data to cloud providers, there have been increasing security and privacy concerns on outsourced data. Several schemes employing Attribute-Based Encryption (ABE) have been proposed for access control of outsourced data in cloud computing, however, most of them suffer from inflexibility in implementing complex access control policies. The proposed scheme used is Hierarchical Attribute-Set-based encryption by extending cipher text-policy Attribute-Set-Based Encryption (ASBE) with a hierarchical structure of users. The proposed scheme not only achieves scalability due to its hierarchical structure, but also inherits flexibility and fine-grained access control in supporting compound attributes of ASBE. In addition, ASBE employs multiple value assignments for access expiration time to deal with user revocation more efficiently than existing schemes. We formally prove the security of HASBE based on security of the Cipher text-Policy Attribute-Based Encryption (CP-ABE) scheme and analyze its performance and computational complexity. We introduced the ASBE scheme for realizing scalable, flexible, and finegrained access control in cloud computing. The ASBE scheme seamlessly incorporates a hierarchical structure of system users by applying a delegation algorithm to ASBE. ASBE not only supports compound attributes due to flexible attribute set combinations, but also achieves efficient user revocation because of multiple value assignments of attributes.
60 Memetic Algorithm with Hybrid Mutation Operator?, Manju Sharma
Genetic Algorithms are the biologically inspired optimization algorithms that mimic the process of natural evolution. The performance of evolutionary algorithm in finding the optimal solutions can be broadly categorized into two strategies: exploration and exploitation, but it has been clearly shown in the literature that one cannot claim that which one is better than others in all the problems or all stages of the problems. Different operators used in evolutionary approaches are either inclined towards exploration or towards exploitation but problems need the operators having the blend of both. This paper studies different mutation operators and a hybrid mutation operator has been proposed, the behavior of which is controlled by the local search. So, in the early cycle of evolution it is more like exploration and gradually it shifts towards exploitation that prevents the algorithm to stick in local optima. The experiments have been conducted using TSP oliva30 and Eil51 benchmark problems and implementation is carried out using MATLAB. Results show the improvement of memetic algorithm with hybrid mutation operator over existing genetic algorithm with simple mutation operators.
Comparing individual object with one more is a characteristic component of individual assessment making process. Though, it is not forever straightforward to be familiar with what to evaluate and what are the alternating entity to find the exact information retrieval system. In this paper, learning the number of technique that mechanically evaluates the questions and entity designed for mining consumer outcome for information extraction. Several methods have been anticipated in earlier work to solve the problem of entity mining outcomes and none of the work have been studied the accurate foundation of comparable entities, but supervised knowledge technique give an well-organized learning outcomes for comparable entities for different questions and routinely mine equivalent entities as of relative query with the intention of users posted online. In this article we study the difficulty of earlier work query and answering scheme and finish why we preferring equivalent entity mining.
62 Code Birthmarks and Graph Isomorphism for Theft Detection?, Snehal N. Nayakoji, S. P. Sonavane?
JavaScript is becoming more and more popular as client-side scripting language in the web community. However, it is easy to copy the source code of JavaScript program with the help of additional functionalities provided by the browser. Hence, the intellectual property right of web application developers is at risk. To address this issue a new software theft detection technique, called as software birthmark; is introduced. Software birthmark results from the intrinsic characteristics of the program which could be used to determine the similarity between two programs. The proposed paper demonstrates the way to extract code signature and to design software birthmark along with the idea of using subgraph isomorphism to detect the source code theft of JavaScript programs.
The paper presents multi-wavelet based on non-visible water marking. In the past, DWT-DCT technique has less copyright protection and content authentication. The proposed method is solved referred problems. In this paper, firstly apply the multi wavelet to improve image resolution at LL sub band. Secondly, embedded the important data (watermark image) into host multimedia, and it can be used in digital right management, authentication and data hiding. The experimental result shows that the watermark scheme has strong robustness, and can embed much more data.
64 Performance Analysis of Image Fusion Algorithms using HAAR Wavelet?, Deepika.L, Mary Sindhuja.N.M?
Image Fusion is a technique used to integrate information from multiple images such that the fused image is suitable for processing tasks. Medical image fusion is used to derive useful information from Medical image. The basic idea is to improve the content of an image by fusing images like Computer Tomography (CT) and Magnetic Resonance Imaging (MRI) images. The proposed method use the Discrete Wavelet based fusion algorithms on medical image fusion of CT and MRI, implementation of fusion rules and the fusion image quality evaluation. Therefore the fused image has the information which is useful for human or machine perception. The fused image with such rich information will improve the performance of image analysis algorithms for medical applications. The fusion performance is done using Entropy, the Root Mean Square Error (RMSE) and Peak Signal to Noise Ratio (PSNR).
65 Survey on Quality Analysis of Cooperation Incentive Strategies in MANET?, K Savitha Rohini, S Dhanasekar?
In mobile ad hoc networks (MANETs), tasks are conducted based on the cooperation of nodes in the networks. However, since the nodes are usually constrained by limited computation resources, selfish nodes may refuse to be cooperative. Reputation system is one of the main solutions to the node non-cooperation problem. A reputation system evaluates node behaviours by reputation values and uses a reputation threshold to distinguish trustworthy nodes and untrustworthy nodes. Although this system has been widely used, very little research has been devoted to investigating the effectiveness of the node cooperation incentives provided by the systems. We propose a protocol called Enhanced Reverse Ad Hoc On Demand Vector Routing Protocol (ERAODV), which uses Hybrid Reputation System (HRS). A Hybrid Reputation system is an enhanced version of Classical Reputation System (CRS). Unlike the CRS it takes into account all the reputation values from the node to determine whether it is trustworthy or not.
66 Distributing, Ensuring and Recovery of Data Stored in Cloud?, Dr. B. F. Momin, Mr. Girish Bamane?
Cloud Computing can be seen as buzzword these days. As sight to traditional systems where IT services are under the control of Limitations like computing, hardware etc and everything is in personal. The Cloud Computing moves all things like applications, databases, and huge data to the large data centres, where management of the data and services may not be trustful and reliable. So, new attributes are arriving to security, which are still not well understood. Here in this paper we are going to know about data storage security. To ensure the correctness of data stored in a cloud, we are going to propose a new scheme of distributed storage. In this scheme, we are verifying the correctness of data stored in the cloud and also identifying the storage server which is misbehaving or being compromised (data error localization). Though there are lots of methods are present to prevent the data modification also for the recovery of data, like data replication method, and parity generation. The scheme proposed in this paper is more efficient than others because it prevents data loss from single point to multipoint failure storage servers.
67 A Survey Conducted on E-Agriculture with Indian Farmers?, Sumitha Thankachan, Dr. S. Kirubakaran?
Technological importance have been a great support for making decisions in various fields especially in agriculture. The development of agriculture has been on under development for the past few years due to lack of Agriculture knowledge and environmental changes. The main aim of this paper is to reach farmers for their awareness, usage and perception in e-Agriculture. The study used statistical survey design technique to collect data from farmers for their awareness in e-Commerce. The results obtained indicated the level of awareness is less such that there is a need for e-agriculture for their support. e-Agriculture is a platform for supporting marketing of agricultural products
68 A Survey on Delegating Log Management to The Cloud, Sinu P S, M.Ananthi?
A log is a collection of record of the events that occurs within an organization containing systems and networks. Logs are being composed of entries which are of its own syntax; each log entry has information that are related to a specific event which has been occurred inside a system or network. Actually, logs are used basically for problems like troubleshooting, but at present logs serve many functions almost in all organizations, for optimizing performance of the system and network, for recording all the actions of users, and for providing useful data for malicious activity investigation. Logs have been in use for containing information that are related to various forms of events that are occurring in the networks and systems. Inside an organization, there are much logs which do contain records that are related to the security of the system; some common examples of these computer security logs are logs that are related to audit that contains the track of user authentication attempts and logs of security device that record the possible types of attacks. In this paper, we focus on the challenges for a secure cloud-based log management service and do propose a framework for doing the above.
69 Survey on Clinical Decision Support System for Diagnosing Heart Disease?, Suchithra, Dr. P. Uma Maheswari?
Ischemic Heart Disease is a disease which is difficult to diagnose and is vey commonly identified only during the mortality of an individual. The World Health Organization (WHO)[12] statistical report state that more than 70 per cent of coronary deaths occur with subjects older than 70 years in North America and Western Europe. As per WHO reports in India and other developing countries 70 per cent deaths occur in subjects less than 70 years of age. Coronary Heart Disease (CHD) is an epidemic in India. A retrospective data set that included 1000 clinical cases is taken for the work. 88 sets were discarded during preprocessing. Tests were run on 912 cases using weka classifiers[5] available in weka 3.7.0. Out of 113 classifiers, 16 classifiers are identified to be the best based on different parameters sensitivity, specificity, accuracy, F-measure, kappa statistic, correctly classified cases, time taken to run the model and ROC curve. The diagnoses made by Clinical decision Support System(CDSS)[1][6][9] were compared with those made by physicians during patient consultations. The major goal of this paper is to build an expert system for diagnosing the presence of Ischemic Heart Disease with an integrated automated classifier using Artificial Intelligence techniques.
70 Monitoring Driver Alertness and Avoiding Traffic Collision Using WSN?, B.Arunkumar, N.Deepak, Dr.T.V.P.Sundararajan?
Driver drowsiness is among the leading causal factors in traffic accidents occurring worldwide. This paper delineates a method to monitor driver safety by analyzing information related to fatigue using EEG signals. Drowsiness is a state of near-sleep, a strong desire for sleep, or sleeping for unusually long periods. It is also the transition state between awakening and sleep during which a decrease in vigilance is generally observed. Both behavioral and physiological modifications occur during drowsiness. Reaction time is slower, vigilance is reduced and information processing is less efficient, which can generate abnormal driving; it induces an increase of the number and the duration of blinks and yawns. Changes in cerebral activity also happen. Therefore, a drowsy detector system is developed which detects drowsiness using EEG signals. The EEG signals from different persons are analyzed and the feature extraction is carried out through the method Fast Fourier Transform (FFT).The EEG signals are classified as delta, theta, alpha and beta depends on these frequency values to detect driver fatigue and to alert the person.
71 Speech Recognition Using Backoff N-Gram Modelling in Android Application, S.Aparna, V.Senthil Kumar?
Google is one of the most popular information retrieval systems among users. Spoken questions are a natural standard for penetrating the network in settings where typing on a console is not applicable. This paper describes a speech boundary to the Google search. The study entails the improvement of Hands- Free voice recognition Google Search Engine to operate Google and browse the result of search without using a keyboard or mouse. Speech recognition uses are becoming more and more beneficial nowadays. Digital processing of speech signal and voice recognition process is very important for fast and precise automatic voice recognition technology. Here we present a new service which is not currently accessible in Google search engine (GSE). It suggests the enactment of speech recognition input in GSE. The paper stimulates an older method from n-gram language modelling to scale training data. The algorithm is implemented efficiently using a MapReduce/SS (Spectral Subtraction) Table framework based on HMM and Gaussian Mixture models.
72 Design of Encounter-Based Social Network in Mobile Application, S.Niranjani, A.Rathna?
The mobile social networks are likely to a large extent enhance interaction with mobile users and shared the information in encounter-based social network. In this encounter traditional social network as opposed users abuse the information’s. So this new approach challenges network basically different by previous social network designs. In this paper, we propose design for encounter-based mobile social network using security for location and encounter privacy. Here also we explore different requirements for these new systems. We present a system by which devices who shared a physical location and time can be matched by a central server. To highlight of these challenges network it was designed for specifically secure centralized server. Centralized servers cannot always be relied upon to protect data confidentially. So we describes the design of SMILE, is secure for a privacy-protection “missed-connections” service for mobile users. It also provides services using mobile devices without relying on trusted coordinating server. Here SMILE design using key exchange protocols. We develop cryptography hash technique for protect the information through the mobile application. This paper presents a design of secure encounter-based social network by implementing in android application called MeetUp.
Digital Forensics is an emerging technology, which is used to detect illegal content in videos. In this project, the main objective is to use the Content Delivery Network (CDN) based Resource Aware Scheduling (CRAS) algorithm to find the originality of the video. CDN transmits the packets from source to destination in the real-time approach. The sample video is given in terms of frames, where the frames are i frame, b frame, p frames respectively. CRAS algorithm schedules the tasks efficiently in the Digital Forensic Service Platform (DFSP) according to resource parameters such as delay and computational load. The proposed system decreases node traffic and improves the scalability.
The Column Mobility Model proves useful for scanning or searching purposes. This model represents a set of Mobile Nodes that move around a given line (or column), which is moving in a forward direction. In this paper we propose a modified column mobility model. A slight modification of the Column Mobility Model allows the individual Mobile Nodes to follow one another and search any node randomly in a column in an efficient manner.
75 Predictive Data Mining: A Generalized Approach?, Meghana Deshmukh?, Prof. S. P. Akarte
In this paper, we included the ambitious task of formulating a general framework of data mining. We explained that the framework should fulfil. It should elegantly handle different types of data, different data mining tasks, and different types of patterns/models. We also discuss data mining languages and what they should support: this includes the design and implementation of data mining algorithms, as well as their composition into nontrivial multi step knowledge discovery scenarios relevant for practical application. We proceed by laying out some basic concepts, starting with (structured) data and generalizations (e.g., patterns and models) and continuing with data mining tasks and basic components of data mining algorithms (i.e., refinement operators, distances, features and kernels). We next discuss how to use these concepts to formulate constraint-based data mining tasks and design generic data mining algorithms. Finally this paper discussed about these components would fit in the overall framework and in particular into a language for data mining and knowledge discovery.
76 Automatic Detection and Restraining Mobile Virus Propagation using Android?, S. Chandrasekar, Prof. V. Jayaprakasan
The mobile viruses and malwares is difficult that desires to be reported in the future. Today’s lot of studies regarding PC viruses and worms, but very less effect has been done concerning the same issues in the mobile environment. But rapid growth of smart phone users, it increasingly become the target of propagating viruses through the Bluetooth and Wi-Fi and spread into the mobile networks. In a mobile viruses and malwares can cause privacy leakage, extra charges, depletion of battery power, remote listening and accessing private short message and call history logs etc., Furthermore, they can scrape wireless servers by sending lot of spam messages or track user positions through GPS [3]. In this we propose a two layer network model for spreading virus through both Bluetooth and SMS/MMS. Our work addressed the effect of human behaviors, i.e., Operational behavior and Mobile behavior, on virus propagation. Moreover, we observe two strategies for avoid mobile virus propagation, i.e., Preimmunization and Adaptive Dissemination strategies represent on the methodology of Autonomy-Oriented Computing (AOC) [13]. So that by using the method it can automatically detect and delete both Bluetooth and SMS virus before enter into the Smartphone operating system.
77 Various Approaches to Detect Wormhole Attack in Wireless Sensor Networks?, Nishant Sharma, Upinderpal Singh?
Wireless Sensor Network (WSN) is an emerging technology that shows great promise for various futuristic applications both for mass public and military. These small, low-cost, low-powers, multifunctional sensor nodes can communicate in short distances. There is currently enormous research potential in the field of wireless sensor network security. The major challenge for employing any efficient security scheme in wireless sensor networks is created by the size of sensors, consequently the processing power, memory and type of tasks expected from the sensors. Among various attacks in wireless sensor networks, In a wormhole attack, a pair of attackers forms ‘tunnels’ to transfer the data packets and replays them into the network. This paper provides a survey on wormhole attack and its counter measures and a proposed scheme has been described that can detect and prevent wormhole attack in wireless sensor networks.
78 Decision Support System for Precluding Coronary Heart Disease (CHD)?, K. Cinetha, Dr. P. Uma Maheswari?
Cardiovascular diseases (CVD) remains the biggest cause of deaths worldwide and the Heart Disease Prediction at the early stage is importance. Coronary heart disease (CHD) is the leading cause of death for both men and women and accounts for approximately 600,000 deaths in the United States every year. To design a Decision support System for Precluding Coronary Heart Disease (CHD) risk of patient for the next ten-years for prevention. To assist medical practitioners to diagnose and predict the probable complications well in advance. Identifying the major risk factors of Coronary Heart Disease (CHD) categorizing the risk factors in an order which causes high damages such as high blood cholesterol, diabetes, smoking, poor diet, obesity, hyper tension, stress, etc. Data mining functionalities are used to identify the level of risk factors to help the patients in taking precautionary actions to stretch their life span. Primary prevention is recommended as promoting healthy lifestyle and habits through increased awareness and consciousness, to prevent development of any risk factors.
Schedulers for cloud computing determine on which processing resource jobs of a workflow should SSbe allocated. In hybrid clouds, jobs can be allocated on either a private cloud or a public cloud on a pay per use Sbasis. The capacity of the communication channels connecting these two types of resources impacts the makespan and the cost of workflow execution. Our new approach introduces Ant Colony Optimization for the scheduling problem in hybrid clouds presenting the main Heuristics such as cost, makesapn, number of cores (multicore), and available bandwidth to be considered when scheduling workflows. Ant Colony Optimization is one of the best optimization techniques in scheduling workflows using heuristics.
80 Integrated System for Reading Multiple Files, Awanti Kamble, Anshoola Jaiswal, Nikita Dekate, Shama Haridas, Kalyani Pendke?
today to read documents having different extension, a user need to install different application which can open those documents. Let’s take an example if a user wants to view a word document than there should be MS Office installed on his/her own PC. In addition, to view PDF file, there must be Adobe PDF Reader installed. Similar is the case with Multimedia files. We need to again install different software applications. Now just for viewing all these files, installing all such application would be hectic and cumbersome task as it utilizes many PC resources like hard disk space, CPU memory etc. In addition, if a user subscribed to a licensed version of this software/application than there is large cost involved in it. Keeping all the above said problems in mind, we decided to build such an application that would include all these applications. The Global File Reader will be combination of Notepad, MS Word, Web Browser, Image Viewer, Media Player and Adobe PDF Reader. Using Global File Reader, a user will be able to view all different, most common file formats. The main USP of Global File Reader is that it can open almost all the types of documents and multimedia files under it. The tabbed feature allows you to open different files retaining or without closing the previous one. The Global File Reader is one man show i.e. what all different applications, text editors, media players do, it does all. It is integration or consolidation of all different applications.
In wireless media, secure communication is one of the important concepts. We use Identity based cryptosystems in order to provide security in two-way relay networks. But due to the use of identity of a node as their public key, this scheme lacks the anonymity and privacy preservation. So, in order to solve this problem, propose a new approach in two-way relay networks by using cooperation jamming and relay selection approach for enhancing security. In this scheme, we propose a two-way relay network consisting of two sources, relays and an eavesdropper and there is a new relay chatting based on transmission scheme is proposed. It uses a single relay in order to forward the messages and the remaining relays transmit interference signals to confuse the eavesdropper by distributed beam forming.
82 Honey Trap Security Server: An Efficient Approach of Securing E-Banking Network?, Ashwini Gabhane, Kiran Bansule, Pradnya Kedar, Lekha Gahukar, Swati Sahare, Ketki Bhakare
This paper presents a new way for securing an account. The Honey trap Security Server is use to secure an account from an attacker, intruder, hackers and crackers. Honey trap is nothing but "a security resource whose value lies in being probed, attacked or compromised". The honey trap contains no data or applications critical to the company but has enough interesting data to entice a cracker. A Honey trap system should be system to be easier prey for intruders than true production systems but with minor system modifications so that their activity can be logged of traced. An important goal of Honey trap Security Server is to trap an intruder and the methods which are used for intrusion before attacking on real server.
83 Weighted Moving Average Forecast Model based Prediction Service Broker Algorithm for Cloud Computing, Prof. Deepak Kapgate?
Proper load balancing aids in minimizing resource consumption, implementing fail-over, enabling scalability, avoiding bottlenecks etc in cloud computing. In Cloud Computing Scenario Load Balancing is composed of selecting Data Center for upcoming request and Virtual machine management at individual Data Center. In this paper, we proposed and implemented Predictive Service Broker (DC selection) dynamic algorithm based on Weighted Moving Average Forecast Model in cloud computing. This study concludes that the proposed predictive DC selection algorithm mainly focus on reducing service response time observed at client side. The result shows drastic reduction in Response time at client side by using Predictive Weighted Moving Average Forecast DC selection algorithm. Various parameters are also identified such as Data Center request service times, Data Center hourly loading, total Data Transfer and Virtual machine costing and respective values are calculated.
Data replication is the most critical component of data-intensive grid computing environment. The need for data replication arises in various areas of data analysis such as high-energy physics, bio-informatics, climate modeling and astronomy. In addition to grid data environments, data replication is the key part of various data sharing applications such as digital libraries, persistent archival environment and content distribution. Parallel file replication where a large file needs to be simultaneously replicated to multiple sites is an integral part of data-intensive grid environment. Propose a tool that creates multiple distribution trees by pipelining point-to-point transfer and optimizes the file replication time to multiple sites. One of the key parts in data replication is the replica catalog that manages the mappings for files from the hierarchical namespace to one or more physical file locations, thus providing an efficient and transparent file sharing on a Grid. Managing and coordinating the data movement process is the crucial performance issue.
85 Evaluating and Analyzing Clusters in Data Mining using Different Algorithms?, N. Sunil Chowdary, D. Sri Lakshmi Prasanna, P. Sudhakar
Cluster analysis or clustering is the task of grouping a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense or another) to each other than to those in other groups (clusters). It is a main task of exploratory data mining, and a common technique for statistical data analysis, used in many fields, including machine learning, pattern recognition, image analysis, information retrieval, and bioinformatics.
86 High Efficiency Data Access System Architecture for Deblocking Filter Supporting Multiple Video Coding Standards, S.Karthikeyan, K.Saran?
In this Paper, we propose a novel algorithm for block artifact reduction for which Pseudo random noise masking is used. In H.264 Video Compression Standard, an In-loop Deblocking filter is implemented to reduce the quantization noise around the block boundaries. But there would still remain artifacts within blocks. These artifacts are known as Contour artifacts. These Contour artifacts further be reduced by our Proposed Pseudo Random Noise Masking Method. The Proposed method is selectively applied to macro-blocks damaged by block artifacts after H.264 Deblocking filtering. Its effectiveness will be clearly demonstrated for all the evaluation aspects. We estimate the level of block artifacts by measuring PSNR of the block artifact Video.
87 Literature Survey on Applications of Digital Signal Processing using Anti-Aliasing and Anti-Imaging Filters?, S.Arun Kumar?
The polynomial-based interpolation is that it can be efficiently implemented using the so called Farrow structure. This discrete-time filter structure consists of Finite Impulse Response (FIR) branch filters having fixed coefficient values. The interpolated samples are obtained by weighting the output samples of these FIR filters by the fractional interval m. In this paper, a new method for designing polynomial based interpolation filters has been proposed in this method is based on the relationship between the Taylor series of the approximating continuous-time signal and the Farrow structure as introduced. It enables us to design the FIR filters in the Farrow structure separately. Because these FIR filter are linear-phase filters, they can be easily designed by using the Remez algorithm.
88 An Investigation of Face Recognition Characteristics Using PCA and ICA?, Yundi Fu, Yongli Cao, Arun Kumar Sangaiah?
This aims to investigate the face recognition characteristics using widely adopted statistical approaches (i.e., Principal Component Analysis (PCA) and Independent Component Analysis (ICA)). This paper focus on Eigen faces approach for implementing the face recognition and detection on the images to compare the performance of PCA and ICA. Moreover, our results are reveals that based on applied condition and performance metric, turning ICA produce better results than PCA. The managerial practices and results of the paper have presented.
89 An Investigation of Dijkstra and Floyd Algorithms in National City Traffic Advisory Procedures?, Arun Kumar Sangaiah, Minghao Han, Suzi Zhang?
This paper is focus on design and implementation of the National City traffic advisory procedures. The main aim of this paper provides the optimal decision and transport advisory procedure to the passengers. In this paper we are trying to investigate Dijkstra algorithm and Floyd algorithm implemented in Microsoft Visual Studio in order to establish the storage structure of transportation network and find the shortest path between two cities. We have formulated the optimal decision rules to analyze the time and cost parameters among various cities in china. We have used Dijkstra and Floyd algorithm in order to compare the performance and provide optimal solution to the passengers.
SPACE-time coding is a transmit diversity scheme with optional receive diversity to achieve high data rate and to improve the reliability of a wireless channel. In the past, STBC has full rate and full diversity for four transmit antennas. This technique has high cost and high complexity due to the more feedback information. In this paper, feedback-rotated QOSTBC technique is used to improve its transmit diversity with one bit feedback.. The experimental result shows that the bit error rate of feedback-rotated QOSTBC has better performance compare to other techniques.
A Mobile Ad hoc Network (MANET) is an autonomous system of mobile stations connected by wireless link to form a network. It does not rely on predefined infrastructure to keep the network connected therefore it is also known as infrastructure less network. A designed protocol must provide scalable routing with better security. In this paper, we proposes the location based protocols of Dynamic Remote Routing (DFR) and Dynamic Location Routing (DLR) schemes, considering location information and distance between the nodes as the routing metric. DLR uses the anchored methods that square measure discovered and managed by sources, using one among two low overhead protocols: Friend Aided Path Discovery and Geographical Map-based Path Discovery. Performance of these protocols will be compared with God domain protocol of Ad hoc On demand Distance Vector Routing (AODV) protocol by using simulation software NS2.
92 Porter Five Forces Analysis of the Leading Mobile Cellular Telephony Service Provider in India?, Subhasish Majumdar, Partha Pratim Bhattacharya?
In this paper, we have discussed Porter Five Forces model and presented mobile communication market scenario in India. Then we have analyzed the intensity of competitions among different mobile phone service providers. Different forces which may throw a challenge to the existing leader are thus found out.
93 An Improved IDS Detection with Protection of Agent Collude Attacks?, L.Devi?, V.Santhiya?
There are chances like a data distributor had given his sensitive data to a set of trusted agents. These agents can be called as third parties. There are chances that some of the data is leaked and found in an unauthorized place. This situation is called IDS. In existing case, the method called watermarking is using to identify the leakage. Or also uses the technique like injecting fake data that appears to be realistic in the data. I propose data allocation strategies that improve the probability of identifying leakages. In enhancement work I include the investigation of agent guilt models that capture leakage scenarios.
94 Denial of Service Attacks in Wireless Networks: The Case of Jammers, L.Devi?, ?A.Suganthi?
Multiple-path source routing protocols distribute the total traffic among available paths. In this article, we consider the problem of jamming-aware source routing and avoiding jamming by splitting data rate. We formulate this traffic allocation as a lossy network flow optimization problem using portfolio selection theory from financial statistics. We show that in multi-source networks, this centralized optimization problem can be solved using a distributed algorithm based on decomposition in network utility maximization (NUM). We demonstrate the network’s ability to estimate the impact of jamming and solve it by redirecting the packets or by splitting data rate. Finally, we efficiently allocate the traffic to maximize the overall throughput.
95 Fast IP Network Recovery Using MRC from Multiple Failures, L.Devi?, ?M.Suganthi?
Internet takes vital role in our communications infrastructure, due to slow convergence of routing protocols after network failure become a budding problem. To assure fast recovery scheme from link and node failure in networks, we present a new recovery scheme called Multiple Routing Configuration (MRC). Now a days, Internet plays a major role in our day to day activities e.g., for online transactions, online shopping, and other network related applications. Internet suffers from slow convergence of routing protocols after a network failure which becomes a growing problem. Multiple Routing Configurations [MRC] recovers network from single node/link failures, but does not support network from multiple node/link failures. In this paper, we present MRC, and analyze its performance with respect to load distribution after a failure. We also show how an estimate of the traffic demands in the network can be used to improve the distribution of the recovered traffic, and thus reduce the chances of congestion when MRC is used. We propose Enhanced MRC [EMRC], to support multiple node/link failures during data transmission in IP networks without frequent global re-convergence. By recovering these failures, data transmission in network will become fast.
96 Wireless Body Area Sensor System for Monitoring Physical Activities Using GUI?, L.Devi?, ?R.Nithya?
Wireless Sensor Networks (WSNs) technologies are considered as one of the key of the research areas in computer science and healthcare application industries. Sensor supply chain and communication technologies used within the system and power consumption therein, depend largely on the use case and the characteristics of the application. Recent technological advances in sensors, low power integrated circuits, and wireless communications have enabled the design of low-cost, miniature, lightweight, intelligent physiological sensor platforms that can be seamlessly integrated into a body area network for health monitoring. Wireless body area networks (WBANs) promise unobtrusive ambulatory health monitoring for extended periods of time and near real-time updates of patients’ medical records through the Internet. We designed the user interface to both address the needs of the research prototype WBAN and support a deployed WBAN system. The user interface must provide seamless control of the WBAN, implementing all the necessary control over the WBAN. Authors conclude that Life-saving applications and thorough studies and tests should be conducted before WBANs can be widely applied to humans, particularly to address the challenges related to robust techniques for detection and classification to increase the accuracy and hence the confidence of applying such techniques without physician intervention.
97 Implementing & Developing Cloud Computing on Web Application?, Anisha Tandon?
This paper introduces internet-based cloud computing, its characteristics, service models, and deployment models in use today. We also discussed the benefits and challenges of cloud computing and the significance of flexibility and scalability in a cloudbased environment. In this paper we also focus on issues and advantages for web and cloud based applications; we also point out the various difficulties associated with dynamic updates for such applications, present and layout directions for future work.
98 A ZIG-BEE BASED WEARABLE HEALTH MONITORING SYSTEM?, M.Sherwin Nayanar, V.Pandimurugan, V.Gowri?
The design and development of a ZigBee based smart noninvasive wearable health monitoring system with effective communication has been developed and reported in this paper which includes GSM technology. Health care plays an important role in humans life. Health care monitoring involves care and regular updates about the diseased. The ageing population will lead to increased healthcare cost as care for the elderly is much more expensive than that of other age groups. The physiological parameters are monitored using different sensors like temperature sensor, impact sensor, heart rate sensor. These sensors are used to measure the vital signs in human body. LM35 series are precision integrated-circuit temperature sensors, whose output voltage is linearly proportional to the Celsius (Centigrade) temperature. LM35 is suitable for remote applications and low cost due to water-level trimming. An accelerometer measures acceleration forces, these forces may be static, like the constant force of gravity pulling at our feet, or they could be dynamic, caused by moving or vibrating the accelerometer. Heart rate sensor consist of an infrared led transmitter and an infrared photo transistor receiver which is connected to microcontroller and then to LCD. ZigBee which is reliable supports large number of nodes, with very long battery life and low cost of this device will help to lower the cost of health monitoring both in home and hospital.
99 Application of the Fuzzy Logic in Content Based Image Retrieval using Color Feature?, Aqeel M. Humadi, Hameed A. Younis?
Content Based Image Retrieval (CBIR) is a set of techniques for retrieving semantically-relevant images from an image database based on automatically-derived image features. Generally, in CBIR systems, the visual features (color, texture, and shape) are represented at low-level. They are just rigid mathematical measures that cannot deal with the inherent subjectivity and fuzziness of people understandings and perceptions (different people would have different understandings and descriptions of the same visual content). As a result, there is a gap between low-level features and high-level semantics. To overcome this problem, we introduce a new system of visual features extraction and matching using Fuzzy Logic (FL) which is a powerful tool that deals with reasoning algorithms used to emulate human thinking and decision making in machines. Specifically, color feature is widely used in content-based image retrieval because of its low computational cost and invariance to scaling, translation, and rotation. The classic system of color histogram creation results in very large 3-D histograms with large variations between neighboring bins. Thus, small changes in the image might result in great changes in the histogram. Manipulating and comparing 3-D histograms is a complicated and computationally expensive procedure. To overcome these problems, a new fuzzy system of color histogram creation, based on the L*a*b* color space, is applied, which links the three components of L*a*b* color space using fuzzy inference system and provides one-dimensional histogram which contains only 15 bins.
100 VLSI Implementation and Analysis of Parallel Adders for Low Power Applications?, Arunprasath S, Karthick A, Dhineshkumar D?
Carry select adder (CSLA) is known to be the fastest adder among the conventional adder structures. Due to the rapidly growing mobile industry not only the faster arithmetic unit but also less area and low power arithmetic units are needed. The modified CSLA architecture has developed using Binary to Excess-1 converter (BEC). The efficient CSLA architecture has developed using D latch. In this paper an analysis has been made between Regular and Conventional CSLA adders like modified, Efficient CSLA. Designs were developed using structural VHDL and synthesized in Altera Quartus II with reference to FPGA device EP2C35F672C6. Experimental results are compared in-terms of area, power, delay and PDP. Results shows that modified carry select adders are better in area and power consumption.
101 Degraded Documents Recovering by Using Adaptive Binarizations and Convex Hull Concept?, Mahendar.R, Navaneetha Krishnan.S, Roshini Ravinayagam, Prakash.P?
As a challenging task involved in recovering an original contents of the various degraded documents due to overlapping of both text and images, fatherly image of a damaged book page often suffers from the various scanning collapse known as scanning shading, dark borders noises and foreground problem. These collapses will degrade the qualities of the scanned documents and cause many problems in the subsequent process of document image analysis. In this paper, we have to propose the two concepts for recovering original quality contents of degraded documents. An adaptive binarizations technique that shows these issues reductions by using adaptive pixel per matching. The adaptive pixel matching is a combination of the local image contrast and the local image gradient that is tolerant to text and background variation caused by different types of document degradations. Convex hull algorithm method proves to be outstandingly effective for image shading correction and dark border noise removal. It can restore a desired blackness free document and meanwhile yield an illuminated surface of high quality.
102 A New Key Management Paradigm for Fast Transmission in Remote Co-operative Groups?, M.VIJAYAKUMAR, V.PRIYA DHARSHINI, Dr.C.SELVAN?
In Emerging technology Mobile adhoc network (MANET) is widely used many areas, successfully to achieve fast transmission and communication. But it cannot achieve fast transmission /broadcasting in Remote Area. To overcome this problem new key management paradigm technique is used. In this proposed method the new key management paradigm form some group. In that group select any one of the node / system based on that priority to send the secret key distribution between sender and receiver to improve fast data transmission in remote Area. Each and every data transmission, secret key will be generated and also should be updated. In that remote area hackers should theft the data, so give protection against the unauthorized person. Using key updating method transmit the data fast, reliable and more securable manner. To create Cooperative groups using a new Key management paradigm in Remote Area. The Computation overhead and Communication Cost are independent of group size. Using rekeying strategies efficient way to achieve any number of addition / deletion process will be carried out and as well as strong security against the collision in that remote Area.
This paper reviews providing strong security is necessary for real time services of any wireless access networks. Wimax and LTE are the latest wireless broadband access networks support high data rate and mobility and become increasingly important as WiMAX data LANs are deployed for business, government and military applications. But free-space transmission introduces new opportunities for eavesdropping on Wireless data communications. What makes it worse is that the sender and the intended receiver have no means of knowing whether the transmission has been intercepted or not, so the eavesdropping is virtually undetectable. Several papers are dealt security stands out as a critical issue in the design and deployment of WiMAX networks but they are not dealt with real time in environment, But, this main contribution of this paper is to provide highly secure data transmission for real time services in real time environment while using Wimax networks, for that We introduces a Virtual Private Networks (VPNs) have emerged as an economic alternative to this current wireless network with building a private networks. VPNs provide security by integrating a set of authentication, encryption, and access control and session management components.
104 On Generating Permutations under User Defined Constraints?, Dhruvil Badani?
In this paper, a method to generate permutations of a string under a set of constraints decided by the user is presented. The required permutations are generated without generating all the permutations.
Visual cryptography scheme is a cryptographic technique which allows visual information (Printed text, handwritten notes, and picture) to be encrypted in such a way that the decryption can be performed by the human visual system, without the aid of computers. There are various measures on which performance of visual cryptography scheme depends, such as pixel expansion, contrast, security, accuracy, computational complexity, share generated is meaningful or meaningless, type of secret images( either binary or color) and number of secret images(either single or multiple) encrypted by the scheme. Intent of this paper is on study and performance analysis of the visual cryptography schemes on the basis of pixel expansion, number of secret images, image format and type of shares generated.
The mainstay of this project is to propose a mobile ward round system based on openEHR standards for the use on smart phones and tablet computers using the Android platform which integrates and uses NFC to explore new ways of computer interaction, data processing and workflows in the medical world. Based on the automatic patient identification via NFC using the mobile device, physicians can easily view recent ward round results and edit/add information without manually selecting the patient from a list. Relying on the open EHR standard, physicians are not limited to a certain ward round document anymore but can define their own ward round templates by means of the open EHR templates and archetypes.
107 Trust Based Voting Scheme and Optimal Multipath Routing for Intrusion Tolerance in Wireless Sensor Network?, P.PRIYADHARSHINI, C.ANOOR SELVI?
Wireless sensor networks (WSNs) deployed in unattended environment energy recharging is difficult. WSN satisfy application specific QoS requirements i.e., reliability, timeliness, security and minimize energy consumption to prolong system useful lifetime with limited resources. The drawbacks of existing work include redundancy management scheme that did not addresses heavy query traffic. Ambiguity in multi-path routing decision is due to higher level of intrusion tolerance rate. The proposed work presented Trust Based Neighbor Weighted Voting Scheme to strengthen intrusion detection in WSN. It evaluates the dynamic radio range of neighbor nodes. Weight threshold is evaluated for marking the sensor node as normal node and malicious node. It discards the communication of internal malicious node by identifying lower weight votes of corresponding sensor node. It governs the best WSN settings in terms of redundancy level used for outsource multipart routing number of weighted votes intrusion invocation interval. WSN lifetime is maximized with trust based weighted voting and handles concurrent higher query traffic.
The growing impact over smart phone increases the development of mobile application for educational Guidance and Counseling (GC) at university that called “UOC”. The application is used for Guidance and Counselling services that run on mobile devices. The application is designed specifically for college and university student. The methods are a combination of interactive multimedia approaches and educational psychology. Hence, the design process is carried out with processes of digitizing the material educational GC services, visualizing wisely and making interactive. The application provides an effective interaction of online counselling among a student and a counselling staff regarding a student’s academic and personal success, drawbacks, problems and feedbacks.
109 XML Dissemination Scheme for Mobile Computing Based on Lineage Encoding?, K. Anusree, Mrs. D. Usha, C. Shiny Jennifer
In wireless environments, broadcasting is an efficient and scalable method to broadcast information to a massive number of clients. We propose an energy and latency efficient XML dissemination scheme for the wireless mobile computing environments. This paper presents a novel unit structure called G-node for streaming XML data in the wireless system. It applies the benefits of the structure indexing and attributes summarization that can integrate relevant XML elements into a group. It provides a path for selective access of their attribute values and text content. The G-node structure removes structural overheads of XML documents, and enables clients to avoid downloading of unwanted data during query processing. We also introduce a lightweight and effective novel encoding technique, called Lineage Encoding, to support evaluation of predicates and twig pattern queries over the wireless stream. The Lineage Encoding technique expresses the parent-child relationships among XML elements as a sequence of bit-strings, called Lineage Code(Lineage Code(V), Lineage Code(H)), and provides basic operators and functions for efficient twig pattern query processing at clients side.
The process for prototype of a High Performance Computing platform for robotics is discussed in this paper. The Cloud-based platform gives advantages of infinite resources, dynamic scalability and better resource utilisation. The different elements and technologies involved in a Cloud will be presented, as well as explaining the building process. The Cloud is compared with using Open stack middleware and KVM with hardware-based virtualization for performance.
Propagation path loss greatly impact on the quality of service of a mobile communication system. To establish any mobile communication system, the basic task is to foresee the coverage of the proposed system in general, and the accurate determination of the propagation path loss leads to development of efficient design and operation of quality networks. Many such different approaches have been developed, over the past, to predict coverage using what are known as propagation models. However, such models, no matter how accurate, will result in co-channel interference and wastage of power when they are used in environments for which they were not developed. So, the best bet is to perform site-specific measurements. This paper presents a measurement-based path loss model, from experimental data collected in Aba urban, South-East Nigeria. Received Signal Strength (RSS) measurements were gathered in Aba from GlobalCom Limited (GLO) Network operating at 900MHz. The results of the measurements were used to develop path loss model for the urban environment, the result shows that the path loss for the measurement environment increases by 3.10dB per decade.
112 Analysis of Lossy Hyperspectral Image Compression Techniques?, Dr. S.M.Ramesh, P.Bharat, J.Anand, J.Anbu Selvan
Graphics Processing Units (GPU) are becoming a widespread tool for general-purpose scientific computing, and are attracting interest for future on board satellite image processing payloads due to their ability to perform massively parallel computations. This paper describes the GPU implementation of an algorithm for on board loss hyper spectral image compression and proposes an architecture that allows accelerating the compression task by parallelizing it on the GPU. The selected algorithm was amenable to parallel computation owing to its block-based operation, and has been optimized here to facilitate GPU implementation incurring a negligible overhead with respect to the original single-threaded version. In particular, a parallelization strategy has been designed for both the compressor, which is implemented on a GPU using MATLAB. Experimental results on several hyper spectral images with different spatial and spectral dimensions are presented, showing significant speed-ups with respect to a single-threaded CPU implementation. These results highlight the significant benefits of GPUs for on board image processing, and particularly image compression, demonstrating the potential of GPUs as a future hardware platform for very high data rate instruments.
113 VLSI Architecture for Implementing Kaiser Bessel Window Function Using Expanded Hyperbolic CORDIC Algorithm, M.Mohana Arasi, J.Anand, P.Bharat, J.Anbu Selvan?
Windowing techniques have been widely used for preprocessing of samples before fast Fourier transform (FFT) in real time spectral analysis to minimize spectral leakage and picket fence effect. Among all popular window functions, Kaiser-Bessel window is an obvious choice for its better spectral characteristics. In this paper, CORDIC (CO-ordinate Rotation Digital Computer) based VLSI architecture for implementing Kaiser-Bessel window has been proposed for real time applications. The parallel pipelined technique has been adopted for the present design to ensure high throughput. Various architectural design and implementation issues have been discussed. The physical synthesis for ASIC implementation of proposed architecture using Synopsys design compiler(Design Vision) and commercially available 0.18 ?m CMOS yields the core area of 52 mm2 and worst case dynamic power of 890 mW at an operating frequency and voltage of 400 MHz and 1.8 V respectively.
114 Copyright Protection through Deep Packet Inspection- An Indian Perspective?, Priyanka Jain, Jai Sachith Paul, Prof. K.Pradeep Kumar?
The beginning of 21st century saw file sharing of copy righted materials emerging as a major threat to the established business models of the content industry. The intellectual property rights in the real world scenario holds true in the virtual world also. This paper tries to understand the various issues of copy right violation in India. It also introduces deep packet inspection, a method to analyze the traffic in real time as an effective solution.
The performance of LPDC is strongly affected by finite-precision issues in the representation of inner variables. Great attention has been paid, to the topic of quantization for LDPC decoders, but mostly focusing on binary modulations and analyzing finite precision effects in a disaggregated manner, i.e., considering separately each block of the receiver. Modern telecommunication standards, instead, often adopt high order modulation schemes, e.g. M-QAM, with the aim to achieve large spectral efficiency. This puts additional quantization problems that have been poorly debated. The choice of suitable quantization characteristics for both the decoder messages and the received samples in LDPC-coded systems using M-QAM schemes is being understood. The analysis involves also the demapper block that provides initial likelihood values for the decoder, by relating its quantization strategy with that of the decoder. A new demapper version, based on approximate expressions, is also presented, that introduces a slight deviation from the ideal case but yields a low complexity hardware implementation. A relevant issue concerns comparison between the error rate performance that is achievable by using LDPC codes and that ensured by other schemes employing SISO decoding. Moreover, modern broadcast communications are characterized by increasing throughput requirements. For example, for the DVB-T2 standard, that must support High Definition Television (HDTV) services.. Another issue in broadcast transmissions concerns complexity of the decoder implementation that can be somehow reduced by introducing suitable approximations. The current scenario of error correcting codes is dominated by schemes using Soft-Input Soft-Output (SISO) decoding. Among them, an important role is played by Low-Density Parity-Check (LDPC) codes that permit to approach the theoretical Shannon limit, while ensuring reduced complexity
116 FACE RECOGNITION BASED ATTENDANCE MARKING SYSTEM?, K.Senthamil Selvi, P.Chitrakala, A.Antony Jenitha?
Automatic face recognition (AFR) technologies have seen dramatic improvements in performance over the past years, and such systems are now widely used for security and commercial applications. An automated system for human face recognition in a real time background for a college to mark the attendance of their employees. So Smart Attendance using Real Time Face Recognition is a real world solution which comes with day to day activities of handling employees. The task is very difficult as the real time background subtraction in an image is still a challenge (6). To detect real time human face are used and a simple fast Principal Component Analysis has used to recognize the faces detected with a high accuracy rate. The matched face is used to mark attendance of the employee.Our system maintains the attendance records of employees automatically. Manual entering of attendance in logbooks becomes a difficult task and it also wastes the time. So we designed an efficient module that comprises of face recognition to manage the attendance records of employees. Our module enrols the staff’s face (3). This enrolling is a onetime process and their face will be stored in the database. During enrolling of face we require a system since it is a onetime process. You can have your own roll number as your employee id which will be unique for each employee. The presence of each employee will be updated in a database. The results showed improved performance over manual attendance management system. Attendance is marked after employee identification. This product gives much more solutions with accurate results in user interactive manner rather than existing attendance and leave management systems.
117 Monitoring Factory Machine Status from Remote Location using GSM Technologies?, C. Shiny Jennifer, B.V. Baiju, K. Anusree?
This proposed project keeps on monitoring the switching status of the machine and the data is stored in memory with date and time. This data is sent to the dedicated android tiny database. At the same time a message will be sent to user mobile through GSM. This project that provides a communication between the machine and android mobile since android is an open source which allows the user to build their own application according to their requirements. If the user wants to access the data he has to log in to the dedicated android app and has to press the fetch button so that the status of the machine is viewed. Finally if the user wants to view the production details in the form of a graphical representation than has to press the generate button so that the entire production details are viewed in a form of a graphical report that could be discussed with the officials. It helps in viewing and making analysis about the production details where ever place the user move so that it allows the right person to take decision at right time and need not to depend upon any supervisor budget report.
118 EFFICIENT GRIDDING AND SEGMENTATION FOR MICROARRAY IMAGES?, P.Thamaraimanalan, D.Dhinesh kumar, K.Nirmalakumari?
This works presents a new efficient gridding and segmentation approach for microarray image. Initially, the microarray images are pre-processed using Stationary Wavelet Transform (SWT), followed by a hard thresholding filtering technique to get a de-noised microarray image. Then, we use autocorrelation to enhance the self-similarity of the image profile to get an efficient gridding. The thresholding method is used for segmentation. The combined global and local thresholding improves the segmentation accuracy which is seen by the improvement in log intensity ratio. The proposed approach was evaluated using images from the Stanford Microarray Database, proved more accurate in intensity computation and more reliable means for estimating gene expression than conventional methods.
119 Secure Token Based Storage System to Preserve the Sensitive Data Using Proxy Re-Encryption Technique, A.Jeeva, Dr.C.Selvan, A.Anitha?
In the cloud computing environment, storing sensitive data is more difficult task. The privacy preserve cost is high when we encrypt entire sensitive data. Also encrypt data are not performing well in cloud application. This is becomes the challenging to preserve the sensitive data in cloud. So we analyse the data which is need to be encrypted and other is not. And also split the data in different parts and stored it in different cloud environment. Each part of data sets are contains the tokens. The storage server identifies the data using token keys. The proxy encryption technique is used to encrypt the proxy. When the client encrypts the data before outsourcing to the cloud server, the link between the client server and cloud server proxy is encrypted using proxy encryption technique. This enables the privacy to preserve the data attacks from the attackers.
Unattended nature of wireless sensor networks leads to mobile replica node attack. An adversary can capture and compromise sensor nodes, make replicas of them, and then mount a variety of attacks with these replicas. These replica node attacks are dangerous because they allow the attacker to leverage the compromise of a few nodes to exert control over much of the network. Previous works on replica detection rely on fixed sensor locations and hence do not work in mobile sensor networks. The proposed work is a fast and effective mobile replica node detection scheme using the Sequential Probability Ratio Test.
121 A Supervised Method for Multi-keyword Web Crawling on Web Forums?, A.Gowtham, Dr.K.Deepa?
Web forums are used by large number of users to post and share their comments with other users of various websites. The forums consist of many lists of topics on their boards with a large list of threads in each board. The users can create many threads and share their views in posts as well. In this paper a supervised web forum multi-keyword crawler is proposed to crawl relevant contents from the forum pages by reducing the delay. All the forums in the web have navigation paths that lead to the forum threads and these paths are connected by specific types of URLs. Thus the proposed method needs to recognize the various URLs by using the regular expression patterns within the forum. Accurate page classifies trained by using other forums can be used to classify the regular expression patterns and detect the URLs. The obtained results show that the proposed method is more reliable and accurate comparing to other existing methods.
122 Improvement of Email Summarization Using Statistical Based Method?, Mithak I. Hashem?
Automatic text summarization is undergoing wide research and gaining importance as the availability of online information is increasing. Email is one of the most important online tools that many of us depends on in his everyday life. Finding Email summaries may be crucial for many users. We deal with email text as a single-document in this research. Text summarization can be classified into two approaches: extraction and abstraction. This research focuses on extractive one. The goal of text summarization based on extraction approach is sentence selection. Our proposed method to obtain the suitable sentences is to assign some numerical measure of a sentence (statistically) for the summary called sentence score and then select the best ones to be included within Email summary. The most important step in summarization by extraction is the identification of important features. In our experiment, we used 130 test Email text from Enron_Sent_Mail_Sample data set. Each Email document is prepared by preprocessing process: sentence segmentation, tokenization, removing stop word, and word stemming. Then, we used 7 important features and calculate their score for each sentence. The results show that the best average similarities with the reference summary (gold summary) were obtained by our method.
123 Intrusion Detection, Secure Protocol & Network Creation for Spontaneous Wireless AD HOC Network?, Nikhil Varghane, Prof. Bhakti Kurade, Prof. Chandradas Pote?
Fundamental aspect in wireless network creation & wireless communication is use of security so this paper proposing a secure protocol for spontaneous wireless ad hoc networks which uses an hybrid public ,private key scheme and the trust between users in order to exchange the initial data and to exchange the secret keys that will be used to encrypt the data. The protocol offers Network creation, protocol messages, and network management communication. We presenting self-configured secure protocol that is able to create the network and share networks secure services. The network allows sharing resources and new services among users in a secure environment. Our proposal has been implemented in order to test the protocol procedure and working. Finally, we compare the protocol with other spontaneous ad hoc network protocols in order to highlight its features and we provide a security analysis of the system. A Spontaneous ad-hoc network is a complete self-configured secure protocol which is able to create the network and share secure services without any previous setup. The network permits sharing resources and offering new services among users in a securely. The protocol contains all functionality required to operate without any outer support. Design of a protocol permits the creation and management of a spontaneous wireless ad hoc network
124 DESIGN PATTERN BASED ANALYSIS IN MULTI-AGENT FRAMEWORK?, A.Sindhuja, Rathi.P.R, Anis Fathima.A.R, A.P.Jeyasanthini?
A modern machine or software can deal with some inbuilt programs which represent a solution to a problem statically where the inputs and the outputs given are constant. During a machine gets inputs at run – time the program must be able to work depending on new situations at that time it needs some intelligence or knowledge which are developed and implemented by agents. When a single agent running in an environment it may identify only some of the states but when multiple agents are running each individual state can be judged easily. Hence in our proposed system a multi agent environment is created and each set of actions are done by individual agent concurrently for a health based system. Thus the actions identified by each agent are communicated through communication languages and justify the solution for that given problem using Evolutionary algorithm and it is stored as a pattern for further use. When there is a solution for a problem in the knowledge base it can be identified through pattern based ontology which is applied on multi agent environment. Thus it may decrease the time to find new solutions.
There are several diseases which cause changes in speech. Larynx cancer is one among them. Cancer can be best diagnosed in a non-invasive method and treated in the early stages. Acoustic voice analysis is an effective and non-invasive method for evaluation and detection of laryngeal pathologies. In this paper we have collected a wide variety of voice samples, including sustained vowels, words, and sentences compiled from a set of speaking exercises for people with laryngeal diseases. We are using multiple sound recordings of a single subject and find out the central tendency and dispersion metrics which improves generalization of the predictive model. Dynamic Time Wrapping algorithm is used to find out the similarity between two voice samples.
126 Traveler Guide using GPS?, Prashant Beldar, Prashant Bansode, Rajendra Mane, Swapnil Gaikwad
In this paper, an Android based mobile application is presented to guide the tourists and daily commuters in their travel in Mumbai city. Tourist guiding through mobile application informs the tourist about nearby Bus Stop. All details of buses arriving and departing at that bus stop. The application is designed for Mumbai city where the bus (BEST) services are provided to the people for travelling different places in Mumbai. The Application gets the Current location of the user through GPS in the form of Longitude and Latitude and this information is send to the Sever. The server replies with the information which contains the nearby Bus Stand along with the Bus Details such as Bus no, Bus route, and Bus Source and Destination.
127 TIMETABLE GENERATION SYSTEM?, Anuja Chowdhary, Priyanka Kakde, Shruti Dhoke, Sonali Ingle, Rupal Rushiya, Dinesh Gawande?
This project introduces a practical timetabling algorithm capable of taking care of both strong and weak constraints effectively, used in an automated timetabling system. So that each teacher and student can view their timetable once they are finalized for a given semester but they can’t edit them. Timetable Generation System generates timetable for each class and teacher, in keeping with the availability calendar of teachers, availability and capacity of physical resources (such as classrooms, laboratories and computer room) and rules applicable at different classes, semesters, teachers and subjects level.
128 Smart Remote Health Care Data Collection Server?, Kalyani Bangale, Karishma Nadhe, Nivedita Gupta, Swati Singh Parihar, Gunjan Mankar?
This paper presents a method to secure data collection server by protecting and developing backups used for Health Care Cloud. The Objective of Smart Remote Health Care Data Collection Server (SRHDCS) is to provide Auto Response Server, Better Solutions for Data Backup and Restore using Cloud, Availability of data remotely using safer protected data transmission and Confidentiality of data remain intake. The Smart Remote Health Care Data Collection Server can collect data and send to a centralized repository in a platform independent format without any network consideration. The central repository is also a source for other vendors/depts. to use the information for their specific requirement. The purpose of Smart Remote Health Care Data Collection Server is to help users (basically admin) to collect information from any remote location even if network connectivity is not available at that point of time.
129 LABVIEW Based Module For Bio Signals Monitoring?, Sabeetha Begum.M, Kumarnath.J?
This project proposes a miniaturized module of the FPGA interfacing system with Zigbee transceiver. This offer a powerful system, which will monitor the human heartbeat and EEG signals. In existing system NIOS II embedded and MATLAB are used, instead of that we use LABVIEW platform for simulate the response signal. It will reduce the hardware complexity of the system. The main aim of using software in monitoring system is data visualization and analysis. On the system, analog and digital circuits are integrated, whereas field-programmable gate array hardware and LABVIEW are co-operated. FPGA has been programmed with software module. In this monitoring, the unusual reaction in the system will send the information to the doctor via the recorded voice. In this we use Zigbee transceiver for efficient treatment to the patient. The analog section is composed of amplifier, analog to digital converter, signal conditioning unit. Signal capture and amplification are realized by an analog circuit, whereas signal process, signal analysis, and man-machine interface control are implemented on a field programmable gate array digital platform. In addition to this, to pursue high performance and good expandability, LABVIEW is employed and system tasks are partitioned to hardware and software. It is mainly used in the application of brain death detection for coma patient.
130 Review on MANET: Characteristics, Challenges, Imperatives and Routing Protocols, Mahima Chitkara, Mohd. Waseem Ahmad?
Nowadays, with the rapid propagation of lightweight wireless devices such as laptops, wireless telephones, and wireless sensors, the potential and importance of nomadic computing particularly mobile ad hoc networking have become apparent. A mobile ad hoc network, or MANET, is an infrastructure less temporary network, formed by a set of wireless mobile hosts that has no central administration and establish their own network dynamically. Moreover, the network’s environment has some features that add extra complications, such as the frequent changes in the topology caused by nodes mobility, as well as the unreliability, resource constraint and the bandwidth limitation of wireless channels. A number of protocols have been proposed in the literature for efficient routing in MANET. Due to the dynamic topology of Mobile Adhoc Network, this paper mainly concentrates on different routing techniques which are the most challenging issue in today’s scenario. Different strategies have been proposed for efficient routing which claimed to provide improved performance. There are different routing protocols proposed for MANETs which makes it quite difficult to determine which protocol is suitable for different network conditions .This paper gives proposal of different strategies by provides an overview of different routing protocols proposed in literature.
131 Review of Techniques for Detecting Video Forgeries?, Aniket Pathak, Dinesh Patil
In today’s fast and speedy life the role of multimedia has increased in considerable manner, in which the use of digital images and videos has increased. With the availability of advance digital video processing technologies, various kinds of videos are produced from different perspectives. In Legal cases in court hearings it’s observed that changes done have been made to accept videos from digital cameras as witnesses. As a result there is a growing interest in forensic analysis of video content where the integrity of digital images and videos need to be checked. In this respect it has become essential to analyze whether a particular video is an original-real one or one that has been tampered using any technique. As video editing techniques are getting very complicated, modified videos are hard to detect. However, when a video is modified, some of its basic properties get changed. Then to detect those changes it is needed to use complex and video processing techniques and algorithms, in this paper we review the various existing methods that are used to find whether the video is real one or not.
132 QUASI RESONANT BUCK CONVERTER FOR DUAL STRING BUCK LED DRIVER, Vijayadevi.A, Sneha Prem, Jikku Mathew, Manikandan.K, Sajla Rehman?
The project presents the digital simulation of Quasi Resonant Converter for driving multiple LED strings using MATLAB Simulink. Quasi Resonant Converter (QRC) is fast replacing conventional PWM converters in high frequency operation. The salient feature of QRC is that the switching devices can be either switched on at Zero Voltage or switched off at Zero Current, so that switching losses are zero ideally. It adopts suitable PWM switching method using resonance. QRC based digitally controlled dual output buck switching LED driver operates in Discontinuous Conduction Mode (DCM) to reduce the input current ripple and extends it to drive multiple outputs. Based on the time multiplexing control scheme in DCM, a theoretical upper limit of the total number of outputs in a buck switching LED driver for various backlight LED current values can be derived analytically. The PWM gate pulses are generated using active current summation technique and it is used to regulate the LED current accurately. The output of QRC is regulated by varying the switching frequency of the converter. The proposed scheme eliminates the series current regulation element present in all conventional LED drivers and it greatly improves efficiency and reduces cost.
133 Lossless Compression for Compound Documents Based on Block Classification?, A.Thamarai Selvi, M.Sambath, S.Ravi?
Image and video compressions are required to reduce the number of bits needed to represent the content of the original data. Compression of scanned or compound documents and images can be more difficult than the original data because it is a mixture of text, picture and graphics. The main requirement of the compound document or images is quality of the decompressed data. Here Quality is defined as the achievement of the high compression ratio. The degradation of compound image or document compression is based on storage and the transmission of the document. Reduce the storage size and the lossless quality are challenging task. In this proposed method a block-based compression method used for scanned documents (and also video). This paper presents the study on the implementation of Mat lab based on pattern matching algorithm. A high-quality document compressor even with single and multipage document is known as H.264/advanced video coding (AVC). The segmented blocks of data are inputs. Each block is matched to our previous pattern. Then the method H.264/AVC uses the Integer transform to convert resulting is encoded using CABAC. The compressed image or scanned documents are visually lossless with high compression ratio (compare to previous standards). We now describe the desired features and how one can implement them using AVC. The proposed Encoder is more efficient for transform encoding of the residual data.
Nowadays, a large volume of data from various resources such as social media networks, sensory devices and other information serving devices are produced. This large collection of unstructured, semi structured data is called big data. The conventional databases and data ware houses can’t process this data. So we need new data processing tools. Hadoop addresses this need. Hadoop is an open source platform that provides distributed computing of big data. Hadoop composed of two components. A storage model called hadoop distributed file system and computing model called MapReduce. Map reducer, is a programming model for handling large complex task by doing two steps called map and reduce. In map stage the master node partition the problem into sub problems and distribute the task into worker nodes. The worker nodes pass the result to master node after solving the problem. In the reduce phase the master node reduce the answers of the sub problem to a final solution.
135 Privacy in Map Reduce Based Systems: A Review, Rosmy C Jose, Shaiju Paul?
Today, every organisation ge ne r a te s and adds huge amount of data to the cloud. This vast amount of data which cannot be effectively captured, processed and analysed by traditional database and search tools is called Big Data. The processing of big data is made possible by using Map Reduce, a programming model and an associated implementation, introduced by Google. MapReduce process d a t a , w h i c h ar e located at different data nodes. It pushes computations to where the data r e s i d e s rather than t h e oppos i t e. So, Map Reduce Framework or source codes may leak sensitive data during computation process. In current im p l em e n t a t i o n (Airavat) Mapper code is written by user and Reducer code is selected from a list provided by the system. If these codes are given by the system itself, usability may become low. Therefore, in the proposed s ys t em both Map and Reduce codes can be written by the user. So usability wi l l be high. A Computation System ensures the privacy leak through storage channels (network connections, files) or privacy leak through the output of the computation is stopped. Use SELinux is used to prevent storage channel leaks. Leaks through the output o f the computations are checked by using differential privacy mechanisms.
MANET is a new wireless network technology increasingly used in many applications. These networks are more vulnerable to attacks than wired networks. Since they have different characteristics, conventional security techniques are not directly applicable to them. Intrusion detection system (IDS) is one of the most active fields of research in Mobile Ad-hoc Network’s (MANET) field. Researchers currently focus on developing new prevention, detection and response mechanism for MANETs. The results demonstrated positive performances against Watchdog, TWOACK, and AACK in the cases of receiver collision, limited transmission power, and false misbehavior report. Packet-dropping attack has always been a major threat to the security in MANETs. In this research work, novel IDS named EAACK protocol using ECC is specially designed for MANETs and compared it against other popular mechanisms in different scenarios through simulations. In this paper, we propose and implement a new intrusion-detection system named Elliptic Curve Cryptography Based Enhanced Adaptive ACKnowledgment (ECC-EAACK) specially designed for MANETs demonstrates higher malicious-behavior-detection rates in certain circumstances while does not greatly affect the network performances.
Vehicular ad hoc networks (VANETs) adopt the Expedite Message Authentication Protocol (EMAP) and Certificate Revocation Lists (CRLs) for their security. In any EMAP system, the authentication of a received message is performed by checking if the certificate of the sender is included in the current CRL, and verifying the authenticity of the certificate and signature of the sender. In this paper, We propose a Message Authentication Acceleration (MAAC) protocol for VANETs, which replaces the time-consuming CRL checking process by an efficient revocation check process. The revocation check process uses a keyed Hash Message Authentication Code (HMAC), where the key used in calculating the HMAC is shared only between non-revoked On- Board Units (OBUs). The MAAC protocol uses a novel probabilistic key distribution, which enables non-revoked OBUs to securely share and update a secret key .By conducting security analysis and performance evaluation, the MAAC protocol is demonstrated to be secured and efficient.
Algorithmic parameterization and hardware architectures can ensure secure transmission of multimedia data in resource constrained environments such as wireless video surveillance networks, telemedicine frameworks for distant health care support in rural areas, and Internet video streaming. Joint multimedia compression and encryption techniques can significantly reduce the computational requirements of video processing systems. To reduce the computational cost of multimedia encryption, along preserving the properties of compressed video (useful for scalability, and transcoding, and retrieval), which endanger loss by naive encryption. In this system, express the two compression blocks for video coding - a modified frequency transform called as Secure Wavelet Transform or SWT and a modified entropy coding scheme called Chaotic Arithmetic Coding (CAC) is used for video encryption. Experimental results are shown for selective encryption using proposed schemes. The SWT has rational coefficients which allow us to build a high throughput hardware implementation on fixed point arithmetic. In CAC, a large number of chaotic maps can be used to perform coding, each achieving Shannon optimal compression performance.
139 Novel Techniques for Color and Texture Feature Extraction?, Miss. Priyanka N.Munje?, Prof. Deepak Kapgate?, Prof. Snehal Golait?
Content based image retrieval (CBIR) is a challenging problem due to large size of the image database, difficulty in recognizing images, difficulty in devising a query and evaluating results in terms of semantic gap, computational load to manage large data files and overall retrieval time. Feature extraction is initial and important step in the design of content based image retrieval system. Feature extraction is a means of extracting unique and valuable information from the image. These features are also termed as signature of image. Feature extraction of the image in the database is done offline therefore it does not contribute significantly in computational complexity. Humans tend to differentiate images based on color, therefore color features are mostly used in CBIR. Color moment is mostly used to represent color features especially when image contain just an object. Regularity, directionality, smoothness and coarseness are some of the texture properties perceived by human eye. Gabor filter and wavelet transform for texture feature extraction has proved to be very effective in describing visual content via multi-resolution analysis. The paper mainly gives the brief ideas of existing retrieval techniques. Also paper gives the comparative analysis of mentioned techniques with different metrics.
140 Data Integration Models for Operational Data Warehousing?, G. Swetha, D. Karunanithi, K. Aiswarya Lakshmi?
Data warehouses have evolved to support more than just strategic reporting, analytics and daily forecasting. Organizations are investing significant resources to integrate valuable information contained in their data warehouse into their day-to-day operations. Incorporating business intelligence into decision making enables these organizations to optimize business performance throughout the day. However, to achieve these efficiencies, data must be provided in real time environment. There are many data integration technologies that serve the data acquisition needs of a data warehouse in organizations, and the demand for low-latency data is causing IT organizations to evaluate a wide range of approaches: intraday batch Extract, Transform, and Load (ETL) processes as well as real-time Change Data Capture (CDC) techniques.
In the process of signal acquisition and transmission image signals might be corrupted by impulse noise. Efficient VLSI implementation is presented in this paper, in order to remove impulse noise. In order to perform better visual quality, edge features should be preserved. Pixels that are detected as noisy are filtered, the others remain unchanged. Here fixed value impulse noise is removed and implemented in VLSI. The VLSI architecture of our design yields a processing rate of about 200 MHz by using TSMC 0.18?m technology. Compared with the state-of-the-art techniques, this work can reduce memory storage by more than 99%. The design requires only low computational complexity and two line memory buffers. Its hardware cost is low and suitable to be applied to many real-time applications.
142 Providing Efficient Data management Services Using Cloud Cache?, Swapnil L. Mahadeshwar, A.R.Surve?
Cloud computing service provides various computational capabilities to their customer which finds a very drastic change in service infrastructure. The main objective is to reduce the cost of deploying services in the clouds. This entire model faces the performance limitations factors in daily applications and networks. By overthrowing these limitations, the cloud settled speed of the process by migrating data and applications to the cloud and faster access to that data from anywhere. Cloud computing applications that offer data management services are arisen. Such clouds hold up caching of data in order to offer quality query services. The users can inquiry the cloud data, paying the cost for the infrastructure they utilize. Cloud management manages the service of several users in a well-disciplined, although, resource-economic way that allows for cloud profit. Multiple user requests can handle well disciplined, although economically effective way to allow profit in cloud managements. The local cache structure is built for user send queries to cloud cache and pricing solution employs a novel method which estimates the relation of the cache service in time efficient manner to catch the best possible price solution.
143 GAPSS: GPS Aided Photo Search System?, Amey Divekar, Pravin Khot, Gaurav Chaudhari, Vishal Shembekar?
Our project proposes a GAPSS (GPS Aided Photo Search System) to identify buildings through their photos captured by phone cameras. User need to take a picture of the building he or she wants to know with Android phone cameras and upload the picture to our system. The system returns name and information of the buildings on Google map. In the project we describe an image-based approach to finding location-based information from camera-equipped mobile devices. Our technique uses content-based image retrieval methods to search databases for matching images and their source pages to find relevant location-based information. In contrast to conventional approaches to location detection, our method can refer to distant locations and does not require any physical infrastructure beyond mobile internet service & simply a web service for information store. Also the Google API can be used to display navigation for the specified image - location on the Google maps.
144 A Review on Privacy Problems in Distributed Information Brokering System and Solutions?, Mr. Ashutosh Kamble, Prof. Deepak Kapgate, Prof. Prakash Prasad?
There is an increasing need for information sharing via on-demand access in different organization. Information Brokering Systems (IBSs) have been introduced to connect large-scale lightly-associated data sources with the help of a brokering overlay. In this system, the brokers make routing decisions to direct client queries to the requested data servers. Some present IBSs consider that brokers are trusted and thus only adopt server-side access control for data confidentiality. But, the privacy of location of data and information about consumer can still be concluded from metadata (such as query and access control rules) exchanged in the IBS, but little attention has been paid for its protection. This paper presents an overview on information sharing in distributed environment through information brokering system and problems associated with it. It also defines two attacks- attribute correlation attack and inference attack.
Embedding a secret message into a cover media without attracting any attention is known as steganography. Steganography is one of the methods used for hidden communication purposes. One of the cover media that can be used for audio steganography is speech. All the methods that we have found for audio steganography change the values of maximum number of samples from the audio signal. Usually change the sample values of the signal annoying the listener and reduce perceptual transparency. Hence the special methods are required for hiding the information in audio signal. This survey proposes a new approach for steganography in speech signals. In this method, secret data are hidden in the silence part of the speech signal. The silence parts are identified by collaborative non voice detection algorithm. The secret data are hidden by reducing a small number of sample values from some samples of the silence part. The main feature of our method is to create the high perceptual transparent steganographic system with acceptable data hiding capacity. This method can hide information in a speech stream with very low processing time that makes our method as a real-time steganography method.
146 An Approach for Minimization of Power Consumption in Ad-Hoc Network?, Abhiruchi Akre, Kimi Bhoyar, Ankita Malve, Avantika Kalbande, Pawan Khade?
The mobile phones that have rich media and wireless networking capabilities has ushered in a new paradigm in mobile computing with new emerging social behaviours. New enabling technologies now allow users to search, locate, download, and share dynamically created content with friends and family from their mobile devices. With ad hoc networking capabilities in mobile devices, we are beginning to see the above trend shift from wide-area communities of users to dense local area social situations such a shift presents opportunities to design proximity aware systems that deliver novel social experiences. For example, fans watching a football game can automatically share pictures taken on their mobile phones with each other, while commenting/rating pictures being taken around them. Designing systems for ad hoc environments presents several interesting research challenges, including the difficult problem of providing scalable, energy efficient presence and content updates. To keep information fresh in such environments, the distribution mechanisms have to focus on frequent, small metadata updates rather than large infrequent payloads, which could also be a cause of significant battery drain from a mobile device.
147 Outlier Mining for Removing the Anomalies in High Dimensional Data Using ARVDH Algorithm, Krupa Mary Jacob, K.Sangeetha, S.Karthik?
In Data mining outliers are one of the main threats for efficient information retrieval from databases. Outliers are also known as Anomalies. Mining of outliers from the normal data is very important and scope of this is very high. Anomaly detection can be found in applications such as credit card fraud detection, intrusion and insider threat detection in cyber-security, detection of fault, or malignant diagnosis. Anomalous data present in database is harmful for the processing of information and usage of that information. Viscous data contain erroneous information and it may contain dangerous code for carking the whole system where it is stored. The main drawback of the existing system is, it does not support data with Multiclustering for removing viscous data. To avoid this problem we propose one algorithm which is Algorithm for Removing the Viscous data in High Dimensional data (ARVDH). Simple and efficient steps are used to remove outliers form information
Creating and recognizing automatically the behavior profile of a user from the commands in a command line interface. Computer user behavior is represented as a sequence of UNIX commands. This sequence is transformed into a distribution of relevant subsequences in order to find out a profile that defines its behavior .The existing system novel evolving user behavior classifier is based on Evolving Fuzzy Systems and it takes into account the fact that the behavior of any user is not fixed, but is rather changing. Timely detection of computer system with intrusion is a problem that is receiving increasing attention. Previous approach cannot prevent legitimate user from abusing their rights in a computer system. Proposed system analysis also be used to supervise, analyze, and detect abnormalities based on a time-varying behavior of same is not considered; we proposed our work to monitor and detects the Masquerader from user behavior profile. Tree-structured architecture is adopted in the partition to avoid the problem of predetermining the number of partitioned data in the region. Then, in the second stage, multiple SVMs, also called SVM experts, that best fit partitioned regions are constructed by finding the most appropriate kernel function and the optimal free parameters of SVMs. Different UNIX command data, proposed system show that a system based on our approach can efficiently recognize a UNIX user and detects masquerader from data. SVMs experts achieve significant improvement in the generalization performance in comparison with the single SVMs models in the existing system.
149 EduPad- “A Tablet Based Educational System for Improving Adult Literacy in Rural India”?, Mayuri Tawri, Richa Sharma, Charandas Pote?
Literacy is one of the great challenges in the developing world. But universal education is an unattainable dream for those children who lack access to quality educational resources such as well-prepared teachers and schools. Worse, many of them do not attend school regularly due to their need to work for the family in the agricultural fields or households. However, the rural areas of India are often at a disadvantage within the Indian Education System. An educational system called EduPad, to reduce the rural adult illiteracy using advanced technology. The device proposed here is an interactive Tablet, which is capable of teaching mul tiple languages. The software helps the user to learn to write as well as spell the alphabets.
150 Solution to Data Sharing for Confidentiality between Service Provider and the Data Owner?, Mr. Ajay Bhaisare, Prof. Prakash Prasad, Prof. Ashwini Meshram?
Cloud Service Provider (CSP) provides various types of services. Such as Storage-as-a-Service (SaaS) is a paid facility provided by CSP, where data owners can outsource their data in the cloud. This having some issue of ensuring the integrity and security of data storage in Cloud. We consider the work of allowing a Trusted Third Party (TTP), on behalf of the cloud client, to verify the integrity and security of the dynamic data stored in the cloud. The data owner securely outsources confidential data in cloud. It allows authorized users to access the owner’s file. It maintains trust between data owner and cloud service provider.
151 Sybil Attack Detection with Reduced Bandwidth Overhead in Urban Vehicular Networks?, D. Balamahalakshmi, Mr. K.N. Vimal Shankar??
Urban vehicular networks should have more location privacy. For Sybil attack detection previously they proposed a footprint concept to detect the Sybil attack by using the trajectory information generated by multiple RSUs also to preserve the location of vehicle. The RSU will generate the location and timing information to vehicle whenever it passes through RSU. Using this message the verification will be carried and also it will consider failed RSU for verification. Reducing the message size is not covered in this system. To achieve this, the repeated occurrences of adjacent RSUs are eliminated in the proposed system. So that the length of the trajectory information is reduced without loss of information and also the bandwidth overhead is reduced.
Personal health record (PHR) systems are patient-facing portals that contain patient health information and allow patients to interact with the health system. PHR is enabled patient centric model of health information exchange, which is often outsourced to be stored at a third party such as cloud providers. Key distinction is that a PHR typically is under the patient’s control, so that an individual patient is the ultimate guardian and editor of information stored or accessible within his or her PHR .In this project, it describes our design and prototype implementation of a social healthcare network over the cloud. Differ from previous work in more secure and strong encryption algorithm for Triple DES. The system is secured with a trust-aware role-based access control.
153 Gas Level Detection and Leakage Monitoring System using a Specific Technique, I. Juvanna, N. Meenakshi?
Liquified Petroleum gas (LPG) is the common one for all cooking applications. Most of us are prone to much difficulty when the gas cylinder gets emptied during the peak cooking hours. We present this paper in order to create awareness about the decreasing weight due to consumption of the gas and to automatically dial to the gas booking office. Continuous measurement of the weight cannot be done using electronic weight gauges, since it causes fatigue in the springs. Hence we move to contact less detection involving acoustic wave. In this system, the inbuilt pressure sensor in RFID is used to measure the level of the gas inside the cylinder. The output of the pressure sensor is given to the PIC controller, where the voltage corresponding to the gas weight is stored. The same is displayed in the LCD, which is connected to the output port of the controller. A threshold value is set in the controller. Once the threshold level is reached, the voltage value is given to the alarm, which alarms the user. And also it is given to the autodialler. A special sensor for detecting the gas detection agent is inbuilt in the RFID device, whose output is connected to the alarm.
154 Lane Detection Techniques - A Review?, Anjali Goel?
Lane coloration has become popular in real time vehicular ad-hoc networks (VANETs). The main emphasis of this paper is to find the further ways which can be used further to improve the result of lane detection algorithms. Noise, visibility etc. can reduce the performance or the existing lane detection algorithms. The methods developed so far are working efficiently and giving good results in case when noise is not present in the images. But problem is that they fail or not give efficient results when there is any kind of noise or fog in the road images. The noise can be anything like dust, shadows, puddles, oil stains, tire skid marks, etc. So the overall goal of this paper is to evaluate the gaps in existing literature and suitable solution for the same.
155 A Review on Image Dehazing Methods, Reshma Kurian, Namitha T.N??
This review paper presents a study about various image dehazing methods to remove the haze in the captured hazy images to recover a better and improved quality of haze free images. One of the critical problems in the field of image processing is the restoration of the images those are corrupted due to various degradations. Images of natural outdoor scenes is degraded due to bad weather conditions such as fog, haze etc. Due to the presence of these atmospheric particles there is a resultant decay in the colour and contrast of the captured image in the bad weather conditions. This may cause difficulty in detecting the objects in the captured hazy images or scenes. Now -a- days due to the recent development of the computer vision area, it is possible to improve the outdoor hazy images and remove the haze from the images.
This article deals with the growth and proportion of different types of co-authored publications in Bioinformatics. This study also explores the applicability of appropriate statistical model to the deterioration in the proportion of single-authored publication with during sample periods. Also studies the applicability of selected statistical models to the distribution of authorship in publications of bioinformatics during 1999 to 2013.
157 Scalable and Secure Sharing of Personal Health Records in Cloud Computing using Attribute-Based Encryption?, Prof. Y.B.Gurav?, Manjiri Deshmukh
Personal health record is maintain in the centralize server to maintain patient’s personal and diagnosis information. Personal health record (PHR) is an emerging patient-centric model of health information exchange, which is often outsourced to be stored at a third party, such as cloud providers. However, there have been wide privacy concerns as personal health information could be exposed to those third party servers and to unauthorized parties. The security schemes are used to protect personal data from public access. To assure the patients’ control over access to their own PHRs, it is a promising method to encrypt the PHRs before outsourcing .In this paper we propose novel patient-centric framework and suite of mechanism for data access control to PHR’s stored in semi-trusted servers Yet, issues such as risks of privacy exposure, scalability in key management, flexible access and efficient user revocation, have remained the most important challenges toward achieving fine-grained, cryptographically enforced data access control. To achieve fine-grained and scalable data access control for PHRs, we leverage attribute based encryption (ABE) techniques to encrypt each patient’s PHR file. Data owner update the personal data into third party cloud data centers. Multiple data owners can access the same data values. Different from previous works in secure data outsourcing, we focus on the multiple data owner scenario, and divide the users in the PHR system into multiple security domains that greatly reduces the key management complexity for owners and users. A high degree of patient privacy is guaranteed simultaneously by exploiting multi-authority ABE. Our scheme also enables dynamic modification of access policies or file attributes, supports efficient on-demand user/attribute revocation and break-glass access under emergency scenarios.
Peer-to-peer networks are networks composed of heterogeneous and autonomous peers that cooperate with each other in a decentralized manner. All peers are both users and providers of resources and can access each other directly without intermediary agents. In the proposed system, we introduce a Self-Organizing Trust model (SORT) that aims to decrease malicious activity in a P2P system by establishing trust relations among peers in their proximity. Each peer develops its own local view of trust about the peers interacted in the past. In this way, good peers form dynamic trust groups in their proximity and can isolate malicious peers. . Finally, an experimental study is conducted on a real P2P prototype, and a large-scale network is further simulated. The results show the effectiveness, efficiency and scalability of the proposed system.
159 Attendance Monitoring Using Face Recognition?, Divya Singh, Ruhi Sunil Hadke, Shruti Sanjay Khonde, Valhavi Diwakar Patil, Monica Kamnani, Mitali R. Ingle?
Students attendance in the classroom is very important task and if taken manually wastes a lot of time. There are many automatic methods available for this purpose i.e. biometric attendance. All these methods also waste time because students have to make a queue to touch their thumb on the scanning device. This work describes the efficient algorithm that automatically marks the attendance without human intervention. This attendance is recorded by using a camera attached in the classroom that is continuously capturing images of students, detect the faces in images and compare the detected faces with the database and mark the attendance.
Data transmission and beacon transmission is based on the location information of each and every node which is connected in the network, in the geographic routing. Hence the beacon packets are transmitted by the neighbor node to the requested node by using APU (Adaptive Position Update) scheme.APU scheme incorporates two rules such as Mobility Prediction (MP) Rule and On-Demand Learning (ODL) rule.APU dynamically adjust the beacon update interval based on the movement of the node. To reduce the updating cost and also to improve the performance of the network, ODL rule is used.ODL creates the topology for the active nodes i.e., forwarding node while transferring the data. But MP rule is not able to handle the link failure while transferring the data. In order to overcome that failure, the Extended Adaptive Position Update (EAPU) scheme has been proposed. It has two ways to overcome the link failure, inaccuracy of the topology and performance degradation as 1) Request for new neighbor node’s neighbor list.2) Verification of acknowledgement, Transmission and path. To achieve the later, the system with TCP protocol instead of using UDP for trustworthy connection has been proposed. Also, load at the time of transmission of data packets in the forwarding node get reduced since TCP protocol is an End-to-End Packet delivery. Thus the system is enhanced with the benefits of EAPU along with TCP for routing purpose as well as trustworthy logical connection.
the paper analyzes the deploy ability of approach for Network Mobility (NEMO) in wireless Vehicular Ad-Hoc Networks (VANETs).The vision for VANETs is road safety and commercial comfort application enabled by short range wireless technology. A Network Mobility (NEMO) based VANET is a new intend to integrate the NEMO based VANET. The advance of NEMO based VANET is a tracking system to prevent attackers from localizing the user inside the hotspot in a vehicle. It is used to communicate between roadside unit (RSU) and vehicle to provide the internet access in a public transportation system (e.g. bus, train) by using different mobile networks (MNN). The passengers can enjoy the full internet access such as cell phone and physical assistants. Because of the open wireless network environment the attackers present in the OSl model can easily localize the mobile networks nodes by measuring their received signal strength (RSS). In these papers modify the scheme called concealment and by using the idea of power variability, proposed a new scheme, (i.e.) Fake point and cluster based sub scheme, its goal to confuse the attackers by increasing the estimation error in received signal. By using correctness, certainty, metrics, the fake point based sub scheme targets the higher MNN’s secured location the number of grid points decreases. The annexation simulation shows the fake point cluster based scheme archives 23% and 37% decrease in the sender power with MNN’s route length and also compared with fake point sub scheme.
162 SAT4BSC: A Static Analysis Tool for BPEL Source Codes?, Esubalew Alemneh, Abdul Azim Abd Ghani, Rodziah Atan?
Business Process Execution Language (BPEL) is Extensible Markup Language (XML) based language for describing the logic to orchestrate the interaction between Web services in a business process. Even though it is fairly new language it is getting popularity in various software industries and research environments. The emphasis of recent researches and developments on web services and on BPEL has been in their architecture and interface. However, the work regarding to tool support especially to compute the metrics and to draw control flow graph (CFG) is in its infant stage. Provision of tools to reckon measures has multitude of benefits. CFG is essential tool to analyze various properties of a source code and it is also useful for software testing, software measure, and software maintenance. In this research we have developed a static analysis tool which is dedicated to compute all available BPEL 2.0 metrics and draw CFG of its source code. The tool has been evaluated by various BPEL process source codes obtained from the languages specifications and from other research papers. The test shows that the tool can compute the metrics and draw the CFG effectively and efficiently.
163 Improvising Authenticity and Security of Automated Teller Machine Services, Srivatsan Sridharan, Gorthy Ravi Kiran, Sridhar Jammalamadaka?
This work aims at improvising the security and authenticity of the Automated Teller Machine (ATM) using a trusted third party application. This system would in turn benefit all the customers who have a valid ATM card registered officially with their mobile number. This system provides the following facilities of withdrawing currency at any remote terminal, verification of the end users identity using Personal Identification Number and an authentic One-Time- Passkey (Pk) validation through the mobile. The customers, without any insider privileges, can withdraw currency without being detected by any mechanisms of theft of card and eaves dropping of the Password from the card holders within the terminal software are also the major threat yet to be addressed. A basic solution is the ATM systems having a two tier authentication Pk and Random Security Question (RSQ) are being generated and validated from the user’s input from the ATM Terminal with authenticity being ensured and the confidentiality being maintained. In such a system, the correctness burden on the terminal’s code is significantly less as the customers have been given the chance to authorize themselves from their hand-held devices and are allowed to withdraw currency in terminal only after their identity is proved by a series of authentication procedures. In this paper along with the dual tier authentication implementation, the issues arise along with them and the solvencies to these issues related to the generation of the RSQ and Pk independent and unique for each session are addressed.
164 A Survey on Privacy-Preserving Techniques for Secure Cloud Storage?, Salve Bhagyashri, Prof. Y.B.Gurav?
Cloud computing is the technology which enables obtaining resources like so services, software, hardware over the internet. With cloud storage users can store their data remotely and enjoy on-demand services and application from the configurable resources. The cloud data storage has many benefits over local data storage. . Users should be able to just use the cloud storage as if it is local, without worrying about the need to verify its integrity. The problem is that ensuring data security and integrity of data of user. so here ,we are having public auditability for cloud storage that users can resort to a third-party auditor (TPA) to check the integrity of data . Here, this paper gives the various issues related to privacy while storing the user’s data to the cloud storage during the TPA auditing. Without appropriate security and privacy solutions designed for clouds this computing paradigm could become a big failure. We are a giving privacy-preserving public auditing using ring signature process for secure cloud storage system. In this paper we are going to analyze various techniques to solve these issues and to provide the privacy and security to the data in cloud.
165 Cloud Removal from Satellite Images Using Information Cloning?, Saranya M?
In recent years, on average about 35% of cloud covers are generally present in optical satellite images. To develop cloud-free satellite images for analyses of current land cover and land-cover change cloud removal approach based on information cloning is introduced. The approach removes cloud-contaminated portions of a satellite image and then clones information from cloud-free patches to their corresponding cloud-contaminated patches under the assumption that land covers change insignificantly over a short period of time. To identify exact location of cloud contaminated region, cloud detection based on window based thresholding approach is introduced. The proposed information cloning algorithm is used to reconstruct the missing data after removing the cloud-contaminated region. By replacing cloud contaminated target image with cloud- and shadow-free parts from the reference image, the information reconstruction is performed. This approach results in cloud removed images and is tested for various input images.
166 Automatic Detection of Optic Disc for the Extraction of Ocular Structure?, Nivedha S, Dinesh V?
Nowadays, some of the most common cause of visual impairment, and blindness are because of diabetes retinopathy, hypertension, glaucoma. These diseases can be detected through regular ophthalmologic examination. However, due to population growth, the ophthalmologists and the experts needed for examination is a limiting factor. So, a system for automatic recognition of these pathological cases will provide a great benefit. Regarding this aspect, the method proposed for the detection of Optic Disc is based on mathematical morphology along with Principal Component Analysis(PCA). It makes use of different operations such as generalized distance function (GDF), the stochastic watershed, and geodesic transformations. The implemented algorithm has been validated on five public databases obtaining promising results.
167 PHISHING WEBSITE DETECTION: A REVIEW?, Feon Jaison, Seenia Francis?
Phishing is an attempt to steal users’ personal and financial information such as passwords, credit card numbers, through electronic communication such as e-mail and other messaging services. Attackers pretend to be from a organization which direct the users to a fake website that resembles a phishing website, which is then used to collect users personal information. Attackers can also trick users into downloading malicious codes or malware after they click on a link embedded in the email. Various researches have been done for protecting the users from phishing attacks. They include firewalls, blacklisting certain domains and internet protocol (IP) addresses, spam filtering techniques, fake website detection, client side tool-bars and user education. Each of these existing techniques has some advantages and some disadvantages. The need to automatically discover a phishing target is an important problem for anti-phishing efforts. If we know the webpage which is considered as the target webpage, we can confirm which all are the phishing pages. It could help the owners to identify phishing attacks so that they can immediately take necessary counter measures.
168 A Review of Intrusion Detection System in Computer Network?, Abhilasha A Sayar, Sunil. N. Pawar, Vrushali Mane?
Internet is a global network used all over by various companies, institutions, and government sectors. With the growth of internet world is coming close to an individual but at same time there is a threat of being robed. Connecting to internet can be both advantageous and disadvantageous in a sense that internet can provide as much comfort to business and also tremendous risk to end users. Increase in the speed of information data flow and also development in communication network along with many factors there is possibility of number of attacks on computer system. In order to protect computer system from these attacks and malicious activities intrusion detection system came into picture. This paper provides us overview of intrusion detection system and various techniques used to implement intrusion detection system.
Community Question answering (cQA) services have gained popularity for the past few years. It supports community users to post and answer questions and also it enables general users to acquire information from a set of answered questions. Though existing cQA forum provides textual answers alone, it is not much informative for many questions. In order to enhance textual answers in cQA with suitable media data MMQA method has been introduced. This method consists of three parts, Answer medium selection, Query generation for multimedia search and multimedia data selection and presentation. This method automatically finds out which type of multimedia information should be added to get an elaborated textual answer and also it automatically gathers data from the web to enhance the answer. By processing a collection of question answer pairs and adding them to a dataset it can set up a novel multimedia question answering (MMQA) method as users can find multimedia answers by comparing questions with those in the dataset. This MMQA method not only provides image and video for direct question answers but also give answers for more complex questions. The multimedia search diversification method is used here to collect the relevant answers based on questions. The result shows that it provides more satisfactory answers to the users and also it is more effective.
170 CPU Power Prediction on Modern Multicore Embedded Processor?, Shuhaizar Daud, R. Badlishah Ahmad, Ong Bi Lynn?
In this paper we put a modern multicore embedded processor in a load controlled environment and test its actual power consumption during idle and on active state. In order to retain the highest accuracy during measurement, we carried out the measurement directly on the processor power supply line during runtime. The test processors are loaded at a specific threshold and the actual power consumption during execution are measured and logged in real time. We have found out that on a modern embedded processors such as on our test platform, the idle and active power requirement are more dependent on processor load rather than CPU vcore or CPU execution frequency.
171 Vampire Attacks: Topology Discovery in Wireless Ad Hoc Sensor Networks, P. Dhivya, P. Sathya Priya, M. Thenila?
This Project work explores resource depletion attacks at the routing protocol layer, which permanently disable networks by quickly draining nodes battery power. The ?Vampire? attacks are not specific to any specific protocol, but rather rely on the properties of many popular classes of routing protocols. To mitigate these types of attacks, include a new proof-of-concept protocol and AODV protocol that provably bounds the damage caused by Vampires during the packet forwarding phase.
172 Integration of Touch Technology in Restaurants using Android?, Sushmita Sarkar, Resham Shinde, Priyanka Thakare, Neha Dhomne, Ketki Bhakare?
The growing number of restaurants and population of restaurant-goers have emphasized the need to enhance the working of hospitality industry. This research work aims at improving the quality of services and business of the hospitality industry by incorporating technology. A detailed research on the integration and utilisation of technology in hospitality industries showcased that various applications based on wireless technologies are already in use enabling partial automation of the food ordering process. In this paper, we discuss about the integration of touch technology in restaurants using android. This system is a basic dynamic database utility system which fetches all information from a centralized database. The tablet at the customer table contains the android application with all the restaurant and menu details. The customer tablet, kitchen display and the cashier counter connects directly with each other through Wi-Fi. This wireless application is user-friendly, improves efficiency and accuracy for restaurants by saving time, reduces human errors and provides customer feedback. This system successfully overcomes the drawbacks in earlier automated food ordering systems and is less expensive as it requires a one-time investment for gadgets.
Data mining is a process of extracting previously unknown knowledge and detecting the interesting pattern from a massive set of data. The amount of multimedia data available to users has increased exponentially. Video is an example of multimedia data. It contains several kinds of data such as text, image, meta-data, visual and audio. It is widely used in many applications like security and surveillance, entertainment, medicine, education programs and sports. In many surveillance mission’s huge amounts of data need to be gathered, evaluated and analyzed in order to make the right decision. Interesting events or threats are often hidden within these large amounts of data. The discovery of interesting events is one of the core problem areas of the data-mining research community. Compared to the mining of other types of data, video data mining is still in its infancy. There are many challenging research problems existing with video mining. Beginning with an overview of the video data mining literature, this paper concludes with the applications of video mining.
174 Review on Authentication Mechanisms of Digital Signatures used for Certification?, Shraddha Kulkarni, Prof. Vikrant Chole, Prof. P S Prasad?
Signatures are commonly used for certifying multiple financial documents such as payment receipt, cheques, stamp papers, agreement, contracts etc. as well as for personal identification such as Identity cards, marks cards etc. Since these documents usually involve transaction of money and identity verification, the signatures should be authenticated for their genuineness. In this paper, we clarify on importance of digital signature for certification as well as a review on multiple digital signature authentication mechanisms. This review helps us to proceed in the right direction of research in digital signature verification and authentication.
Phishing is a kind of online security attack where the attacker creates a replica of an existing web page to fool users in order to hack their personal, financial, or password data. Phishing is a form of online fraudulent activity in which an attacker aims to steal a victim’s sensitive information, such as an online banking password or a credit card number. Victims are tricked into providing such information by a combination of spoofing techniques and social engineering. In this paper we have proposed a new approach named as "Anti phishing framework with interactive captcha validation scheme using visual cryptography" to solve the problem of phishing. It uses visual cryptographic schemes to counter phishing pages where one secret captcha image share resides with user and the other secret shares reside in server. During authentication a genuine server forwards its share and the user forwards his share resulting in a secured access to the system via a reconstructed captcha. But the traditional captcha is prone to character recognization attacks and third-party human attacks. To overcome this problem we used here new generation of captcha which is known as interactive captcha to counter the both attacks. By recording CAPTCHA solving time on a per-character basis, we propose to use Detection Threshold Algorithms for CAPTCHA that enables a server to detect and reject third-party human attacks in ways not possible with existing CAPTCHAs. Combined with visual cryptographic schemes, we offer a dynamic or interactive captcha that can thwart all possible authentication threats.
176 Advanced E-mail Spam Detection Methodology by the Neural Network Classifier, V. BRINDHA, J. GEORGE CHRISTOBER, V.NANDHINI?
Spam is an Commonly said unwanted mail through the internet. In this trend Spam is became an major issue of this world to control legitimately. The large amount of spam will affect the bandwidth problems and mere nuisance to be considered. The issue in these spam mail is to mixed with the new mails and categorization is difficult to produce and it is very tough task to categorize the spam mails and legitimate mails. Spam Filtering is the processing of email to categorize the specific criteria. An anti-spam filtering will be introduced for the classification of Spam mails in the inbox and the spam detector trained by the pre classified emails. The detector detects whether the mail is an ordinary mail or spam mail. Here we include the advance technique called as Neural Network Classifier in the filtering technique to calculate the probability of Spam message from the machine learning from the Spam mails. Neural Network is very accurate efficient, adoptable and robustness and Effective anti-spam approach to classify whether the mail is an Spam or an Ordinary Emails.
177 Translation of English Algorithm in C Program using Syntax Directed Translation Schema, Nisha N. Shirvi, Mahesh H. Panchal?
Natural language Processing (NLP) is most promising area of research now days. Automatic Translation Application like Translator of English written algorithm to C program is very useful for the people who want to make programing but don’t know any formal language like C, JAVA, etc. Translation of English algorithm to C program has been implemented by Rule based approach using syntax directed translation schema. Rule based system do not use any intermediate representation. The input to the system is naturally written English Algorithms and it output its equivalent C Program. In this paper described system comprises of a two tool Flex (Scanner) and Bison (Parser). Flex as scanner define rules related string acceptance and token generation. Bison as parser define NLP Phrase structural grammar (PSG) with its semantic action.
178 One-Dimension Multi-Objective Bin Packing Problem using Memetic Algorithm?, Khushbu Patel, Mahesh Panchal
Memetic Algorithm has been proven to be successful to find the nearest optimum solution to hard combinatorial optimization problems. In this paper, Memetic algorithm is designed for One-dimension Multi-Objective Bin Packing Problem. Memetic Algorithm is a combination of power of Genetic algorithm with the powerful local search technique to focus on the population of local optima. In this paper memetic algorithm is performing local search on each chromosome; it is guaranteed to give near to optimal solution than Genetic Algorithm.
179 Filtered Wall: An Automated System to Filter Unwanted Messages from OSN User Walls?, Ms. Pallavi Kalyane, Prof. Sachin Chavan, Prof. Pravin Rahate?
Now days On-line Social Networks (OSNs) are one of the most popular interactive medium to communicate, share, and disseminate a considerable amount of human life information. This project represents a system enforcing filtering of unwanted messages coming from the user based on its content. Our system gives ability to OSN users to have a direct control on the messages posted on their walls. Up to now, OSNs provide little support to prevent unwanted messages on user walls. There is no content-based preferences are supported and therefore it is not possible to prevent unwanted messages, such as political or vulgar ones, no matter of the user who posts them. Providing this service is not only a matter of using previously defined web content mining techniques for a different application, rather it requires to design ad hoc classification strategies. This is because wall messages are constituted by short text for which traditional classification methods have serious limitations since short texts do not provide sufficient word occurrences. One fundamental issue in this system is blocking of user for lifetime. We overcome this Problem by using Proposed System; In this paper, we propose a system that performs blocking of user for particular time limit and also send notification, E-Mail to that who has posted unwanted message on wall. Along with that we are using Self Organizing Neural Network (SOINN) with Redial Based Function (RBF) for classification of text. In this we use the back propagation technique of neural network (i. e. Using previous knowledge of user messages we take proper action).
180 Mining Multilevel Fuzzy Association Rule from Transaction Data?, Urvi A. Chaudhary, Mahesh Panchal?
Mining multilevel association rules in transaction dataset is most commonly and widely used in data mining. It is more challenging when some form of uncertainty like fuzziness is present in data or relationships in data. Present a model of mining multilevel association rules based on frequency. Due to this reason, the different minimum support at each level must be set a low value; otherwise, a lot of valuable patterns may not be found. We have employed fuzzy set concepts, multi-level taxonomy and different minimum supports to find fuzzy multilevel association rules in a given transaction data set. Apriori concept is used in model to find the item sets. The proposed model adopts a top-down progressively deepening approach to derive large itemsets. This approach incorporates fuzzy boundaries instead of sharp boundary intervals.
181 Enhanced Live Migration of Virtual Machine Using Comparison of Modified and Unmodified Pages?, Sushil Kumar Soni, Ravi Kant Kapoor?
Now a days cloud computing is one of the fast growing technology in the field of computer science and information technology because of online, cheap and pay as use scheme. The cloud computing is mainly a business-oriented model to provide on demand computing resource. It has become popular in short time because of its attractive services like easy to use, pay as use and accessibility of its services throughout the world etc. The cloud computing concept is motivated by the idea that information processing can be public utility and can be done more efficiently on large farms of computing resources and storage systems with the availability of all time throughout the world accessible via the Internet. Virtual machine migration is one of the crucial activities that are carried out in cloud management. In this paper, we have proposed a model to remove some overhead in migration approach to increase its efficiency. The model is implemented and tested using simulator and results are compared with the contemporary approaches of migration.
182 A Wavelet Transform Based Secure Data Transfer Using Blowfish Algorithm?, Rashmi.J, Bharathi.G?
The modern era has seen ample number of cryptographic and stenographic techniques to transmit and receive data in a secure and confidential manner. In our paper we use a multi resolution wavelet domain by collaborating the concepts of steganography and cryptography. Initially we use a modified blowfish algorithm and will embed the encrypted message into an image. At the later part of the technique discrete wavelet transform is used so that the stagnated image is transformed into approximation and detailed image. The final reduced image is subjected into the receiver and the vice versa of the technique is used to obtain the plain text. The experimental results of this technique is unanimous and it’s found to be less suspicious.
In ancient days voting takes place through the Kudavolai voting system. It works in the way of picking a paper among many rolled paper in a bowl. Then Electronic Voting Machine (EVM) is a simple electronic device used to record votes in place of ballot papers and boxes which were used earlier in conventional voting system. Fundamental right to vote or simply voting in elections forms the basis of democracy. All earlier elections be it state elections or centre elections a voter used to cast his/her favorite candidate by putting the stamp against his/her name and then folding the ballot paper as per a prescribed method before putting it in the Ballot Box. This is a long, time-consuming process and very much prone to errors. This situation continued till election scene was completely changed by electronic voting machine. No more ballot paper, ballot boxes, stamping, etc. all this condensed into a simple box called ballot unit of the electronic voting machine. Because biometric identifiers cannot be easily misplaced, forged, or shared, they are considered more reliable for person recognition than traditional token or knowledge based methods. So the Electronic voting system has to be improved based on the current technologies viz., biometric system. However people are not ready to pole their votes by standing in a long queue. In order to improve the voting ratio SMS voting has been introduced. This article discusses complete review about voting devices, Issues and comparison among the voting methods and technology support for SMS voting.
Meetings or conferences play a crucial role in workplace dynamics. Active human participation is the vital component of a group social dynamics in meetings or conferences. Smart meeting has got three phases. They are capturing phase, processing phase and information exchanging phase. This work aims at identifying the degree of participation, which makes sense that the involvement and the movement of every participant are tracked. This can be used for future decision making. Also, the gestures, eye gaze frequencies are also taken into account, while identifying the degree of participation of the participant.
185 An Efficient Real Time Video Multicasting Protocol and WLANs Cross-Layer Optimization in IEEE 802.11N?, Gopikrishnan.R, Ms. J.R.Thresphine??
During the data transmission in multicasting using IEEE 802.11n WLANs standard there are two problems occurred is poor consistency and low data rate broadcast. So we are going to implement a new protocol in our project REMP (Reliable Efficient Multicast Protocol). To overcome the above problems this REMP is mainly suggested for MAC level Multicast protocol for increasing reliability and efficiency. The efficient is satisfied by the adjustment of the MCS (Modulation coding scheme) and reliability is satisfied selective for invalid multicast protocol. In additionally we are implemented the another protocol S-REMP (Scalable Reliable Efficient Multicast protocol) is for delivery of minimal quality video to all user and higher video quality is provided to the users exhibiting better channel conditions. Simulation results are implemented in our project.
186 Implementation of Password Guessing Resistant Protocol (PGRP) to Prevent Online Attacks?, M.YUVARAJ, A.R.BHARATHIDASAN, N.KUMAR?
The inadequacy of login protocols designed to address large scale online dictionary attacks (e.g., from a botnet of hundreds of thousands of nodes). Brute force and dictionary attacks on password-only remote login services are now widespread and emerging technique. Convenient login for legitimate users while preventing such attacks is a difficult problem. Automated Turing Tests (ATTs) continue to be an effective, easy-to-deploy approach to identify automated malicious login attempts with reasonable cost of inconvenience to users. In this paper, we propose a protocol called Password Guessing Resistant Protocol (PGRP), derived upon revisiting recent proposals designed to avoid such attacks. In PGRP limits the total number of login attempts from unknown remote users to as low as a single attempt per username, the users in most cases (e.g., when attempts are made from known, frequently-used machines) can make multiple failed login attempts before being challenged with an ATT. We evaluate the performance of PGRP with two realworld data sets and find out more than the existing proposals.
187 Selection of Most Relevant Features from High Dimensional Data using IG-GA Hybrid Approach?, Ishani Mandli, Prof. Mahesh Panchal?
Feature selection is considered a problem of global combinatorial optimization in machine learning, which reduces the number of features, removes irrelevant, noisy and redundant data, and results in acceptable classification accuracy. In the past few decades, researchers have developed large amount of feature selection algorithms. These algorithms are designed to serve different purposes, are of different models, and all have their own advantages and disadvantages. Although there have been intensive efforts on surveying existing feature selection algorithms, to the best of our knowledge, there is still not a dedicated repository that collects the representative feature selection algorithms to facilitate their comparison and joint study. To fill this gap, in this work, an IG-GA hybrid approach with MRMR evaluation function is presented for high dimensional data set.
Wireless mesh networks (WMNs) have emerged as a key technology for next-generation wireless networking. Wireless mesh networks (WMNs) consist of mesh routers and mesh clients, where mesh routers have minimal mobility and form the backbone of WMNs. WMNs have large variety of applications in the field of military and disaster management. WMNs are undergoing rapid progress and inspiring numerous applications. However, many technical issues still exist in this field. They are queuing delay, unpredictable node mobility, wireless multi-hop communication, limited battery power and wireless range of mobile devices, as well as the absence of a central coordination authority in WMNs. This paper presents a study on recent advances and open research issues in WMNs
189 The Role of Web Content Mining and Web Usage Mining in Improving Search Result Delivery?, Ms. Shital C. Patil, Prof. R. R. Keole?
In today’s e-world search engines play a vital role in retrieving and organizing relevant data for various purposes. However, in the real ground relevance of results produced by search engines are still debatable because it returns enormous amount of irrelevant and redundant results. Providing relevant information to user is the primary goal of the website owner. Web mining is ample and powerful research area in which retrieval of relevant information from the web resources in a faster and better manner. Web content mining improves the searching process and provides relevant information by eliminating the redundant and irrelevant contents. However for a broad-topic and ambiguous query, different users may have different search goals when they submit it to a search engine. Web usage mining plays an important role in inferring user search goals as they can be very useful in improving search engine relevance and user experience. The paper focuses on combine approach of web usage mining and web content mining.
Research on web performance has mainly concentrated on the network medium over which pages are transmitted. While it is quite clear that it is the speed of transmission that has the greatest effect on the perceived page load time on the client, this research seeks to evaluate the effect of web development practices particularly the use of PHP frameworks on the response time of web pages. The goal is to find a way that can significantly reduce the response time of framework-based web pages. The technique to be considered by this research is the use of AJAX technology to load dynamic content. In this proposed technique, when a user accesses a website for the first time in a session, a general template (consisting of HTML and CSS files) of the page is sent to the browser and displayed before the dynamic content starts loading. This technique uses the general idea that once a user sees something appearing on the screen they become patient and wait for the rest of the page to load.
191 Security Issues in SaaS Delivery Model of Cloud Computing, Aized Amin Soofi, M. Irfan Khan, Raman Talib, Umer Sarwar?
SaaS (Software as a service) is one of the main service provided by cloud computing it has a feature of multi tenancy that virtually provides the services on one by one basis but physically all user utilize the services at same time. It has received considerable attention in past few years and an increasing number of countries show their interest in the promotion of SaaS market. Although it has received significant attention but security issue is one of the major inhibitor in decreasing the growth of SaaS. Many organizations may still be hesitant to introduce SaaS mainly because of the trust and security concerns, they may observe more risks than benefits in introducing this service. In this study an attempt is made to discuss the security issues and their existing solutions in SaaS delivery model of cloud computing.
192 Security Issues in SaaS Delivery Model of Cloud Computing, Aized Amin Soofi, M. Irfan Khan, Ramzan Talib, Umer Sarwar?
SaaS (Software as a service) is one of the main service provided by cloud computing it has a feature of multi tenancy that virtually provides the services on one by one basis but physically all user utilize the services at same time. It has received considerable attention in past few years and an increasing number of countries show their interest in the promotion of SaaS market. Although it has received significant attention but security issue is one of the major inhibitor in decreasing the growth of SaaS. Many organizations may still be hesitant to introduce SaaS mainly because of the trust and security concerns, they may observe more risks than benefits in introducing this service. In this study an attempt is made to discuss the security issues and their existing solutions in SaaS delivery model of cloud computing.
In this paper we propose to apply data mining tech-unique to How to Improve Listening skill. We use real data on 100 students from Tamil Nadu College, such as induction rules and decision trees. In experiments, attempt to improve their accuracy for predicting which students Personality Test using all the available attributes. Next selecting the best attributes; and finally, rebalancing data and using cost sensitive classification. The outcomes have been compared and the models with the best results are shown.
194 Depth Video Compression Using Weighted Mode Filtering?, Ms. MAANVIZHI.J, Mr. K.SIVAKUMAR?
In this system, a technique has been proposed to compress a depth video by taking coding artifacts, spatial resolution, and dynamic range of the depth data into account. Due to abrupt signal changes on object boundaries, a depth video compressed by conventional video coding standards often introduces serious coding artifacts over object boundaries, which severely affect the quality of a synthesized view. The coding artifacts are suppressed by a post-processing, based on a weighted mode filtering and utilizing it as an in-loop filter. The weighted mode filtering method is attained by a joint histogram process. The weighted mode filtering is then applied to reconstruct a final solution with the original dynamic range by using the guided color information. This in addition, suppress the distortion from the dynamic range down/up scaling process by filtering the up scaling depth value based on the neighborhood information without degrading much the synthesized view quality. In addition, the proposed filter is also tailored to efficiently reconstruct the depth video from the reduced spatial resolution and the low dynamic range. The down/up sampling coding approaches for the spatial resolution and the dynamic range are used together with the proposed filter in order to further reduce the bit rate. The proposed techniques are verified by applying them to an efficient compression of multi-view-plus-depth data, which has emerged as an efficient data representation for 3-D video. Experimental results show that the proposed techniques significantly reduce the bit rate while achieving a better quality of the synthesized view in terms of both objective and subjective measures.
Vampire attacks are not specific to any specific protocol, but rather rely on the properties of many popular classes of routing protocols. A single Vampire can increase network-wide energy usage by a factor of O(N), where N in the number of network nodes. This paper will use two attack on stateless protocol in which first Carousel attack is an adversary sends a packet with a route composed as a series of loops, such that the same node appears in the route many times. Second, Stretch attack where a malicious node constructs artificially long source routes, causing packets to traverse a larger than optimal number of nodes. The vampire attack are very difficult to detect and more over very difficult to prevent.
196 Enhanced Privacy ID for Remote Authentication using Direct Anonymous Attestation Scheme?, Uma.R, Aravind.P?
Anonymizing networks such as Tor allow users to access Internet services privately by using a series of routers to hide the client’s IP address from the server. The success of such networks, however, has been limited to the users employing this anonymity for abusive purposes such as defacing popular Web sites. A system in which servers can “blacklist” misbehaving users, thereby blocking users without compromising their anonymity. For example, As Cloud Services such as Google collect more and more personal data and store them in a centralized manner, the consequence of exposing or leaking an account’s information could be nightmarish. It is desirable that some measures of data control are available on the part of users. In this paper, we introduce a new cryptographic scheme called Enhanced Privacy ID (EPID) for remote, anonymous authentication of a hardware device securely and privately.
197 Intrusion Detection in Mobile Adhoc Network?, Mrs. Mugdha Kirkire, Prof. Poonam Gupta?
Now a day’s wireless communication has rapid enhancement as demand for wireless network goes on increasing. One of the most popular and growing network is Mobile Adhoc Network as no of mobile users are users are incremented day by day. Mobile Adhoc Network (MANET) is infrastuctureless network so it is applicable in various fields for communications such as rescue operations, tactical operations, and environmental attack signatures. To secure such a most demanding network is itself a big challenge. Due to fast changing topology and some other vulnerability it is difficult nut essential to provide security in such a kind of network. To secure network we have to detect the attacks and take appropriate action on it. In the survey of MNAET we find that there are some attack signatures dependent on other previous attack signatures. This is different types of known intrusive actions; it would allow new or undocumented types of attacks to go invisible. As a due to the new attack is a derivative from the previous attack. To apply the intrusion detection technique this paper introduces acknowledgement based approach and trust based approach which is used to detect intrusion in mobile ad hoc network (MANET) and uses intrusion detection technique like monitoring and multicasting algorithm. Our proposed system look for the occurrence of those patterns which can be consider as attack. So our system is divided into main two parts:-1.To detects intruder attacks in mobile ad hoc network (MANET). 2. To detect the technique of developing a network safety by describing network behavior structure that point out offensive use of the type of attack in mobile ad hoc network (MANET) and apply local intrusion detection or network intrusion detection.
198 ENTITY SEARCH ENGINES?, Pinky Paul?, Mr. Thomas George?
This review paper presents a study about entity search engine .It describes details about entity search engine architecture and it’s working or the different methods adopted for entity extraction on it. All these methods are described based on the working examples such as entity cube and Microsoft academic search. Search engine for searching the summaries of an entity that make the user searching easy. In entity search engine, it extracts all the entities and relationships from heterogeneous web pages through different techniques. And finally integrate all these extracted information into a single unit.
199 Data Services For E-Tailers Leveraging Web Search Engine Assets- A Review?, Shaleena K.P, Thomas George?
This review paper presents a study about how the data services can be provided for an e-tailer by mining the web search engine assets. It describes in detail some of the existing approaches to mine data from large databases to the methods that can be used to mine data from the query log of a search engine. It gives a brief description about some string matching techniques. It also discusses about the techniques used behind each of the data service which is been presented.
Cloud is the emerging technology used in recent trends and for all WS-BA.I n this paper we proposed by providing more security for atomic transaction in web service. Considering online net banking system, the user will enter their user id and password for accessing their account details. They can view all the accounts across all branches of Net Bank locations online and as well affect fund transfers on real time basis within the Bank network. The fund transfers are stored in Net bank database using some services and if some crash or data loss occurs in database the replica is created for efficient transaction by using BFT algorithm. The services includes, Activation service, Registration service, Completion service, Coordinator service In activation services, creates a Coordinator object and a transaction context for each transaction. Essentially, the Activation service behaves like a factory object that creates Coordinator objects. The Registration service allows the Participants and the Initiator to register their end point references. The Completion service allows the Initiator to signal the start of the distributed commit. The Coordinator service runs the 2PC protocol, which ensures atomic commitment of the distributed transaction.
201 A Mechanism to Adjust the Updated Data and Recovery for School?, Hussain Abdulkareem Younis, Hassen Mohssen Audafa, Dr. Hameed A. Younis
Manual system which is used to manage the database represents a problem in the current time after the tremendous progress in the information technology field. Due to the increased number of peoples and their information has become not easy to deal with the classic manual system for that the solution is by use automated system, but a new problem is generated how to control the data update. In the manual system, the update process done by the write-off or correction ink and this manner used for a limited number of updates. In the automated system, the problem is how to know if the data is updated or not, how to know how many updating and if there is ability of recovery or not. The developed system in this paper performs the processes of the automated system as a substitute for manual system, as well as, that system provide solution for the control update problem by give the manager the ability of monitoring the modifications and capability to recover the old data.
ZigBee is based on the IEEE 802.15.4 standard and was designed to be used in wireless control and sensor networking. ZigBee provides self-organized, multi-hop and reliable networking facility with long battery lifetime. ZigBee standards have been developed to provide simple, low cost and battery efficient wireless devices. Mobility is part of ZigBee vision and it is difficult to provide to/from connections to mobile end-devices. Due to movement of end-devices, data delivery failures occur in ZigBee wireless network. So, to locate the misplace end devices, the Broadcasting method is used to lessen the effects of mobility. But it consumes large amount of resources in terms of bandwidth and power consumption. Recently ZigBee Node Deployment and Tree construction (ZNDTC) framework is proposed to reduce such resource consumption and provides efficient data transmission between coordinator and mobile end devices. Further adaptive transmission rate and bandwidth utilization technique is then introduced to improve network throughput. Adaptive transmission rate is used to improve the network throughput by increasing the transmission rate i.e. the rate of flow when there is no data loss in the flow. Thus transmission rate is managed based on network traffic.
The Internet has been widely applied in various fields; more and more network security issues emerge and catch people’s attention. However, adversaries often hide themselves by spoofing their own IP addresses and then launch attacks. For this reason, researchers have proposed a lot of trace back schemes to trace the source of these attacks. Some use only one packet in their packet logging schemes to achieve IP tracking. Others combine packet marking with packet logging and therefore create hybrid IP trace back schemes demanding less storage but requiring a longer search. In this paper, we propose a new hybrid IP trace back scheme with efficient packet logging aiming to have a fixed storage requirement for each router (under 320 KB, according to CAIDA’s skitter data set) in packet logging without the need to refresh the logged tracking information and to achieve zero false positive and false negative rates in attack-path reconstruction. In addition, we use a packet’s marking field to censor attack traffic on its upstream routers. Lastly, we simulate and analyze our scheme, in comparison with other related research, in the following aspects: storage requirement, computation, and accuracy.
204 A Survey on Quality of Service for Optimized Linked State Routing protocol in Mobile Ad hoc Network?, Jalpesh D. Ghumaliya, Sandip Chauhan?
A wireless Mobile Ad hoc NETworks (MANETs) is a special type of wireless network that does not have wired infrastructure to support communications between different nodes. Addressing Quality of Service (QoS) support in the Internet has been widely investigated. But, such efforts are unsuitable for MANETs which introduce bandwidth constraints and dynamic network topology. In MANET, routing protocols have a significant role in terms of the performance because they determine the way of sending and receiving packets between mobile nodes where all nodes are free to move about arbitrarily and where all the nodes configure themselves. In MANET, each node acts both as a router and as a host &even the topology of network may also change rapidly. In this paper we have done the study of OLSR routing protocol from various reputed papers. The key concept used in the protocol is that of Multi-Point Relays (MPRs) which are selected nodes that forward broadcast messages during the flooding process. The objective is to make observations about how the network performance with OLSR routing protocol can be enhance.
205 OSN User Filtered Walls for Unwanted Messages Using Content Mining?, Ms. Shruti C. Belsare, Prof. R.R. Keole?
In today’s On-line Social Networks (OSNs) is to give users the ability to control the messages posted on their own private space to avoid that unwanted content is displayed. Up to now, OSNs provide little support to this requirement. To fill the gap, in this paper, we propose a system allowing OSN users to have a direct control on the messages posted on their walls. This is achieved through a flexible rule-based system, that allows users to customize the filtering criteria to be applied to their walls, and a Machine Learning based soft classifier automatically labeling messages in support of content-based filtering. To study strategies and techniques limiting the inferences that a user can do on the enforced filtering rules with the aim of bypassing the filtering system by creating a instance randomly notifying a message system that should instead be blocked, or detecting modifications to profile attributes that have been made for the only purpose of defeating the filtering system. Automatically user will get a mail notification.
The proposed system uses features of microscopic images by examining changes like texture, geometry, color and statistical analysis of images. These changes will be used as a classifier input. The presented method shows how effective an automatic morphological method to identify the Acute Lymphocytic Leukemia (ALL) by microscope images of blood samples. At first the system individuates the leucocytes present in others blood cells, after that it recognizes the lymphocyte cells (cells that causes acute leukemia), evaluation regarding morphological indexes from those cells is done and finally classification for the presence of the leukemia is done. This also includes 2D PCA for the feature extraction along with separation of nucleus and cytoplasm and other cellular features.
Vehicular ad hoc networks (VANETs) are highly mobile wireless networks that are designed to support vehicle safety. A Our new protocol combines features of on demand routing with positionbased geographic routing in a manner that efficiently uses all the position data available. The protocol is provided to gracefully exit to on demand routing as the position information attaining low level. In the paper, we have presented, a hybrid routing scheme that gracefully integrates the characteristics of on-demand and proactive routing, it shows more scalability for movement and collision load. We have demonstrated that new hybrid protocol provides well on demand routing protocols (such as AODV), Geo-routings (such as GPSR) and adaptive hybrid routing protocols (such as AODV) in high mobility and collision load. The improvement is more significant with higher mobility and collision load. We show through analysis that our protocol is scalable and has an optimal overhead, even in the presence of position errors. Our protocol provides an enhanced yet pragmatic location-enabled solution that can be deployed in all VANET-type environments. QOS-AR is simple to deploy and yet effectively obtains optimal scalability performance, making it an ideal candidate for the routing protocol in emerging VANETs. By selecting the forwarding So that we can reduces delay, increase energy and throughput
208 Critical Analysis of Cloud Computing Using OpenStack?, Paramjot Singh, Vishal Pratap Singh, Gaurav Pachauri?
IT world is changing. Evolution of cloud computing in recent past has brought drastic change in era of IT field. Major advantage of cloud computing is that the hardware need not to be upgraded as Cloud Services provides everything of demand basis. Consuming electricity is the best example by which cloud computing can be explained, just pay for whatever you used. In this paper, we present a comparative study of leading cloud providers such as Amazon Cloud Services, Rackspace powered by OpenStack and other open source Cloud providers. Further discussing about how to implement OpenStack or just trying out by Devstack and Trystack just for testing purpose and at last covering releases and recent work going on in OpenStack. The aim of this paper is to show importance of OpenStack as a Cloud provider and how to get started with OpenStack.
209 Polarity Classification Using Twitter Data?, Paramjot Singh?
Polarity Classification over Twitter offers different organizations a fast and effective way to monitor the feelings/emotions of general public towards their brand, business, politicians etc. A wide range of features for training polarity classifiers for Twitter datasets have been researched in recent years with varying results. In this paper, we introduce a novel approach for automatically classifying and adding semantics as additional features the polarity of Twitter messages. These messages are classified as positive or negative or neutral with respect to a query term. The paper focuses on addressing polarity classification for product features in product reviews by building semantic association between product features and polarity words. The results show that our method is encouraging.
210 Defending Against Attack in Heterogeneous Networks?, M. Mukesh Krishnan, R. Ravi?
In Disruption Tolerant Network flood attack is occur when the packets or packet replicas are send continuously from source to destination. Flood attack normally cause packet loss and inconsistency in packets. In order to overcome flood attack rate limit has been set in each node so the nodes only accept the particular limit of data’s. Our detection adopts Claim-carryand check in which each node itself counts the number of packets or replicas that it has sent and claims the count to other nodes. When the node violates its rate limits, it will be detected and its data traffic will be filtered by the way the amount of traffic has to be reduce. The receiving nodes carry the claims when they move and cross-check if their carried claims are inconsistent when they contact. To avoid this data loss we propose a technique Distributed Dynamic Routing Algorithm. This algorithm provides the best path in a network to perform effective communication dynamically. The Distributed Dynamic Routing Algorithm chooses a best path to transmit data from source to destination through intermediate nodes randomly. Here the network posse’s parallel communication so the transmission time is very low. Since the protocol transmits data randomly data transmission is more secure. Since the data’s send dynamically the communication is efficient without any malicious activity.
We have introduced a design for future 5G mobile in heterogeneous wireless network. As compared to the situation of today, in 2020, mobile access networks will experience significant challenges. The paper throws light on the evolution and development of various generations of mobile wireless technology along with their significance and advantages of one over the other. This paper takes as starting point the situation of today, and tries to pinpoint important focus areas and potential solutions when designing an energy efficient 5G mobile network architecture. These include system architecture, where a logical separation of data and control planes is seen as a promising solution; network deployment, where (heterogeneous) ultra dense layouts will have a positive effect, radio transmission, In the near future, it is expected that mobile cloud computing (MCC) will benefit enterprises by improving network manageability and maintenance.
Mobile cloud computing is an emerging technology to improve the quality of mobile services. Together with an explosive growth of the mobile applications and emerging of cloud computing concept, mobile cloud computing (MCC) has been introduced to be a potential technology for mobile services. MCC integrates the cloud computing into the mobile environment and overcomes obstacles related to the performance (e.g., battery life, storage, and bandwidth), environment (e.g., heterogeneity, scalability, and availability), and security (e.g., reliability and privacy) discussed in mobile computing. In this paper, we describe what is mobile cloud computing, including its scope, current developments, and research challenges This section lists some of the major issues in Mobile Cloud Computing. One of the key issues in mobile cloud computing is the end to end delay in servicing a request.. We had also seen the comparison between mobile cloud computing and the cloud computing, the architecture of mobile cloud computing and the developing areas and the application of mobile cloud computing.
213 Source Anonymous Message Authentication Based On ECC in Wireless Sensor Networks?, B.Renugadevi, T.John Peter
Source Anonymous Message authentication (SAMA) is one of the most effective ways to prevent unauthorized and corrupted messages from being forwarded in wireless sensor networks (WSNs).A scalable authentication scheme based on elliptic curve cryptography (ECC) is introduced to allow any node to transmit an unlimited number of messages without suffering the threshold problem and provides message source privacy. For each message the sending node generates a source anonymous message authenticator for the message. The generation is based on MES scheme on elliptic curves. An efficient key management framework is introduced to ensure isolation of the compromised nodes. ECC reduces computational and communication overhead under comparable security levels while providing message source privacy.
This project revolves around the design and implementation of floating point adder architecture using reversible logic to improve the design in terms of the number of garbage outputs and the number of gates used. In recent years, reversible logic has emerged as a promising technology having its applications in low power CMOS, quantum computing, nanotechnology and optical computing because of its zero power dissipation under ideal conditions. In this paper, the reversible logic closely follows the IEEE754 specification for binary Floating point adder architecture is done so as to minimize the number of gates used and their garbage outputs. The existing and the proposed floating point adder architectures are designed using Verilog and simulated using Xilinx ISE 9.1 tool.
215 Analysis of Data Confidentiality Techniques in Cloud Computing?, Ms. Mayuri R. Gawande, Mr. Arvind S. Kapse?
Cloud Computing is one of the emerging technologies in Computer Science. Cloud provides various types of services to us. Database Outsourcing is a recent data management paradigm in which the data owner stores the confidential data at the third party service provider’s site. The service provider is responsible for managing and administering the database and allows the data owner and clients to create, update, delete and access the database. There are chances of hampering the security of the data due to untrustworthiness of service provider. So, to secure the data which is outsourced to third party is a great challenge. The major requirements for achieving security in outsourced databases are confidentiality, privacy, integrity, availability. To achieve these requirements various data confidentiality mechanisms like fragmentation approach, High-Performance Anonymization Engine approach etc. are available. In this paper, various mechanisms for implementing Data Confidentiality in cloud computing are analyzed along with their usefulness in a great detail.
216 Analysis of Malware Detection Techniques in Android?, Ms. Prajakta D. Sawle, Prof. A. B. Gadicha?
The malware threat for mobile phones is expected to increase with the functionality enhancement of mobile phones. This threat is increased with the surge in population of smart phones instilled with stable Internet access which provides attractive targets for malware developers. Currently, in the smartphone market, Android is currently the most popular smartphone operating system. Due to this popularity and also to its open source nature, Android-based smartphones are now an ideal target for attackers. Since the number of malware designed for Android devices is increasing fast, Android users are looking for security solutions aimed at preventing malicious actions from damaging their smartphones. Anti-malware products promises to effectively protect against malware on mobile devices and many products are available for free or at reasonable prices. From this perspective, we propose and analyse some potential limitation-oriented techniques for effective malware detection.
217 A Survey on Privacy Preservation in Data Publishing?, V. Shyamala Susan, Dr. T. Christopher?
Privacy preservation is the most concentrated issue in data publishing, as the sensitive information should not be leaked. For this sake, several techniques such as generalization, bucketization are proposed, in order to deal with privacy preservation. However, generalization fails on high dimensional data because of dimensionality and it causes information loss due to uniform distribution. On the other hand, bucketization cannot achieve membership disclosure. All the above mentioned shortcomings are overcome by a technique named slicing. Slicing can handle high dimensional data too. It is proposed that slicing can be clubbed with the algorithm in order to increase the data utility and privacy.
218 Error Detection in Decoding of Euclidean Geometry Low Density Parity Check (EG-LDPC) Codes, K. ADINARAYANA, J. RAVI?
In a recent paper, a method was proposed to accelerate the majority logic decoding of difference set low density parity check codes. This is useful as majority logic decoding can be implemented serially with simple hardware but requires a large decoding time. For memory applications, this increases the memory access time. The method detects whether a word has errors in the first iterations of majority logic decoding, and when there are no errors the decoding ends without completing the rest of the iterations. Since most words in a memory will be error-free, the average decoding time is greatly reduced. In this brief, we study the application of a similar technique to a class of Euclidean geometry low density parity check (EG-LDPC) codes that are one step majority logic decodable. The results obtained show that the method is also effective for EG-LDPC codes. Extensive simulation results are given to accurately estimate the probability of error detection for different code sizes and numbers of errors.
219 A NEW APPROACH TO IMPROVE BUSINESS USING SEO TECHNIQUES?, Sreeharsha Muram, VenkataRamana.K, Rajasekhar Kunisetty
SEO Stands for "Search Engine Optimization”. Increase the website results via paid/natural resources. Search engine optimisation should not be seen as an end in itself. It is a function that should be undertaken to improve the overall commercial performance of a web site. The role of SEO is to legitimately process of improving rankings. There are few genuine guarantees of a top placement, particularly for highly competitive search terms. Good SEO will improve a web site’s ranking across a range of selected terms. However, any process whereby a search engine is illicitly manipulated in order to guarantee a high placement is referred to as spamming. The successful execution of a search engine optimisation project requires skills in the areas of analysis, research, planning, copy writing and communication. A comprehensive search engine optimisation project is divided into four interrelated phases. 1 .Pre-site activities – The research and planning activities undertaken before an existing or new site or page is actually touched or built. 2. On-site activities – The activities directly involved in the content and design of webpages. 3 .Off-site activities – Building a portfolio of quality inbound links to your website. 4. Post–site activities – Analysing and responding to site traffic and user feedback once a website has been optimised. Effective SEO is a continuous activity.
220 Security for Privileged Accounts Using Break-Glass Technique, Arun.S, Mohanasundarm.A, Bhoopathi Siva.K?
Break-glass within computing is a term used to describe the act of checking out a system account password for use by a human. It is generally used for highest level system accounts such as root for unix or SYS/SA for database. These accounts are highly privileged and not in themselves individualized to a specific human, so instead break-glass limits them by the password time duration, with the aim of controlling and reducing the account’s usage to that which is necessary. Break-glass has been examined in a number of publications applied to medical systems. What is currently missing is an accurate translation of original break-glass concepts, especially applied to high security environments such as banking. This paper will provide a description of how break-glass is evolving into a broader method of time-based access control mechanism. Finally how time-based access control and break-glass can be varied adaptively based on threat level is proposed.
221 Alleviating Internal Data Theft Attacks by Decoy Technology in Cloud?, I.Sudha, A.Kannaki, S.Jeevidha?
Cloud Computing enables several users to share common computing resources, and to access and store their personal and business information. The accessing includes so many things as well as can keep their private and industrial information. These new thoughts and innovations have pro’s at the same time con’s too. And there is new security challenges has been a raised. The increase in the number of cloud users are from the World Wide Web users means of internet. The users who have prospective valid credentials which contain username and password are treated as insiders. In security perspective, all the remote users are known as attackers. Some active security mechanisms fails to prevent data theft attacks and it should make sure that the remote user is not an attacker. We propose a new approach for securing data in the cloud by using user profiling and provoking decoy technology. When an unauthorized access is assumed and then confirmed using various challenge questions, we initiate a disinformation attack by returning huge amount of decoy information to the attacker. This approach protects against the misuse of the original user data. When a decoy document is loaded into memory, we authenticate whether the document is a decoy document by computing an HMAC based on all the contents of that document.
222 Dynamic Data Aggregation Prediction Based Clustering to Mobile Sink in Wireless Sensor Networks?, M.Suganthi, Mrs. Susmita Mishra?
Wireless Sensor Networks is a fast leading technology which has showed up many opportunities in the field of data reporting and monitoring. It has a collection of sensor nodes which can report data to the base station. It increases energy consumption and traffic. So to avoid network traffic and to prolong network lifetime clustering scheme is used. Mobile sinks can easily move to the deployed area to reduce the data acquisition and gathering time. Therefore, an efficient clustering and prediction based routing protocol(EECPA) can be used to predict the mobile sink movement so as to minimize the energy consumption and to effectively transmit the aggregated data to the sink.
223 Enhancing Search Engine Optimization using Web Content Mining?, Mr. S.Balamurugan, Dr. S.Thirunirai Senthil?
Most of the website suffers from technological issues, some which are conscious and some which are not aware. Its capacity depends upon the background done in web content mining, when changed domain or web mining or just common mistakes often have done by webmasters. Deciding to hire an SEO is a big decision that can potentially improve your site and save time, but you can also risk damage to your site and reputation to research the potential advantages as well as the damage that an irresponsible SEO can do to your site. It is not just an implementation for measuring privacy of data but can also be used as a tool for business and market research, and to access and also improve the effectiveness of a web site. By understanding how and when your projection and clients are interacting with your content and applications, you can successfully optimize your online presence according to both their needs and your business goals.
224 A Survey on Wavelet Domain Techniques for Image Super Resolution?, Saranya P, Fatimakani K, Kanchanadevi P, Venkatesan S, Govindaraju S?
The main objective of super-resolution (SR) imaging is to reconstruct a high-resolution (HR) image of a scene from one or more low-resolution images of the scene. In resolution enhancement of images, the main loss is on the high frequency components (edges) of the image. This is due to the smoothing caused by interpolation. Hence in order to enhance the quality of the super resolved image, preserving the edges is essential. In this paper we are studying various image resolution enhancement techniques that utilize Wavelet Transform (WT) techniques. This paper compares various image resolution enhancement methods that employs discrete wavelet transform (DWT), stationary wavelet transform (SWT), dual tree complex wavelet transform (DT-CWT), wavelet zero padding (WZP), cycle spinning (CS). To enhance the contrast of the image singular value decomposition (SVD) is employed with wavelet transform, in which singular value matrix gives the illumination content. By modifying that value the contrast of the given image is increased. Simulation experiments have been performed on a variety of images using Matlab, and results were compared using peak signal to noise ratio (PSNR).
225 Special Scheme of Hidden Data Procession for Several Applications in Wireless Sensor Networks, Janarish Saju C, Sharon Nisha M
In wireless sensor networks hidden data procession is the concept of collecting, summarizing and combining sensor node’s data in order to reduce the amount of data transmission in the networks. In previous studies we have found that homomorphic encryption algorithm have been applied to hide data during aggregation from sensor nodes. However the principle involved in this algorithm does not satisfy several applications in sensor environment, and second compromising in case of sensor node attack cannot be prevented and then finally the number of messages aggregated could not be detected and whether it may be a duplicate copy, therefore a special scheme ?Hidden Data Procession? has been introduced which is an extended form of CRT(Chinese Redundancy Theorem) algorithm such that the security schemes are applied using ?Key Distribution? technique, since it has three methodology to satisfy the above mentioned problem. Initially it was designed mainly for various application environment and second it prevents compromising node attack and finally a special method of counting capability is applied here, to prevent unauthorized data sensed, Here all the functions are implemented in the form of database as a service model and the queries are aggregated in the form of encrypted manner separately, which proves security in hidden data procession.
A Novel routing metric, Expected Forwarded Counter (EFW) to cope with a problem of selfish behavior (packet dropping) of mesh routers in a Wireless Mesh Networks (WMN).Wireless Mesh Networks emerged as a flexible and low cost network infrastructure, where heterogeneous mesh routers managed by different users collaborate to extend network coverage. EFW combines routing Layer observations of forwarding behavior with MAC Layer measurements of wireless link quality to select the most reliable and high performance path. The proposed metrics will be evaluated through both simulations and real-life deployments on two different wireless testbeds, performing a comparative analysis with On Demand Secure Byzantine Resilient Routing (ODSBR) Protocol and Expected Transmission Counter (ETX).Cross Layer metrics accurately capture the path reliability and Even when a high percentage of network nodes misbehave it increase the WMN performance.
227 An Assess Android Antimalware that Detects Malicious Dynamic Code in Apps, Miss. Srushti Hatwar, Prof. Chetan Shelke?
Android is currently the most popular operating system and a considerable number of Smartphone’s, tablet computers ship with Android. However, users feel their private information at threat, facing a rapidly increasing number of malware for Android which significantly exceeds that of other platforms. Antimalware’s software promises to effectively protect against malware on Smartphone’s and many products are available for free or at reasonable prices. We systematically analyze the security implications of the ability to load malicious dynamic code in Android apps. We assess an Android Antimalware software tool to detect attempts to load malicious code and from the study of many online applications we observed, that malicious code is loaded in an unprotected way is a major issue. We also show how malware can use code-loading techniques to avoid detection by exploiting a conceptual weakness in current Android malware protection.
228 Field Oriented Control of Permanent Magnet Synchronous Motor?, P.Ramesh, RachaPrathyusha?
Today, “electric platform” based commercial and military aerospace, land vehicles and weapon systems are using adjustable speed PM motor systems to replace older fixed speed motors that have mechanical gearboxes. PMSM (Permanent Magnetic Synchronous Motor) has been increasingly used in many high performance application due to its advantages of high power density, high power factor and efficiency. Firstly, a SVPWM scheme, vector control method and fuzzy controller are derived and applied in the speed control IC of PMSM drive. Secondly, the Very-High-Speed IC Hardware Description Language (VHDL) is adopted to describe the behavior of the fore mentioned control algorithms. Vector control techniques have made possible the application of PMSM motors for high performance applications where traditionally only dc drives were applied. PMSM torque control has traditionally been achieved using Field Oriented Control (FOC).
Voting has existed for several years and the process of voting has progressed over the years. Voting has migrated in some countries from hand ballot systems to more electronic means such as Internet voting. An electronic voting system requires a higher level of security than an E-commerce system, the platform over which electronic voting is carried out goes a long way in determining the security requirements they can achieve and its practicability in actual elections. Traditional voting systems also has its shortcomings in terms of lack of Voter’s mobility, flexibility, individual verifiability and accuracy of the tallying process due to human errors which can be addressed using an electronic voting over a secure platform. These issues have inspired this thesis in which I intend to propose an electronic voting scheme which is more secure and offering maximum facilities.
The fixed-width multiplier is attractive to many multimedia and digital signal processing systems which are desirable to maintain a fixed format and allow a little accuracy loss to output data. This paper presents the design of high-accuracy fixed-width modified Booth multipliers. To reduce the truncation error, we first slightly modify the partial product matrix of Booth multiplication and then derive an effective error compensation function that makes the error distribution be more symmetric to and centralized in the error equal to zero, leading the fixed-width modified Booth multiplier to very small mean and mean-square errors. In addition, a simple compensation circuit mainly composed of the simplified sorting network is also proposed. Compared to the previous circuits, the proposed error compensation circuit can achieve a tiny mean error and a significant reduction in mean-square error while maintaining the approximate hardware overhead.
231 Clustering with Efficient Web Usage Mining?, MS. RASIKA KALBENDE, MR. AMIT SAHU?
Web usage mining attempts to discover useful knowledge from the secondary data obtained from the interactions of the users with the Web. Web usage mining has become very critical for effective Web site management, creating adaptive Web sites, business and support services, personalization; network traffic flow analysis etc., Web site under study is part of a nonprofit organization that does not sell any products. It was crucial to understand who the users were, what they looked at, and how their interests changed with time. To achieve this, one of the promising approaches is web usage mining, which mines web logs for user models and recommendations. Web usage mining algorithms have been widely utilized for modeling user web navigation behavior. In this study we advance a model for mining of user’s navigation pattern. The proposal of our work proceeds in the direction of building a robust web usage knowledge discovery system, which extracts the web user profiles at the web server, application server and core application level. The proposal optimizes the usage mining framework with fuzzy C means clustering algorithm (to discover web data clusters) and compare with Expected Maximization cluster system to analyze the Web site visitor trends. The evolutionary clustering algorithm is proposed to optimally segregate similar user interests. The clustered data is then used to analyze the trends using inference system. By linking the Web logs with cookies and forms, it is further possible to analyze the visitor behavior and profiles which could help an e-commerce site to address several business questions. Experimentation conducted with CFuzzy means and Expected Maximization clusters in Syskill Webert data set from UCI, shows that EM shows 5% to 8% better performance than CFuzzy means in terms of cluster number.
Data aggregation scheme reduces the large amount of transmission in Wireless Sensor Networks (WSN). Concealed Data Aggregation schemes that are extended from homomorphic public encryption system are designed for a multi-application environment. The drawbacks of existing work include address aggregation security for Database As Service (DAS) Model. Client query aggregation increases the computation cost. Compromised secret keys affect the sensor node aggregations that are loosed. In DAS client stores database are on an entrusted service provider. The proposed work presented Concealed Data Aggregation for Database-AS-Service. It establishes the trusted database server for the client data storage. The aggregation of client queries for multiple applications is made with private homomorphic encryption standards. The client query responsive data are extracted from trusted data server with authenticated concealment. PH scheme contains utilizable properties to conceal data of respective clients. It minimizes the computation cost due to the client query aggregates. The uncompromised secret key improves the client query response for multiple groups.
233 Secure Crypto System for Image Encryption and Data Embedding using Chaos and BB Equation Algorithm?, S.Revathy?
This project proposes method for image encryption and decryption, data embedding and data extraction. The content owner first encrypts image by BB equation and chaos algorithm, then the data is encrypted using data hiding key and embedded into LSB bit of specific pixels. With an encrypted image containing additional data, if a receiver has the data-hiding key, the data can be extracted without revealing original image. If the receiver has the encryption key, the original image can be extracted without disturbing data embedded. If the receiver has both the data-hiding key and the encryption key, then additional data and the original content can be recovered without any error. Since the data embedding only affects the LSB of the encrypted image, the decryption with the encryption key can result in image retrieval similar to the original version.
234 FPGA Implementation of Wu-Manber Algorithm for BLASTN DNA Sequence Matching, Anitha Ranganathan?
BLAST is one of the most popular sequence analysis tools used by molecular biologists. Blast is fast and it is ubiquitous within the genomic community. However, because the size of genomic databases is growing rapidly, the computation time of BLAST, when performing a complete genomic database search, is continuously increasing. In order to overcome this time consuming process, we propose a PPBF architecture, which is used to speed up the BLASTN process more than the NCBI BLASTN software running on a general purpose computer.
235 Logical Fault Detection Based on Conservative QCA for Ultra Low Power Devices?, Yamuna S?
As transistors decrease in size to accommodate in a single die, it increases chip computational capabilities. However, transistors cannot get much smaller than their current size. The quantum-dot cellular automata (QCA) approach represents one of the possible solutions in overcoming this physical limit. It can be implemented by using reversible gates which will not dissipate power during computation. In general, the testing of sequential circuits is difficult as compared to combinational circuits since it need to test along with the previous state. In this paper, the sequential circuit based on reversible logic can be tested using only two test vector, thereby eliminating the need for any type of scan-path access to internal memory cells. The two test vectors are all 1s and all 0s. The designs of two vectors testable latches, master-slave flip-flops and double edge triggered (DET) flip-flops are presented. Along with that a new design for multiplexer is implemented based on conservative QCA logic, that is not reversible in nature but has similar properties as the Fredkin gate of working as 2:1 multiplexer. The proposed MX-cqca gate surpasses the Fredkin gate in terms of complexity (the number of majority voters), speed, and area. The importance of the proposed work lies in the fact that it provides the design of reversible sequential circuits completely testable for any stuck-at fault by only two test vectors.
236 Design of Power Gated ML Sensing Low Power CAM?, Neelam Sharma S?
A Content Addressable Memory (CAM) is a memory unit that performs single clock cycle content matching instead of addresses. CAMs are vast used in look-up table functions, network routers and cache controllers. Since basic lookups are performed over all the stored memory information there is high power dissipation. In reality there is always trade-offs between power consumption, area used and the speed. CAMs are popular in network routers for packet forwarding and packet classification, but they are also beneficial in a variety of other applications that require high speed. The main CAM challenge is to reduce power consumption associated with the large amount of parallel active circuitry, without sacrificing speed or memory density. Thus robust, high-speed and low-power ML sense amplifiers are highly sought after in CAM designs. In this work, we introduce a parity bit and effective gated-power technique to reduce the peak and average power consumption and enhance the robustness of the design against process variations.
237 A Survey on Text Based Clustering, S.Chidambaranathan
Clustering is the main technique for data analysis and it deals with the organisation of a set of objects in a multidimensional space into cohesive groups called clusters. Every cluster contains closely related objects and has very dissimilar objects in other clusters. Cluster analysis aims at discovering the objects with same behaviour in a collection. Thus, if an object satisfies a rule, the all objects that are similar to that object will satisfy the same rule. With this functionality the hidden similarity, relationship and other concepts can be predicted with respect to the cluster of objects. In this work, we present a survey on text based mining.
Security Testing is much important in Software Development to find the vulnerabilities while developing the product. In this paper I presented a method to find the injection vulnerabilities. Vulnerability is a security weakness or it is a hole in the product which is not fixed while developing the software. Normally Hackers hack the software by finding these vulnerabilities and cause data loss or severe damage to the entire product. Injection is one of the most dangerous attacks among much vulnerability which is placed in number one in the top 10 web application vulnerabilities by OWASP. This paper presents how to find these vulnerabilities using Code Review Testing technique. These implementations are used for only educational purpose. It is a crime if you apply these to any website without getting any permission from the owner. If crime is proved, he will be punished under Information Technology (Amendmen) Act, 2008, Section 43(a) read with section 66 is applicable.
239 An Android Based Medicine Reminder System Using External Storage?, Prabhukannan.G, Liza M. Kunjachen, Dr. J. Jegadeesan?
In Modern healthcare most of the errors have been identified in Out-patient medication administration. These medication errors are caused due to under or over doses and forgot to take medicines at proper time. Because of these types of errors recovery from the diseases are getting delayed and the patient is suffering for more time. In this paper we introduce an Android based application for the patients. This application will remind the user to take proper medicines in proper quantity at proper time. Because of the android application portability could result in theft, so data security requirements need to be incorporated in the design process. In ?Med Reminder" application, information on the device is encrypted and stored in the database, it is difficult to obtain illegitimately while still making confidential data easy to access. In this application, the data in databases residing on external secure digital card (SD card) of android devices are encrypted. In this paper, we discuss the technologies and methods used in android database encryption/decryption implementation and medicine in-take schedule to set reminder.
The Lossy based compression technique is used to compress the 3d image. These techniques exploit the spatial and temporal redundancy within the point data. To design an effective compression algorithm for point cloud computation by increasing its efficiency and space vector. so we perform a spatial decomposition based on octree data structures. By encoding their structural differences, we can successively extend the point clouds at the decoder. Another approach reduces coding complexity and coding precision. Our experimental results show a strong compression performance of a ratio of 14 at 1 mm coordinate precision and up to 40 at a coordinate precision of 9 mm.
241 Watermarking Scheme for Colour Images Using Hidden Markov Model?, Mr. Kanchan Mahajan, Prof. Mahendrakumar Rai?
Robustness, imperceptibility and high capacity simultaneously is of great importance in digital watermarking. This paper presents a new informed image watermarking scheme with high robustness and simplified complexity at an information rate of 1/64 bit/pixel. This paper uses HMM model for finding exact embedding strength in an image. HMM model in wavelet domain is successfully implemented on gray scale images, this paper extend the same concept for colour images. According to simulation results, this watermarking schema is robust under common attacks like Gaussian noise and compression and rotation attacks.
242 SECURITY ANALYSIS OF DYNAMIC GROUPS IN CLOUD?, Ms. Shrayu P. Pachgade, Prof .K.G. Bagde?
The technology of distributed data processing in which some scalable information resources and capacities are provided as a service to multiple external customers through Internet technology. Cloud computing can and does mean different things to different people. The common characteristics most share are on-demand scalability of highly available and reliable pooled computing resources, secure access to metered services from nearly anywhere, and dislocation of data from inside to outside the organization. While aspects of these characteristics have been realized to a certain extent, cloud computing remains a work in progress. This publication provides an overview of the security and privacy challenges pertinent to public cloud computing and points out considerations organizations should take when outsourcing data, applications, and infrastructure to a public cloud environment. Recent advances have given rise to the popularity and success of cloud computing. However, when outsourcing the data and business application to a third party causes the security and privacy issues to become a critical concern. In this paper I proposed a secure data sharing scheme for the dynamic groups in the cloud. Also we analyze the security and privacy of our scheme with the proofs, algorithms and techniques.
243 Review of Image Processing Techniques for Automatic Detection of Tumor in Human Liver, Vinita Dixit, Jyotika Pruthi
The review paper describes the various image processing techniques for automatic detection of tumor in human liver. Without a healthy liver a person cannot survive. It is a life threatening disease which is very challenging phenomenon for both medical and engineering technologists. The chances of survival having liver tumor highly depends on early detection of tumor and then classification as Malignant (cancerous) and Benign (non-cancerous) tumors. In this paper image processing techniques for automatic detection of brain are discussed which includes image acquisition, preprocessing and enhancement, image segmentation, classification and volume calculation steps.
244 Comparative Analysis of Routing Protocols AODV DSDV and DSR in MANET?, Yassine MALEH?, DR. Abdellah Ezzati?
Recent advances in wireless and micro electronic communications have enabled the development of a new type of wireless network called mobile ad hoc networks. MANETs are currently the greatest innovation in the field of telecommunications. A Mobile Ad hoc Wireless Network (MANET) is a collection of autonomous nodes that communicate with each other by forming a multi-hop network, maintaining connectivity in a decentralized manner. Routing protocols represent an essential aspect of the performance of mobile wireless networks. In this paper, we present the performance analysis of three prominent MANET routing protocols; DSDV, DSR, and AODV using NS2.
Network security consists of the provisions and policies adopted by a network administrator to prevent and monitor unauthorized access, misuse, modification, or denial of a computer network and network-accessible resources. In most of the systems, the network security is achieved by firewall and acts as a filter for unauthorized traffic. But there are some problems with these traditional firewalls like they rely on the notation of restricted topology and controlled entry points to function. Restricting the network topology, difficulty in filtering of certain protocols, end-to-end encryption problem and few more problems lead to the evolution of Distributed Firewalls. It secures the network by protecting critical network endpoints, exactly where hackers want to penetrate. This paper is a survey paper, dealing with the general concepts such distributed firewalls, its requirements and implications and introduce, its suitability to common threats on the Internet, as well as give a short discussion on contemporary implementations. A distributed firewall gives complete security to the network.
Web services and applications is an important part of our daily life providing communication and management of information from anywhere. With the increase in web services, the applications of web moved to a multitier design. These services have web server at front end that runs the applications and data server at back end. Due to their abundant use in managing information, the web services have always been the target of attacks such as future session attack, sql injection attack, cross site scripting attack etc. To prevent such attacks in web services, IDS based on anomaly detection that depends on honey pot can be used to detect the unknown attacks by identifying the anomalous behavior that differs from the behavior of legitimate user.
Magnetic Resonance Imaging is one of the best technologies currently being used for diagnosing brain tumor. Brain tumor is diagnosed at advanced stages with the help of the MRI image. Segmentation is an important process to extract suspicious region from complex medical images. Intelligent system is designed to diagnose brain tumor through MRI using image processing algorithms such as Particle Swarm Optimization. The proposed system is having three phases. In the first phase, preprocessing is performed to remove the film artifacts and unwanted skull portions in brain MRI image. In the second phase, enhancement is performed to remove noise in brain MRI image. In the third phase, Particle Swarm Optimization is implemented for segmenting tissues such as WM (White Matter), GM (Grey Matter) and CSF (Cerebrospinal Fluid) in brain MRI image. The segmented brain MRI helps the radiologists in detection of brain abnormalities and tumor. The algorithm is tested for 50 real patient’s brain MRI image.
248 An Analysis on the Performance Evaluation of Routing Protocols in Wi-Fi/802.11b Network?, Merin Skariah, Prof. Dr. Suriyakala C D?
The present scenario in wireless network is changing rapidly owing to several factors like high data rate, mobility, and range being some of them. The selection of an appropriate routing protocol is a key issue when designing a scalable and efficient wireless network. Here, we are proposing an intensive comparative study on the performance of AODV, DSR and DSDV protocols in Wi-Fi network. Our work will throw light on the performance analysis of Wi-Fi (802.11b) with appropriate metrics and also evaluate the performance of different routing protocols. The performance of these routing protocols has been analyzed for three metrics: throughput, end to end delay and packet loss. NS2 (Network Simulator) is used for the purpose of analysis.
In recent years Mobile ad-hoc network (MANET) is one of the most promising fields for research and development of wireless network. As the popularity of mobile device and wireless networks significantly increased over the past years, wireless ad-hoc networks has now become one of the most vibrant and active field of communication and networks. Due to severe challenges, the special features of MANET bring this technology great opportunistic together. Owe to the vulnerable nature of the mobile ad hoc network, there are numerous security threats that disturb the development of it. We first analyze the main vulnerabilities in the mobile ad hoc. In this paper, we discuss security issues and their current solutions in the mobile ad hoc network. Networks, which have made it much easier to suffer from attacks than the traditional wired network. Then we discuss the security criteria of the mobile ad hoc network and present the main attack types that exist in it. Finally we survey the current security solutions for the mobile ad hoc network. Owe to the vulnerable nature of the mobile ad hoc network, there are numerous security threats that disturb the development of it. We first analyse the main vulnerabilities in the mobile ad hoc. In this paper, we discuss security issues and their current solutions in the mobile ad hoc network. Networks, which have made it much easier to suffer from attacks than the traditional wired network. Then we discuss the security criteria of the mobile ad hoc network and present the main attack types that exist in it. Finally we survey the current security solutions for the mobile ad hoc network.
250 Selecting Best Features Using Combined Approach in POS Tagging for Sentiment Analysis?, Nikita D. Patel, Chetana Chand?
Today E-commerce popularity has made web an excellent source of gathering customer reviews / opinions about a product that they have purchased. The number of customer reviews that a product receives is growing at a very fast rate. Opinion mining from product reviews, forum posts and blogs is an important research topic today with many applications. Customers use the reviews for deciding quality of product to buy. So, opinion mining may be a Decision making process. It means reviews are given to promote or to demote the product. There is need to find how many reviews are positive and how many are negative. So, to find out it features for which classification is going to be performed should be best or optimal. This Paper presents various approaches of classification for sentiment analysis and proposed work is selecting best feature set such as pos tags from reviews which we can easily classify the review of customer. Only features which are giving best decision for analysis have been selected in pre-processing task and Combination of best feature set will be used to classify reviews.
Segmentation is the process which subdivides an image into its constituent regions. This technique can also identify region of interest in image or annotate the data. It is observed as the most critical function of image processing and analysis. Image segmentation plays a crucial role in many medical imaging applications by automating delineation of anatomical structures and other regions of interest. This paper makes a survey on the current segmentation techniques used in the retrieval of medical images. Numerous segmentation algorithms and techniques have been defined for image segmentation. This paper presents a detailed review of several segmentation techniques with their principal ideas, types, advantages and disadvantages.
252 DESIGN OF MULTIPLE CONSTANT MULTIPLICATION ALGORITHM FOR FIR FILTER?, T. Sandhya Pridhini, Diana Aloshius, Aarthi Avanthiga, Rubesh Anand?
Design of low power systems has become a significant performance goal in the present world. A fast and energy-efficient multiplier is required in electronics industry especially in Digital Signal Processing, image processing and arithmetic units in microprocessors.Multiplier is an important element which contributes substantially to the total power consumption of the system. In VLSI design, the designers have more constraints which include less silicon area, high speed and minimal power consumption. The Aim of this research is to design a low cost finite impulse response filter using the concept of faithfully rounded truncated multipliers. The optimization of bit width and the hardware resources are done with good accuracy. In direct FIR filter the multiple constant multiplication are implemented using the improved version of truncated multipliers.
253 Categorization of Data Mining Tools Based on Their Types?, J.Mary Dallfin Bruxella, S.Sadhana, S.Geetha
Data mining the extraction of hidden predictive information from large databases is a powerful new technology with great potential to help companies focus on the most important information in their data warehouses. The development and application of data mining algorithms requires the use of powerful software tools. As the number of available tools continues to grow, the choice of the most suitable tool becomes increasingly difficult. This paper attempts to support the decision-making process by discussing the historical development and presenting a range of existing state-of-the-art data mining and related tools. This paper is organized as follows: the first section Introduction about Data Mining. The criteria to compare data mining software are explained in the second section Review of Data Mining Tools. The last section Categorization of Data Mining Software into Different Types proposes a categorization of data mining software and introduces typical software tools for the different types.
254 Enhanced Data Transmission for Cluster-Based Wireless Sensor Networks?, Miss. Vaishali M.Sawale, Prof. Arvind.S.Kapse?
Nowadays, with the rapid increase of Wireless sensor Network enabled many devices and the more wide spread use of Wireless Sensor Network. WIRELESS sensor network (WSN) is a network system comprised the distributed devices using wireless sensor nodes to guide physical or environmental conditions, such as sound, temperature, and motion .Secure data transmission is a critical issue for wireless sensor networks (WSNs). Clustering is an effective and practical way to enhance the system performance of WSNs. Sensor used for these purposes needs to be deployed very slowly and in a random fashion Clustering is a technique employed to increase the various capabilities of a sensor network. We propose two secure and efficient data transmission (SET) protocols for clustered Wireless sensor Network CWSNs, called SET-IBS by using the identity-based digital signature (IBS) scheme and SET-IBOOS by using identity-based online/offline digital signature (IBOOS) scheme. This application facilitate to facilitate require packet Delivery from one or more senders to multiple receivers, provisioning security in group communications is pointed out as a critical and challenging goal In this paper, we study a secure data transmission for cluster-based Wireless Sensor Network (CWSNs).The results show that the proposed protocols have more performance than the existing secure protocols for CWSNs, in terms of security overhead and energy consumption.
255 QoS Enhanced Architecture for Cloud Computing Environment?, ANBUMOZHI ANBUKKARASAN, LIZA M KUNJACHEN?
Cloud computing is a popular model for enabling network access to shared pool of computing resources that can be provisioned with minimal effort. There are significant issues prevailing with regard to proficient provisioning. Existing works on cloud computing focuses on creation and deletion of static and dynamic VMs and based on the requests the VMs are recycled [1]. But significant amount of time is required for this process which could be applied in serving more user request s. In this paper we introduce provisioning technique that facilitates adaptive management of system offering end users guaranteed Quality of Services (QoS). To improve the efficiency of the system, we use workload analyzer and queuing techniques to achieve high QoS. A loop free path finding algorithm (LPA) is presented to identify the duplicate nodes and replace with the least cost code
256 Design of Router Micro Architecture Based on Runtime Adaptive Selection Strategies for On-Chip Communication Interconnection Network?, Ms. Aryadevi S, Mr. T Shanmughanathan?
To meet the growing computation-intensive applications and the needs of low-power, highperformance systems, and the number of computing resources in single-chip has enormously increased, because current VLSI technology can support such an extensive integration of transistors. This paper presents adaptive routing selection strategies suitable for network-on-chip (NoC). The main prototype presented in this paper use west first routing algorithm to make routing decision at runtime during application execution time. Messages in the NoC are switched with a wormhole cut-through switching method, where different messages can be interleaved at flit-level in the same communication link without using virtual channels. Hence, the head-of-line blocking problem can be solved effectively and efficiently.
Continuous scaling of CMOS technology makes it possible to integrate a large number of heterogeneous devices that need to communicate efficiently on a single chip. For this efficient routers are needed to takes place communication between these devices. As the chip scales, the probability of both permanent and transient faults is also increasing, making Fault Tolerance (FT) a key concern in scaling chips. This project, proposes a fault-tolerant solution for a bufferless network-on-chip, including an on-line fault-diagnosis mechanism to detect both transient and permanent faults, a hybrid automatic repeat request, and forward error correction link-level error control scheme to handle transient faults and a reinforcementlearning- based fault-tolerant deflection routing (FTDR) algorithm to tolerate permanent faults.
258 A Survey on Android’s Location Content Search Engine?, Kanchan B. Budhakar, Amruta T. Kashid, Rutuja N. Pathare?
The internet is widely used in day-to-day life. An Android’s Location Content (ALC) Search Engine that captures location of users and provides information related to that location. Data mining is done by click through data based on user preferences. In mobile search location information plays an important role. ALCSE has two concepts, location concept and content concepts. GPS is used to identify the user’s location. Click through data are stored on the client side ontology files and it is used for storing location and content based information on the server side. To balance the weights between the content and location facets four entropies are introduced. For reranking the data as per user preferences the weight vectors are used. Privacy is protected by storing & collecting clickthrough data on client side. ALCSE server performs the actual computation and heavy tasks and actual results are sent to the client, ALC search engine saves energy of users’ mobile.
259 Remote Administrative Trojan/Tool (RAT), Manjeri N. Kondalwar, Prof. C.J. Shelke?
Remote Administration Tool (RAT) allowing a potentially malicious user to remotely control the system. A Remote Administration Tool is remote control software that when installed on a computer it allows a remote computer to take control of it. A Remote Administration Trojan (RAT) allows an attacker to remotely control a computing system and typically consists of a server invisibly running and listening to specific TCP/UDP ports on a victim machine as well as a client acting as the interface between the server and the attacker. The most common means of infection is through email attachments. The developer of the virus usually uses various spamming techniques in order to distribute the virus to unsuspecting users. Malware developers use chat software as another method to spread their Trojan horse viruses such as Yahoo Messenger and Skype. Remote Administration Trojans (RATs) are malicious pieces of code often embedded in lawful programs through RAT-sanction procedures. They are stealthily planted and help gain access of victim machines, through patches, games, E-mail attachments, or even in legitimate-looking binaries. Once installed, RATs perform their unexpected or even unauthorized operations and use an array of techniques to hide their traces to remain invisible and stay on victim systems for the long haul.
In this paper, we propose a secure and reliable routing technique based on fuzzy logic (SRRT) for finding a secure and reliable path in wireless mesh networks. In this technique for each node we find out two variables, trust value and hop count value, to determine the lifetime of the routes. The trust level that is used to choose a reliable and secure route between the communicating nodes is not a predefined value. Therefore to facilitate the evaluation of trust levels, a fuzzy logic based approach has been also implemented. To assign trust levels to nodes of wireless mesh networks, a fuzzy trust evaluation mechanism receives information about the behavior history of wireless mesh network nodes. Three types of misbehaving nodes are considered in this paper. These include dropping the packets by the node, packets forwarding in a wrong direction and delay the packet regularly. Every node along route discovery records the trust value and hop count value in RREQ packet. In the destination with the aid of fuzzy logic, a new parameter is generated from inputs trust value and hop count value of each route which is called “Route”. The path with more route value is selected as a secure and reliable route from source to destination. Simulation results show that SRRT has significant performance and reliability enhancement in comparison with other traditional existing on-demand routing algorithms.
Cloud computing is emerging as a new paradigm of large-scale distributed computing. It is a framework for enabling convenient, on-demand network access to a shared pool of computing resources. Load balancing is one of the main challenges in cloud computing which is required to distribute the dynamic workload across multiple nodes to ensure that no single node is overwhelmed. It helps in optimal utilization of resources and hence in enhancing the performance of the system. The goal of load balancing is to minimize the resource consumption which will further reduce energy consumption and carbon emission rate that is the dire need of cloud computing.”Cloud computing” is a term, which involves virtualization, distributed computing, networking, software and web services. A cloud consists of several elements such as clients, data enter and distributed servers. It includes fault tolerance, high availability, Scalability, flexibility, reduced overhead for users, reduced cost of ownership, on Demand services etc. Central to these issues lies the establishment of an effective load balancing algorithm. The load can be CPU load, memory capacity, delay or network load. Load Balancing is the process of distributing the load among various nodes of a distributed System to improve both resource utilization and job response time while also avoiding a situation where some of the nodes are heavily loaded while other nodes are idle or doing very little work. Load balancing ensures that all the processor in the system or every node in the network does approximately the equal amount of work at any instant of time. This technique can be sender initiated, receiver initiated or symmetric type (combination of sender initiated and receiver initiated types). Our objective is to develop an effective load balancing algorithm using Divisible load scheduling theorem to maximize or minimize different performance parameters (throughput, latency for example) for the clouds of different sizes (virtual topology depending on the application requirement).
262 A High Speed and Area Efficient Wallace Tree Multiplier with Booth Recoded Technique?, B. Venkata Sateesh, Shiju C Chacko?
Wallace tree is an improved version of tree based multiplier architecture.wallace tree multiplier implemented by using booth recoder in this paper. This paper aims reduction of additional latency and area of improved version of Wallace tree multiplier. In proposed method implementing by the use of booth algorithm to generate partial products and compressor adder techniques can be used to sum partial products. The modified architecture shows result of proposed architecture is around 67 Percent faster than the existing Wallace-tree multiplier, 22 percent faster than the radix-8 Booth multiplier, 18 Percent faster than the radix-16 Booth multiplier. Proposed architecture shows better performance in terms of area and speed.
263 Agile Programming and Design Patterns on Web Applications Using J2EE and Ruby on Rails –A Case Study?, Vedavyas J, Kumarswamy Y?
Agile Programming methodologies reduce the risk of longer time consumption for the software applications development. They give much importance to the real-time communication and priority for the satisfaction of stake holders. From the object oriented world the Design patterns have taken a lot of effort for the design reuse. Design patterns provide solutions and also make easy to common design problems by reusing successful designs and architectures. By merging the application and implementation methodology lots of design patterns can be formed for to help the new designers. This paper emphasizes on the Agile methodology, Design patterns for the web applications using J2EE and Ruby on rails with respective case study design pattern example.
264 A Novel Design of Reversible Universal Shift Register, Rashid Anwar, Jobbin Abraham Ben?
Reversible logic gates provide power optimization which can be used in low power CMOS design, optical computing, quantum computing and nanotechnology. This paper propose a new 4*4 reversible RR gate that works as a reversible 4:1 multiplexer and has a reduced quantum cost. A novel design of Reversible Universal shift register using RR gates with reduced delay and quantum cost is proposed.
Mobile ad hoc network (MANET) is an autonomous system of mobile nodes connected via wireless links. Each node operates not only as an end system, but also as a router to forward packets. The nodes are free to move about and organize themselves into a network. Delay Tolerant Networking (DTN) is an end-to-end network architecture designed to provide communication in and/or through highly stressed networking environments. The key part in DTN is Bundling protocol. The bundling protocol allows hosts that normally cannot able to communicate each other due to network partitioning. Bundle Protocol follows the method of store, carry and forward. In case of store, carry and forward method, packets have been held for some period of time only. Probably there is a possibility of packet loss and delay occurs in those particular nodes, when time expires in Bundle Protocol. This paper aims to reduce such delay by using MANET routing protocol called OLSR (Optimised Link State Routing Protocol) for better delivery of packets. If the delay occurs in any node, Optimized Link State Protocol is enabled and the packets can be delivered quickly to the neighbours in an efficient manner. The parameters considered here are end to end Delay, throughput, route load and packet delivery ratio.
266 Android Based Meter Reading Using OCR?, Rohit Dayama, Anil Chatla, Heena Shaikh, Medha Kulkarni?
Meter reading and billing are complex tasks of electricity, water and gas supplier companies. The current technology of billing process uses manual process of meter reading, updating the server with reading and billing customer. We are suggesting a technology that includes android application and web application to get reading, updating server and inform consumers about bill units and amount. Android application is used to get the readings from the meter automatically by simply capturing the image of the meter and then performing the OCR technique on the captured image in android app which is nothing but ?optical character recognition?. The output of OCR is meter reading from image which is then send to the server. The customer will receive a mail regarding the bill as soon as the photo is been clicked. With the help of web application customer can view his bill and make payment online, customer can also lodge complaint if any. New features are also added that will reduce workload on company and their employees.
267 DooDB: A Graphical Password Database Consist of Doodles and Pseudo Signature Based User Authentication?, Miss. Yugandhara A.Thakare, Prof.A.S.Kapse?
Most of the computer user has to remember the alphanumeric password or PIN consist of combination of numbers, characters and symbols for different kind of application which may intern increases load of user in remembering per application username and password. This method has been shown to have significant drawbacks. For example, users tend to pick passwords that can be easily guessed. On the other hand, if a password is hard to guess, then it is often hard to remember. To address this problem and for reducing the cognitive load on the user we have presented DooDB containing Doodles and Pseudo Signature. Here the focus is upon doodle based password which is subset of recall graphical passwords .Users are authenticated by asking to draw with their fingertip a doodle on a handheld device touch screen which is captured and further used for verification. There were no restrictions regarding duration or shape.
268 Secure Key Exchange over Internet?, Miss. Pooja P. Taral, Prof. Vijay B. Gadicha?
Due to advances in technology and communication, it requires more effort to ensure security. It is essential that every organization has the right level of security. Authentication in security had emerged to be an essential factor in the key establishment over internet. The DIKE (Deniable Internet Key Exchange) protocols add novelty and new value to the IKE standard. In recent communication systems, as there is more and more use of internet, the security services have become essential. Key-exchange in Diffie–Hellman key-exchange (DHKE) is among the core cryptographic mechanisms to ensure network security.
Open Source Software (OSS) uses open bug repository during development and maintenance, so that both developer and user can reports bugs that they have found. These systems are generally called as bug tracking system or bug repositories. Bug tracking system is open bug repository that is maintained by open source software organizations to track their bugs. In OSS bug reports from all over the world are gathered which is submitted by geographically distributed team through the use of the internet. Team member of OSS typically works in distributed environment, so the system of tracking bugs in open bug repository is totally distributed and uncontrolled. In OSS different reporters may submit same bug report again and again for the same problem. The same report which is submitted by several reporters is referred to as duplicate bug report. Excessive duplicate bug put extra overhead on software organizations. Utility of software is hindered by these duplicate bugs. In this study bug repository of open source projects was explored to find out the factors that have impact on duplicate bug reports. To find out the factors that have impact on duplicate bug reports bug repository of six open source software project was explored. Factors analyzed for study were numbers of submitters, bug repository size, project size, life span of project and number of developers. Project studied were Thunderbird, Mandriva Linux, and Firefox for Android, Eclipse BIRT and Kompare. It is evident from the result that some factors have impact on duplicate bug reports and some factors do not seem to impact duplicate reports. The factors that impact the duplicate bug report in bug tracking system are number of submitters and size of bug repository. On the other hand project size, project life span and number of developer does not seem to be the factors that impact duplicate bug reports.
In radio, multiple-input and multiple-output, or MIMO (pronounced my-moh by some and me-moh by others), is the use of multiple antennas at both the transmitter and receiver to improve communication performance. It is one of several forms of smart antenna technology. MIMO technology has attracted attention in wireless communications, because it offers significant increases in data throughput and link range without additional bandwidth or increased transmit power. It achieves this goal by spreading the same total transmit power over the antennas to achieve an array gain that improves the spectral efficiency (more bits per second per hertz of bandwidth) and/or to achieve a diversity gain that improves the link reliability (reduced fading). Because of these properties, MIMO is an important part of modern wireless communication standards such as IEEE 802.11n (Wi-Fi), 4G, 3GPP Long Term Evolution, WIMAX and HSPA+.
271 A Comparative Study on Performance Evalution of Eager versus Lazy Learning Methods?, Solomon Getahun Fentie, Abebe Demessie Alemu, Bhabani Shankar D. M.?
Classification is one of the fundamental tasks in data mining and has also been studied extensively in statistics, machine learning, neural networks and expert systems over decades. Naïve Bayes, k-nearest, and decision tree are the most commonly known classification algorithms ever used in different researches. In this study, the performance evaluation of eager (naïve Bayes, ADTree) and lazy (IBk, KStar) classification algorithms are experimented. Our findings show that based on the evaluation metrics precision, recall, F-measure and accuracy, eager learners are slow in training but faster at classification than lazy classification algorithms because they constructs a generalization model before receiving any new tuples to classify. Moreover, based on our investigation eager learners outperform the lazy learners in their accuracy.
272 Congestion Detection & Minimization in Wireless Sensor Network By Using Multipath Rate Organization Technique?, Prof. Sachin Patel, Prof. Rakesh Pandit, Mr.Abhijeet Rathod
In wireless sensor network to achieve higher consistency and load balancing the various multipath routing protocols have been proposed. Moreover, wireless sensor network typically incorporates heterogeneous applications within the same network. A sensor node may have multiple sensors i.e. light, temperature, motion etc. with different transmission characteristics. An important function of the transport layer in WMSNs is congestion control [1]. When an event occurs the sensor node becomes active in transmitting information because of this data traffic increases and might lead to congestion that results in packet drops and decrease network performance. In this paper, we present a new technique on node based Congestion Control with Priority Support distributing the load on node as an indication of congestion degree [6]. The rate assignment to each traffic source is based on its priority index as well as its current congestion degree.
273 Performance Analysis of two Anaphora Resolution System for Hindi Language, Priya Lakhmani, Smita Singh, Sudha Morwal?
One of the challenges in NLP is to determine what entities are referred to in the discourse and how they relate to each other. This is known as Anaphora resolution. Basically there are three main algorithms for anaphora resolution- Hobbs, Centering and Lappin Leass algorithm. This paper presents the comparison of two computational models for resolving anaphora. The first model is based on the concept of Lappin and Leass algorithm and the second model is based on the concept of Centering algorithm. Both of these model works on Hindi language. As Hindi language is quiet complicated with respect to other European languages, there are many factors needed to be considered for resolving anaphora. Our computational models uses Recency factor as a salient factor. An experiment is conducted on short Hindi stories and the comparative result for both the models is summarized. The respective accuracy for both the model is analyzed and finally the conclusion is drawn for the best suitable model for Hindi Language.
274 Design of Power and Rate Adaptation with Scheduling In Wireless Networks Based On SIC?, M.Vinodhini, R.Uthira Devi?
To properly evaluate the usage of SIC, a joint design of power and rate adaptation algorithm with scheduling in wireless networks is to be analyzed.SIC (Successive Interference Cancellation) is an effective way of multiple packet reception (MPR) to fight with interference in wireless networks. Power management and high data rate is an important problem in wireless networks. With the help of SIC these problems are investigated. The link scheduling and power adaptation will minimize the total transmission power in a network and rate adaptation algorithm will minimize the high data rate in a network. The main objective is o improve the performance (QoS) of a network topology. Joint design of power and rate adaptation with scheduling has great potential to increase the throughput gain and successful delivery of packets and capacity in wireless networks. Performance of proposed scheme has been verified using network simulator to show that the approach is efficient
275 New Touch Screen Application to Retrieve Speech Information?, J.Rajeswari, E.Thanga Selvi?
An adaptive speech rate control technology for ultra fast listening that is equivalent to skimming is described. Nowadays, listening to audio books on mobile devices is quite common. People read books at various levels of detail from close reading to skimming. Although a similar feature to skimming is required to efficiently obtain information from audio sources, there is no tool equivalent to skimming for audio playback. Therefore a new speech rate conversion method is developed to efficiently obtain information from audio sources with very fast replay. This algorithm will help not only sighted people to enjoy audio books but also visually impaired people because almost all of their information is obtained from speech. Thus, the implementation of this technology on special audio players for visually impaired people as a new replay function is expected to be useful. Moreover, this technology should be useful for all audio book listeners, not only people with limited sight. A new touch screen application is developed for consumer use.
Searching energy sources to satisfy the world’s growing demand is one of the foremost challenges for the next coming century. The seasonal movements of earth affects in the radiation intensity on solar systems. The design and construction of an efficient charging system for battery by tracked solar panels. Thus, the implementation of an energy management system applied to line follower robotic vehicle. The main proposals of the project are the implementation of a solar tracking mechanism aimed at increasing power levels in the solar panels. The robotic vehicle battery is charged by the solar panel, optimal charging circuit using the microcontroller and BFO algorithm in programming to increase the efficient for charging battery. To improve the solar tracking accuracy, a mixed solar-tracking system combines BFO (Bacterial Foraging Optimization) with PSO (Particle Swarm Optimization) algorithm is develop. Since the proposed mechanism is capable of tracking maximum light intensity.
277 Estimation and Detection of Blood Pressure Using Smart Phones without Using Cuff?, V.Sathya, R.Mohan Raj?
Smart phones are very popular in today’s life. Cameras with high resolution, High end processors and built-in sensors such as accelerometer, orientation sensor and light-sensors are equipped in today’s phones. Motivated by this statistic and the diverse capability of smart phones, we focus on utilizing them for biomedical applications. Blood pressure is a significant vital sign; blood pressure monitoring has a great significance to determine the health status of patients. In this project blood pressure, Temperature and heart beat are estimated by using sensors and the result is viewed in smart phones by using GSM and if any abnormal result is showed then the result is send to the particular mobile number through SMS. An application is created in smart phone to view the result in waveform. We estimate the systolic and diastolic pressure. It would be useful to have a device which can measure all vital signs in such an event. Transmission of the vital signs measured using the smart phone can be a life saver in critical situations.
278 Fingerprint Authentication System Using Minutiae Matching and Application?, M.Sathiya Moorthy, R.Jayaraj, Dr. J. Jagadeesan
Fingerprints are the most widely used biometric feature for person identification and verification in the field of biometric identification. Fingerprints possess two main types of features that are used for automatic fingerprint identification and verification: (i) global ridge and furrow structure that forms a special pattern in the central region of the fingerprint and (ii) minutiae details associated with the local ridge and furrow structure. This paper presents the implementation of a minutiae based approach to fingerprint identification and verification and serves as a review of the different techniques used in various steps in the development of minutiae based Automatic Fingerprint Identification System (AFIS). The technique conferred in this paper is based on the extraction of minutiae from the thinned, binarized and segmented version of a fingerprint image.
279 A Statistical Communication Analysis Model for Attack Detection in Mobile Network?, Amit Kumar, Omprakash Tailor, Mr. Krishna Kumar?
Security is one of the most critical issue of mobile network that ensures the reliable communication over the network. Because of dynamic property of Mobile Adhoc Network, the chances of the security issues increases. These attacks are incorporated at different security layers of the network. To handle these attacks, different researchers have defined many authentication based, prevention based and detection based approaches. In this paper, a description of one of such standard security model is presented based on the statistical analysis. The presented analysis model will perform the communication analysis under different attacks and analyze the severity of attack. The work is presented as the constraint based model description so that the reliable communication will be ensured.
280 A Study on Different Approaches on Software Review Mining?, Anju, Anshul Anand?
Questionnaire based analysis is one of the traditional approach for software quality evaluation under different measurement vectors. But generally a questionnaire involves the technical terms and boundation so that the end user is not interested to answer the answer sequence. An improved form to obtain the user interest analysis in the software is the software review analysis. In this form, the user interest is accepted in the form of text review and an intelligent analysis over the review is performed to identify the software product context analysis and the review comment analysis. Based on this analysis, the overall user adaptive interest to the software product is identified. In this paper, a study on different approaches of software product review is defined.
In earlier works several data aggregation schemas based methods have been proposed to overcome the problems of the privacy in wireless sensor networks. These methods provides efficient result to analysis of secure data with traditional aggregation because cluster heads can straightforwardly comprehensive the cipher texts not including decryption; accordingly, reduces the transmission overhead in wireless sensor networks. But still the data aggregation schema occurs two major problems in aggregation process; the major cluster head does not receive the entire data and cannot authenticate data truthfulness and dependability via between message digest. To conquer these problems in this work first proposed a Generalized Geometric programming method to select best cluster head or aggregation of the data by finding the shortest hop inter-CH routing. For this scheme, transmission overhead reduced and improve the coverage-time maximization is created as a signomial optimization difficulty with the purpose of is capably solved using Generalized Geometric Programming (GGP) techniques. The optimal cluster sizes of the individual data aggregation are achieved beginning this examination. Experimentation results show that the transmission overhead is still concentrated even if our approach is recoverable on common sensing data. Furthermore, the design has been widespread and adopt on together homogeneous and heterogeneous wireless sensor networks data aggregation schemas while using GGP clustering (or) cluster head routing selection.
282 Characteristic Evaluation of Distributed QoS Routing, Shuchita Upadhyaya, Gaytri Devi?
In current scenario of Internet, the demand of real time multimedia applications has been increased. These applications are bandwidth greedy, impose strict delay guarantees, stable jitter and low packet loss probabilities, which require a fixed Quality-of-Service (QoS) assurance in transmission. The present Internet routing methods, based on the best-effort paradigm, is not able to provide any performance assurance required in these applications. Here is a need of the mechanism which will consider these QoS factors(delay ,jitter, bandwidth etc.) for the transmission. There are many aspects in the network which provide the guarantee of the quality of service. However the one of the key technology for providing it is the QoS routing. The basic problem of QoS routing is to find a path satisfying multiple constraints. It is focused on identifying the path that will consider multiple parameters like bandwidth, delay, jitter, cost, hop count etc. instead of one. To provide Quality of Service (QoS) guarantee both of the routing schemes –source routing and distributed routing can be used. In source routing, the path computation is done at source node whereas in distributed routing, the path computation is distributed among intermediate routers between source and destination. Both source routing and distributed routing have important roles to play in QoS routing. Source routing is seen impractical in Internet as the complete explicit path would have to be included in the IP header. In source routing, path computation is done at the source node. Whereas in distributed routing, path computation is distributed among set of intermediate nodes between source and destination. Source routing is used in today?s Internet for special cases only, such as mapping the network with trace route, troubleshooting etc. Distributed routing is currently the dominant method in Internet. This paper describes the distributed routing approach and its implementation in QoS domain. Many distributed QoS routing algorithms have been proposed in literature by varying the QoS metrics and protocols. The paper discusses some of these algorithms and also provides their characteristics evaluation depicting its type, metrics considered and state information.
Image compression is the method of transmission with minimal size without degrading the quality of the image however the result of image compression is less than optimal. Image processing is a method to convert an image into digital form and extract useful information. Cloud computing (SaaS) is based on on-demand self services with pay as use model. Software as a Service is a cloud computing service model that makes use of cloud computing infrastructure to deliver an application to many users. In the existing system, to upload a quality image in web, it requires about 20KB.The objective of this proposed work is to further reduce memory space using Haar Wavelet Transform to store and retrieve the images. The compressed image is then deployed in cloud for the efficient sharing of images among the users.
284 Design of Power Efficient Low-Cost Embedded Control Systems for Domestic Induction Heating Appliances?, S.Shanthi, S.Muthukrishnan, G.Mohanambal?
The demand for better quality, safe and power efficient products is most preferred in recent days. Safe, efficient and quick induction heating appliances attract more customers. This work describes the model of induction heating process, design of inverter circuit and the execution results. In the design of heating coil, power converter unit and closed feedback system are very important design factors because they decide the overall operating performance of induction heater including efficiency and performance. The circuit is simulated using the proteus software and the performance is analysed using the experimental results.
285 Encryption of JPEG2000 Images using Watermarking?, Renuga Devi.J, Priyadharshini.K, Umamaheswari.V
The proposed system aims to secured jpeg2000 images using digital image watermarking technique. The watermarking scheme is robust against noise, scaling and filtering attack. The jpeg2000 image encryption and decryption using RC4 stream cipher. In watermark, embedding and extraction has been done using robust watermarking technique. The watermarking techniques are Spread Spectrum, Scalar Costa Scheme and Rational Dither Modulation. The proposed work is analyzed by Peak Signal to Noise Ratio (PSNR), which is calculated using Mean Square Error (MSE).
Cloud computing is an emerging computing paradigm which involves virtualization, distributed computing, networking, software and web services. Cloud computing stores the data and disseminated resource in open environment. Load balancing is one of the main challenges in cloud environment which aims in optimizing resource use, maximize throughput and avoid overload. It requires distribution of the dynamic workload across multiple nodes to ensure that no single node is overwhelmed. A SQ(d) scheduling algorithms can maintain load balancing and provide minimum job scheduling and resource allocation. In order to gain maximum profits with optimized load balancing algorithms, it is necessary to utilize resources efficiently. So the proposed work address JIQ algorithm which provides efficient performance in load balancing. It effectively reduces the system load, communication overhead at job arrivals and maintains actual response time.
287 Secured Energy Optimization for Wireless Multimedia Sensor Networks using Fuzzy logic?, Sindhu Duraisamy, R. Priya, Vinu Raja VijayaKumar?
Network congestion, data accuracy and network lifespan founds to play a major decisive concern for resource restraint in Wireless Multimedia Sensor Networks (WMSNs). Multimedia data comprises larger volume of information which is needed to be transmitted over the network. Since these multimedia collection may comprise of audio, video and scalar data. The utilization of memory and resource for transmission of multimedia data results in outsized contrast that consequences congestion, packet drop, buffer spill over declension of throughput and quality of service. To trounce this out stress in this paper the network is designed to be deployed with heterogeneous sensor networks. A new way of Fuzzy logic scheme is introduced in this paper which involves two phases, assortment phase and dispatching phase. In first phase fuzzy logic assortment technique enforces the classification of inward multimedia stream. Segregated facts are routed through designated path by using ant based routing scheme in second phase. Security litigation is accomplished by one-way hash function. Finally we compared the proposed protocol with the existing distributed predictive and verification algorithm, were the results shows that proposed scheme has a greater QoS merits.
288 Analysis of Integer Transformation and Quantization Blocks using H.264 Standard and the Conventional DCT Techniques?, Priyanka P James, Chirappanath B Albert, Inbalina.K?
H.264 standard, transformation is a technique of converting the image samples into elementary frequency components. Integer Transformation helps in removing redundant data from an image and involves only real components and quantization reduces the precision of transform coefficients. H.264 is a lossy compression format because of Integer Transformation and Quantization. This paper deals with the understanding and the analysis in the reduction of complexity of integer transformation and quantization blocks using H.264 and the conventional techniques.
289 LOW POWER QVCO USING ADIABATIC LOGIC?, Vergin Jeyaseeli.F, Udhaya Kumar.S?
A new low-phase noise low-power quadrature voltage-controlled oscillator (QVCO) using adiabatic logic is proposed. Power can be reduced by using this technique. The QVCO is a group of two superposable current-switching distinction Colpitts VCOs in which the major core VCO is affiliated to the second in an in-phase method, and the second core VCO is securely coupled to the first in an anti-phase mode. To syndicate the two core VCOs, the Substrates of the cross-connected transistors as well as the substrates of MOS varactors are used; they need not for any additional fundamentals for coupling, which could decrease the noise and decrease the power dissipation. The power is reduced up to 0.02 nW and the frequency range is reduced up to 1.8 MHz compared to existing system.
290 Identity Management System to Ensure Cloud Security?, Miss. Priyanka S. Rathod, Prof. Mr. R.R. Keole?
Cloud computing can provide infinite computing resources on demand due to its high scalability in nature, which eliminates the needs for Cloud service providers to plan far ahead on hardware provisioning. Security is the biggest challenge to promote cloud computing currently. Trust has proved to be one of the most important and effective alternative means to construct security in distributed systems. Multi located data storage and services in the Cloud make privacy issues even worse. In order to efficiently and safely construct entities trust relationship in cloud and cross-clouds environment, identity management services are crucial in cloud computing infrastructures to authenticate users and to support flexible access control to services, based on user identity properties (also called attributes) and past interaction histories . Such services should preserve the privacy of users, while at the same time enhancing interoperability across multiple domains and simplifying management of identity verification.
291 FUFM-High Utility Itemsets in Transactional Database, S.Priya, E.Thenmozhi, Mrs. D.Shiny Irene?
The practical usefulness of the frequent item set mining is limited by the significance of the discovered itemsets. There are two principal limitations. A huge number of frequent item sets that are not interesting to the user are often generate when the minimum support is low.Proposing two algorithms, namely utility pattern growth (UP-Growth) and UP-Growth+, for mining high utility itemsets with a set of effective strategies for pruning candidate itemsets.
292 Importance of Virtual Reality in Current World?, Shiny Mathew?
Virtual reality (VR) is considered as important technology, giving scope for a great leap for adverse fields. Virtual reality is sometimes referred to as immersive multimedia, is a computer-simulated environment that can simulate physical presence in places in the real world or imagined worlds. This paper brushes the importance of this stimulated reality stating how VR has gone through advancements giving us a cutting edge technology. It correlates an in-depth detail about origin and history of early VR and elaborates the current stand of this technology in the society. It illustrates the technology misconceptions with its aspects of being fully developed and how we can overcome it. People rivet on VR mainly for entertainment but their real impacts are in arts, business, communication, design, education, engineering, medicine and many other fields.
293 A Study of Web Traffic Analysis, Mr. Pratik V. Pande, Mr. N.M. Tarbani, Mr. Pavan V. Ingalkar?
With the rapid increasing popularity of the WWW, Websites are playing a crucial role to convey knowledge and information to the end users. Discovering hidden and meaningful information about web user’s usage patterns is critical to determine effective marketing strategies to optimize the Web server usage for accommodating future growth. Most of the currently available Web server traffic analysis tools explicitly provide statistical information. The web server traffic analysis tools make the use of Web Access Logs that are generated on the server while the user is accessing the website. A Web access log comprises of various entries like the name of the user, his IP address, number of bytes transferred timestamp etc. The task of web traffic analysis tools becomes more challenging when the web traffic volume is enormous and keeps on growing. In this paper, we propose a various model to discover and analyze useful knowledge from the available Web log data and also provides a comparative study of variety of Log Analyzer tools exist which helps in analyzing the traffic on web server.
294 Energy Saving in Wireless Sensor Network using Attribute Based Dynamic Routing?, G.Subha, M.Nava Bharathy?
In wireless sensor networks the sensor nodes are deployed in various environments for monitoring temperature, pressure and some other purposes. Theses sensor nodes have limited energy and cannot be reenergized easily. So to utilize the energy efficiently, this project proposes an attribute based energy saving mechanism. Here the information sensed by the sensors are aggregated based on packet attribute which is inspired by the concept of pheromone in ant colony optimization. The data sensed by the sensors are sent to sink by various path based on potential based dynamic routing protocol. Thus it reduces the redundant information produced by the adjacent sensors and also provides the information with accuracy. If the sensor nodes are compromised or the information is modified by an adversary then it will be critical for the user to use that information. So the data must be encrypted and then send to the base station.
Cloud computing allows users to view computing in a new direction, as it uses the existing technologies to provide better IT services at low-cost. To offer high QOS to customers according SLA, cloud services broker or cloud service provider uses individual cloud providers that work collaboratively to form a federation of clouds. It is required in applications like Real-time online interactive applications, weather research and forecasting etc., in which the data and applications are complex and distributed. In these applications secret data should be shared, so secure data sharing mechanism is required in Federated clouds to reduce the risk of data intrusion, the loss of service availability and to ensure data integrity. So In this paper we have proposed zero knowledge data sharing scheme where Trusted Cloud Authority (TCA) will control federated clouds for data sharing where the secret to be exchanged for computation is encrypted and retrieved by individual cloud at the end. Our scheme is based on the difficulty of solving the Discrete Logarithm problem (DLOG) in a finite abelian group of large prime order which is NP-Hard. So our proposed scheme provides data integrity in transit, data availability when one of host providers are not available during the computation.
296 Implementation of Multibiometric System Using Iris and Thumb Recognition, Ashish Naghate, Mayur Sahu, Pranju Bhange, Swati Lonkar, Pallavi Wankhede, Yamini Bute?
A Multibiometric system relies on the evidence presented by multiple sources of biometric information. Most biometric systems that are presently in use, typically use a single biometric trait to establish identity (i.e., they are unibiometric systems). Multibiometric systems address the issue of non-universality (i.e., limited population coverage) encountered by unibiometric systems. There are various basic criteria for Multibiometric security system: gait recognition, voice recognition, palm recognition, face recognition, iris recognition, fingerprint recognition, vein recognition. From these two main approaches are used in this: Fingerprint and Eye Iris. These two unibiometric systems results in multibiometric system giving more security to the users.
297 Software Quality Assessment in Object Based Architecture?, N.Jayalakshmi, Nimmati Satheesh?
Software metrics are required to measure quality in terms of software performance and reliability related characteristics like dependencies, coupling and cohesion etc. It provides a way to measure the progress of code during development and having direct relationship with cost and time incurred in the software design and development at their later stages. These major issues must be checked and informed early in the development stage, so that reliability of any software product could be ensured for any large and complex software project. Object oriented software metrics directly focuses on the issues like complexity, reliability and robustness of the software developed using object oriented design methodologies. It reflects the time, cost and effort that Functionality, Scalability, Usability, Performance, Reliability, maintainability. Durability, Serviceability, Availability, Installability, Structured ness and Efficiency. There are two types of parameters namely functional parameters and nonfunctional parameters. Functional parameters deal with the functionality or functional aspects of the application while non functional parameters deal with the non-functional parameters (but desirable) like usability, maintainability that a developer usually doesn’t think of at the time of development oriented software like Extensibility, Reusability, efforts, manageability and cost [1, 2, 3]. To know more about the internal structure of the product one should know more about the interdependencies of parameters of metrics and Software quality parameters. Figure 1 shows the interdependencies of the metrics parameters and software quality parameters by measuring Object Oriented Metrics. Metrics Parameters Object oriented metrics provides all parameters through which one can estimate the complexities and quality related issues of any software at their early stages of development. The three object oriented metrics namely MOOD Metrics, CK Metrics, and QMOOD Metrics and given a case study to show, how these metrics are useful in determining the quality of any software designed by using object oriented paradigm.
Large applications executing on Grid or cluster architectures consisting of hundreds or thousands of computational nodes create problems with respect to reliability. The source of the problems is node failures and the need for dynamic configuration over extensive runtime. This paper presents two fault-tolerance mechanisms called Theft-Induced Check pointing and Systematic Event Logging. These are transparent protocols capable of overcoming problems associated with both benign faults, i.e., crash faults, and node or subnet volatility. Specifically, the protocols base the state of the execution on a dataflow graph, allowing for efficient recovery in dynamic heterogeneous systems as well as multithreaded applications. By allowing recovery even under different numbers of processors, the approaches are especially suitable for applications with a need for adaptive or reactionary configuration control. The low-cost protocols offer the capability of controlling or bounding the overhead. A formal cost model is presented, followed by an experimental evaluation. It is shown that the overhead of the protocol is very small, and the maximum work lost by a crashed process is small and bounded. One possible solution to address heterogeneity is to use platform independent abstractions such as the Java Virtual Machine. However, this does not solve the problem in general. There is a large base of existing applications that have been developed in other languages. Reengineering may not be feasible due to performance or cost reasons. Environments like Microsoft .Net address portability but only few scientific applications on Grids or clusters exist.
Now everyone use mobile. With Associate in Nursing more and more mobile society and the worldwide preparation of mobile and wireless networks, the wireless communications will Support several current and rising attention applications. During these applications embody health watching and intelligent emergency management system, attention information access, and all over mobile telemedicine. The design and development of a invasive health system sanctioning self-management of patients throughout their daily activities. The projected system integrates patient health watching, standing work for capturing many issues or symptom met, and social sharing of the recorded data inside the patient’s community, attending to facilitate malady management system.
300 Performance Analysis of Topology based Routing Protocols in VANET, M. Chitra, S. Sivasathya, B. Muthamizh
Vehicular Ad Hoc Network is an emerging field in wireless technology. Data Dissemination in VANET is complicated due to high mobility and continuous change in the topology of the network. In this study we have analyzed the performance of topology based routing protocols with two different traffic scenarios in VANET. In order to analyze the performance of these protocols, we considered some of the QOS parameters like Average Throughput, End-to-End Delay, Packet Delivery Ratio and Average Jitter. This paper considered the different topology based routing protocols including OLSR, IARP, AODV, DYMO and ZRP. Here OLSR, IARP are proactive protocols, AODV, DYMO are reactive protocols, and ZRP is a hybrid protocol. The performance of these protocols has been analyzed and presented using QualNet Simulator 5.0.2.
Graphical passwords and text passwords are most widely used primary user authentication for all websites due to its convenience and simplicity. These kinds of passwords are static in nature and hence it is easy for the attackers to hack using malicious programs and threats. The security drawbacks of the existing methods are phishing, keyloggers, malware, sniffing, spoofing, surfing and guessing. These problems can be overcome by using OTP protocol which is dynamic and also it greatly avoids man-in-the-middle attack, password reuse and password stealing attacks. The proposed method requires a long-term password for login and must be remembered. Finally, OTP is generated using MD5 algorithm. The authentication system is suggested for Android smart phones.
302 Intelligent Monitoring of Patients in Hospitals Using CAN Protocols and ARM7TDMI Processor?, Dr. C. Gurudas Nayak?, Siddu S Kadiwal?, Shobha Kadiwal?, Aruna Kumar Angadi?
In a hospital the monitoring of multiple patients constantly is a major issue if patient is not in intensive care unit. This paper presents a monitoring system that has the capability to monitor physiological parameters from multiple patient bodies and alarm the doctors if the patient’s physiological parameters go beyond critical values. In the proposed system, a Electronic Control Unit has attached near patient body to collect all the physiological parameters and sends them to the base station. The attached sensors on patient’s body are able to sense the heart rate, blood pressure and so on. This system can detect the abnormal conditions, issue an alarm to the patient and send a SMS to the physician. The main advantage of this system in comparison to previous systems is to reduce the energy consumption to prolong the network lifetime, speed up and extend the communication coverage to increase the freedom for enhance patient quality of life. We have developed this system in multi-patient architecture for hospital healthcare.
303 Cloud Security Tracking, Log Maintenance and Notification System for Net Banking Cloud Applications?, Miss. Bhagyashree A. Dhamande, Mr. Amit Sahu?
Cloud computing offers an innovative business model for organizations to adopt IT services without upfront investment. It advantages are scalability, resilience, flexibility, efficiency and outsourcing non-core activities. Our system is intended to develop an application which is multi tenant cloud based and mission critical. Security is one of the major issues which hamper the growth of cloud. This paper introduces a detailed analysis of the cloud computing security issues and challenges focusing on the cloud computing types and the service delivery types. To develop a cloud based system that demonstrates tracking of activities, maintaining log of events and notifying the users about probable threats.
304 Design of Power Optimization using C2H Hardware Accelerator and NIOS II Processor?, Mr. Sufiyan B. Mukadam, Prof. Abhijit S. Titarmare?
The current trend in the silicon industry has been a move steadily towards Chip Multicore Processor (CMP) system to get better outputs. However, chip multicore processors have higher amount of soft errors, which result in degradation of the overall system reliability. Hence, we have been cautious of using CMP architectures for faster-reliable embedded real-time system applications that have high reliability levels. The major use of these processors also states the processor migration tendency. With new technology processor architectures, the older ones are to become vanished sooner. Present the power optimization and detailed reliability analysis of power optimization of single-core and multi-core based systems. The analysis results are then used to compare the power optimization and reliability of CMP architectures with the corresponding reliability of single processor architectures. To fulfil this requirement, Designs a method for power optimization using NIOS II processor. Reducing power consumption in embedded system that use Field programmable gate array (FPGA) is increasingly important, particularly for battery powered applications or to reduce heat or system cost. You can use parallel algorithms to exploit the parallel architecture of FPGA devices to accomplish more work per clock cycle, allow you to lower the clock frequency per frame. High-level development tools such as System On Chip Peripherals (SOPC) Builder and the NIOS II C-to-Hardware Acceleration Compiler (C2H) has tremendously useful in the power-saving potential of the FPGA hardware by easily adding hardware accelerators and lowering clock frequencies, optimizing power.
305 Density-Based Spatial Clustering with Noise – A Survey?, Naveen Kumar, S.Sivasathya?
Spatial data mining is the task of discovering knowledge from spatial data. Density-Based Spatial Clustering occupies an important position in spatial data mining task. This paper presents a detailed survey of density-based spatial clustering of data either by classification or clustering. The various algorithms are described based on DBSCAN (Density-Based Spatial Clustering of Application with Noise) comparison them on the basis of various attributes and different pitfalls.
306 Application for Intra-College Communication Based on Cloud Computing?, Neethuanna Mathew, Namrata Salgar, Pretty Varghese, Mr. Yogesh Pawar?
Cloud computing is a rapidly growing technology with usage of virtualized resources as a service through the internet. These services prove to be of great use in fields of education by providing educational applications and tools for students and teachers for communication. This system is based on the concept of web services which is implemented as an android mobile application that communicates with android and java client. The proposed system provides a cost effective application for users in their daily life. Cloud based application offer better alternative academic institutions with very less expenses.
Image processing plays an important role in the detection of object. The object detection is very necessary. In the object detection many technologies are used. But there are some reasons due to which the detector may face some problems in object detection. These problems are: congestion, noise effect and so on. Hence to remove these distortions, we are going to use the region prop along with skull detection. It helps to remove the distortions coming while we detect an object. It recognize a particular object not the noise or any other distortion. Hence it gives us a better result than the previous techniques.
In this paper a new very large scale integration (VLSI) algorithm for a 2N-length discrete Hartley transform (DHT) that can be efficiently implemented on a highly modular and parallel VLSI architecture having a regular structure is presented. The DHT algorithm can be efficiently split on several parallel parts that can be executed concurrently. In this we present a new approach to design VLSI algorithms and VLSI architectures based on a synergistic treatment of the problems at algorithmic, architectural and implementation level. Moreover, the proposed algorithm is well suited for the sub expression sharing techniques that can be used to significantly reduce the hardware complexity of the highly parallel VLSI implementation and also it will increase the speed of the parallel multipliers. Using the advantages of the proposed algorithm and the fact that we can efficiently share the multipliers with the same constant, the number of the multipliers has been significantly reduced such that the number of multipliers is very small comparing with that of the existing algorithms. Thereby, the cost and power of the design can be reduced both in efficient implementation of transforms and reduction/removal of intermediate stages by employing different techniques. The performance overview of our proposal is that we will have efficiently replacing an faster adder and high speed multiplier in the existing algorithm of highly modular and parallel architecture, thereby resulting in significant reduction of overall power consumption, propagation delay, increases the speed and improves the overall hardware complexity of the system.
Providence of remotely sensed data promotes the challenges of how to process the data and how to analyze it as soon as possible. With accordance to Grid conformity heterogeneous computing sources, a Grid environment is built for the processing of remotely sensed images. In this study, CSF4 is taken as meta-scheduler in the collective layer in such a network environment. The message transmission is implemented by a protocol defined by a Grid middleware GRAM (Globus Resource Allocation Manager). SGE, LSF, and OpenPBS are used in the fabric layer of the Grid environment. As an example of remotely sensed image processing in the application layer, image smooth processing is achieved under the MPICH-G2 programming model. The relationship between the node number and time-consuming are analyzed. And the efficiency is shown by comparison between the parallel and serial processing under different node numbers and image sizes. These instructions give you basic guidelines for preparing papers for conference proceedings.
310 Digital Watermarking of Wavelet Transforms Based on Coding and Decoding Techniques?, Mrs. Rashmi Soni, Prof. M.K.Gupta?
Digital watermarking is the most important technology in today’s world, to avoid illegal copying of data. This technique can be applied to audio, video, text or images. This paper surveys the features and concepts pertaining to the various watermarking techniques such as DCT, DWT and purpose of digital watermarking & image watermarking. The sudden increase in watermarking interest is most likely due to the increase in concern over copyright protection of content copyright-protected digital contents are easily recorded and distributed due to: occurrence of high-capacity digital recording devices & the explosive growth in using Internet. The watermark carries information about the object in which it is hidden.
Sensor networks are dense wireless networks of small, low-cost sensors, which collect and disseminate environmental data. Wireless sensor networks facilitate monitoring and controlling of physical environments from remote locations with better accuracy. They have applications in a variety of fields such as environmental monitoring, military purposes and gathering sensing information in inhospitable locations. The sensor nodes in Wireless Sensor Network are battery powered devices which consumes energy during data transmission, processing, etc. The critical task in WSN is to deal with optimizing energy consumption. In this our main focus is for enhancing the energy levels in WSN nodes by saving energy using concept of multi sink scenario.
Wireless sensor network (WSN) is a network of small light weighted wireless nodes which are highly distributed and deployed in large numbers. Wireless sensor networks provide an economic approach for the deployment of the control devices and distributed monitors and avoid the expensive wired system. When the communication takes place in wireless sensor networks then the energy is consumed. Here main concern is to avoid battery wastage. The cluster head is also choosing according to the minimum battery consumption by applying election algorithm. The BS is also placed within the deployed area of wireless sensor network. In our proposed work, NS-2 is used as a simulator to implement whole scenario.
In distributed database the data is physically stored on two or more computer systems, so DDBMS allow to manages or organize the whole database as a single collection of data as result the individuals able to access data from any of the database system because the replication of the data among all the systems in the network. A distributed database may be partitioned and replicated in addition to being distributed across multiple sites. All of is not visible to the users. In this sense, the distributed database technology extends the concept of data independence, which is a middle conception of database management, to situation where data are distributed and replicated over a number of machines connected by a network. In this paper our main focus is to improve the security of the distributed database
314 Survey of the Green within Computing?, Vishnu Kumar M, Ganapathy Sundaram V, Beulah Joice R
Green within Computing is a term used to describe an innovative way on how technology and ecology converge together. Green computing ultimately focuses on ways in reducing overall environmental impacts. Around the world recent studies have shown the Sustainable IT services require the integration of green computing practices such as recycling, electronic waste removal, power consumption, virtualization, improving cooling technology, and optimization of the IT infrastructure to get together sustainability requirements. This paper will provide the literature review on problems from data centers, green computing techniques, and life cycle perspective of energy and power.
315 Watermarking of Dataset with Usability Constraints Model?, R. Shankari, V. Sindhiya, D. Vidhya, Mrs.D.Shiny Irene, Mrs.M.Arshiya Mobeen
The large datasets are being mined to extract hidden knowledge and patterns that assist decision makers in making effective, efficient, and timely decisions in an ever increasing competitive world. This type of “knowledge-driven” data mining activity is not possible without sharing the “datasets” between their owners and data mining experts (or corporations); as a consequence, protecting ownership (by embedding a watermark) on the datasets is becoming relevant. The most important challenge in watermarking (to be mined) datasets is: how to preserve knowledge in features or attributes? Usually, an owner needs to manually define “Usability constraints” for each type of dataset to preserve the contained knowledge. The model aims at preserving “classification potential” of each feature and other major characteristics of datasets that play an important role during the mining process of data; as a result, learning statistics and decision-making rules also remain intact. We have implemented our model and integrated it with a new watermark embedding algorithm to prove that the inserted watermark not only preserves the knowledge contained in a dataset but also significantly enhances watermark security compared with existing techniques.
316 5G Technology-Evolution and Revolution?, Meenal G. Kachhavay?, Ajay P.Thakare?
In this paper, an attempt has been made to review various existing generations of mobile wireless technology in terms of their portals, performance, advantages and disadvantages. The paper throws light on the evolution and development of various generations of mobile wireless technology along with their significance and advantages of one over the other. In the past few decades, mobile wireless technologies have experience 4 or 5 generations of technology revolution and evolution, namely from 1G to 4G.Current research in mobile wireless technology concentrates on advance implementation of 4G technology and 5G technology. Currently 5G term is not officially used. In 5G research is being made on development of World Wide Wireless Web (WWWW), Dynamic Adhoc Wireless Networks (DAWN) and Real Wireless World. In this paper we propose novel network architecture for next generation 5G mobile networks.. In the proposed architecture the mobile terminal has the possibility to change the Radio Access Technology - RAT based on certain user criteria.
317 An Advanced Security - A Two-Way Password Technique for Cloud Services?, Yogesh Brar, Shobhit Krishan, Ankur Mehta, Vipul Talwar, Tanupriya Choudhury, Vasudha Vashisht?
A model for delivering information technology services in which resources are safely stored and retrieved from the internet through web-based tools and applications, rather than a direct connection to a server. Gathering the resources whenever and wherever is a big issue, for smaller companies, and it is a matter of great concern. Thus, Cloud Computing offers a solution to smaller firms. Using the Internet as the backbone, cloud computing provides accessibility to the end users on "whenever and wherever" required basis. This type of electronic system having access to web, allows employees to work remotely. Users here are only concerned with the computing services that he/she has asked for. All the details of approaching this task are kept hidden from the user. The data is secured and stored in massive storage data centre and can be accessed from any device all over the world. Research shows that the architecture of current cloud computing system is central structured one; all the data nodes must be indexed by a master server which may become bottle neck of the system. Cloud Computing finds its use in various areas like web hosting, graphics rendering, financial modeling, web crawling, etc.
318 Anonymously Share Data on Group Signature in the Large Groups of Cloud?, Rengasamy.R, Guru Rani.G?
we have a tendency to propose a secure multi-owner information sharing theme. It implies that any user within the cluster will firmly share information with others by the untrusted cloud. Our planned theme is in a position to support dynamic teams expeditiously. Specifically, new granted users will directly decipher information files uploaded before their participation while not contacting with information homeowners. User revocation is often simply achieved through a unique revocation list while not change the key keys of the remaining users. The dimensions and computation overhead of encoding area unit constant and freelance with the amount of revoked users. We offer secure and privacy-preserving access management to users that guarantees any member during a cluster to anonymously utilize the cloud resource. Moreover, the $64000 identities of knowledge homeowners are often disclosed by the cluster manager once disputes occur. We offer rigorous security analysis, and perform intensive simulations to demonstrate the potency of our theme in terms of storage and computation overhead.
Security in computers has been the core issue when it comes to the operation of computer systems. To safeguard data, organizations impose password policies that in way ascertain that there is a degree of security on files that may be sensitive in nature. These policies differ in different organizations and the effectiveness depends on the success of those password policies. A successful policy mainly depends on the behavior of the users and how they follow it to the book. This research focuses on this aspect and will try to address the end user acceptance and will attempt to improve these policies by introducing the use of mobile phone tokens using the Bluetooth and Rijndael encryption. This would ensure that the users get authenticated by the windows password login and then further authenticated to gain access to their most private files using their Bluetooth enabled mobile phones. In this way we can have less frequent password changes or have less strict policies that the users are resistant to and they can and provide an extra feature that would allow for an automated environment using the proximity sensor to verify if your mobile token is in range or not. This paper will try to assess whether an implementation of this system will provide extra security to files and also improve password policies.
320 A Review on Novel Approach for MRI Image Detection using Kochanek-Bartels Splines with Masking Algorithm?, Sukhjit Kaur, Pooja Sharma?
Encryption of image plays a very important role, it helps to save the image from the unauthorized attack. Image processing usually refers to digital image processing, but optical and analog image processing also are possible. MRI is widely used in pre operative and post operative evaluation of the patients. Magnetic Resonance Imaging (MRI) is a powerful visualization technique. It allows images of the internal anatomy to be acquired in a safe and non invasive way. It is based on the principles of Nuclear Magnetic Resonance (NMR). it allows a vast array of different types of visualizations to be performed. In this paper, we are going to use the splines and masking algorithm to detect MRI images.
321 A Review on Novel Approach of Boundary Detection and Image Segmentation using Brightness Gradient and Cardinal Splines?, Pooja Sharma, Sukhjit Kaur?
Image processing is faster process and cost effective process. It provides us number of techniques to avoid the problem like noise and signal distortion. Image segmentation is a fundamental process in many image, video and computer vision application. Using of image segmentation we can able to understand the fundamental of digital image processing. Image segmentation is used to enhancement of image and also useful different medical application. . On the basis of pixel intensity we can differentiate the boundaries of different objects. Segmentation identifies separate object within an image and also find boundary between different regions. In this paper, we are going to use the Brightness Gradient and Cardinal splines for segmentation.
322 A New Approach for Key Forwarding Scheme in WSN Using Mobile Sink?, E.Revathi, C.Darwin, G.Vivitha?
A dynamic en-route filtering scheme is to address both false report injection attacks and DoS attacks in wireless sensor networks. In our scheme, sensor nodes are organized into clusters. Each legitimate report should be validated by multiple message authentication codes (MAC), which are produced by sensing nodes using their own authentication keys. The authentication keys of each node are created from a hash chain. Before sending reports, nodes disseminate their keys to forwarding nodes using Hill Climbing approach. Then, they send reports in rounds. In each round, every sensing node endorses its reports using a new key and then discloses the key to forwarding nodes. Using the disseminated and disclosed keys, the forwarding nodes can validate the reports. Each node can monitor its neighbors by overhearing their broadcast, which prevents the compromised nodes from changing the reports. Report forwarding and key disclosure are repeatedly executed by each forwarding node at every hop; until the reports are dropped or delivered to the base station. We assume that the topologies of wireless sensor networks change frequently either because sensor nodes are prone to failures or because they need to switch their states between Active and Sleeping for saving energy. This paper also proposes a framework to maximize the life time of WSN by using mobile sink. Each node does not need to send the data immediately as it becomes available. Instead the node can store the data temporarily and transmit it when the mobile sink is at the most favorable location for achieving the longest WSN lifetime.
323 Defending against Flood Attacks in Disruption Tolerant Networks using Blowfish Algorithm, S. Pushpa, B. Mani Megalai, A. Wisy Shantha?
Disruption Tolerant Networks (DTNs) utilize the mobility of nodes and the opportunistic contacts among nodes for data communications. Due to the limitation in network resources such as contact opportunity and buffer space, DTNs are vulnerable to flood attacks in which attackers send as many packets or packet replicas as possible to the network, in order to deplete or overuse the limited network resources. In this paper, we employ rate limiting to defend against flood attacks in DTNs, such that each node has a limit over the number of packets that it can generate in each time interval and a limit over the number of replicas that it can generate for each packet. We propose a distributed scheme to detect if a node has violated its rate limits. To address the challenge that it is difficult to count all the packets or replicas sent by a node due to lack of communication infrastructure, our detection adopts claim-carry-and-check: each node itself counts the number of packets or replicas that it has sent and claims the count to other nodes; the receiving nodes carry the claims when they move, and cross-check if their carried claims are inconsistent when they contact. The claim structure uses the pigeonhole principle to guarantee that an attacker will make inconsistent claims which may lead to detection. We provide rigorous analysis on the probability of detection, and evaluate the effectiveness and efficiency of our scheme with extensive trace-driven simulations.
Image compression may be lossy or lossless. Lossless compression is preferred for archival purposes and often for medical imaging, technical drawings, clip art, or comics. Lossy compression methods, especially when used at low bit rates, introduce compression artifacts. Lossy methods are especially suitable for natural images such as photographs in applications where minor (sometimes imperceptible) loss of fidelity is acceptable to achieve a substantial reduction in bit rate. The lossy compression that produces imperceptible differences may be called visually lossless. In numerical analysis and functional analysis, a discrete wavelet transform (DWT) is any wavelet transform for which the wavelets are discretely sampled. As with other wavelet transforms, a key advantage it has over Fourier transforms is temporal resolution: it captures both frequency and location information (location in time).
325 Vehicle to Vehicle Communication using RFID along with GPS and WAP?, A.Vanitha Katherine, R.Muthumeenakshi, N.Vallilekha?
This paper studies the vehicle to vehicle communication to share safety messages The communication is one-to-many, local, and geo-significant. The vehicular communication network is ad-hoc, highly mobile, and with large numbers of contending nodes. We design several random access protocols for medium access control. The protocols fit in the facility of RFID, GPS and WAP. This can be done using DSRC.
The main objective of this system is to propose a system, which is used for ATM security applications. Here Bankers will collect the customer finger prints and mobile number while opening the accounts then customer can access the ATM machine. When the customer enters ATM and after inserting card he must place finger on the finger print module then he get automatically generated 4-digit code every time as a message to the mobile of the authorized customer through GSM modem connected to the microcontroller. The code received by the customer should be entered by pressing the keys on the touch screen, after only that he will be able for further transaction. This proposal will go a long way to solve the problem of account safety.
327 Comparative Analysis of Various Underwater Image Enhancement Technique, Shiwam S. Thakare, Amit Sahu?
Image enhancement is a process of improving the quality of image by improving its feature. In this paper comparative analysis of various enhancement techniques for such underwater images is presented. The underwater image suffers from low contrast and resolution due to poor visibility conditions, hence an object identification become typical task. The processing of underwater image captured is necessary because the quality of underwater images affect and these images leads some serious problems when compared to images from a clearer environment. A lot of noise occurs due to low contrast, poor visibility conditions, absorption of natural light, non uniform lighting and little color variations, and blur effect in the underwater images, because of all these reasons number of methods are there to cure these underwater images, different filtering techniques are also available in the literature for processing and enhancement of underwater images.
328 Using OLAP with Diseases Registry Warehouse for Clinical Decision Support?, A. K. Hamoud, Dr. Taleb A.S. Obaid
Diseases registry databases in Iraqi hospitals hold huge information which can be used to support strategic clinical decisions. The clinicians and professionals often need tools to get the valuable information. This information cannot be got with normal functions used with operational databases. Using On-Line Analytical Processing (OLAP) gives insight and clear view on the information from different points. It makes it very easy and fast to view results as reports. The researchers used with clinical data warehouse to give decision makers on-time results to assist them in support their decisions. Two database sources used to construct and build data warehouse in order to use OLAP to view multidimensional data.
329 Performance Analysis of Classification Algorithms?, Payal Pahwa, Manju Papreja, Renu Miglani?
Classification is a process of finding model for partitioning the data into different classes. It is a process of generalizing and assigning a class label to a set of unclassified cases. In drug design the classification algorithms helps to identify the class of new designed drug (test data set) on the basis of existing training data set. In this paper we analyze and compare the behavior of different kinds of classification algorithms on medical data set taking from literature.
330 SECRET SPLITTING SCHEME: A REVIEW?, Nikita Dhule, Mr. Amit Sahu?
For protecting sensitive information Secret splitting technique is employed, like crypto logic keys. It’s wont to give a secret worth to variety of parts-shares-that need to be merging along to induce the first worth. These shares will then lean to individual parties that shield them exploitation customary suggests that, e.g., memorize, store in a very pc or in a very safe. Secret splitting is employed in trendy cryptography to attenuate the risks related to compromised information. Splitting a secret distributes the danger of compromising the worth across many parties. Customary security assumptions of secret splitting schemes state that once Associate in nursing resister gets access to any variety of shares lowers than some outlined threshold; it gets no data of the key worth. In recent years, security of operations going down over a network becomes important. It’s necessary to safeguard such actions against “bad” users who might attempt to misuse the system (e.g. steal MasterCard numbers, browse personal mails, execute actions while not authorization, or impersonate different users). Several crypto logic protocols and schemes were designed to unravel issues of this kind.
331 To Enhance Reliability of Dynamic Clustering Using Self Learning Technique: A Review, Shivani Garg?
The wireless sensor network is one of the types of Ad hoc network. Any sensor node can join or leave the network when they want i.e. self-configuring in nature. There is no central controller is present in wireless sensor network. Wireless sensor nodes are responsible for data routing in the network. Wireless sensor network is used to monitor the environmental conditions like, pressure, temperature and humidity etc. Wireless sensor network is deployed in the far places like forests, deserts etc. Wireless Sensor nodes are very small in size and have limited resources. In such far places it is very difficult to recharge or replace the battery of the sensor nodes. In such conditions, we focus to reduce the battery consumption of the sensor nodes. In our work, a new technique is proposed to reduce battery utilization. Our new proposed technique will be based on the dynamic clustering using neural network. Before data transmission sensor nodes form the cluster dynamically using the neural network.
332 Data Mart Designing and Integration Approaches?, Rashmi Chhabra, Payal Pahwa?
Today companies need strategic information to counter fiercer competition, extend market share and improve profitability. So they need information system that is subject oriented, integrated, non volatile and time variant. Data warehouse is the viable solution. It is integrated repository of data gathered from many sources and used by the entire enterprise. In order to standardize data analysis and enable simplified usage patterns, data warehouses are normally organized as problem driven, small units called data marts. Each data mart is dedicated to the study of a specific problem. The data marts are merged to create data warehouse. This paper discusses about design and integration of data marts and various techniques used for integrating data marts.
333 Approaches for Web Service Selection, Vijayalaxmi S Jeure, Y.C.Kulkarni?
Web services is a technology for transmitting data over the Internet and allowing programmatic access to that data using standard Internet protocols. There may be the services providing similar properties, effects, capabilities, and interfaces. Selecting one such similar service is that matches the users requirement is a difficult task. Quality of Service (QoS) attributes provides a differentiation among the competing services, allowing a prospective user to choose the services which best suit to his/her QoS requirements. This paper addresses precisely this component. In this paper we have discussed different approaches for web service selection and proposed particle swarm optimization algorithm for the selection of web services to match consumers with services based on QoS attributes as closely as possible. Particle swarm optimization is the population based stochastic optimization technique. It is the population based search procedure. In this the population of agents called particles is created and uniformly distributed on the region. Particle’s position is evaluated according to the function. If the particle’s current position is better than it’s previous position it is updated. Particle is moved to the new position. Accordingly the evaluation is done to find the best suitable position.
334 Intelligent System for Brain Diseases Diagnosis Using Neural Network and Bayesian Methods?, Amir Y Mahdi, Shaker k Ali, Rabia R Mkamis?
Brain disease represents a general term that represents a broad range of brain disorders. It can occur in all age groups, but becomes more common in old age. Medical diagnosis is an important but complicated task that should be performed accurately and efficiently and its automation would be very useful in this domain (medical domain).In this paper we will designed an expert system model to detect and diagnose degenerative brain disease. The disease is determined by using artificial neural network and Bayesian theory. It is the most commonly used techniques for developing expert systems, where the system receive the information of patient and given these the information to ANN which doing the compression between these information with constant values of humans body that stored in network, in the case the similarity values because of symptoms among the more than disease, the disease is specifically by law of Bayesian. Moreover, the system can recommend treatments for the diseases in its specialism. These treatments are suggested based on many conditions and constraints related to the patient and the diagnosed disease as well as it can be used to assist the doctors and less cost.
335 Review on Text Clustering Based on Frequent Itemset?, Prajakta Jaswante, Dr. P.R. Deshmukh?
Recently the vast amount of textual information available in electronic form is growing at staggering rate. This increasing number of textual data has led to the task of mining useful or interesting frequent itemsets (words/terms) from very large text databases and still it seems to be quite challenging. The use of such frequent itemsets for text clustering has received a great deal of attention in research community since the mined frequent itemsets reduce the dimensionality of the documents drastically. In the proposed research, we have considered an efficient approach for text clustering based on the frequent itemsets. A renowned method, called Apriori algorithm is used for mining the frequent itemsets. The mined frequent itemsets are then used for obtaining the partition, where the documents are initially clustered without overlapping. Furthermore, the resultant clusters are effectively obtained by grouping the documents within the partition by means of derived keywords. Finally, for experimentation, any of the dataset can be used and thus the obtained outputs can ensure that the performance of the proposed approach has been improved effectively.
336 Review on “Data Mining with Big Data”, Vitthal Yenkar, Prof. Mahip Bartere
Big Data relates large-volume, complex, increasing data sets with multiple independent sources. With the rapid evolution of data, data storage and the networking collection capability, Big Data are now speedily expanding in all science and engineering domains. Big Data mining is the ability of extracting constructive information from huge streams of data or datasets, that due to its variability, volume, and velocity. Data mining includes exploring and analyzing big quantity of data to locate different molds for big data. Artificial intelligence (AI) and statistics are the fields which develop these techniques, This paper discusses a characterizes applications of Big Data processing model and Big Data revolution, from the data mining outlook. The analysis of big data can be troublesome because it often involves the collection and storage of mixed data based on different patterns or rules (heterogeneous mixture data). This has made the heterogeneous mixture property of data a very important issue. This paper introduces ?heterogeneous mixture learning,? We study the tough issues in the Big Data revolution and also in the data-driven model.
337 Review on “Adaption of Ranking Model for Domain Specific Search”?, Mr. Pratik R.Mantri, Prof. Mahip M.Bartere?
Different new vertical domains are coming everyday so running a sophisticated ranking model Is no longer enviable as the domain are different and building a separate model for each domain is also not favorable because there much time required for labeling the data and training the samples. In this paper we are managing the above problem by regularization based algorithm called as ranking adaptation SVM (RA-SVM), the algorithm is used to adapt existing ranking model of sophisticated search engine to new domain. Here performance is still guaranteed and times taken to label the data training the samples are reduced. The algorithms only requires prediction from existing ranking model and do not require internal structure of it. Adapted ranking model concentrate on specific domain to achieve superior results which are relevant to the search, further it reduces the searching cost also as the most appropriate search results are shown.
338 Review on “An analysis of the Management Security and Security Algorithms in Cloud Computing”, Nilesh N.Chawande, Prof. Jayant P.Mehare?
Cloud computing has elevated IT to newer limits by offering the market environment data storage and capacity with flexible & scalable computing as well as processing power to match elastic demand and supply while reducing capital use. However the opportunity cost of the successful implementation of cloud computing is to effectively manage the security in the cloud applications, due to constantly increase in the popularity of cloud computing there is an ever growing risk of security. Thus security is becoming a main and top issue for security concern. In this paper, we have analyzed the management security and various security algorithms in cloud computing.
339 Automatic Segmentation of Digital Images Applied in Cardiac Medical Images?, Mukesh G. Mahore, Vrushali V. Dhanrale, Harshad R. Borde, Pooja G.Lahoti, Suraj B. Borge
The digital image processing is important in medical fields. It is used for future operation and study purpose. In that segmentation is the main factors. Image segmentation plays a crucial role in many medical imaging application by automating or facilitating the destination of anatomical structure and author regions of interest that may be used in a specific study. For that purpose there are several methods, but it is difficult to find a method that can easily adapt to different type of images. For that problem ,our paper aims to represents a adaptable segmentation method, and give the better segmentation. To define the threshold this method based on model of automatic multilevel threshholding and techniques of group histogram quantization, histogram analysis percentage of slope and calculation of entropy. This technique rejects the tissue of biopsies from cardiac transplant.
340 Secure Platform for Wireless Sensor Network?, Mohamed Otmani, Abdellah Ezzati?
A mesh, ADHOC and MANET networks can be used in several cases. In this paper we try to focus on two routing protocols in ADHOC in order to use one in real cases and without too much difficulty in implementation and securing. In this order we start by introducing the two routing protocols, then we create a mesh network using one of them as routing protocol with laptops, android, ARM based computer and smart phones as end-devices, then we implement a passive and active attack in order to collect data exchanged between nodes then we secure the communications with virtual private network tunneling and connect the end devices to the cloud.
341 Mobile Database Review and Security Aspects?, Bhagat.A.R, Prof. Bhagat.V.B
This article show different introduction to mobile database and its threat along with their security that may be occurs for mobile database in the real world and gives possible solution to eliminate them and this work, a case study of a secure mobile database application. In particular, we design, implement and evaluate a mobile database. The importance of databases in modern businesses and governmental institutions is huge and still growing. Many mission-critical applications and business processes rely on databases. These databases contain data of different degree of importance and confidentiality, and are accessed by a wide variety of users. Integrity violations for a database can have serious impact on business processes; disclosure of confidential data in some cases has the same effect. Traditional database security provides techniques and strategies to handle such problems with respect to database servers in a non-mobile context. We identify a set of security issues and apply appropriate techniques to satisfy the corresponding security requirements.
342 Mobile HealthCare Technology Based on Wireless Sensor?, Pimpre.D.M, Bhagat.V.B?
The recent advances in Wireless Sensor Networks have given rise to many application areas in healthcare. It has produced new field of Wireless Body Area Networks. Using wearable and non-wearable sensor devices humans can be tracked and monitored. Monitoring from the healthcare perspective can be with or without the consent of the particular person. Even if it is with the consent of the person involved, certain social issues arise from this type of application scenario. The issues can be privacy, security, legal and other related issues. Healthcare sensor networks applications have a bright future and it is a must to take up these issues at the earliest. The issues should be carefully studied and understood or else they can pose serious problems. In this paper we try to raise and discuss these issues and find some answers to them. This paper focuses on WSNs utilization in medicine. In order to present the current state in this research field, few particular projects have been selected and compared.
343 QADR with Energy Consumption for DIA in Cloud?, Ms. S.Soundharya, Mr. R.Santhosh?
In today’s IT field scalability on storage resources are provided by cloud computing. Moreover data-intensive applications are also developed in this technology. Occurrence of data Corruption on these DIA (data intensive applications) does not meet the QoS Requirements. So in order to satisfy the requirements, we are comparing two algorithms from existing paper one is HQFR which uses the greedy algorithms. HQFR algorithm cannot minimize the cost of data replication and the count of QoS violated data replicas. These two objectives of QADR problem has been achieved by using MCMF algorithm which provides an optimal solution in polynomial time. But when compared to HQFR algorithm its computation time is high, since we have to consider more number of nodes as it is in cloud environment. To find a solution for this time complexity, Combination of nodes–Technique has been introduced in MCMF. Demonstration of these algorithms has been implemented under Windows Azure environment which provides comparatively good results for data replication. Further this implementation has been extended to concern energy consumption in cloud environment.
344 Local Octal Pattern: A Proficient Feature Extraction for Face Recognition, Nithya J, Suchitra S, Dr. S Chitrakala?
This paper presents a novel and efficient face image representation based on Local Octal Pattern (LOP) texture features. The standard methods viz., the local binary pattern (LBP), the local ternary pattern (LTP) and the local tetra pattern (LTrP) are able to encode with a maximum of four distinct values about the relationship between the referenced pixel and its corresponding neighbors. The proposed method calculates the diagonal, horizontal and vertical directions of the pixels using first-order derivatives. Thereby, it encodes eight distinct values about the relationship between the referenced pixel and its neighbors. The performance of the proposed method is compared with the LBP, the LTP and the LTrP based on the results obtained in terms of average precision and average recall on PubFig image database.
345 SMART PHONE BASED SOCIAL NETWORKING FOR TEACHING & LEARNING?, G. M. M. Bashir, Md. Atikqur Rahaman, Syed Md. Galib, M.M. Rahaman?
Smart devices are becoming popular for their extreme features and attributes. Currently, smart mobile devices are providing wireless communication using Wi-Fi, 3G and 2G network. Proliferation of network coverage and smart phone users make the use of warm embrace of social networking services easy and simple. With the perfection of wireless communication, mobile social networking has become a hot research topic at this moment. The characteristic of mobile devices along with software support allows us to solve the challenges to develop social networking platform for teaching learning.In this paper, we would like to propose a mobile based architecture that will allow students and faculty members to access for their required data. The mobile application will be used to transfer class content, lecture slides, e-books, and tutorials on the fly. The user will be able to place requests for teaching learning contents, blog their queries. On the other hand, the answer from any user will be presented in this system as soon as replied. The social network bridges the gap between students and teachers through knowledge dissemination.
Engineering as it finds its wide range of application in every field not an exception even the medical and military field. One of the technologies which aid the surgeons to perform even the most complicated surgeries successfully is Virtual Reality. And In this paper I have a look at different military fields and applications. I present example applications, which are being used in those fields. The applications are discussed and conclusions are made. Haptics, despite of being a relatively new topic in military field, has a small but certain ground already and will be more important in the future. Even though virtual reality is employed to carry out operations the surgeon’s attention is one of the most important parameter. If he commits any mistakes it may lead to a dangerous end. So, one may think of a technology that reduces the burdens of a surgeon by providing an efficient interaction to the surgeon than VR. Now our dream came to reality by means of a technology called “HAPTIC TECHNOLOGY”
347 Network Assisted Mobile Computing with Efficient Cache Maintenance?, K Komarasamy, Dr. L M Nithya
Mobile applications retrieve content from remote servers through user generated queries. Processing the request fully on the mobile devices can quickly reduce battery resources. Alternatively, processing request at remote servers can have slow response times due communication latency incurred during transmission of large query. We use network-assisted mobile computing method where mid-network nodes with ?leasing? capabilities are deployed by a service provider. Leasing computation power can be reducing battery consumption on the mobile devices as well as improve response times. We evaluate the dynamic programming algorithm to solve for the optimal processing policies that suggest the amount of processing to be done at each mid-network node in order to minimize processing and communication latency and processing costs. In this research provide efficient cache maintenance at mid-network nodes using Push and Pull algorithm (PP) and Fast Lender Detection (FLD) to minimize processing time because we retrieve content from mid-network nodes.
348 Performance Analysis of Sequential Element for Low Power Clocking System?, Umayal.S?
The sequential element (flip-flop) is a basic building block to design any clocking system, which consists of the clock distribution tree and flip-flops. A large portion of the on chip power is consumed by the clocking system the total power consumption of the clocking system depends on both clocking distribution tree and also the register elements (flip-flops). The power consumption of register element is higher than that of the clocking distribution tree the objective is to reduce the power consumption by the register elements (flip-flop). A method of Conditional data mapping Flip Flop (CDMFF) was proposed earlier. The drawbacks of CDMFF are, it uses more number of transistors and it has a floating node on its critical path. Additionally it cannot be used in noise intensive environment. For that a method called Clocked Pair Shared Implicit pulsed Flip Flop (CPSFF) is proposed. In this method the number of transistors is reduced by sharing the clocked pair transistors. The floating node problem is also avoided by using precharge transistors. The design can be implemented in DSCH and MICROWIND 3.1 CMOS Layout tool. The performance is analyzed in views of number of transistors (N), Area (A), power (P), delay (D-Q), power delay product (PDP). Analysis of the performance parameters shows that performance of CPSFF is superior compared to the conventional Flip Flop. Overall power is reduced in CPSFF when compared to the previous method CDMFF. A 20% reduction of power can be achieved in Clocked Pair Shared Flip Flop (CPSFF). In addition due to the absence of floating node problem low swing voltage and dual edge clocking can be easily employed into the proposed register element (flip-flop) to construct clocking system.
349 Using Genetic Algorithm for Whole Test suite Generation of Object Oriented Programs?, Mrs.R.Gowri, R.DhanBhagya Preety, B.Durga Devi, G.Aruna?
In Software testing there is a great demand for the automation of test cases. The test cases are either generated before coding using software specifications or after coding using program execution traces. Many genetic algorithms have been proposed for procedural programming but they do not suit well for object oriented programming. This is because in object oriented programs the object relationship exists and considering them is important for the generation of best test suite. We propose a method to generate test suites for object oriented programs using genetic algorithm by considering the object-relationships and their dependencies such as polymorphism, message passing and inheritance that exists among them. The key feature of our proposed technique is that the test suites are evolved as a whole instead of generating one test case for each coverage goal. To investigate the effectiveness of our approach we have used a java program of Chocolate Vending machine as the source code and few test cases have been relatively developed to evaluate the fittest test suite with maximum coverage. We have applied our technique in the EVOSUITE tool to determine the efficiency of our approach. Our results indicate that the use of genetic algorithm in Test case Generation is beneficial than the traditional approach of targeting single branches.
350 Survey on Secure Updates using Over the Air Programming in Wireless Sensor Network, Naresh M. Bhagat, S. P. Akarte?
Wireless Sensor Networks (WSNs) face many challenges including reliability, flexibility and security. When WSNs deployed in remote locations need to be reprogrammed, environmental conditions often make it impossible to physically retrieve them. Over the Air Programming (OAP) plays an important role in achieving this task. Over-the-air programming (OAP) is a fundamental service in sensor networks that relies upon reliable broadcast for efficient dissemination. SenSeOP Programming protocols provide a convenient way to update program images via wireless communication. In hostile environments where there may be malicious attacks against wireless sensor networks, the process of reprogramming faces threats from potentially compromised nodes. While existing solutions can provide authentication services, they are insufficient for a new generation of network coding-based reprogramming protocols in wireless sensor networks. We present a security approach that is able to defend pollution attack against reprogramming protocols based on network coding. It employs a homomorphic hashing function and an identity-based aggregate signature to allow sensor nodes to check packets on-the-fly before they accept incoming encoded packets, and introduces an efficient mechanism to reduce the computation overhead at each node and to eliminate bad packets quickly. In this paper, introduce SenSeOP, a selective and secure OTAP protocol for WSNs. For this purpose, the proposed protocol uses multicast transfer supported by cryptography. We evaluate the performance of our approach in real testbeds, compare it with state-of-the-art protocols, and show that this approach enables efficient and reliable wireless reprogramming.
351 A Review of Jelly Fish Attack in Mobile Adhoc Networks, Manjot Kaur, Anand Nayyar?
Mobile Adhoc Networks have become a part and parcel of technology advancements due to its working as autonomous system. MANET networks are vulnerable to various types of attacks and threats due to its unique characteristics like dynamic topology, Shared physical medium, distributed operations and many more. There are many attacks which effect the functioning of MANETS’ such as denial of service which is most commonly used to affect the network is one of the types of attacks in MANETS. Jellyfish attack has gained its name recently in attack scenario in Mobile Ad hoc networks. JellyFish Attack exploits the end to end congestion control mechanism of Transmission Control Protocol (TCP).
352 A Comprehensive Review of Distance and Density Based Cluster Head Selection Schemes, Naveen Sharma, Anand Nayyar?
One of the most important consideration in designing sensor nodes in a wireless sensor network for prolonging the network lifetime by minimizing energy consumption. The number of cluster and the distribution of cluster heads (CHs) always make a major impact on the network performance. Distance and Density based clustering algorithm can greatly improve energy efficiency of WSNs because it adopts a multi-hop communication in each cluster. Besides, the neighborhood of the sink node (SN) will perform direct transmission to relieve the workload of CHs. Load balancing and scalability are very crucial factors which plays important role in the selection of Cluster head. In this paper we present the study of different distance and density based cluster head selection algorithms for wireless sensor networks and compared them on various parameters.
353 Design of Coarse Grain Architecture for DSP Application?, Kushal R. Kalmegh?, Prof. Vaishali Tehre?
CGRAs consist of an array of a large number of function units (FUs) interconnected by a mesh style network. In stream-based applications, such as telecommunications, data encryptions, and signal processing are the workloads in many electronic systems. In these applications, the real-time constraints often have stringent energy and performance requirements. The application-specific integrated circuits (ASICs) become inevitably a customized solution to meet these ever-increasing demands for highly repetitive parallel computations. In this project proposing a coarse grain architecture and mapping of some DSP Algorithm.
354 Critical Success Factors for Agile Methodology Adaptation in Software Development Organizations?, Vishvadeep Tripathi, Arvind Kumar Goyal?
This paper presents the outcome of our extensive literature survey and interaction with a number peer of agile practitioners. The purpose of this study is to identify critical success factors for adopting agile methodologies in software development organizations. The focus of this study to determine the critical success factor for any software development organization willing to transform from traditional software development methodology to agile development methodology.
Steganography is the science of hiding the fact that communication is taking place, by hiding information behind information. Many different carrier file formats can be used, but digital media are the most popular because of their usage on the Internet. For hiding secret information in digital media, there exists a large variety of steganography techniques some are more complex than others and all of them have some advantages and disadvantage. Various applications have different requirements of the steganography technique used. For example, some applications may require absolute invisibility of the secret data, while others require a larger secret data to be hidden. This paper intends to give an overview of Digital media steganography, its uses and techniques. It also attempts to identify the requirements of a good steganography algorithm and briefly reflects on which stenographic techniques are more suitable for which applications.
A new genetic algorithm (GA) is used to detect the locations of the License Plate (LP) symbols. Adaptive threshold method has been used to overcome the dynamic changes of illumination conditions when converting one image into binary. The detection stage of the LP is the most critical step in an automatic vehicle identification system Connected component analysis technique (CCAT) is used to detect candidate objects inside the unknown image. Encouraging results with 96.8% overall accuracy have been reported for two different datasets having variability in orientation, scaling, plate location, illumination and complex background.
357 Alienation of Melanocytes in Cancer Pretentious Cells using Level Set Segmentation Algorithm, Narmadha.R, Ahalya Mary.J?
Melanoma is the dangerous form of Skin Cancer that reduces the life period of many blessed sole and cause increased death rates every second. Hence this should be detected in early stage. It is achieved only by the expert dermatologist after painful & time consuming biopsy. The proposed Computer aided system use different technique to reduce the pain and time of cancer affected patient. Thus it separates the affected Melanocyte in the epidermis area. The cancer image is given as input to the system. As the image contains dust (noise & hair) it should be removed first. It is obtained by means of bilateral method in image processing. The local intensity of the image is reduced using Level Set Segmentation Algorithm. LRRS is used to fix the candidate nuclei region generated by level set segmented image. SVM (Support Vector Machine) is proposed to recognize the object i.e., Melanocyte and Keratinocyte in epidermis area. Thus the proposed system produces accurate result over different histopathological image.
Operating system support for Wireless sensor networks (WSN) plays a major role in building scalable distributed applications that are efficient and reliable. Over the years, we have seen a variety of Operating systems emerging in the sensor net community to facilitate developing WSN applications. The design of operating system for WSN is a challenging task. In the proposed model optimized kernel model for portable and easy-to-use WSN operating system has a smooth learning curve for users with C and UNIX programming experience is presented. The OS features a configuration model that allows reducing application binary code size and building time. In contrast to other Wireless sensor networks operating system, MansOS provides both event-based and threaded user application support, including a complete lightweight implementation of preemptive threads
359 A Review on Various Techniques for Image Debluring, Shital Hemant Umale?
Image debluring refers to procedures that attempt to reduce the blur amount in a blurry image and grant the degraded image an overall sharpened appearance to obtain a clearer image. The point spread function (PSF) is one of the essential factors that needed to be calculated, since it will be employed with different types of debluring algorithms. In this paper, the studied various fast debluring techniques like Richardson – Lucy and its optimized version, Van Cittert and its enhanced version, Landweber, Poisson Map, and Laplacian sharpening filters. The usage of the PSF in the debluring algorithm is explained and a comparison between the optimized, the enhanced algorithms and Laplacian sharpening filters in terms of the number of mathematical operations, number of iterations employed, computation time, debluring in case of noise existence, and the accuracy measurement using peak signal to noise ratio (PSNR) for each technique is conducted.
360 Monitoring and Self-Transmitting Data Using Zone Routing Protocol in Ad-hoc Network towards Effective Mobility Management?, S.Jayamoorthy, I.Varalakshmi, S.Kumarakrishnan?
361 SAAS – A Gateway to Cost Effective Secure Vehicular Clouds?, M.R.Yasmeen?, M.Ramya Devi?
Vehicular Cloud Communication (VCC) is the latest buzz in conventional cloud computing. In a Vehicular Cloud (VC), the commuters share resources ranging from storage to computing power to renting it to others over the Internet. The broad horizon calls for covering various aspects of security, social impact, cost effective communication. Much research has been carried over the VC architecture, security challenges and potential threats applicable in the VCs. This paper highlights a cost effective, hassle free, secure communication between the cloud and misshaped vehicles. Communication is established via Software as a Service (SAAS). Additionally, the nearest medical help is made available. Both haversine and GeoAPI distance matrix have been used for the same. The security challenge is addressed with the help of Digital Signature Algorithm (DSA). AES and Blowfish have been evaluated to compute the message processing speed.
362 Dynamic Bandwidth Allocation Scheme for Efficient Handover in IEEE 802.16e Networks?, M. Deva Priya, M.L. Valarmathi, R.K. Shanmugapriya, D. Prithviraj?
Mobile Worldwide interoperability for Microwave access (WiMAX), IEEE 802.16e is a promising solution that provides ubiquitous wireless access with high data rates, high mobility and wide coverage. The main issue in mobile WiMAX networks is managing user mobility. Queues associated with the arriving packets should have enough bandwidth to meet the requirements; else the Handoff Call Dropping Probability (HCDP) will be high. The Dynamic QoS based Bandwidth Allocation Algorithm (DQBA2) is proposed to increase the system utilization and to reduce the Dropping Probability. Traffic with high Bandwidth Allocation Factor (BAF) is given high priority. The number of users is increased by dynamically allocating bandwidth based on the Arrival Rate (?). To improve the efficiency of the proposed scheme, a Scanning with Self-back off (SSB) scheme is included. The proposed system shows better performance in terms of Throughput, delay and packet loss.
363 Change Monitoring of Burphu Glacier from 1963 to 2011 using Remote Sensing?, Rahul Singh, Dr. Renu Dhir?
Himalayas has one of the largest resources of snow and ice, which act as a freshwater reservoir for all the rivers originating from it. Monitoring of these resources is important for the assessment of availability of water in the Himalayan Rivers. The mapping of Glaciers is very difficult task because of the inaccessibility and remoteness of the terrain. Remote sensing techniques are often the only way to analyze glaciers in remote mountains and to monitor a large number of glaciers in multitemporal manner. This paper presents the results obtained from the analysis of a set of multitemporal Landsat MSS, TM and ETM+images for the monitoring and analysis of Burphu Glacier.
364 Designing a Web Based Module for SAR to Ensure Data Security Using JAVA?, Bhumi Aravind Kalaria, Dr. C Gurudas Nayak, Dr. N.K. Shrivastava
“Save our souls” is the phrase often uttered by the people who are in distress, whether they are in a sinking ship or in an Aircraft which is about to crash etc. In order to save them, a Satellite Aided Search and Rescue system was founded. This system works under the collaboration of many Nations. It plays a vital role in Search and Rescue operations universally. This Satellite Aided Search and Rescue system is named as the COSPAS/SARSAT contributing to the nations, which founded it. The main objective of this system is to support all organizations worldwide with responsibility for Search and Rescue Operations, whether at sea, air or on land. The COSPAS/SARSAT is a successful operational system designed to detect and locate distress locations with the help of radio beacons, (These signals are received by the Low Polar Orbiting satellites and relays a signal either at 121.5 MHz or 406.025 MHz). These signals are received by the Low Polar Orbiting satellites and down linked to the ground stations (LUT’s & MCC’s) where the signal is processed, providing the distress location. My project involves designing a website for SATELLITE AIDED SEARCH AND RESCUE SYSTEM PROVIDED MY INDIAN MISSION CONTROL CENTRE, ISTRAC BANGALORE. This project gives very detailed information about Satellite Aided Search And Rescue System. Which includes the information about- working of the system, the system principle, LEOSAR and GEOSAR satellites which are used by the system, the Indian contribution to the search and rescue system, information about beacons and there types, INMCC’s response to real distress calls and facility for Online Beacon Registration and links to the INMCC Current Status.
In the current trend many businesses publish their applications utilities on the web. The desideratum for supporting the classification and semantic annotation of services constitutes an important challenge for service–centric software engineering. Such a semantic annotation may require, in turn, to be made in acceptance to a specific ontology. Also, a service description needs to felicitously relate with other similar services. For a particular service request to relate an implicit service description, this paper overcomes the issues in web service discovery using semantics The service request along with web service composition is performed using categorization of semantic based service and enhancement in semantics in an ontology frame work.
Increasing use of computer results in an exposition of information, making data mining an important research topic. Data mining in medical has a great potential for exploring the hidden patterns in the data set of medical domain. Thus there is a way to automatically discover knowledge of the data if the data is uncovered. Data mining is referred to as knowledge discovery in databases (KDD) can be defined as non trivial process of identifying valid, novel potential, useful and ultimately understandable patterns in data. These patterns can be effectively utilized for medical diagnosis. By collecting the raw data widely distributed, heterogeneous in nature and voluminous organize them and integrate to form a hospital management system. Finally by using the technique of data mining here we identify few areas of Medicare where data mining technique can be applied to Medicare database for Knowledge discovery.
Distributed Denial of Service (DDoS) attacks along with IP spoofing is a major threat faced by networks. The problem is more complicated in case of proxy networks as it is difficult to identify the particular attacker node. A perfect novel server-side defense scheme is proposed to resist DDoS attack by identifying and blocking the particular attacker node along with provision for IP spoofing detection. A TSL-IP based HsMM algorithm and a Hop-count detection algorithm were proposed to detect attackers and spoofed IPs. The approach utilizes the TSL behavior of the requesting nodes to identify attacks and IP of the node as a unique identity to identify the particular attacker node, which makes the scheme more accurate than existing schemes. Soft control is a novel attack response method proposed in this work. It performs behavior reshaping that tries to converts a suspicious traffic into a relatively normal one before rudely discarding them. The Proposed variation in HTTP protocol supports for identifying which client is intruder rather than detecting the innocent web proxy. The TTL based filtering performs a mapping between IP addresses and their hop-counts to identify spoofed IP packets.
Edge detection is a process of identifying and detecting sharp discontinuities in an image. The discontinuities are abrupt changes in pixel intensity gray level value. The traditional method of edge detection involves convolving the image with an operator (2- D filter) which is constructed to be sensitive to noise. Edge detector is a collection of very important local image processing method to locate sharp changes in the intensity value. Edge detection is an important technique in many image processing applications such as object recognition, motion analysis, pattern recognition, medical image processing etc. This paper shows the comparison of edge detection techniques under different conditions showing advantages and disadvantages of these algorithms. This was done under Matlab. Further work would be to detection of liver tumor with the help of new developed algorithm.
369 Layered Mapping of Cloud Architecture with Grid Architecture?, Pardeep Seelwal?
Cloud computing is a type of computing that relies on sharing computing resources rather than having local servers or personal devices to handle application. Cloud computing is a comprehensive solution that delivery IT as a service. It is an internet-based computing solution where shared resources are provided like electricity distributed on the grid electrical grid. On the other hand, Grid is an infrastructure that involves the integrated and collaborative use of networks, databases, computers owned and managed by multiple organizations. Grid computing allows an application to run on different machine and also utilize the grid resources. This paper describes the layered mapping of cloud model with grid model in terms of their services.
370 On New Approach in Using 433MHz Radio Modules?, Prithviraj Shetti, Prasad V. Sakharpe, Amrut Ubare?
Cheap radio modules such as 433MHz Rx/Tx pairs are very popular in hobby projects and readily available in the local market. For reliable data transmission and reception these modules need initial burst training pulses for sync and some encoding scheme for reducing effect of noise. Present approaches make use of VirtualWire library and Manchester library for using these modules with Arduino board. This gives transmission of 3-4 digit sensor values and 5-12 characters text string. In present paper we demonstrate it is possible to send even longer text string and integer data without using these two libraries and using another protocol viz. software serial only. Transmission of long text strings is demonstrated with reliable results.
Association rule mining concept is used to show relation between items in a set of items. Apriori algorithm is one the mostly used algorithm is association rule mining. It extracts the frequent and useful patterns from the large databases. It is easy to understand and implement yet it has some drawbacks. Many algorithms are being given to improve its performance. This paper does a survey on different improved approaches of Apriori algorithm which improves its performance by removing its drawbacks and reduces its execution time by parallel implementation of it.
372 Firewall and Its Policies Management?, Er. Smriti Salaria, Er. Nishi Madaan?
Firewalls are core elements in network security. A firewall element determines whether to accept or discard a packet that passes through it based on its policy. Firewall allows separation between frontend and backend entity so as to ensure security. In this paper we have critically analyzed various firewall management policies and techniques and also have covered our views such that its major types, classification and applications.
373 An Innovative Smart Soft Computing Methodology towards Disease (Cancer, Heart Disease, Arthritis) Detection in an Earlier Stage and in a Smarter Way?, Mr. Tanupriya Choudhury, Prof. (Dr.) Vivek Kumar, Dr. Darshika Nigam, Vasudha Vashisht?
Cancer, Heart disease, arthritis are the most common diseases found in the majority of the populations in recent years. Medical diagnosis is enormously essential but complex task that should be accomplished exactly and proficiently. Although momentous progress has been made in the diagnosis and treatment of these diseases, further investigation is still desired. These disease diagnosis are a challenging task which can offer automatic prediction about the disease of patient so that further treatment can be made informal. Due to this fact, disease diagnosis has received enormous interest globally among medical community. In this paper soft computing played an important role in diagnosis of these diseases with improved effectiveness and suitable accuracy. It gives a detailed view of a innovative earlier detection system for these diseases with the help of soft computing methods and proper attributes reference value to predict the diseases analytically. Different clinical values for attributes and biomarkers have been taken as an input and match these with the reference values to predict the diseases accurately. Diseases datasets are also analyzed here using soft computing approach. The outcome of this would help doctors, scientists, pharmacists in understanding the characteristic and association of attributes which is responsible for these diseases and provide proper diagnosis method and in discovering new drugs.
Data Mining refers to mining or extracting knowledge from huge volume of data. Classification is used to classify each item in set of data into one of the predefined set of classes. In data mining, an important technique is classification, generally used in broad applications, which classifies various kinds of data. In this paper, different datasets from University of California, Irvine (UCI) are compared with different classification techniques. Each technique has been evaluated with respect to accuracy and execution time and performance evaluation has been carried out with J48, Simple CART (Classification and Regression Testing), and BayesNet and NaiveBayesUpdatable Classification algorithm.
Nowadays, dengue (vector-borne tropical viral diseases) has been become the greatest scourge of humankind and there consequence has more impact than any other pathogen in shaping the human genome. Generally in Pakistan specifically in Punjab (Pakistan) Dengue is emerging as one of the major public-health problem. Federal & Provincial Health Governments are taking all possible steps on ?War-Footing? to recover such type of diseases. They are determines in making special possibilities to face such type of problems in advance. From WWW, digital libraries, World Health Organization (WHO) and other news sources it is estimated that about 2.5 billion people, or 40 percent of the world’s population, live in areas where there is a risk of dengue transmission because Dengue flourishes in urban poor areas, suburbs and the countryside but also affects more affluent neighborhoods in tropical and subtropical countries. As of November 2011, it has killed over 300 people in the last several months and over 14,000 are infected by this mosquito-borne disease. Majority of the people infected are from the Lahore area in Punjab, Pakistan. As a matter of fact, if a virus attacks on somewhere, what will be its next target in geographically aspects, because dengue virus will be spread from one place to other from contaminated water and mosquito. Therefore by using above said data sources, performing some preprocessing techniques such as transformation, filtration, stemming and indexing of the documents and then applying data mining techniques our system will not only helps to identify geographical spreading patterns of the viruses but it also helps to suggest proactively next geographical location where virus has most probability to attach so that government can take remedy measures.
Nowadays, dengue (vector-borne tropical viral diseases) has been become the greatest scourge of humankind and there consequence has more impact than any other pathogen in shaping the human genome. Generally in Pakistan specifically in Punjab (Pakistan) Dengue is emerging as one of the major public-health problem. Federal & Provincial Health Governments are taking all possible steps on ?War-Footing? to recover such type of diseases. They are determines in making special possibilities to face such type of problems in advance. From WWW, digital libraries, World Health Organization (WHO) and other news sources it is estimated that about 2.5 billion people, or 40 percent of the world’s population, live in areas where there is a risk of dengue transmission because Dengue flourishes in urban poor areas, suburbs and the countryside but also affects more affluent neighborhoods in tropical and subtropical countries. As of November 2011, it has killed over 300 people in the last several months and over 14,000 are infected by this mosquito-borne disease. Majority of the people infected are from the Lahore area in Punjab, Pakistan. As a matter of fact, if a virus attacks on somewhere, what will be its next target in geographically aspects, because dengue virus will be spread from one place to other from contaminated water and mosquito. Therefore by using above said data sources, performing some preprocessing techniques such as transformation, filtration, stemming and indexing of the documents and then applying data mining techniques our system will not only helps to identify geographical spreading patterns of the viruses but it also helps to suggest proactively next geographical location where virus has most probability to attach so that government can take remedy measures.
377 Satellite Image Fusion Using Maximization of Non-Gaussianity?, A. M. El Ejaily, F. Eltohamy, M. S. Hamid, G. Ismail?
Image fusion is a technique for combining images from different sources to obtain a single image with enhanced information content. This paper proposes an image fusion method to merge panchromatic (PAN) and multispectral (MS) remote sensing satellite images using genetic algorithm to maximize the nongaussianity of the independent components of ICA. The genetic algorithm evolves the mixing matrix of the independent components of the MS image by maximizing the kurtosis. The proposed method is applied to Quickbird, Ikonos, and Worldview satellite image data. Performance evaluation of the proposed method is compared with that of IHS, PCA, and ICA based image fusion methods. Experimental results show optimum performance of the proposed method in terms of spatial resolution and color preservation of the fused images with the three different types of satellite image data.
378 Design of Mobile Event Management and Broadcast System Using Rational Unified Process?, Sonali Addetla, Mohini Gorde, Supriya Ghadge, Suvarna Kusal, Anupkumar M Bongale
Now a days marketing and event management is one of the biggest challenge in advertisement industries. Media like television, newspaper, email, posters, etc. are some of the ways by which commercial products can be advertised effectively. There are specific kinds of advertisements which are dynamic in nature. These are considered as dynamic because their price, offer may change frequently based on the customer demand. These dynamic advertisements cannot be published effectively to end users using conventional mode. For fulfilling such requirements a new technique called Mobile Event Management is proposed in this paper. Mobile Event Management works on Bluetooth where customer will get dynamic advisements over mobile phones as soon customer enters the commercial place. In this paper we have used Ration Unified Process (RUP) and Unified Modeling Language (UML) to provide effective design framework for the proposed advertisement technique.
The general method which is in use nowadays is text based or alphanumeric based passwords. But this method as such has not proved to be efficient in security against password guessing attacks. Either the password is easy to guess or hard to remember. To overcome this problem we have developed authentication technique which uses pictures as passwords. According to a survey of graphical password techniques they are classified into two categories: recognition based and recall based approaches. We are also going to study the strengths and limitations of each method and will also focus on future scopes in this field. This paper focuses on integrated evaluation of the Persuasive Cued Click Points graphical password authentication system and security using centered discretization technique. This paper also gives the solution for prevention from emergence of hotspots. Hotspots are the portion of images which are more likely to be chosen as click points. Also we are here proposing a new technique for graphical authentication.
Now a days a mobile phones becoming a basic part of our life .this is one of the most important medium for the communication, the mobile phone batteries has always been problem for recharging. Mobile have to be put to recharge after the batteries has drained out. In this paper the main purpose is shown to make the recharging of mobile phones anywhere you want without charger this is done only when there is a use of microwave, the microwave signal transmitted from transmitter using a special kind of antennas called slotted wave guide antennas at a frequency is 2.45GHZ. We have to add a sensor, rectenna circuit in our mobile phone to do this job successfully. This is one of the best technologies and for this purpose we are proposing wireless charging of mobile phones by using microwaves.
381 Enhancing Security in Mobile Communication using a Unique Approach in Steganography?, Prof. Sharmishta Desai, Sanaa Amreliwala, Vineet Kumar?
Mobile phones are the most commonly used devices in today’s scenario. The need for secured communication has become more imperative. Steganography is the most reliable technique for communicating critical information through publicly available communication channels. There are various techniques available for transfer of critical data such as cryptography and steganography. A proposed model of communicating hidden data through public portals has been discussed in this paper, where in text steganography and Image steganography are used.
382 Performance Improvement of Routing Protocol Using Two Different Mobility Models In Vehicular Adhoc Vehicular Network?, Mr.Vaibhav D. Patil, Prof. Atul R. Deshmukh?
Vehicular Ad hoc network (VANET) is a collection of mobile nodes that are randomly located so that the connections between nodes are dynamically changing. In VANET mobile nodes form a temporary network without the use of any existing network infrastructure or centralized administration. In recent years, the study aspect of vehicular ad hoc network (VANET) is becoming an interesting research area characteristics led to the need for efficient routing and resource saving VANET protocols, to fit with different VANET Mobility environments. The aim of this paper is to give a survey of the VANETs routing Scenario, this paper gives an overview of Vehicular ad hoc networks (VANETs), the existing VANET routing protocols and the exiting two mobility Models. The paper also represents the general outlines VANETs, investigates different routing schemes that have been developed for VANETs, and providing classifications of VANET routing protocols within two classification forms and gives summarization.
383 A Review on the Role of Big Data in Business?, Jafar Raza Alam, Asma Sajid, Ramzan Talib, Muneeb Niaz?
Big data is a game changing thing. Successful organizations are achieving business advantages by analyzing big data. It has received significant attention in recent years but some challenges are one of the major causes in diminishing the growth of organizations. The main issues why these organizations are not begin their planning stage to implement the big data strategy because they don’t know enough about the big data and they don’t understand the benefits of big data. In this study, an attempt is made to review the role of big data in the business.
384 Detection of Automobile Drivers Stress from Physiological Signals?, Dayalin Subi J, Anuja H S
This project gives an analysis of various physiological signals of a person with respect to the stress developed within him/her. The analysis of stress was done using ECG, EEG and respiratory signals acquired from the automobile drivers who were made to drive on different road conditions to get different stress levels. As a part of analysis, two features were extracted from the physiological signals and it clearly shows the changes in the feature with respect to the stress of the driver. From the extracted feature, stress is classified using SVM classifier. The performance of the networks was tested and compared with other physiological signal and produce better result with high accuracy.
385 Application Layer’s Approaches for TCP Incast Problem at Data Center Networks?, Irfan Riaz Shohab, Muhammad Younas, Ramzan Talib, Umer Sarwar?
Data Centers have become very popular to store a huge amount /volumes of data. Many companies like Amazon, Google, Microsoft, IBM, Yahoo, and Face Book use the data center for storage of Web Search, E-Commerce, and large scale computation. High Speed Links, Low Propagation Delay, Limited sizes Switch Buffer are the main characteristic of Data Center. Data Center in these days have hundreds of thousands of servers to store the data across many thousands of machines. TCP is the most popular transport layer protocol that is currently used in the internet. Data Center faces the different set of problems than internet. TCP Incast is the main problem that the data centers faces. TCP incast problem is refer to the TCP throughput collapse. When the multiple data sender s simultaneously respond to a single receiver, that is called many to one communication pattern, the burst data overload the buffer of receiver’s switch. This causes the throughput collapse that degrades the performance and packets loss. That is so-called TCP incast problem. Many techniques, approaches and algorithms have been introduced to resolve the throughput collapse problem of TCP. Due to the switching to cloud computing culture or cluster base switching system those are data centers, the researchers join the race of research to avoid the throughput collapse issue. Multi-layer approaches & techniques are proposed for incast problem. Application layer is the most important layer in the TCP/IP structure. This layer has the capabilities to handle or control to congestion. In this research work techniques for incast problem at Application Layer level is discussed. In this research work attempt is made to discuss available possible solution for researchers who want to work on the incast problem at application layer.
386 Light Fidelity (LI-FI)-A Comprehensive Study?, Ekta, Ranjeet Kaur?
This latest technology Li-Fi (Light Fidelity) refers to 5G Visible Light Communication systems using light-emitting diodes as a medium to high-speed communication in a similar manner as Wi-Fi.Harald Haas says his invention, which he calls D-LIGHT, can produce data rates faster than 10 megabits per second, which is speedier than your average broadband connection. In the days where internet has become a major demand people are in a search for Wi-Fi hotspots. Li-Fi or New Life of data communication is a better alternative to Wi- Fi in wireless communication. Li-Fi has thousand times greater speed than Wi-Fi and provides security as the visible light is unable to penetrate through the walls, which propose a new era of wireless communication. Such technology has brought not only greener but safer and cheaper future of communication.
387 MULTIHOMING AND MULTISTREAM PROTOCOL IN COMPUTER NETWORKS, Syed Yasmeen Shahdad, Gulshan Amin, Pushpender Sarao?
SCTP (stream control transmission protocol) is a reliable message oriented transport layer protocol. It is an IETF standard developed by the Transport Area Working Group (TSVWG). SCTP has been standardized by the IETF(Internet Engineering Task Force) in series of RFCs ,as a reliable transport protocol to transport SS7(Signaling System 7) signaling messages for internet applications such as ISDN over IP, telephony sinagaling , media gateway control, IP telephony[5]. SCTP is similar to TCP in many services. They are both unicast connection-oriented reliable protocols, provide in-sequence packet delivery and congestion control. SCTP preserves message boundaries and at the same time detects lost data, and out of order data. It uses 32 bit checksum as opposed to a 16 bit checksum of TCP. Although TCP has traditionally been used, we argue that SCTP provides better services than TCP in terms of reliability and performance. Due to its attractive features such as multi-streaming and multi-homing. SCTP has received much popularity, in terms of both research and development [6, 7]. SCTP provides multi-homing in that endpoints can use multiple IP addresses for the connection. SCTP may be more resistant to Man in the Middle (MITM) and Denial of Service (DoS). Multi-stream does not refer to multiple streams in the TCP sense but rather each stream represents a sequence of messages within a single association. These may be long or short messages which include flags for control of segmentation and reassembly.
388 Literature Survey on DWT Based Image Steganography?, Mrs. Suvarna Patil, Mr. Gajendra Singh Chandel
Steganography is the science of secure communication which has received much attention from the scientific community recently. The four main objectives of steganography are: Indefectibility, Security, embedding payload, and Robustness. However, steganography can protect data by hiding it in a cover object but using it alone may not guarantee total security. Thus, the use of encryption in steganography can lead to „security in depth?. It is the science of embedding information into the cover image viz., text, video, and image (payload) without causing statistically significant changes to the cover image. The advanced secure image steganography presents a challenging task of transferring the embedded information to the destination without being detected .This paper provides a state-of-the-art review and analysis of the different existing methods of steganography along with some common standards and guidelines drawn from the literature.
389 Encryption and Decryption of Data Using QR Authentication System, Atul Hole, Mangesh Jadhav, Shivkant Kad, Swanand Shinde?, Prof. Pramod Patil?
In this paper, we explore how QR codes can be used in education. The low technical barrier of creating and reading QR codes allows innovative educators to incorporate them into their educational endeavors. Security the data is a big problem. And to solve this problem we propose an efficient method to authenticate digital information presents in our documents. If an intruder tries to change the information of the document that intruder cannot do that in QR Code. In this Paper we encrypt the data using encryption Algorithm. The information which is encrypted are entered inside the QR code and QR code will be also printed with the original data of document. Then the data can then be retrieved from the QR code and can be decrypted using decryption algorithm. And finally it can be verified data that are already presents in the document.
390 IDS: Survey on Intrusion Detection System in Cloud Computing?, Mr. Ashish Kumbhare, Mr. Manoj Chaudhari?
Cloud computing provides flexible on demand services to the end users with lesser infrastructural investment. Under the supervision of different managements, these services are provided over the Internet using known networking protocols, standards and formats. Existing deficiencies in underlying technologies tend to open doors for intrusion. This paper surveys different intrusions affecting basics of cloud security i.e. availability, confidentiality and integrity of Cloud resources and services. It examines proposals incorporating Intrusion Detection Systems (IDS) in Cloud and discusses various types and techniques of IDS.
391 Improvement of Expectation Maximization Clustering using Select Attribute, Rupali Bhondave, Madhura Kalbhor, Supriya Shinde, K. Rajeswari?
Data mining is the process of extracting valuable information from various sources of data and produces knowledge. For mining data, WEKA tool is used. In WEKA there are various processes to produce knowledge, like Preprocess, Classification, Clustering, Select Attribute, and Association etc. This paper focuses clustering Technique. Clustering is a technique by which we can categorize similar objects or dissimilar object. There are various algorithms in clustering. A method attribute selection for experimentation on Expectation Maximization (EM) clustering is used. In attribute selection we used Best First Search (BFS), Random Search for EM clustering, which gives better results than the result obtain without using attribute selection method.
392 A Survey on Security in Cloud Computing, Varsha Yadav, Preeti Aggarwaal?
Cloud computing is an emerging technology in IT industry. In Cloud computing technology, computing resources are provided as a service over the internet, rather than a product. Cloud computing has gained great attention from the industry but there are still many issues that are hampering the growth of cloud. One of these issues is security of data stored on the servers of cloud service providers. This paper presents a survey on various security schemes that provide data security in cloud computing.
393 Secure Sharing of Medical Records Using Cryptographic Methods in Cloud, M.P. Radhini, P.Ananthaprabha, P.Parthasarathi?
Cloud based data is safer than paper and client-server records. Now, medical practices just have to be willing to look to the cloud for the future of healthcare IT. There are lots of security issues related with the storage of sensitive personal health information in the cloud, which will make lots of security challenges to the PMR privacy and confidentiality. Cryptography is an essential tool that helps to assure our data accuracy. The Cryptographic techniques can be employed to protect the data in cloud environment. The technique used for security is multiple authority attribute based encryption technique which focuses on the multiple data owner scenario and divide the users in the PMR system into multiple security domains which leads to key management complexity for owners and users. In the proposed distributed attribute based encryption scheme PMR can be accessed from any hospital using a single key thereby reducing the complexity of key management.
394 A Distributed Computer Machine Vision System for Automated Inspection and Grading of Fruits?, Yogitha.S, Sakthivel.P?
a computer machine vision system that can be used for automatic high-speed fruit sorting and grading is proposed. The objective of the project is to develop the advanced quality control inspection system makes use of distributed network architecture to interface the camera unit to a computer system through GigE LAN environment in a flexible way. The development work activity involves in dynamic capturing of image signal from camera when the objects are moving on the conveyor in real time based on synchronized trigger events. This project planned to do in visual studio using OpenCV. The process involved for estimating the colour information and geometry parameters makes use of sequence of complex library functionality like removing the noise, detecting the edges, smoothing, dilate, erode, filling, debluring, filtering, histogram, colour values, pixel averaging, etc.
395 A Multi-Level Security Framework for Cloud Computing?, Mr. Anup Date, Mr. Dinesh Datar?
Cloud computing is a model of information computing, storage, delivery of services and sharing infrastructure resources provided to clients on their demand. Instead of purchasing actual physical devices servers, storage, or any networking equipment, customer used these resources from a cloud provider as an outsourced service. It defined as ?a model of management of information, resources and applications as services over Internet as per requirement of clients?. Cloud computing is a approach to convenient and on demand network access to a shared group of computing resources that can be provided by service provider in the form of multiple services. It introduces a new Internet-based environment for on-demand access, dynamic provision for computing resources by using various type services on cloud. These models are referred as Software as a Service, Platform as a Service and Infrastructure as a Service. Just because of that it is tough to manage the security and privacy problems in cloud caused by its sensitive data, outsourcing of infrastructure, multi-tenancy nature, and critical applications. This paper proposing a framework that identifies and summarize the security and privacy challenges in cloud services. It highlights cloud-specific attacks and risks and clearly illustrates their mitigations and countermeasures. This is also highlight a multilevel security framework for cloud computing that helps to satisfy security and privacy requirements in the clouds and protect them against intruder attacks. The purpose of this work is to demonstrate and introduced a security and privacy aspect that will take into considerations while developing and using the cloud environment either by individuals or organizations.
396 Predicting the Effect of Diabetes on Kidney using Classification in Tanagra?, Divya Jain, Sumanlata Gautam
As numerous data mining tools & techniques continue to develop along with the healthcare domain, the applications of data mining in healthcare sector will undoubtedly play a growing role in the world. There exist a large number of useful techniques in data mining like classification, association, clustering, regression etc. that helps to discover new trends in huge healthcare databases. This paper comes out with the application of classification and prediction technique to find the effect of diabetes on kidney. The implementation is done using the application of C4.5 Algorithm in Tanagra. This paper demonstrated the utility of classification on a dataset containing records of both diabetic and non-diabetic patients using data mining tool Tanagra. Using Kidney Function Tests (KFT), the effect of diabetes on kidney is determined. After comparing manually the result of Tanagra with the actual output, we discovered that Tanagra is very near to the output. Finally, the performance of classifier in Tanagra is evaluated in terms of recall, precision and error rate.
397 Design of Moderate Speed and Moderate Resolution Successive Approximation Analog to Digital Converter?, Mr. Jitendra Waghmare?, Prof. P.M. Ghutke?
This Paper presents the Design of analog to digital converter (ADC) for low power applications, so here is the selection of right architecture is very crucial. We have chosen successive approximation Analog to Digital Converter because of their compact circuitry as compared with the Flash ADC which makes this SAR ADC inexpensive. Day By Day more and more applications are built on the basis of power consumption so this SAR ADC will be useful for high speed with medium resolution and low power consumption. The Successive Approximation (SAR) architecture is very suitable for data acquisition, it has resolutions ranging from 8 bits to 12 bits and sampling rates ranging from 50 KHz to 50 MHz
398 An Aggregate Key Based Cryptosystem for Secure Data Sharing in Cloud Computing?, R. Vanitha, V. Elavarasi?
Cloud computing provides the flexible architecture to share the applications as well as the other network resources. Cloud storage enables networked online storage when data is stored on multiple virtual servers generally hosted by third parties, rather than being hosted on dedicated servers. Key management and key sharing plays the main role in the data sharing concept of cloud computing. Traditional key cryptosystem lack the enhanced security techniques as the keys are generated by the exciting random key generation. Existing system said to have aggregate key cryptosystem in which key generated by means of various derivations of cipher text class properties of data and its associated keys. The aggregate was generated at only once, if we lost the key means it is difficult to access the data. So we introduce a SSH (Secure Shell) key, Digital signature, key escrow and encapsulation algorithm for secure authentication in cloud. This key is used to authenticate the remote computer and allow it to authenticate the user.
399 Challenges and Security Issues in Cloud Computing?, Joshna S, Manjula P?
Cloud Computing is a flexible, cost-effective, and proven delivery platform for providing business or consumer IT services over the Internet. However, cloud Computing presents an added level of risk because essential services are often outsourced to a third party, which makes it harder to maintain data security and privacy, support data and service availability, and demonstrate compliance. Cloud Computing leverages many technologies. It advantages to mention but a few include scalability, resilience, flexibility, efficiency and outsourcing non-core activities. Cloud computing offers an innovative business model for organizations to adopt IT services without upfront investment. The aim of this paper is to render a more elaborated and complete understanding of the issues and challenges related to Cloud security and provide major research directions for future to the researchers in concerned areas
400 Multimodality Sensor System for Sleep- Quality Monitoring?, Ms. Snehal R. Sawale, Prof. Vijay S. Gulhane
Multimodality is the mixture of textual, audio, and visual modes in combination with media and materiality to create meaning. The influence of sleep conditions to human health and performance is currently well known but still underestimated and monitoring devices are not widespread. This paper describes methodology and prototype design of a sleep monitoring. Sleep monitoring is an important issue and has drawn considerable attention in medicine and healthcare. Given that traditional approaches, such as polysomnography, are usually costly, and often require subjects to stay overnight at clinics, there has been a need for a low-cost system suitable for long-term sleep monitoring. In this paper, we propose a system using low-cost multimodality sensors such as video, passive infrared, and heart-rate sensors for sleep monitoring. We apply machine learning methods to automatically infer a person’s sleep state, especially differentiating sleep and wake states. This is useful information for inferring sleep latency, efficiency, and duration that are important for long-term monitoring of sleep quality in healthy individuals and in those with a sleep-related disorder diagnosis. Our experiments show that the proposed approach offers reasonable performance compared to an existing standard approach (i.e., actigraphy), and that multimodality data fusion can improve the robustness and accuracy of sleep state detection.
401 Data Mining Technique its Needs and Using Applications?, Anup Arvind Lahoti?, Prof. P. L. Ramteke?
Data mining is a process of using different algorithms to find useful patterns or models from data. It is a process of selecting, exploring and modeling large amount of data. Mostly used in technology and also used in different areas of real life like finance, marketing, business. It can also be used in different social science methodologies, such as psychology, cognitive science and human behavior etc. The ability to continually change and acquire new understanding is a driving force for the application of DM. This allows many new future applications of data mining [1] as in today’s world the need of data in every field is growing very vast. So, for satisfying our need there should be proper Data Mining techniques available. In this paper we are presenting valuable information about data mining.
This project proposes the new idea to continuously monitor the railway bogie ,Each bogie has electronic control unit which is connected sensors and all ECU are interconnected through can protocol bus ,here temperature is mainly monitored ,if fire accidents is occurred then emergency exits are opened and sprinkler will be on ,and accident is notified to main railway driver. Here the air condition in ac bogies is continuously monitored and controlled.
403 An Advanced Watermarking and Compression of Images using SPIHT Compression, Shihas Abdul Razak J, Rekha Bhandarkar
Everyday very huge amount of data is embedded on digital media or distributed over the internet. One way to overcome illegal duplication of the data is to insert information known as watermark, into potentially vulnerable data in such a way that it is impossible to separate the watermark from the data. A watermark is a form, image or text that is impressed onto digital media, which provides evidence of its authenticity. Another most common issue today is the storage, manipulation, and transfer of digital images. The files that comprise these images, however, can be quite large and can quickly take up precious memory space on the computer’s hard drive. In multimedia application, most of the images are in color. And color images contain lot of data redundancy and require a large amount of storage space. This project concentrates on implementing watermark in image as well as video and also reducing the size of the data by compressing it using SPIHT (Set Partitioning In Hierarchical Trees) compression. The main consideration for any watermarking scheme is its robustness to various attacks. Experimental results demonstrate that it is robust by calculating normalized cross co-relation (NCCR), mean square error (MSE) and the peak signal to noise ratio (PSNR) between the watermark image and extracted image.
404 A Review: Grid Computing, Ankit Punia, Ms. Pooja Mittal
A grid is a hardware and software infrastructure that allows service oriented, flexible, and seamless sharing of diverse network of resources. A grid is to compute data intensive tasks and provides faster throughput and scalability at lower costs. The aim of grid computing is to provide an affordable approach to large-scale computing problems. The geographically isolated computational resources combined within a grid viewed as a virtual supercomputer. This paper tells about the various types of grid, Grid Architecture, OGSA, Fault Tolerance, Load Balancing, and various challenges in grid computing.
405 The Multitenant Cloud Architecture?, JUEE DARYAPURKAR, PROF. KARUNA BAGDE
Force.com is the preeminent on-demand application development platform in use today, supporting some 47,000+ organizations. Individual enterprises and commercial software-as-a service (SaaS) vendors trust the platform to deliver robust, reliable, Internet-scale applications. To meet the extreme demands of its large user population, Force.com’s foundation is a metadata driven software architecture that enables multitenant applications. This paper explains the patented technology that makes the Force.com platform fast, scalable, and secure for any type of application.
In this paper, we introduces IsoNet, hardware- based conflict-free dynamic load distribution and balancing with heavy job transfers. IsoNet is a lightweight job queue manager which makes faster thee list of jobs to be executed, and maintaining load balance among all Chip multi processor cores. For balanced distribution of workload based Job Queue Management for Many-Core Architectures. To implement S-box method in each queue, The Plain text of 4 bit data is considered for encryption algorithm utilizing key. For high speed applications, the Non LUT based implementation of S-box is preferred. Performance evaluation of the design with respect to area, power, and time has been done.
407 Human Computer Interaction: Analysis and Journey through Eras?, Pratibha T Jose, Surbhi Miglani, Sanjay Yadav?
This paper provides an overview of Human Computer Interaction (HCI). It includes the explanation regarding how, when did the HCI emerge and how it evolved during years till date and what could be the future inventions in the field of HCI. This excerpt also gives an overview of HCI related terminologies and its design principles. The design models of HCI include unimodal and multimodal architectures. Various tools used for building interfaces have also been discussed in this paper. The interface building tools design theories and principles of HCI systems have been discussed in the medieval sections. Finally the future and various applications of the HCI are discussed.
408 Survey on Network Based Intrusion Detection System in MANET?, Nithya Karthika M, Raj Kumar?
Mobile Ad hoc Network is a collection of mobile nodes equipped with both a wireless transmitter and a receiver that communicate with each other via bidirectional wireless links either directly or indirectly. The self-configuring ability of nodes in MANET made it popular among critical mission applications like military use or emergency recovery. However, the open medium and wide distribution of nodes make MANET vulnerable to malicious attackers. It is crucial to develop efficient intrusion-detection mechanisms to protect MANET from attacks. Network-based intrusion detection systems operate differently from host-based IDSes. The design philosophy of network-based IDS is to scan network packets at the router or host-level, auditing packet information, and logging any suspicious packets into a special log file with extended information. We survey on the intrusion detection system in MANET using various methods and algorithms.
409 Social Authentication and Untrusted Clouds for Secure Location Sharing?, Miss. Priyanka K. Shinde, Prof. Nitin R. Chopde
Recently, many location-sharing services (LSSs) have emerged that share data collected using mobile devices. However, research has shown that many users are uncomfortable with LSS operators managing their location histories and that the ease with which contextual data can be shared with unintended audiences can lead to regrets that sometimes outweigh the benefits of these systems. In an effort to address these issues, we have developed SLS: a secure location sharing system that combines location-limited channels, multi-channel key establishment, and untrusted cloud storage to hide user locations from LSS operators while also limiting unintended audience sharing. In addition to describing the key agreement and location-sharing protocols used by SLS, we discuss an iOS implementation of SLS that enables location sharing at tuneable granularity through an intuitive policy interface on the users’ mobile device.
410 Different Genetic Operator Based Analysis and Exploration of TSP?, Ashima Malik?
TSP has been considered the most complex algorithmic problem because of its time complexity. TSP comes under the NP Complete problem that becomes more critical as the number of cities increases. In this present work, an effective solution to TSP is provided using genetic approach. The work has presented genetic based model to generate the TSP path in effective time. The improvement is here performed on fitness function and crossover stages where the cost based analysis is performed to generate the effective path. In this paper, a detailed description to genetic process model is given with exploration of different stages of genetics. The paper also includes the presented algorithm along with associated assumptions. The work will be able to provide effective solution in optimized time.
411 To Avoid Unwanted Messages from User Wall: Content Based Filtering Approach, Miss. Dipali D. Vidhate, Prof. Ajay P. Thakare?
This paper proposes a system that implements a content-based message filtering service for Online Social Networks (OSNs). Our system allows OSN users to have a direct control on the messages that are posted on their walls. This is done through a rule-based system, that allows a user to customize the filtering criteria, which is to be applied to their walls, and a Machine Learning based classifier which can automatically produce membership labels for the support of our content-based filtering mechanism.
Mobile ad hoc networks (MANETs) consist of wireless mobile nodes that can dynamically and freely self-organize into arbitrary and temporary ad hoc network topologies. Due to high mobility of nodes, there exist frequent link breakages which lead to frequent path failures and route discoveries. In a route discovery, both cluster and broadcasting scheme is deployed, where a mobile node blindly rebroadcasts the first received route request packets unless it has a route to the destination, and it causes the broadcast storm problem. In cluster based multipath routing scheme is proposed for reducing routing overhead and energy consumption in MANETs. Multipath routing with cross layer framework is proposed to effectively exploit the load balancing and also to improve the network lifetime, accurate additional coverage ratio of cluster is determine the sensing cluster member coverage knowledge. Packet integrity is determined to provide the hop to hop authentication. By combining the cluster and cross layer frame work, a reasonable node lifetime is set to improve communication coverage. This scheme combines the advantages of the new enhanced RSA encryption/decryption scheme and energy consumption model which can significantly decrease the number of retransmissions so as to reduce the routing overhead, energy consumption, vulnerability of attackers and can also improve the network lifetime.
MANETs may be an assortment of wireless mobile nodes forming a network while not victimization any existing infrastructure. Varied issues area unit principally thanks to the shortage of resources of those networks. The solutions for typical networks area unit sometimes not decent to supply economical adhoc operations. The wireless nature of communication and lack of security infrastructure raise many security issues. Increased accommodative Acknowledge (EAACK) one in all the Intrusion Detection System (IDS) mechanism that will increase the integrity IDS victimization digital signature, ACK digitally signed before its reach destination. In EACCK probabilities to create false acknowledgement. EACCK uses DSR routing protocol for characteristic the route. DSR causes additional Routing Overhead. Instead of DSR, AODV and TRUST protocols accustomed offer less end-to-end delay and routing overhead. Comparison of those protocol accustomed choose higher path to secure transmission between nodes.
414 Hybrid Approach for Optimizing the Search Engine Result?, Ashish Kumar Kushwaha, Nitin Chopde?
Due to tremendous growth in growth of internet over recent years, huge amount of data collected over the web and search engine users facing problem in search a relevant information by writing few keywords, search engine returns a number of result page and then user have to spend long time to search a relevant information from number of result. In this paper, we propose a hybrid approach for optimizing the search engine results using document clustering, genetic algorithm and Query Recommendation to provide the user with the most relevant pages to the search query. This process starts with query recommendation, based on learning from query logs that predicts user information requirements in which an algorithm has been applied to recommend related queries to a query submitted by user and process of document clustering, genetic algorithm are applied to resultant pages from query recommendation to deliver most relevant result to user at minimum time.
A Forensic Image is a is often accompanied by a calculated Hash signature to validate that the image is an exact duplicate of the original which is mainly focus on detection of artifacts introduced by single processing tool. Hence making it necessary for developing several for detection of artifacts. It is by introducing theoretical frameworks, based on Dempster-Shafer’s Theory of Evidence, Fuzzy Theory and on Bayesian inference respectively. These decision fusion theories are mainly of heterogeneous or having the conflicting outputs of forensic algorithms. These models are easily expandable to an arbitrary number of tools do not require output to be probabilistic and take into account available information about tools reliability. To validate the proposed approaches, some experiments addressing a simple yet realistic scenario in which three forensic tools exploit different artifacts introduced by double JPEG compression to detect cut and paste tampering within a specified region of an image. The results we obtained are encouraging when we compared with the performance of a simple decision method based on the binary OR operator.
416 Enhancement the Security of WSN using ALARM Protocol to Prevention from Reply Attack?, Neelam Shekhawat, Moumita Ghosh
The wireless Ad hoc network is the self configuring type of network. In self configuring type of networks mobile nodes can leave or join the network when they want .In such type of networks many inside and outside attacks are possible. Inside and outside attacks are broadly classified as active and passive attacks. To prevent inside and outside attacks trust relationship between the mobile nodes must be maintained. The trust relationship between the mobile nodes is provided by mutual authentication. ALARM is the protocol for providing trust relationship between the mobile nodes. In this protocol the clocks of the mobile nodes are weakly synchronized by using GPS. In such case reply attack is possible. To prevent reply attack clocks of the mobile nodes must be strongly synchronized. In our new proposed technique, we are enhancing t the ALARM protocol to provide strong clock synchronization between the mobile nodes. Our new technique will be based on the NTP (network time protocol).
417 Review on “Image Segmentation Methods”?, Rita Harle, Prof. M. R. Joshi?
The objective of the image segmentation is to simplify the representation of pictures into meaningful information by partitioning into image regions. The aim of this paper is to review existing approaches to the segmentation of images and highlighting the key-points. Here we try to cover different recent approaches of segmentation techniques and try to cover maximum number of research papers.
418 ASSOCIATION RULE MINING IN DISTRIBUTED DATABASE SYSTEM?, E.Deenadayalan, D.Kerana Hanirex, Dr.K.P.Kaliyamurthie?
Data mining is one of the crucial research areas. Among this, discovery of association rules is an important research topic. This paper implements a distributed database algorithm (DD) for mining association rules. The efficiency of this algorithm is compared with the standard FP-Growth algorithm. This algorithm produces the same result as that of FP-Growth with higher efficiency and accuracy. This paper is tested against with the connect data sets and hence prove its efficiency and accuracy.
419 A Comprehensive Study on Cloud Computing?, Suruchee V.Nandgaonkar, Prof. A. B. Raut?
Cloud computing is becoming an increasingly popular enterprise model in which computing resources are made available on-demand to the user as needed. The unique value proposition of cloud computing creates new opportunities to align IT and business goals. Cloud computing use the internet technologies for delivery of IT-Enabled capabilities ‘as a service’ to any needed users i.e. through cloud computing we can access anything that we want from anywhere to any computer without worrying about anything like about their storage, cost, management and so on. In this paper I provide a comprehensive study on the motivation factors of adopting cloud computing, review the several cloud deployment and service models. It also explore certain benefits of cloud computing over traditional IT service environment-including scalability, flexibility, reduced capital and higher resource utilization are considered as adoption reasons for cloud computing environment. I also include security, privacy, and internet dependency and availability as avoidance issues. The later includes vertical scalability as technical challenge in cloud environment.
420 Optimization of Rule - Based Medication Delivery in Large scale Healthcare Framework, G.Manikandan, Dr. S.Prabakaran?
In large scale healthcare point of care areas, medication delivery to large number of patients simultaneously faces large number of barriers in terms of performance and scalability. Integrated decision support systems improve clinical performance and patient outcomes. Computerized programs have novel aspects that have to be considered but aspects such as technical problems/support and user interface issues acts as barriers. In terms of eliminating such barriers in large scale healthcare framework, optimization of rule-based expert system is highly necessary to attain high performance and scalability factors that ensure patient safety at Point of Care.
Authentication is one the most important security primitive. Password authentication is most widely used authentication mechanism. Password provides security mechanism for authentication and protection services against unwanted access to resource. To address these authentication problems, a new alternative authentication method have been proposed using picture as passwords. Graphical passwords have been designed to try to make passwords more memorable and easier for people to use and there, more secure. Using a graphical password, user click on images rather than type alphanumeric characters. In this paper, we have purposed a new hybrid graphical password based system, which is a combination of recognition and recall based techniques that offers many advantages over the existing systems and may be more convenient for the user. Our scheme is resistant to shoulder surfing attacks on graphical passwords. This scheme is proposed for mobile devices which are more handy and convenient to use than traditional desktop computer systems.
422 An Enhancement in Centroid Algorithm in Range-free Grid Based Environment for Wireless Sensor Networks?, Gurleen Singh, Malti Rani?
Localization is a prominent part of the Wireless Sensor Networks (WSN), as without the location information, messages are bound to be invalid on network. Various techniques have been introduced to localize the unknown nodes in the network. The efficiency of localization algorithms depend on the accuracy of localizing the nodes, precisely. The anchor nodes are always limited because of hardware restrictions like energy consumptions, cost etc. The primary objective of this task is to achieve the least localization error. In order to achieve this objective, centroid algorithm is refined in the grid environment. The grid environment provides a regular deployment of the anchor nodes. The derived technique uses the concept of weight and distance to improve the accuracy and the simulation results manifest the superior performance.
423 Mail_Alert: Online Suspicious URL Detection of Tweets from Twitter Public Timeline, SPOORTHI K, SARVAMANGALA D R?
Twitter, a famous social networking site where thousands of users use it to tweet to the world, is prone to spam, phishing, and malware distribution. Tweets are the atomic building blocks of Twitter, 140-character status updates with additional associated metadata. People tweet for a variety of reasons about a multitude of topics. Traditional spam detection scheme for twitter are ineffective against feature fabrications or consume much time and resources. Conventional suspicious URL detection schemes utilize several features including lexical features of URLs, URL redirection, HTML content, and dynamic behavior. However, evading techniques such as time-based evasion and crawler evasion exist. In this paper, we propose a suspicious URL detection system for Twitter in which numerous tweets from the Twitter public timeline is collected and dynamic trained classifier is been built to classify among suspicious and the real ones. Timelines are collections of Tweets, ordered with the most recent first. Evaluation results show that our classifier accurately and efficiently detects suspicious URLs. A near real-time system for classifying suspicious URLs in the Twitter stream. In this paper I propose to block the malicious URLs and provide mail alert for malicious URLs occur in the twitter stream.
A novel face detection system is presented in this paper. Finding faces in an arbitrary scene and successfully recognizing them have been active topics in Computer Vision for decades. Human skin color is an effective feature used to detect faces, although different people have different skin color, several studies have shown that the basic difference based on their intensity. Textures of human faces have a special texture that can be used to separate them from different objects. This research paper presents a robust and precise scheme for detecting the faces and locating the facial features in the images with complex background using genetic algorithm. The entire propose work has divided into two modules. First is Face Detection using skin colour region and locating various features in skin area. Secondly Face Recognition using Train artificial neural network using Genetic algorithm. Experimental results demonstrate that this face detection and recognition system provides successful results for the image of individuals.
425 ZigBee Wireless Sensor Network Technology?, Miss. Pooja V. Ingalkar, Prof. A. B. Deshmukh?
A wireless sensor network (WSN) consisting of spatially distributed autonomous devices using sensors to cooperatively monitor physical or environmental conditions. A WSN consists of many inexpensive wireless sensors, which are capable of collecting, storing, processing environmental information, and communicating with neighboring nodes. ZigBee is newly develop technology that work on IEEE standard 802.15.4,Which can be use in wireless sensor network. Low data rates, low power consumption, low cost are the features of the ZigBee. ZigBee is world wide open standard for wireless radio network in monitoring and control field. ZigBee/IEEE 802.15.4 devices can be used to improve the current manufacturing control systems, detect unstable situations, control production pipelines, and so on. ZigBee and IEEE 802.15.4 are designed for lightweight sensor platforms.
426 Enhancing Probabilistic Packet Marking by Integrating Dynamic Probability and Time to Live (TTL) Clustering?, Souzan Asadollahi?
In recent years, Denial-of-service attacks emerged as a pressing problem. Since a lot of attention has been placed on Denial-of-service defense research and a number of approaches have been proposed. One suggested solution is ?IP Trace back? which is referred to as tracing malicious packets back to their origin. It categorized in several methodology. Packet Marking from this category is the subject of our study. In this paper, we focus on ?Probabilistic Packet Marking (PPM)? which is inefficient in the case of Distributed Denial of Service (DDoS) attacks due to high false positive in reconstructing attack graph and also high convergence time. We adopt the dynamic probability along with Time to Live clustering method in order to reduce the rate of false positive and convergence time. We envision DDoS attack starts when network traffic is more than our default threshold. In an abstract view, we have considered dynamic probability rather than fixed, which is the root problem in most Probabilistic Packet Marking (PPM) and also to facilitate the fast reconstructing of attack graph, we exploit TTL field in two folds: one time to live and another one identification field for packets’ fragments coming from same distance. Consequently, our experimental results show how our model would be efficient in comparison with some pervious methods.
In Wireless Sensor Network, the multi-hop transmissions are used for data collection as well as to deliver a data packet by sequence of nodes. While transmitting the packets over a network, the faults occurring are common. In order to avoid faults and to ensure the network Quality of Service the WSN need to take actions for degradation of services. In existing system, the generic link model only considers the forwarding quality of a node using the metrics like packet delay, Packet Delivery Ratio, number of packets dropped but doesn’t focus on the fault detection. The proposed system implements Distributed Localized Fault Sensor technique which helps to diagnose the sensor node during fault detection in a deployed area of sensor network. The diagnosis will complete if all the sensor nodes are identified as good or faulty in a predefined time. Thus the Localized Faulty Sensor Detection overcomes the problem of identification of faulty nodes in a sensor network and has considerable accuracy rate. As a result, the proposed system has a high fault detection accuracy and low false alarm rate. Individual sensor nodes are subject to be compromised in a security because they may be deployed in hostile environments and each sensor node communicates in a wireless medium. An adversary can produce a wormhole by directly linking two compromised nodes or using out-of-band channels to violate the security of a network. The proposed method uses ACK messages for detecting wormholes and is based on a Lightweight Countermeasure for Wormhole Attacks (LITEWORP) scheme. As a result the proposed method reduces energy consumption as well as provides greater network security.
428 Detection of Fire Flow in Videos by SVM Classifier with EM-Segmentation Method?, S.Shanthi, J.George Christober
In the past decennary computational vision based flame detection has focused significantly with a camera surveillance system omnipresent, whereas many penetrative features such as colour, shape, texture, etc., have been employed in the literature. This paper proposed a motion detection of motion features, the variation of the flow on fire motion in turbulent, fast and rigid motion of the object. The fire motion is not characterized by the classical optical flow methods. Optical mass transport model and Data driven optical flow scheme are the two methods used to detect dynamic texture and saturated flame in the fire detection task combined with EM segmentation image classification process for accuracy of the result. The proposed system we use Support Vector Machine instead of neural network influences.
429 Performance Analysis of Dual Tail Comparator for Low Power Applications?, P. Raja Sekara Pandian, Mr. M. Krishnamurthy?
The dynamic regenerative comparators are used in analog to digital converters to reduce the power consumption, area and to increase the speed so it is necessary to design low power dynamic comparator for designing low power ADC. In this project, an analysis on the delay of the dynamic comparators is presented. From the analysis, an intuition about the main contributors to the comparator delay and the tradeoffs in dynamic comparator design are found. Based on that analysis, a new dynamic comparator is proposed, where the circuit of an existing double tail comparator is modified for low-power and fast operation even in small supply voltages. Without complicating the design and by adding few transistors, the positive feedback during the regeneration is strengthened, which results in remarkably reduced delay time. Post-layout simulation results in a 0.18-?m CMOS technology confirm the analysis results. It is shown that in the proposed dynamic comparator both the power consumption and delay time are significantly reduced.
430 A Survey on Advanced Page Ranking in Query Recommendation?, Rinki Khanna, Asha Mishra?
Search engines are programs that search documents for specified keywords and return a list of the documents where the keywords were found. They return long list of ranked pages, finding the relevant information related to a particular topic is becoming increasingly critical and therefore, Search Result Optimization techniques come in to play. In this work an algorithm has been applied to recommend related queries to a query submitted by user. Query logs are important information repositories to keep track of user activities through the search results. Query logs contain attributes like query name, clicked URL, rank, time. Then the similarity based on Keyword and Clicked URL’s is calculated. Clusters have been obtained by combining the similarities of both keyword and clicked URL’s to perform query clustering. Most favored queries are discovered within every query cluster. The proposed result optimization system presents a query recommendation scheme towards better information retrieval to enhance search engine effectiveness to a large scale.
431 An Approach for Finding Frequent Item Set Done By Comparison Based Technique, Ms. Ankita Parmar, Mr. Kamal Sutaria, Mr. Krutarth Joshi?
Frequent pattern mining has been a focused theme in data mining research for over a decade. Abundant literature has been dedicated to this research and tremendous progress has been made, ranging from efficient and scalable algorithms for frequent itemsets mining in transaction databases to numerous research frontiers, such as sequential pattern mining, structured pattern mining, correlation mining, associative classification, and frequent pattern-based clustering, as well as their broad applications. In this paper, we develop a new technique for more efficient pattern mining. Our method find frequent 1-itemset and then uses the heap tree sorting we are generating frequent patterns, so that many. We present efficient techniques to implement the new approach.
432 Client-Server Version of Energy Management through the Computational Outsourcing?, Sridevi.K, George Christober.J?
In This work we optimize the Energy resources done by the Computational Outsourcing Process. The Cost of energy is higher than the Cost of Communication as per the analysis of Experimentation so we using the Computational outsourcing. The functions of Computational outsourcing having the nature to act like Deputy Server on the mobile phone like devices with in the storage space. In the System should decide the outsourcing process on the dynamic allocation that means run time while decision making process and it having the client server version of device programs that can be run on the device which makes the utilization of energy during computational process. In this Approach having that the independent application and require the minimal energy awareness of the programmer and It can be implemented in the real time system and it can have decision logic and computation process that can be run time support within the dynamic allocation process It can be saving the energy resources of mobile like devices.
Location privacy is a particular type of information privacy that can be defined as the ability to prevent others from learning one’s current or past location. In general security in every area counted so security issues related to location based services are mandatory without it our application devices are not perfectly reliable . We are proposing a new technique that uses user anonymity and dummy locations for location privacy while using location aware application server. User communicates with the server through a trusted proxy server. It sends dummy locations to the application server with its original position. The user uses temporary pseudonyms that are changed frequently according to some algorithm. Whenever pseudonyms are changed by a user, dummy locations are chosen in a tricky fashion. By the mathematical formula of path and number of dummies that makes the task of tracing the user very difficult. Security of user location will be hence increased.
434 Efficient Allocation of Resources in Cloud Server Using Lopsidedness?, B. Selvi, C. Vinola, Dr. R. Ravi?
Cloud computing plays a vital role in the organizations resource management. Cloud server allows dynamic resource usage based on the customer needs. Cloud server achieves efficient allocation of resources through virtualization technology. It addresses the system that uses the virtualization technology to allocate the resources dynamically based on the demands and saves energy by optimizing the number of server in use. It introduces the concept to measure the inequality in multi-dimensional resource utilization of a server. The aim is to enlarge the efficient resource utilization system that avoids overload and save energy in cloud by allocating the resources to the multiple clients in an efficient manner using virtual machine mapping on physical system and Idle PMs can be turned off to save energy.
435 Secure and Energy Efficient CDAMA Scheme in Wireless Sensor Network Using DAS Model?, Nidhi Mouje, Nikita Chavhan
Wireless sensor networks (WSNs) are ad-hoc networks composed of tiny devices with limited computation and energy capacities. For such devices, data transmission is a very energy-consuming operation. It thus becomes essential to the lifetime of a WSN to minimize the number of bits sent by each device. Concealed data aggregation (CDA) schemes that are based on the homomorphic characteristics of a privacy homomorphism (PH) enable end-to-end encryption in wireless sensor networks CDAMA is designed by using multiple points, each of which has different order. it is designed for a multi-application environment. it mitigates the impact of compromising attacks in single application environments and degrades the damage from unauthorized aggregations. In the database-service-provider model to maintain data privacy, clients need to outsource their data to servers in encrypted form. So that time, clients must still be able to execute queries over encrypted data.
The fetal ECG (fECG) provides a mean to monitor non-invasively the fetal heart activity. In this paper, an extended nonlinear Bayesian filtering framework for extracting electrocardiograms (ECGs) from a single channel as encountered in the fetal and maternal ECG extraction from abdominal sensor is presented. The recorded signals are modeled as the summation of several ECGs. Each of them is described by a nonlinear dynamic model, previously presented for the generation of a highly realistic synthetic ECG. A modified version of this model is used in several Bayesian filters, including the extended kalman filter, Extended Kalman smoother, and Unscented Kalman filter. Consequently, each ECG has a corresponding term in this model and can thus be efficiently discriminated even if the waves overlap in time. This framework is also validated on the extractions of fetal ECG and maternal ECG from actual abdominal recordings, as well as of actual twin magneto cardiograms. In this paper, maternal blood pressure is estimated based on kalman filtering using single channel recordings.
437 An Efficient Directional Multiresolution Image Representation using Contourlet Transform?, Ankita Sharma, Prof. Abhay Kumar, Prof. Rahul Deshmukh?
The limitations of commonly used separable extensions of one-dimensional transforms, such as the Fourier and wavelet transforms, in capturing the geometry of image edges are well known.A ?true? two dimensional transform that can capture the intrinsic geometrical structure that is key in visual information. The main challenge in exploring geometry in images comes from the discrete nature of the data. Thus, unlike other approaches, such as curvelets, that first develop a transform in the continuous domain and then discretize for sampled data, our approach starts with a discrete-domain construction and then studies its convergence to an expansion in the continuous domain. Specifically, a discrete-domain multiresolution and multidirection expansion using non-separable filters, in much the same way that wavelets were derived from filter banks. This construction results in a flexible multiresolution, local, and directional image expansion using contour segments, and thus it is named the contourlet transform. The discrete contourlet transform has a fast iterated filter bank algorithm that requires an order N operations for N-pixel images. Furthermore, we establish a precise link between the developed filter bank and the associated continuous domain contourlet expansion via a directional multiresolution analysis framework. We show that with parabolic scaling and sufficient directional vanishing moments, contourlets achieve the optimal approximation rate for piecewise smooth functions with discontinuities along twice continuously differentiable curves. Finally, we show some numerical experiments demonstrating the potential of contourlets in several image processing applications.
438 Integrating Static Analysis Tools for Improving Operating System Security, Ashish Joshi, Kanak Tewari, Vivek Kumar, Dibyahash Bordoloi
Static analysis approach is widely used for detecting vulnerabilities within the code before the execution. C/C++ programming languages consist of highest number of vulnerabilities of which buffer overflow is the highest rated. Of all static analysis tools available none has enabled to detect all the vulnerabilities. Hence, we have proposed an integrated approach using two open-source static analysis tools: Flawfinder and Cppcheck for developing a new static analysis tool.
439 Brain Tumor Detection using Curvelet Transform and Support Vector Machine, Bhawna Gupta, Shamik Tiwari?
The prevalent cause of death in human being is brain tumor. A brain tumor is a mass or growth of anomalous cells in brain. The detection of brain tumor is difficult task. Image processing provides relevant techniques for efficient detection. In the proposed technique, first the features of MRI (Magnetic Resonance Imaging) images are extracted with curvelet transform, and then these features are applied to the support vector machine for successful identification. This proposed methodology gives efficient results.
440 Utilizations of LSB Matching and Replacement for Efficiency Improvement in Digital Secret Communication, Nisha.M.J, G.H.Asha, Anandh Kumar.V, Mahendar.R?
To develop and check the Steganography based information’s by using Matlab this proposal investigates the detection of information hidden in digital media by both the least signi?cant bit (LSB) matching and replacement scheme. Which can completely recover the original images without any distortion from the secret images by utilizing the parity features of the original images and defining two embedding pairs are used embeds hidden message via LSB matching and replacing the LSB of the cover image with the MSB of the message image will help us to form a stego image which would contain the message. This message can be retrieved only by that receiver who knows that it is a stego image sent by the sender. The proposed method always has lower distortion for various levels of abstractions. Experimental results disclose that the proposed method not only provides better performance and size reductions than those of OPAP and DE, but also is secure under the detection of some well-known steganalysis techniques tested here.
- NFC is incorporation of RFID in smartphones offering a large number of services which can make our lives simple. In this paper, we are going to discuss one such innovation known as ‘NFC enabled car keys’. In future we will have cars which will facilitate keyless entry since the key will be present inside the phone and conferred to the car via NFC. It becomes necessary that such a system be secured by multiple authentication steps. The complete system is designed in a manner such that no single fact is sufficient enough for any adversary to drive away the car. Keys and code are distributed so as to make the system more secure. The key manager is involved in the scenario to ease the provision of transferring key Over-The-Air.
442 Recognition of 2D Barcode Images Using Edge Detection and Morphological Operation, Priyanka Gaur, Shamik Tiwari?
Bar code recognition has been widely used for several years in many commercial applications. Each symbol which comes into barcode category mainly contains information about the product to which it is attached. 2D barcodes store more information in both vertically and horizontally directions. Common types of 2D barcodes include Aztec, Data Matrix and QR Code. While they all look similar in appearance, each one is encoded in a different formats. The availability of cell phones with digital camera facility provides users a mobile platform for decoding bar code rather than the use of the conventional scanner which is lack of mobility. The motive of this paper is to provide a novel technique for recognition of barcode images captured by mobile phones with the help of image processing techniques. In this proposed work MATLAB platform is used for all processing operations. The performance shows satisfactory results.
443 Improve Performance by Task Scheduling Beneficial to Both User and Cloud Provider in Cloud Computing?, Drashti P. Hirani, Prof. Altaf B. Mogal?
Cloud Computing refers to Internet based development and Utilization of computer technology and so it can be described as a Internet Based Computing. Scheduling is a critical problem in cloud, because cloud provider has to give services to many users in the Cloud Environment. The main objective of scheduling is to maximize resource utilization and to minimize processing time of task, where resource utilization is Cloud Service provider’s perspective to ensure that resource are utilized efficiently and processing time of task is the User’s perspective by considering parameter like task completion time or task execution cost. This paper includes algorithm which satisfies both Cloud Service provider’s perspective and User’s perspective and improves performance over sequential scheduling
444 Fingerprint Based Gender Classification Using Discrete Wavelet Transform & Artificial Neural Network?, Samta Gupta, A. Prabhakar Rao?
This research implements a novel method of gender classification using fingerprints. Two methods are combined for gender classifications. The first method is the wavelet transformation employed to extract fingerprint characteristics by doing decomposition up to 5 levels. The second method is the back propagation artificial neural network algorithm used for the process of gender identification. This method is experimented with the internal database of 550 fingerprints finger prints in which 275 were male fingerprints and 275 were female fingerprints. Overall classification rate of 91.45% has been achieved. Results of this analysis make this method a prime candidate to utilize in forensic anthropology for gender classification in order to minimize the suspects search list by getting a likelihood value for the criminal gender.
445 Analysis of Packet Filtering Technology in Computer Network Security?, Miss. Rupali P. Hinglaspure, Prof. B. R. Burghate?
— in today’s networks are becoming more complex, and the demands on bandwidth are through the roof. In achieving visibility for the best network performance, monitoring the right data is critical. Explore a network monitoring switch tool that provides advanced filtering options that are able to quickly resolve network problems and add on new capabilities as future requirements come along, without much manual effort to maintain an updated set of filtering rules. Packet filtering has proved to be a handy tool to put access controls to IP traffic. Packet filters can be used to block IP packets based on certain criteria such as the protocol used and various protocol characteristics. On the Internet, packet filtering is the process of passing or blocking packets at a network interface based on source and destination addresses, ports, or protocols.
446 Ongole Breed Cattle Health Expert Advisory System Using Parallel Particle Swarm Optimization Algorithm?, B. Jogeswara Rao, Prof. M. S. Prasad Babu?
— the present developed paper deals with the development of expert systems using machine learning algorithm techniques to advice the farmers through online in villages. An expert system can be defined as a computer program, with a set of rules encapsulating knowledge about a particular problem domain. Machine Learning is a mechanism, used in the development of Expert systems, concerned with writing a computer program that automatically improves with experience. PSO Algorithm was taken as base and designed a new algorithm known as Parallel Particle swarm optimization. Using this Parallel Particle swarm optimization (PSO). We developed a Ongole Breed Cattle Expert System. This system is mainly aimed for identifying the diseases and disease management Of Ongole Breed Cattle and to advise the farmers through online in the villages. The present advisory system is designed by using JSP as front end and MYSQL as backend.
Companies have long been taking assistance of simple business analysis to make their strategic decisions but the dilemma of using this method to make internal and external decisions is that the decision makers can see only one side of the picture because the context of the business process is missing. So the decisions made in the light of the business analysis are not accurate. Simple business analysis focuses only on the data itself from various applications and the ambience (context) goes in the background which subsequently affects the decisions made. In today’s competitive business environment, organizations cannot solely rely on the data despite they have to focus on the context of the data. To collect all the relevant information about the particular process the process logs, operational data and its related event logs ought to be considered and integrated. This matching will allow us to take advantage of both data and its semantics. The out of research will facilitate decision makers to ponder on the whole scenario and can make way better decisions.
Resolution enhancement(RE) methods that are independent of wavelets i.e. interpolation methods leads to blurring as high frequency components are lost.RE scheme based on Discrete wavelet transform(DWT) leads to artifacts due to shift variant property. A complex wavelet-domain image resolution enhancement algorithm based on dual-tree complex wavelet transform (DT-CWT) with non local means(NLM) and curvelet transform is proposed. In this scheme, the low resolution image is undergone curvelet transform for denoising. The high frequency sub bands obtained by DT-CWT of the resultant denoised image are interpolated using Lanczos interpolator. The high frequency sub bands are further passed through an NLM filter to cater for the artifacts generated by DT-CWT.The low resolution input image and the filtered high frequency sub bands are combined using inverse DT-CWT to obtain a resolution-enhanced image. The quantitative peak signal-to-noise ratio (PSNR) and results are presented to reveal the superiority of the proposed technique through comparisons between state-of-the-art resolution enhancement methods.
Digital watermarking is an effective way to protect copyright of multimedia data even after its transmission. Current trends support digital image files as the cover file to hide another digital file with secret message or data. This paper proposes Discrete Wavelet Transform (DWT) to authenticate the multimedia image and it can convert the image from spatial domain to frequency domain. The idea behind the LSB algorithm is to insert the bits of the hidden message into the least significant bits of the pixels. Field Programmable Gate Array (FPGA) technology has become a viable target for the implementation of real time algorithms suited to video image processing applications. This System is developed on Xilinx Spartan3 Field Programmable Gate Array (FPGA) device using embedded development kit (EDK) tools from Xilinx. The results showed that the proposed algorithm has a very good hidden invisibility, good security and robustness for a lot of hidden attacks
450 Development of Mushroom Expert System Based on SVM Classifier and Naive Bayes Classifier?, Prof. M.S. Prasad Babu?, Rajani Thommandru?, K.Swapna?, E.Nilima?
Machine learning is the ability of a machine to improve its own performance through the use of a software that employs artificial intelligence techniques. In practice, this involves creating programs that optimize a performance criterion through the analysis of data. Support vector machine (SVM) is the machine learning algorithm used for creating the classification model from training data. In this paper, Support Vector Machine and Naïve Bayes algorithms are used for classification of mushrooms. Mushroom Expert System is developed for classification of mushrooms and to predict the class of mushrooms on submission of characteristics of the mushrooms. Programmed interviews are conducted with domain experts to build the knowledge base for the mushroom expert system. Performances of both algorithms are evaluated on mushroom data in fold cross-validation. This system is a web based application for online users and developed with JSP as front end and MySQL as backend.
451 An Efficient Privacy Preserving Scheme for VANET?, BUVANESWARI GANESAN, BENNET PRABHA, MOHANA SUNDARI G?
Vehicular AdHoc Networks (VANETs) scheme help to authenticate messages, identify the valid vehicles and Preserves the privacy of each vehicles.The Objective of PublicKeyInfrastructure (PKI) is to enable secure, convenient and acquisition of public keys and provide functionality using certificates. However, fixed public keys allow an eavesdropper to associate a key with a vehicle and a location, violating drivers’ privacy .In this work we propose a VANET time stamped key management scheme based on Temporary Anonymous Certified Keys (TACKs) which replaces the time consuming CRL checking process. To reduce the message overhead substantially and to enhance the effectiveness of the message verification, BSpec’s algorithm is used. Our scheme efficiently prevents malevolent vehicles from linking with dif erent keys and provides timely revocation of misbehaving participants while maintaining the same or less overhead for vehicle-to-vehicle communication.
452 Survey on Access Control Delegation to Protect and Maintain Privacy of Cloud Data, Nicholaus Gati, Sudhakar G.?
Conventional access control models often assume that the entity enforcing access control policies is also the owner of the data. This assumption is no longer holds as it forces the data owner to do a lot of computations as the third party such as cloud only provide facilities for data storage, where the approaches to enforce fine grained access control on confidential data hosted in the cloud are based on fine grained encryption of data. Under these models the owner of data is force to perform the fine grained encryption of data before uploading on the cloud and once user dynamics or credentials change the data owner must re-encrypt the data. Data owners thus incur high computational and communication costs. We propose a better approach should delegate the enforcement of fine- grained access control to the cloud, so to minimize the overheads at the data owner, while assuring data confidentiality from the cloud. The proposed approach that can well delegate the enforcement of access control is based on two layer of encryption, where the data owner performs course-grained encryption and the clouds perform fine grained encryption on top of the owner encrypted data. The main challenging issue is how access control policies (ACPs) can be decomposed such that the two layers of encryption perform well as required. For this case some novel optimization algorithms are proposed to help solve such a problem. Also an efficient group key management scheme is utilized to support expressive access control policies. Our system assures confidentiality, integrity of data and preserves the privacy of the end user from the cloud while delegating most of the access control enforcement to the cloud.
In a federated information system with diverse participants (from different organizations) such as data producers, data consumers, or both, the need of cross-organizational information sharing naturally arises. However, different types of applications often need different forms of information sharing. In particular, while some applications (e.g., stock price updating) would need a publish-subscribe framework, the on-demand information access is more suitable for other applications. A number of information brokering systems have been developed to provide efficient and secure information sharing. Many existing information brokering systems adopt server side access control deployment and honest assumptions on brokers. However, little attention has been drawn on privacy of data and metadata stored and exchanged within Information Brokering System (IBS). We proposed an Information Brokering System (IBS) on the top of a peer-to-peer overlay to support information sharing among loosely federated data sources. It consists of diverse data servers and brokering components, which help client queries to locate the data servers. However, privacy of data location and data consumer can still be inferred from metadata (such as query and access control rules) exchanged within the IBS, but little attention has been put on its protection. We studied the problem of privacy protection in information brokering process. A formal presentation of the threat models with a focus on two attacks: attribute-correlation attack and inference attack is been performed. We propose a flexible and scalable system using a broker-coordinator overlay network. Through an innovative automaton segmentation scheme, distributed access control enforcement, and query segment encryption, proposed system integrates security enforcement and query forwarding while preserving system-wide privacy. We performed a comprehensive analysis on privacy, end-to-end performance, and scalability, the proposed system integrate security enforcement and query routing while preserving system-wide privacy with reasonable overhead.
A wireless sensor network consists of distributed autonomous sensors to monitor physical or environmental conditions, such as temperature, sound, pressure, etc. and to cooperatively pass their data through the network to a main location. In the wireless sensor network energy consumption is one of the important concerns. In order to bring energy efficiency in WSNs an energy-aware sensor node is implemented. The objective of the energy efficient strategy is reducing the energy consumption from both the sensor node level and the network level in a WSN. In the sensor node to decrease the communication energy consumption, the distance between the transmitter and the receiver is predicted before available transmission, and then, the lowest transmission power needed to transmit the measurement data is calculated and determined. In addition to that the sensor nodes are also set to sleep mode between two consecutive measurements for saving the energy. By using this concept we improve the energy efficiency in the wireless sensor networks. Further to improve the energy-saving we introduce an innovative technique which combines energy efficiency and multiple path selection for data fusion in WSN. The network is divided into various clusters and the node with highest residual energy is chosen as the cluster head. For each cluster head the sink computes multiple paths for data transmission. In the cluster head the data from the sensor is compressed by using the distributed source coding and the lifting scheme wavelet transform method. To save the energy for each round of transmission the path is changed in a round robin manner. By utilizing this method, we achieve less energy consumption with increased packet delivery ratio.
455 Color Image Segmentation using IMOWT with 2D Histogram Grouping?, Christo Ananth, A.S.Senthilkani, S.Kamala Gomathy, J.Arockia Renilda, G.Blesslin Jebitha, Sankari @Saranya.S.?
In this paper, a novel algorithm based on 2D histogram Grouping for color Image Segmentation is proposed. The proposed method uses intermediate features of maximum overlap wavelet transform (IMOWT) as a pre-processing step. The coefficients derived from IMOWT are subjected to 2D histogram Grouping. This method is simple, fast and unsupervised. 2D histograms are used to obtain Grouping of color image. This Grouping output gives three segmentation maps which are fused together to get the final segmented output. This method produces good segmentation results when compared to the direct application of 2D Histogram Grouping. IMOWT is the efficient transform in which a set of wavelet features of the same size of various levels of resolutions and different local window sizes for different levels are used. IMOWT is efficient because of its time effectiveness, flexibility and translation invariance which are useful for good segmentation results.
456 Cloud Computing for Academic Environment?, S. Palaniappan?
In traditional computing, we install software programs on system (computer) update the hardware as per our requirements. Documents we create or save are stored in our computer. Documents are accessible on our own network, but they can’t be accessed by computers outside the network. Using of cloud computing, the software programs aren’t run from one’s personal computer, but are rather stored on servers accessed via the Internet. Cloud Computing provides resources and capabilities of Information Technology (e.g., applications, storages, communication, collaboration, infrastructure) via services offered by CSP (cloud service provider). Cloud Computing has various characteristics as shared infrastructure, self-service, pay-per use model, dynamic and virtualized, elastic and scalable. Cloud computing in academic environment will be benefitted by every student and staff where lots of collaboration and safety of data is needed in academic. Academic has various departments and many semesters where lots of students need to access the computing a need for highly available up-to-date software and hardware is must. Cloud computing has the capacity of scaling and elasticity which is perfect for such an environment.
Project planning is one of the most important activities in software project. Poor planning often lead to project faults and dramatic outcomes for the project team. Software effort estimation is a very critical task in the software engineering and to control quality and efficiency a suitable estimation technique is crucial. This paper gives a comparative analysis of various available software effort estimation methods. These methods can be widely categorized under algorithmic model, non – algorithmic model, parametric model and machine learning model. These existing methods for software cost estimation are illustrated and their aspect will be discussed. No single technique is best for all situations, and thus a careful comparison of the results of several approaches is most likely to produce realistic estimation. In this paper an example of estimation is also presented in actual software project.
458 A Self Destructing Data System Based on Active Storage Framework for Protecting Data Privacy from attackers UN agency?, R.Rengasamy, V.Kumaresan, G.Guru Rani?
Personal knowledge hold on within the Cloud could contain account numbers, passwords, to be different necessary info that would be used and misused, a contender, or a court of law. These knowledge area unit cached, and archived by Cloud Service suppliers (CSPs), typically while not authorization and management. Self-destructing knowledge in the main aims at protective the user data’s privacy. Data and data copies become destructed or unclear when a user-specified time, with none user intervention. Additionally, the secret writing key is destructed when the user-specified time. During this paper, we have a tendency to gift SeDas, a system challenge through a unique integration of scientific discipline with active techniques storage techniques supported T10 OSD commonplace. We have a tendency to enforce a proof-of-concept SeDas image. Through practicality and evaluations of the SeDas to meets all the privacy-preserving goals represented. System to be compared while not self-destructing knowledge mechanism, output for downloading with the projected SeDas tolerably decreases by but seventy two, whereas upload/download self-destructing with operations knowledge mechanism will increase by but hour.
459 An Industrial Investigation of Human Factors Effect on Software Productivity: Analyzed by SEM Model?, Rabia Khan, Israr Ahmed, Md. Faisal?
The software productivity in any software organization can be greatly enhanced if the human factors associated with the company employees are aligned in proper direction of cooperation and unity, among different developments teams working on same software project. In brief, the growth of any software company highly depends on understanding the concept of software productivity, and a means to measure and quantify it. In the present paper we will analyze some critical factors associated with the software development productivity, using the industrial case study data gathered after conducting questionnaire, interviews, etc. in a typical medium sized software firm. The data analysis is done through the Structure equation modeling approach. Next we present the effect of human factors on software development productivity. It’s important to combine both the Human factors and software productivity, in our study so as to arrive at one final conclusion about factors affecting the software development productivity. Finally the paper ends by having discussion in brief about the results, and future work scope about this topic.
460 Design of Piezoelectric MEMS Sensor for Energy Harvesting from Low Frequency Applications, Swapnil D.Shamkuwar, Kunal N.Dekate?
This paper presents a design of a Piezo-electric sensor model which acts as an energy harvester based on vibrations. The ample vibration-based micro electromechanical systems (MEMS) piezoelectric harvester has become an important subject in most research publications. It provides a green and practically infinite alternative power source to conventional energy sources, this harvester will significantly explore the applications of wireless sensor networks, biomedical implants etc., which may generate mW or ?W level of power. Vibration energy harvesting has been employed for converting ample amount of kinetic energy into electric energy by several different transduction methods. Amongst many transducing harvesters, the piezoelectric energy harvester (PEH) is compatible with MEMS technology and has a high electromechanical coupling effect and requires no external voltage sources and accordingly has been in most of recent research. The simplicity associated with piezoelectric micro-generators make them very attractive for MEMS applications in which ambient vibrations are harvested and converted into electric energy. These micro shaped-generators can become an alternative to the battery-based solutions in the future. In this paper, we propose a model and present the simulation of a MEMS-based energy harvester under ambient vibration excitation using the COMSOL 4.3b approaches.
461 Implementation of Hash Function Based On Neural Cryptography?, Vineeta Soni, Mrs. Sarvesh Tanwar, Prof. Prema K.V.?
In this paper a new hash function is constructed based on multilayer feed forward network with piecewise linear chaotic map. Chaos has been used in data protection because of the features of initial value sensitivity, random similarity and ergodicity. We have used three neuronal layers to prove confusion, diffusion and compression respectively. This hash function takes input of arbitrary length and generate a fixed length hash value.(128 bit, 256 bit or 512 bit). By performance analysis and results we have shown that generated hash function is one way, collision resistant and secure against birthday attacks and man in middle attacks.
Controller Area Network (CAN) is a network protocol that allows multiple processors in a system to communicate efficiently with each other. Based on requirements of modern vehicles, in vehicle Control Area Network (CAN) architecture has been implemented. In order to reduce point to point wiring harness in vehicle automation, CAN is suggested as a means for data communication within the environment. The benefits of CAN bus based network over traditional point schemes will offer increased flexibility and expandability for future technology insertions. This project is aimed at the implementation of CAN protocol using PIC for vehicle monitoring system and security using Principle component analysis. The main feature of the system includes monitoring of various vehicles. In the security part, The system automatically takes photos of driver and compares his or her face with database to check whether he is an authenticated driver or not. He can have access to the vehicle only if he is an authenticated driver. If he is not an authenticated driver access to the vehicle will not be provided. Also, the owner of the vehicle gets an image of the theft via Email .Which is an additional feature of the given system.
463 Reversible Watermarking Data Embedding into Images using DHS and IMC?, D.Maladevi, V.Rajaram, K.Thaiyalnayaki?
A reversible watermarking scheme is one of the improvement in a histogram shifting modulation modifies local precision of image content. In existing schemes hides a very less images and the capacity, distortion is more so that we propose dynamic histogram shifting insert higher images into a single original images and with the lesser distortion. For the same capacity, achieve the Peak Signal-to-noise ratio(PSNR) is more than 1to 4DB. Invariant image classification is to classify the parts of the images and embedder, extractor remain synchronize with the watermarked image. Classification of original images derives from image itself.
464 Improving Selfish Node Detection in ZigBee Wireless Network with Shortcut Tree Routing?, Mr. A. Mohanasundaram, Mr. R. Regin, Dr. P. S. Prakash?
The ZigBee tree routing and Bluetooth is used in many resources, limited devices and applications. ZigBee tree routing is not requiring any routing table and route discovery cost to send a packet to the destination. But, the ZigBee tree routing has the basic limitation that a packet follows the tree topology. So ZigBee tree routing is cannot provide the correct routing path. In this project work, a shortcut tree routing protocol is proposed that offering the near advantageous routing path and as well as maintains the advantages of the ZigBee tree routing such as no route discovery overhead and low memory consumption. The key idea of the shortcut tree routing is to calculate remaining hops from an arbitrary source to the destination using the hierarchical addressing scheme in ZigBee. Each source or intermediate node forwards a packet to the neighbour node with the Slight remaining hops in its neighbour table. The shortcut tree routing is completely distributed and compatible with ZigBee standard in that it only using the addressing scheme and neighbour table without any changes of the specification. The mathematical analysis proves that the 1- hop neighbour information improves overall network performance by providing an efficient routing path and distributing the traffic load concentrated on the tree links. The efficient performance rating, it shows that the shortcut tree routing achieves better to performance compared to Ad Hoc On Demand Distance Vector with limited overhead of neighbour table maintenance as well as overwhelms the ZigBee tree routing in all the network conditions such as network density, network Structures, traffic type and the network traffic.
465 Path Planning Algorithm for Mobile Anchor-Based Localization in Mobile Networks?, T. S. Lokhande, Prof. R. R. Shelke
Mobile technology has helped to simplify networking by enabling multiple computer users to simultaneously share resources in a home or business without additional or intrusive wiring. A Mobile Network consists of hundreds or thousands of nodes and a small number of data collection devices. Localization of these mobile nodes is an essential issue in mobile networks because many applications require the mobile to know their locations with a high degree of precision. Various localization methods based on mobile anchor nodes have been proposed for assisting the mobile nodes to determine their locations. However, none of these methods attempt to optimize the trajectory of the mobile anchor node. Accordingly, proposed path planning scheme, ensures that the trajectory of the mobile anchor node minimizes the localization error and guarantees that all of the mobile node can determine their locations.
Cloud-assisted privacy preserving mobile health monitoring, which applies the mobile communications and cloud computing technologies, is considered as an efficient approach for improving the value of healthcare service while lowering the healthcare cost. This method protects the privacy of the parties who are involved in this system and their data.
467 Secure Authentication with 3D Password?, GANESH JAIRAM RAJGURU?, Prof. P. L. Ramteke?
Providing Authentication to any system leads to provide more security to that system. There are many authentication techniques are available, Such as textual password, Graphical password, etc. but each of this individually having some limitations & drawbacks. To overcome the Drawbacks of previously existing authentication technique a new improved authentication technique is used, this authentication Scheme is called as 3D password. The 3D password is multi-password & multi-factor authentication system as it uses a various authentication techniques such As textual password, Graphical password etc. Most important part of 3d password scheme is inclusion of 3d virtual environment. 3d virtual environment is virtual environment which is consisting of real time object scenarios. It is not actual real time environment, it is just user interface provided to scheme which looks like same as real environment. 3d password is more secure authentication scheme than any other authentication techniques because this authentication scheme is more advanced than any other schemes. Also this scheme is hard to break & easy to use. In this paper we have introduced our contribution towards 3D Password to become more secure & more user friendly to users of all categories. This paper also explaining about what is 3D password?, working of 3D password scheme, some mathematical concept related to 3D password, applications of scheme etc. all these concepts are briefly introduced & explained in this paper as per section wise.
468 A Review of Rule Based Clustering and Routing Approach to Improve Clustered Network Communication?, Shafali Vashist, Mr. Samit Yadav?
Lots of challenges are associated with sensor network. Communication in sensor network is controlled using some routing protocol. To achieve the effective communication over the network, particular protocol selection is required. In this paper, the classification of sensor network is defined according to the applications and requirement. The paper has categorized all the available protocols in some related classes as well as the exploration to each protocol class is defined in this paper.
469 Approaches for Combating Delay and Achieving Optimal Path Efficiency in Wireless Sensor Networks?, Agam Gupta, Mansi Gupta, Anand Nayyar?
Wireless sensor networks (WSNs) have gained a lot of interest of researchers because of its wide area of applications. Sensor network has four main performance metrics namely network lifetime, end to end delay, packet delivery ratio and throughput. In recent times a lot of research has been done on extending the lifetime of the network. Many times end to end delay factor is compromised for increasing the lifetime of the network. But in some of the applications like environment monitoring, intrusion detection etc. delay is not tolerable. In this paper we have listed some of the approaches or protocols that are both delay resistant and energy efficient. The aim of this paper is to let the readers aware how to increase the network lifetime without compromising the delay factor. This can be very helpful for the applications which do not tolerate delay in their working.
470 Review of Routing Techniques Driving Wireless Sensor Networks, Shuchi Sharma, Mansi Gupta, Anand Nayyar?
In recent times wireless sensor networks (WSNs) have grown enormously and become progressively attractive in wide variety of applications. Wireless sensor networks contain tiny sensor nodes. These sensor nodes have the capability of sensing, communication and computation. Many new protocols are designed for wireless sensor networks where energy, congestion, lifetime, coverage etc are essential consideration. In this paper survey of routing protocols for WSN has been done and various categories of routing protocols get explored. The main categories of routing protocols in wireless sensor network are data-centric protocols, hierarchical protocols, location based protocols, QOS aware protocol. The main aim of this paper is to conclude various routing techniques and merits and demerits of routing techniques.
The hasty growth of the web is causing the stable growth of information, leading to several problems such as an increased difficulty of extracting potentially useful knowledge. The huge amount of information available online, the World Wide Web is a fertile area for data mining research. The research in web mining aims to develop new techniques to effectively extract and mine useful knowledge or information from these web pages. Due to the heterogeneity and lack of structure of Web data, automated discovery of targeted or unexpected knowledge/information is a challenging task. In this paper, we survey the research in the area of Web mining, point out the categories of Web mining and variety of techniques used in those categories.In thispaper we elicit research scope in the areas of web usage mining, web content mining, web structure mining and concluded this study with a brief discussion on data managing, querying, representation issues.
472 A Review on Partitioning Techniques in Database?, Abhay Kumar, Jitendra Singh Yadav?
Data is most important in today’s globe as it helps organizations as well as persons to take out information and use it to make various decisions. Data generally stocked in database so that retrieving and maintaining it becomes easy and manageable. All the operations of data handling and maintenance are done using Database Management System. Data management is much monotonous task in growing data environment. Partitioning is possible solution which is partly accepted. Partitioning provides user-friendliness, maintenance and impulsive query performance to the database users. In this paper, brief review of methods of partitioning and helps to reduce the wait in response time. Paper shows the positive result with partitioning methods.
473 A Framework for an Outlier Pattern Detection in Weather Forecasting?, Miss. Kavita Thawkar, Prof. Snehal Golait, Prof. Rushi Longadge
Data Mining is the process of discovering new patterns from large data sets. Meteorological data mining is a form of data mining which concerned with finding rare patterns inside largely available weather data. To detect rare Weather pattern is difficult challenge because these rare events are characterized by low occurrence and uncertainty. In this paper, we proposed an Adaptive Markov Chain Algorithm Model which uses an open number of states of Markov Chain to accommodate the dynamic temporality of data. The data is collected with the Tropical Atmosphere Ocean (TAO) array which was developed by the international Tropical Ocean Global Atmosphere (TOGA) program. Data Variables including latitude, longitude, zonal wind, meridional wind, humidity, air temperature and sea surface temperature are considered for identifying climate change patterns in this paper. By adding the Markov property as a global restriction, the granular size of the clusters is determined for optimal performance. Our climate change pattern detection algorithm is proven to be of potential use for climatic and meteorological research as well as research focusing on temporal trends in weather and the consequent changes.
474 Clustering Techniques Analysis for Microarray Data?, Shweta Srivastava, Nikita Joshi?
Microarray data is gene expression data which consists of the protein level of various genes for some samples. It is a high dimensional data. High dimensionality is a curse for the analysis of gene expression data. Thus gene selection process is used in which most informative genes are selected from the pool of gene expression data set. All the genes are not relevant in each case. First we need to select those genes which are relevant as well as there should be least redundancy among them. For this purpose various approaches can be used such as: Filter methods, wrapper methods, embedded approach and clustering. In this paper embedded approach for gene selection and clustering method will be used for performing the sample clustering to refine the classification and will be compared with each other on the basis of various parameters.
475 A Review: Security Issues in Mobile Ad Hoc Network?, Priti, Dr. Priti Sharma?
Mobile Ad-hoc network is infrastructure less, spontaneous and multi-hop network, which consist of decentralized wireless mobile nodes. MANET uses temporary ad-hoc network topologies, that allowing people and devices to seamlessly connect to network in areas with no pre-existing communication infrastructure e.g. disaster recovery environments. And mobile ad-hoc network is a collection of nodes that are connected through each other with a wireless medium forming rapidly changing topologies. Routing in mobile ad-hoc network is a challenging term due to its dynamic changes in topologies. There are lots of trust models and routing protocol which are used in MANETs to get a security. Various trust schemes are used to provide integrity, confidentiality and availability in mobile ad-hoc network to gain the secure environment. In this paper, we will discuss characteristics, attacks and security in mobile ad-hoc network.
476 The Energy Efficient Multi-Hop Clustering Process for Data Transmission in Mobile Sensor Networks, V. Shunmuga Sundaram
The main function of Cognitive Radio Technology is to enable the Spectrum Utilization and detect the unused spectrum and sharing it without harmful interference to other users. Energy Consumption is a primary concern in the Wireless device Networks. The proposed solution is distributed Efficient Multi-Hop Clustering routing protocol which can consider not only for static mobile nodes but also in the Mobile Environment and used to reduce the packet loss during the cluster communication. The main function is to elect the cluster head according to the energy level, Connectivity and Stability and transfer the information from the source to the destination. The nodes in the clusters should be advertising the cluster head to other nodes. It improves the Connectivity between the Cluster head and also provides the active communication. The DEMC protocol function is to change according to the topology networks and the information stored in the radio networks. It mainly increase the chance of generating the communication link that leads to finds more reliable communication path for Data Transmission.
477 Comparison of SVD-Watermarking and LSB-Watermarking Techniques?, Anupma Yadav, Anju Yadav?
Watermarking, belongs to the data hiding field, has gain lots of attention and interest of researcher. There is work going on in this field. There are various use of watermarking algorithms that includes content protection, tamper detection, and copyright protection. In this paper, we presented two well know watermarking technique SVD and LSB in terms of image. When we compared them, earlier one shows better result as compared to the former.
478 Implementation of Reversible Sequential Circuits Using Conservative Logic Gates, Dhaarinee.S, Rajeswaran.N
Reversible circuits do not loose any information during computation. Reversible computation can be performed using reversible gates like fredkin gate, feynmann gate and toffoli gate. It has unique output vector for each input vector and has one to one mapping between the inputs and outputs. The existing system is designed using fredkin gate and they are cascaded in series or parallel using the characteristic equation of each reversible gate. In the proposed system sequential circuits like master slave flip-flop and edge triggered flip-flop are designed using toffoli gate which is universal in nature. These circuits have less power dissipation and are used in applications like quantum computing, digital signal processing, cryptography nanotechnology and testing. The circuit can detect stuck at fault using two test vectors 0 and 1.
Groups are becoming one of the most compelling features in both online social networks and Twitterlike microblogging services. A stranger outside of an existing group may have the need to find out more information about attributes of current members in the group, in order to make a decision to join. However, in many cases, attributes of both group members and the stranger need to be kept private and should not be revealed to others, as they may contain sensitive and personal information. How can we find out matching information exists between the stranger and members of the group, based on their attributes that are not to be disclosed? In this paper, we present a new group matching mechanism, by taking advantage private set intersection and ring signatures. With our scheme, a stranger is able to collect correct group matching information while sensitive information of the stranger and group members are not disclosed. Finally, we propose to use batch verification to significantly improve the performance of the matching process.
Static Random Access Memory (SRAM) is an indispensable part of most modern VLSI Designs, because of its lower power consumption, high speed and its dominates silicon area in many applications. SRAM plays a significant role in energy consumption due to the high density for evermore increased computing power in many ultra-low power applications. A novel low power 6T SRAM cell with single bitline to enhance the stability. SRAM energy efficiencies can be achieved with a wider SRAM array structure with fewer rows than columns particularly at low supply voltage. In the proposed 6T SRAM cell write operation done by charging or discharging single bit line (BL) ,which results in reduction of dynamic power consumption. Simulation results show a better efficiency for the same SRAM bit density and the same supply voltage.
481 Reversible Data Hiding and its Methods: A Survey?, Sukhdeep Kaur, Manshi Shukla?
Hiding information destroys the host image even though the distortion introduced by hiding is imperceptible to the human visual system. Reversible data hiding techniques are designed to solve the problem of lossless embedding of large messages in digital images so that after the embedded message is extracted, the image can be restored completely to its original state before embedding occurred. This paper presents a survey on various reversible data hiding methods.
482 Result Analysis for LBP and Shape Context Methodologies used as Authentication Mechanisms of Digital Signatures used for Certification?, Shraddha Kulkarni, Prof. Vikrant Chole
Signature is very important attribute which is mostly used for financial document Certification and personal identification. So it is necessary to check authentication and genuineness of that signature. In literature most of the signature verification methods verifies that whether test signature is perfectly aligned to the specified axis or not. If not then that method rejects that signature even though it may be genuine. Here, implemented technique verifies the signature size and angle invariant. The invariance can be calculated by scaling and rotational manipulations on the test image. In this paper proposed methodology shape context involves, cropping signature image from cheque, gray scale image conversion, Edge detection of image, Binarization of image which is then localized and compared with the account holder’s source of information then it authenticate that cheque and clarify.
483 Optimizing the Performance and Secure Distributed Wireless Network in Unreliable D-NCS using CGA?, Babu Pinjar?, C.N.Chinnaswamy?
Distributed network-control-systems (D-NCS) are a network structure and components that are capable of integrating sensors, actuators, communication, and control algorithms to suit real-time applications. Distributed networked control systems (D-NCS) are vulnerable to various network attacks when the network is not secured, thus, D-NCS must be well protected with security mech- anisms (e.g., cryptography), which may adversely affect the dynamic performance of the D-NCS because of limited system resources. This paper is concerned with the problem of designing a secure distributed control methodology that is capable of performing a secure consensus computation in a D-NCS in the presence of misbehaving nodes. It embeds four phases (detection, mitigation, identification, and update) into the distributed control process. We use a wireless network based, robot navigation path tracking system called Intelligent Space (iSpace) as a D-NCS test bed in this paper. Network security algorithms DES and 3DES are integrated with the application to secure the sensitive information flow. Then, the paper proposes a paradigm of the performance-security tradeoff optimization based on the Coevolutionary genetic algorithm (CGA) for D-NCS.
484 Dynamic Routing for ADA in Wireless Sensor Networks?, Nagaraj?, C.N.Chinnaswamy?
Sensor webs consisting of nodes with limited battery power and wireless communications are deployed to collect useful information from the field[1]. Gathering sensed information in an energy efficient manner is critical to operating the sensor network for a long period of time. Data collection problem is defined where, in a round of communication, each sensor node has a packet to be sent to the distant base station. There is some fixed amount of energy cost in the electronics when transmitting or receiving a packet and a variable cost when transmitting a packet which depends on the distance of transmission. If each node transmits its sensed data directly to the base station, then it will deplete its power quickly. Data aggregation[3][5] has been widely recognized as an efficient method to reduce energy consumption[4] in wireless sensor networks, which can support a wide range of applications such as monitoring temperature[4], humidity, level, speed etc. The data sampled by the same kind of sensors have much redundancy since the sensor nodes are usually quite dense in wireless sensor networks. To make data aggregation more efficient, the packets with the same attribute, defined as the identifier of different data sampled by different sensors such as temperature sensors, humidity sensors, etc., should be gathered together. However, to the best of our knowledge, present data aggregation mechanisms did not take packet attribute into consideration. Following this paradigm, numerous data aggregation schemes [6], [7], [8],[9],[10], [11], [12], [13], [14], [15], [16], [7], [18] have been proposed to save the limited energy on sensor nodes in WSNs. In this paper, we take the lead in introducing packet attribute into data aggregation and propose an Attributeaware Data Aggregation(ADA) mechanism using Dynamic Routing which can make packets with the same attribute convergent as much as possible and therefore improve the efficiency of data aggregation.
Procure Data Centre (PDC) is a coming forth patient data-centric framework of data interchange, large scale data centric applications. In which the data is been outsourced to be stored to general IT providers, such as cloud providers and how to assure their private data while being stored in the cloud servers. To secure the information govern over entree to their own file, it is a hopeful method to encrypt file and personal information before outwards. Yet, effects such as danger of privacy view, measurability in key management, compromising entree and efficient user revocation, have continued the most significant disputes accomplishing fine-grained, cryptographically imposed information entree assure. In this thesis, propose a new data centric role model and a suit of method for information access control to personal profiles put in half-believed servers. To reached close–grained and measurable information entree assure for PDC’s, and gained Distributed Multi Authority-Attribute Based Encryption (DMA-ABE) method to generation cipher text of data through for encrypt each data file. Different from past works in assure information outsourcing, focus on the more than one data proprietor security script, and split the users in the PDC scheme into multiple assured area that heavily shrinks the key management complexity for proprietors and consumers. A peak of data privacy is ensured at the same time by working distributed multi-authority ABE. In this scheme also enables dynamic alteration of access policies or file attribute, confirms efficient availability of data that can be needed by users/attribute revocation. Evaluating the performance of Cloud provisioning policies, application workload models, and resources performance models in a repeatable manner under varying system and user configurations and requirements is difficult to achieve. To overcome this challenge, propose CloudSim: an extensible simulation toolkit that enables modeling and simulation of Cloud computing systems and application provisioning environments. The CloudSim toolkit supports both system and behavior modeling of Cloud system components such as Data Centres, Virtual Machines (VMs).The implementation of proposed algorithm is performed by using CloudSim3.0.1 simulator.
486 Effective Copyright Protection of Digital Products by Embedding Watermarking?, Monika Craig, Prof. Deepak Kapgate?
The watermarking of digital images, audio, video and multimedia products in general have been proposed for resolving copyright ownership and verifying originality of content .This paper studies the contribution of watermarking for developing the protection schemes. Wolfgang and Delp Algorithm (Technique) is used in case of watermark Embedding. The algorithm proposed the scheme of watermarking by encrypting the watermark image using symmetric encryption like DES. According to the method, the watermark symbol to be encrypted first using DES and then embedding it into RGB vectors of original image using SVD transformation sampling. The experiment proves that the algorithm of embedding watermark has better robustness to JPEG compression attack.
It is crystal clear that there has been systematic transformation of human activities and societies since the advent of information?s technologies (IT). However, this information Technology that ought to be of enormous advantage without any fear in its usage by people has been hijacked by the world terrorists to lunch many attacks via cyberspace (internet).Cyber terrorism has been one of the social ideological menaces that have great challenges to the contemporary society all over the world. However, the emergence of this cyber terrorism has the potentials to vitiate the positive uses of information technology if not properly and promptly addressed. It is in the light of this that this study critically explores contemporary literatures of this topical issue of information technology aided terrorism (cyber terrorism). The study presents the semantic interpretations of cyber terrorism, the components of cyber terrorism, the motivating factors of cyber terrorism, the various cyber terrorism techniques usually adopted by the perpetrators (terrorists), and the consequences of such a prevalence cyber attack. The study equally suggests the possible panaceas to tackle the problem of cyber terrorism so as to maximize the benefits attached to information technology (IT) and minimize the evil posed by the terrorists.
488 Performance Improvement in Multimedia Answering By Web Excavation?, Aparna.N, Prathima.V.R?
Question Answering (QA) can be considered as an alternative of Information retrieval systems. It is a way of responding to a query which is asked in natural language, with accurate and precise result. The relevance of question answering lies in the downside of search engines, which return a lot of irrelevant documents based on some key terms. Community based Question Answering (CQA) services are defined as dedicated platforms for users with diverse background to share information and knowledge and to respond to other users’ questions, resulting in the building of a community where users interactively give ratings to questions and answers. But the downside of existing CQA forums are, most of the previous systems are text based and fail to provide more detailed information which helps the user to understand the things completely. Here I propose a model that is able to provide answers from different CQA forums along with suitable multimedia information. In this model, first the combination of media through which question should be answered is selected based on question and answer pairs, in the next stage the most relevant keyword will be selected based on question & answer, then in the final stage it will collect appropriate multimedia information from different web sources and presented to the user along with textual information. Compared to lot of multimedia question answering approaches, it mainly focus on extracting textual answering from different web sources along with the multimedia information and it is faster compared to other approaches, and the result should be precise and appropriate media data.
489 Prevention of Session Hijacking and IP Spoofing With Sensor Nodes and Cryptographic Approach?, Krunal P. Bhaturkar?, Prof. Karuna G. Bagde?
Many web applications available today make use of some way of session to be able to communicate between the server and client. Unfortunately, it is possible for an attacker to exploit session in order to impersonate another user at a web application. The session hijacking is the most common type of attack in the infrastructure type of network. The confidentially is not providing under this attack to user information. Session hijacking attack is launched by making fake access point. If we detect the fake access point then we can stop session hijacking, and various techniques had been proposed. In this paper, we are giving a new mechanism to detect the fake access point with the use of sensor nodes in the network. In this mechanism we are also giving the protection against IP Spoofing by the use of public private key cryptography key exchange algorithm. We also discuss the results through simulations in Network Simulator.
490 Document Clustering Using Concept Weight?, Sapna Gupta, Prof. Vikrant Chole
Traditional techniques for clustering the document are mostly based on the number of occurrences and the existence of keywords. Similarly Phrase based clustering technique ignores the semantics behind the words, only captures the order in which the words appear in a sentence. The term frequency based clustering techniques takes the documents as bag-of words while ignoring the semantic relationship between the words. Considering the drawbacks of such system this paper proposes a concept based clustering technique. The ideology behind this concept is, it uses Medical Subject Headings MeSH ontology for extracting the concept and the concept weight calculation is done by its identity and relationship with its synonym. K-medoid algorithm is used for clustering documents on Semantic through which the results are analyzed.
491 An Analysis of Subspace Methods for Large South Indian Datasets, Krishna Murthy C.R, C.Naveena, T.C Manjunath?
Optical Character Recognition (OCR) is one of the important fields in image processing and pattern recognition domain. Handwritten Character Recognition has always been a challenging task. The complexity of accurate recognition of Multi Lingual South Indian Scripts makes its recognition a challenging task for the researchers. Multi Lingual characters are a challenging task because of the high degree of similarity between the characters. This paper presents an analysis of subspace methods for recognition of handwritten isolated Multi Lingual South Indian Scripts for the Kannada, Tamil, Malayalam languages. The study was carried out with a huge dataset containing 33,640 handwritten samples. The proposed method preprocesses the 841 different classes of characters obtained from scanned documents of the Multi Lingual South Indian Scripts for the Kannada, Tamil, Malayalam languages. Both Principal Component Analysis (PCA) & Fisher Linear Discriminant Analysis (FLDA) approaches are used to extract the features of characters. For classification Probabilistic Neural Network (PNN) approach is used with the combination of both PCA & FLDA feature extraction method. Based on classification of character the computed results performance of both PCA & FLDA based PNN classification was analyzed & discussed here.
492 A Simplified Review on Fast HSV Image Color and Texture Detection and Image Conversion Algorithm, Monika Deswal, Neetu Sharma?
In order to identify the perceived qualities of texture and color in a building mathematical models for object a optimized and efficient algorithm ‘A Fast HSV Image Color and Texture Detection Algorithm’ based on color intensity using Artificial Intelligence and fuzzy Logic is presented in this paper. We will be using color intensity method over conventional method. The ‘Fast HSV Image Color and Texture Detection Algorithm’ focuses to integrate the detection of image color with detection of texture using AI and Fuzzy logic. Color detection has been among the widest research area in the field of computer science. In computer vision, there are several pre-existing color models for describing the specification of the colors such as RGB, CMY and HSV. This paper presents detection of color using HSV-based (hue, saturation, value) color model since it greatly decreases the size of color and grey-scale information of an image .This paper can be treated as a reference for getting in depth knowledge of the Color detection and texture detection.
493 A Framework for MEMS Based Hand Gesture Recognition System for Controlling the Mouse Cursor using Wireless Technology?, Khushbu Agrawal, Vikrant Chole?
HAND gesture recognition provides an intelligent, natural, and convenient way of human– computer interaction (HCI). Sign language recognition (SLR) and gesture-based control are two major applications for hand gesture recognition technologies. SLR aims to interpret sign languages automatically by a computer in order to help the deaf communicate with hearing society conveniently. Since sign language is a kind of highly structured and largely symbolic human gesture set, SLR also serves as a good basic for the development of general gesture-based HCI. It provides us the means to create a communication between human and computer to operate a mouse cursor on the screen of a PC.
494 Intrusion Detection using Fuzzy Data Mining, Sandeep Dhopte, Prof. Manoj Chaudhari
With the rapid expansion of computer networks during the past few years, security has become a crucial issue for modern computer systems. A good way to detect illegitimate use is through monitoring unusual user activity. The solution is an Intrusion Detection System (IDS) which is used to identify attacks and to react by generating an alert or blocking the unwanted data. For IDS, use of genetic algorithm gives huge number of rules which are required for anomaly intrusion detection. These rules will work with high-quality accuracy for detecting the Denial of Service and Probe type of attacks connections and with appreciable accuracy for identifying the U2R and R2L connections. After getting huge rules we apply fuzzy data mining techniques to security system and build a fuzzy data mining based intrusion detection model. These findings from this experiment have given promising results towards applying GA and Fuzzy data mining for Network Intrusion Detection. Performance of the proposed system will be measured using the standard KDD 99 data set.
495 A Simplified Review on Study and Implementation of Recognizing Terrorist by Rapid Detection by Activities and their Facial Match, Ritu Kanwa, Samit Yadav?
Security is an important aspect in all environments. It is related to safety. Our Topic “Study and implementation of Recognizing Terrorist by rapid detection by activities and their facial match” proposed for various kinds of faces is presented in this paper. To recognize terrorist is one of the major task of any government agency. The implementation of it is one of the most influential works of human kind. Our approach is to find the facial detection and activities of people for fifteen second video capture. Every people come under the camera recognize each separately. Its activity matches with record of each suspect and facial is one of the most important and it match with every mean of change. Even after plastic surgery it can also predict. The study of face recognition falls into three methods such as face normalization, feature extraction and face matching.
496 Security in WLAN- Review of Security and Throughput Tradeoff?, Avinash Kaur, Harwant Singh
An IEEE 802.11 WLAN is a group of wireless nodes located within a moderate physical area. It provides intensified productivity, less cost of installation and mobility. The open to air nature of wireless channel that give us great convenience of mobility also make it vulnerable to attack. Various security issues are described in paper. The tradeoff between security and throughput is main focus in WLAN.As if the security increases the throughput gradually decreases. To solve this problem various techniques are used which reduces this problem.
The growth of the Database domain and the wide Internet has been revolutionized over many years, a large number of people interact through information over the internet. This rapid change and the cost effectiveness of new evolving technologies are providing large opportunities for developing distributed database systems. These large-scale systems are made up of various interacting components, each of which is well encapsulated. However, this rapid growth has also brought many security issues, as data is now made available on the open Internet is delicate.
498 Content Based Image Retrieval using Color Feature Extraction with KNN Classification, MS. PRAGATI ASHOK DEOLE, PROF. RUSHI LONGADGE?
Image Retrieval system is an effective and efficient tool for managing large image databases. Content based image retrieval system allows the user to present a query image in order to retrieve images stored in the database according to their similarity to the query image. Content Based Image Retrieval (CBIR) is a technique which uses visual features of image such as color, shape, texture, etc. to search user required image from large image database according to user’s requests in the form of a query. In this paper content based image retrieval method is used retrieve query image from large image database using three features such as color, shape, texture etc. The main objective of this paper is classification of image using K-nearest neighbors Algorithm (KNN).
499 A Novel Patient Centric Framework for Data Access Control in Semi-trusted Cloud Servers?, Allamaprabhu G Rudraxi, Mr. Parikshith Nayak S.K?
Cloud storage is a model of networked online storage where data is stored in virtualized pools of storage which are generally hosted by third parties. Personal health record (PHR) is an emerging patient-centric model of health information exchange, Issues such as risks of privacy exposure, scalability in key management, flexible access, and efficient user revocation, have the most important challenges s o we propose a novel patient-centric framework and a suite of mechanisms for data access control to PHRs stored in semi trusted servers. To achieve fine-grained and scalable data access control for PHRs, we leverage attribute-based encryption (ABE) techniques to encrypt each patient’s PHR file. This project also supports multiple owner scenarios and divides the users in the system into multiple security domains that greatly reduce the key management complexity for owners and users. We implement this system and evaluate atop DriveHQ cloud to demonstrate that our new system provides Secure Data Access over outsourced data.
500 Network Security Analysis Based on Authentication Techniques?, Anupriya Shrivastava, M A Rizvi?
Network Security issues are now becoming important as society is moving to digital information age. Data security is the utmost critical component in ensuring safe transmission of information through the internet. It comprises authorization of access to information in a network, controlled by the network administrator. The task of Network security not only requires ensuring the security of end systems but of the entire network. Authentication is one of the primary and most commonly ways of ascertaining and ensuring security in the network. In this paper, an attempt has been made to analyze the various authentication techniques such as Knowledge-based, Token-based and Biometric-based etc. Furthermore, we consider multi-factor authentications by choosing a combination of above techniques and try to compare them.
501 An Effective Approach on Scheduling Algorithm in Cloud Computing?, Suman Sangwan, Sunita Sangwan?
Cloud is the most effective technology of recent times at attracts most users and organizations towards it. Because of the different opportunities and characteristics provided by cloud computing, it taken the attention to the all business market stakeholders. The main problem in cloud computing is task scheduling. Scheduling is the most effective and important task for a computer system that basically decides the order of the process execution when different processes are kept in a queue. An effective scheduling give better resource utilization and that give better result to users. The need for a scheduling algorithm arises from the requirement for most modern systems to perform multitasking and multiplexing. In this paper we describe different scheduling algorithm.
502 Intruder Proof and Secure Cryptography?, Ravinder Sangwan, Anshul Anand?
Cryptography is a method of storing and transmitting data in a form that only those it is intended for can read and process. It is a science of protecting information by encoding it into an unreadable format. Although the ultimate goal of cryptography, and the mechanisms that make it up, is to hide information from unauthorized individuals, most algorithms can be broken and the information can be revealed if the attacker has enough time, desire, and resources. This paper presents the way of protecting the information from: • Intruders: those who capture the packet and alter the information. • Cryptanalysts: those who decrypt cipher text into plain text without key.
Searching acquaintance of entities by Wikipedia is the significant most modern issues in field of information exploration. WebPages of searching includes a keyword which has developed whereas information hunt has of late been investigated to obtain information of a particular entity. Searching of Wikipedia is regularly an improved alternative for a customer to get hold of information of a particular entity than representative exploring engines. A novel system was intended for computing a connection by reflecting entire notions of remoteness, connectivity, besides co-citation. It makes use of a comprehensive maximum flow on information system towards working out potency of a connection from entity by means of assessment of flow. Thought of cohesion basis techniques were not assumed since they reprove entities containing elevated degrees even though such entities are significant towards a number of relations in Wikipedia. An entity correlated by entities turn out to be an entity connecting to mutually when path of each edge is inverted and hence co-occurrence is considered as reverse of co-citation.
504 Achieving Optical Fiber Communication Experiments by OptiSystem?, Alla Abbas Khadir, Baydaa F. Dhahir, Xiquan Fu
Recently, optical fiber communication technology have made great progress, where has been constantly exploring new technologies has greatly enhanced communications capabilities in the traditional sense, this makes the optical fiber communication technology in a broader context has been applied. Deployment of optical communication systems is costly and reconfiguration is in some cases impossible or uneconomical, therefore the experiments and simulation of systems are has become necessity to predict and optimize system performance. This paper highlights on the implement of these technologies by use optisystem program which will increase the proper understanding of mechanism action of each optical communication component and provides more freedom to explore design parameters than analytic calculations and physical experiments thus allows students to develop an intuitive understanding of optics in a rapid way.
505 Metadata-Based Modeling Approach: A Multi-Viewpoints UML Profile?, Ali Hassan Muosa?
These papers is a process associated with UML, which deals with structural aspects to use in methodical a way, development process offers a complete multi-views modeling via dealing with both structural and behavioral aspects. The achieved work saved all features of multi-views system (MVS) in metadata-based model as matrix/cross-table does not set about the behavior aspects of the modeling process. The metadatabased approach addresses a structural aspect related to the share of features’ views and to sharing data without referring to the way these views will respond or how to be able to synchronize them in order to obtain the behavior of multi-view objects (instances of a multi-view system). The achieved work in these papers aims to fill this space by providing a new mechanism to the UML profile that allows expressing the behavioral needs of a system. We will focus on describing the individual behavior of multi-view objects by Actors-Views that require adjustments of UML modeling concepts.
506 A Review on GPS and its Applications in Computer Science?, POOJA SINGAL, DR. R.S.CHHILLAR
Global Positioning System (GPS) is a satellite based navigation system. GPS is used in guiding & mapping applications and also used in location services. In present days, GPS is not only provide the position but GPS is also used in various applications in computer science like wireless video processing & monitoring using mobile, localization of automobile, in fisheries & marine studies, in IEEE 802.11 networks, in auto regressive ionospheric prediction model etc. GPS & all above applications are discussed in this paper.
507 A Secure Routing Protocol for Wireless Adhoc Network Creation, Rajashekharagouda Patil, Shreedharamurthy S K?
Ad-hoc wireless network is an infrastructure-less network, i.e. there is no centralized coordination for the network operations. As and when a new node comes in the vicinity of the network it will spontaneously form the network. This paper presents a secure protocol for spontaneous wireless Ad-hoc network which uses a hybrid symmetric/asymmetric scheme and the trust between users in order to exchange the initial data and to exchange the secret keys that will be used to encrypt the data. Our proposal is a complete self-configured secure protocol that is able to create the network and share secure services without any infrastructure. Network creation stages are detailed and the communication, protocol messages and network management are explained. Our proposal has been implemented in order to test the protocol procedure and performance.
Wavelet Transform is used to analyze image. The watershed transformations a useful morphological segmentation tool is used for a variety of grey-scale images. An efficient method of segmentation is presented in this proposed method by modified multiresolution, which combines the pyramidal image segmentation with hierarchical watershed segmentation algorithm. The segmentation procedure consists of pyramid representation, image segmentation, region merging and region projection. Each layer is split into a number of regions by rooting labeling technique and boundary is extracted by threshold. Morphological operation is used to smooth the image.
509 A Modified Version of DMAC Clustering Algorithm in MANET?, Ira Nath, Trisha Gorai?
A Mobile Ad-hoc network (MANET) is formed when a group of mobile wireless nodes collaborate between them to communicate through wireless links in the absence of the fixed infrastructure and any centralized control. These characteristics make it able to adapt and operate in difficult conditions. It is vital to keep the topology stable as long as possible. In this paper a new approach is adopted for cluster formation by modifying DMAC clustering algorithm. This algorithm can result in a more stable configuration, and thus yield better performance.
510 A Comparative Analysis of Pro-active Routing Protocols in MANET?, B. Kondaiah, Dr. M. Nagendra
A mobile ad hoc network (MANET) is a collection of mobile nodes that is connected through a wireless medium forming rapidly changing topologies. A collection of mobile nodes that form a network without any fixed infrastructure. Therefore, routing in MANET is a critical task due to highly dynamic environment. Efficient Routing Protocols will make MANET reliable. The routing protocols in MANET classified into three categories proactive (Table driven), reactive (On demand) and Hybrid routing protocols. But we will discuss Proactive routing protocols DSDV and WRP. In This paper provides an overview of these protocols by presenting their characteristics, functionality, benefits and then their comparative analysis parameters. These protocols will be measured with suitable metrics.
511 A New ANN, GRNN and RBF Neural Network for Heart Disease Diagnosis?, Pramod Kumar Yadav?, Dr. K.L.Jaiswal?, Dr. Kamlesh Singh?, Satish Pandey?
In this paper, two types of Artificial Neural Network (ANNs), Generalized Regression Neural Network (GRNN) and Radial Basis Function (RBF) have been used for heart disease to prescribe the medicine. Diagnosing the heart disease and prescribing the medicine on the basis of symptoms is a very challenging task to improve the ability of the physicians. The training capacity and medicines provided by these two techniques are compared with the original medicines provided by the heart specialist. About 300 patient’s data are collected from S.G.M.Hospital Rewa under the supervision of doctor. This study includes the detailed information about patient and preprocessing was done. The GRNN and RBF have been applied over this patient data for the outcome the medicine. The result of these evaluation show that the overall performance of RBF can be applied successfully for prescribing the medicine for the heart disease patient.
512 A Rule Based Clustering and Routing Approach to Improve Clustered Network Communication?, Shafali Vashist, Mr. Samit Yadav?
Lots of challenges are associated with sensor network. Communication in sensor network is controlled using some routing protocol. To achieve the effective communication over the network, particular protocol selection is required. In this paper, the classification of sensor network is defined according to the applications and requirement. The paper has categorized all the available protocols in some related classes as well as the exploration to each protocol class is defined in this paper.
513 Improved Accuracy for Decision Tree Algorithm Based on Unsupervised Discretization, Ihsan A. Kareem, Mehdi G. Duaimi?
A decision tree is an important classification technique in data mining classification. Decision trees have proved to be valuable tools for the classification, description, and generalization of data. Work on building decision trees for data sets exists in multiple disciplines such as signal processing, pattern recognition, decision theory, statistics, machine learning and artificial neural networks. This paper deals with the problem of finding the parameter settings of decision tree algorithm in order to build an accurate tree. The proposed technique is an unsupervised filter. The suggested discretization applies on C4.5 algorithm to construct a decision tree. The improvement on C4.5 algorithm includes two phases: the first phase is discretization all continuous attributes instead of dealing with numerical values. The second phase is constructing a decision tree model and evaluates performance. It has been experimented on three data sets. All those data files are picked up from the popular UCI the (University of California at Irvine) data repository. The results obtained from experiments show C4.5 after discretization better than C4.5 before discretization.
514 Improved Accuracy and User Satisfaction by Inferring User Search Goals based on Feedback Sessions?, Ms. S. S. Jadhav, Prof. N. D. Kale?
User search goals can be defined as information on various aspects of query that user want to obtain and it can be considered as the collection of information needs for a query. Different users may have different search goals in their mind when they pass ambiguous query to a search engine. Thus, it is important to infer and analyze user search goals to improve the performance of a search engine and user experience. By clustering the proposed feedback sessions, we infer different user search goals for a query. The feedback session is combination of both clicked and unclicked URLs and this feedback session is mapped to the pseudo-documents to better represent the information needs of user. These pseudo-documents are clustered using bisecting K-means clustering algorithm which produces better results than K-means clustering algorithm and reduces computation time. Finally, Classified Average Precision (CAP) evaluation criterion is used to evaluate the performance of system. In this way, the proposed system can infer user search goals efficiently and satisfy information needs of user. Experimental results are presented using user click-through logs from a commercial search engine to validate the effectiveness of our proposed methods.
515 Which one is better - JavaScript or jQuery?, Md. Zeeshan Ahmed
JavaScript is a programming with direct support to object oriented methodologies. Basically JavaScript is a computer language which is used in most of the web browsers. jQuery is not a language, but in fact it is a well written JavaScript code. As quoted on official on jQuery website, "it is a fast and concise JavaScript Library that simplifies HTML document traversing, event handling, animating, and Ajax interactions for rapid web development".
516 IDS IMPLEMENTATION IN A PRIVATE CLOUD, Mr. Ashish Kumbhare, Mr. Manoj Chaudhari?
Cloud computing is rising field because of its performance, high availability, least cost and many others. Besides this companies are required there business from cloud computing because the dread of data leakage. Due lack of proper security control policy and weakness in protect which lead to many vulnerability in cloud computing. This thesis has been written to focus on the problem of data leakage and proposes a framework works in two phases. First phase which is known as Threats in cloud computing. In this phase we launch the attacks on the cloud server and analyze the impact of the attacks on server side. The name of the attacks which is launch on the cloud server is known as DDoS, ICMP and Malware attacks. Second phase is Data Security in which Data arrangement is done by client before storing the data.
517 Rician Noise Reduction in MRI Images using Wave Atom Transform?, Shashi Jangra, Mr. Samit Yadav?
Magnetic resonance imaging is a medical imaging technique that measures the response of atomic nuclei of body tissues to high frequency radio waves when placed in a strong magnetic field and that produces images of the internal organs. De-noising is always a challenging problem in magnetic resonance imaging and important for clinical diagnosis and computerized analysis, such as tissue classification and segmentation. It is well known that the noise in magnetic resonance imaging has a Rician distribution. . In this paper, an improved de-noising technique is proposed on Magnetic Resonance Images highly corrupted with Rician Noise using wave atom shrinkage.
518 An Effective Method for Load Balancing in MANET?, Dr. M.V. Siva Prasad, P. Niranjan, B. Swathi?
A mobile ad hoc network (MANET) is a collection of wireless mobile hosts forming a temporary network without the aid of any stand-alone infrastructure or centralized administration. Mobile Ad-hoc networks are self organizing and self-configuring multichip wireless networks where, the structure of the network changes dynamically. A routing protocol in MANET should fairly distribute the routing tasks among mobile hosts. Most current routing protocols for mobile Adhoc networks consider the shortest path with minimum hop count as optimal route without any consideration of any particular node’s traffic and thus degrading the performance by causing serious problems in mobile node like congestion, power depletion and queuing delay. Therefore it is very attractive to investigate Routing protocols which use a Routing Metric to Balance Load in Adhoc networks. We present various load Balanced Routing protocols for efficient data transmission in MANETs.
519 Data Encryption Methods Used in Secure ATM Transactions?, CH. Krishna Prasad, G. Srinivasa Rao, Dr. M.V. Siva Prasad?
Data security is an important issue in current scenario of banking financial operation specially with transaction of secure and confidential data. It must be send with high security at the time of communication. In this paper we will discuss various types of encryption methods and standards which are used in secure banking data transmissions to make more data security. Especially here we discuss the communication security methods used between Auto teller machine and bank server banking financial operations, when we transmit data from an Auto Teller Machine to bank server it must send in encrypted form so that an unauthorized user cannot access the secure information directly at the time of data communication. using this paper I will try to explain different data security that how the data transactions can make more secure with different security techniques used in ATM transactions. Various security levels of data and encryption standard used in banking data transaction security. Encryption methods are built into the communication network to prevent unauthorized transactions that could protect the data from unauthorized access. This paper focuses on Data Encryption Standard and Advanced Encryption Standard, these are the encryption standards used by the banks to protect the data and for secure data transmission.
520 Image De-noising and its Methods: A Survey?, Barjinder Kaur, Manshi Shukla?
An image is considered as a collection of information stored as intensities and the occurrence of noises. The occurrence of noise present in the image causes degradation in the quality of the image. The basic idea behind image processing is how we estimate the correct pixel values. Image De-noising is one of the fundamental problems which is faced in image processing and computer vision .There are various de-noising methods that have been employed to remove noise from the existing image. There are many researches which are still going on to find the best method to remove noise from the image while preserving its fine details. In this paper we are going to discuss different noise models and methods to remove these noises with their various advantages and disadvantages.
521 A Comparison of Performance Metrics for Various Routing Protocols in MANET?, Hemant Rai?
MANET (Mobile Ad Hoc Network) is an independent set of mobile users that communicate through wireless connections. The network topology of MANET changes rapidly over time. Since the nodes are mobile therefore Packet delivery ratio, energy consumption, Delay is an important constraint of Mobile Ad-Hoc Network (MANET). Routing for MANET is the keys where we can save some energy and decrease delays. This paper however presents a comparison of performance metrics for DSDV, AODV, ZRP routing protocol for MANET.
522 An Elegant Fusion of Concurrent Crawling and Page Rank Technique for Spidering Websites, Smita Marwadi?, Mr. Neelabh Sao
The World Wide Web is expanding day by day. With the great growth of the Web, it has become a massive challenge for the all-purpose single process crawlers (A crawler is a program that downloads and stores Web pages, often for a Web search engine) to locate the resources that are precise and relevant in an appropriate amount of time, so more enhanced and convincing algorithms are in stipulate. Thus it becomes vital to improve the crawling procedure, in order to finish downloading pages in a sensible amount of time. Web crawler which employs multi-processing to allow multiple crawler processes to run concurrently. We have proposed a resourceful concurrent crawler that is fusion of page rank and concurrent multi-process crawler, offering a means to efficiently crawl the Web and presenting a scalable solution that allows crawl speeds to be tuned as needed.
523 An Elegant Draw Near to Improve the Design of an E-commerce Website Using Web Usage Mining and K-Means Clustering?, S. Divya Rajan?, Mr. Neelabh Sao?
Web Mining is an enormous field that helps us to understand range of concepts of different fields. Web Usage Mining Techniques are attempted to motive about diverse materialized issues of Business Intelligence which include marketing proficiency as domain knowledge and are specifically designed for electronic commerce purposes. The growing reputation of e-commerce makes data mining requisite technology for several applications, especially online business competitiveness. The World Wide Web provides profuse raw data in the form of web logs. Nowadays many business applications utilizing data mining techniques to pull out useful business information on the web evolved from web searching to web mining. This paper introduces a web usage mining intellectual system to provide nomenclature on user information based on transactional data by applying association and K-means clustering data mining algorithms.
524 Adaptive Image De-Noising Model Based on Multi-Wavelet with Emphasis on Pre-Processing?, Shubhra Soni?, Ahsan Hussain?
The field of signal or image processing naturally deals with the image de-noising. The image may be corrupted by a noise and/or poor illumination and/or high temperature, and/or transmission. The ability of capturing the energy of signal provides us the better solution towards de-noising of a natural images corrupted by Gaussian noise using multi-wavelet techniques. Multi-wavelet can gratify with symmetry and asymmetry which are very imperative characteristics in signal processing. The image will be highly de-noised if and only if the degree of the noise is lesser. Normally, its energy is dispersed over low frequency band while both its noise and details are dispersed over high frequency band. Corresponding hard threshold used in various scale high frequency sub-bands. In this paper proposed to indicate the aptness of various wavelets and multi-wavelet based and a size of different neighborhood on the performance of image de-noising algorithm in terms of PSNR value. Finally it compares wavelet and multi-wavelet techniques and produces best de-noised image using multi-wavelet technique based on the performance of image de-noising algorithm in terms of PSNR Values.
This paper presents a review of steganography and various steganography techniques used for data compression. The purpose is to have a deep study of various steganographic techniques used for data compression. The main objective is to find out a technique, which can hide a large amount of data. To fulfil the purpose, various researches and projects done earlier are taken into consideration.
526 FIRE-ROUTER: A NEW SECURE INTER-NETWORKING DEVICE?, Er. Shikha Pandit, Er. Pritam Kumar, Er. Deepak Malik?
As networking is the backbone of computer industry. With the growing need of development and expansion, every industry nowadays depends on networking. Networking includes special devices for specific task like router for path selection, switches for better connectivity, and firewalls for better security. With the increasing spoofing and snooping threats the need of security is increasing day by day. Thus, better security implementation with minimum cost has become the bottle neck for engineers to deal with. In this paper we discuss working of some important networking devices and propose a new device to deal with the above problem discussed.
527 Improved Accuracy Distribution Localization in Wireless Sensor Networks?, Asmaa Q. Shareef, Maad M. Mijwel?
Localization system is important when there is an uncertainty of the exact location of some fixed or mobile devices. In this paper, the problem of localization system to estimate the position of randomly deployed nodes of a wireless sensor network (WSN) is addressed, a propose algorithm named Improved Accuracy Distribution localization for wireless sensor networks (IADLoc) is issued, it is used to minimize the error rate of localization without any additional hardware cost and minimum energy consumption and also is decentralized implementation. The IADLoc is a range free and range based localization algorithm that uses both type of antenna (directional and omni-directional), it allows sensors determine their location based on the region of intersection (ROI) when the beacon nodes send the information to the sink node and the latter sends this information to the sensors relying on the antenna.
As sensor networks may interact with sensitive data and operate in hostile unattended environments, it is imperative that security concern be addressed from the beginning of the system. But sensor networks also introduce severe resource constraints due to their lack of data storage and power. Both of these represent major obstacles to the implementation of traditional computer security techniques in a wireless sensor networks. There has to be some comprise between the security and the energy. Many symmetric protocols have been implemented for sensor networks. The major problem in symmetric key security protocol is the key distribution problem. But asymmetric protocol like RSA has not been implemented due to high power constrain and for memory issue. In this research paper, RSA can be implemented for sensor in an efficient manner by using optimized computation. The problem of using power function for large encryption and decryption key for encryption and decryption method respectively has been optimized. This reduces the cost of computation for RSA and also designs a model to reduce the power consumption of the cluster head by decrypting the information message at the base station of the cluster node which uses the public key of its corresponding cluster node to encrypt the message.
529 A Survey on Data Storage and Security in Cloud Computing?, V. Spoorthy, M. Mamatha, B. Santhosh Kumar?
cloud computing has been envisioned as the next generation architecture of IT enterprise. Cloud computing moves the application software and data bases to the large data centers, where the management of the data and services may not be fully trustworthy. This poses many new security challenges which have not been fully implemented. In this paper, we mainly focus on aspects for providing security for data storage in cloud, also architecture for data storage that are implemented by other service providers vendors in cloud, key points for proving security for data storage.
530 Cloud Testing: A Review Article?, Radhika Batra, Naveen Sharma
Today Cloud computing is emerging as a new technology in corporate world and organization. Cloud testing is a form of software testing in which web applications use cloud computing environments (i.e. a "cloud") to handle with real-world user traffic by using various cloud technologies and solutions. Cloud computing leads an opportunity in offering testing as a service (TaaS) for SaaS and clouds. But at the same time, it causes new issues, challenges and needs in software testing, particular in the field of 1) testing clouds and 2) cloud-based applications. In this paper we discus some basic concepts of cloud testing, its type and major issues and challenges comes in this field. The paper also gives light on the benefits of cloud testing over conventional software testing.
531 Earthquake Reporting System by Using Real Time Nature of Twitter?, SK.Haseena, P.Sandeep Reddy, Dr. M.V Siva Prasad?
TWITTER, a popular microblogging service, an important characteristic of Twitter is its real-time nature. We analyze the real-time interaction of events such as earthquakes in Twitter and propose an algorithm to monitor tweets and to detect a target event. To detect a target event, we devise a classifier of tweets based on features such as the keywords in a tweet, the number of words, and their context. Subsequently, we propose a probabilistic spatiotemporal model for the target event that can find the center of the event location. We regard each Twitter user as a sensor and apply particle filtering, which are widely used for location estimation. The particle filter works better than other comparable methods for estimating the locations of target events.
Next Generation Satellite Routed Sensor Systems is expected to provide disaster detection system with high real time performance. By using Satellite Networks SRSS realizes data collection from multiple sensor terminals deployed in a wide area. However an efficient access control scheme is needed to achieve multiple access from numerous sensor terminals to the satellite with its limited bandwidth. In order to efficiently resolve these problems we propose a new scheme which utilizes a divide and conquer approach for efficient bandwidth allocation. Numerical results demonstrate the effectiveness of proposal.
533 QR Code Scanning app for Mobile Devices?, Mircea Moisoiu, Andrei Negr?u, Robert Gy?rödi, Cornelia Gy?rödi, George Pecherle?
QR code (abbreviated from Quick Response Code) is the trademark for a type of matrix barcode first designed for the automotive industry [1]. Today the QR code is widely used in all industries. In our paper we present an implementation of an Android device using libraries and combined algorithms in order to be able to scan any QR code fast accurate and easy. The devices that we targeted for our application are the Google Glasses and an Android operated phone. The implementation for each of the devices was slight different, but the core algorithms and libraries were the same.
The ability to track and check the location of people or equipment in real time has a number of application areas such as child safety, prisoner tracking and supply chain to name but a few. Wi-Fi location determination is a technology that has been developed in recent years, that utilizes existing Wi-Fi equipment such as those installed in personal computers, personal data assistant’s (PDA) and mobile phones. The technology uses modulated Wi-Fi transmission signals to detect the presence of a device, which does not necessarily have to be connected to the network in question, just visible to it, the system is then able to determine the position of the device based on the signals received from the various Access Points (AP).
Many scheduling algorithms have been studied to assure the time constraints of real-time processes. Scheduling decision of these algorithms is usually based on parameters which are assumed to be crisp. However, in many circumstances the values of these parameters are vague. The vagueness of parameters suggests that we make use of fuzzy logic to decide in what order the requests should be executed to better utilize the system and as a result reduce the chance of a request being missed. Our main contribution is proposing a fuzzy approach to multiprocessor real-time scheduling in which the scheduling parameters are treated as fuzzy variables. A simulation is also performed and the results are judged against each other. It is concluded that the proposed fuzzy approach is very promising and it has the potential to be considered for future research.
536 A Comparative Study on Data Aggregation in Wireless Sensor Networks by using Directed Diffusion and Ant Colony Algorithm?, M. Shailaja?
Wireless sensor networks uses Data Aggregation concept for reducing energy consumption. Data Aggregation gathers correlated sensing data and aggregates at the intermediate nodes to reduce the number of messages exchanged network. This paper considers the problem of comparing the energy consumptions of Directed Diffusion and Ant Colony Algorithm for data aggregation. Directed diffusion is data-centric. All nodes in a directed diffusion-based network are application-aware. This enables diffusion to achieve energy savings by selecting empirically good paths and by caching and processing data in-network. The ant colony system provides a natural and intrinsic way of exploring search space in determining data aggregation. Every ant will explore all possible paths from the source node to the sink node. The data aggregation tree is constructed by the accumulated pheromone
The Internet will soon be sailing in very rough as it is about to run out of the current Internet Protocol Version four (IPV4). Moving from Internet Protocol version Four (IPv4) to Internet Protocol version six (IPv6) is not straightforward because IPv4 and IPv6 are incompatible protocols. To enable the smooth transition between IPv4 and IPv6, several transition mechanisms have been proposed by IETF IPng Transition Working Group (NGTrans) such as Tunneling, dual stack, Translation. Tunneling supports ?like-to-like? IP connectivity across an ?unlike? network, whereas translation supports ?like-to-unlike? IP interconnectivity. No comprehensive strategy exists to address all possible scenarios. Because tunneling can keep the end-to-end model that the Internet is built on. Tunneling enables IPv6 connectivity across an IPv4 network and vice versa. Although tunneling can’t achieve direct interworking between IPv4 and IPv6, but broadly adopting it as the foundation for IPv6 transition will accelerate IPv6 adoption, and retain the legacy IPv4 connectivity, and let operators leverage their existing IPv4 assets during the transition period. The key concern is that tunneling retains the end-to-end notion and IP like-to-like affinity on which the Internet is built. Bandwidth allocation is an important factor to be considered in networking. Efficient bandwidth management technique is important in satisfying the requested services. In this project, the emphasis is laid on developing a tunnel-based framework that solves the transition problems in backbone and allocation of bandwidth efficiently by allocating the requested bandwidth as per the demand.
Keyword generation for search engine advertising is an important problem for sponsored search or paid placement advertising. A recent strategy in this area is bidding on non-obvious yet relevant words, which are economically more viable. Targeting many such non-obvious words lowers the advertising cost, while delivering the same click volume as expensive words. Generating the right non-obvious yet relevant keywords is a challenging task. The challenge lies in not only finding relevant words, but also in finding many such words. In this paper, we present TermsNet, a novel approach to this problem. This approach leverages search engines to determine relevance between terms and captures their semantic relationships as a directed graph. By observing the neighbours of a term in such a graph, we generate the common as well as the non-obvious keywords related to a term.
539 Leader Election Algorithms in Distributed Systems?, Seema Balhara, Kavita Khanna?
A Distributed system is an application which executes a collection of protocols to coordinate the actions of multiple processes on a network, such that all components cooperate together in order to perform a single or small set of related tasks. It is very difficult for processes to cooperate with each other because of failures of any of the process during communication. The leader election is critical problem in distributed system as data is distributed among different node which is geographically separated. To maintain co-ordination between the node, leader node have to be selected. This paper contains the information about the various existing leader election mechanisms which is used for selecting the leader in different problems.
540 Measuring Height of an Object using Accelerometer and Camera in iOS and Android Devices?, Anmole Dewan, Abhijeet Sharma, Tanupriya Choudhary, Vasudha Vashisht?
The height of object can be determined by using inch tapes, angles of elevation and basic trigonometry. Everything listed can be replaced by using standard smart phones having accelerometer, GPS receiver, network connectivity and camera. The research deals with calculation of height of an object by converting data received from the sensors.
An Ad hoc wireless sensor networks (WSNs) assuring many new exciting applications such as On-demand computing supremacy, incessant connectivity, and instantaneously deployable communication for armed and responders. As WSNs grow to be a greater extent essential to the everyday performance of people and organizations, many attacks are arisen in the Ad hoc wireless sensor networks. This project explores resource depletion attacks at the routing protocol layer, which permanently disable networks by quickly draining nodes battery power. In the worst case, a single vampire can increase network-wide energy usage by a factor of O (N), where N is the number of network nodes. This project proposes methods to prevent the adversary influence on the network, including an implemented PLGPa method of AODV protocol that provably bounds the damage caused by vampires during the packet forwarding phase.
542 Application of Cloud Rank Framework to Achieve the Better Quality of Service (QoS) Ranking Prediction of Web Services?, Miss. Shraddha B.Toney, Prof. N.D.Kale?
In Cloud computing the QoS rankings provides priceless information for making most favourable cloud service selection from a set of functionally comparable service candidates. To obtain QoS values, real-world invocations on the service candidates are usually required. To avoid the time-consuming and pricey real-world service invocation QoS ranking prediction framework is used. This framework requires no supplementary invocations of cloud services when making QoS ranking prediction. Two personalized QoS ranking prediction approaches such as cloud rank 1 ( CR1) and cloud rank 2 ( CR2) are used to envisage the QoS rankings unswervingly. Widespread experiments are conducted employing real-world QoS data, including 1000 distributed users and three real-world web services. The implemented framework uses modernized ranking approach which uses different QoS parameters to predict the ranking more accurately. Normalized Discounted Cumulative Gain (NDCG) has been used to analyze the accuracy of QoS ranking prediction for the implemented framework.
543 An Efficient Face Recognition using PCA and Euclidean Distance Classification?, Ashutosh Chandra Bhensle, Rohit Raja?
Person identification using face is very exigent and knotty problem. Recognition of a person from an arbitrary perspective is crucial requirements for security measures and access control. Recognition of a particular face can be helpful for lots of problems like person – computer interaction, criminal detection, etc. The current system has more calculation due to upper dimensionality and not more effectual as well. Thus, instead of acquiring the face vectors with high dimensionality it is better to use face vectors with lower dimensionality. This implemented face recognition system is easy and comparatively simple to recognize the faces from videos taken from a distance and web cams. The improved PCA algorithm takes out facial features and classification is performed by minimum distance classification.
In the present day scenario, power is a major need for human life. There is a need to develop non- conventional sources for power generation due to the reason that our conventional sources of power are getting scarcer by the day. This paper emphasizes on the idea that the kinetic energy getting wasted while vehicles move can be utilized to generate power by using a special arrangement called ?power hump? at toll gates. The main aim is to run the toll system using piezoelectric sensor. In turn we are saving the power needed for running the toll gate. Now a days there is a huge rush in the toll plazas in order to pay the toll tax. Therefore in order to reduce the traffic jam and to save time, & also to reduce the money loss of 300 cores / year. In this paper automation in toll tax payment using RFID is designed. Automation of toll plaza is made using the combination of microcontroller, RFID and Piezoelectric sensor. In addition, this paper also had solar panels, which would satisfy our power needs, when there is no vehicular movement.
545 Text Grouping using Textual Entailment?, Partha Pakray?
Textual Entailment is an important field in Natural Language Processing domain. Given two texts called T (Text) and H (Hypothesis), the textual entailment recognition is the task of deciding whether the meaning of H can be logically inferred from that of T. A Textual Entailment (TE) system has developed and this system has tested on various entailment standard datasets. This TE will apply to different texts then the TE system will group them into different single group. A corpus has created for this experiment that has total 10 groups which contains 3540 sentences. F-score of the textual entailment system is 61% and will detect 8 groups correctly from 10 groups.
546 A Semi-Distributed Load Balancing Algorithm Using Clustered Approach?, Shweta Rajani, Renu Bagoria?
Rapid increase and advancement in the use of computer and internet has increased the demand for resource sharing since it has increased the amount of load across internet to a vast level. In a distributed computing system made up of different types of processors, each processor in the system may have different performance and reliability characteristics. In order to take advantage of this diversity of processing power, a modular distributed program should have its modules assigned in such a way that the applicable system performance index, such as execution time or cost, is optimized. This situation can be handled either by increasing the size of servers or by effectively distributing the workload among multiple servers. The presented paper discusses various techniques of load balancing and proposes a new design and algorithm which uses a clustered and semi-distributed approach to perform dynamic load balancing.
547 Survey of Creating Location Based Regions of Spatial Queries Using Proxy Approach in Mobile Environments?, K.Suresh Babu, Swetha Madireddy
Caching valid regions of spatial queries at mobile clients is effective in reducing the number of queries submitted by mobile clients and query load on the server. However, mobile clients suffer from longer waiting time for the server to compute valid regions. We propose in this paper a proxy-based approach to continuous nearest-neighbor (NN) and window queries. The proxy creates estimated valid regions (EVRs) for mobile clients by exploiting spatial and temporal locality of spatial queries. For NN queries, we devise two new algorithms to accelerate EVR growth, leading the proxy to build effective EVRs even when the cache size is small. On the other hand, we propose to represent the EVRs of window queries in the form of vectors, called estimated window vectors (EWVs), to achieve larger estimated valid regions. This novel representation and the associated creation algorithm result in more effective EVRs of window queries. In addition, due to the distinct characteristics, we use separate index structures, namely EVR-tree and gridindex, for NN queries and window queries, respectively. To further increase efficiency, we develop algorithms to exploit the results of NN queries to aid grid index growth, benefiting EWV creation of window queries. Similarly, the grid index is utilized to support NN query answering and EVR updating. We conduct several experiments for performance evaluation. The experimental results show that the proposed approach significantly outperforms the existing proxy-based approaches.
548 Improving the Effectiveness of Marketing and Sales using Genetic Algorithm?, Pratima O. Fegade, Prof. Dinesh D. Patil?
The mention system is useful tool for discovering the customer purchasing pattern. Main aim of this system is to discover the items frequently purchase by the customer. This system is helpful for marketing and improving the sales. This system helps in making the decision of profit, placement, pricing and promotion of the products. Examining product frequently purchase by the customer help in guessing improvement require in products. In this system first apriori algorithm is applied over the item sets after that optimization method is applied over the result of apriori algorithm.
Network Traffic Analysis (NTA) in heterogeneous networks is one of the emerging research areas receiving substantial attention from both the research community and traffic analyzers. Many tasks in NTA can be naturally cast in a supervised and unsupervised learning model. Many supervised classification models and unsupervised clustering learning models in data mining have been proposed for heterogeneous network. Due to the importance of network traffic analysis in data mining research with the rapid development of new models, To provide a comprehensive review on supervised classification and unsupervised clustering model on heterogeneous type of network in this paper and systematically give a summarization of the state-of-the-art techniques for network traffic analysis. It addresses the problem of network management such as traffic load, quality of service, and trend analysis. This survey covers real time supervised classification and unsupervised clustering algorithms and analyze techniques for heterogeneous networks. It provides taxonomy of the different supervised classification algorithms and unsupervised clustering algorithms and evaluates the various performance metrics that are significantly used for the purpose of comparison. A detailed review is provided covering fuzzy relational clustering algorithm, classification learning algorithms, global voting algorithm and hybrid algorithms. The survey evolve certain open issues, key research challenges for network traffic analysis using supervised classification and unsupervised clustering model in heterogeneous networks, and likely to provide productive research directions.
550 An Analysis of Genetic Algorithm and Tabu Search Algorithm for Channel Optimization in Cognitive AdHoc Networks?, V.Jayaraj, J.Jegathesh Amalraj, S.Hemalatha?
Cognitive Radio Ad Hoc Networks (CRAHNs) constitute a viable solution to solve the current problems of inefficiency in the spectrum allocation, and to deploy highly reconfigurable and self-organizing wireless networks. Cognitive Radio (CR) devices are envisaged to utilize the spectrum in an opportunistic way by dynamically accessing different licensed portions of the spectrum. However the phenomena of channel fading and primary cum secondary interference in cognitive radio networks does not guarantee application demands to be achieved continuously over time. The limited available spectrum and the inadequacy in the spectrum usage necessitate a new communication standard to utilize the existing wireless spectrum opportunistically. Here, we discuss the existing mechanisms that are followed to improve the optimization of channel allocation in cognitive network. This paper compares the techniques used to optimize the Secondary User’s performance in Channel Allocation.
551 Analysis of Location Based Routing Protocols against Wormhole Attack for MANETs: A Literature Survey?, Devendra Kumar, Deepak Kumar Xaxa?
The mobile nodes in MANETs dynamically change their topology and hence require an efficient mechanism to communicate with each other. There are number of routing protocols anticipated for MANET environment categorized as Location based routing protocols and Non-Location based routing protocols. In MANETs, Location based routing protocols are preferred as they are more efficient in routing compared with the Non -Location Based routing protocols. The paper presents classification of the mobile ad-hoc routing protocols and survey of location based routing protocols against wormhole attack.
552 Call Admission Control and Resource Utilization in 3G Networks?, K. Prasuna, Chilakalapudi Meher Babu, Dr. Ujwal A. Lanjewar, K. Priya?
Every newly developed technology faces the challenges to justify its potential and become alive in a real working environment. Combining two of the leading new technologies, GPRS and multicasting in the field of mobile communications makes this challenge even more exigent. At a time when radio and core network resources are utilized at maximum, multicasting has the potential to improve overall resource utilization and as result to provide more channels for voice or data transmission. This also significantly improves the quality of the service and allows applications with high bandwidth demands such as streaming video and viceo conferencing to be widely promoted. In this we will focus on studying the impact of multicast implementation on GPRS platforms. From the analyses of the output results from the system simulator we will determine the multicast influence on a GPRS and different network loads, number of users and multicast density modes. We have studied the impact of packet delays and packet overflow which is in close relation to the Quality of Service provided by the system.
553 Modified Shape Context for Signature Verification of Automated Cheque Authentication System?, Sangeeta Girish Narkhede, Prof. Dinesh D. Patil?
Every person has a unique signature hence it is considered as one of the biometrics for the purpose of authentication. Signature is an indivisible part of any bank cheque but it can be copied by skill or by mere observation. To avoid such frauds many techniques have been found till date. Research here includes shape contexts for authentication of signature on bank cheque. Shape Contexts are mostly used to verify whether 2 shapes are similar or not. Existing work on shape context requires transformation for shape alignment but this research eliminates this extra work. This paper presents a modified version of shape context for signature verification on bank cheque using K-Nearest Neighbour classifier. Proposed system also demonstrates effective performance when compared with other pattern matching technique Local Binary Pattern.
This is an attempt to devise a memory efficient WDR (Wavelet Difference Reduction) algorithm by decomposing a matrix using customized Echelon algorithm and applying WDR on it in parts. The standard process of WDR algorithm requires the entire matrix to be available in RAM (random access memory) which might not be feasible always or for a longer duration till the entire matrix is encoded. This idea takes advantage of working on matrix in parts without having to partition the matrix in any predefined way. The encoded matrix can be reconstructed using symbolic computations by creating equations and solving them against decomposed matrices.
555 A NEAR WELLBORE SIMULATOR?, Zeeshan Ahmad, Jaspal K Saini, Haziq Jeelani?
The major objective of this project is to develop a Near Wellbore Simulator, a software programme, which will assist oil and gas companies in effective chemical squeeze treatments in oil and gas fields. This report contains methodology planned to apply to develop the complete software package for proposed Near Wellbore Simulator. This document contains information about purpose and requirement of such programme, design of user interface for the programme users, methodology to do flow computation, and result calculation from flow computation have been described in sequence. In the beginning, the background information about challenges oil and gas companies are facing has been described along with the demand of this type of software programme. The objective of this project is to develop a near wellbore simulator (a software package) to simulate the scale inhibitor squeeze treatment operations and evaluate post squeeze well behavior. The proposed simulator will serve for all the challenges (described above) faced by operator during scale squeeze treatments. All the mathematical equations for solution flow, adsorption, desorption, will be provided and will be coded inside the software. The user of the software will input required data and compute squeeze treatments to solve various challenges they face during scale inhibitor squeeze. There is important demand of a software programme, a near wellbore simulator, which assist oil and gas companies in effective squeeze treatments The region or rock layers, which is close to wellbore (let’s say up to 50 meters away from wellbore) is called near wellbore region. Due to deposition of scale mineral either on the rock surface in the near wellbore region or inside the wellbore, oil and gas production decreases. This software will help the oil and gas industries to determine what amount of chemicals should be used so that minerals should not get deposited on the wellbore.
556 Data Broadcasting Approximation Algorithms for Wireless Networks, B.Sujatha, S.Nagaprasad, G.Srinivasa Rao?
In this paper, an algorithm for efficient network-wide broadcast (NWB) in mobile ad hoc networks (MANETs) is proposed. The algorithm is performed in an asynchronous and distributed manner by each network node. The algorithm requires only limited topology knowledge, and therefore, is suitable for reactive MANET routing protocols. Simulations show that the proposed algorithm is on average 3-4 times as efficient as brute force flooding. Further, simulations show that the proposed algorithm compares favorably over a wide range of network sizes, with a greedy algorithm using global topology knowledge, in terms of minimizing packet transmissions. The application of the algorithm to route discovery in on-demand routing protocols is discussed in detail. Proofs of the algorithm's reliability and of the intractability of solving for a minimum sized transmitter set to perform NWB are also given.
557 Study of Energy Consumption in DSR using NS2, Palak, Nasib Singh Gill?
A mobile ad hoc network is a self organizing temporary network. It consists of mobile nodes roaming here and there. Routing in such networks is a challenging task. Various routing protocols have been proposed since the origin of MANETS. DSR is the most popular routing protocol based on source routing. This paper is an extensive study of DSR taking various performance parameters. Effect of these parameters on energy consumption is studied in this paper. Efficient DSR is one which gives best route with minimum energy consumption and increases network lifetime.
558 Multi-Robot Co-ordination For Box-Pushing using Embedded Controller?, Ketaki.P.Dahale, Vijay.D.Chaudhari, Kantilal.P.Rane?
Multi-Robot coordination for task allocation has been widely studied topic in the literature. A robot team can accomplish a given task more quickly than a single robot can by dividing the task into sub-tasks and executing them concurrently in application domains where the tasks can be decomposed. It leads to effective coverage of a large area. The principle of territoriality in homogeneous agent groups is a physical division of space and all associated recourses in order to minimize interference and maximize synergy. Each robot is assigned a working area and robots operate only in work space allocated to them in order to reduce interference between them. Dynamic task allocation allows robots to change their behavior in response to environmental changes in order to improve overall task performance. From the studies related on division of labor mechanism in the insect society it can be found that an individual engages in task performance if environment exceed their intrinsic thresholds. The advantage of centralized approach is that they can produce globally optimal plans. In our work, we exploit some of the features of territoriality, dynamic task allocation, division of labor in insect societies and centralized systems. We propose action selection without explicit communication for multi-robot box-pushing which changes a suitable behaviour set depending on a situation for adaptation to a dynamic environment. A central controller identifies the situation and issues the appropriate control signals to adapt the suitable behaviour set to it. As a result, we find out our approach is promising for designing adaptive multi-robot box-pushing. We demonstrate these concepts using three line following mobile robots. The task is to push a given box from source location to destination location. The number of robots active at a given time depends upon the weight of the box. The boxes to be moved are of different weight. The ratio of robots working at a given time change dynamically as per the changes in environment.
559 A Comparison and Selection on Basic Type of Searching Algorithm in Data Structure?, Kamlesh Kumar Pandey, Narendra Pradhan?
A lot of problems in different practical fields of Computer Science, Database Management System, Networks, Data Mining and Artificial intelligence. Searching is common fundamental operation and solve to searching problem in a different formats of these field. This research paper are presents the basic type of searching algorithms of data structure like linear search, binary search, and hash search. We have try to cover some technical aspects to this searching algorithm. This research is provides a detailed study of searching algorithms working process, select on Searching Algorithm according To Problem and compares them on the basis of different parameters like total number of comparison, type of data Structure, time complexity and space complexity.
560 Performance Evaluation of AODV and AODV-LR in terms of Route Maintenance, R. Vijayadharshini?, Dr. A. Padmapriya?
The Idea at the back of ad hoc network is multi-hop broadcasting in which packets are sent from the source node to destination node. It does not have any infrastructure based network. In ad hoc network distributed routing is available. Because of mobility in nodes, there are recurring path breakages. In ad hoc network, the primary goal of routing is to detect paths between source to destination with minimum routing overhead. Applications of ad hoc network are military applications as well as emergency operations. In ad hoc networks this work focused on AODV route maintenance. This paper suggests a new approach of route maintenance scheme AODV-LR in AODV which reduces the data transmission time of AODV as well as makes better performance with minimum bandwidth utilization and maximum packet delivery ratio. This concept has been implemented and tested on OMNet++.
561 Design and Development of Speech Database for Travel Purpose in Marathi?, Pooja V. Janse, Ratnadeep R. Deshmukh, Smita B. Magre
The paper represents the brief information about developing speech database in Marathi language for Travel purpose in Aurangabad District. Development of speech database is very primary requirement for developing an Automatic Speech Recognition System. The accuracy of speech recognition depends on the quality of the speech data recorded and the algorithms implemented for the development of ASR. The data collection procedure from various speakers from Aurangabad district is described in the paper for developing ASR system in Marathi language for travel domain.
562 A Novel Interoperable Mobile Wallet Model with Capability Based Access Control Framework?, Neeharika P, V N Sastry?
Initially mobile phones were used only for calls and messaging services. Nowadays almost all the basic utility devices around us have been replaced by mobile phones, ranging from simple alarm clock to controlling ubiquitous devices remotely. Mobile phones nowadays are much smarter compared to the devices used for payment processing in the early ages of banking. The plastic cards that we carry in our wallets like financial cards, membership cards, driving license etc all hold digital data. This gave inception to the idea of placing the plastic cards onto a mobile phone. There are a large number of mobile wallet initiatives currently. We have given the existing challenges that the current initiatives are facing. In this paper we have given a model for the development of a mobile wallet that can work across various platforms. Security is the major concern when it comes to finance related information. To address the security issues of our proposed mobile wallet model, we have also given an access control model that works with our interoperable mobile wallet in detail.
Information Brokering Systems are attracting and increasing attention as an efficient means of sharing data among large, diverse and dynamic sets of user. The peer from logical over lay network by establishing links to some other peers they know are discover. A user in a peer-to-peer system in issues quires the describe data of interest the quires are propagated throw the overlay network to locate peer that provide data relevant to the query and only matching results are returned to the user. Information Brokering System (IBS) atop a peer-to-peer overlay has been proposed to support information sharing among loosely federated data sources. In existing IBSs adopt server side access control deployment and honest assumptions on brokers, and shack little observation on privacy of data and metadata stored and exchanged within the IBS. This paper studies the problem of privacy protection in information brokering process (PPIB). Then, this paper propose a broker-coordinator face, as well as two schemes, automaton segmentation scheme and query segment encryption scheme, to share the secure query routing use among a set of brokering servers. With comprehensive survey on privacy, end-to-end performance, and scalability, we show that the proposed system can combine security enforcement and query routing while preserving system-wide privacy with reasonable overhead. Further enhanced by Information Brokering System using Data Encryption Standard (DES), Digital Signature and XOR swap algorithm.
564 Distributed Data Collection Scheme for Store and Forward Information in Wireless Sensor Network?, Madhavi S. Kukade, Prof. Kapil N. Hande?
Sensors networks are capable of collecting an enormous amount of data over space and time often, the ultimate objective is to ?sample, store and forward? that is to sense the data, store it locally and ultimately forward it to accent almost and analysed. Typical sensor nodes are wireless nodes with limited storage and computational power. Furthermore they are prone to ?failure? by going out of wireless range, interference running out of battery etc. They can be deployed in isolated or dangerous areas to monitor objects, temperatures, etc. or to detect fires, floods, or other incidents. There has been extensive research on sensor networks to improve their utility and efficiency. The sensor and storage nodes are distributed randomly in some region and cannot maintain routing tables or shared knowledge of network topology. Some nodes might disappear from the network due to failure or battery depletion. A distributed data collection algorithm to accurately store and forward information obtained by wireless sensor networks is proposed. The proposed algorithm does not depend on the sensor network topology or geographic locations of sensor nodes, but rather makes use of uniformly distributed storage nodes. Analytical and simulation results for this algorithm show that, with high probability, the data disseminated by the sensor nodes can be precisely collected by querying any small set of storage nodes.
565 Automated Window 8 Security and Safety System?, Geeta?, Dr. Neelam Sharivastav?
Automated Windows 8 Security Console is software program which is aimed to help Windows 8 Users to protect their privacy & from malicious attacks. Automated Windows 8 Security Console is a Graphical User Interface Software which guides an average user about the latest security issues involved in windows 8. Software then offers a feature to implement the security by simple approval by user. The approval can be a simple click or an enter button. Automated windows 8 Security Console will establish a security configuration for Microsoft Windows 8. Automated Windows 8 Security Console will help system and application administrators, security specialists, auditors, help desk, and platform deployment person who plan to develop, deploy, assess, or secure solutions that incorporate Microsoft Windows 8.We are Making Windows Security Console which make the user free from Malicious attack and privacy.
566 FoCUS – Forum Crawler Under Supervision?, V.Rajapriya?
Forum Crawler Under Supervision (FoCUS) is a supervised web-scale forum crawler. The web contains large data and innumerable websites that are monitored by a tool or program known as crawler. The goal is to crawl relevant forum content from the web with minimal overhead. Forums have different layouts or styles and are powered by different forum software packages. They have similar implicit navigation paths connected by specific URL types to lead users from entry pages to thread pages. It reduces the web forum crawling problem to a URL-type recognition problem. It also shows how to learn accurate and effective regular expression patterns of implicit navigation paths from automatically created training sets using aggregated results from weak page type classifiers. These type classifiers can be trained and applied to large set of unseen forums. It produces the best effectiveness and addresses the scalability issue and includes the concept called sentimental analysis.
567 Intelligent Pressure Measuring System, Shayama Subair, Lizy Abraham?
In this paper, a Pressure measurement using MEMS based sensor, BMP180 is implemented which is an economic and feasible method. This study mainly deals with the applicability of BMP180 in space applications. Pressure is an important parameter to monitor. The total process consists of sensing of the temperature using BMP180 , PIC16F877A and sent to PC using RS232 serial interface, display in the way of digital and waveform using a real time software Lab VIEW (Laboratory Virtual Instrumentation Engineering Workbench) of National Instruments, USA and comparing with conventional sensor used in industry.
Cloud computing provides multiple services to the cloud users, particularly in Infrastructure as a service (Ias) clouds user may install vulnerable software on their virtual machines. Attackers exploit these virtual machines to compromise as zombie and by using it attacker can perform Distributed denial of service (DDOS) attacks. The Distributed Denial of service attacks (DDOS) caused by the extreme flow of requests from clients to the cloud sever at the same time. The DDOS attacks are very much high in the existing Intrusion detection systems. To overcome these problems a modified approach called Effective Intrusion Detection and reducing Security risks in Virtual Networks (EDSV) is proposed. It enhances the intrusion detection by closely inspecting the suspicious cloud traffic and determines the compromised machines A novel attack graph based alert correlation algorithm is used to detect DDOS attacks and reduced to low level by incorporating access control and software switching mechanism. It also reduces the infrastructure response time and CPU utilization.
569 A Review on Question Generation System from Punjabi Text Contain Historical Information?, Parshan Singh, Rajbhupinder Kaur?
Automatic Question generation is a process of generating questions automatically from a text with the help of various NLP techniques. Main challenging area while generating the questions from a text automatically is that it must be correct semantically. Rule based approach is most common approach to generate the questions automatically from a text. In this paper we are presenting the review on question generation from historical documents written in Punjabi language. To generate the questions automatically from a Punjabi text a corpora in Punjabi language which contain various named entities such as names of persons, locations, cities, countries and other entities is required which is not yet available. So a NER (Named Entity Recognition) Tool is also need to be created which recognizes the names from a given sentence and generate the appropriate questions from it.
Power factor gives the relationship between the input voltage and input current waveforms to an electrical load, powered by an AC source. Most often the AC utility mains will be the source but in certain cases it could also be the output of a motor drive, inverter or other localized AC source. The electrical energy quality is an important factor technically and economically. Accurate power factor measurement is important in any electrical system. This paper presents a simple, cost effective and accurate power factor measurement system implemented using ATmega microcontroller and LabVIEW. In the proposed hardware current measurements are taken using a Hall Effect current sensor and voltage measurements are directly obtained by stepping down the input voltage and converting it to a suitably proportional voltage input to microcontroller. The system is able to effectively measure power factor of appliances consuming power between 20W and 1000W.
571 Wireless Acoustic Signal Monitoring Using MEMS sensor and ATmega on LabVIEW Platform?, Swathy L, Lizy Abraham?
Acoustics and vibrations caused on machine structures can result in faulty or collapse of the whole mechanical system. These acoustics produced by machinery are vital indicators of machine health. Acoustic signals analysis can be used as a tool for locating the problem and thus taking necessary actions. MEMS sensors provide high accurate low size and low cost microphones which can be even used in low cost space applications such as sounding rockets, nano-satellites etc. This paper explains the algorithm to extract acoustic signals from ADMP401, which is a high quality, high performance, low power, analog output, bottom-ported omnidirectional MEMS microphone. The sensor output will give the error free data even in noisy conditions. ATmega microcontroller is used to read the data from the sensor and send to LabVIEW software running on the computer which will extract and display the data from the serial port in real time. As rotating structures are those machine parts which may experience high vibration and acoustic noises, the same is implemented in wireless also which helps in extracting acoustic signals from rotating as well as inaccessible parts of the machinery.
572 Efficient Image Retrieval Using Different Content Based Image Retrieval Methods: A Review, Priyanka Srivastava, Dr. K.S.Patnaik?
In this paper, we reviewed different methods used in Content-based image retrieval, to resolve the problem of efficient and similar digital image retrieval from large databases with high precision. Uses of different features like texture, color, shape have been focused in different ways to implement better and faster retrieval of data through different CBIR methods. Methods like Statistical Fractal-scaled Product Metric (SFPM), Fractalscaled Product Metric (FPM), local tetra patterns (LTrPs) have been focused to maximize the accuracy of CBIR systems and to enhance similarity queries.
573 A Review on Compressive Sensing in Synthetic Aperture Radar (SAR)?, Ankit, Manoj Ahlawat?
The system parameters have been specified in view of all the constraints and practical limitations. The performance metrics of the system such as range resolution and cross-range resolution, etc. have been worked out and the system level specification has been worked out keeping in view the desired performance. Using MATLAB as major tool, the specified system parameters have been tested for their accuracy and correctness. A simulation of Pulse Doppler radar is completed which includes waveform design, target modeling, LFM pulse compression, side lobe control and threshold detection. This paper surveys the use of sparse reconstruction algorithms and randomized measurement strategies in radar processing. Although the two themes have a long history in radar literature, the accessible framework provided by compressed sensing illuminates the impact of joining these themes. Potential future directions are conjectured both for extension of theory motivated by practice and for modification of practice based on theoretical insights. A SAR image formation algorithm (Doppler Beam Sharpening) has been implemented in MATLAB.
574 Using the Literature to Develop a Preliminary Conceptual Model for the Student Success Factors in a Programming Course: Java as a Case Study?, Salam Abdulabbas Ghanim, Nassir Jabir Al-khafaji?
The complexity and difficulty ascribed to computer programming have been asserted to be the causes of its high rate of failure record and attrition. It is opined that programming either to novice, middle learner, and the self-branded geeks is always a course to be apprehensive of different studies with varying findings. Studies on factors leading to the success of programming course in higher institutions have been carried out. Many Universities have greatly high failure rates, in practically, Java. This really motivates this study, which aims at discovering the factors affecting the success of programming courses.
575 A Review on Comparative Analysis of Routing Protocols of MANET?, Umara Urooj, Nafees Ayub, Ramzan Talib, Yahya Saeed?
Mobile Ad Hoc Networks is an emerging field of networking. This system has some distinct characteristics than existing networking technologies. This system is builds up by the collection of mobile nodes come in contact of each other to work together or serve each other. If a payload is not related to a node it is sent to next node. So a node acts as a router. MANET has its own set of routing protocols. It has different Proactive nature, Reactive nature and Hybrid nature routing protocols. In this paper those journals are selected for review which considered OPNET as simulator and performed comparative analysis on different MANET routing protocols. Here it is checked which type of traffic is considered for routing purpose and which routing protocol performed well, which parameters were considered? So that after reviewing this paper, an idea about MANET routing protocols, their working can be developed.
576 Securing Online Name System through Trust Demonstrating and Temporal Analysis, Chandana Krishna Sivunigunta, Sudha.K?
Using the fast growth involving status methods in various online networks, manipulations towards this kind of methods are evolving easily. Because of the anonymity with the World-wide-web, it's very hard regarding regular customers to evaluate any stranger’s stability and good quality that makes on the web interactions dangerous. In this particular cardstock, we offer program TATA, the particular abbreviation involving shared Temporary and Rely on Research, which shields status methods from your fresh viewpoint: the particular combined time period domain anomaly recognition and Dempster–Shafer theory-based trust working out. Sixty how a on the web participants guard themselves by judging the grade of unknown people or maybe unfamiliar products previously. To cope with this problem, on the web status methods are already piled up. This objective is always to generate large-scale digital word-of-mouth sites where by people talk about ideas and encounters, regarding reviews and scores, upon a variety of products, as well as solutions, services, digital articles and in some cases others. We offer a difference detector inside TATA because anomaly detector, which normally takes the particular rating sequences as advices and picks up modifications developing inside the rating sequences. This recommended alter detector will probably diagnose not only sudden fast modifications but also modest modifications accumulated with time. Like this, even if malicious customers put in shady scores using modest shifts in order to little by little deceive items’ status results, this kind of kind of modifications will still be accumulated and finally end up being found by the recommended alter detector.
577 A Generalized Flow-Based Method for Research on Acted Relationships in Wikipedia?, Nalini N, Padmavathi V, Prudhvi Raj V?
We all concentrate on measuring human relationships among sets of things with Wikipedia as their internet pages could be regarded as specific things. Two kinds of human relationships usually are really exist among a couple of things with Wikipedia, a great explicit partnership will be manifested by way of a single link between a couple of internet pages to the things, in addition to a great implied partnership will be manifested by way of a link framework containing the two internet pages. A lot of the previously proposed options for measuring human relationships usually are cohesion-based strategies that take too lightly popular things obtaining higher college diplomas, though such things might be important with constituting human relationships with Wikipedia. The other strategies usually are inadequate pertaining to measuring implied human relationships because they make use of just a few in the next three critical indicators: range, connection, in addition to cocitation. We all propose the latest procedure using a generalized highest flow that shows all the three aspects in addition to does not take too lightly popular things obtaining higher diploma. We all validate by way of findings that our procedure can certainly evaluate the potency of the partnership much more correctly when compared with these previously proposed strategies accomplish. Another amazing part of each of our procedure will be exploration elucidatory things that are certainly, things constituting the partnership. We all describe that exploration elucidatory things would certainly start the book method to significantly comprehend the partnership.
578 Polymorphic Worms Collection in Cloud Computing?, Ashraf A. Shahin?
In the past few years, computer worms are seen as one of significant challenges of cloud computing. Worms are rapidly changing and getting more sophisticated to evade detection. One major issue to defend against computer worms is collecting worms’ payloads to generate their signature and study their behavior. To collect worms’ payloads, we identified challenges for detecting and collecting worms’ payloads and proposed high-interactive honeypot to collect payloads of zero-day polymorphic worms in homogeneous and heterogeneous cloud computing platforms. Virtual machine (VM) memory and VM disk image are inspected from outside using open-source forensics tools and VMWare Virtual Disk Development Kit. Our experiments show that the proposed approach overcomes the identified challenges.
579 A Lightweight Access for Hybrid Mobile Web Cloud Content Architecture, Shawkat K. Guirguis, Adel A. El-Zoghabi, Mohamed A. Hassan?
The web was first design to just provide information that could be hosted over traditional client server model, the fast growth of web content and large numbers of web content today, trend to utilize the cloud computing and hyper mobile web, which provide instant computing power, scalability, availability, saving time and administration effort. Today there is an increasing demand for accessing the Internet from mobile devices, which becoming very popular, mobile web access now is an integral part of our lives, as the majority of current web content ignore the mobility, which considered as a great challenge for web content creators. The main requirement when talking about the context of web future, is to enhance both features of cloud computing and mobile web content, by achieving the hybrid mobile web cloud content this allow us faster access technique, and enjoy with most benefits for mobile mashup for cloud computing. The main contribution of this paper is to combine the mobile web with cloud computing, to introduce an innovative computing model, called mobile cloud computing. By implementation the proposed architecture style experimental results show that the access response and excitation time is decreased, we gain a minimized transfer data size, and strongly utilizing the three screen vision view.
580 An Implementation of Early Warning of Floods along Zambezi Basin Through the Use of Context-Awareness?, Munyaradzi Magomelo, Hilton Chikwiriro, Clive Gurure?
The ever-growing research area of early warning in floods is a broad field that has received much attention from researchers. This is because early warning has proven to be effective in reducing deaths, injuries and property damage due to floods if done appropriately. In this research, firstly, the researcher performed an in-depth study and analysis of existing flood warning systems with the aim of evaluating what has been done and identifying weaknesses that exist if any. Also, improvements over the more traditional early warning flood systems were evaluated with the aim of justifying the applicability of each in the studied research implementation area. Secondly, the researcher used an experimental research design that included surveys as a method to infer conclusions from the sample population, which included the influential flood warning people selected in the implementation area. The context-aware model was then designed and implemented using a number of open source tools using an agile software methodology strategy in the research implementation area and various qualitative and quantitative result data obtained. These results were then analyzed and the researcher concluded that context awareness did manage to improve early warning of floods.
The main aim of Objective image quality assessment (IQA) is to evaluate image quality consistently with human perception. We have different types of perceptual IQA metrics but they cannot accurately represents the degradations from different types of distortions, e.g., existing structural similarity metrics perform well on content dependent distortions and gives the better peak signal-to-noise ratio (PSNR) but it is not well on content-independent distortions. In this paper, we integrate the merits of the existing IQA metrics with the guide of the recently revealed internal generative mechanism (IGM). The IGM indicates that the human visual system actively predicts sensory information and tries to avoid residual uncertainty for image perception and understanding. Motivated by the IGM theory, here we assume an autoregressive prediction algorithm to decompose an input scene into two portions, the predicted portion with the predicted visual content and the disorderly portion with the residual content. Distortions on the predicted portion causes to degrade the primary visual information, and structural similarity procedures are employed to measure its degradation; distortions on the disorderly portion mainly change the uncertain information and the PNSR is employed for it. Based on the noise energy deployment on the two portions, finally we mix the two evaluation results to acquire the overall quality score. Simulation results show better performance comparable with the state-of-the-art quality metrics.
582 Segmentation of Medical Images using Image Registration?, A.Nirmala, V.Sridevi?
Medical image segmentation is one of the most essential task in many medical image applications, as well as one of the most complex tasks. Medical image segmentation aims at partitioning a medical image into its constituent regions or objects, and isolating multiple anatomical parts of interest in the image. The precision of segmentation often determines the final success or failure of the whole application. For example, when doctors want to reconstruct a 3D volumetric model of the heart, they need to segment the regions of heart in a series of 2D images. If segmentation is done wrongly, the reconstruction will be erroneous. Therefore, considerable care should be taken to improve the reliability and accuracy of segmentation in medical image analyzing and processing. If the region of interest in image have homogeneous visual feature then the segmentation is very easy. However, in more general medical applications, images are much more complex, and difficulties exist inevitably in segmenting these images. The difficulties of medical image segmentation are mainly based on the nature of imaging technology, dealing with low contrast image with noise, image properties, overlapping parts of an image. Due to these difficulties, intelligent algorithms are needed to segment multiple anatomical parts of medical images. One promising approach is registration-based segmentation. A model of the anatomical parts of interest is constructed. The model is registered to the image of a patient. When registration is correctly performed, segmentation of the various anatomical parts is done. By representing prior knowledge in the model, registration-based segmentation can handle complex segmentation problems and produce accurate and complete results automatically.
583 The Role of Data Warehousing Concept for Improved Organizations Performance and Decision Making?, Nwakanma Ifeanyi, Egbivwie Oghenevwoke, Azike Uchenna, Nwaobilo Amarachi, Ibeji Chinedum, Ohia Nwabueze?
In challenging times good decision-making becomes critical. The best decisions are made when all the relevant data available is taken into consideration. The best possible source for that data is a welldesigned data warehouse. The concept of data warehousing is not hard to understand. The concept is to create a permanent storage space for the data needed to support analysis, reporting, and other organizational activities. The goal of this paper is to elicit the crucial role of data warehousing in an organization performance and decision making.
584 A Hybrid Approach used to Stem Punjabi Words?, Puneet Thapar?
Stemming is the process of removing the affixes from inflected words, without doing complete morphological analysis. A stemming Algorithm is a procedure to reduce all words with the same stem to a common form [20]. The purpose of stemming is to obtain the stem or radix of those words which are not found in dictionary. If stemmed word is present in dictionary, then that is a genuine word, otherwise it may be proper name or some invalid word. It is useful in many areas of computational linguistics and information-retrieval work. This technique is used by the various search engines to find the best solution for a problem. The algorithm is a basic building block for the stemmer. Stemmer is basically used in information retrieval system to improve the performance .The paper present a stemmer for Punjabi, which uses a Naive algorithm. We also use a suffix stripping technique in our paper. Similar techniques can be used to make stemmer for other languages such as Hindi, Bengali and Marathi. An in depth analysis of Punjabi news corpus was made and various possible noun suffixes were identified like ???, i??, ???, ??, ? ?? etc. and the various rules for noun and proper name stemming have been generated. The result of stemmer is good and it can be effective in information retrieval system. This stemmer also reduces the problem of over-stemming and under-stemming.
585 Advance Reservation of Resources in Workflow System?, Lalit?, Dr. Hardeep Singh?
This is related to reserve the resource in advance in “Workflow scheduling” of cloud computing. The problem with the traditional allocation of resource is that the resource may be not available when needed and the requested application will be rejected. To overcome this problem, an advance reservation technique is used. Various techniques are used to reserve the resource in advance such as ECA rule, HAARS, Co-coordinator based, Neighborhood etc. There are two main factor of resource reservation in advance 1) Start time of Resource Allocation 2) Resource utilization Time. This paper shows new approach based on time system which are used in grid system but not in workflow system. The implementation of this paper is done in Netbeans technology. This paper also shows the existing technology for the advance reservation of resources and also purposed new methodology for the advance reservation of resource.
586 Service Selection by Predicting Website Attender Information, Prof. Dr. Zafer Agdelen; Dr. Amir Reza Shahbazkia
Web services are becoming a common and convenient means of doing business over the Internet. More-and-more web services are kept on arriving over the Internet, offering the same set of services to the end users. The availability of simi lar web services increases the complexity of discovery as well as the selection process of web services. The traditional way of discovery of web service involves keyword based searching followed by manual selection. The keyword based search is not efficient. In this paper, we have used an improved mechanism for web service selection based on biorhythm, age, time of attendance and origin society of the user. As interest in website owners arises not only to keep their customers but also increase them to get more income. By attracting customers more than any other competitors the chance to be the winner in this competition arises. Improving in business has number of rules which sellers should obey. The business rules such as negotiation, body language, time management, and selling strategy have been completely discussed in M.B.A And D.B.A courses. At the same time, for websites there is not that much information. In this study we are going to introduce new rules for websites to act more attractive. Company managers before any negotiation, should choose the best negotiator. This duty has different step. Important step is that the negotiators should be studied different courses related to strategy of negotiation. Second step is to realize biorhythm, not only for the company speakers but also for the other side as well. Now a day’s websites are an important negotiator for any companies.
587 Neighbourhood Countries Text Translation by Help of Mathematical Rules (Independent of Any Electronic Devices), Prof. Dr. Ali Haydar; Dr. Amir Reza Shahbazkia
As interest in human logical translating, since visiting neighbor countries has increased in the world and inability of people to memorize huge number of foreign words. We have developed a logical way for translation Azerbaijani, Turkmen, Uzbek, Qashqai, Turkmen Crimean Tatar and neighbor languages of Turkish to Turkish. In this way people instead of memorizing huge number of words, just by memorizing a few rules, they can transfer their own language words to other languages. Many known languages have the same roots and do originate form a few base languages. Because of this, the languages that have originated from the same root(s) may be translated to each other without using a dictionary, such as, translating neighbours’ languages. Neighbours’ languages words to Turkish words. Detection of the suffix of the word in neighbours’ languages to separate main words and its suffixes, is the first step. Then we applied morphological issues as well .In the same manner, both of the languages have same position of verbs nouns adverbs in the sentences. With this in mind, text translation, is as easy as word Translation. In this paper we propose a new method for analysis and finding exact translation of a word by applying algorithmic rules. The use of this research is to present ten rules to toursits from neighbor countries of Turkey, when they are coming to visit Turkey, such that during visiting they can use their own mother tongue for communication, just they apply the easy rules to their own mother tongue words, to make Turkish word for communication. In future this idea can be used for other countries.
588 Machine Translation by Homograph Detector with the Help of Grammatical Base of Persian Words, Prof. Dr. Zafer Agdelen; Dr. Amir Reza Shahbazkia
Language is core medium of communication and translation is core tool for the understand the information in unknown language. Machine translation helps the people to understand the information of unknown language without the help of Human translator. This study is brief introduction to machine Translation and the solution for homographs. machine translation have been developed for many popular languages and many researches and developments have been applied to those languages but a significant problem in Persian (the language of Iranian, Afghani, etc.) is detecting the homographs which is not generally problematic in any other languages except Arabic. Detection of homographs in Arabic have been extensively studied. However Persian and Arabic share 28 characters, having only 4 different characters, they are two quite different languages. Homographs, words with same spelling and different translations are more problematic to detect in Persian because not all the pronounced vowels are written in the text (only 20% of vowels are written in the text) so the number of homographs in Persian is about thousands of times more than in other languages except Arabic. In this paper we propose a new method for analysis and finding exact translation for homographs by algorithmic and grammatical rules.
589 Superimposed Rule-Based Classification Algorithm in IoT , Ivy Kim D. Machica; Bobby D. Gerardo; Ruji P. Medina
The application of the Internet of Things (IoT) in agriculture captures an enormous amount of data for decision making. However, hardest-to-detect abnormal data points that are transmitted can be harmful if not detected at an earlier stage. This paper presents an application of the Superimposed-Rule Based Classification Algorithm (SRBCA) using IoT in agriculture that lowers the false positive in anomaly detection by training one-class dataset of a ground-truth collection of agricultural sensor readings and evaluating using the combination of ground-truth and synthetic balanced test set. The CRoss-Industry Standard Process for Data Mining (CRISP-DM) methodology was used as a guide in the development of the study. The SRBCA was developed to detect conditional anomalous instances. The model was tested with one (1) year daily collection of environmental sensors. This algorithm considers the behavior and indicator features for anomaly detection. Moreover, a confusion matrix is presented showing the accuracy of the result of the SRBCA compared with One-Class Support Vector Machine (OCSVM) and its types which were considered as the closest prior art of the algorithm. The experimental results show that SRBCA performed better in identifying conditional anomaly over OCSVM and its varieties.
590 Development of Text Recognition Prototype with Classification of Neural Networks AND Text-To-Speech in Javanese Scripts Using Incremental Methods , Ifan Prihandi; Indra Ranggadara; Boy Yuliandi
Artificial neural networks are information processing systems that have characteristics similar to human neural networks. Learning models need to be done on an artificial neural network before being used to solve problems by examining and correcting any errors that occur during the learning process. Developments in information technology affect particular scientific expertise. Field of image processing does not become one-the only method of solving a problem, but the current image processing combined with artificial intelligence to examine or look for a solution in a variety of applications. Akshara Java is one of the priceless cultural heritages. Form of script and art-making becomes a relic that deserves to be preserved. Not only in Java, but Akshara Java is also used in the Sunda and Bali, although there is little difference in the writing used the same script. The purpose of this research is to create a model of image processing and converted into text into voice so that people can learn Akshara Jawa and able to preserve the culture of Indonesia and is expected to be a reference for the development of the mobile application development at a later stage.
591 Tableau Big Data Visualization Tool in the Higher Education Institutions for Sustainable Development Goals , Ahmed M. Amer; Mohamed M. EL-Hadi
The purpose of this paper is to use of Tableau in HEIs to achieve the Sustainable Development Goals (SDGs). Tableau can use large data sets to analyse, visualize and share knowledge, such as: whole Egypt, HEIs, governorate, university, college, program, course, faculty members, and students to determine the demography and performance. It provides a variety of graphs, chart forms and dashboards that can assist to have a better tool in HEIs for sustainability. An analysis based on a literature review of visualization tools such as Tableau and sustainable development goals. This paper suggests that most 17 SDGs and ESDGs can be analysed, visualized and shared knowledge using Tableau. Tableau tool can be used to study the extent to which higher education institutions can be used to contribute to the achievement of sustainable development both internal and external EHEIs geographical boundaries. This paper emphasizes the application of a relevant practical underpinning to support Sustainable Development Goals (SDGs).
592 Detecting Digital Forgery Using Image Processing in Zero Day Attack , L. Haider Hameed Razzaq; Dr. Ghadah Al-Khafaji
Digital image processing represent the identity for the last century in computer graphic. Many companies started to develop software to meet the significant need for the new technologies, especially these software demanded by movies and films production. Also, on the personal needs, image processing software are developed to the extent of zero coast for end users. This matter helped wide range of end user to build their skills to be professionals in photography jobs. On the other hands, the priceless of the image processing tools in addition to cheap computer device prices encourages other categories of end users to use these facilities illegally to forge the digital images to produce fake image content. This research tried to focus on help organizations (In developed countries) who uses normal scanners to convert paper documents into digital images to discover if the stored digital images are forges or still authentic as the first timed scanned in order to use them with the guaranty as reference digital evidences for future. Discovering the technique used to forge the digital image or determine the portion pf the image where the tampering have been done is not considered in this paper.
593 Information Governance: A Necessity in Today's Business Environment , Ripon Datta; Mounicasri Valavala; Md Haris Uddin Sharif
The purpose of this research work is to identify the necessity of information governance (IG) in businesses today, and the impact it has on business survival. Most of the businesses face IG implementation issues, which do not have permanent or lasting solutions. Also, it is still not yet well-known which people in business are responsible for running IG and what role each individual play. A qualitative research method is conducted where secondary sources of information are analyzed. Also, a comprehensive review of literature is conducted. Today, almost every business is experiencing issues in the fields of information security, compliance, information and communication automation, policies implementation, and allocating roles and responsibilities for players in IG systems. Some solutions to these problems are provided in this research.
594 Multidimensional Modeling of Semi-Structured Data: XML Documents and Tweets , Kais Khrouf
The considerable development experienced by the technologies in recent decades has led to the emergence of relatively simple panoply of Internet applications based on open source software, and services designed to improve online collaboration to the large public as: social networking sites (Tweeter, for example) and XML (Extensible Markup Language). Therefore, it is essential to provide efficient tools for decision makers in order to help them analyzing the semi-structured data in a simple way; i.e., as they actually analyze factual or descriptive data. In this paper, we propose a new generic multidimensional model dedicated to the semi-structured data (XML documents and Tweets).
595 A New Approach for Data Cryptography , Ziad Alqad; Majid Oraiqat; Salah Al-Saleh; Hind Al Husban; Soubhi Al-Rimawi
Due to the large number of different computer applications transactions on the internet, cryptography is a vital key in ensuring the security of the transactions. Cryptography is an important way of achieving data confidentiality, data integrity, user authentication and non-repudiation. In this paper we will introduce a new approach of message encryption-decryption, this approach will be implemented, and the experimental results will compared with the results of DES method of data cryptography. The following features of the proposed approach will be proved:-Simplicity, Efficiency, High security level, Flexibility.
596 ESP32 Based Data Logger , Ibrahim Al Abbas
Data logger is an important realization of any measurement and instrumentation system. The developed data logger can be used in several measurement chains such as smart home applications and IIOT (Industrial Internet Of Things). The development of such data logger can be achieved by measuring several physical quantises such as temperature, humidity and position. The sensors readings are monitored by client software running in widely used Internet browsers. The main advantage of the design is the creation of a useful and important wireless based measurement system for any type of physical parameters monitoring. The ESP32 microcontroller system is used to record data from several sensors and the measurements are transmitted via Wi-Fi to client server.
597 Brain Computer Interface (BCI) , Abdallah Abdelaziz
With the technological development in the world especially in computers, a computer simulation was observed with almost all things around as in a lot of fields such as engineering, medical, chemical,...etc., so can a computer understand the human's emotions? Emotions are intrinsically connected to the way that people interact with each other. Emotion constitutes a major influence in determining human behaviors. A human being can read the emotional state of another human, and behave in the best way to improve their communication at that moment. This is because emotions can be recognized through words, voice intonation, facial expression, and body language. Can a computer understand the human's brain? Making the computer more empathic to the human is one of the aspects of affective computing. The computer can actually take a look inside the human’s head to observe their mental state. In this study, we will understand how can a computer understand human brain by understanding the electric signals that come from it.
598 The Examination of Analyzing Data by Algorithm Performance , Farhad Shamssoolari
An algorithm is a specified set of rules/instructions that the computer will follow to solve a particular problem. In other words, we need to tell the computer how to process the data, so we can make sense of it. Data analysis has many facets, ranging from statistics to engineering. In this paper basic models and algorithms for data analysis are discussed [Songyi Xiao, Wenjun Wang, Hui Wang,2019]. Novel uses of cluster analysis, precedence analysis, and data mining methods are emphasized. The software for the cluster analysis algorithm and the triangularization is presented. The efficiency or complexity of the algorithm is nothing but the number of steps executed by the algorithm to achieve the results. In theoretical analysis of algorithms, it is common to estimate their complexity in asymptotic sense, i.e., to estimate the complexity function for reasonably large length of input. It's also easier to predict bounds for the algorithm than it is to predict an exact speed. Asymptotic notation is a shorthand way to write down and talk about 'fastest possible' and 'slowest possible' running times for an algorithm, using high and low bounds on speed. Big O notation, omega notation and theta notation are used to this end [Dr.N.Sairam & Dr.R.Seethalakshmi , 2010].
599 Retina Random Number Generator for Stream Cipher Cryptography , Murooj Aamer Taha; Naji Mutar Sahib; Taha Mohammed Hasan
Biometric is the measurement of behavioral and physiological characteristics for the human, generally used either for identification or verification, but it is also can be used as a key for different security applications. Among different biometric characteristics such as ears, voice, fingerprint, face, retina, iris, palm print, hand geometry, etc., the retina biometric can provide a higher level of security because of its inherent robustness. The main aim of this paper is to design and build a pseudorandom number generator based on the retina for stream cipher cryptography. The proposed system is based on the use of hybrid technology that consists of the characteristics of human retina and logistic functions to generate keys with high-quality specifications in terms of unpredictability, randomization, and non-re-generation. The NIST Package and correlation statistical tests prove that the generated keys are random, unpredictable, uncorrelated, and robust against different kinds of attack. The retina image keys are capable of passing most of the NIST statistical tests with high success rates also the average security test prove that the encrypted text is secure against entropy attack.
600 Numerical Simulation of the Coupled Dynamic Thermoelastic Problem for Orthotropic Bodies , Kalandarov A.A.; Babadjanov M.R.
The article considers the coupled dynamic thermoelasticity problem for a two-dimensional orthotropic material. A boundary value problem consists of the equations of motion and heat conduction attributing ut Party or, respectively, hyperbolic th and parabolic mu type in which the unknowns are the displacement I and temperature. Explicit and implicit difference schemes are compiled and solved numerically in two ways, and the coincidence of the numerical results is shown.
601 Optimized Siting and Sizing of DG in a HESCO Feeder using Particle Swarm Optimization , Shahbaz Ahmed; Mahesh Kumar; Syed Hadi Hussain; Zubair Ahmed Memon
Power system is a radial distribution system comprising large number of nodes and branches. Especially in Pakistan, large network causes power loss and voltage drop due to outdated infrastructure. Distributed Generation is one of the emerging solutions to compensate the load demand along with improvement in power quality and voltage profile of electrical power system. Although, proper placement and sizing of DG is still a challenging situation in power system as improper placement and sizing may bring the network more sever situation. This paper analyses a real time 11 kV radial distribution network of HESCO. An artificial intelligence technical named PSO is utilized to identify the optimal placement and size for DG integration. The proposed technique resulted as very effective for reduction in power loss with fast convergence.
602 A Review on Cloud Computing Security , Oluyinka. I. Omotosho
Cloud computing is commonly used to represent any work done on a computer, mobile or any device, where the data and possibly the application being used do not reside on the device but rather on an unspecified device elsewhere on the Internet. The basic premise of cloud computing is that consumers (individuals, industry, government, academia and so on) pay for IT services from cloud service providers (CSP). Services offered in cloud computing are generally based on three standard models (Infrastructure-as s service, Platform-as a service, and Software as a Service) defined by the National Institute of Standards and Technology (NIST). The reason for cloud existence is to resolve managing problems being faced for data that were excessively stored, either mandatory capacity was limited due to the infrastructure of the business, or large capacity that led to a wasted capital. Apart from those major factors such as the initial capital, capitals and the service-fix cost, the sophisticated effort for the patching, the managing and the upgrading of the internal infrastructure is a huge obstacle for firm‘s development and mobility. For many firms where client and cultural competency have not got the strength to manage large data center environments and infrastructure, it would be wise to upload their files or data backups to another machine via internet, in order to concentrate more on the organizations primary objectives.
603 Semantic based e-Housing Platform for Urban Regions , Omotosho, Oluyinka. I.; Adeyanju, Ibrahim A.
Shelter is a basic necessity of life. Every human in all categories are entitled to securing a desired accommodation with an ease, in this advent of advancement in technology. Currently, the process of securing accommodations/houses in most cities of developing countries are characterized by stress, discouragement, confusion, fear of exorbitant rent charges and estate agents' commission among others. While there have been drastic improvements in the first and second world countries, enormous work is still required in the Third world countries to eradicate challenges surrounding acquisition of basic needs of life, among which accommodation is primary. This work maximizes the advent of semantic web technology in a platform that integrates all players in e-Housing for a lasting solution strategy.
604 A POPBL CONCEPTUAL FRAMEWORK FOR THE DESIGN AND IMPLEMENTATION OF ASICs , Terungwa Stephen Akor; Kamalularifin bin Subari; Hanifah binti Jambari; Amirmudin bin Udin; Sarimah binti Ismail
The popularity of ASICs is growing every day due to the high demand for customization and the need for hardware/software integration in the digital society, especially in the IoT. A major benefit of using custom ASIC models in IoT is that products delivery can be carried out at very low cost thus, making the designers across the vertical market like industrial, smart utilities and medical devices to differentiate IoT products using custom ASIC designs. This article seeks to develop a POPBL conceptual framework for the design and implementation of ASICs. To this effect, a content analysis of scientific literature, models and frameworks were carried out with a focus on ASIC application, design and implementation, PbBL, PjBL, and POPBL processes. The resulting conceptual framework is a logical and systematic algorithm starting from problem identification/analysis, activation of prior knowledge, the setting of objectives/specifications, project initiation/execution, assessments/evaluation, and public presentation. The POPBL process stages act as the anchor for the ASIC design and implementation from start, product specifications, architecture, logic design, physical design, and tap-out. The framework will be useful for students and researchers for the development of soft skills like problem-solving, critical thinking, creativity and innovation.
605 Odd Posts Identification through the Vocabulary by Semantic Sentiment Analysis Using Machine Learning Algorithm , Dr. Mohammed Ali Alzahrani
In this work, the importance to detect the odd posts is considered to understand. This work explains the odd posts, and the methods using machine learning algorithms are well explained. In the first part, a survey is conducted on what runs on the internet and why to eliminate the odd posts is required. After explaining the importance of the odd posts as the data is exponentially increasing, to eliminate the odd posts various techniques are used but here are some of the best, TF-IDF is used as the clustering technique, SVM used to eliminate the odd posts and observed as a better solution. However, to eliminate the odd posts using machine learning algorithms RNN produces good results, as RNN is feasible to use for the large scale dataset. The sentiment analysis using machine learning approaches is used to obtain the progressive results which aim to eliminate the odd posts from the data, as the current demand and major source of the data are social media platforms. The accuracy is improved, and the implementation of the model is done on the dataset of the tweets which includes negative, positive and neutral tweets.
606 Semantic Web Service Discovery Approaches: A Comparative Study , Omnia Saidani Neffati; Oumaima Saidani
Nowadays, Web services are well involved in business computing for the development of distributed applications through diverse networks. In fact, finding the appropriate Web service meeting users' requirements (Web service discovery) becomes a crucial issue. In the literature review, many Web service discovery approaches have been proposed to in order to make easy the Web service discovery process. Existing approaches are different in terms of objective, issue to handle with, used techniques/methods, etc. In this paper, we propose, first, a literature review on approaches addressing semantic Web service discovery. Second, we provide a comparative study between these approaches on the basic of multiple criteria such as scalability, heterogeneity, context-awareness, accuracy, etc.
607 Talking Diary: A Novel Approach for Automatic Audio Note Categorization and Event Scheduling for Android Application , A.Hani Munir; Abubakar Manzoor; Utba Aziz
Smartphone usage is very common nowadays. People tried to regulate their routine tasks using different mobile applications. In this research, a novel approach for note classification and scheduling is proposed to facilitate mobile users in automatically organizing their daily routine tasks with a single audio note. The proposed model has been implemented in an android application named-Talking Diary. Talking Diary is an android application comprised of three modules namely auto audio note classification, auto audio note scheduling, and working hour's calculator. The proposed model of classifier computed similarity score by extracting N-gram weights from Concept Net to perform classification. The scheduler generates events and alarm on receiving the audio note. Working hours are calculated on the basis of using GPS location. Testing of Talking Diary android application is performed by collecting 500 sample notes from the random population of participants. Accuracy, Recall, and Precision is calculated for the correctness of classifier. Test cases were developed to ensure the performance of the scheduler and work hour calculator.
608 A Robust Review of SHA: Featuring Coherent Characteristics , Ghazala Shaheen
Secure hashing algorithms produces a message digest based on principles similar to those used in the design of the MD4 and MD5 message digest algorithms, but has a more conservative design. The four SHA algorithms are structured differently and are named SHA-0, SHA1, SHA-2, and SHA-3. SHA series appears to provide greater resistance to attacks, supporting the NSA's assertion that the change increased the security. This is a review study which includes the comparisons of different secure hashing algorithms with respect to the attributes that are considered as performance pillars of cyber security systems.
609 Creating Human Speech Identifier using WPT , Dr. Amjad Hindi; Dr. Majed Omar Dwairi; Prof. Ziad Alqadi
Human speech signal is widely used in various vital applications such as computer security system, speech signal usually has big size, which make it difficult to identify the speech by direct matching, sample by sample, so the process of using an efficient and accurate method of creating speech print is an important task in speech identification process. In this research paper we will introduce a wavelet packet decomposition, the generated wavelet packet tree will be analyzed in order to create a unique features (identifier) to each speech signal, we will show how wavelet packet decomposition is flexible in creating speeches identifiers by providing a variety of selections, each features selection will lead to generate a unique and small in size identifier, which can be used later on in any application requiring human speech recognition.
610 Accessibility Analysis of Bangladesh Government Websites Based on WCAG 2.0 , Mostak Ahmed; Zhijun Yan; Shariful Islam; Md Mehedi Hasan Sunny
The website is one of the most used platforms for information sharing. Every citizen has equal rights to get information from government websites. The main objective of this research is to analyze the accessibility, usability and security aspects of the government websites of Bangladesh. The research has been carried out based on WCAG 2.0 (level AA) by using various tools like TAW, WAVE, CCA, Total Validator, CynthiaSays, Keyboard Accessibility, and Readability test. This analysis was based on randomly selected 20 government websites that were found in the National Web Portal of Bangladesh. Results found that all of the government websites of Bangladesh failed to adhere to the minimum requirements of WCAG 2.0 (AA) guidelines. In the end, the researcher suggested some recommendations to increase web accessibility.
611 Review of Cryptography in Cloud Computing , Samar Zaineldeen; Abdelrahim Ate
Cloud computing offerings a distinctive way to share distributed resources, cloud computing is a type of Internet-based sharing of distributed assets through web. It is a model that empowerment prevalent, on-demand access to a shared mutual pool of configurable processing property and computing resources. Consequently, security becomes a critical issue in cloud computing. Securing information is the key issue in the field of network security. Cryptography is stand out among the best method to improve data security. This paper discusses the role of cryptography in cloud computing to enhance the information and data security.
612 ICT Diffusion and Primary Care Methodological Contribution on Clustering Methods to Partition Medical Practices in the USA , Professor Christine C Huttin
OBJECTIVES: This project presents an analysis of IT processes on variations of prescribing patterns for patients diagnoses with diabetes type II. It follows a first study on electronic billing and analyses various stages of IT processes in clinical practices. METHODS: A sample of 610 patients is extracted from the CDC physician survey (Huttin/Wong dataset 2010). RESULTS: Two Hierarchical clustering methods: Average Linkage (AL) and Ward were used and lead to a partitioning of medical records in three clusters, showing significant differences in levels of computerized clinical information (Savage Test). CONCLUSIONS: more research is needed to include other clustering techniques and a generalization with a matrix of similarity, possibly with an optimization of multi objectives. This type of research can be used for analysis and management of propagation of IT processes inside clinical systems and control for their effects on physician prescribing behaviors.
The computerized world we are living in has a ton of information and data that is utilized by an assortment of clients, for example, videos, books and articles. Different users like different content. Getting what each user likes can be irritating. Each online services provider always aims in having many clients. Recommender systems importance’s arises in such situations. The recommender framework proposes to a client some substance for example e.g. movies and books depending on what the user likes. In this research, a new movie recommender system is proposed, that will be able to improve the existing recommender systems. With this new recommender framework, the client will receive an improved forecast contrasted with different frameworks that as of now exist for example content-based filtering and collaborative-filtering. In order to overcome the disadvantages of the both methods (collaborative filtering algorithm (unsupervised learning) and content-based filtering algorithm (supervised learning)), the new system combines both methods. This will bring up a more stable system compared to the existing ones.
614 Exploring Factors Influencing Mobile-Banking Usage among PAAET College of Business Studies Students , Lamyaa S. AlAli; Musaed S. AlAli
This study aims to examine the penetration of mobile banking applications usage among PAAET College of business studies students. A panel data, obtained using questioner, is used to examine the relationship between the number of times students use their mobile banking applications in conducting bank transactions per week against number of factors. Results obtained from this research indicates that, out of the ten factors examined, students age had a statistically significant direct relation with the number of times students use their m-banking applications and that students living in capital province use their m-banking application more often than other provinces in Kuwait.
615 Simplification of Arithmetic and Variable Script in Hypertext Preprocessor (PHP) , Danial Kafi Ahmad
Programming had become as one of the vital skills required in Computer Science, Information Technology, Space Science, Engineering and their related field. This could be seen through the computing industry specifically in the software development area whereby a software based systems were developed in order to perform or cater a specific task by mean of automation, and this will require a developer to deploy their programming skills, ability and experience in order to develop the system. Despite of the functioning software system, their efficiency also take place as one of the important factor which contribute to the quality of the software produced. In this research, a few lines of code which comprise of variables, arithmetic operator as well as their operands of Hypertext Preprocessor (PHP) were developed and were simplified into a single line of code. The simplified versions of code were able to perform the similar computation as to the initial design. In addition, the single line of code is assumed to be more efficient than code of few lines.
616 Modified Genetic Folding Algorithm for Breast Cancer Classification Dataset , Mohammad A. Mezher
Cancer is a disease that develops in the human body due to gene mutation. Because of various factors, cells can become cancerous and grow rapidly, destroying normal cells at the same time. Support vector machines allow for accurate classification and detection of the classes. The advantage of kernel selection is to derive global learning rates for SVMs using the Genetic Folding algorithm. The developed GF algorithm outperforms traditional SVMs in the UCI Breast Cancer Wisconsin Diagnostic (BCWD) dataset under a certain comparative analysis, which is conducted under a set of conditions that describe the behavior of the compared algorithms. The observation that relates the GF performance appears to be comparable with SVM. The statistical analysis relies on a careful analysis of the ROC curve. Moreover, the GF algorithm shows that accuracy rates are obtained adaptively, that is, without knowing the parameters resulting from the margin conditions. The experimental results show that the one GF operator produces superior classification accuracy. The proposed method plays an important role in the detection of breast cancer in an efficient time frame.
617 Cooperative Domain Ontology Reduction Based on Rough Sets , Wa'el Mohsen; Mostafa Aref; Khaled ElBahnasy
Ontology is widely used in the areas of knowledge engineering, web-based data mining, and others. The process of developing and evolving inter-organizational domain ontologies is easy to get much redundant information. Rough set theory can be used to reduce the attributes of ontologies. This type of reduction eliminates the harsh requirements of the reduct and overcomes the drawback of the possible reduct that the derived decision rules may be incompatible with the ones derived from the original system. In this paper, we formulate the preliminaries of using Rough Set Theory to solve this problem while building or evolving process of the inter-organizational domain ontology. This technique can be used to enhance automatic and semi-automatic operations to develop and evolve ontologies.
618 Social Distancing: Role of Smartphone During Coronavirus (COVID – 19) Pandemic Era , Herbert Wanga; Thobius Joseph; Mauna Belius Chuma
In December 2019, an outbreak of severe acute respiratory syndrome coronavirus 2 (SARSCoV-2) infection occurred in Wuhan, Hubei Province, China and spread across China and beyond. On February 12, 2020, WHO officially named the disease caused by the novel coronavirus as Coronavirus Disease 2019 (COVID-19). On January 30, 2020, WHO declared COVID-19 as the sixth public health emergency of international concern. An outbreak has posed significant threats to international health and the economy. It has raised intense attention not only within China but internationally. It has been declared a pandemic by the World Health Organization. WHO calls for social distancing with several measures such as Isolation, Quarantine, Closing schools, working from home instead of at the office, Restricting movement of people and the cancellation of mass gatherings, Cancelling or postponing conferences and large meetings, and not taking public transportation, including buses, subways, taxis, and rideshares. This leads to lockdown. Smartphone installed with relevant app brings people together even when this coronavirus (COVID-19) pandemic forces us apart. Life has to go on. Students need education, people need food and medication, and economy has to be stable. Social distancing should not be complicated. These apps make your loved ones, team members and favourites accessible. It makes people not feeling as if they are jailed, but take social distancing as a social responsibility. The apps are categorized into video conferencing, social video chats, medical, entertainment, health & fitness, food & drinks, and apps for visual & hearing impairments.
619 Support Vector Machine (SVM) for Medical Image Classification of Tumorous , Reem Alrais; Nazar Elfadil
Cancer has become a leading cause of death worldwide. To deal with medical images to discover tumors and their types, Authors need a distinct experience in understanding medical images. Authors need machine learning techniques to reach great accuracy and speed to analyse these images to avoid a lack of experience or errors. In this paper, Authors will study a (SVM) of machine learning techniques used to classify brain images. SVM will be used in this paper to analyse brain images and discover Benign Tumor and Malignant tumor by using Matlab software. The results of the experiments conducted showed the accuracy of the system provided for the classification of tumor types (Benign, Malignant) found in medical brain images. Authors will adhere in this research that the images to be classified are limited by the presence of only two types of tumors. In the future, some pre-processing procedures will be added to the brain's medical images prior to the classification process.
The central processing unit is the brain of the computer unit. If you have purchased all the necessary hardware's, you have already gone through the first stage of assembling of your computer. You are advised to put on your anti-static wrist strap that will enable you discharge yourself before unpacking your components from its original anti-static bags. It is necessary to discharge yourself to avoid the danger of damaging your components by anti-static shock while making contact with the components. If you don't have an anti-static wrist strap, you can discharge yourself by touching the metal edges of the casing. Most of these parts can be bought together in what is known as a "barebones kit". In this instance, most of these components were bought together from percenttechnology.com as a barebones kit for around #35,000. There is financial advantage when we buy parts bundled together. Have all the mounting screws that come with the motherboard and a Philips screw driver handy as we will need them through all the stages. This is far cheaper when assembled than when purchased the assembled one. After securing the needed parts and materials, it will likely take between 2 and 4 hours to assemble your personal computer.
621 GENERIC FRAMEWORKS FOR SVM, ANN, LGBM, AND LR ALGORITHMS , Nora Ibrahem Alghurair; Mohammad A. Mezher
World Health Organization describes diabetes as a multiple etiological metabolic condition defined by persistent hyperglycemia with anomalies in glucose, lipid and protein metabolism triggered by insulin secretion deficiencies, lipid or both. Diabetes is one of the 21st century's most daunting health problems in the world, and affects over 425 million people. Data mining is one of the major techniques which develops and supports medical data research. The aim of this research was to establish a diagnostic framework for diabetes. The PIDD used to function description. Two generic frameworks have been proposed in the study. The first framework uses the ANN technique and fed production to SVM which yields the diagnostic result. With this generic framework, five experiments are carried out and the highest accuracy achieved was 81.8%. The second framework employs an ensemble of majority voting techniques which combines LGBM, SVM, and LR. The generic framework was right at 87.9 per cent. The frameworks presented in contrast with other state-of-the-art solutions, and it found that the second solution is the better one.
622 Measuring Cyber Security Awareness of Students: A Case Study at Fahad Bin Sultan University , Wejdan Aljohani; Nazar Elfadil
In this research paper authors designed questionnaire instrument to measure the current level of cyber security awareness (CSA) among Fahad Bin Sultan University (FBSU) students. The questionnaire is designed to fulfil the goals of this research project aims and objectives. The main goal of this paper aims to evaluate the level of cyber security awareness among FBSU students. Furthermore, cyber security students' awareness level questionnaire is adapted from few other cyber security awareness related questionnaires. A total of 212 students have participated in the survey. The study findings show that the students' awareness is in an average level and there is no difference in cyber security awareness level between male and female students. Furthermore, survey instrument's results indicate that the module has been effective in measuring students' awareness.
There are constant and on-going research and innovations to enhance speech processes for ease of communication for all, but most especially those with disabilities. With recent technology advancements, voice recognition has gained a massive acceptance particularly, when it comes to making communications with machines more natural and seamless. Voice technologies is increasingly being applied in many electronic devices most especially phones, personal computers, and web browsers. This is achieved mainly using a Speech Recognition System (SRS). In this write-up, some of the models, approaches, Application Programming Interfaces (APIs), performance metrics used in SRS are reviewed. Thereafter, a sample implementation of one of the common APIs, JavaScript Web Speech API, was presented; noting particularly the simplicity with which speech recognition using APIs we can achieved, with just a basic knowledge of programming language.
624 Organizational Commitment , Bilal Ali Abbas Al Halboosi; Dr. Umut Inan
Based on what has been described previously, it has been shown that there is an important need to study human behavior, aiming to achieve success and development for organization, and this can be done considering many factors including organizational commitment, which is a critical concept that influence organizations, and can be viewed as a positive feeling by the individual towards the organization. Many studies and researches have studies organizational commitment from different point of views. We have also shown a group of key concepts about organizational commitment, and confirmed its importance and characteristics that makes it a unique phenomenon. Moreover, dimensions, its approaches, influencing variables, and we discuss Contributing factors to organizational commitment. Above all, we have talked about the stages that take the organizational commitment to reach its required level, along with mentioning different ways to enhance it further more, and its implications on both individual and organization (1)influencing variables. Above all, we have talked about the stages that take the organizational commitment to reach its required level, along with mentioning different ways to enhance it further more, and its implications on both individual and organization.
625 Face Recognition System Approach Based on Neural Networks and Discrete Wavelet Transform , Ibtisam Mousa Alatawi; Nazar Elfadil Mohamed
The technology of face recognition is attractive and full of technology research challenges; It is used for recognizing people by using digital images. Although face recognition has an important role in several areas such as security, face recognition technology still encounters many challenges that need to be solved with more scientific methods. One of These challenges lead can be the variations of the face of the same person due to lighting or pose. This project explores and investigates the use of combined hybrid algorithms based on neural networks and discreet wavelet transform for face recognition in order to enhance the recognition rate for a face from identified data set of faces. Two techniques have been used in this research; the First one is applying the discrete wavelet transformation method in order to improve and compress the images of the data set. The second one is implementing a well-known approach called Principal Component Analysis. The training and testing face images are selected from ORL database, which contains 400 images for 40 different persons and have minimum pose variation. The experimental results confirmed that the proposed methodology provides a feasible and effective solution for recognizing faces.
626 Job Satisfaction , Wisam Mawlood Alazzawi; Dr. Redvan Ghasemlounia
Through the study some important indicators in the development of institutions, by achieving a high degree of satisfaction and loyalty of personnel working, and the question was asked the study problem in the following: What is the role of the incentive system in increasing performance? What is the degree of relationship between job satisfaction and job performance? The study aims to define the concept of job satisfaction and identify its causes, the most important factors affecting the performance of employees in public and private institutions, and clarify the importance of incentives and their role in achieving job satisfaction and raise the level of performance. The hypotheses of the study that the incentive system has a significant role in increasing performance, And the effect of the conditions of the work environment on the performance of employees and the study relied on the analytical descriptive method for its suitability with this type of studies, and that there is a statistically significant relationship between the system of incentives and performance. The study recommended the establishment of a system of incentives to ensure fair distribution among workers in institutions and improve the existing salary scale by improving the annual salary increase so that employees are satisfied with their jobs.
627 Proposal To Cope Change Resistance Using DevOps , Ahmed Osama Mansour; M. Rizwan Jameel Qureshi
This paper is written to support a phenomenon in which development and operations (DevOps) teams work together to deliver software on a continuous pace, eliminating the walls of confusion between stakeholders, and facilitating the business to seize and cope up with existing and constantly emerging opportunities. The main processes are development, quality assurance, pre-operational and operational. There are many tools involved in the implementation of DevOps such as Git, Jenkins, Chef, Vegrant, Docker and Hadoop. Due to the technical expertise in variety of tools, DevOps faces many factors that hinder its adoption such as change in the working norms of traditional teams, team structure and cost. The most significant factor is resistance to change by the traditional teams. In this paper, we will identify the possible change-resistant personnel, the reasons behind change-resistant and a solution is proposed to overcome change-resistant. A survey is used as a research design to validate the proposed solution. The results of survey support the proposed solution that it will help the software companies to cope the change resistance while adopting DevOps.
628 Secure Separate Bit Plane Image Processing for Distributed Video Surveillance System (DVSS/DVC) , Kien. T.V; An. B.L; Quynh. L.C
For emerging applications such as wireless video cameras, wireless low-power surveillance networks, and disposable video cameras for medical applications. DVC is useful and maybe the best choice. Since the primary objective of DVC is low-complexity video encoding, the bulk of computation is shifted (transmitted) to the decoder, as opposed to a low-complexity decoder in conventional video compression standards such as H.264 and MPEG. Besides the low bit rate, the transmission has to be very secure also. That means the complexity of the used keys should be rather high and the keyspace has to be large to provide enough individual keys for a large wireless sensor network.
Web services are becoming a common and convenient means of doing business over the Internet. More-and-more web services are kept on arriving over the Internet, offering the same set of services to the end users. The availability of similar web services increases the complexity of discovery as well as the selection process of web services. The traditional way of discovery of web service involves keyword based searching followed by manual selection. The keyword based search is not efficient. In this paper, we have used an improved mechanism for web service selection based on website as a negotiator. As interest in website owners arises not only to keep their customers but also increase the number of deals and interests. By effective negotiating, more income, than any other competitors will be obtained. Improving in business has number of rules which sellers should obey. The business rules such as negotiation, body language, time management, and selling strategy have been completely discussed in M.B.A And D.B.A courses. At the same time, for websites there is not that much information. In this study we are going to introduce new rules for websites to act more effective as a negotiator. Company managers before any negotiation, should choose the best negotiator. This duty has different step. Important step is that the negotiators should be studied different courses related to strategy of negotiation. Second step is to realize customers' position on the negotiation table and his personal behavior. Now a day's websites are an important negotiator for any companies. To be the best in this mission we use first step the iridology and position of computer comparing to user's place; Second step is to realize astrology, not only for the company speakers but also for the other side as well; third step is to realize biorhythm, not only for the company speakers but also for the other side as well and the fourth step is prediction by the use of present condition and situation of moon and other planet which affect not only our way of thinking but also the earth condition.
630 Modeling and Simulation of Urban Growth in the Use of Industry 4.0 , Erdal Özbay; Feyza Altunbey Özbay
Industrial growth is the positive increase of various opportunities and factors in a region within the framework of certain planning over time. Along with the developing technologies, Remote Sensing (RS), Geographic Information Systems (GIS), and simulation models are used to see the results of industrial growth effects and to create predictions. The findings obtained are guiding in terms of planning, investment studies, and management. In this context, the data obtained by using RS and GIS are input data for different simulation models, while simulation models provide predictions for the future with information in the past and present time period. Various hybrid simulation techniques are used in the computer-aided industrial simulation, consisting of Monte-Carlo, Petri-Networks, reality and traffic simulation, and their combination. In this context, the basic components and properties of an industrially strong and healthy simulation will be emphasized. In this study, by considering the integration of RS and GIS with simulation models in modeling industrial systems, the Von Thünen Model, Concentric Zoning Theory, Central Field Theory, Sector Theory, Artificial Neural Networks, Markov Chains, Cellular Automaton, Logistic Regression, and SLEUTH model have been examined.
631 Usage of Cloud Computing and Big data for Internet of Things , Erdal Özbay; Feyza Altunbey Özbay
The Internet of Things (IoT) has been very popular among researchers' study topics, especially for the past few years. Cloud Computing and Big data are two current issues that are increasingly attracted by almost everyone and that we frequently encounter in our daily lives. With these technologies, large amounts of data collected today can be stored in large-sized storage systems using information infrastructures and high-level calculations can be made. There are various requirements for objects to interact with stored data and communicate with other devices, which is an important opportunity in today's competitive environment. In this study, the relationship between the internet of things, Cloud computing, and Big data are examined. The aim of this study is to provide general information to researchers who will conduct data-based studies on Cloud systems in order to use the Internet of Things and related methods and technologies effectively.
632 Cryptanalysis of a Self-Recovery Fragile Watermarking Scheme , Oussama Benrhouma; Ahmed Taleb
In this paper, we analyze the security of a self-recovery fragile watermarking scheme proposed by C. Wang et al. The scheme is designed to control images integrity by locating the tampered areas and then recover the tampered zones. An attack against C. Wang et al.'s scheme is demonstrated, and we were able to manipulate the watermarked images without being detected by the extraction scheme. The theoretical and experimental results show that the proposed scheme is not secure against attacks.
633 User Experience of Academic Library Websites , Najwa Samrgandi
Despite the indispensable technological shifts, the success of a website is anchored on effectiveness, perception, valuableness, satisfaction, reliability, and efficiency dimensions. In this research, heuristic evaluation enabled to assess the primary content elements on academic library websites homepages. The heuristic method's findings were compared to task-based usability tests. Usability heuristics results correlate to task-based usability testing findings by supplementing measurement of users' expectations in multifaceted magnitudes. Cultural and one's national origin determines students' assessment and usability of universities' libraries' websites to a great deal. Universities can benefit from optimized operational performance by upholding underlying principles of heuristic model to achieve exceptional outcomes. This study derives ultimate satisfactory consciousness that heuristic evaluation's principles guide to design an academic library's website that complements efficiency and satisfaction.
634 Towards an Automated Islamic Fatwa System: Survey, Dataset and Benchmarks , Amr A. Munshi; Wesam H. AlSabban; Abdullah Tarek Farag; Omar Essam Rakha; Ahmad A. Al Sallab; Majid Alotaibi
Islam is the second largest and the fastest growing religion. The Islamic Law, Sharia, represents a profound component of the day-to-day lives of Muslims. This creates a lot of queries, about specific problems, that requires answers, or Fatwas. While sources of Sharia are available for anyone, it often requires a highly qualified person, the Mufti, to provide Fatwa. To get certified for Fatwa, the Mufti needs to undergo a sophisticated and long education process that starts from basic to high school. With Islam followers representing almost 25% of planet earth population, generating a lot of queries, and the sophistication of the Mufti qualification process, creating shortage in them, we have a supply-demand problem, calling for Automation solutions. This motivates the application of Artificial Intelligence (AI) to Automated Islamic Fatwa. In this work, we explore the potential of AI, Machine Learning and Deep Learning, with technologies like Natural Language Processing (NLP), paving the way to help the Automation of Islam Fatwa. We start by surveying the State-of-The Art (SoTA) of NLP, and explore the potential use-cases to solve the problems of Question answering and Text Classification in the Islamic Fatwa Automation. We present the first and major enabler component for AI application for Islamic Fatwa, the data. We build the largest dataset for Islamic Fatwa, spanning the widely used websites for Fatwa. Moreover, we present baseline systems, for Topic Classification, Topic Modelling and Retrieval-based Question-Answering, to set the direction for future research and benchmarking on our dataset. Finally, we release our dataset and baselines to the public domain, to help advance the future research in the area.
635 Multi-Level Access Control System in Automated Teller Machines , Ismaila W. Oladimeji; Omidiora E. Olusayo; Ismaila Folasade M.; Falohun Adeleye S.
E-commerce theft involves using lost/stolen debit/credit cards, forging checks, misleading accounting practices, etc. Due to carelessness of cardholders and criminality activities of fraudsters, the personal identification number (PIN) and using account level based fraud detection techniques methods are inadequate to cub the activities of fraudsters. In recent times, researchers have made efforts of improving cyber-security by employing biometrics traits based security system for authentication. This paper proposed a multi-level fraud detection system in automated teller machine (ATM) operations. The system included PIN level, account-level and biometric level. Acquired RealScan-F scanner was used to capture liveness fingers. Transactional data were generated for each individual fingerprint with unique PIN. The results of the simulation showed that (i) the classification at account level only yielded averages 84.3% precision, 94.5% accuracy and 5.25% false alarm rate; (ii) matching at biometric level using liveness fingerprints samples yielded 0% APCER , 0% NPCER and 100% accuracy better than using fingerprints samples that produced 4.25% APCER , 2.33% NPCER and 93.42% accuracy; (iii) combining the three levels with the condition that all the levels must be positive produced 87.5% precision,84.9% accuracy and 2.65% false alarm rate; (iv) while the classification using voting technique yielded 99.15% precision, 97.35% accuracy and 0.47% false alarm.
636 Design and Implementation of a Hybrid Barangay Information Management System , Rogelio Bon. Intud, Jr.
in pursuant to the republic Act No. 6975 by the Department of the Interior and local Government Act of 1990, through the (NBOO) the National Barangay operations Office as mandated to establish and update the master list of Barangays, Barangay officials and Barangay Socio-economic profiles. thus, in compliance thereto the proponents Developed a Hybrid Barangay Information Management System to rapidly gather,encode,store and maintain data of the Barangay which in effect may result to a systematize manner of accumulating and retrieving relevant information that is essential in coming up with informed decisions on various possible local governance issues. As observed, there were a number of problems associated with the current adopted laborious manual methods on a day to day Basis such as, retrieving huge numbers of file Folders of their Constituents Data and this causes delay in the delivery of services and had some inaccuracies in the completion of tasks and so on. Nowadays, with the advent of modern technology it opens wider opportunities for barangay to serve its constituents better through computerization of the documents as it provides such as barangay clearance, certificate of indulgency, letter of recommendation, generate report and others. Motivated by the vision of empowering this self-governing political system digitally, the proponents aimed to determine this advantages and could developed a Hybrid Barangay Information Management System that will hasten the transactions performed and documents provided by the barangays. This system is designed to be accessed only by the authorized users to ensure the integrity of all transactions. It will be designed and implemented using Microsoft Visual Basic 2010 as the front-end while running Microsoft Access Server as the back-end and also it has as embedded responsive Intra web Portal that can be utilized by the barangay functionaries.
This study designed and developed a Coastal Resource Management and Fish Catch Monitoring System for Northern Negros Aquatic Resources Management and Advisory Council (NNARMAC). It specifically examined the quality of the developed software based on McCall's Software Quality Model. It also determined the level of system acceptability based on the required application functionality, provided a fast easy way to store raw fish catch data, and generated a report that shows statistics of fish catch data monthly and annually. The system is connected to the NNARMAC Website which enables the user to post updates and news about their Coastal Area. The research process from conducting a preliminary survey to initial design until the developed system underwent evaluations, reviews, and updates. Findings revealed that the developed system has passed all the criteria based on McCall's software quality model. It is concluded that the system is highly acceptable by the end-users and provides a fast and easy way to monitor the fish catch data.
638 Internet Architecture: Current Limitations Leading Towards Future Internet Architecture , Maudlyn I. Victor- Ikoh; Ledisi G. Kabari
The original internet design principle was guided by the end-to-end principle in the early 1980s and formed the foundation for the existing internet architectural model. The priorities of the original internet designers do not match the needs of today actual users; rise in new players, demanding applications, erosion of trust and rights and responsibilities is pushing the internet to a new dimension. This paper presents the goals and principles behind the design of the original internet architecture, the resulting issues and limitations of the existing network architecture and the approaches that is driving the future internet architecture.
639 Quality Extended Use Case Point (QUCP): An Improved Cost Estimation Method , Ibrahim Mohammad Ba’abbad; M. Rizwan Jameel Qureshi
The quality of a product is one of the major interests of the manufacturing process in all industries. The software industry imposes to construct a project with several phases to ensure producing high-quality software. A software development company estimates time, effort and cost of the project during planning phase. It is important to have accurate estimations to reduce the risks of project failure. Several cost estimation methods are practiced in the software development companies such as Function Point (FP), Use Case Points (UCP), Constructive Cost Model I and II and Story Points (SP). UCP cost estimation method is taken in this research to improve the accuracy of its estimation. UCP estimation depends on the use case diagram of the proposed system. A use case diagram describes the main functional requirements of the proposed system. UCP partially considers non-functional requirements through the technical and environmental factors. There is a lacking in the UCP method to consider the importance of quality attributes in the estimating process. This paper proposes an extended version of the existing UCP method named Quality Extended Use Case Point (QUCP) method in which quality attributes are included to improve the accuracy of cost estimation. A questionnaire is used to validate the proposed QUCP method. It is found after data analysis that seventy five percentages of the participants are agreed that the proposed method will not only help to improve the accuracy of cost estimation but it will also enable a software development company to deliver high-quality products.
640 TRAFFIC ANALYSIS USING IMAGE PROCESSING , Mounica B; Nithya B S; Rakshitha N; Sirisha M
The vehicle congestion on the road is increasing day by day and also the management of such large traffic by traditional approach isn’t adequate enough. To eliminate this problem, the project is developed using machine learning in which the testing model is trained to extract the needed image about traffic Information. Extracted information from image sequences of testing model can give us real information to create the database which is the captured images like accident, foggy places, collision of the vehicles, traffic signal, no traffic jam etc. take the image from testing model and processing the trained model which compares the new image and trained image and identify the reason for violation or reason for accident. Data processing will be done to determine the reason under the cause of the accident. This application is utilizing image processing methods designed and modified to the needs and constraints of traffic analysis. Therefore, it shows that it can reduce the traffic congestion and avoids the time being wasted.
641 Examination of Assorted Social Engineering Attack by Different Types of Machine Learning Algorithms , Amal Alhamad; Dalal Aldablan; Raghad Albahlal
The most powerful attack on the systems is Social Engineering Attack because of this attack deals with Psychology so that there is no hardware or software can prevent it or even can defend it and hence people need to be trained to defend against it.[1] Social engineering is mostly done by phone or email. In this research, which is based on previous research we have conducted, the aim of it was of it was to highlight the different social engineering attacks and how they can prevent in social network because social engineering is one of the biggest problems in social network, a concern the privacy and security. This project is using a set of data then analysis it uses the Weka tool, to defend against these attacks we have evaluated three decision tree algorithms, RandomForest, REPTree and RandomTree. It was also related to an J48 algorithm, On the contrary, here contains a complete overview of social engineering attacks, also more than one algorithm was searched.
642 An Operative Application of Distributed Ledger Technology for Banking Domain , Sharmila S P; Harsha Pandit Moger
Banking systems have been using a centralised network over several years. Any attack on the centralised unit would risk a whole lot of banking data. To avoid this, Blockchain is an approach, which is more appropriate to hold the large data in a safe manner as it is a decentralised network. The purpose of this paper is to provide better understanding about using blockchain for the banking sector by outlining the opportunities, benefits and challenges of this technology. We propose a technique of banking using blockchain which would make things simpler, safer and transparent. Also, at some point in time, we can even put restrictions to limit the transaction amount, day limit, credit limit or other restrictions depending upon the banking rules. Through these restrictions, attempts towards hacking or misuse could be prevented. This work would benefit the banking sectors and modify the direction of Finance.
643 Information System in Booking Weight Steam Based on Web , Rayan Soqati
The development of information technology today requires all activities to run well. Almost all agencies or companies already use a variety of applications to support their work, for easy in performing daily operational activities. Similarly, PT. XYZ whose company is engaged in lightweight steel roof distributor region of west Jakarta, also want to provide the best ordering services for customers and business partners. As one of the emerging distributors of course in terms of service lightweight steel roofing, PT. XYZ make use of web-based application system to serve as a means in conducting service ordering lightweight steel roof. In this case the research is done to design an application on web-based software that is application service in ordering lightweight web-based steel roof by using OOAD (Object Oriented Analysis Design) method with modelling using object-oriented approach that is UML (Unified Modelling Language). The results to be achieved is a service application in ordering lightweight web-based steel roof that can provide convenience services for customers who want to order lightweight steel roof and as well as a marketing tool for lightweight steel roof in PT. XYZ.
644 Edge Computing and Its Convergence with Blockchain in 6G: Security Challenges , Alex Mathew
Even though the wireless network of 5G has not been investigated exhaustively, the sixth generation (6G) echo systems' visionaries are already being debated. Therefore, to solidify and consolidate privacy and security within 6G networks, this paper examines edge computing and its convergence with blockchain in 6G: security challenges. Moreover, the paper examines how security might affect the wireless systems of the 6G, potential obstacles characterizing various 6G technologies, alongside possible remedies. This paper unveils the 6G security vision alongside key indicators of performance with tentative landscape threat premised upon predicted sixth generation infrastructure. Furthermore, a discussion of the privacy and security challenges that might emerge from the existing sixth generation applications and demands is presented. Additionally, the paper sheds light into the researchlevel projects and standardization efforts. Specific attention is accorded to discussion on the security consideration with 6G enhancing technologies, including quantum computing, visible light communication (VLC), distributed ML/AI, physical layer security, and distributed ledger technology (DLT). Overall, this paper seeks to guide the subsequent investigation of sixth generation privacy and security in the early stage of envisioning to practicality.
The agriculture field plays vital role in development of smart India. To increase economic level the production of fruits, crops and vegetables can use CAD technique using image processing tools. Identifying diseases in fruits is an image processing’s big challenging task. This can done by continuous visual photos or videos monitoring system. The automated image processing research helps to control the pesticides on fruits and vegetables. In this paper we focus to detect the diseases of tomato at earlier stage. The proposed system shows how different algorithms such as color thresholding segmentation techniques and K-means clustering are used. In proposed system shows the K-means Clustering is better than RGB color based colorthresholder method for detecting tomato diseases in beginning stage.
646 A Conceptual Framework for Minimizing Peak Load Electricity using Internet of Things , Amira Hassan Abed; Mona Nasr; Laila Abd Elhamid
Electricity load demand converts from time to time frequently in a day. Encountering time-varying demand particularly in peak times is considered a big challenge that faces electric utilities. Persistent growth in peak load increases the prospect of power failure and increases the electricity equipping marginal cost. Therefore, balancing production and consumption of electricity or addressing peak load has become a key attention of utilities. Most previous works and researches were focused on applying Shave/Shift peak load to solve energy scarcity. In this study, we introduce four significant technologies and techniques for achieving peak load shaving, namely "Internet of Things (IoT) in Energy System", "On-site Generation systems (Renewable Energy Resources)", "Demand Side Management (DSM)" applications of control center and "Energy Storage Systems (ESSs)". The impact of these four major methods for peak load shaving to the grid has been discussed in detail. Finally, we suggest a conceptual framework as guiding tool for illustrating the presented technologies of Shave/Shift peak load in energy systems.
647 Innovative IOT Covid-19 Monitoring System to Ameliorate Medical Professionals , Hannah Alex
Special quarantine centers setup to handle COVID-19 patients have experienced an overflow of patients as cases of the infectious disease keep on rising. Doctors assigned to these quarantine centers have had a difficult time keeping track of the health conditions of the patients in quarantine. The doctors assigned to such setups have an increased risk of infection due to their interactions with the patients. In order to enable the health workers to efficiently monitor the quarantined patients and reduction of in-service infections, this study proposes to design an innovative IOT based using IOT Gecko platform health monitoring system able to remotely monitor the health of the patients and send automated reports to doctors’ over a set internet connection. The proposed system will be equipped with heartbeat sensor, temperature sensor and BP Sensor to keep track of respective health conditions of the patients. If successfully designed and implemented, the systems will be enable doctors to remotely monitor patient’s heartbeat, temperature and blood pressure reducing the risks of infection and increasing the number of patients a single doctor can monitor at a time.
648 Deep Reinforcement Learning for Cybersecurity Applications , Alex Mathew
There has been a rapid growth of the devices connected to the internet in the last decade for the various internet (IoT) of things applications. The increase of these smart devices has posed a great security concern in the internet of things ecosystem. The internet of things ecosystem must be protected from these threats. Reinforcement learning has been proposed by the cybersecurity professionals to provide the needed security tools for securing the IoT system since it is able to interact with the environment and learn how to detect the threats. This paper presents a comprehensive research on cybersecurity threats to the IoT system applications. The RL algorithms are also presented to understand the attacks on the IoT. Reinforcement learning is widely employed in cybersecurity because it can learn on its own experience by investigating and capitalizing on the unknown ecosystem, this enables it solve many complex problems. The RL capabilities on dealing with cybercrime challenges are also exploited in this paper.
649 A Review on Image Segmentation Techniques and Its Recent Applications , Iqraa Qasim
The technique of attaching a name to each component in a picture so that elements with much the same label have certain visual qualities (pixels, colours, values, patterns, and other features of the image) is known as image segmentation. The output of image segmentation is a series of fragments or patterns taken from the image that collectively cover the full picture. After segmenting an image, classification detects classes in an image on the basic of similarities and differences between detected segments. These two tools together are doing wonders in better object detection in images. Big world has big data, and this big data need robust algorithms for understanding images without a human. This paper summarises the techniques of image segmentation and classification based on their efficiency. Finally, a comparison table shows who performed it better.
The study entitled “RIPENING BEHAVIOR OF CARABAO MANGO FRUITS (Mangifera indica) TREATED WITH CLIMACTERIC FRUITS AS RIPENING STIMULANTS” at Surigao State College of Technology – Mainit Campus Crop Science Laboratory (Crop Processing) for a period of seven days starting on June 18, 2015 to June 25, 2015. This study undertaken to determine the effects of the selected climacteric fruits on the ripening behavior of Carabao mango fruits. This experimental research laid out in a Completely Randomized Design (CRD) consisting of six treatments replicated four times. Treatments were the following: T1 (Control), T2 (750 g ripe Cavendish banana), T3 (750 g ripe tomato), T4 (750 g ripe mango), T5 (750 g ripe avocado), and T6 (750 g ripe honeydew melon). The parameters assessed were the number of days to ripen, ripening percentage, color of the fruits, weight loss, and taste quality of the fruits. Data were analyzed using Mean and Analysis of Variance for Completely Randomized Design (CRD) to determine if the resulting means vary significantly with each other. The Analysis of Variance (ANOVA) revealed that treatments were highly significant on the number of days to ripen weight loss and taste quality of the fruits. For the results showing highly significant difference, the Duncan’s Multiple Range Test (DMRT) employed to find out between which treatments significantly vary. Results showed that T6 (750 g ripe honeydew melon) gave the best mean about the number of days to ripen and weight loss. For the ripening percentage of carabao mango fruits, T2 (750 g ripe Cavendish banana) and T5 (750 g ripe avocado) registered the highest mean. T2 (750 g ripe Cavendish banana) revealed the highest mean in terms of the color of the fruits. Concerning the taste quality of the fruits, T3 (750 g ripe tomato) and T5 (750 g ripe avocado) obtained the highest mean. Therefore, it concluded that the use of climacteric fruits affects the ripening behavior of the Carabao mango fruits.
651 Paperless Enrollment System: Functionality and Credibility as an Online Platform, Niño V. Hayagan
It is a fact that dozens of Universities and Colleges already have the online platform long before the pandemic. They both have an online enrollment system and electronic learning for online classes however this way of learning is in least priorities since the usual way is by going to school premises and process the enrollment physically. Face to face classes as well is the focus of every school organization since it promotes personal interaction which is essential to develop and boost confidence among students. This usual way of learning needs an immediate change and require an effective solution during the pandemic. It puts everybody on their individual houses and stops us to almost everything we are doing outside our home. A rapid fall to every country’s economy and a widespread scarcity for food is one of the main problem we face. Online processes is the key in order not to violate all safety protocols being implemented. Online Enrollment System is needed to be prioritize and improve as students can’t afford to discontinue and skip taking their remaining semester and subjects in school. Creating the Online System is a huge advantage for parents, students and to the school itself. Since it is an online platform, it can cater all incoming students, current students and returnees which will ease the hustle as it is a comfortable way in terms of filing documents for enrollment.
652 Barangay Integrated Management System with Mobile Support, Jeffred P. Lim
Due to the prevailing pandemic, managing public demands must be granted with utmost precautions while maintaining better and faster services to the people. The purpose of this study is to improve public services of the barangay office and barangay health center by developing a system that will centralize data from barangay health center and barangay office, manage barangay public information, filter constituents’ profile as to requests, complaints, and health services availed, and display inventory of medical and office supplies. Moreover, to limit face to face transactions by developing a mobile application that will allow the registered users to set appointments and file complaints. Based on the thorough evaluation of the experts and respondents, the Barangay Integrated Management System with Mobile Support is highly usable, secured, efficient, and provides a fast and easy way to manage residents’ profile, manage public information, manage supplies, manage complaints, set appointments, manage medical transactions, and generate significant reports. The study reflects that the barangay office and the barangay health center can greatly benefit by using the developed system in providing better and faster services to the people and limit face to face transactions by using the mobile application in registering complaints and setting appointments of the residents.