Similar Journals
![]() |
International Journal of Soft Computing and Software Engineering
Number of Followers: 14 ![]() ISSN (Print) 2251-7545 Published by Advance Academic Publisher ![]() |
- Phase Shift and Code Hopping Spread Spectrum Physical Layer Design and
Implementation
Authors: Saifuldeen Abdulameer Mohammed
Abstract: The communication security in the last four year became obsessed or nightmare for companies , countries and even individuals, joins them the needing a big speed messaging (high bits rate) because of the evolution in social networking , media ,IOT, visualizations and cloud computing ; so held conferences and workshops for researchers to pull new ideas can be applicable, fast implementation and need little (or no) synchronization, but the whole focus on data encryption or key exchange and there for Physical layer takes care of the little because they forget (companies and countries researchers) that the most important way to break the encryption is to collect data Through eavesdropping on the well-known physical layers, forms of electromagnetic waves and its protocols still using today.For fifth generation (5G), there is a proposed system it would integrate two systems DS-CDMA (multi-users) and OFDM (multi-carriers) it’s called MC-CDMA but through multiple antennas especially in the main base stations to the end users, recent researches have provided a new way for DS-CDMA using Constellation points transmitted by IFFT using several orthogonal codes, In this paper will provide a new technique Phase Shift Hopping and Code Hopping to enhancing the security more and more by very simple way, at the end we will see the Phase Shift Hopping Spread Spectrum (PSHSS)and Code Hopping Spreads Spectrum (CHSS) how its work perfectly without any big change in the system or in bit rate and bit error rate, all result have been tested by MATLAB R2014b using 8 users in the Code Division Multiplexing Access (CDMA).
- Performance Evaluation of Software Projects using Criteria Importance
Through Inter-criteria Correlation Technique
Authors: Madan Kumaraswamy; Ramaswamy R
Abstract: Software project management problems are highly complex, multi-dimensional, and not always objective in nature. Performance evaluation of software projects is a multi-criteria decision. This research paper highlights the applicability of a multi criteria decision making approach to understand software project management decision problems. A multi criteria decision model based approach to decision making helps project managers to identify significant decision criteria that transmits maximum information to influence the decision, criteria that have the highest conflicting information, rank the projects on their performance basis the correlated project criteria. A field study on performance evaluation of software development projects based on three project criteria; project complexity, project team size, and actual effort to complete the project is used to illustrate the multi criteria project performance evaluation problem. The research paper corroborates the findings in literature that project complexity is the criteria that transmits larger information and therefore higher is its importance in project evaluation decision making. The project performance scores based on criteria weights is then used to rank the projects based on their relative performance on the correlated criteria.
- Impacts of MAC Layer on LANDY Routing Protocol Performance
Authors: ADAM MACINTOSH
Abstract: The aim of this paper is to investigate the impacts of MAC layer on our proposed MANET routing protocol LANDY. And to assess how the data load impacts the routing protocol performance under four different MAC layer environments. In this study, LANDY, DSR, GPSR, OLSR are used as the routing protocols (Position based, Reactive, and Proactive) with default settings. OPNET simulator was used to design and build a unified simulation environment; to evaluate the performance of the different protocols proposed in the IETF, in different scenarios.
- Researching the Development of the Electrical Power System Using
Systemically Evolutionary Algorithm
Authors: Jerzy Tchorzewski; Emil Chyzy
Abstract: The paper contains the concept and the results of research concerning the evolutionary algorithm, identified based on the systems control theory, which was called the Systemically of Evolutionary Algorithm (SAE). Special attention was paid to two elements of evolutionary algorithms, which have not been fully solved yet, i.e. to the methods used to create the initial population and the method of creating the robustness (fitness) function. Other elements of the SEA algorithm, i.e. cross-over, mutation, selection, etc. were also defined from a systemic point of view. Computational experiments were conducted using a selected subsystem of the Polish Electrical Power System and three programming languages: Java, C++ and Matlab. Selected comparative results for the SAE algorithm in different implementations were also presented.
- The Impact of EHR on the Quality of Care
Authors: Arshia Khan
Abstract: Provision of consistent, high quality health care is lacking in the United States. There is significant evidence identifying the need for improvement in health care. The implementation of health information technology has been proposed as a strategy to reduce costs and improve the quality of care. Researchers have found health information technology implementation benefits the healthcare management of patients by improving the quality of care provided. Additionally there is evidence that critical access hospitals are lacking in quality of care and have higher mortality (i.e., death) rates in comparison to urban hospitals. A study identified a gap in the knowledge of quality of care with respect to critical access hospitals.
- The Robotic System Achievement of Rugged Mobile Non-Intrusive Imaging
Inspection System
Authors: Carol Niznik
Abstract: The Lagrangian Deformable Tensor Software(LDTS)Protocol is a mathematical technique which realizes many of the new sensor imaging concerns from Terrorists since 911. The rugged mobile non-intrusive imaging inspection system for vehicle interiors and cargo areas goals require prediction of motion, weight and sensitivity perturbation in a three dimensional perspective and must be modeled by a Lagrangian Deformable Tensor equation within the LDTS Protocol for detection of explosives, biological, and chemical agents in containers in concrete and other construction materials for rugged mobile non-intrusive imaging inspection systems. The LDTS Protocol contains the mathematical complexity to interpret the imagery content of containers for Global Information Grid(GIG)CIA Simulation and Intelligence interpretation and response to protect humanity from Terrorist threats from IED single or multiple Bombers via a Robotic YOYO Device for bomb disposal, a portable device that is handheld and capable of imaging a briefcase-sized object for interior explosive detection at one pass with single-sided imaging capability with the high resolution provided by the LDTS Protocol. The additional Heart Protocol Geometric Software Structure links the LDTS Geometric Software Structure to the two Sensor Hardware Geometric Structures,i.e. the Overlayed Circular Sensor Structure and the Cross Sensor Structure. The basic Robotic YOYO model enables image sensing in place with the singing property of the YOYO from the sensors in the periphery of the Overlayed Circular Sensor Structure shaped device with overlayed larger diameter sections, and the Cross Sensor Structure with the circular sections containing sensors at the edge of each arm of the Cross Sensor Structure. This sensing property enables placement of this Robotic YOYO System on the floor of vehicle or chamber to track a Terrorist suspect and has either wired or wireless communication with other Robotic Devices or Security Personnel. The Robotic YOYO System, which uses the YOYO singing property to establish the length of the sensing time correlated to each Terrorist Threat, is an evidence-based knowledge tool enhancing the administration of justice and public safety for guiding policy and practice.
- Caching Real-Time Optimal Strategy Visualization Interval Protocol
Development And Performance Analysis With Distributed Tactical Intrusion
Detection
Authors: Carol Niznik
Abstract: Distributed Tactical Intrusion Detection Protocol with the virtual data reduction capability via a caching virtual memory. The basic conceptual software principal is the overlay of the cache virtual memory on the six software algorithms comprising the Real Time Visualization Interval (RTVI) Protocol: (1) Real-Time, (2) Gray Scale and Binary Image Retrieval, (3)Risk Analysis for Edge Detection, (4) Risk Analysis for Image Detection, (5) Minimax Edge Detection for Image Reconstruction, and (6)Clustering. The cache virtual memory is implemented in the Chinese Checker geometric software structure(CCGSS) for the RTVI Protocol as sectors replacing the Chinese Checker marble locations within the six star facets and the center hexagonal structure. Within each of the algorithms, the cache structures are related to the storage required to process these algorithms. It is noted that the cluster algorithm utilizes a hexagonal structure for its placement as sections of the cluster. The six star facets of the CCGSS Protocol can be processed by the cache virtual memory in sequence to execute the surveillance of multiple source attack information realizing the Optimal Strategy Visualization Interval. The Optimal Strategy Visualization Interval is theoretically generated realizing the concept of the Invariant Imbedding of an Optimization theory equation as a kernel in another equation, here, the Games of Timing Optimal Strategy and Optimal Interval equation overlay contains the kernel that is the optimization equation of the six facets equation for the RTVI Protocol.
- Tactical Ballistic Missile(TBM) Composite Tracking Protocols For Single
Integrated Air Picture(SIAP) Optimized Attributes Risk Constrained By
Discrimination, Classification, Architecture And Data Registration, Army
System Of Systems(ASoS)
Authors: Carol Niznik
Abstract: Alternative Composite Tracking Protocols for a Composite Track File will be achieved by the theoretical performance evaluation of two classes of software theory protocols on a Network Centric Topology. The Network Centric Topology for theoretical performance evaluation of the two classes of software theory protocols, Composite Network Centric Fractal/Graphic (CTNCF/G) Protocol and Composite Network Centric Holographic (CTNCH) Protocol, will be comprised of a STAR Topology with a Simulation Facility center, i.e. Army GIG through the CIA, and the multiple STAR facets comprised of gateway software linking the Simulation Facility and the Sensors. The optimization of the Single Integrated Air Picture(SIAP) Attributes with the following constraints, the Data Registration concepts, the Army Integrated Air and Missile Defense (AlAMO) System of Systems (SoS) architecture, discrimination and classification parameters will be reflected along with the perturbation sensitivity theory in the two theoretical software protocol classes representing solutions for the imaging Composite Track Files. The key technical risks and programmatic risks are developed for the composite tracking capability.
- Development, Implementation And Performance Evaluation Of The Optimal
Anti-Worm Detection Defense Software(OAWDDS) Protocol
Authors: Carol Niznik
Abstract: Computer Worms known as malicious codes are defined as programs that replicate without infecting other programs and some Worms spread by copying themselves from disk to disk or computer to computer across a network, while others replicate in memory to slow the computer. There are five components of a Worm and the individual Worm nodes can be linked in a communication network to build a larger Worm network with five topologies. The five Worm network topologies are: Hierarchical Tree, Centrally Connected, Shockwave Rider-Type, Hierarchical Tree with Several Layers of Authority and Many Centralized Nodes,and Full Mesh. The Optimal Anti-Worm Detection Defense Software(OAWDDS)Protocol realizes three optimization constraints from the mathematical formalization of a Worm: five components, five topologies, and five structures. The computer networking Queueing System Modeling of the exponential Worm growth characteristics with GI/G/1, M/M/1,and E2/M/1 Queueing Parameters within the three constraints and formed computer network congestion control between possible Worm Hosts and Worm Nodes will ensure 100% elimination of Worm Network protection. The analytical Lagrangian optimization in the OAWDDS Protocol of the proportion of vulnerable machines compromised with the three constraints will ensure the following three criteria to be enforced: (A)stopping Worm attacks 100% of the time,(B)enabling 100% protection for a given system without knowing any of the signatures of the individual Worms and (C)maintaining 100% effectiveness without periodic updates like virus protection. These three criteria will assure critical government public and private infrastructure systems required to maintain the national and economic security of the U.S.
- A Study for Regression Testing Techniques and Tools
Authors: Passant Kandil; Sherin Mousse, Nagwa Badr
Abstract: Regression testing is a part of the software testing activity, which is an important activity of the software development life cycle and the maintenance process. It is carried out to ensure that changes made in the fixes or any enhancement changes are not influencing the previously working functionality. Regression testing is mostly done by re-running existing test cases against the modified code to determine whether the changes affect anything. This requires a lot of cost and time, which increases as the size and the complexity of the software increases. Instead of re-running all the test cases, a number of different approaches were studied to solve regression-testing problems. There has been an explosion in the use of data mining techniques in the exploration and analysis of large quantities of data in order to discover meaningful patterns and rules. Data mining models were introduced for software testing to design a minimal set of regression tests. This helps solving regression testing problems with large-scale systems that are usually accompanied by thousands set of test cases, where it is considered impossible to re-run all of them each time a system update is applied. Therefore, data mining is investigated to handle such cases.In this paper, we investigate the different techniques proposed to solve the regression testing problems, where a comprehensive study is conducted for analysis and evaluation. We also discuss the tools presented in market for the regression testing. Finally, we present our proposed approach for regression testing using data mining techniques. The main advantage of this new approach is that it can be applied on large-scale systems having thousands of test cases. The proposed regression-testing algorithm considers time and cost constraints with no human intervention.
- Designing a Data Warehouse from OWL sources
Authors: Yassine Laadidi; Mohamed Bahaj
Abstract: The Semantic web is the new extension of actual web to make data “understandable” by computers in order to construct one global source of information. As a result, a huge quantity of semantic data is provided. RDF (which stands for Resource Description Framework) is a standard model for data, used and designed to describe information on the semantic web. Web Ontology Language (OWL) is the standard language used to describe semantic relationships and allows us to specify far more about the properties and classes. A data warehouse as a dimensional schema is designed to change and grow up over time to respond to the business needs. This paper describes our approach to define a dimensional fact model from OWL ontology sources. The method treats a complex ontology structure in two parties, first by a simplification process that allows us to clean up and focus in important concepts and needed data, the second party is the construction of the dimensional fact model according to the resulting OWL structure from the previous party.
- Model Based Refinement and the Design of Retrenchments
Authors: Richard Banach
Abstract: The ingredients of typical methodologies for model based development via refinement are re-examined, and some well-known frameworks are reviewed, drawing out commonalities and differences. It is observed that the ingredients of these formalisms can frequently be ‘mixed and matched’ much more freely than is often imagined, resulting in semantic variations on the original formulations. It is also noted that similar alterations in the semantics of specific formalisms have taken place de facto due to applications pressures and for other reasons. This analysis suggests prioritising some criteria and proof obligations over others within this family of methods. These insights are used to construct a foundation for the design of notions of retrenchment appropriate for, and complementary to, given notions of refinement. The notions of retrenchment thus derived for the specific refinement formalisms examined earlier, namely Z, B, Event-B, ASM, VDM, RAISE, IO-automata and TLA+, are presented, and within the criteria given, all turn out to be very similar.
- An Improved feature selection based on neighborhood positive approximation
rough set in document classification
Authors: Leena Patil; Mohammed Atique
Abstract: Feature selection is a challenging problem in the field of machine learning, pattern recognition and data mining. Feature Subset Selection becomes an important preprocessing part in the area of data mining. In rough set theory, the problem of feature selection, called as attribute reduction, aims to retain the discriminatory power of original features. A large number of features is the problem in text categorization. Most of the features are noisy, redundant, relevant or irrelevant noise that can mislead the classifier and it may have different predictive power. Therefore, feature selection is often used in text categorization. It is most important to reduce dimensionality of the data to get smaller subset of features and relevant information within efficient computational time as time complexity is the major issue in feature selection. To deal with these problem many feature selection algorithms are available, still such algorithms are often computationally time consuming, and possess the problem of accuracy and stability. To overcome these problems we developed a framework based on neighborhood positive approximation rough set for feature subset selection in which the size of the neighborhood depends on the threshold value δ. In the proposed framework we obtain several representative and rank preservation of significance measures of attributes. In this paper firstly document preprocessing is performed. Secondly, a neighborhood positive approximation is used to accelerate the attribute reduction. Thirdly result validations based on classifiers are performed. Experimental results show that the improved feature selection based on neighborhood positive approximation rough set model becomes more efficient in terms of the stability, computational time and accuracy in dealing with large datasets.
- Single Channel Source Separation Using Non-Gaussian NMF and Modified
Hilbert Spectrum
Authors: Seyyed Reza Sharafinezhad; Mohammad Eshghi, Habib Alizadeh
Abstract: In this paper, a new and powerful method for Blind Source Separation (BSS) for single channel mixtures is presented. This method is based on non-Gaussian nonnegative matrix factorization (NG-NMF) in which modified Hilbert spectrum is employed. In the proposed algorithm, the Adaptive EEMD (AEEMD) is introduced to transfer the signal to the Enhancement Intrinsic Mode Functions (EIMF). The Hilbert spectrums of EIMFs are used as artificial observations. In order to make estimated spectrum of EIMF of sources using NMF, the maximization of non-Gaussianity is used. Then, spectra of estimated oscillation modes are transferred to the time domain by the inverse Hilbert spectrum (IHS). In order to cluster of these oscillation modes, k-means clustering algorithm based on KLD (Kullback Leibler Divergence) is used. The simulation results indicate that the proposed algorithm performs the separation of speech and interfering sounds. from a single-channel mixture, successfully.
- Semi-automated construction mechanism of heterogeneous artifacts
Authors: Mounir ZEKKAOUI; Abdelhadi FENNAN
Abstract: An artifact is a general term for any kind of created information, produced, modified or used by developers in the implementation of software systems. Among these artifacts include the source code, analysis and design models, unit testing, XML deployment descriptors, user guides, and properties files. We consider an application is described by a together heterogeneous and distributed software artifacts. All artifacts can evolve over time (artifacts can be removed others can be added). Each artifact may change over time. This may be a source of degradation in functional, qualitative, or behavioral terms of modified software. Hence the need for a unified approach for extraction and representation of different heterogeneous artifacts in order to ensure a unified and detailed description of heterogeneous software artifacts, exploitable by several software tools and enabling responsible for the evolution of carry out the reasoning change concerned..
- A Comparison of Two Semantic Sensor Data Storages for Total Data
Transmission
Authors: Manaf Sharifzadeh; Saeid Aragy, Kaveh Bashash, Shahram Bashokian, Mehdi Gheisari
Abstract: The creation of small and cheap sensors promoted the emergence of large scale sensor networks. Sensor networks allow monitoring a variety of physical phenomena, like weather conditions (temperature, humidity, atmospheric pressure ...), traffic levels on highways or rooms occupancy in public buildings. Some of the sensors produce large volume of data such as weather temperature. These data should be stored somewhere for user queries. In this paper two known sensor data storage methods that store data semantically has been compared and it has been shown that storing data in ontology form consumes more energy so the lifetime of sensor network would decreases. The reason we choose them is that they are useful and popular.
- SPI Model’s and Software Review’s in Software Enterprises
Authors: Nomi Baruah
Abstract: The following work conducts a study about the results of a survey aimed at comprehending the relevance of the software reviews i.e. activities of aspects reviewed and frequency of reviews in software enterprises. The main purpose of this study was to research about the small scale and medium scale software enterprises. The procedure of research was conducted by preparing a questionnaire consisting of 21 software process improvement models. Besides, inorder to conduct the mentioned survey the following software enterprises were visited: FutureSoft, Delhi, Exilant Technologies Pvt Ltd,Bhubaneswar, Targus Technologies, Chandigarh SIQUES,Noida,GET,Guwahati, DZ Engineering,Pune, Verschaska Infotech Pvt. Ltd,Mumbai, Zaloni Technologies,Guwahati,Aris Global Software Pvt Ltd,Bangalore and Roma ThinkSoft Pvt. Ltd,Bangalore.It is noteworthy to mention here that the software reviews are applicable to software products throughout the software life cycle and is a vital step in requirement, designing, coding, testing and maintenance of engineering systems and software projects. Software review process area is used with the intent of locating software defects. It is a powerful process area to improve the quality of software product. The software review process area should be tailored by identifying as to which software work products are to be reviewed. Other areas to be observed in this regard are people involved in the review, select the appropriate review process by weighting the software work product’s scope, document review plan, schedule and types. Then the review schedule should be published during the project planning stage. The software review verifies the work product, satisfies the specifications found in any predecessor work product, identifies any deviation from standards and identifies improvement opportunities to the software developer.
- Discrete Cat Swarm Optimization for Solving the Quadratic Assignment
Problem
Authors: Mohammed Essaid RIFFI; Abdelhamid BOUZIDI
Abstract: A discrete cat swarm optimization is a metaheuristic based on natural behavior of cats, each cat has two modes that are the seeking mode, and the tracing mode. The seeking mode is when a cat is at rest, that’s how a cat spends most of its life time. The tracing mode is when a cat is hunting. This paper proposes a new discrete cat swarm optimization algorithm to solve the quadratic assignment problem, as one of the known combinatorial optimization problems. This problem is attributed to NP-Hard class. In order to test the performance of the algorithm described herein, we will resolve some instances of the quadratic assignment library problem.
- Discrete Cat Swarm Optimization for Solving the Quadratic Assignment
Problem
Authors: Mohammed ESSAID RIFFI; Abdelhamid BOUZIDI
Abstract: A discrete cat swarm optimization is a metaheuristic based on natural behavior of cats, each cat has two modes that are the seeking mode, and the tracing mode. The seeking mode is when a cat is at rest, that’s how a cat spends most of its life time. The tracing mode is when a cat is hunting. This paper proposes a new discrete cat swarm optimization algorithm to solve the quadratic assignment problem, as one of the known combinatorial optimization problems. This problem is attributed to NP-Hard class. In order to test the performance of the algorithm described herein, we will resolve some instances of the quadratic assignment library problem.
- An Enhanced Port Tunneling & Device Tracking Authentication
Mechanism
Authors: Yamini S; D. Maheswari
Abstract: Port knocking is a technique by which only a single packet or special sequence will permit the firewall to open a port on a machine where all ports are closed by default. It is an unresisting authorization technique which offers firewall-level authentication to ensure authorized access to possibly unprotected network services. This method is liable to attacks when attackers detect the network. This paper suggests a new method which is called “Enhanced Port Tunneling & Device Tracking (EPT & DT)” to banish both DOS-Knocking and NAT- Knocking attacks. The source IP address where an annoyed activity had originated is of limited value because it does not specify a physical locality, besides an endpoint in a network for the exclusive conviction of routing. Furthermore, people and their devices move across the network, changing IP address as significance. It is proficient to have some hints about where a device was at the time the offending action was accomplished. Nevertheless, it would be prudent to connect different pieces of evidence to ascertain additional information, such as IP addresses worn by the corresponding device. Devices constantly accessing a private network, at different times, can be outlined by analyzing and associating Network and Port Address Translation (NAPT) logs, in order to acclaim recurring activity patterns. It is feasible to recognize some of the users from their traffic abnormalities without considering the exposed IP addresses. Experiments were conducted on NAPT logs accumulated in a campus network, with DHCP data providing control points for validation. The main purpose of using NAPT logs is for device tracking.