Abstract: Multicriteria models have been proposed for inventory classification in previous studies. However, it is important to make a decision when a particular multicriteria inventory classification model should be preferred over other models and also if the highest performing model remains the highest performing at all times. Companies always look for ways to improve customer order fulfillment process. This paper shows how better inventory classification can improve customer order fill rate in variable settings. The method to compare the inventory classification models with regard to improving customer order fill rate is proposed. The cut-off point is calculated which indicates when a model currently in use should be dropped in favor of another model to increase revenue by filling more orders. Sensitivity analysis is also performed to determine how holding cost and demand uncertainty affect the performance metric. Finally, regression analysis and hypothesis testing inform the decision-maker of how a model’s performance differs from other models at various values of holding cost and standard deviation of demand. PubDate: Wed, 05 Apr 2017 08:54:41 +000

Abstract: The paper compares numerically the results from two real option valuation methods, the Datar-Mathews method and the fuzzy pay-off method. Datar-Mathews method is based on using Monte Carlo simulation within a probabilistic valuation framework, while the fuzzy pay-off method relies on modeling the real option valuation by using fuzzy numbers in a possibilistic space. The results show that real option valuation results from the two methods seem to be consistent with each other. The fuzzy pay-off method is more robust and is also usable when not enough information is available for a construction of a simulation model. PubDate: Wed, 12 Oct 2016 11:07:56 +000

Abstract: This paper aims to cluster entities which are described by a data matrix. Under the assumption of normality of observations contained in each table, each entity is represented by samples from Gaussian distribution, that is, a number of measurements in the data matrix, the sample mean vector, and the sample covariance. We propose a new distance based on Mahalanobis’s discriminant score to measure the similarity between objects. The present study is thought to be an important and interesting topic of research not only in the quest for an adequate model of the data representation but also in the choice of the distance index between entities that would allow justifying the homogeneity of any observed classes. PubDate: Thu, 29 Sep 2016 08:34:56 +000

Abstract: The analytic hierarchy process (AHP) has advantages that the whole number of comparisons can be reduced via a hierarchy structure and the consistency of responses verified via a consistency ratio. However, at the same time, the AHP has disadvantages that values vary according to the form of hierarchy structure and it is difficult to maintain consistency itself among responses. If the number of comparisons can be reduced, a comparison within a single level is optimal, and if comparison can be made while the priority among entities is maintained, consistency may be automatically maintained. Thus, in this study, we propose a method of assigning weights, which applies hierarchy structure of AHP and pairwise comparison but complements the disadvantages of AHP. This method has advantages that the number of comparisons can be reduced and also consistency is automatically maintained via determination of priorities first on multiple entities and subsequent comparisons between entities with adjoined priorities. PubDate: Thu, 28 Apr 2016 12:07:54 +000

Abstract: The Greenhouse Gas (GHG) emissions due to transport operations have drastically increased in recent years. The sea transport in particular contributes 2.7 to 3 percent of CO2, a major component of GHG emissions globally. Numerous measures have been undertaken locally and internationally to alleviate the sea transport share of Greenhouse Gases. However, most of these measures will be fruitful if ship investors (e.g., ship owners and operators) would fully employ the GHG emission reduction strategies. Due to the scarcity of the statistical data in this respect, this study therefore presents a rough set synthetic assessment (RSSA) model to GHG emission abatement strategies in the Tanzanian shipping sector. The results of the assessment reveal that the Tanzanian shipping companies engaged in Cabotage trade are aware of the abatement strategies and moderately apply them. PubDate: Thu, 31 Mar 2016 08:17:07 +000

Abstract: Usually, traditional methods for investment project appraisal such as the net present value (hereinafter NPV) do not incorporate in their values the operational flexibility offered by including a real option included in the project. In this paper, real options, and more specifically the option to abandon, are analysed as a complement to cash flow sequence which quantifies the project. In this way, by considering the existing analogy with financial options, a mathematical expression is derived by using the binomial options pricing model. This methodology provides the value of the option to abandon the project within one, two, and in general periods. Therefore, this paper aims to be a useful tool in determining the value of the option to abandon according to its residual value, thus making easier the control of the uncertainty element within the project. PubDate: Tue, 01 Mar 2016 06:48:24 +000

Abstract: We study asset pricing dynamics in artificial financial markets model. The financial market is populated with agents following two heterogeneous trading beliefs, the technical and the fundamental prediction rules. Agents switch between trading rules with respect to their past performance. The agents are loss averse over asset price fluctuations. Loss aversion behaviour depends on the past performance of the trading strategies in terms of an evolutionary fitness measure. We propose a novel application of the prospect theory to agent-based modelling, and by simulation, the effect of evolutionary fitness measure on adaptive belief system is investigated. For comparison, we study pricing dynamics of a financial market populated with chartists perceive losses and gains symmetrically. One of our contributions is validating the agent-based models using real financial data of the Egyptian Stock Exchange. We find that our framework can explain important stylized facts in financial time series, such as random walk price behaviour, bubbles and crashes, fat-tailed return distributions, power-law tails in the distribution of returns, excess volatility, volatility clustering, the absence of autocorrelation in raw returns, and the power-law autocorrelations in absolute returns. In addition to this, we find that loss aversion improves market quality and market stability. PubDate: Thu, 08 Oct 2015 07:08:41 +000

Abstract: The main purpose of this paper is to improve on the conceptual as well as the methodological aspects of BSC as a quantitative model by combining elements from traditional balanced scorecard (BSC) thinking with the Systems Thinking. This is done by combining short and long term aspects of measurements. The result is then used to build and construct a balanced scorecard model for strategic learning with the specific aim to maintain satisfied customers and motivated employees. Strategic planning, operational execution, feedback, and learning are some of the most important key features of any performance measurement model. This paper aims to address not only the conceptual domain related to BSC, that is, learning and system dynamics causality and feedback, but also the methodological domain concept of precision solved by differential equations. Our results show how a potential move from a static strategic vision map to a linked and dynamic understanding may be not fully realistic but very useful for learning purposes. The new knowledge obtained from the learning feedbacks fertilizes both decision discussion and decision-making and what may be required in order to move to the next level of BSC and system dynamics integration. PubDate: Tue, 05 May 2015 11:38:50 +000

Abstract: This paper proposes a new hybrid multiattribute decision making (MADM) model which deals with the interactions that usually exist between hostel attributes in the process of measuring the students’ satisfaction towards a set of hostels and identifying the optimal strategies for enhancing their satisfaction. The model uses systematic random stratified sampling approach for data collection purpose as students dwelling in hostels are “naturally” clustered by block and gender, factor analysis for extracting large set of hostel attributes into fewer independent factors, λ-measure for characterizing the interactions shared by the attributes within each factor, Choquet integral for aggregating the interactive performance scores within each factor, Mikhailov’s fuzzy analytical hierarchy process (MFAHP) for determining the weights of independent factors, and simple weighted average (SWA) operator to measure the overall satisfaction score of each hostel. A real evaluation involving fourteen Universiti Utara Malaysia (UUM) hostels was carried out in order to demonstrate the model’s feasibility. The same evaluation was performed using an additive aggregation model in order to illustrate the effects of ignoring the interactions shared by attributes in hostel satisfaction analysis. PubDate: Thu, 30 Apr 2015 08:45:18 +000

Abstract: Almost all of the today’s modern daily life conditions of humankind depend on the electricity. The countries either by only themselves or sometimes with some international intuitions and/or organizations have been trying to find the best methods, ways, and projects to supply the electricity to their societies. One of the important tools for the countries to increase the amount and quality of the electricity generation is to activate/ignite/initiate the private investment capabilities/opportunities. The electricity generation market in Turkey is a free/open market for both the foreign and domestic private investors. Hence, both the foreign and domestic private investors have been looking for the most suitable electricity generation plant projects. Small hydropower plant (SHPP) investments (SHPPIs) are one of the alternatives in the Turkish electricity generation market especially for the private investors searching for the renewable energy investments. This experimental research study investigates the possibility of using the ELECTRE III/IV, Shannon’s Entropy, and Saaty’s Analytic Hierarchy Process (AHP) subjective weighting (for criteria) methods for the solution of this problem. In the experimental case study, the most appropriate SHPPIs amongst five alternative SHPPIs at the SHPPIs’ predevelopment investment stages in Turkey were evaluated and ranked in order. PubDate: Tue, 03 Feb 2015 11:44:09 +000

Abstract: Challenges for text processing in ancient document images are mainly due to the high degree of variations in foreground and background. Image binarization is an image segmentation technique used to separate the image into text and background components. Although several techniques for binarizing text documents have been proposed, the performance of these techniques varies and depends on the image characteristics. Therefore, selecting binarization techniques can be a key idea to achieve improved results. This paper proposes a framework for selecting binarizing techniques of palm leaf manuscripts using Support Vector Machines (SVMs). The overall process is divided into three steps: (i) feature extraction: feature patterns are extracted from grayscale images based on global intensity, local contrast, and intensity; (ii) treatment of imbalanced data: imbalanced dataset is balanced by using Synthetic Minority Oversampling Technique as to improve the performance of prediction; and (iii) selection: SVM is applied in order to select the appropriate binarization techniques. The proposed framework has been evaluated with palm leaf manuscript images and benchmarking dataset from DIBCO series and compared the performance of prediction between imbalanced and balanced datasets. Experimental results showed that the proposed framework can be used as an integral part of an automatic selection process. PubDate: Mon, 26 Jan 2015 09:49:35 +000

Abstract: Organizations typically employ the ABC inventory classification technique to have an efficient control on a huge amount of inventory items. The ABC inventory classification problem is classification of a large amount of items into three groups: A, very important; B, moderately important; and C, relatively unimportant. The traditional ABC classification only accounts for one criterion, namely, the annual dollar usage of the items. But, there are other important criteria in real world which strongly affect the ABC classification. This paper proposes a novel methodology based on a common weight linear optimization model to solve the multiple criteria inventory classification problem. The proposed methodology enables the classification of inventory items via a set of common weights which is very essential in a fair classification. It has a remarkable computational saving when compared with the existing approaches and at the same time it needs no subjective information. Furthermore, it is easy enough to apply for managers. The proposed model is applied on an illustrative example and a case study taken from the literature. Both numerical results and qualitative comparisons with the existing methods reveal several merits of the proposed approach for ABC analysis. PubDate: Thu, 01 Jan 2015 09:34:07 +000

Abstract: In the traditional inventory system, it was implicitly assumed that the buyer pays to the seller as soon as he receives the items. In today’s competitive industry, however, the seller usually offers the buyer a delay period to settle the account of the goods. Not only the seller but also the buyer may apply trade credit as a strategic tool to stimulate his customers’ demands. This paper investigates the effects of the latter policy, two-level trade credit, on a retailer’s optimal ordering decisions within the economic order quantity framework and allowable shortages. Unlike most of the previous studies, the demand function of the customers is considered to increase with time. The objective of the retailer’s inventory model is to maximize the profit. The replenishment decisions optimally are obtained using genetic algorithm. Two special cases of the proposed model are discussed and the impacts of parameters on the decision variables are finally investigated. Numerical examples demonstrate the profitability of the developed two-level supply chain with backorder. PubDate: Wed, 31 Dec 2014 00:10:24 +000

Abstract: The present study is concerned with the cost modeling of an inventory system with perishable multi-items having stock dependent demand rates under an inflationary environment of the market. The concept of permissible delay is taken into account. The study provides the cost analysis of inventory system under the decision criteria of time value of money, inflation, deterioration, and stock dependent demand. Numerical illustrations are derived from the quantitative model to validate the results. The cost of inventory and optimal time are also computed by varying different system parameters. The comparison of these results is facilitated by computing the results with neurofuzzy results. PubDate: Thu, 18 Dec 2014 00:10:13 +000

Abstract: The attitudes of general practitioners (GP) play an influential role in their decision making about patient treatment and care. Considering the GP-patient encounter as a complex system, the interactions between the GP and their personal network of peers give rise to “aggregate complexity,” which in turn influences the GP’s decisions about patient treatment. This study models aggregate complexity and its influence in decision making in primary care through the use of social network metrics. Professional network and attitudinal data on decision making responsibility from 107 rural GPs were analysed. Social network measures of “density” and “inclusiveness” were used for computing the “interrelatedness” of components within such a “complex system.” The “number of components” and “degree of interrelatedness” were used to determine the complexity profiles, which was then used to associate with responsibility in decision making for each GP. GPs in simple profiles (i.e., with low components and interactions) in contrast to those in nonsimple profiles, indicate a higher responsibility for the decisions they make in medical care. This study suggests that social networks-based complexity profiles are useful for understanding decision making in primary care as it accounts for the role of influence through the professional networks of GPs. PubDate: Thu, 11 Dec 2014 07:38:18 +000

Abstract: In order to stimulate demand of their product, firms generally give credit period to their customers. However, selling on credit exposes the firms to the additional dimension of bad debts expense (i.e., customer’s default). Moreover, credit period through its influence on demand becomes a determinant of inventory decisions and inventory sold on credit gets converted to accounts receivable indicating the interaction between the two. Since inventory and credit decisions are interrelated, inventory decisions must be determined jointly with credit decisions. Consequently, in this paper, a mathematical model is developed to determine inventory and credit decisions jointly. The demand rate is assumed to be a logistic function of credit period. The accounts receivable carrying cost along with an explicit consideration of bad debt expense which have been often ignored in previous models are incorporated in the present model. The discounted cash flow approach (DCF) is used to develop the model and the objective is to maximize the present value of the firm’s net profit per unit time. Finally, numerical example and sensitivity analysis have been done to illustrate the effectiveness of the proposed model. PubDate: Mon, 08 Dec 2014 00:10:04 +000

Abstract: We present implemented concepts and algorithms for a simulation approach to decision evaluation with second-order belief distributions in a common framework for interval decision analysis. The rationale behind this work is that decision analysis with interval-valued probabilities and utilities may lead to overlapping expected utility intervals yielding difficulties in discriminating between alternatives. By allowing for second-order belief distributions over interval-valued utility and probability statements these difficulties may not only be remedied but will also allow for decision evaluation concepts and techniques providing additional insight into a decision problem. The approach is based upon sets of linear constraints together with generation of random probability distributions and utility values from implicitly stated uniform second-order belief distributions over the polytopes given from the constraints. The result is an interactive method for decision evaluation with second-order belief distributions, complementing earlier methods for decision evaluation with interval-valued probabilities and utilities. The method has been implemented for trial use in a user oriented decision analysis software. PubDate: Sun, 30 Nov 2014 00:10:07 +000

Abstract: This work describes the design of a decision support system for detection of fraudulent behavior of selling stolen goods in online auctions. In this system, each seller is associated with a type of certification, namely “proper seller,” “suspect seller,” and “selling stolen goods.” The certification level is determined on the basis of a seller’s behaviors and especially on the basis of contextual information whose origin is outside online auctions portals. In this paper, we focus on representing knowledge about sellers in online auctions, the influence of additional information available from other Internet source, and reasoning on bidders’ trustworthiness under uncertainties using Dempster-Shafer theory of evidence. To demonstrate the practicability of our approach, we performed a case study using real auction data from Czech auction portal Aukro. The analysis results show that our approach can be used to detect selling stolen goods. By applying Dempster-Shafer theory to combine multiple sources of evidence for the detection of this fraudulent behavior, the proposed approach can reduce the number of false positive results in comparison to approaches using a single source of evidence. PubDate: Tue, 25 Nov 2014 11:11:28 +000

Abstract: In many statistical applications, it is often necessary to obtain an interval estimate for an unknown proportion or probability or, more generally, for a parameter whose natural space is the unit interval. The customary approximate two-sided confidence interval for such a parameter, based on some version of the central limit theorem, is known to be unsatisfactory when its true value is close to zero or one or when the sample size is small. A possible way to tackle this issue is the transformation of the data through a proper function that is able to make the approximation to the normal distribution less coarse. In this paper, we study the application of several of these transformations to the context of the estimation of the reliability parameter for stress-strength models, with a special focus on Poisson distribution. From this work, some practical hints emerge on which transformation may more efficiently improve standard confidence intervals in which scenarios. PubDate: Thu, 23 Oct 2014 00:00:00 +000

Abstract: The authors deal with the topic of the final assembly scheduling realized by the use of genetic algorithms (GAs). The objective of the research was to study in depth the use of GA for scheduling mixed-model assembly lines and to propose a model able to produce feasible solutions also according to the particular requirements of an important Italian motorbike company, as well as to capture the results of this change in terms of better operational performances. The “chessboard shifting” of work teams among the mixed-model assembly lines of the selected company makes the scheduling problem more complex. Therefore, a complex model for scheduling is required. We propose an application of the GAs in order to test their effectiveness to real scheduling problems. The high quality of the final assembly plans with high adherence to the delivery date, obtained in a short elaboration time, confirms that the choice was right and suggests the use of GAs in other complex manufacturing systems. PubDate: Thu, 02 Oct 2014 11:18:22 +000

Abstract: In today’s rapid changing and highly competitive business environment, innovation is broadly recognized as a powerful competitive weapon. Innovation is a dynamic process that needs continuous, evolving, and mastered management. Thus, companies need to monitor and measure their innovation capacity to manage the innovation process. Yet, there is lack of a psychometrically valid scale for innovation capacity construct in the current innovation literature. The purpose of this paper is to develop a reliable and valid scale of measurement for innovation capacity. To test its unidimensionality, reliability, and several components of validity, we used data collected from 175 small- and medium-sized enterprises (SMEs) in Iran and performed a series of analyses. The reliability measures, exploratory and confirmatory factor analyses, and several components of validity tests strongly support a four-dimensional scale for measuring innovation capacity. The dimensions are knowledge and technology management, idea management, project development, and commercialization capabilities. PubDate: Tue, 30 Sep 2014 00:00:00 +000

Abstract: This paper discusses on the notion of trapezoidal fuzzy intuitionistic fuzzy sets (TzFIFSs) and some of the arithmetic operations of the same. Correlation coefficient of TzFIFS is proposed based on the membership, nonmembership, and hesitation degrees. The weighted averaging (WA) operator and the weighted geometric (WG) operator are proposed for TzFIFSs. Based on these operators and the correlation coefficient defined for the TzFIFS, new multiattribute decision making (MADM) models are proposed and numerical illustration is given. PubDate: Mon, 04 Aug 2014 11:50:57 +000

Abstract: A new solution concept for hypergames called subjective rationalizability is proposed. Hypergame theory is a game theoretical framework that deals with agents who may misperceive game structures and explicitly takes into account hierarchy of perceptions, that is, an agent’s view about another agent’s view and so on. An action of an agent is called subjectively rationalizable when the agent thinks it can be a best response to the other’s choices, each of which the agent thinks each agent thinks is a best response to the other’s choices, and so on. Then it is proved that subjective rationalizability is equivalent to the standard notion of rationalizability under a condition called inside common knowledge. The result makes the new solution concept a practical tool in hypergame analyses. Theoretically, it is characterized as such a concept that provides the precise implication, that is, predicted outcomes, of a given hypergame structure. PubDate: Thu, 10 Jul 2014 08:36:10 +000

Abstract: We describe an improvement of Chergui and Moulaï’s method (2008) that generates the whole efficient set of a multiobjective integer linear fractional program based on the branch and cut concept. The general step of this method consists in optimizing (maximizing without loss of generality) one of the fractional objective functions over a subset of the original continuous feasible set; then if necessary, a branching process is carried out until obtaining an integer feasible solution. At this stage, an efficient cut is built from the criteria’s growth directions in order to discard a part of the feasible domain containing only nonefficient solutions. Our contribution concerns firstly the optimization process where a linear program that we define later will be solved at each step rather than a fractional linear program. Secondly, local ideal and nadir points will be used as bounds to prune some branches leading to nonefficient solutions. The computational experiments show that the new method outperforms the old one in all the treated instances. PubDate: Tue, 01 Jul 2014 00:00:00 +000

Abstract: We consider in this paper an M/G/1 type queueing system with the following extensions. First, the server is unreliable and is subject to random breakdowns. Second, the server also implements the well-known -policy. Third, instead of a Bernoulli vacation schedule, the more general notion of binomial schedule with vacations is applied. A cost function with two decision variables is developed. A numerical example shows the effect of the system parameters on the optimal management policy. PubDate: Thu, 15 May 2014 10:02:29 +000

Abstract: The optimal production and advertising policies for an inventory control system of multi-itemmultiobjective problem under a single management are formulated as an optimal control problem with resourceconstraints under inflation and discounting in fuzzy rough (Fu-Ro) environment. The objectivesand constraints in Fu-Ro are made deterministic using fuzzy rough expected values method (EVM). Here,the production and advertisement rates are unknown and considered as control (decision) variables. Theproduction, advertisement, and demand rates are functions of time t. Maximization of the total proceed fromperfect and imperfect units and minimization of the total cost consisting of production, holding, and advertisementcosts are formulated as optimal control problems and solved directly using multiobjective geneticalgorithm (MOGA). In another method for solution, membership functions of the objectives are derived andthe multi-objective problems are transformed to a single objective by the convex combination of the membershipfunctions and then the problem is solved by generalized reduced gradient (GRG) method. Finally, numerical experimentand graphical representation are provided to illustrate the system. PubDate: Mon, 07 Apr 2014 13:37:36 +000

Abstract: Previous studies reported mixed and ambiguous results of the relationship between TQM practices and performances. This study investigated impacts of TQM practices on various performance measures as well as the reasons and the barriers of the TQM practices of firms in Turkey. We used a cross-sectional survey methodology in this study, and the unit of the sample was at the plant level. The sample was selected from the member firms to Turkish Quality Association and the firms located in the Kocaeli-Gebze Organized Industrial Zone. We obtained 242 usable questionnaires, with a satisfactory response rate of 48.4 percent. We conducted exploratory factor analysis and multiple regression analysis. This study has shown that different TQM practices significantly affect different performance outcomes. Results revealed that primary obstacles that the firms in Turkey face were lack of employee involvement, awareness and commitment of the employees, inappropriate firm structure, and lack of the resources. It is recommended that firms should continue implement TQM with all variables to improve performance. Firms should improve employees’ involvement/commitment/awareness to TQM, enhance firm structure, and provide resources to overcome the barriers that prevent effective implementation of TQM practices. PubDate: Sun, 16 Mar 2014 00:00:00 +000