for Journals by Title or ISSN for Articles by Keywords help
 Subjects -> BUSINESS AND ECONOMICS (Total: 3107 journals)     - ACCOUNTING (88 journals)    - BANKING AND FINANCE (264 journals)    - BUSINESS AND ECONOMICS (1150 journals)    - CONSUMER EDUCATION AND PROTECTION (24 journals)    - COOPERATIVES (4 journals)    - ECONOMIC SCIENCES: GENERAL (166 journals)    - ECONOMIC SYSTEMS, THEORIES AND HISTORY (179 journals)    - FASHION AND CONSUMER TRENDS (13 journals)    - HUMAN RESOURCES (93 journals)    - INSURANCE (23 journals)    - INTERNATIONAL COMMERCE (126 journals)    - INTERNATIONAL DEVELOPMENT AND AID (83 journals)    - INVESTMENTS (27 journals)    - LABOR AND INDUSTRIAL RELATIONS (43 journals)    - MACROECONOMICS (15 journals)    - MANAGEMENT (523 journals)    - MARKETING AND PURCHASING (88 journals)    - MICROECONOMICS (24 journals)    - PRODUCTION OF GOODS AND SERVICES (138 journals)    - PUBLIC FINANCE, TAXATION (34 journals)    - TRADE AND INDUSTRIAL DIRECTORIES (2 journals) BUSINESS AND ECONOMICS (1150 journals)                  1 2 3 4 5 6 | Last
 Central European Journal of Operations Research   [SJR: 0.837]   [H-I: 17]   [5 followers]  Follow         Hybrid journal (It can contain Open Access articles)    ISSN (Print) 1613-9178 - ISSN (Online) 1435-246X    Published by Springer-Verlag  [2353 journals]
• Cheap talk and cooperation in Stackelberg games
• Authors: Raimo P. Hämäläinen; Ilkka Leppänen
Pages: 261 - 285
Abstract: Abstract Previous literature on cheap talk suggests that it is used to increase cooperation. We study cheap talk and the effect of the leader’s private payoff information in new repeated Stackelberg game settings. Our results confirm earlier studies that the players cooperate in repeated Stackelberg games with complete payoff information. In the cheap talk setting the follower has the actual first mover advantage and should in theory benefit from it, but we find that many followers cooperate instead. Similarly, many leaders do not use cheap talk for cheating but commit to symmetric joint-optimum quantities. The leader’s private payoff information results in a low frequency of cooperation but in the presence of cheap talk players do cooperate.
PubDate: 2017-06-01
DOI: 10.1007/s10100-016-0444-9
Issue No: Vol. 25, No. 2 (2017)

• The impact of asynchronous trading on Epps effect on Warsaw Stock Exchange
• Authors: Henryk Gurgul; Artur Machno
Pages: 287 - 301
Abstract: Abstract The main goal of the analysis is the verification of whether asynchrony in transaction times is a considerable cause of the Epps effect on the Warsaw Stock Exchange among the most liquid assets. A method for compensating for the impact of asynchrony in trading on the Epps effect is presented. The method is easily applicable. Calculations are made using the exact time of transactions and prices of the assets. The estimation is not biased by intervals during which no transactions have taken place. Among all the analyzed stock pairs, asynchrony turns out to be the main cause of the Epps effect. However, the corrected correlation estimator seems to be more volatile than the regular estimator of the correlation. The presented analysis can be reproduced for the same data or replicated for another dataset; all R codes used in the process of writing this article are available upon request. The main novelty/value added of this paper is the application to an emerging market of a new method for compensating for asynchrony in trading.
PubDate: 2017-06-01
DOI: 10.1007/s10100-016-0442-y
Issue No: Vol. 25, No. 2 (2017)

• Multi-period resource allocation for estimating project costs in
competitive bidding
• Authors: Yuichi Takano; Nobuaki Ishii; Masaaki Muraki
Pages: 303 - 323
Abstract: Abstract In competitive bidding for project contracts, contractors estimate the cost of completing a project and then determine the bid price. Accordingly, the bid price is markedly affected by the inaccuracies in the estimated cost. To establish a profit-making strategy in competitive bidding, it is crucial for contractors to estimate project costs accurately. Although allocating a large amount of resources to cost estimates allows contractors to prepare more accurate estimates, there is usually a limit to available resources in practice. To the best of our knowledge, however, none of the existing studies have addressed the resource allocation problem for estimating project costs in competitive bidding. To maximize a contractor’s expected profit, this paper develops a multi-period resource allocation method for estimating project costs in a sequential competitive bidding situation. Our resource allocation model is posed as a mixed integer linear programming problem by making piecewise linear approximations of the expected profit functions. Numerical experiments examine the characteristics of the optimal resource allocation and demonstrate the effectiveness of our resource allocation method.
PubDate: 2017-06-01
DOI: 10.1007/s10100-016-0438-7
Issue No: Vol. 25, No. 2 (2017)

• Assessing efficiency of public health and medical care provision in OECD
countries after a decade of reform
• Authors: Yasar A. Ozcan; Jaya Khushalani
Pages: 325 - 343
Abstract: Abstract The objective of this study was to examine the change in efficiency of health care systems of 34 OECD countries between 2000 and 2012, a period marked by significant health reform in most OECD countries. This paper uses a novel Dynamic Network Data Envelopment Analysis (DNDEA) model to analyze the efficiency of the public health system and the medical care system of these OECD countries independently along with assessing the efficiency of their overall health system. This helps understand the relative priorities for improving the overall health system. The data for this study was obtained from the OECD Health Facts database. The study findings suggest that countries which improved their public health system were more likely to show overall improvement in efficiency.
PubDate: 2017-06-01
DOI: 10.1007/s10100-016-0440-0
Issue No: Vol. 25, No. 2 (2017)

• Synchronizing vans and cargo bikes in a city distribution network
• Authors: Alexandra Anderluh; Vera C. Hemmelmayr; Pamela C. Nolz
Pages: 345 - 376
Abstract: Abstract One of the significant side-effects of growing urbanization is the constantly increasing amount of freight transportation in cities. This is mainly performed by conventional vans and trucks and causes a variety of problems such as road congestion, noise nuisance and pollution. Yet delivering goods to residents is a necessity. Sustainable concepts of city distribution networks are one way of mitigating difficulties of freight services. In this paper we develop a two-echelon city distribution scheme with temporal and spatial synchronization between cargo bikes and vans. The resulting heuristic is based on a greedy randomized adaptive search procedure with path relinking. In our computational experiments we use artificial data as well as real-world data of the city of Vienna. Furthermore we compare three distribution policies. The results show the costs caused by temporal synchronization and can give companies decision-support in planning a sustainable city distribution concept.
PubDate: 2017-06-01
DOI: 10.1007/s10100-016-0441-z
Issue No: Vol. 25, No. 2 (2017)

• No such thing as a perfect hammer: comparing different objective function
specifications for optimal control
• Authors: D. Blueschke; I. Savin
Pages: 377 - 392
Abstract: Abstract Linear-quadratic (LQ) optimization is a fairly standard technique in the optimal control framework. LQ is very well researched, and there are many extensions for more sophisticated scenarios like nonlinear models. Conventionally, the quadratic objective function is taken as a prerequisite for calculating derivative-based solutions of optimal control problems. However, it is not clear whether this framework is as universal as it is considered to be. In particular, we address the question whether the objective function specification and the corresponding penalties applied are well suited in case of a large exogenous shock an economy can experience because of, e.g., the European debt crisis. While one can still efficiently minimize quadratic deviations around policy targets, the economy itself has to go through a period of turbulence with economic indicators, such as unemployment, inflation or public debt, changing considerably over time. We test four alternative designs of the objective function: a least median of squares based approach, absolute deviations, cubic and quartic objective functions. The analysis is performed based on a small-scale model of the Austrian economy and illustrates a certain trade-off between quickly finding an optimal solution using the LQ technique (reaching defined policy targets) and accounting for alternative objectives, such as limiting volatility in economic performance. As an implication, we argue in favor of the considerably more flexible optimization technique based on heuristic methods (such as Differential Evolution), which allows one to minimize various loss function specifications, but also takes additional constraints into account.
PubDate: 2017-06-01
DOI: 10.1007/s10100-016-0446-7
Issue No: Vol. 25, No. 2 (2017)

• Usability of Nomology-based methodologies in supporting problem
structuring across cultures: the case of participatory decision-making in
Tanzania rural communities
• Authors: Joseph R. Kakeneno; Cathal MacSwiney Brugha
Pages: 393 - 415
Abstract: Abstract In this paper, we present the results of an empirical study that was conducted to demonstrate how the Structured MCDM methodology which is based on Nomology, the science of the laws of the mind, could be used to support problem structuring and improve rural community participation in a developing country in Africa. The results support the view that a model which is based on a generic structure is flexible and transferable to similar problem contexts and various situations across cultures and beyond national borders; and that it can easily support distributed participatory decision-making or be integrated into a Participatory Decision Support System.
PubDate: 2017-06-01
DOI: 10.1007/s10100-016-0460-9
Issue No: Vol. 25, No. 2 (2017)

• Multi-objective two-stage grey transportation problem using utility
function with goals
• Authors: Sankar Kumar Roy; Gurupada Maity; Gerhard-Wilhelm Weber
Pages: 417 - 439
Abstract: Abstract Multi-Objective Goal Programming is applied to solve problems in many application areas of real-life decision making problems. We formulate the mathematical model of Two-Stage Multi-Objective Transportation Problem (MOTP) where we design the feasibility space based on the selection of goal values. Considering the uncertainty in real-life situations, we incorporate grey parameters for supply and demands into the Two-Stage MOTP, and a procedure is applied to reduce the grey numbers into real numbers. Thereafter, we present a solution procedure to the proposed problem by introducing an algorithm and using the approach of Revised Multi-Choice Goal Programming. In the proposed algorithm, we introduce a utility function for selecting the goals of the objective functions. A numerical example is encountered to justify the reality and feasibility of our proposed study. Finally, the paper ends with a conclusion and an outlook to future investigations of the study.
PubDate: 2017-06-01
DOI: 10.1007/s10100-016-0464-5
Issue No: Vol. 25, No. 2 (2017)

• Heuristic algorithms for the minmax regret flow-shop problem with interval
processing times
• Authors: Michał Ćwik; Jerzy Józefczyk
Abstract: Abstract An uncertain version of the permutation flow-shop with unlimited buffers and the makespan as a criterion is considered. The investigated parametric uncertainty is represented by given interval-valued processing times. The maximum regret is used for the evaluation of uncertainty. Consequently, the minmax regret discrete optimization problem is solved. Due to its high complexity, two relaxations are applied to simplify the optimization procedure. First of all, a greedy procedure is used for calculating the criterion’s value, as such calculation is NP-hard problem itself. Moreover, the lower bound is used instead of solving the internal deterministic flow-shop. The constructive heuristic algorithm is applied for the relaxed optimization problem. The algorithm is compared with previously elaborated other heuristic algorithms basing on the evolutionary and the middle interval approaches. The conducted computational experiments showed the advantage of the constructive heuristic algorithm with regards to both the criterion and the time of computations. The Wilcoxon paired-rank statistical test confirmed this conclusion.
PubDate: 2017-07-29
DOI: 10.1007/s10100-017-0485-8

• Time to dispense with the p -value in OR'
• Authors: Marko Hofmann; Silja Meyer-Nieberg
Abstract: Abstract Null hypothesis significance testing is the standard procedure of statistical decision making, and p-values are the most widespread decision criteria of inferential statistics both in science, in general, and also in operations research, in particular. p-values are of paramount importance in the life and human sciences, and dominate statistical summaries in natural and technical sciences as well as in operations research, a domain in which the p-value seems to be a common denominator for decision making based on samples. Yet, the use of significance testing in the analysis of research data has been criticized from numerous statisticians—continuously for almost 100 years. This criticism has recently (March 7, 2016) been given an official status by a statement from the American Statistical Association on p-values. Is it time to dispense with the p-value in OR' The answer depends on many factors, including the research objective, the research domain, and, especially, the amount of information provided in addition to the p-value. Despite this dependence from context three conclusions can be made that should concern the operational analyst: First, p-values can perfectly cast doubt on a null hypothesis or its underlying assumptions, but they are only a first step of analysis, which, stand alone, lacks expressive power. Second, the statistical layman almost inescapably misinterprets the evidentiary value of p-values. Third and foremost, p-values are an inadequate choice for a succinct executive summary of statistical evidence for or against a research question. In statistical summaries confidence intervals of standardized effect sizes provide much more information than p-values without requiring much more space.
PubDate: 2017-07-28
DOI: 10.1007/s10100-017-0484-9

• Investments in supplier-specific economies of scope with two different
services and different supplier characters: two specialists
• Authors: Günter Fandel; Jan Trockel
Abstract: Abstract Firms have to choose their market positions. Suppliers can offer a wide range of services as generalists or they act as specialists by offering a small range of services. In this paper based on Chatain/Zemsky (Manag Sci 53:550–565, 2007) and Chatain (Strateg Manag J 32:76–102, 2011) we analyse how supplier-specific economies of scope generated by investments can compensate the loss occurring by a non-optimal organisational structure (resource configuration) of production. These considerations are modelled by a non-cooperative game with one buyer and two suppliers. We show how the buyer can gain from supplier-specific economies of scope. In this case, the buyer will never split the orders to both suppliers, i.e. he always should order one supplier, if the tasks have similar characteristics and the investment costs of a supplier result in higher specific economies of scope relevant to the choice of the buyer. The amount of the specific economies of scope determines to whom of the suppliers the buyer will place both orders. But, if the investment costs of the suppliers are very high and/or the gains of the buyer are rather low, the pure strategy combination “no investments” for the two suppliers will become the unique Nash equilibrium, whereby the buyer places the two orders each to the supplier who is the specialist for it.
PubDate: 2017-07-27
DOI: 10.1007/s10100-017-0483-x

• Editorial
• Authors: Tibor Csendes; Csanád Imreh; József Temesi
PubDate: 2017-06-20
DOI: 10.1007/s10100-017-0482-y

• Tight upper bounds for semi-online scheduling on two uniform machines with
known optimum
• Authors: György Dósa; Armin Fügenschuh; Zhiyi Tan; Zsolt Tuza; Krzysztof Węsek
Abstract: Abstract We consider a semi-online version of the problem of scheduling a sequence of jobs of different lengths on two uniform machines with given speeds 1 and s. Jobs are revealed one by one (the assignment of a job has to be done before the next job is revealed), and the objective is to minimize the makespan. In the considered variant the optimal offline makespan is known in advance. The most studied question for this online-type problem is to determine the optimal competitive ratio, that is, the worst-case ratio of the solution given by an algorithm in comparison to the optimal offline solution. In this paper, we make a further step towards completing the answer to this question by determining the optimal competitive ratio for s between $$\frac{5 + \sqrt{241}}{12} \approx 1.7103$$ and $$\sqrt{3} \approx 1.7321$$ , one of the intervals that were still open. Namely, we present and analyze a compound algorithm achieving the previously known lower bounds.
PubDate: 2017-06-14
DOI: 10.1007/s10100-017-0481-z

• Basin Hopping Networks of continuous global optimization problems
• Authors: Tamás Vinkó; Kitti Gelle
Abstract: Abstract Characterization of optimization problems with respect to their solvability is one of the focal points of many research projects in the field of global optimization. Our study contributes to these efforts with the usage of the computational and mathematical tools of network science. Given an optimization problem, a network formed by all the minima found by an optimization method can be constructed. In this paper we use the Basin Hopping method on well-known benchmarking problems and investigate the resulting networks using several measures.
PubDate: 2017-05-30
DOI: 10.1007/s10100-017-0480-0

• The dynamic vehicle rescheduling problem
• Authors: Balázs Dávid; Miklós Krész
Abstract: Abstract The pre-planned schedules of a transportation company are often disrupted by unforeseen events. As a result of a disruption, a new schedule has to be produced as soon as possible. This process is called the vehicle rescheduling problem, which aims to solve a single disruption and restore the order of transportation. However, there are multiple disruptions happening over a “planning unit” (usually a day), and all of them have to be addressed to achieve a final feasible schedule. From an operations management point of view the quality of the final solution has to be measured by the combined quality of every change over the horizon of the “planning unit”, not by evaluating the solution of each disruption as a separate problem. The problem of finding an optimal solution where all disruptions of a “planning unit” are addressed will be introduced as the dynamic vehicle rescheduling problem (DVRSP). The disruptions of the DVRSP arrive in an online manner, but giving an optimal final schedule for the “planning unit” would mean knowing all information in advance. This is not possible in a real-life scenario, which means that heuristic solution methods have to be considered. In this paper, we present a recursive and a local search algorithm to solve the DVRSP. In order to measure the quality of the solutions given by the heuristics, we introduce the so-called quasi-static DVRSP, a theoretical problem where all the disruptions are known in advance. We give two mathematical models for this quasi-static problem, and use their optimal solutions to evaluate the quality of our heuristic results. The heuristic methods for the dynamic problem are tested on different random instances.
PubDate: 2017-05-24
DOI: 10.1007/s10100-017-0478-7

• A framework for sensitivity analysis of decision trees
• Authors: Bogumił Kamiński; Michał Jakubczyk; Przemysław Szufel
Abstract: Abstract In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.
PubDate: 2017-05-24
DOI: 10.1007/s10100-017-0479-6

• Should business rely on business cycle forecasting?
• Authors: Tobias F. Rötheli
Abstract: Abstract We investigate the circumstances in which business cycle forecasting is beneficial for business by addressing both the short-run and the long-run aspects. For an assessment of short-run forecasting we make a distinction between using publicly available information of cycle probabilities and the use of resources to sharpen this outlook. A sharpened forecast can pay off because it helps the firm to optimally select its output mix. For a long-run perspective we show that firms whose optimal level of operation varies with varying selling prices gain from an accurate assessment of the likelihood of the states of expansion and recession. Petroleum refining in the U.S. is econometrically studied as an exemplary industry. The results document cyclical regularities that indicate that forecasting is advantageous for firms in this industry.
PubDate: 2017-05-22
DOI: 10.1007/s10100-017-0477-8

• Order Batching and Picker Routing in manual order picking systems: the
benefits of integrated routing
• Authors: A. Scholz; G. Wäscher
Abstract: Abstract Order Batching and Picker Routing Problems arise in warehouses when items specified by customer orders have to be retrieved from their storage locations. The Order Batching Problem includes the grouping of a given set of customer orders into feasible picking orders such that the total length of all picker tours is minimized. In order to calculate the length of a picker tour, the sequence has to be determined according to which the items contained in the picking order will be picked. This problem is known as the Picker Routing Problem. Although quite sophisticated heuristics and even efficient exact solution approaches exist to the Picker Routing Problem in warehouse with up to two blocks, the routing problem does not get much attention when dealing with the Order Batching Problem. Instead, the order pickers are assumed to follow a certain, simple routing strategy when making their ways through the warehouse. The advantage of this approach can be seen in the fact that—in particular for single-block warehouse layouts—the corresponding picker tours are very straightforward and can be memorized easily by the order pickers. This advantage diminishes, however, when more complex, multi-block layouts have to be dealt with. Furthermore, in such case, the approach may result in picker tours which are far from optimal. For multi-block layouts, we integrate different routing algorithms into an iterated local search approach for the batching in order to demonstrate what the benefits are from solving the Order Batching and the Picker Routing Problem in a more integrated way. By means of numerical experiments it is shown that paying more attention to the Picker Routing Problem results in a substantial improvement of the solution quality without increasing computing times.
PubDate: 2017-01-31
DOI: 10.1007/s10100-017-0467-x

• One-dimensional stock cutting: optimization of usable leftovers in
consecutive orders
• Authors: Luka Tomat; Mirko Gradišar
Abstract: Abstract This paper deals with usable leftovers (UL) in one-dimensional stock cutting in consecutive orders. UL are leftovers longer than a certain threshold and are returned to stock in order to be used in future orders. Shorter leftovers are treated as trim-loss. If UL are being used at a slower pace, than they are being generated for a longer period of time, excessive growth of UL in stock can appear. This is not acceptable due to higher costs of manipulations and warehousing. However, a certain amount of UL in stock is desirable because it contributes to greater variety of stock lengths, which in general results in lower trim-loss. The method solves the problem of how to calculate the near optimal amount of UL in stock and how to control the stock. The proposed method is tested by using a computer simulation in which UL from previous orders are used in the next ones instead of being randomly generated. The computational results indicate that trim-loss in consecutive orders is reduced, and excessive growth of UL in stock is prevented.
PubDate: 2017-01-16
DOI: 10.1007/s10100-017-0466-y

• Stochastic sensitivity analysis of concentration measures
• Authors: Martin Bod’a
Abstract: Abstract The paper extends the traditional approach to measuring market concentration by embracing an element of stochasticity that should reflect the analyst’s uncertainty associated with the future development regarding concentration on the market. Whereas conventional practice relies on deterministic assessments of a market concentration measure with the use of current market shares, this says nothing about possible changes that may happen even in a near future. The paper proposes to model the analyst’s beliefs by dint of a suitable joint probability distribution for future market shares and demonstrates how this analytic framework may be employed for regulatory purposes. A total of four candidates for the joint probability distribution of market shares are considered—the Dirichlet distribution, the conditional normal distribution, the Gaussian copula with conditional beta marginals and the predictive distribution arising from the market share attraction model—and it is shown how their hyperparameters can be elicited so that a minimum burden is placed on the analyst. The proposed procedure for stochastic sensitivity analysis of concentration measures is demonstrated in a case study oriented on the Slovak banking sector.
PubDate: 2017-01-10
DOI: 10.1007/s10100-016-0465-4

JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
Fax: +00 44 (0)131 4513327

Home (Search)
Subjects A-Z
Publishers A-Z
Customise
APIs