Abstract: Error compensation techniques have been widely applied to improve multiaxis machine accuracy. However, due to the lack of reliable instrumentation for direct and overall measurements, all the compensation methods are based on offline measurements of each error component separately. The results of these measurements are static in nature and can only reflect the conditions at the moment of measurement. These results are not representative under real working conditions because of disturbances from load deformations, thermal distortions, and dynamic perturbations. This present approach involves the development of a new measurement system capable of dynamically evaluating the errors according to the six degrees of freedom. The developed system allows the generation of useful data that cover all machine states regardless of the operating conditions. The obtained measurements can be used to evaluate the performance of the machine, calibration, and real time compensation of errors. This system is able to perform dynamic measurements reflecting the global accuracy of the machine tool without a long and expensive analysis of various error sources contribution. Finally, the system exhibits compatible metrological characteristics with high precision applications. PubDate: Tue, 20 Dec 2016 09:03:20 +000

Abstract: Alginate hydrogels are suitable for the encapsulation of a great variety of biomolecules. Several alternatives to the conventional alginate formulation are being studied for a broad range of biotechnological applications; among them the addition of sugars and biopolymers arises as a good and economic strategy. Sugars (trehalose and β-cyclodextrin), a cationic biopolymer (chitosan), an anionic biopolymer (pectin), and neutral gums (Arabic, guar, espina corona, and vinal gums) provided different characteristics to the beads. Here we discuss the influence of beads composition on several physicochemical properties, such as size and shape, analyzed through digital image analysis besides both water content and activity. The results showed that the addition of a second biopolymer, β-CD, or trehalose provoked more compact beads, but the fact that they were compact not necessarily implies a concomitant increase in their circularity. Espina corona beads showed the highest circularity value, being useful for applications which require a controlled and high circularity, assuring quality control. Beads with trehalose showed lower water content than the rest of the system, followed by those containing galactomannans (espina corona, vinal, and guar gums), revealing polymer structure effects. A complete characterization of the beads was performed by FT-IR, assigning the characteristics bands to each individual component. PubDate: Wed, 07 Sep 2016 09:58:55 +000

Abstract: Although medical equipment maintenance has been carefully managed for years, very few in-depth studies have been conducted to evaluate the effectiveness and efficiency of these implemented preventive maintenance strategies, especially after the debate about the credibility of manufacturer’s recommendations has increased in the clinical engineering community. Facing the dilemma of merely following manufactures maintenance manual or establishing an evidence-based maintenance, medical equipment maintenance could have exploited an advanced area in operations research which is maintenance optimization research. In this paper, we review and examine carefully the status of application oriented research on preventive maintenance optimization of medical devices. This study addresses preventive healthcare maintenance with a focus on factors influencing the maintenance decision making. The analysis is structured by defining different aspects necessary to construct a maintenance optimization model. We conclusively propose directions to develop suitable tools for better healthcare maintenance management. PubDate: Wed, 27 Jul 2016 11:28:56 +000

Abstract: This paper presents a multifactor decision-making approach based on “grey-complex proportional assessment (COPRAS-G) method” in a view to overcome the limitations of Failure Mode Effect and Criticality Analysis (FMECA). In this model, the scores against each failure mode are expressed in grey number instead of crisp values to evaluate the criticalities of the failure modes without uncertainty. The suggested study is carried out to identify the weights of major failure causes for bearings, gears, and shafts of aluminium wire rolling mill plant. The primary findings of the paper are that sudden impact on the rolls seems to be most critical failure cause and loss of power seems to be least critical failure cause. It is suggested to modify the current control practices with proper maintenance strategy based on achieved maintainability criticality index (MCI) for different failure causes. The outcome of study will be helpful in deriving optimized maintenance plan to maximize the performance of process industry. PubDate: Thu, 14 Jul 2016 17:00:47 +000

Abstract: The family of consecutive-type reliability systems is under investigation. More specifically, an up-to-date presentation of almost all generalizations of the well-known consecutive -out-of-: system that have been proposed in the literature is displayed, while several recent and fundamental results for each member of the aforementioned family are stated. PubDate: Sun, 31 May 2015 07:27:54 +000

Abstract: Advancement in technology has led to greater accessibility of massive and complex data in many fields such as quality and reliability. The proper management and utilization of valuable data could significantly increase knowledge and reduce cost by preventive actions, whereas erroneous and misinterpreted data could lead to poor inference and decision making. On the other side, it has become more difficult to process the streaming high-dimensional time-to-event data in traditional application approaches, specifically in the presence of censored observations. This paper presents a multipurpose analytic model and practical nonparametric methods to analyze right-censored time-to-event data with high-dimensional covariates. In order to reduce redundant information and to facilitate practical interpretation, variable inefficiency in failure time is determined for the specific field of application. To investigate the performance of the proposed methods, these methods are compared with recent relevant approaches through numerical experiments and simulations. PubDate: Sun, 10 May 2015 09:46:49 +000

Abstract: The aim of coal quality control in coal mines is to supply power plants daily with extracted raw material within certain coal quality constraints. On the example of a selected part of a lignite deposit, the problem of quality control for the run-of-mine lignite stream is discussed. The main goal is to understand potential fluctuations and deviations from production targets dependent on design options before an investment is done. A single quality parameter of the deposit is selected for this analysis—the calorific value of raw lignite. The approach requires an integrated analysis of deposit inherent variability, the extraction sequence, and the blending option during material transportation. Based on drill-hole data models capturing of spatial variability of the attribute of consideration are generated. An analysis based on two modelling approaches, Kriging and sequential Gaussian simulation, reveals advantages and disadvantages lead to conclusions about their suitability for the control of raw material quality. In a second step, based on a production schedule, the variability of the calorific value in the lignite stream has been analysed. In a third step the effect of different design options, multiple excavators and a blending bed, was investigated. PubDate: Thu, 29 Jan 2015 16:46:46 +000

Abstract: The present paper analyzes a two-unit cold standby system wherein both units may become operative depending upon the demand. Initially, one of the units is operative while the other is kept as cold standby. If the operative unit fails or the demand increases to the extent that one operative unit is not capable of meeting the demand, the standby unit becomes operative instantaneously. Thus, both units may become operative simultaneously to meet the increased demand. Availability in three types of upstates is as follows: (i) when the demand is less than or equal to production manufactured by one unit; (ii) when the demand is greater than whatever produced by one unit but less than or equal to production made by two units; and (iii) when the demand is greater than the produces by two units. Other measures of the system effectiveness have also been obtained in general case as well as for a particular case. Techniques of semi-Markov processes and regenerative processes have been used to obtain various measures of the system effectiveness. PubDate: Mon, 01 Dec 2014 00:10:02 +000

Abstract: Reliability is an important phase in durable system designs, specifically in the early phase of the product development. In this paper, a new methodology is proposed for complex systems’ design for reliability. Specific test and field failure data scarcity is evaluated here as a challenge to implement design for reliability of a new product. In the developed approach, modeling and simulation of the system are accomplished by using reliability block diagram (RBD) method. The generic data are corrected to account for the design and environment effects on the application. The integral methodology evaluates reliability of the system and assesses the importance of each component. In addition, the availability of the system was evaluated using Monte Carlo simulation. Available design alternatives with different components are analyzed for reliability optimization. Evaluating reliability of complex systems in competitive design attempts is one of the applications of this method. The advantage of this method is that it is applicable in early design phase where there is only limited failure data available. As a case study, horizontal drilling equipment is used for assessment of the proposed method. Benchmarking of the results with a system with more available failure and maintenance data verifies the effectiveness and performance quality of presented method. PubDate: Sun, 30 Nov 2014 09:19:57 +000

Abstract: We consider the estimation of stress-strength reliability based on lower record values when and are independently but not identically inverse Rayleigh distributed random variables. The maximum likelihood, Bayes, and empirical Bayes estimators of are obtained and their properties are studied. Confidence intervals, exact and approximate, as well as the Bayesian credible sets for are obtained. A real example is presented in order to illustrate the inferences discussed in the previous sections. A simulation study is conducted to investigate and compare the performance of the intervals presented in this paper and some bootstrap intervals. PubDate: Sun, 16 Nov 2014 09:49:57 +000

Abstract: Statistical process control (SPC) is one of the most important statistical tools for monitoring production processes. It can be effectively designed and implemented when the process or product specifications are consecutively observed from a mass production condition. Normally, short-cycle productions do not have sufficient data to implement SPC. This research introduced how to design and implement short-run control chart for batch production conditions. Monitoring critical specifications of supplied parts to automotive industry was proposed. The results revealed that unequal variables followed normal distribution and can be fluctuated over time for the purpose of monitoring multiple products for each product including multidimensions with unequal means and variances from the central line to control the chart. Out-of-control signals and nonrandom patterns can be recognized on the developed short-run control chart accordingly. PubDate: Thu, 25 Sep 2014 06:25:51 +000

Abstract: A new load-share reliability model of systems under the changeable load is proposed in the paper. It is assumed that the load is a piecewise smooth function which can be regarded as an extension of the piecewise constant and continuous functions. The condition of the residual lifetime conservation, which means continuity of a cumulative distribution function of time to failure, is accepted in the proposed model. A general algorithm for computing reliability measures is provided. Simple expressions for determining the survivor functions under assumption of the Weibull probability distribution of time to failure are given. Various numerical examples illustrate the proposed model by different forms of the system load and different probability distributions of time to failure. PubDate: Thu, 11 Sep 2014 06:10:43 +000

Abstract: Quality of coordinate measuring machine (CMM) in dimension and form metrology is designed and performed at the NIS. The experimental investigation of CMM performance is developed by using reference Flick standard. The measurement errors of corresponding geometric evaluation algorithm (LSQ, ME, MC, and MI) and probe scanning speed (1, 2, 3, 4, and 5 mm/s) are obtained through repeated arrangement, comparison, and judgment. The experimental results show that the roundness error deviation can be evaluated effectively and exactly for CMM performance by using Flick standard. Some of influencing quantities for diameter and roundness form errors may dominate the results at all fitting algorithms under certain circumstances. It can be shown that the 2 mm/s probe speed gives smaller roundness error than 1, 3, 4, and 5 mm/s within 0.2 : 0.3 μm. It ensures that measurement at 2 mm/s is the best case to satisfy the high level of accuracy in the certain condition. Using Flick standard as a quality evaluation tool noted a high precision incremental in diameter and roundness form indication. This means a better transfer stability of CMM quality could be significantly improved. Moreover, some error formulae of data sets have been postulated to correlate the diameter and roundness measurements within the application range. Uncertainty resulting from CMM and environmental temperature has been evaluated and confirmed the quality degree of confidence in the proposed performance investigation. PubDate: Thu, 17 Jul 2014 10:01:54 +000

Abstract: This paper presents optimum design of time-censored constant-stress partially accelerated life test sampling plan (PALTSP) in which each item runs either at use or at accelerated conditions and product life follows Burr type XII. The optimal plan consists in finding out sample proportions allocated to both use and accelerated conditions by minimizing the asymptotic variance of test statistic for deciding on acceptance/rejection of the lot such that producer’s and consumer’s interests are safeguarded. The method developed has been illustrated using an example. Sensitivity analysis has also been carried out. PubDate: Wed, 18 Jun 2014 07:07:54 +000

Abstract: This paper aims to enlarge the family of one-class classification-based control charts,referred to as OC-charts, and extend their applications. We propose a new OC-chart using the -means data description (KMDD) algorithm, referred to as KM-chart. The proposed KM-chart gives the minimum closed spherical boundary around the in-control process data. It measures the distance between the center of KMDD-based sphere and the new incoming sample to be monitored. Any sample having a distance greater than the radius of KMDD-based sphere is considered as an out-of-control sample. Phase I and II analysis of KM-chart was evaluated through a real industrial application. In a comparative study based on the average run length (ARL) criterion, KM-chart was compared with the kernel-distance based control chart, referred to as K-chart, and the -nearest neighbor data description-based control chart, referred to as KNN-chart. Results revealed that, in terms of ARL, KM-chart performed better than KNN-chart in detecting small shifts in mean vector. Furthermore, the paper provides the MATLAB code for KM-chart, developed by the authors. PubDate: Mon, 09 Jun 2014 10:14:53 +000

Abstract: This paper reports the development of an approach to integrate the appropriate modeling techniques for estimating the effect of project quality management (PQM) on construction performance. This modeling approach features a causal structure that depicts the interaction among the PQM factors affecting quality performance in a given construction operation. In addition, it makes use of fuzzy sets and fuzzy logic in order to incorporate the subjectivity and uncertainty implicit in the performance assessment of these PQM factors to discrete-event simulation models. The outcome is a simulation approach that allows experimenting with different performance levels of the PQM practices implemented in a construction project and obtaining the corresponding productivity estimates of the construction operations. These estimates are intended to facilitate the decision making regarding the improvement of a PQM system implemented in a construction project. A case study is used to demonstrate the usefulness of the proposed simulation approach for evaluating diverse performance improvement alternatives for a PQM system. PubDate: Wed, 07 May 2014 11:27:53 +000

Abstract: The renewal and renewal-intensity functions with minimal repair are explored for the Normal, Gamma, Uniform, and Weibull underlying lifetime distributions. The Normal, Gamma, and Uniform renewal, and renewal-intensity functions are derived by the convolution method. Unlike these last three failure distributions, the Weibull except at shape does not have a closed-form function for the n-fold convolution. Since the Weibull is the most important failure distribution in reliability analyses, the approximate renewal and renewal-intensity functions of Weibull were obtained by the time-discretizing method using the Mean-Value Theorem for Integrals. A Matlab program outputs all reliability and renewal measures. PubDate: Wed, 19 Mar 2014 16:47:20 +000

Abstract: Based on the existing literature, this paper proposes two assumptions and designs a set of ecological supply chain performance evaluation indicators system. Because these indicators are interdependence for each other, this paper selects the analytic network process and builds the ANP network model to evaluate the degree of ecological supply chain management practice among Chinese manufacturing enterprises. The evaluation results show that there is different level indeed about the ecological supply chain management level; the better the ecological supply chain management practice degree is, the more quickly the ecological supply chain management performance levels increase. PubDate: Thu, 06 Mar 2014 08:32:28 +000

Abstract: We consider optimal replacement policies with periodic imperfect maintenance actions and minimal repairs. The multistate system is minimally repaired at failure and imperfect maintenance actions are regularly carried out for preventive maintenance. The discrete modified Weibull distribution is introduced and some cost functions applied to this distribution are defined in order to be minimized. Moreover, we assume that the costs of preventive maintenance depend on the degree of repair via a Kijima type 2 model. For illustrative purpose, the obtained results are applied on sets of simulated data. PubDate: Thu, 27 Feb 2014 12:15:48 +000

Abstract: The recent proliferation of Markov chain Monte Carlo (MCMC) approaches has led to the use of the Bayesian inference in a wide variety of fields. To facilitate MCMC applications, this paper proposes an integrated procedure for Bayesian inference using MCMC methods, from a reliability perspective. The goal is to build a framework for related academic research and engineering applications to implement modern computational-based Bayesian approaches, especially for reliability inferences. The procedure developed here is a continuous improvement process with four stages (Plan, Do, Study, and Action) and 11 steps, including: (1) data preparation; (2) prior inspection and integration; (3) prior selection; (4) model selection; (5) posterior sampling; (6) MCMC convergence diagnostic; (7) Monte Carlo error diagnostic; (8) model improvement; (9) model comparison; (10) inference making; (11) data updating and inference improvement. The paper illustrates the proposed procedure using a case study. PubDate: Tue, 14 Jan 2014 11:55:25 +000

Abstract: The microelectromechanical system (MEMS) is one of the most diversified fields of microelectronics; it is rated to be the most promising technology of modern engineering. MEMS can sense, actuate, and integrate mechanical and electromechanical components of micro- and nano sizes on a single silicon substrate using microfabrication techniques. MEMS industry is at the verge of transforming the semiconductor world into MEMS universe, apart from other hindrances; the reliability of these devices is the focal point of recent research. Commercialization is highly dependent on the reliability of these devices. MEMS requires a high level of reliability. Several technological factors, operating conditions, and environmental effects influencing the performances of MEMS devices must be completely understood. This study reviews some of the major reliability issues and failure mechanisms. Specifically, the fatigue in MEMS is a major material reliability issue resulting in structural damage, crack growth, and lifetime measurements of MEMS devices in the light of statistical distribution and fatigue implementation of Paris' law for fatigue crack accumulation under the influence of undesirable operating and environmental conditions. PubDate: Sun, 12 Jan 2014 00:00:00 +000