Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Orthogonal Frequency Division Multiplexing (OFDM) is the most popular multicarrier communication technique, but its disadvantage is the large Peak-to-Average Power Ratio (PAPR). In recent years, different researchers presented several techniques to avoid this problem such as companding techniques. Moreover, the Single-Carrier Frequency Division Multiple Access (SC-FDMA) system is a popular system in mobile communications because of its advantage of low PAPR, but reducing its PAPR is still an open research issue. So, an extension of the work applied on OFDM to SC-FDMA is adopted in this paper to reduce the PAPR, while achieving a low Bit Error Rate (BER). New companding schemes are adopted in this paper with the help of the Discrete Wavelet Transform (DWT) in the presence of channel degradations for lowering the PAPR of the SC-FDMA system, while achieving a low BER. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Signatures are a crucial behavioral trait widely used to authenticate a person's identity. Financial and legal institutions, including commercial banks, consider it a legitimate method of document authentication. Despite the emergence of various biometric authentication techniques such as fingerprints, retinal scans, and facial recognition, signature verification is still a prevalent authentication method among Indian industries, especially in the banking sector. Signature verification is used while processing cheques and other documents, even when only digital copies of such documents are available. An example of signature verification on digital documents is the Cheque Truncation System of India, adopted by all scheduled commercial banks in India. However, manual signature verification is tedious and vulnerable to human error. This paper attempts to compare the efficacy of Convolution Neural Networks and Support Vector Machine algorithms in automating the process of signature verification. These algorithms incorporate various image features to verify whether the signature is genuine or fraudulent without human intervention. The Support Vector Machine algorithm performs better, considering the computational limitations of production systems. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The alternative proposed method aims to combine management accounting, precisely the activity-based system, and statistical tools to develop a method of assessing and predicting human capital within research laboratory. Statistical tools are Standardized Mean Difference, Hierarchical Cluster Analysis and Discriminant Analysis. The first normalizes the activities of the laboratory; the second classifies the results obtained, while the third standardizes these results by expressing them in terms of probability. The standardized scores are used for the computation and the prediction of human capital in research laboratories via activity regrouping center. The originality of this work is to fill a research gap in the field of hybridization in calculation and prediction of human capital by integrating the two disciplines mentioned above. Likewise, the originality of this work lies in the use of an activity-based accounting architecture to process outputs (and not costs) related to intangible aspects. The proposed method has research and social implications since it allows making appropriate research policy, adequate management control and improves organizational relations within the laboratory concerned. The findings show, through an illustration, the applicability of the proposed method and the usefulness of the tools used to do this. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Feature selection is an important preprocessing step in analyzing large scale data. In this paper, we prove the monotonicity property of the \(\chi ^2\) -statistic and use it to construct a more robust feature selection method. In particular, we show that \(\chi ^2_{Y, X_1} \le \chi ^2_{Y, (X_1, X_2)}\) . This result indicates that a new feature should be added to an existing feature set only if it increases the \(\chi ^2\) -statistic beyond a certain threshold. Our stepwise feature selection algorithm significantly reduces the number of features considered at each stage making it more efficient than other similar methods. In addition, the selection process has a natural stopping point thus eliminating the need for user input. Numerical experiments confirm that the proposed algorithm can significantly reduce the number of features required for classification and improve classifier accuracy. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Dara and Ahmad (Recent advances in moment distribution and their hazard rates, Academic Publishing GmbH KG, Lap Lambert, 2012) proposed the length-biased exponential (LBE) distribution and proved that the LBE distribution is more flexible than the exponential distribution. In this paper, we have obtained new explicit algebraic expressions and some recurrence relations for both single and product moments of order statistics from LBE distribution. Further, these expressions are used to compute the means, variances and covariances of order statistics for different sample of sizes and for arbitrarily chosen parameter values. Next, we use these moments to obtain the best linear unbiased estimates of the location and scale parameters based on complete as well as Type-II right censored samples. Finally, we carried out a simulation study to show the application of our results. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract We introduced and studied a new three-parameter lifetime distribution based on Marshall–Olkin and inverted Nadarajah–Haghighi distribution. In comparison with some existing lifetime models, the new distribution has very flexible shapes for hazard rate as well as for the probability density function. The related mathematical functions of the new distribution are presented. The maximum likelihood method is considered to obtain the parameter estimates of the MOINH distribution. A simulation study is presented to assess the behavior of the proposed estimator. The potentiality of the distribution is illustrated by fitting two real data sets. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The T-NH{Y} family is developed and study in this paper. Various statistical properties such as the mode, quantile, moments and Shannon entropy were derived. Two special distributions namely, exponential-NH{log-logistic}and Gumbel-NH{logistic} were developed. Plots of the failure rate functions for these distributions for some given parameter values indicated that the hazard rate functions can exhibit different types of non-monotonic failure rates. Two applications using real datasets on failure times revealed that the exponential-NH{log-logistic} distribution provides better fits to the datasets than the other fitted models. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In this paper, we establish the explicit expressions and some recurrence relations for single and product moments of order statistics from exponentiated Burr XII distribution. By using, these results we can calculate the mean and variances based on order statistics for the given distribution. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In this paper, we investigate the estimation problems of unknown parameters of the Kumaraswamy distribution under type I progressive hybrid censoring. This censoring scheme is a combination of progressive type I and hybrid censoring schemes. We derive the maximum likelihood estimates of parameters using an expectation-maximization algorithm. Bayes estimates are obtained under different loss functions using the Lindley method and importance sampling procedure. The highest posterior density intervals of unknown parameters are constructed as well. We also obtain prediction estimates and prediction intervals for censored observations. A Monte Carlo simulation study is performed to compare proposed methods and one real data set is analyzed for illustrative purposes. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In this paper, we obtain exact explicit expressions as well as several recurrence relations satisfied by single and product moments of dual (lower) generalized order statistics from Topp-Leone weighted Weibull distribution. These relations are deduced for moments of order statistics and lower record values. Further, these recurrence relations and other properties of dual generalized order statistics are used to characterize this distribution. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In this paper, single acceptance sampling inspection plan for generalized half-normal distribution when the lifetime experiment is truncated at a pre-fixed time is introduced. For the propose plan, ratio of true mean life time to specified mean life time for different choices of confidence level are provided. Minimum sample size which is necessary to ensure a certain specified lifetime are obtained. Operating characteristic values of the proposed plan are also presented. A real life example is considered to show the applicability of the proposed sampling plan. PubDate: 2022-12-01

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The purpose of this paper is to introduce the adaptive progressive hybrid censored scheme of the bivariate model which expands the limited applicability of failure censored schemes for the bivariate models in several fields of products. Also, the paper discusses a new bivariate model based on an adaptive progressive hybrid censored with more efficacy than the traditional models. Based on the FGM copula function and Odd-Weibull family, we will introduce the bivariate FGM Weibull-Weibull distribution. To estimate the model parameters, maximum likelihood and Bayesian estimation are used. In addition, for the parameter model, asymptotic confidence intervals and credible intervals of the highest posterior density for the Bayesian are calculated. A Monte-Carlo simulation analysis is carried out of the maximum likelihood and Bayesian estimators. Finally, we demonstrate the utility of the suggested bivariate distribution using real data from the medical area, such as diabetic nephropathy data. PubDate: 2022-10-23

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract To address the need for secure digital image transmission an algorithm that fulfils all prominent prerequisites of a steganography technique is developed. By incorporating the salient features of fractal cover images, dual-layer encryption using the standard chaotic map and DNA-hyperchaotic cryptography along with DWT-SVD embedding, key aspects like robustness, better perceptual quality and high payload capacity are targeted to build a blind colour image steganography algorithm in this work. A fractal cover image is used to hide a DNA-chaotic encrypted colour image using DWT-SVD embedding method. A two-dimensional standard chaotic map, which exhibits robust chaos for a very large range of parameter, is used to generate the pseudo-random number sequences of cryptographic qualities. One of the core novelty of the proposed method is the 2 layers chaotic encryption method to generate the DNA encrypted secret image which is finally embedded in a fractal cover image using DWT-SVD transform domain technique capable of withstanding the false positive attack. The comprehensive statistical security tests and the standard evaluation benchmarks depict that this efficient yet simple hybrid steganography algorithm is highly robust as well as sustainable against removal, geometrical, image enhancement and histogram attacks, offers better perceptual image quality and also contributes high perceptual quality of the extracted image. PubDate: 2022-10-22

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In this article, step-stress partially accelerated life testing (SSPALT) with competing risks is studied when the lifetime of test units follows Nadarajah–Haghighi (NH) distribution. The maximum likelihood estimates (MLEs) and Bayes estimates (BEs) of the model parameters are derived under progressive Type-II censoring. Furthermore, the approximate and credible confidence intervals (CIs) of the parameters are computed. A numerical example has been constructed to illustrate the methods used for the study. Finally, simulation studies are performed to demonstrate the accuracy of the MLEs and BEs for the parameters of Nadarajah–Haghighi distribution and the BEs showed better results than MLEs. PubDate: 2022-10-22

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract The key assumption in accelerated life testing is that the mathematical model concerning the lifetime of the item and the stress is known or can be assumed. In several situations, such life-stress relationships are not known and cannot be assumed, i.e. accelerated life testing information cannot be extrapolated to use situation. So, in such cases, a partially accelerated life test is a more appropriate testing method to be executed for which tested objects are subjected to both normal and accelerated circumstances. Due to continual improvement in manufacturing design, it is more difficult to obtain information about the lifetime of products or materials with high reliability at the time of testing under normal conditions. An approach to accelerate failures is the step-stress partially accelerated life test which increases the load applied to the goods in a particular discrete sequence. In this study, the maximum likelihood estimators of inverse the generalized inverse Lindley distribution parameters and the acceleration factor are investigated in a step-stress partially accelerated life test model utilizing two various types of progressively hybrid censoring systems. Furthermore, the performance of the model parameter estimators with the two progressive hybrid censoring schemes is analyzed and compared in terms of biases and mean squared errors using a Monte Carlo simulation approach. PubDate: 2022-10-20

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In this paper we present a new G family of probability distributions. Some of its mathematical properties are derived. Based on a special member of the new family, a single acceptance sampling plan is considered. The issue of a single sample plan when the lifetime test is truncated at a pre-determined period is discussed. For certain different acceptance levels, confidence limits and values ratio of time and the sample size is desired to assure the estimated fixed mean life. The results of lowest ratio of actual mean life to fixed mean life that confirms acceptance with a given probability are presented. A case study is presented for this purpose. PubDate: 2022-10-20

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In this article, we proposed a new extension of the Topp–Leone family of distributions. Some important properties of the model are developed, such as quantile function, stochastic ordering, model series representation, moments, stress–strength reliability parameter, Renyi entropy, order statistics, and moment of residual life. A particular member called new extended Topp–Leone exponential (NETLE) is discussed. Maximum likelihood estimation (MLE), least-square estimation (LSE), and percentile estimation (PE) are used for the model parameter estimation. Simulation studies were conducted using NETLE to assess the MLE, LSE, and PE performance by examining their bias and mean square error (MSE), and the result was satisfactory. Finally, the applications of the NETLE to two real data sets are provided to illustrate the importance of the NETLG families in practice; the data sets consist of daily new deaths due to COVID-19 in California and New Jersey, USA. The new model outperformed many other existing Topp–Leone’s and exponential related distributions based on the real data illustrations. PubDate: 2022-10-18

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract In recent times, various machine learning approaches have been widely employed for effective diagnosis and prediction of diseases like cancer, thyroid, Covid-19, etc. Likewise, Alzheimer’s (AD) is also one progressive malady that destroys memory and cognitive function over time. Unfortunately, there are no dedicated AI-based solutions for diagnoses of AD to go hand in hand with medical diagnosis, even though multiple factors contribute to the diagnosis, making AI a very viable supplementary diagnostic solution. This paper reports an endeavor to apply various machine learning algorithms like SGD, k-Nearest Neighbors, Logistic Regression, Decision tree, Random Forest, AdaBoost, Neural Network, SVM, and Naïve Bayes on the dataset of affected victims to diagnose Alzheimer’s disease. Longitudinal collections of subjects from OASIS dataset have been used for prediction. Moreover, some feature selection and dimension reduction methods like Information Gain, Information Gain Ratio, Gini index, Chi-Squared, and PCA are applied to rank different factors and identify the optimum number of factors from the dataset for disease diagnosis. Furthermore, performance is evaluated of each classifier in terms of ROC-AUC, accuracy, F1 score, recall, and precision as well as included comparative analysis between algorithms. Our study suggests that approximately 90% classification accuracy is observed under top-rated four features CDR, SES, nWBV, and EDUC. PubDate: 2022-10-17

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: Abstract Underwater Acoustic (UWA) wireless communication systems are plagued by a slew of flaws that restrict their performance. This includes factors such as high attenuation in seawater, sediment type, acidity concentration, water temperature, and sound speed propagation. One of the available solutions is Orthogonal Frequency Division Multiplexing (OFDM). Unfortunately, the OFDM systems suffer from the Carrier Frequency Offset (CFO) phenomenon that causes Inter-Carrier-Interference. One of the means to overcome this problem is joint equalization and CFO compensation. In this paper, the conventional OFDM system is adapted for Multiple-Input-Multiple Output (MIMO)-OFDM communication utilizing Discrete Wavelet Transform (DWT) rather than Discrete Fourier Transform (DFT). The DWT-based OFDM system has certain benefits over the comparable DFT. The trade-off, on the other hand, is the necessity for an extra DFT/IDFT to complete the Frequency-Domain equalization procedure, which increases the total computational complexity. In addition, we present a Joint Low Regularized Linear Zero Forcing equalizer for MIMO-OFDM based on DWT that employs the banded-matrix approximation approach. The suggested approach avoids signal-to-noise ratio estimation. Simulation results show that the proposed scheme outperforms different schemes at the same UWA channel conditions spatially in the case of estimation errors. PubDate: 2022-10-14