Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: AbstractThis review discusses methods for learning parameters for image reconstruction problems using bilevel formulations. Image reconstruction typically involves optimizing a cost function to recover a vector of unknown variables that agrees with collected measurements and prior assumptions. Stateof- the-art image reconstruction methods learn these prior assumptions from training data using various machine learning techniques, such as bilevel methods.One can view the bilevel problem as formalizing hyperparameter optimization, as bridging machine learning and cost function based optimization methods, or as a method to learn variables best suited to a specific task. More formally, bilevel problems attempt to minimize an upper-level loss function, where variables in the upper-level loss function are themselves minimizers of a lower-level cost function.This review contains a running example problem of learning tuning parameters and the coefficients for sparsifying filters used in a regularizer. Such filters generalize the popular total variation regularization method, and learned filters are closely related to convolutional neural networks approaches that are rapidly gaining in popularity. Here, the lower-level problem is to reconstruct an image using a regularizer with learned sparsifying filters; the corresponding upper-level optimization problem involves a measure of reconstructed image quality based on training data.This review discusses multiple perspectives to motivate the use of bilevel methods and to make them more easily accessible to different audiences. We then turn to ways to optimize the bilevel problem, providing pros and cons of the variety of proposed approaches. Finally we overview bilevel applications in image reconstruction.Suggested CitationCaroline Crockett and Jeffrey A. Fessler (2022), "Bilevel Methods for Image Reconstruction", Foundations and Trends® in Signal Processing: Vol. 15: No. 2-3, pp 121-289. http://dx.doi.org/10.1561/2000000111 PubDate: Thu, 05 May 2022 00:00:00 +020

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: AbstractThis monograph addresses operating characteristics for binary hypothesis testing in both classical and quantum settings and overcomplete quantum measurements for quantum binary state discrimination. We specifically explore decision and measurement operating characteristics defined as the tradeoff between probability of detection and probability of false alarm as parameters of the pre-decision operator and the binary decision rule are varied. In the classical case we consider in detail the Neyman-Pearson optimality of the operating characteristics when they are generated using threshold tests on a scalar score variable rather than threshold tests on the likelihood ratio. In the quantum setting, informationally overcomplete POVMs are explored to provide robust quantum binary state discrimination. We focus on equal trace rank one POVMs which can be specified by arrangements of points on a sphere that we refer to as an Etro sphere.Suggested CitationCatherine A. Medlock and Alan V. Oppenheim (2021), "Operating Characteristics for Classical and Quantum Binary Hypothesis Testing", Foundations and Trends® in Signal Processing: Vol. 15: No. 1, pp 1-120. http://dx.doi.org/10.1561/2000000106 PubDate: Tue, 16 Nov 2021 00:00:00 +010

Please help us test our new pre-print finding feature by giving the pre-print link a rating. A 5 star rating indicates the linked pre-print has the exact same content as the published article.

Abstract: AbstractImagine a coverage area where each mobile device is communicatingwith a preferred set of wireless access points (amongmany) that are selected based on its needs and cooperate tojointly serve it, instead of creating autonomous cells. Thiseffectively leads to a user-centric post-cellular network architecture,which can resolve many of the interference issuesand service-quality variations that appear in cellular networks.This concept is called User-centric Cell-Free MassiveMIMO (multiple-input multiple-output) and has its rootsin Network MIMO. The main challenge is to achieve thebenefits of cell-free operation in a practically feasible way,with computational complexity and fronthaul requirementsthat are scalable to enable massively large networks withmany mobile devices. Starting from a definition of UsercentricCell-Free Massive MIMO, this monograph covers thefoundations of channel estimation, signal processing, pilotassignment, dynamic cooperation cluster formation, poweroptimization, fronthaul signaling, and spectral efficiencyevaluation in uplink and downlink under different degreesof cooperation among the access points and arbitrary linearcombining and precoding.Suggested CitationÖzlem Tugfe Demir, Emil Björnson and Luca Sanguinetti (2020), "Foundations of User-Centric Cell-Free Massive MIMO", Foundations and Trends® in Signal Processing: Vol. 14: No. 3-4. http://dx.doi.org/10.1561/2000000109 PubDate: Thu, 31 Dec 2020 00:00:00 +010