Subjects -> MATHEMATICS (Total: 1013 journals)     - APPLIED MATHEMATICS (92 journals)    - GEOMETRY AND TOPOLOGY (23 journals)    - MATHEMATICS (714 journals)    - MATHEMATICS (GENERAL) (45 journals)    - NUMERICAL ANALYSIS (26 journals)    - PROBABILITIES AND MATH STATISTICS (113 journals) MATHEMATICS (714 journals)            First | 1 2 3 4 | Last
 Showing 201 - 400 of 538 Journals sorted alphabetically Educação Matemática Debate Edumatica : Jurnal Pendidikan Matematika EduMatSains Electronic Journal of Differential Equations Electronic Journal of Graph Theory and Applications       (Followers: 3) Em Teia : Revista de Educação Matemática e Tecnológica Iberoamericana Emergent Scientist Energy for Sustainable Development       (Followers: 13) Enseñanza de las Ciencias : Revista de Investigación y Experiencias Didácticas Entropy       (Followers: 5) ESAIM: Control Optimisation and Calculus of Variations       (Followers: 2) Euclid European Journal of Applied Mathematics European Journal of Combinatorics       (Followers: 3) European Journal of Mathematics       (Followers: 1) European Scientific Journal       (Followers: 1) Examples and Counterexamples Experimental Mathematics       (Followers: 5) Expositiones Mathematicae       (Followers: 2) Extracta Mathematicae Facta Universitatis, Series : Mathematics and Informatics Finite Fields and Their Applications       (Followers: 5) Fixed Point Theory and Applications Formalized Mathematics Forum of Mathematics, Pi       (Followers: 1) Forum of Mathematics, Sigma       (Followers: 1) Foundations and Trends® in Econometrics       (Followers: 6) Foundations and Trends® in Networking       (Followers: 1) Foundations and Trends® in Stochastic Systems       (Followers: 1) Foundations and Trends® in Theoretical Computer Science       (Followers: 1) Foundations of Computational Mathematics Fractal and Fractional Fractals       (Followers: 1) Frontiers of Mathematics in China Fuel Cells Bulletin       (Followers: 9) Functional Analysis and Other Mathematics       (Followers: 4) Fundamental Journal of Mathematics and Applications Funktsional'nyi Analiz i ego Prilozheniya Fuzzy Optimization and Decision Making       (Followers: 8) Game Theory       (Followers: 2) Games       (Followers: 4) Games and Economic Behavior       (Followers: 25) Gamm - Mitteilungen GANIT : Journal of Bangladesh Mathematical Society GEM - International Journal on Geomathematics       (Followers: 1) General Mathematics Glasgow Mathematical Journal Global Journal of Mathematical Sciences Graphs and Combinatorics       (Followers: 4) Grey Systems : Theory and Application Groups, Complexity, Cryptology       (Followers: 2) GSTF Journal of Mathematics, Statistics and Operations Research       (Followers: 1) Historia Mathematica Historical Methods: A Journal of Quantitative and Interdisciplinary History       (Followers: 28) IMA Journal of Applied Mathematics IMA Journal of Numerical Analysis - advance access ImmunoInformatics       (Followers: 1) Indagationes Mathematicae Indian Journal of Pure and Applied Mathematics       (Followers: 4) Indonesian Journal of Combinatorics Indonesian Journal of Science and Mathematics Education       (Followers: 1) Infinite Dimensional Analysis, Quantum Probability and Related Topics       (Followers: 1) Infinity Jurnal Matematika dan Aplikasinya       (Followers: 3) Information and Inference InfoTekJar : Jurnal Nasional Informatika dan Teknologi Jaringan InfraMatics Insight - Non-Destructive Testing and Condition Monitoring       (Followers: 110) International Electronic Journal of Algebra International Journal for Numerical Methods in Engineering       (Followers: 35) International Journal for Numerical Methods in Fluids       (Followers: 19) International Journal of Advanced Mathematical Sciences International Journal of Advanced Mechatronic Systems       (Followers: 2) International Journal of Advanced Research in Mathematics International Journal of Advances in Engineering Sciences and Applied Mathematics       (Followers: 10) International Journal of Algebra and Computation       (Followers: 1) International Journal of Algebra and Statistics       (Followers: 3) International Journal of Applied and Computational Mathematics International Journal of Applied Mathematical Research       (Followers: 1) International Journal of Applied Mathematics and Computer Science       (Followers: 7) International Journal of Applied Mechanics       (Followers: 8) International Journal of Applied Nonlinear Science International Journal of Autonomic Computing       (Followers: 1) International Journal of Bifurcation and Chaos       (Followers: 4) International Journal of Biomathematics       (Followers: 2) International Journal of Computational Complexity and Intelligent Algorithms International Journal of Computational Economics and Econometrics       (Followers: 6) International Journal of Computational Geometry and Applications       (Followers: 2) International Journal of Computational Intelligence and Applications       (Followers: 2) International Journal of Computational Methods       (Followers: 4) International Journal of Computer Processing Of Languages       (Followers: 1) International Journal of Control, Automation and Systems       (Followers: 15) International Journal of Dynamical Systems and Differential Equations       (Followers: 1) International Journal of Economics and Accounting       (Followers: 1) International Journal of Foundations of Computer Science       (Followers: 3) International Journal of Fuzzy Computation and Modelling       (Followers: 2) International Journal of Image and Graphics       (Followers: 5) International Journal of Industrial Electronics and Drives       (Followers: 3) International Journal of Low-Carbon Technologies       (Followers: 1) International Journal of Mathematical Education in Science and Technology       (Followers: 9) International Journal of Mathematical Modelling & Computations       (Followers: 3) International Journal of Mathematical Modelling and Numerical Optimisation       (Followers: 5) International Journal of Mathematical Sciences and Computing International Journal of Mathematics       (Followers: 4) International Journal of Mathematics & Computation International Journal of Mathematics and Mathematical Sciences       (Followers: 4) International Journal of Mathematics in Operational Research       (Followers: 2) International Journal of Metaheuristics       (Followers: 1) International Journal of Modelling in Operations Management       (Followers: 2) International Journal of Modern Nonlinear Theory and Application       (Followers: 1) International Journal of Number Theory       (Followers: 1) International Journal of Partial Differential Equations       (Followers: 2) International Journal of Polymer Science       (Followers: 25) International Journal of Pure Mathematical Sciences International Journal of Reliability, Quality and Safety Engineering       (Followers: 14) International Journal of Research in Undergraduate Mathematics Education       (Followers: 4) International Journal of Sediment Research       (Followers: 2) International Journal of Shape Modeling       (Followers: 1) International Journal of Theoretical and Mathematical Physics       (Followers: 13) International Journal of Trends in Mathematics Education Research       (Followers: 4) International Journal of Ultra Wideband Communications and Systems International Journal of Wavelets, Multiresolution and Information Processing International Journal on Artificial Intelligence Tools       (Followers: 9) International Mathematics Research Notices       (Followers: 1) Internet Mathematics       (Followers: 1) Inventiones mathematicae       (Followers: 2) Inverse Problems in Science and Engineering       (Followers: 3) Investigations in Mathematics Learning Iranian Journal of Optimization       (Followers: 2) Israel Journal of Mathematics Ithaca : Viaggio nella Scienza ITM Web of Conferences Izvestiya Rossiiskoi Akademii Nauk. Seriya Matematicheskaya Jahresbericht der Deutschen Mathematiker-Vereinigung Japan Journal of Industrial and Applied Mathematics Japanese Journal of Mathematics JIPM (Jurnal Ilmiah Pendidikan Matematika) JMPM : Jurnal Matematika dan Pendidikan Matematika JOHME : Journal of Holistic Mathematics Education       (Followers: 2) Johnson Matthey Technology Review Jornal Internacional de Estudos em Educação Matemática Journal d'Analyse Mathématique       (Followers: 2) Journal de Mathématiques Pures et Appliquées       (Followers: 3) Journal for Research in Mathematics Education       (Followers: 28) Journal für Mathematik-Didaktik Journal of Advanced Mathematics and Applications       (Followers: 1) Journal of Algebra       (Followers: 3) Journal of Algebra and Its Applications       (Followers: 3) Journal of Algebraic Combinatorics       (Followers: 3) Journal of Algorithms & Computational Technology Journal of Applied Mathematics       (Followers: 3) Journal of Applied Mathematics and Computing Journal of Applied Mathematics, Statistics and Informatics       (Followers: 1) Journal of Artificial Intelligence and Data Mining       (Followers: 10) Journal of Classification       (Followers: 5) Journal of Combinatorial Designs       (Followers: 4) Journal of Combinatorial Optimization       (Followers: 7) Journal of Combinatorial Theory, Series A       (Followers: 5) Journal of Combinatorial Theory, Series B       (Followers: 3) Journal of Complex Analysis       (Followers: 2) Journal of Complex Networks       (Followers: 1) Journal of Complexity       (Followers: 6) Journal of Computational and Applied Mathematics       (Followers: 6) Journal of Computational Biology       (Followers: 9) Journal of Computational Mathematics and Data Science Journal of Computational Multiphase Flows       (Followers: 1) Journal of Computational Physics       (Followers: 59) Journal of Computational Physics : X       (Followers: 1) Journal of Computer Engineering, System and Science (CESS) Journal of Contemporary Mathematical Analysis Journal of Cryptology       (Followers: 5) Journal of Difference Equations and Applications Journal of Differential Equations       (Followers: 1) Journal of Discrete Mathematics       (Followers: 1) Journal of Dynamics and Differential Equations Journal of Engineering Mathematics       (Followers: 2) Journal of Evolution Equations Journal of Experimental Algorithmics Journal of Flood Risk Management       (Followers: 14) Journal of Function Spaces Journal of Functional Analysis       (Followers: 3) Journal of Geochemical Exploration       (Followers: 4) Journal of Geological Research       (Followers: 1) Journal of Geovisualization and Spatial Analysis Journal of Global Optimization       (Followers: 6) Journal of Global Research in Mathematical Archives Journal of Homotopy and Related Structures Journal of Honai Math Journal of Humanistic Mathematics       (Followers: 1) Journal of Hyperbolic Differential Equations Journal of Indian Council of Philosophical Research Journal of Industrial Mathematics       (Followers: 2) Journal of Inequalities and Applications Journal of Infrared, Millimeter and Terahertz Waves       (Followers: 3) Journal of Integrable Systems Journal of Knot Theory and Its Ramifications       (Followers: 2) Journal of Liquid Chromatography & Related Technologies       (Followers: 7) Journal of Logical and Algebraic Methods in Programming       (Followers: 1) Journal of Manufacturing Systems       (Followers: 3) Journal of Mathematical Analysis and Applications       (Followers: 3) Journal of mathematical and computational science       (Followers: 2)

First | 1 2 3 4 | Last

Similar Journals
 Information and InferenceNumber of Followers: 0     Free journal ISSN (Print) 2049-8764 - ISSN (Online) 2049-8772 Published by Oxford University Press  [419 journals]
• Does SLOPE outperform bridge regression'
• Authors: Wang S; Weng H, Maleki A.
Pages: 1 - 54
Abstract: AbstractA recently proposed SLOPE estimator [6] has been shown to adaptively achieve the minimax $\ell _2$ estimation rate under high-dimensional sparse linear regression models [25]. Such minimax optimality holds in the regime where the sparsity level $k$, sample size $n$ and dimension $p$ satisfy $k/p\rightarrow 0, k\log p/n\rightarrow 0$. In this paper, we characterize the estimation error of SLOPE under the complementary regime where both $k$ and $n$ scale linearly with $p$, and provide new insights into the performance of SLOPE estimators. We first derive a concentration inequality for the finite sample mean square error (MSE) of SLOPE. The quantity that MSE concentrates around takes a complicated and implicit form. With delicate analysis of the quantity, we prove that among all SLOPE estimators, LASSO is optimal for estimating $k$-sparse parameter vectors that do not have tied nonzero components in the low noise scenario. On the other hand, in the large noise scenario, the family of SLOPE estimators are sub-optimal compared with bridge regression such as the Ridge estimator.
PubDate: Mon, 15 Nov 2021 00:00:00 GMT
DOI: 10.1093/imaiai/iaab025
Issue No: Vol. 11, No. 1 (2021)

• Secure multiparty computations in floating-point arithmetic
• Authors: Guo C; Hannun A, Knott B, et al.
Pages: 103 - 135
Abstract: AbstractSecure multiparty computations enable the distribution of so-called shares of sensitive data to multiple parties such that the multiple parties can effectively process the data while being unable to glean much information about the data (at least not without collusion among all parties to put back together all the shares). Thus, the parties may conspire to send all their processed results to a trusted third party (perhaps the data providers) at the conclusion of the computations, with only the trusted third party being able to view the final results. Secure multiparty computations for privacy-preserving machine-learning turn out to be possible using solely standard floating-point arithmetic, at least with a carefully controlled leakage of information less than the loss of accuracy due to roundoff, all backed by rigorous mathematical proofs of worst-case bounds on information loss and numerical stability in finite-precision arithmetic. Numerical examples illustrate the high performance attained on commodity off-the-shelf hardware for generalized linear models, including ordinary linear least-squares regression, binary and multinomial logistic regression, probit regression and Poisson regression.
PubDate: Mon, 18 Jan 2021 00:00:00 GMT
DOI: 10.1093/imaiai/iaaa038
Issue No: Vol. 11, No. 1 (2021)

• Distributed information-theoretic clustering
• Authors: Pichler G; Piantanida P, Matz G.
Pages: 137 - 166
Abstract: AbstractWe study a novel multi-terminal source coding setup motivated by the biclustering problem. Two separate encoders observe two i.i.d. sequences $X^n$ and $Y^n$, respectively. The goal is to find rate-limited encodings $f(x^n)$ and $g(z^n)$ that maximize the mutual information $\textrm{I}(\,{f(X^n)};{g(Y^n)})/n$. We discuss connections of this problem with hypothesis testing against independence, pattern recognition and the information bottleneck method. Improving previous cardinality bounds for the inner and outer bounds allows us to thoroughly study the special case of a binary symmetric source and to quantify the gap between the inner and the outer bound in this special case. Furthermore, we investigate a multiple description (MD) extension of the CEO problem with mutual information constraint. Surprisingly, this MD-CEO problem permits a tight single-letter characterization of the achievable region.
PubDate: Fri, 14 May 2021 00:00:00 GMT
DOI: 10.1093/imaiai/iaab007
Issue No: Vol. 11, No. 1 (2021)

• A dimensionality reduction technique for unconstrained global optimization
of functions with low effective dimensionality
• Authors: Cartis C; Otemissov A.
Pages: 167 - 201
Abstract: AbstractWe investigate the unconstrained global optimization of functions with low effective dimensionality, which are constant along certain (unknown) linear subspaces. Extending the technique of random subspace embeddings in Wang et al. (2016, J. Artificial Intelligence Res., 55, 361–387), we study a generic Random Embeddings for Global Optimization (REGO) framework that is compatible with any global minimization algorithm. Instead of the original, potentially large-scale optimization problem, within REGO, a Gaussian random, low-dimensional problem with bound constraints is formulated and solved in a reduced space. We provide novel probabilistic bounds for the success of REGO in solving the original, low effective-dimensionality problem, which show its independence of the (potentially large) ambient dimension and its precise dependence on the dimensions of the effective and embedding subspaces. These results significantly improve existing theoretical analyses by providing the exact distribution of a reduced minimizer and its Euclidean norm and by the general assumptions required on the problem. We validate our theoretical findings by extensive numerical testing of REGO with three types of global optimization solvers, illustrating the improved scalability of REGO compared with the full-dimensional application of the respective solvers.
PubDate: Wed, 19 May 2021 00:00:00 GMT
DOI: 10.1093/imaiai/iaab011
Issue No: Vol. 11, No. 1 (2021)

• Compressed Sensing with 1D Total Variation: Breaking Sample Complexity
Barriers via Non-Uniform Recovery
• Authors: Genzel M; März M, Seidel R.
Pages: 203 - 250
Abstract: AbstractThis paper investigates total variation minimization in one spatial dimension for the recovery of gradient-sparse signals from undersampled Gaussian measurements. Recently established bounds for the required sampling rate state that uniform recovery of all $s$-gradient-sparse signals in ${\mathbb{R}}^n$ is only possible with $m \gtrsim \sqrt{s n} \cdot{\operatorname{PolyLog}}(n)$ measurements. Such a condition is especially prohibitive for high-dimensional problems, where $s$ is much smaller than $n$. However, previous empirical findings seem to indicate that this sampling rate does not reflect the typical behavior of total variation minimization. The present work provides a rigorous analysis that breaks the $\sqrt{s n}$-bottleneck for a large class of “natural” signals. The main result shows that non-uniform recovery succeeds with high probability for $m \gtrsim s \cdot{\operatorname{PolyLog}}(n)$ measurements if the jump discontinuities of the signal vector are sufficiently well separated. In particular, this guarantee allows for signals arising from a discretization of piecewise constant functions defined on an interval. The key ingredient of the proof is a novel upper bound for the associated conic Gaussian mean width, which is based on a signal-dependent, non-dyadic Haar wavelet transform. Furthermore, a natural extension to stable and robust recovery is addressed.
PubDate: Sat, 15 May 2021 00:00:00 GMT
DOI: 10.1093/imaiai/iaab001
Issue No: Vol. 11, No. 1 (2021)

• Compressive learning with privacy guarantees
• Authors: Chatalic A; Schellekens V, Houssiau F, et al.
Pages: 251 - 305
Abstract: AbstractThis work addresses the problem of learning from large collections of data with privacy guarantees. The compressive learning framework proposes to deal with the large scale of datasets by compressing them into a single vector of generalized random moments, called a sketch vector, from which the learning task is then performed. We provide sharp bounds on the so-called sensitivity of this sketching mechanism. This allows us to leverage standard techniques to ensure differential privacy—a well-established formalism for defining and quantifying the privacy of a random mechanism—by adding Laplace of Gaussian noise to the sketch. We combine these standard mechanisms with a new feature subsampling mechanism, which reduces the computational cost without damaging privacy. The overall framework is applied to the tasks of Gaussian modeling, k-means clustering and principal component analysis, for which sharp privacy bounds are derived. Empirically, the quality (for subsequent learning) of the compressed representation produced by our mechanism is strongly related with the induced noise level, for which we give analytical expressions.
PubDate: Sat, 15 May 2021 00:00:00 GMT
DOI: 10.1093/imaiai/iaab005
Issue No: Vol. 11, No. 1 (2021)

• Learning deep linear neural networks: Riemannian gradient flows and
convergence to global minimizers
• Authors: Bah B; Rauhut H, Terstiege U, et al.
Pages: 307 - 353
Abstract: AbstractWe study the convergence of gradient flows related to learning deep linear neural networks (where the activation function is the identity map) from data. In this case, the composition of the network layers amounts to simply multiplying the weight matrices of all layers together, resulting in an overparameterized problem. The gradient flow with respect to these factors can be re-interpreted as a Riemannian gradient flow on the manifold of rank-$r$ matrices endowed with a suitable Riemannian metric. We show that the flow always converges to a critical point of the underlying functional. Moreover, we establish that, for almost all initializations, the flow converges to a global minimum on the manifold of rank $k$ matrices for some $k\leq r$.
PubDate: Mon, 08 Feb 2021 00:00:00 GMT
DOI: 10.1093/imaiai/iaaa039
Issue No: Vol. 11, No. 1 (2021)

• Breaking the waves: asymmetric random periodic features for low-bitrate
kernel machines
• Authors: Schellekens V; Jacques L.
Pages: 385 - 421
Abstract: AbstractMany signal processing and machine learning applications are built from evaluating a kernel on pairs of signals, e.g., to assess the similarity of an incoming query to a database of known signals. This nonlinear evaluation can be simplified to a linear inner product of the random Fourier features (RFFs) of those signals: random projections followed by a periodic map, the complex exponential. It is known that a simple quantization of those features (corresponding to replacing the complex exponential by a different periodic map that takes binary values, which is appealing for their transmission and storage) distorts the approximated kernel, which may be undesirable in practice. Our take-home message is that when the features of only one of the two signals are quantized, the original kernel is recovered without distortion; its practical interest appears in several cases where the kernel evaluations are asymmetric by nature, such as a client-server scheme. Concretely, we introduce the general framework of asymmetric random periodic features, where the two signals of interest are observed through random periodic features—random projections followed by a general periodic map, which is allowed to be different for both signals. We derive the influence of those periodic maps on the approximated kernel and prove uniform probabilistic error bounds holding for all pair of signals picked in an infinite low-complexity set. Interestingly, our results allow the periodic maps to be discontinuous, thanks to a new mathematical tool, i.e., the mean Lipschitz smoothness. We then apply this generic framework to semi-quantized kernel machines (where only one of the signals has quantized features and the other has classical RFFs), for which we show theoretically that the approximated kernel remains unchanged (with the associated error bound) and confirm the power of the approach with numerical simulations.
PubDate: Tue, 11 May 2021 00:00:00 GMT
DOI: 10.1093/imaiai/iaab008
Issue No: Vol. 11, No. 1 (2021)

• High-temperature structure detection in ferromagnets
• Authors: Cao Y; Neykov M, Liu H.
Pages: 55 - 102
Abstract: AbstractThis paper studies structure detection problems in high-temperature ferromagnetic (positive interaction only) Ising models. The goal is to distinguish whether the underlying graph is empty, i.e., the model consists of independent Rademacher variables, vs. the alternative that the underlying graph contains a subgraph of a certain structure. We give matching upper and lower minimax bounds under which testing this problem is possible/impossible, respectively. Our results reveal that a key quantity called graph arboricity drives the testability of the problem. On the computational front, under a conjecture of the computational hardness of sparse principal component analysis, we prove that, unless the signal is strong enough, there are no polynomial time tests which are capable of testing this problem. In order to prove this result, we exhibit a way to give sharp inequalities for the even moments of sums of i.i.d. Rademacher random variables which may be of independent interest.
PubDate: Sat, 26 Dec 2020 00:00:00 GMT
DOI: 10.1093/imaiai/iaaa032
Issue No: Vol. 11, No. 1 (2020)

• Recovery analysis of damped spectrally sparse signals and its relation to
MUSIC
• Authors: Li S; Mansour H, Wakin M.
Pages: 355 - 383
Abstract: AbstractOne of the classical approaches for estimating the frequencies and damping factors in a spectrally sparse signal is the MUltiple SIgnal Classification (MUSIC) algorithm, which exploits the low-rank structure of an autocorrelation matrix. Low-rank matrices have also received considerable attention recently in the context of optimization algorithms with partial observations, and nuclear norm minimization (NNM) has been widely used as a popular heuristic of rank minimization for low-rank matrix recovery problems. On the other hand, it has been shown that NNM can be viewed as a special case of atomic norm minimization (ANM), which has achieved great success in solving line spectrum estimation problems. However, as far as we know, the general ANM (not NNM) considered in many existing works can only handle frequency estimation in undamped sinusoids. In this work, we aim to fill this gap and deal with damped spectrally sparse signal recovery problems. In particular, inspired by the dual analysis used in ANM, we offer a novel optimization-based perspective on the classical MUSIC algorithm and propose an algorithm for spectral estimation that involves searching for the peaks of the dual polynomial corresponding to a certain NNM problem, and we show that this algorithm is in fact equivalent to MUSIC itself. Building on this connection, we also extend the classical MUSIC algorithm to the missing data case. We provide exact recovery guarantees for our proposed algorithms and quantify how the sample complexity depends on the true spectral parameters. In particular, we provide a parameter-specific recovery bound for low-rank matrix recovery of jointly sparse signals rather than use certain incoherence properties as in existing literature. Simulation results also indicate that the proposed algorithms significantly outperform some relevant existing methods (e.g., ANM) in frequency estimation of damped exponentials.
PubDate: Wed, 07 Oct 2020 00:00:00 GMT
DOI: 10.1093/imaiai/iaaa024
Issue No: Vol. 11, No. 1 (2020)

JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762