Subjects -> MATHEMATICS (Total: 1100 journals)
    - APPLIED MATHEMATICS (88 journals)
    - GEOMETRY AND TOPOLOGY (23 journals)
    - MATHEMATICS (812 journals)
    - MATHEMATICS (GENERAL) (43 journals)
    - NUMERICAL ANALYSIS (24 journals)
    - PROBABILITIES AND MATH STATISTICS (110 journals)

MATHEMATICS (812 journals)                  1 2 3 4 5 | Last

Showing 1 - 200 of 538 Journals sorted alphabetically
Abakós     Open Access   (Followers: 5)
Abhandlungen aus dem Mathematischen Seminar der Universitat Hamburg     Hybrid Journal   (Followers: 4)
Academic Voices : A Multidisciplinary Journal     Open Access   (Followers: 2)
Accounting Perspectives     Full-text available via subscription   (Followers: 7)
ACM Transactions on Algorithms (TALG)     Hybrid Journal   (Followers: 16)
ACM Transactions on Computational Logic (TOCL)     Hybrid Journal   (Followers: 4)
ACM Transactions on Mathematical Software (TOMS)     Hybrid Journal   (Followers: 6)
ACS Applied Materials & Interfaces     Hybrid Journal   (Followers: 40)
Acta Applicandae Mathematicae     Hybrid Journal   (Followers: 1)
Acta Mathematica     Hybrid Journal   (Followers: 12)
Acta Mathematica Hungarica     Hybrid Journal   (Followers: 2)
Acta Mathematica Scientia     Full-text available via subscription   (Followers: 5)
Acta Mathematica Sinica, English Series     Hybrid Journal   (Followers: 6)
Acta Mathematica Vietnamica     Hybrid Journal  
Acta Mathematicae Applicatae Sinica, English Series     Hybrid Journal  
Advanced Science Letters     Full-text available via subscription   (Followers: 12)
Advances in Applied Clifford Algebras     Hybrid Journal   (Followers: 4)
Advances in Calculus of Variations     Hybrid Journal   (Followers: 6)
Advances in Catalysis     Full-text available via subscription   (Followers: 5)
Advances in Complex Systems     Hybrid Journal   (Followers: 10)
Advances in Computational Mathematics     Hybrid Journal   (Followers: 23)
Advances in Decision Sciences     Open Access   (Followers: 4)
Advances in Difference Equations     Open Access   (Followers: 3)
Advances in Fixed Point Theory     Open Access   (Followers: 8)
Advances in Geosciences (ADGEO)     Open Access   (Followers: 19)
Advances in Linear Algebra & Matrix Theory     Open Access   (Followers: 11)
Advances in Materials Science     Open Access   (Followers: 19)
Advances in Mathematical Physics     Open Access   (Followers: 8)
Advances in Mathematics     Full-text available via subscription   (Followers: 17)
Advances in Nonlinear Analysis     Open Access   (Followers: 1)
Advances in Numerical Analysis     Open Access   (Followers: 9)
Advances in Operations Research     Open Access   (Followers: 13)
Advances in Operator Theory     Hybrid Journal   (Followers: 4)
Advances in Porous Media     Full-text available via subscription   (Followers: 5)
Advances in Pure and Applied Mathematics     Hybrid Journal   (Followers: 10)
Advances in Pure Mathematics     Open Access   (Followers: 11)
Advances in Science and Research (ASR)     Open Access   (Followers: 9)
Aequationes Mathematicae     Hybrid Journal   (Followers: 2)
African Journal of Educational Studies in Mathematics and Sciences     Full-text available via subscription   (Followers: 9)
African Journal of Mathematics and Computer Science Research     Open Access   (Followers: 7)
Afrika Matematika     Hybrid Journal   (Followers: 3)
Air, Soil & Water Research     Open Access   (Followers: 13)
AKSIOMA Journal of Mathematics Education     Open Access   (Followers: 3)
AKSIOMATIK : Jurnal Penelitian Pendidikan dan Pembelajaran Matematika     Open Access   (Followers: 1)
Al-Jabar : Jurnal Pendidikan Matematika     Open Access   (Followers: 1)
Al-Qadisiyah Journal for Computer Science and Mathematics     Open Access   (Followers: 1)
AL-Rafidain Journal of Computer Sciences and Mathematics     Open Access   (Followers: 6)
Algebra and Logic     Hybrid Journal   (Followers: 8)
Algebra Colloquium     Hybrid Journal   (Followers: 4)
Algebra Universalis     Hybrid Journal   (Followers: 2)
Algorithmic Operations Research     Open Access   (Followers: 5)
Algorithms     Open Access   (Followers: 12)
Algorithms Research     Open Access   (Followers: 1)
American Journal of Computational and Applied Mathematics     Open Access   (Followers: 10)
American Journal of Mathematical Analysis     Open Access   (Followers: 2)
American Journal of Mathematical and Management Sciences     Hybrid Journal   (Followers: 1)
American Journal of Mathematics     Full-text available via subscription   (Followers: 7)
American Journal of Operations Research     Open Access   (Followers: 8)
American Mathematical Monthly     Full-text available via subscription   (Followers: 6)
An International Journal of Optimization and Control: Theories & Applications     Open Access   (Followers: 11)
Anadol University Journal of Science and Technology B : Theoritical Sciences     Open Access  
Analele Universitatii Ovidius Constanta - Seria Matematica     Open Access  
Analysis and Applications     Hybrid Journal   (Followers: 1)
Analysis and Mathematical Physics     Hybrid Journal   (Followers: 6)
Analysis Mathematica     Full-text available via subscription  
Analysis. International mathematical journal of analysis and its applications     Hybrid Journal   (Followers: 5)
Anargya : Jurnal Ilmiah Pendidikan Matematika     Open Access   (Followers: 7)
Annales Mathematicae Silesianae     Open Access   (Followers: 2)
Annales mathématiques du Québec     Hybrid Journal   (Followers: 4)
Annales Universitatis Mariae Curie-Sklodowska, sectio A – Mathematica     Open Access   (Followers: 1)
Annales Universitatis Paedagogicae Cracoviensis. Studia Mathematica     Open Access  
Annali di Matematica Pura ed Applicata     Hybrid Journal   (Followers: 1)
Annals of Combinatorics     Hybrid Journal   (Followers: 4)
Annals of Data Science     Hybrid Journal   (Followers: 14)
Annals of Discrete Mathematics     Full-text available via subscription   (Followers: 8)
Annals of Functional Analysis     Hybrid Journal   (Followers: 3)
Annals of Mathematics     Full-text available via subscription   (Followers: 2)
Annals of Mathematics and Artificial Intelligence     Hybrid Journal   (Followers: 14)
Annals of PDE     Hybrid Journal  
Annals of Pure and Applied Logic     Open Access   (Followers: 5)
Annals of the Alexandru Ioan Cuza University - Mathematics     Open Access  
Annals of the Institute of Statistical Mathematics     Hybrid Journal   (Followers: 1)
Annals of West University of Timisoara - Mathematics     Open Access  
Annals of West University of Timisoara - Mathematics and Computer Science     Open Access   (Followers: 2)
Annuaire du Collège de France     Open Access   (Followers: 6)
ANZIAM Journal     Open Access   (Followers: 1)
Applicable Algebra in Engineering, Communication and Computing     Hybrid Journal   (Followers: 3)
Applications of Mathematics     Hybrid Journal   (Followers: 3)
Applied Categorical Structures     Hybrid Journal   (Followers: 4)
Applied Computational Intelligence and Soft Computing     Open Access   (Followers: 14)
Applied Mathematics     Open Access   (Followers: 4)
Applied Mathematics     Open Access   (Followers: 8)
Applied Mathematics & Optimization     Hybrid Journal   (Followers: 10)
Applied Mathematics - A Journal of Chinese Universities     Hybrid Journal   (Followers: 1)
Applied Mathematics and Nonlinear Sciences     Open Access  
Applied Mathematics Letters     Full-text available via subscription   (Followers: 4)
Applied Mathematics Research eXpress     Hybrid Journal   (Followers: 1)
Applied Network Science     Open Access   (Followers: 2)
Applied Numerical Mathematics     Hybrid Journal   (Followers: 5)
Applied Spatial Analysis and Policy     Hybrid Journal   (Followers: 6)
Arab Journal of Mathematical Sciences     Open Access   (Followers: 4)
Arabian Journal of Mathematics     Open Access   (Followers: 2)
Archive for Mathematical Logic     Hybrid Journal   (Followers: 4)
Archive of Applied Mechanics     Hybrid Journal   (Followers: 6)
Archive of Numerical Software     Open Access  
Archives of Computational Methods in Engineering     Hybrid Journal   (Followers: 6)
Arkiv för Matematik     Hybrid Journal   (Followers: 1)
Armenian Journal of Mathematics     Open Access   (Followers: 1)
Arnold Mathematical Journal     Hybrid Journal   (Followers: 1)
Artificial Satellites     Open Access   (Followers: 24)
Asia-Pacific Journal of Operational Research     Hybrid Journal   (Followers: 3)
Asian Journal of Algebra     Open Access   (Followers: 1)
Asian Research Journal of Mathematics     Open Access   (Followers: 1)
Asian-European Journal of Mathematics     Hybrid Journal   (Followers: 3)
Australian Mathematics Teacher, The     Full-text available via subscription   (Followers: 7)
Australian Primary Mathematics Classroom     Full-text available via subscription   (Followers: 5)
Australian Senior Mathematics Journal     Full-text available via subscription   (Followers: 2)
Automatic Documentation and Mathematical Linguistics     Hybrid Journal   (Followers: 5)
Axioms     Open Access   (Followers: 1)
Baltic International Yearbook of Cognition, Logic and Communication     Open Access   (Followers: 2)
Banach Journal of Mathematical Analysis     Hybrid Journal   (Followers: 2)
Basin Research     Hybrid Journal   (Followers: 5)
BIBECHANA     Open Access   (Followers: 2)
Biomath     Open Access  
BIT Numerical Mathematics     Hybrid Journal   (Followers: 1)
Boletim Cearense de Educação e História da Matemática     Open Access  
Boletim de Educação Matemática     Open Access  
Boletín de la Sociedad Matemática Mexicana     Hybrid Journal  
Bollettino dell'Unione Matematica Italiana     Full-text available via subscription   (Followers: 3)
British Journal of Mathematical and Statistical Psychology     Full-text available via subscription   (Followers: 18)
Bruno Pini Mathematical Analysis Seminar     Open Access  
Buletinul Academiei de Stiinte a Republicii Moldova. Matematica     Open Access   (Followers: 13)
Bulletin des Sciences Mathamatiques     Full-text available via subscription   (Followers: 4)
Bulletin of Dnipropetrovsk University. Series : Communications in Mathematical Modeling and Differential Equations Theory     Open Access   (Followers: 3)
Bulletin of Mathematical Sciences     Open Access   (Followers: 1)
Bulletin of Symbolic Logic     Full-text available via subscription   (Followers: 3)
Bulletin of the Australian Mathematical Society     Full-text available via subscription   (Followers: 2)
Bulletin of the Brazilian Mathematical Society, New Series     Hybrid Journal  
Bulletin of the Iranian Mathematical Society     Hybrid Journal  
Bulletin of the London Mathematical Society     Hybrid Journal   (Followers: 3)
Bulletin of the Malaysian Mathematical Sciences Society     Hybrid Journal  
Cadernos do IME : Série Matemática     Open Access   (Followers: 1)
Calculus of Variations and Partial Differential Equations     Hybrid Journal  
Canadian Journal of Mathematics / Journal canadien de mathématiques     Hybrid Journal  
Canadian Journal of Science, Mathematics and Technology Education     Hybrid Journal   (Followers: 22)
Canadian Mathematical Bulletin     Hybrid Journal  
Carpathian Mathematical Publications     Open Access   (Followers: 1)
Catalysis in Industry     Hybrid Journal   (Followers: 1)
CEAS Space Journal     Hybrid Journal   (Followers: 3)
CHANCE     Hybrid Journal   (Followers: 5)
Chaos, Solitons & Fractals     Hybrid Journal   (Followers: 3)
Chaos, Solitons & Fractals : X     Open Access  
ChemSusChem     Hybrid Journal   (Followers: 8)
Chinese Annals of Mathematics, Series B     Hybrid Journal  
Chinese Journal of Catalysis     Full-text available via subscription   (Followers: 2)
Chinese Journal of Mathematics     Open Access  
Ciencia     Open Access   (Followers: 1)
Clean Air Journal     Full-text available via subscription   (Followers: 1)
CODEE Journal     Open Access   (Followers: 3)
Cogent Mathematics     Open Access   (Followers: 2)
Cognitive Computation     Hybrid Journal   (Followers: 3)
Collectanea Mathematica     Hybrid Journal  
College Mathematics Journal     Hybrid Journal   (Followers: 4)
COMBINATORICA     Hybrid Journal  
Combinatorics, Probability and Computing     Hybrid Journal   (Followers: 4)
Combustion Theory and Modelling     Hybrid Journal   (Followers: 15)
Commentarii Mathematici Helvetici     Hybrid Journal  
Communications in Advanced Mathematical Sciences     Open Access  
Communications in Combinatorics and Optimization     Open Access  
Communications in Contemporary Mathematics     Hybrid Journal  
Communications in Mathematical Physics     Hybrid Journal   (Followers: 4)
Communications On Pure & Applied Mathematics     Hybrid Journal   (Followers: 4)
Complex Analysis and its Synergies     Open Access   (Followers: 3)
Complex Variables and Elliptic Equations: An International Journal     Hybrid Journal  
Composite Materials Series     Full-text available via subscription   (Followers: 9)
Compositio Mathematica     Full-text available via subscription  
Comptes Rendus Mathematique     Full-text available via subscription  
Computational and Applied Mathematics     Hybrid Journal   (Followers: 4)
Computational and Mathematical Methods     Hybrid Journal  
Computational and Mathematical Methods in Medicine     Open Access   (Followers: 2)
Computational and Mathematical Organization Theory     Hybrid Journal   (Followers: 1)
Computational Complexity     Hybrid Journal   (Followers: 4)
Computational Mathematics and Modeling     Hybrid Journal   (Followers: 9)
Computational Mechanics     Hybrid Journal   (Followers: 5)
Computational Methods and Function Theory     Hybrid Journal  
Computational Optimization and Applications     Hybrid Journal   (Followers: 9)
Computers & Mathematics with Applications     Full-text available via subscription   (Followers: 11)
Concrete Operators     Open Access   (Followers: 4)
Confluentes Mathematici     Hybrid Journal  
Contributions to Discrete Mathematics     Open Access   (Followers: 2)
Contributions to Game Theory and Management     Open Access  
COSMOS     Hybrid Journal  
Cryptography and Communications     Hybrid Journal   (Followers: 13)
Cuadernos de Investigación y Formación en Educación Matemática     Open Access  
Cubo. A Mathematical Journal     Open Access  
Current Research in Biostatistics     Open Access   (Followers: 8)
Czechoslovak Mathematical Journal     Hybrid Journal   (Followers: 1)
Daya Matematis : Jurnal Inovasi Pendidikan Matematika     Open Access   (Followers: 2)
Demographic Research     Open Access   (Followers: 15)
Demonstratio Mathematica     Open Access  

        1 2 3 4 5 | Last

Similar Journals
Journal Cover
Computational Optimization and Applications
Journal Prestige (SJR): 1.127
Citation Impact (citeScore): 2
Number of Followers: 9  
 
  Hybrid Journal Hybrid journal (It can contain Open Access articles)
ISSN (Print) 1573-2894 - ISSN (Online) 0926-6003
Published by Springer-Verlag Homepage  [2626 journals]
  • Convergence rates of subgradient methods for quasi-convex optimization
           problems
    • Abstract: Quasi-convex optimization acts a pivotal part in many fields including economics and finance; the subgradient method is an effective iterative algorithm for solving large-scale quasi-convex optimization problems. In this paper, we investigate the quantitative convergence theory, including the iteration complexity and convergence rates, of various subgradient methods for solving quasi-convex optimization problems in a unified framework. In particular, we consider a sequence satisfying a general (inexact) basic inequality, and investigate the global convergence theorem and the iteration complexity when using the constant, diminishing or dynamic stepsize rules. More importantly, we establish the linear (or sublinear) convergence rates of the sequence under an additional assumption of weak sharp minima of Hölderian order and upper bounded noise. These convergence theorems are applied to establish the iteration complexity and convergence rates of several subgradient methods, including the standard/inexact/conditional subgradient methods, for solving quasi-convex optimization problems under the assumptions of the Hölder condition and/or the weak sharp minima of Hölderian order.
      PubDate: 2020-09-01
       
  • Convergence study of indefinite proximal ADMM with a relaxation factor
    • Abstract: The alternating direction method of multipliers (ADMM) is widely used to solve separable convex programming problems. At each iteration, the classical ADMM solves the subproblems exactly. For many problems arising from practical applications, it is usually impossible or too expensive to obtain the exact solution of a subproblem. To overcome this, a special proximal term is added to ease the solvability of a subproblem. In the literature, the proximal term can be relaxed to be indefinite while still with a convergence guarantee; this relaxation permits the adoption of larger step sizes to solve the subproblem, which particularly accelerates its performance. A large value of the relaxation factor introduced in the dual step of ADMM also plays a vital role in accelerating its performance. However, it is still not clear whether these two acceleration strategies can be used simultaneously with no restriction on the penalty parameter. In this paper, we answer this question affirmatively and conduct a rigorous convergence analysis for indefinite proximal ADMM with a relaxation factor (IP-ADMM \(_{r}\) ), reveal the relationships between the parameter in the indefinite proximal term and the relaxation factor to ensure its global convergence, and establish the worst-case convergence rate in the ergodic sense. Finally, some numerical results on basis pursuit and total variation-based denoising with box constraint problems are presented to verify the efficiency of IP-ADMM \(_{r}\) .
      PubDate: 2020-09-01
       
  • On the convergence of steepest descent methods for multiobjective
           optimization
    • Abstract: In this paper we consider the classical unconstrained nonlinear multiobjective optimization problem. For such a problem, it is particularly interesting to compute as many points as possible in an effort to approximate the so-called Pareto front. Consequently, to solve the problem we define an “a posteriori” algorithm whose generic iterate is represented by a set of points rather than by a single one. The proposed algorithm takes advantage of a linesearch with extrapolation along steepest descent directions with respect to (possibly not all of) the objective functions. The sequence of sets of points produced by the algorithm defines a set of “linked” sequences of points. We show that each linked sequence admits at least one limit point (not necessarily distinct from those obtained by other sequences) and that every limit point is Pareto-stationary. We also report numerical results on a collection of multiobjective problems that show efficiency of the proposed approach over more classical ones.
      PubDate: 2020-09-01
       
  • An augmented Lagrangian algorithm for multi-objective optimization
    • Abstract: In this paper, we propose an adaptation of the classical augmented Lagrangian method for dealing with multi-objective optimization problems. Specifically, after a brief review of the literature, we give a suitable definition of Augmented Lagrangian for equality and inequality constrained multi-objective problems. We exploit this object in a general computational scheme that is proved to converge, under mild assumptions, to weak Pareto points of such problems. We then provide a modified version of the algorithm which is more suited for practical implementations, proving again convergence properties under reasonable hypotheses. Finally, computational experiments show that the proposed methods not only do work in practice, but are also competitive with respect to state-of-the-art methods.
      PubDate: 2020-09-01
       
  • An active-set algorithmic framework for non-convex optimization problems
           over the simplex
    • Abstract: In this paper, we describe a new active-set algorithmic framework for minimizing a non-convex function over the unit simplex. At each iteration, the method makes use of a rule for identifying active variables (i.e., variables that are zero at a stationary point) and specific directions (that we name active-set gradient related directions) satisfying a new “nonorthogonality” type of condition. We prove global convergence to stationary points when using an Armijo line search in the given framework. We further describe three different examples of active-set gradient related directions that guarantee linear convergence rate (under suitable assumptions). Finally, we report numerical experiments showing the effectiveness of the approach.
      PubDate: 2020-09-01
       
  • On the resolution of misspecified convex optimization and monotone
           variational inequality problems
    • Abstract: We consider a misspecified optimization problem, requiring the minimization of a function \(f(\cdot;\theta ^*)\) over a closed and convex set X where \(\theta ^*\) is an unknown vector of parameters that may be learnt by a parallel learning process. Here, we develop coupled schemes that generate iterates \((x_k,\theta _k)\) as \(k \rightarrow \infty\) , then \(x_k \rightarrow x^*\) , a minimizer of \(f(\cdot;\theta ^*)\) over X and \(\theta _k \rightarrow \theta ^*\) . In the first part of the paper, we consider the solution of problems where f is either smooth or nonsmooth. In smooth strongly convex regimes, we demonstrate that such schemes still display a linear rate of convergence, albeit with larger constants. When strong convexity assumptions are weakened, it can be shown that the convergence in function value sees a modification in the canonical convergence rate of \({{{\mathcal {O}}}}(1/K)\) by an additive factor proportional to \(\Vert \theta _0-\theta ^*\Vert\) where \(\Vert \theta _0-\theta ^*\Vert\) represents the initial misspecification in \(\theta ^*\) . In addition, when the learning problem is assumed to be merely convex but admits a suitable weak-sharpness property, then the convergence rate deteriorates to \({\mathcal {O}}(1/\sqrt{K})\) . In both convex and strongly convex regimes, diminishing steplength schemes are also provided and are less reliant on the knowledge of problem parameters. Finally, we present an averaging-based subgradient scheme that displays a rate of \({\mathcal {O}}(1/\sqrt{K})+ \mathcal{O}(\ \theta_0-\theta^*\ (1/K))\) , implying no effect on the canonical rate of \({{{\mathcal {O}}}}(1/\sqrt{K})\) . In the second part of the paper, we consider the solution of misspecified monotone variational inequality problems, motivated by the need to contend with more general equilibrium problems as well as the possibility of misspecification in the constraints. In this context, we first present a constant steplength misspecified extragradient scheme and prove its asymptotic convergence. This scheme is reliant on problem parameters (such as Lipschitz constants) and leads to a misspecified variant of iterative Tikhonov regularization, an avenue that does not necessitate the knowledge of such constants.
      PubDate: 2020-09-01
       
  • Nonlinear optimal control: a numerical scheme based on occupation measures
           and interval analysis
    • Abstract: This paper presents an approximation scheme for optimal control problems using finite-dimensional linear programs and interval analysis. This is done in two parts. Following Vinter approach (SIAM J Control Optim 31(2):518–538, 1993) and using occupation measures, the optimal control problem is written into a linear programming problem of infinite-dimension (weak formulation). Thanks to Interval arithmetic, we provide a relaxation of this infinite-dimensional linear programming problem by a finite dimensional linear programming problem. A proof that the optimal value of the finite dimensional linear programming problem is a lower bound to the optimal value of the control problem is given. Moreover, according to the fineness of the discretization and the size of the chosen test function family, obtained optimal values of each finite dimensional linear programming problem form a sequence of lower bounds which converges to the optimal value of the initial optimal control problem. Examples will illustrate the principle of the methodology.
      PubDate: 2020-09-01
       
  • Weak convergence of iterative methods for solving quasimonotone
           variational inequalities
    • Abstract: In this work, we introduce self-adaptive methods for solving variational inequalities with Lipschitz continuous and quasimonotone mapping(or Lipschitz continuous mapping without monotonicity) in real Hilbert space. Under suitable assumptions, the convergence of algorithms are established without the knowledge of the Lipschitz constant of the mapping. The results obtained in this paper extend some recent results in the literature. Some preliminary numerical experiments and comparisons are reported.
      PubDate: 2020-08-07
       
  • On the interplay between acceleration and identification for the proximal
           gradient algorithm
    • Abstract: In this paper, we study the interplay between acceleration and structure identification for the proximal gradient algorithm. While acceleration is generally beneficial in terms of functional decrease, we report and analyze several cases where its interplay with identification has negative effects on the algorithm behavior (iterates oscillation, loss of structure, etc.). Then, we present a generic method that tames acceleration when structure identification may be at stake; it benefits from a convergence rate that matches the one of the accelerated proximal gradient under some qualifying condition. We show empirically that the proposed method is much more stable in terms of subspace identification compared to the accelerated proximal gradient method while keeping a similar functional decrease.
      PubDate: 2020-08-06
       
  • Consistent treatment of incompletely converged iterative linear solvers in
           reverse-mode algorithmic differentiation
    • Abstract: Algorithmic differentiation (AD) is a widely-used approach to compute derivatives of numerical models. Many numerical models include an iterative process to solve non-linear systems of equations. To improve efficiency and numerical stability, AD is typically not applied to the linear solvers. Instead, the differentiated linear solver call is replaced with hand-produced derivative code that exploits the linearity of the original call. In practice, the iterative linear solvers are often stopped prematurely to recompute the linearisation of the non-linear outer loop. We show that in the reverse-mode of AD, the derivatives obtained with partial convergence become inconsistent with the original and the tangent-linear models, resulting in inaccurate adjoints. We present a correction term that restores consistency between adjoint and tangent-linear gradients if linear systems are only partially converged. We prove the consistency of this correction term and show in numerical experiments that the accuracy of adjoint gradients of an incompressible flow solver applied to an industrial test case is restored when the correction term is used.
      PubDate: 2020-08-03
       
  • Global optimization via inverse distance weighting and radial basis
           functions
    • Abstract: Global optimization problems whose objective function is expensive to evaluate can be solved effectively by recursively fitting a surrogate function to function samples and minimizing an acquisition function to generate new samples. The acquisition step trades off between seeking for a new optimization vector where the surrogate is minimum (exploitation of the surrogate) and looking for regions of the feasible space that have not yet been visited and that may potentially contain better values of the objective function (exploration of the feasible space). This paper proposes a new global optimization algorithm that uses inverse distance weighting (IDW) and radial basis functions (RBF) to construct the acquisition function. Rather arbitrary constraints that are simple to evaluate can be easily taken into account. Compared to Bayesian optimization, the proposed algorithm, that we call GLIS (GLobal minimum using Inverse distance weighting and Surrogate radial basis functions), is competitive and computationally lighter, as we show in a set of benchmark global optimization and hyperparameter tuning problems. MATLAB and Python implementations of GLIS are available at http://cse.lab.imtlucca.it/~bemporad/glis.
      PubDate: 2020-07-27
       
  • On a numerical shape optimization approach for a class of free boundary
           problems
    • Abstract: This paper is devoted to a numerical method for the approximation of a class of free boundary problems of Bernoulli’s type, reformulated as optimal shape design problems with appropriate shape functionals. We show the existence of the shape derivative of the cost functional on a class of admissible domains and compute its shape derivative by using the formula proposed in Boulkhemair (SIAM J Control Optim 55(1):156–171, 2017) and Boulkhemair and Chakib (J Convex Anal 21(1):67–87, 2014), that is, by means of support functions. On the numerical level, this allows us to avoid the tedious computations of the method based on vector fields. A gradient method combined with a boundary element method is performed for the approximation of this problem, in order to overcome the re-meshing task required by the finite element method. Finally, we present some numerical results and simulations concerning practical applications, showing the effectiveness of the proposed approach.
      PubDate: 2020-07-21
       
  • A new method based on the proximal bundle idea and gradient sampling
           technique for minimizing nonsmooth convex functions
    • Abstract: In this paper, we combine the positive aspects of the gradient sampling (GS) and bundle methods, as the most efficient methods in nonsmooth optimization, to develop a robust method for solving unconstrained nonsmooth convex optimization problems. The main aim of the proposed method is to take advantage of both GS and bundle methods, meanwhile avoiding their drawbacks. At each iteration of this method, to find an efficient descent direction, the GS technique is utilized for constructing a local polyhedral model for the objective function. If necessary, via an iterative improvement process, this initial polyhedral model is improved by some techniques inspired by the bundle and GS methods. The convergence of the method is studied, which reveals that the global convergence property of our method is independent of the number of gradient evaluations required to establish and improve the initial polyhedral models. Thus, the presented method needs much fewer gradient evaluations in comparison to the original GS method. Furthermore, by means of numerical simulations, we show that the presented method provides promising results in comparison with GS methods, especially for large scale problems. Moreover, in contrast with some bundle methods, our method is not very sensitive to the accuracy of supplied gradients.
      PubDate: 2020-07-20
       
  • A variation of Broyden class methods using Householder adaptive transforms
    • Abstract: In this work we introduce and study novel Quasi Newton minimization methods based on a Hessian approximation Broyden Class-type updating scheme, where a suitable matrix \(\tilde{B}_k\) is updated instead of the current Hessian approximation \(B_k\) . We identify conditions which imply the convergence of the algorithm and, if exact line search is chosen, its quadratic termination. By a remarkable connection between the projection operation and Krylov spaces, such conditions can be ensured using low complexity matrices \(\tilde{B}_k\) obtained projecting \(B_k\) onto algebras of matrices diagonalized by products of two or three Householder matrices adaptively chosen step by step. Experimental tests show that the introduction of the adaptive criterion, which theoretically guarantees the convergence, considerably improves the robustness of the minimization schemes when compared with a non-adaptive choice; moreover, they show that the proposed methods could be particularly suitable to solve large scale problems where L-BFGS is not able to deliver satisfactory performance.
      PubDate: 2020-07-14
       
  • The distance between convex sets with Minkowski sum structure: application
           to collision detection
    • Abstract: The distance between sets is a long-standing computational geometry problem. In robotics, the distance between convex sets with Minkowski sum structure plays a fundamental role in collision detection. However, it is typically nontrivial to be computed, even if the projection onto each component set admits explicit formula. In this paper, we explore the problem of calculating the distance between convex sets arising from robotics. Upon the recent progress in convex optimization community, the proposed model can be efficiently solved by the recent hot-investigated first-order methods, e.g., alternating direction method of multipliers or primal-dual hybrid gradient method. Preliminary numerical results demonstrate that those first-order methods are fairly efficient in solving distance problems in robotics.
      PubDate: 2020-07-13
       
  • An explicit Tikhonov algorithm for nested variational inequalities
    • Abstract: We consider nested variational inequalities consisting in a (upper-level) variational inequality whose feasible set is given by the solution set of another (lower-level) variational inequality. Purely hierarchical convex bilevel optimization problems and certain multi-follower games are particular instances of nested variational inequalities. We present an explicit and ready-to-implement Tikhonov-type solution method for such problems. We give conditions that guarantee the convergence of the proposed method. Moreover, inspired by recent works in the literature, we provide a convergence rate analysis. In particular, for the simple bilevel instance, we are able to obtain enhanced convergence results.
      PubDate: 2020-07-06
       
  • Acceleration techniques for level bundle methods in weakly smooth convex
           constrained optimization
    • Abstract: We develop a unified level-bundle method, called accelerated constrained level-bundle (ACLB) algorithm, for solving constrained convex optimization problems. where the objective and constraint functions can be nonsmooth, weakly smooth, and/or smooth. ACLB employs Nesterov’s accelerated gradient technique, and hence retains the iteration complexity as that of existing bundle-type methods if the objective or one of the constraint functions is nonsmooth. More importantly, ACLB can significantly reduce iteration complexity when the objective and all constraints are (weakly) smooth. In addition, if the objective contains a nonsmooth component which can be written as a specific form of maximum, we show that the iteration complexity of this component can be much lower than that for general nonsmooth objective function. Numerical results demonstrate the effectiveness of the proposed algorithm.
      PubDate: 2020-07-06
       
  • Make $$\ell _1$$ ℓ 1 regularization effective in training sparse CNN
    • Abstract: Compressed Sensing using \(\ell _1\) regularization is among the most powerful and popular sparsification technique in many applications, but why has it not been used to obtain sparse deep learning model such as convolutional neural network (CNN)' This paper is aimed to provide an answer to this question and to show how to make it work. Following Xiao (J Mach Learn Res 11(Oct):2543–2596, 2010), We first demonstrate that the commonly used stochastic gradient decent and variants training algorithm is not an appropriate match with \(\ell _1\) regularization and then replace it with a different training algorithm based on a regularized dual averaging (RDA) method. The RDA method of Xiao (J Mach Learn Res 11(Oct):2543–2596, 2010) was originally designed specifically for convex problem, but with new theoretical insight and algorithmic modifications (using proper initialization and adaptivity), we have made it an effective match with \(\ell _1\) regularization to achieve a state-of-the-art sparsity for the highly non-convex CNN compared to other weight pruning methods without compromising accuracy (achieving 95% sparsity for ResNet-18 on CIFAR-10, for example).
      PubDate: 2020-07-04
       
  • Inverse point source location with the Helmholtz equation on a bounded
           domain
    • Abstract: The problem of recovering acoustic sources, more specifically monopoles, from point-wise measurements of the corresponding acoustic pressure at a limited number of frequencies is addressed. To this purpose, a family of sparse optimization problems in measure space in combination with the Helmholtz equation on a bounded domain is considered. A weighted norm with unbounded weight near the observation points is incorporated into the formulation. Optimality conditions and conditions for recovery in the small noise case are discussed, which motivates concrete choices of the weight. The numerical realization is based on an accelerated conditional gradient method in measure space and a finite element discretization.
      PubDate: 2020-06-29
       
  • A second-order shape optimization algorithm for solving the exterior
           Bernoulli free boundary problem using a new boundary cost functional
    • Abstract: The exterior Bernoulli problem is rephrased into a shape optimization problem using a new type of objective function called the Dirichlet-data-gap cost function which measures the \(L^2\)-distance between the Dirichlet data of two state functions. The first-order shape derivative of the cost function is explicitly determined via the chain rule approach. Using the same technique, the second-order shape derivative of the cost function at the solution of the free boundary problem is also computed. The gradient and Hessian informations are then used to formulate an efficient second-order gradient-based descent algorithm to numerically solve the minimization problem. The feasibility of the proposed method is illustrated through various numerical examples.
      PubDate: 2020-06-25
       
 
JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
 


Your IP address: 3.226.245.48
 
Home (Search)
API
About JournalTOCs
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-