In this repository, my colleague, Nimra Nawaz, and I implemented the advanced and core concepts of Optimization algorithms such as Steepest Descent and quasi-newton method BFGS to explore the problem of estimating the matrix 2-norm as an unconstrained optimization problem taught by Prof. Antonio Frangioni in Optimization for Data Science course at Università di Pisa for the year 2023/24.
(P) is the problem of estimating the matrix norm ||A||2 for a (possibly rectangular) matrix A ∈ ℝm × n, using its definition as an unconstrained maximum problem.
(A1) is a standard gradient descent steepest descent approach.
(A2) is a quasi-Newton method such as BFGS or L-BFGS.
- Learned mathematical concepts necessary to construct algorithms for the solution of optimization problems.
- Understanding of mathematics behind the optimization of convex and non-convex multivariate functions.
- Univariate continuous unconstrained optimization.
- Multivariate continuous unconstrained smooth optimization
- Multivariate continuous unconstrained nonsmooth optimization
- Sparse hints to Data Science applications