In this work we devise a new version of the Lawson-Hanson with Devia- tion Maximization (LHDM) algorithm, a block method for the solution of NonNegative Least Squares (NNLS). It is shown that the new algorithm here presented convergences in a finite number of steps under suitable conditions. An extensive campaign of experiments is performed in order to evaluate the performance gain with respect to the standard Lawson-Hanson algorithm. We also explore the sparse recovery ability of LHDM, comparing it against several l1-minimization solvers in terms of solution quality and time-to-solution on a large set of dense instances.

The Lawson-Hanson Algorithm with Deviation Maximization: Finite Convergence and Sparse Recovery

Monica Dessole;Fabio Marcuzzi
2023

Abstract

In this work we devise a new version of the Lawson-Hanson with Devia- tion Maximization (LHDM) algorithm, a block method for the solution of NonNegative Least Squares (NNLS). It is shown that the new algorithm here presented convergences in a finite number of steps under suitable conditions. An extensive campaign of experiments is performed in order to evaluate the performance gain with respect to the standard Lawson-Hanson algorithm. We also explore the sparse recovery ability of LHDM, comparing it against several l1-minimization solvers in terms of solution quality and time-to-solution on a large set of dense instances.
File in questo prodotto:
File Dimensione Formato  
LHDM_revised_preprint.pdf

accesso aperto

Tipologia: Preprint (submitted version)
Licenza: Accesso libero
Dimensione 986.6 kB
Formato Adobe PDF
986.6 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3400338
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 8
  • OpenAlex ND
social impact