Quantum Algorithms for Least Squares Regression
Quantum computing presents fast algorithms for solving least squares regression problems efficiently, offering solutions for overdetermined linear systems, matrix coherence, and regression computations. These algorithms leverage quantum mechanics to achieve computational speed-ups and approximate solutions, addressing challenges associated with classical methods.
Download Presentation
Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
E N D
Presentation Transcript
Fast quantum algorithms for Least Squares Regression and Statistic Leverage Scores Yang Liu Shengyu Zhang The Chinese University of Hong Kong
Part I. Linear regression Output a quantum sketch of solution. Part II. Computing leverage scores and matrix coherence. Output the target numbers.
Part I: Linear regression Solve overdetermined linear system ?? = ?, where ? ? ?, ? ?,? ?, ? ?. Goal: compute min ? Least Square Regression (LSR) 2. ?? ?2
Closed-form solution Closed-form solution known: ? = ?+? = ??? 1??, ?+: Moore-Penrose pseudo-inverse of ?. If the SVD of ? is ?? ?= ?? ??? ??? ? ? = ????(?), then ?+= ?? 1??. Classical complexity: ? ?2? + ??2 Prohibitively slow for big matrices ?. ? where
Relaxations Relaxation: Approximate: output ? ? . Important special case: Sparse and low-rank ?: ? ? + ?? *1,2, where ? = # non-zero entries in each row/column. ? = ????(?) . Quantum speedup? Even writing down the solution ? takes linear time. *1. K. Clarkson, D. Woodruff. STOC, 2013. *2. J. Nelson, H. Nguyen. FOCS, 2013.
Quantum sketch Similar issue as solving linear system ?? = ? for full-rank ? ? ?. Closed-form solution: ? = ? 1?. [HHL09]*1 Output ? ??? ? ?2?2log? Condition number ? = ?1/??, where ?1 ??> 0 are ? s singular values. ?: sparsity. : proportional ? in time *1. A. Harrow, A. Hassidim, S. Lloyd, PRL, 2009.
Controversy Useless? Can t read out each solution variable ?? Useful? As intermediate steps, e.g. when some global info of ? is needed. ?,? can be obtained from ? by SWAP test. Classically also ????log?? Impossible unless ? = ???. s.
LSR results Back to overdetermined system: ? = ?+?. [WBL12]*1: Output ? ??? ? log ? + ? ?3?6. Ours: Same approx. in time ? log ? + ? ?2?3 Simpler algorithm. Can also estimate ? computing ?,? . Extensions: Ridge Regression, Truncated SVD ? in time 2, which is used for, e.g. 2 *1. N. Wiebe, D. Braun, S. Lloyd, PRL, 2012.
Our algorithm for LSR Input: Hermition ? ? ?, ? ?. Assume ? = ?=1 and the rest ?? s are 0. Non-Hermition reduces to Hermition. Output: ? | ? w/ ? ? , and ? Note: Write ? as ? = ?????, then the desirable output is ?+? = ? [?] 1 ?, ???? ?? with 1 ?1 ?? ? 2. 2 ?? ????.
Algorithm Tool: Phase Estimation quantum algorithm. Output eigenvalue for a given eigenvector. ? ? [?]???? ?? ? [?]????| ?? where ?? ?? ? ? ? ????? ?? + ?>????? ?? ? ? 1 2? ??1 + 0 ?? 0 // attach 0 , rotate if ?? 1 2? ?? ???? ?? 1// select 1 component ?? 1 ?? ????, which is just ?+?. ? ?
Extension 1: Ridge regression For ill-conditioned (i.e. large ?) input? Two classical solutions. Ridge regression: min ?? ?2 Closed-form solution: ? = ?????+ ??? Previous algorithms: ? ?2? + ?3, ? ? + ?2? for sparse and low rank. 2+ ? ?2 2. 1? max 1 , ? min1 Ours: ? log ? + ? ?2? 3, for ? = ? , ?.
Extension 2: Truncated SVD 2, where ??= ? with Goal: min ??? ?2 singular values < ? truncated. Ours: ? log ? + ? ?2/min{?3,?2????} ????= ?? ??+1, where ??> ? ??+1.
Part II. statistic leverage scores ? has SVD ?? ?= ?? ??? ??? ? leverage score ?? = ?? 2 ??: the ?-th row of ?. Matrix coherence: max Leverage score ?? measures the importance of row ?. A well-studied measure. Very useful in large scale data analysis, matrix algorithms, outlier detection, low-rank matrix approximation, etc. *1 ?. The ?-th 2. ?(?). ? *1. M. Mahoney, Randomized Algorithms for Matrices and Data, Foundations & Trends in Machine Learning, 2010.
Computing leverage scores Classical algo.*1 finding all ?(?): ?(??). No better algorithm for finding max ?(?) ? Our quantum algorithms for finding each ?(?): ?(log(? + ?)). finding all ?(?): ?(?log(? + ?)). finding max ? ?(?): ?( ?log(? + ?)). *1. P. Drineas, M. Magdon-Ismail, M. Mahoney, D. Woodruff. J. MLR, 2012.
Algorithm for LSR Input: rank-? Hermition ? ? ?, ? ?, ? [?]. ? = ?=1 Output: ? ?(?). Key Lemma: If ?? = ? [?]????, then ?= ?? 2 = ????+??// ??+= ?????? 1??= ??? = ?=1 ???? ?=1 ?=1 ?? = ?????????? = ?=1 ?? ???? ?? with 1 ?1 ?? 1 ? ?, ??+1= = ??= 0 2= ??????? ? ? ???? ?? ? 1?? ?? 1??????????? ? ?=1 ???? ? 2
Algorithm ?? ? [?]???? ?? ? [?]????| ?? where ?? ?? ? ? ? ????? ?? 1 + ?>????? // rotate to 1 if ?? 1/2?. Estimate the prob of observing 1 when measuring the last qubit. ? ??? ?? 0 2= ?, the target. Pr 1
Summary We give efficient quantum algorithms for two canonical problems on sparse inputs Least squares regression Statistical leverage score The problems are linear algebraic, not group/number/polynomial theoretic