Sample-Efficient Low Rank Phase Retrieval

2021 
This work studies the Low Rank Phase Retrieval (LRPR) problem: recover an $n \times q$ rank- $r$ matrix ${ \boldsymbol {X}^{\ast}}$ from $\boldsymbol {y}_{k} = | \boldsymbol {A}_{k}^\top \boldsymbol {x}^{\ast} _{k}|$ , $k=1, 2,\ldots, q$ , when each $\boldsymbol {y}_{k}$ is an m-length vector containing independent phaseless linear projections of $\boldsymbol {x}^{\ast}_{k}$ . Here $|.|$ takes element-wise magnitudes of a vector. The different matrices $\boldsymbol {A}_{k}$ are i.i.d. and each contains i.i.d. standard Gaussian entries. We obtain an improved guarantee for AltMinLowRaP, which is an Alternating Minimization solution to LRPR that was introduced and studied in our recent work. As long as the right singular vectors of ${ \boldsymbol {X}^{\ast}}$ satisfy the incoherence assumption, we can show that the AltMinLowRaP estimate converges geometrically to ${ \boldsymbol {X}^{\ast}}$ if the total number of measurements $mq \gtrsim nr^{2} (r + \log (1/\epsilon))$ . In addition, we also need $m \gtrsim max(r, \log q, \log n)$ because of the specific asymmetric nature of our problem. Compared to our recent work, we improve the sample complexity of the AltMin iterations by a factor of $r^{2}$ , and that of the initialization by a factor of $r$ . We argue, based on comparison with related well-studied problems, why the above sample complexity cannot be improved any further for non-convex solutions to LRPR. We also extend our result to the noisy case; we prove stability to corruption by small additive noise.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []