Talk:Low-rank approximation

Latest comment: 11 months ago by Bbbbbbbbba in topic Conflict of interest problem

It seems to me that in Recommender system applications , the low rank approximation may consist of categorical data, but that is not necessarily the case.

Similarly in Machine Learning (including Recommender systems), the data may be non linearly structured, but that is not necessarily the case. — Preceding unsigned comment added by AndrewMcN (talkcontribs) 07:20, 25 December 2013 (UTC)Reply

Proof of matrix approximation theorem edit

While the theorem is stated in terms of the Frobenius norm, the proof is given for the spectral norm. This should be fixed.

It is also possible to formulate it as a direct proof: take   and the null space  . By the dimension formula, the intersection is non-trivial, so we can choose   with  . This leads directly to  , proving  .

I assume that proving the theorem for the Frobenius norm might pose a greater challenge.

The proof for the Frobenius norm can be found at the 1936 paper "The approximation of one matrix by another of lower rank" in the reference section (the pdf can be found with google scholar). However, I'm not sure if this is the most simple proof known to date. Also what should we do for the proof for spectral norm? Should we modify the problem description to account for both cases? Bbbbbbbbba (talk) 03:26, 20 November 2014 (UTC)Reply

The current proof for the Frobenius norm is wrong; there is no quick justification for the "clearly..." step. One nice (valid) proof is given here: http://math.stackexchange.com/a/759174/81360 via Weyl's inequalities. Bengski68 (talk) 09:33, 21 June 2016 (UTC)Reply

Proof of uniqueness? edit

The theorem statement mentions uniqueness but there is no uniqueness argument in the proof. This also needs fixed. Uniqueness can be only up to some orthogonal rotations when some of the (r largest) singular values are not unique. Jfessler (talk) 22:27, 9 January 2016 (UTC)Reply

Alternating projections algorithm - what size should w be? edit

I can't get this to run without an error on line 6. What size should w be? Every way I try it the code crashes out on line 6.

d=rand(4,3); w=ones(size(d)); [u s v]=svd(d,0);p=v(:,1:2);tol=1e-6;maxiter=10; >> wlra_ap(d, w, p, tol, maxiter) Error using * Inner matrix dimensions must agree.

Error in wlra_ap (line 6)

   vl = (bp' * w * bp) \ bp' * w * d(:);

Can someone add a working example? Thanks. — Preceding unsigned comment added by 76.88.33.36 (talk) 16:13, 16 February 2018 (UTC)Reply

Conflict of interest problem edit

Recently someone at 192.33.105.77 removed several references to sources by I. Markovsky, calling attention to the fact that Markovsky themselves (User:Imarkovs) added those references. From my understanding of WP:SELFCITE, such references are allowed as long as they are relevant and not excessive, and seeing as User:Imarkovs also created the first versions of this page in the first place, and those references have been there for a long time, I am unsure if they should count as excessive, even though out of the 6 references added by User:Imarkovs (one of them later turned to an external link; see Special:PermanentLink/493378742), 4 are self cites.

As I am currently not very familiar with the topic of this article, I'd like to discuss with the community what to do here. In my opinion, if the text related to the references (such as the current first bullet point in the "Applications" section) is significant, those references should either be restored or replaced by some more relevant references. Otherwise, maybe the text itself should also be removed or revised as appropriate. Bbbbbbbbba (talk) 22:24, 9 May 2023 (UTC)Reply