Talk:Kronecker product
This article is rated Start-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
References
editIs it possible to insert a reference in which one can find proofs for the listed properties of the Kronecker product? E.g. Abstract properties / 1. Spectrum needs in my opinion a citation. — Preceding unsigned comment added by 161.116.80.135 (talk) 15:55, 10 September 2012 (UTC)
This edit request by an editor with a conflict of interest has now been answered. |
I suggest we could add some other applications to the Khatri-Rao product.
Please note that one of the articles sited is of my authorship. Nevertheless, it gives new applications to the Khatri-Rao problem, and, more importantly, presentes a new relationship between the Khatri-Rao and the Kronecker product, what is in accordance with the contents of this Wikipage. If you wish to verify the contents of the paper I listed, you can find a copy of it in my webpage http://www.decom.fee.unicamp.br/~masiero or directly at http://www.decom.fee.unicamp.br/~masiero/articles/Journal/LSP2674969.pdf
Suggested text: This column-wise version of the Khatri-Rao product is useful in linear algebra approaches to data analytical processing[1] and in optimizing the solution of inverse problems dealing with a diagonal matrix[2][3].
Bmasiero (talk) 17:29, 27 March 2017 (UTC)Dr. Bruno Masiero
- Done I'm just going to add it in anyway and let the regulars of the article verify the references. jd22292 (Jalen D. Folf) (talk) 16:58, 14 July 2017 (UTC)
References
- ^ See e.g. H.D. Macedo and J.N. Oliveira. A linear algebra approach to OLAP. Formal Aspects of Computing, 27(2):283–307, 2015.
- ^ Lev-Ari, Hanoch (2005-01-01). "Efficient Solution of Linear Matrix Equations with Application to Multistatic Antenna Array Processing". Communications in Information & Systems. 05 (1): 123–130. ISSN 1526-7555.
- ^ Masiero, B.; Nascimento, V. H. (2017-05-01). "Revisiting the Kronecker Array Transform". IEEE Signal Processing Letters. 24 (5): 525–529. doi:10.1109/LSP.2017.2674969. ISSN 1070-9908.
Request
editRequesting addition/articles for Khatri-Rao and Tracy-Singh products. [1] Shyamal 04:39, 25 July 2006 (UTC)
- Done - actually a few months ago. -- StevenDH (talk) 20:23, 16 April 2008 (UTC)
add bases to comparison to abstract tensor product
editThe paragraph should note that a choice of bases is involved: If A and B represent homomorphisms given certain bases of the involved vector spaces, the Kronecker product of A and B represents the tensor product of these homomorphisms with respect to certain bases of the tensor products of the domain and codomain vector spaces of the form a_1 x b_1, a_1 x b_2, ..., a_1 x b_n, a_2 x b_1, ... 84.190.181.201
- Done - actually a few months ago, as well. RobHar (talk) 16:18, 18 August 2009 (UTC)
Column-wise Khatri-Rao product
editAn anonymous user edited (concerning the final related matrix operation):
- <!-- comment: the following definition is the same as the above except that it uses implicit partitions instead of explicit partitions... is there really need for this second example? -->
The reason I got me a Wikipedia-account in the first place was that I needed the definition of the colunm-wise KR product for my master's thesis, and I was tired of always looking in the paper by Liu. Later I examined this paper in which I saw (on p.3 in the pdf) what I had by then found out namely that the Khatri-Rao product is implied to operate on matrices with as partitions their columns. I wasn't sure whether this would be a mistake or a different (and confusing) convention or something, therefore, and also for my own reference, I added it to the article as a seperate case. But maybe it needs some clarification. -- StevenDH (talk) 20:23, 16 April 2008 (UTC)
- I was the anonymous user who added that. That's interesting to hear why you initially added it. As you can see, I changed the wording in the article to say that both may be called the KR product. I've used the KR product in a couple papers recently in which I just define it as implicitly partitioning columns to avoid any confusion. As it stands, I left the example you added because it probably is better to include both examples (it appears both definitions are used).24.91.117.221 (talk) 17:03, 26 May 2008 (UTC)
Question!!!
editIf A is n-by-n, B is m-by-m and denotes the k-by-k identity matrix then we can define the Kronecker sum, , by
(Note that this is different from the direct sum of two matrices.)
But the denotion of Kronecker sum and Direct sum is equel!!! So is it mistake? Gvozdet (talk) 13:29, 18 August 2009 (UTC)
- Do you mean that they are both denoted by ⊕? That's not really problem. There aren't enough symbols in math so it's common to reuse symbols. I am surprised this is called the Kronecker sum in the first place though. RobHar (talk) 15:37, 18 August 2009 (UTC)
Link to the article in Russian is missing
edithttp://ru.wikipedia.org/wiki/%D0%A2%D0%B5%D0%BD%D0%B7%D0%BE%D1%80%D0%BD%D0%BE%D0%B5_%D0%BF%D1%80%D0%BE%D0%B8%D0%B7%D0%B2%D0%B5%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5 —Preceding unsigned comment added by 217.29.95.125 (talk) 14:23, 23 July 2010 (UTC)
The Kronecker Product is not the Tensor Product
editThere is a confusing clash of nomenclature regarding the Kronecker product.
Despite the occasional use of the phrase "tensor product" to describe the Kronecker product, the Kronecker product doesn't coincide with the usual definition of the tensor product. The tensor product is an operation that produces a tensor of higher rank. That is, in coordinates the tensor product adds indices:
(Refer, for example, to: John Lee, Introduction to Smooth Manifolds)
As an example, the tensor product of two vectors gives you a matrix (ignoring covariant and contravariant issues for the moment). But the tensor product of two matrices is a fourth-rank tensor, not another matrix. The Kronecker product between matrices simply gives you another matrix (although with higher dimension), and is then not the same thing as the tensor product.
The distinction might be blurred in some literature where they presumably call the actual tensor product a Cartesian product, which is also bad nomenclature since the Cartesian product is a product on sets, not tensors; better nomenclature would be direct product (and indeed MathWorld adopts an explicit hybrid nomenclature to be absolutely clear: [[2]]).
The relevant section of the Tensor Product page agrees with me except for the instances that mention the Kronecker product. In fact, this confusion arises in the talk page of the Tensor Product article as well, where Physdragon also agrees with me.
I think the confusion is that when working with only vectors and matrices it's easy to identify matrices and their vectorized counterparts (or indeed fourth-rank tensors and their matricized counterparts) -- something that can't be done with general tensors. The Kronecker product in this light is then the "matricized" version of the fourth-rank tensor which maps between second-rank tensors of the form
or more explicitly is the matrix which maps between vectorized tensors of the form
The Kronecker product should then be contrasted with the tensor product and should not be used to exemplify the tensor product, which is a different -- albeit related -- beast. 129.32.11.206 (talk) 18:22, 15 October 2012 (UTC)
- If you read what this page says, it is explained that "the Kronecker product of matrices corresponds to the abstract tensor product of linear maps". In short, if S and T are two linear maps, they induce a linear map S ⊗ T whose matrix (with respect to a specific fairly natural choice of basis) is the Kronecker product of the matrices representing S and T. There's really no need to talk about ranks of tensors here. RobHar (talk) 03:57, 16 October 2012 (UTC)
- Thank you for your responsive and constructive feedback. I agree that ranks of tensors aren't necessarily relevant to the Kronecker product, which is why I think this article should not contain any mention of the tensor product at all, except perhaps for contrast. Maybe I didn't make the purpose of my post clear. This is what I think should happen with this article IMO: I think these phrases should be removed or revised:
- "[The Kronecker product] is a generalization of the outer product (which is denoted by the same symbol) from vectors to matrices, and gives the matrix of the tensor product with respect to a standard choice of basis." (emphasis mine), and
- "If A and B represent linear transformations V1 → W1 and V2 → W2, respectively, then A ⊗ B represents the tensor product of the two maps, V1 ⊗ V2 → W1 ⊗ W2." (emphasis mine)
- The first phrase is misleading because "matrix of the tensor product" is ambiguous. It could be made more precise by calling it the matricization of the tensor product, which refers to an actual operation on tensors, as opposed to implying that there is an obvious "matrix of a tensor". It may seem obvious in the case of second rank, but it's better to be delicate when going between tensors and matrices, because problems can arise (for example, the transformation rules which characterize a tensor don't work properly when the tensor has been matricized).
- (See below this post for a correspondence between the Kronecker and tensor products as a matricization, which I feel is more precise than the current language in the article.)
- The second phrase is misleading because it's false (we've already defined ⊗ to mean Kronecker product here!), except in the sense where "tensor product" is a synonym for "Kronecker product", which is a bad synonym to introduce since it's highly ambiguous and confusing. 129.32.11.206 (talk) 17:18, 16 October 2012 (UTC)
Maybe in order to clarify the relationship, we could put up something like this:
- For two second-rank tensors and , their Kronecker product can be written as:
- where denotes the abstract tensor product, and is the matricization map taking rank-4 tensors over elements of viewed as a space of rank-2 tensors, to rank-2 tensors over elements of viewed as a space of vectors (rank-1 tensors).
One problem with this is that the between vector spaces is an abstract tensor product and not a Kronecker product, so to be consistent the notation will be funny-looking: , etc.
Another problem is the ambiguity of saying "viewed as a space of..." However I think this is sufficiently clear that one shouldn't have to obfuscate the notation to be precise: 129.32.11.206 (talk) 17:18, 16 October 2012 (UTC)
- I understood what you meant. I was pointing out that you are incorrect in thinking that what is written is incorrect. Let me rephrase. The tensor product is a bifunctor on the category of vector spaces. So, given to linear transformations S and T from V1 to W1 and V2 to W2, one obtains by functoriality a linear transformation S ⊗ T from V1 ⊗ V2 to W1 ⊗ W2. Given any linear transformation S from V1 to W1 and a basis of V1 and a basis of W1, one obtains a matrix A(S) representing the linear transformation; this is basic linear algebra (and ditto for T). What the article is saying is that in a perfectly natural choice of basis for V1 ⊗ V2 and W1 ⊗ W2, the matrix A(S ⊗ T) equals the Kronecker product of A(S) with A(T). This is a simple fact that is quite easy to check (if you still don't believe me). And there's nothing misleading going on here: S and T are linear transformations and S ⊗ T is the standard notation for the map one obtains from the fact that the tensor product is a bifunctor; A(S) and A(T) are matrices and as this article states in the first line, the Kronecker product is a binary operation on matrices and that's what A(S) ⊗ A(T) means. None of these things are being thought of as multidimensional arrays. These are all abstract mathematical objects. As for your objections to "[The Kronecker product] is a generalization of the outer product (which is denoted by the same symbol) from vectors to matrices, and gives the matrix of the tensor product with respect to a standard choice of basis." once again you are being confused by your perspective. In this case, the sentence is not so clear, I give you that, but what it's trying to say is quite true and again simple (at least when viewed from the right perspective). Here goes. When one speaks of the outer product of two vectors, this is what physicist would typically write as
- Now, it is a very common convention to (given a fixed basis) write elements of a vector space V as column vectors. A row vector is then thought of as (the coordinates in the dual basis of) an element of the dual V*. Thus, in a standard way, the above is thought of as an element of V ⊗ V*. This vector space V ⊗ V* is canonically isomorphic to Hom(V,V), i.e. the linear transformations from V to itself. Now one has a basis of V lying around already (since you wrote down a column vector) and so one can write down the canonically associated element of Hom(V,V) in that basis. That is a matrix. That matrix is equal to the Kronecker product of the two above vectors thought of as 3x1 and 1x3 matrices. Should the article explain this in more detail? Yes. But this is all perfectly natural. It seems like your perspective on this comes from manifolds (and maybe computer programming given your link to matricization), but this is misleading you. This is just multilinear algebra. I think this things should be better explained, but they are certainly not false. RobHar (talk) 05:39, 17 October 2012 (UTC)
- I understood what you meant. I was pointing out that you are incorrect in thinking that what is written is incorrect. Let me rephrase. The tensor product is a bifunctor on the category of vector spaces. So, given to linear transformations S and T from V1 to W1 and V2 to W2, one obtains by functoriality a linear transformation S ⊗ T from V1 ⊗ V2 to W1 ⊗ W2. Given any linear transformation S from V1 to W1 and a basis of V1 and a basis of W1, one obtains a matrix A(S) representing the linear transformation; this is basic linear algebra (and ditto for T). What the article is saying is that in a perfectly natural choice of basis for V1 ⊗ V2 and W1 ⊗ W2, the matrix A(S ⊗ T) equals the Kronecker product of A(S) with A(T). This is a simple fact that is quite easy to check (if you still don't believe me). And there's nothing misleading going on here: S and T are linear transformations and S ⊗ T is the standard notation for the map one obtains from the fact that the tensor product is a bifunctor; A(S) and A(T) are matrices and as this article states in the first line, the Kronecker product is a binary operation on matrices and that's what A(S) ⊗ A(T) means. None of these things are being thought of as multidimensional arrays. These are all abstract mathematical objects. As for your objections to "[The Kronecker product] is a generalization of the outer product (which is denoted by the same symbol) from vectors to matrices, and gives the matrix of the tensor product with respect to a standard choice of basis." once again you are being confused by your perspective. In this case, the sentence is not so clear, I give you that, but what it's trying to say is quite true and again simple (at least when viewed from the right perspective). Here goes. When one speaks of the outer product of two vectors, this is what physicist would typically write as
- Thank you again for your explanation, RobHar. I agree now that the article is correct, but the various equalities were not explicit enough to be clear for me. You're correct that my background is with manifolds, but I'm actually encountering the Kronecker product in the context of nonlinear elasticity, where the explicit coordinate representations of the Kronecker product AND the tensor product are simultaneously important. For this reason, I feel it's still appropriate to - at some point in the article - explicitly reference the isomorphisms "behind the scenes" which are tacitly obvious to an algebraist. Perhaps there's a standard notation for (invariant versions of) the and maps? (129.32.11.206) 76.99.97.70 (talk) 13:33, 19 October 2012 (UTC)
Formulas for the inverse of a sum of Kronecker products
editMiller (1981) derives a formula for the inverse of sums of Kronecker products (p. 72) in the special case where the "last" matrix is of rank 1. Specifically, Let W = A x G + B x E, where rank(E) = 1. Then W^{-1} = A^{-1} x G^{-1} - T x G^{-1} E G^{-1} where T = ( A + g B )^{-1} B A^{-1} and g = trace(E G^{-1}). Miller also works out examples where G is the identity matrix and B is also of rank 1. Are these too detailed to include in the page? If they are appropriate, I can add them. --Nathanvan (talk) 22:31, 1 January 2013 (UTC)
Property on transpose can be conjugate transpose
editThe section on Properties says:
"The operation of transposition is distributive over the Kronecker product:
- "
This can be made more generic by also including the complex case and considering the conjugate transpose instead:
I think this is worth noting. I believe it is more trivial to see from this that distributiviness also holds for the real case with the transpose, than the other way around. anoko_moonlight (talk) 11:15, 12 August 2013 (UTC)
Pronunciation
editHow does one pronounce A ⊗ B ? If there’s a standard way, I think it should be mentioned the first time the notation appears in the article. Loraof (talk) 17:28, 5 July 2018 (UTC)
Kronecker sum
edit"Kronecker sums appear naturally in physics when considering ensembles of non-interacting systems.[citation needed] Let Hi be the Hamiltonian of the ith such system. Then the total Hamiltonian of the ensemble is
- ."
It's not quite true. I would say that "Direct sums appear naturally ..." not "Kronecker sums ..."
Perfect shuffle matrices and Commutation matrices
editProperty 2 of the Kronecker product uses the MATLAB notation without explaining it (I suspect it is just a copy/paste of the paper quoted [3]), which is unclear. Moreover, there is more to it, and there is even an article about this fact : https://en.wikipedia.org/wiki/Commutation_matrix. Perhaps the page should be edited to fill in the gaps and clarify this "Shuffle matrix" thing ? AnthonyStC (talk) 21:33, 6 August 2020 (UTC)