Talk:Slutsky's theorem

Latest comment: 2 years ago by Irazall in topic URL to footnote 4 is not working anymore

old discussion…

edit

The result in the article is not known as Slutsky's Theorem (that is a different result), but rather Slutsky's Lemma. The two results are cited often enough that the distinction should be made. — Preceding unsigned comment added by 98.223.197.174 (talk) 16:34, 2 January 2013 (UTC)Reply

The claim is wrong for general X_n, Y_n. —Preceding unsigned comment added by 91.19.96.104 (talkcontribs)

Agreed. According to Fumio Hayashi's Econometrics textbook, Slutsky's Theorem says nothing about X_n and Y_n BOTH converging in distribution. Instead, if it is if X_n converges in distribution and Y_n converges in probability, then X_n + Y_n ... X_n*Y_n ... as stated already.

This is a very important distinction, but I'm not an expert. —Preceding unsigned comment added by 24.105.142.135 (talk) 16:30, 11 March 2008 (UTC)Reply

Yes, I agree, this looks strange. One reference could be Bickel and Doksum, Mathematical statistics, theorem A.14.9, page 467. —Preceding unsigned comment added by 128.32.132.218 (talk) 19:29, 12 March 2008 (UTC)Reply

- The theorem, as stated, only makes sense if one of the variables X or Y is constant. Otherwise the distributions of X_n+Y_n, X+Y, X_n.Y_n and X.Y are not properly defined ! To have a proper definition of these distributions, one would need the joint distributions of the (X_n,Y_n)'s and of (X,Y). Now if one of the limiting variables, say Y, is actually a constant, then Y_n converges to Y in distribution if and only if it converges to that constant in probability.

- In the third point (convergence of X_n/Y_n) it should be imposed that for n large enough, Y_n is almost surely non zero. Otherwise X_n/Y_n might not be defined on some non-negligible set of the probability space, even if Y \neq 0 a.s.

- To recap, the correct statement is : Let (Xn) and (Yn) be sequences of univariate random variables. If (Xn) converges in distribution to X and (Yn) converges in distribution to a constant a, then

   * (Xn + Yn) converges in distribution to X + a,
   * (XnYn) converges in distribution to a*X, and
   * (Xn / Yn) converges in distribution to X / Y if Y \neq 0 almost surely.

Note that the convergence in distribution of a sequence of random variables (Yn) to a constant a is equivalent to the same convergence in probability. —Preceding unsigned comment added by 81.57.2.37 (talk) 00:35, 12 April 2008 (UTC)Reply

Sorry, the last point is: (Xn / Yn) converges in distribution to (X / a) if a \neq 0 and Yn \neq 0 almost surely for n large enough. —Preceding unsigned comment added by 81.57.2.37 (talk) 00:40, 12 April 2008 (UTC)Reply

All these issues have been resolved, now the theorem’s statement is correct. ... stpasha » talk » 04:42, 13 September 2009 (UTC)Reply

Validity w.r.t. convergence in probability

edit

The article says (in the notes section): "The theorem remains valid if we replace all convergences in distribution with convergences in probability (due to this property)." Here "this property" says that convergence in probability implies convergence in distribution. I think the reasoning here does not work: to replace *all* occurences of "convergence in distribution" by "convergence in probability", we would need "this property" in both directions. --Dominique Unruh (talk) 15:59, 18 December 2019 (UTC)Reply

@DniQ You're right; I have deleted the last half of that sentence. Thatsme314 (talk) 00:19, 20 April 2020 (UTC)Reply

URL to footnote 4 is not working anymore

edit

The link to the counter example is not working anymore — Preceding unsigned comment added by Irazall (talkcontribs) 14:32, 7 June 2022 (UTC)Reply