Wikipedia:Reference desk/Archives/Mathematics/2011 July 5

Mathematics desk
< July 4 << Jun | July | Aug >> July 6 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


July 5

edit

Problem with continuous function

edit

This problem is from Spivak's Calculus (4th ed.). Suppose f is a continuous function on [0,1] with the property that f(0) = f(1) and n is any natural number. The first part requires the reader to prove that the equation f(x) = f(x + 1/n) has at least one root in [0,1]; I managed to do that easily enough, by deducing a contradiction. However, the second part has me stumped: Suppose a is in (0,1) and is not equal to 1/n for any natural n. Find an f with the properties above for which the equation f(x) = f(x + a) has no root in [0,1]. Some help with constructing f, particularly by using the first part of the question, would be great. Thanks in advance. —Anonymous DissidentTalk 06:20, 5 July 2011 (UTC)[reply]

Hint: Start with a function for which   for every x. Then modify it to satisfy both   and  . -- Meni Rosenfeld (talk) 08:28, 5 July 2011 (UTC)[reply]
The obvious choice to start with is something trigonometric with period a, such as f(x) = sin(2πx/a). I'm not sure how to modify that to meed the additional requirements though. I thought about saying that if a is in [1/(n+1), 1/n], then offset the period by making it 1/n, but this is not enough. Could you give more advice about a good way to modify a function which satisfies f(x) = f(x + a) for all x to a function that satisfies it for no x? —Anonymous DissidentTalk 10:31, 6 July 2011 (UTC)[reply]
Add to it a function   which satisfies it for no x. -- Meni Rosenfeld (talk) 11:18, 6 July 2011 (UTC)[reply]
But such a g is what we're looking for... —Anonymous DissidentTalk 12:00, 6 July 2011 (UTC)[reply]
You're looking for a function which satisfies it for no x and takes the same value at 0 and 1. Just the former is easy. -- Meni Rosenfeld (talk) 12:13, 6 July 2011 (UTC)[reply]
Right, great. g(x) = (1-x)sin(2π/a) works a treat. Thanks for your input. I guess it came down to figuring out how to go from "for all" to "for none". I'll remember the "add convenient g(x)" idea in the future. —Anonymous DissidentTalk 13:11, 6 July 2011 (UTC)[reply]
Just a small caveat: This particular choice of f doesn't work for a = 2/n (where n is odd) since sin(2π/a) = 0, but it shouldn't be hard to come up with a similar replacement in that particular instance. Rckrone (talk) 02:04, 7 July 2011 (UTC)[reply]
Good point, the periodic function should have a global extremum (unique up to period) at 0. The OP actually has a typo so it's not clear what he meant. If he meant   then that doesn't actually work since  . What I originally had in mind, which works for  , is  . To make it work for all  , you can use  . -- Meni Rosenfeld (talk) 08:20, 7 July 2011 (UTC)[reply]
That's not what I meant. I meant g(x) = (1-x)sin(2π/a) and f(x) = sin(2πx/a) + g(x). Rckrone's point still holds though. —Anonymous DissidentTalk 23:35, 7 July 2011 (UTC)[reply]

Maximum likelihood estimation / regression

edit

Could someone explain in a succint way the difference between these approaches? Is one a subset of the other?

In particular I have two large sets of variables X and Y, and I know Y to be linearly related to X and I suppose the error to be distributed as a Gaussian. I know I can use simple linear regression here to determine the intercept and slope. MLE is quite new to me so I apologise if I'm thinking about it stupidly. In this case it 'feels' like they're related but it also feels like there must be some distinction that must cause one to be preferred. --Iae (talk) 11:31, 5 July 2011 (UTC)[reply]

MLE and regression are both very general terms and comparing them depends on the context to which they are applied. It so happens that for the particular case of a linear model with Gaussian errors of fixed variance, the log-likelihood is proportional to the sum of squared errors, so the MLE will give the same results as simple linear regression. -- Meni Rosenfeld (talk) 11:44, 5 July 2011 (UTC)[reply]

@Iae: You seem to imagine that MLE and regression are two approaches to something. "Regression" is a vague and general term that means estimation of the population mean or population median or similar location parameter, of one variable, conditional on the value of another variable, and all based on a sample. Maximum likelihood estimation is estimation in problems involving parametrized families of proability distributions. In some situations, least-squares estimates correspond exactly with maximum likelihood estimates; in others maximum likelihood makes no sense because there is no parametrized family of probability distributions, but regression is still done. Michael Hardy (talk) 05:23, 7 July 2011 (UTC)[reply]

I think it's more accurate to say that MLE is a generalizable approach to estimation, and regression is an estimation procedure for determining the impact of one or more variables on a dependent variable. The results of a linear regression using ordinary least squares are equivalent to those you would get using MLE. But this is not true of other regression procedures; and MLE can be used to derive estimators for many quantities whose relationship cannot be characterized through regression. 12.186.80.1 (talk) 18:49, 7 July 2011 (UTC)David[reply]

Thanks everyone. I think I confused myself a bit in fixating on linear parameter estimation. --Iae (talk) 23:12, 7 July 2011 (UTC)[reply]

Brownian motion on a Riemannian manifold as a scaling limit

edit

Does anyone know of a reference that presents a clear account of how to obtain a Brownian motion on a Riemannian manifold as a scaling limit of random walks? Sławomir Biały (talk) 13:38, 5 July 2011 (UTC)[reply]

Since Riemannian manifolds are locally Euclidean, I don't see why the scaling should be any different than it is for a Euclidean manifold. Looie496 (talk) 21:38, 6 July 2011 (UTC)[reply]
Riemannian manifolds are only locally topologically Euclidean, not metrically Euclidean. Brownian motion is sensitive to the metric. Sławomir Biały (talk) 22:35, 6 July 2011 (UTC)[reply]