Wikipedia:Reference desk/Archives/Mathematics/2015 November 19

Mathematics desk
< November 18 << Oct | November | Dec >> November 20 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


November 19 edit

Finite endomorphism ring edit

Is there an infinite abelian group with a finite endomorphism ring? GeoffreyT2000 (talk) 01:17, 19 November 2015 (UTC)[reply]

I don't think so. If an abelian group has a finite endomorphism ring, then it is necessarily a torsion abelian group. (Otherwise multiplication by an integer gives an obvious injection from the set of integers into the endomorphism ring.) In fact, for the same reason, the elements must have bounded order. By the first Prüfer theorem, a torsion abelian group of this kind is isomorphic to a direct sum of cyclic groups. Because the order of these cyclic groups is bounded, infinitely many direct factors are repeated by the pigeonhole principle, and so in that case the endomorphism ring is uncountably infinite. Sławomir
Biały
12:07, 19 November 2015 (UTC)[reply]

Probability distributions, frequency of events edit

Hi, I'd like to create an IID stochastic process  , where 1 indicates that an event has occurred, and 0 that it has not. I had been using  , where B(p) is the Bernoulli distribution with parameter p. This works ok, and I can vary the mean frequency of events, but the variance is p(1-p), and so the variance increases as I increase frequency - I am primarily interested in  . What distribution should I use that would allow me to manipulate the frequency and variance independently? Or at least have a fixed variance for all frequencies? I need at the end to have a string of length N like 010...001, so counting events in an interval is not helpful. To be clear, I'd like to have the time between events (expected value of 1/p) have the same variance for all frequencies p. I can think of a few ways to generate strings with the necessary properties programmatically, but it would be far better if I could do it with a simple distribution. Any ideas? I feel like there must be something simple I'm forgetting about. Thanks, SemanticMantis (talk) 16:14, 19 November 2015 (UTC)[reply]

That's impossible. The only distribution with support   is Bernoulli. To get what you described (which may or may not be what you want), you'll have to make your events dependent (contrary to the assumption of IID).
And the best way to do that is probably to start with the distribution you want for the time between successful events, with a given mean and variance (plenty of choice there - a good choice is the maximum entropy distribution), and simply running that and deriving the process (with 0's to fill in the gaps between successive 1's). But again, this will mean that there will be dependence between the events at given times (the dependence will be stronger for nearby events). -- Meni Rosenfeld (talk) 18:53, 19 November 2015 (UTC)[reply]
@Meni Rosenfeld: D'oh! Thanks, that makes sense, I forgot to leave out option "c) Is this impossible?" -- of course I see now that what I asked for is indeed impossible. To clarify, you're suggesting that I could instead create an IID process S_n for the spaces, then create O=1...1...1, where the number of zeros in the ... is S_n. Then O_n is not IID, but I could create S_n such that the mean and variance are independent. But can I do that with maximal entropy? E.g. I thought the geometric distribution had maximal entropy on {(0),1,2,...}, and that won't let me pick mean and variance independently. If I want to demand independent mean and variance, what are my options for support on \mathbb{N}? I think maybe I can re-parameterize the Beta-binomial distribution by mean and variance like you can do with the Beta distribution, but that's only quasi-independent, because once you pick the mean it bounds the allowable variances. I think I may well stick with my Bernoulli set up for now, but I'm interested in the time-dependent case as possible future refinement. SemanticMantis (talk) 19:54, 19 November 2015 (UTC)[reply]
I guess I could pick the mean \mu and then let S_n = DiscreteUniform(\mu-k, \mu+k). Variance could then at least be arbitrarily large or small. SemanticMantis (talk) 20:01, 19 November 2015 (UTC)[reply]
Yes, that is the process I proposed.
I meant, "maximum entropy for given mean and variance supported on positive integers" (geometric is max. ent. for given mean, without variance specification). This kind of distribution has the same form as the normal distribution, but the scale & shift parameters will not be exactly the mean and s.d. (since the restriction to positive integers changes the mean and variance for a given formula). You'd have to do a bit of work to find the correct parameters.
Alternatively, a combination of binomial distribution (for low variance) and negative binomial (for high variance) can work, and it's easier to find the parameters, but they don't cover the entire possibilities of mean & variance. -- Meni Rosenfeld (talk) 20:18, 19 November 2015 (UTC)[reply]
Note also that a uniform distribution will be fairly restrictive - you must keep it positive, so it limits how wide an interval you can take, and hence you can't have high variance. Negative binomial doesn't have this problem. -- Meni Rosenfeld (talk) 20:31, 19 November 2015 (UTC)[reply]
Thanks again Meni, very helpful. SemanticMantis (talk) 22:31, 19 November 2015 (UTC)[reply]