Wikipedia:Reference desk/Archives/Mathematics/2017 January 20

Mathematics desk
< January 19 << Dec | January | Feb >> January 21 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


January 20 edit

How much of math is used/useful? edit

While reading this thread from a couple days ago, I was stopped in my tracks by something User:Meni Rosenfeld wrote: "[Mathematics in the real world] is somewhat limited in depth, because the vast majority of mathematics has no known applicability." This made me wonder, are there any even somewhat reasonable estimations of just how much of math is actually used to solve real-world problems? Is math anything like English, which has, I hear, over a million words now, but you can get along with just a couple thousand words in your vocabulary and be super-fluent with 20-30 thousand? I understand there are probably many obstacles to a precise answer to this type of question. I'm looking for more of a rule-of-thumb type of answer (and maybe a little reckless speculation, just for good measure :). Bobnorwal (talk) 04:26, 20 January 2017 (UTC)[reply]

Have to take the opposing position here. Basically all areas of math are applicable and applied these days. And frequently the stuff that people huff and puff the most to keep pure in one generation turns out to be the most useful and practical stuff to apply the next. <namedrop> I think Paul Dirac said something similar somewhere - so his answer / rule of thumb/ reckless speculation/reasonable estimation was - ultimately 100%. (I sat in front of him at a lecture once, when I was knee high to a grasshopper.)</namedrop>.John Z (talk) 10:39, 20 January 2017 (UTC)[reply]
I wasn't talking about areas (though arguably it's true with this as well, depending on what you call "area"). I was talking about actual specific results/theorems.
Also, it's true that many areas/results find an application eventually, but in any specific moment in time, most results have not yet been applied. -- Meni Rosenfeld (talk) 12:24, 20 January 2017 (UTC)[reply]
  • It obviously depends on the quantification you take for "how much of math" (by amounts of characters written on a particular topic? by dollars invested in grants?), but also on what used/useful means. Technology readiness level jumps to mind: there are discoveries that everyone knows will be useful at some point in the future, but are not on the market yet; how do you count those? TigraanClick here to contact me 11:43, 20 January 2017 (UTC)[reply]
[ugh, I "posted" this hours ago, didn't notice there was an edit conflict, and left it as is... Posting for real now.]
I don't know the relevant numbers myself, but I'd say that yes, math is similar to English in this regard.
I'm weighing by specific results rather than general fields. For example, you could take the collection of all theorems that have been proven in peer-reviewed journal articles, and ask which of those have ever been applied. I bet the vast majority have not.
For example, I'm thinking about something like circle packing (Circle_packing_in_a_circle, [1]) (that's a relatively well-known problem, there are much more esoteric ones). In 1994 the optimal way to pack 11 circles inside a circle was found. This no doubt required a tremendous amount of work. Was this result ever applied? I doubt it. When we do want to pack circles, people will usually either truncate an optimal plane packing, or run a heuristic algorithm to find an approximate solution for the specific inner and outer radius given.
Provably optimal packings of squares inside a circle can be useful for the chip fabrication industry, but that is a separate, unique problem. There are tons of results in this overall type of problem ([2]) which, probably, have mostly remained unused. -- Meni Rosenfeld (talk) 12:21, 20 January 2017 (UTC)[reply]
I can imagine a lot of people would not get the 11 circle packing given a week to work on it heuristically! I wonder if anyone has ever turned chips to pack them on a wafer better, I think it would be possible to do accurately with the registration used in current step and repeat fabrication but it certainly sounds very worrysome. Dmcq (talk) 13:55, 20 January 2017 (UTC)[reply]
There's also a distinction between "useful now" and "useful when it was created". Prime numbers were for the most part trivia - it's useful to know about them for some everyday tasks (numbers with lots of prime factors are easier to divide arbitrarily, and gears with prime numbers of teeth are less susceptible to wear), and some people thought they had some spiritual value, but there was little value to knowing, say, Mersenne primes. Until that is, mathematical cryptography came along and suddenly being able to generate and study very large prime numbers was of critical importance for security reasons. Other examples of pure mathematics that graduated to applied are conic sections, which were just pretty curves until people realized they described orbits, non-Euclidean geometry and Riemannian geometry, which allowed the formulation of relativity, and the Hilbert space, which turns out to be very important in quantum physics. Lots more examples here. So even things that appear useless at the moment may have great applications in a few decades' time. Smurrayinchester 16:34, 20 January 2017 (UTC)[reply]

The integral test for convergence has a condition that the function be continuous edit

My professor says that this condition is unnecessary. I argue that if you have a function that's not defined at any x-value but the integers, the integral test would return a finite sum (namely 0), but I think he said something like the function having to have a value at every point, but isn't that just requiring a continuous function? Maybe I misunderstood his objection, but I think I'm right anyway. 69.22.242.15 (talk) 14:55, 20 January 2017 (UTC)[reply]

  • "Continuous" is stronger than "defined". Consider for instance the Dirichlet function or Thomae's function, both of which are defined on all reals but not continuous on any nonzero interval.
Now, the integral test for convergence assumes the function to be monotonously decreasing and positive. There is a theorem out there that says such a function has a limit (0 in that case). And I think to remember another theorem that says that the set of discontinuities of a function that is monotonous (and bounded (not sure that is necessary?)) on an interval must be a discrete set. If so, that means the function is almost continuous. Wrong, see counterexample provided by 111.69.101.73 below. TigraanClick here to contact me 15:11, 20 January 2017 (UTC)[reply]
The integral test is always presented as being required to be eventually decreasing (for the region where it is not decreasing the sum is thus finite) and continuous, but I've not seen any proof that explains the continuity requirement. That's really what I'd like to know: a proof that you need a continuous function for the test. It definitely needs to be bounded (thus the non-negative requirement). 69.22.242.15 (talk) 20:41, 20 January 2017 (UTC)[reply]
You don't need continuous, you just need eventually decreasing and integrable. Continuity is usually used as a shorthand for integrable because it's easier to describe and suffices in almost all cases. As for the proof that the integral test works, almost every calculus text has a picture of the proof: draw width 1 rectangles to the left, and the series is less than the area under the function; draw width 1 rectangles to the right, and it's greater than the area. Continuity was never used.--111.69.101.73 (talk) 10:36, 21 January 2017 (UTC)[reply]
This sounds possible, since no one actually proves the continuous part in those proofs, but I thought mathematicians were supposed to be precise. Anyway, I looked up Riemann_integral#Integrability, which says, "If a real-valued function is monotone on the interval [a, b] it is Riemann-integrable, since its set of discontinuities is at most countable, and therefore of Lebesgue measure zero." Well, aren't we requiring an eventually monotonic function here? Then why the extra specification for continuity or integrability? 69.22.242.15 (talk) 14:25, 22 January 2017 (UTC)[reply]
The statement with continuity is both correct and precise. Most undergraduate courses in calculus largely avoid the issue of integrability, and continuity is a convenient substitute. --JBL (talk) 00:45, 23 January 2017 (UTC)[reply]
I'm afraid you're mistaken. Consider   defined on   by   for  , and   for  . It has discontinuities at all  , and also at 0.
What's true is that all discontinuities of a monotonic function are jump discontinuities, and a function can have only countably many of those. That may have been what you were thinking of.--111.69.101.73 (talk) 10:36, 21 January 2017 (UTC)[reply]
Discrete set ≠ finite set. Although your example has a non-discrete discontinuity in 0, it is not monotonous either. TigraanClick here to contact me 14:11, 21 January 2017 (UTC)[reply]
I'm aware of the meaning of discrete set, which is why I included an accumulation point in the discontinuities. And you're mistaken, it is monotonic. It's not strictly increasing, but it could be easily made to be. In fact, with a bit more effort, you could make a strictly increasing function with discontinuities at precisely the rational numbers.--2406:E006:3A70:1:74D7:EB87:1E:BFC5 (talk) 23:09, 21 January 2017 (UTC)[reply]
f(0)=-1 < 0 < f(x) for any nonnegative x, whereas f is decreasing for x>0, so it is not monotonic (and cannot be easily fixed to be). Either you restrict the function to the nonnegative reals, and then it is has no discontinuity in 0 since it is not defined in 0, or you include 0 in the definition domain and then you cannot (or so I think to remember) have a monotonic function with a discontinuity accumulation point. TigraanClick here to contact me 11:59, 24 January 2017 (UTC)[reply]
f is not decreasing for positive x. Look again. It's a stepped version of the identity function.--2406:E006:3A70:1:3811:AF51:5425:D660 (talk) 12:40, 24 January 2017 (UTC)[reply]
...huh. Yes. Sorry. My brain parsed the definition as "f([1/(n+1),1/n[)=n". So that is indeed a counterexample to the "theorem" I "remembered". That must be what happens when you are rusty.   TigraanClick here to contact me 14:17, 25 January 2017 (UTC)[reply]

Locus: Ellipse with Axes 1 and 3 edit

About a couple of years ago, I was researching various geometric loci with interesting properties. (For instance, to give just one such example, I asked myself what is the set of triangles with two perpendicular medians; in this particular case, the locus described by the triangle point not belonging to the two perpendicular medians turned out to be a simple circle). Now, one of the several loci I've discovered at the time was an ellipse whose axes formed a ratio of 1 : 3. Unfortunately, I completely forgot what the locus was supposed to represent, and I cannot for the life of me remember what its defining property was. (Before anyone asks, this ellipse is indeed determined by two equilateral triangles sharing a common side, but that's not it). I know this is a long shot, but I was hoping that maybe one of you could help me. Thank you. — 79.113.193.22 (talk) 14:58, 20 January 2017 (UTC)[reply]

If the ratio between the ellipse's two axes were 1 : 2 instead of 1 : 3, for instance, then this locus would be related to the set of triangles whose angle bisector (also) bisects the segment determined by the foot of its median and the foot of its height. — 79.113.193.22 (talk) 18:04, 20 January 2017 (UTC)[reply]
Never mind, now I remember: It's related to the set of triangles whose Euler line is parallel to its base. — 79.118.187.232 (talk) 04:15, 21 January 2017 (UTC)[reply]
  Resolved